From predicting who will be a repeat offender to who’s the best candidate for a job, computer algorithms are now making complex decisions in lieu of humans. But increasingly, many of these algorithms are being found to replicate the same racial, socioeconomic or gender-based biases they were built to overcome.
This racial bias extends to software widely used in the health care industry, potentially affecting access to care for millions of Americans, according to a new study by researchers at the University of California, Berkeley, the University of Chicago Booth School of Business and Partners HealthCare in Boston.
The new study, published Oct. 25 in the journal Science, found that a type of software program that determines who gets access to high-risk health care management programs routinely lets healthier whites into the programs ahead of blacks who are less healthy. Fixing this bias in the algorithm could more than double the number of black patients automatically admitted to these programs, the study revealed. (author introduction)