These algorithms look at X-rays and detect your race in some way

[ad_1]

Millions of dollars Is spending development artificial intelligence Software that reads the results of X-rays and other medical scans, hoping that it can find things that doctors are looking for but sometimes miss, such as lung cancer.A new study reports that these algorithm You can also see what the doctor would not look for in this type of scan: the patient’s ethnicity.

The authors of the study and other medical artificial intelligence experts say that these results make it more important than ever to check the fairness of health algorithms to people of different ethnic identities. Complicates this task: the authors themselves are not sure what the clues they have created to predict a person’s race are.

The algorithm can read evidence of ethnicity from a person’s medical scan from testing on five types of images used in radiology research, including chest and hand X-rays and mammograms. These images include patients identified as black, white, and Asian. For each type of scan, the researchers used image training algorithms labeled with the patient’s self-reported ethnicity. Then they challenged the algorithm to predict the race of the patient in the different unlabeled images.

Radiologists usually do not consider a person’s ethnic identity-which is not a biological category-which is visible in scans under the skin. However, these algorithms have somehow proven to be able to accurately detect all three ethnic groups, as well as different views of the body.

For most types of scans, the algorithm can correctly identify which of the two images is from a black person more than 90% of the time. Even the worst-performing algorithm has an 80% success rate; the best result is 99% correct.This result And related Code At the end of last month, it was posted online by more than 20 researchers with medical and medical expertise. Machine learning, But the research has not yet been peer-reviewed.

The result has raised new concerns that artificial intelligence software may magnify inequalities in the healthcare sector. Studies have shown that black patients and other marginalized ethnic groups tend to receive poorer healthcare services compared to the rich or white.

Machine learning algorithms adjust them to read medical images by providing them with many labeled condition examples (such as tumors). By digesting many examples, the algorithm can learn pixel patterns statistically related to these tags, such as the texture or shape of lung nodules.Some algorithms compete with doctors in this way in detecting cancer or skin problems; there is evidence that they can detect signs of disease Invisible to human experts.

Judy Gichoya, a radiologist and assistant professor at Emory University who participated in the new study, said that the revelation of race that image algorithms can “see” in internal scans may prompt them to also learn inappropriate associations.

Due to historical and socio-economic factors, medical data used to train algorithms often bears traces of racial inequality in disease and medical treatment. This may cause the algorithm to search for statistical patterns in scans, using its guesses about the patient’s race as a shortcut, suggesting diagnoses related to racial bias patterns in its training data, not just the visible medical abnormalities that radiologists are looking for. Such a system may give some patients wrong diagnosis or all errors are eliminated. The algorithm may suggest different diagnoses for blacks and whites with similar signs of disease.

“We must educate people on this issue and study what we can do to alleviate it,” Gichoya said. Her project collaborators are from Purdue University, Massachusetts Institute of Technology, Beth Israel Deaconess Medical Center, National Tsing Hua University, University of Toronto, Stanford University and other institutions.

Previous research has shown that medical algorithms create bias in care delivery, and image algorithms may perform differently for different population groups. In 2019, a widely used algorithm was discovered to give priority to the most severely ill patients Disadvantaged blackIn 2020, researchers from the University of Toronto and the Massachusetts Institute of Technology showed that algorithms trained to mark diseases such as pneumonia on chest X-rays sometimes behave differently for people of different genders, ages, races, and health insurance types.

[ad_2]

Source link