We don’t think of everyday devices as biased against race or gender, but they can be. Electrical engineer and computer scienti

admin2022-02-15  32

问题   We don’t think of everyday devices as biased against race or gender, but they can be. Electrical engineer and computer scientist Achuta Kadambi is familiar with the problem both professionally and personally. "Being fairly dark-skinned myself," Kadambi says, he sometimes cannot activate no-touch faucets that detect light bouncing off skin. At one airport, he recalls, "I had to ask a lighter-skinned traveler to trigger a faucet for me".
  Medical devices, too, can be biased. In a recent article, Kadambi describes three ways that racial and gender bias can permeate medical devices and suggests a number of solutions.
  The first problem, Kadambi says, is physical bias, which is inherent in the mechanics of the device. Then there is computational bias, which lies in the software or in the data sets used to develop the gadget. Finally, there is interpretation bias, which resides not in the machine but in its user. It occurs when clinicians apply unequal, race-based standards to the readouts from medical devices and tests—an alarmingly common practice.
  Physical bias made news last December when a study at the University of Michigan found that pulse oximeters—which use light transmitted through skin and tissue to measure the oxygen in a person’s blood—are three times more likely to miss low oxygen levels in black patients than in white ones.
  Computational biases can creep into medical technology when it is tested primarily on a homogeneous group of subjects—typically white males. For instance, an artificial-intelligence system used to analyze chest x-rays and identify 14 different lung and chest diseases worked less well for women when trained on largely male scans. But training the system on a gender-balanced sample produced the best overall results, with no significant loss of accuracy for men.
  Stopping computational bias means making a much greater effort to recruit people from different populations to participate in the design and testing of medical devices. In addition to building diversity among researchers, Rachel Hardeman, a public health scientist at the University of Minnesota favors mandatory training of medical personnel, a step that might also help counter practices that lead to interpretation bias. California has moved in this direction, she notes, with a 2020 law requiring health-care providers treating pregnant women and their newborns to complete a curriculum aimed at closing racial gaps in maternal and infant mortality.
  Fairness, Kadambi argues, should be a criterion for evaluating new technology, along with effectiveness.
Which of the following is the author likely to agree with?

选项 A、People should be friendlier to the black.
B、More funds should be donated to invent medical devices.
C、It’s impossible to stop the three types of biases.
D、Equality should be taken into account when assessing technologies.

答案D

解析 观点题。文章中提到的是仪器对黑人存在“种族偏见”,并不是其他人对黑人有偏见,故排除选项A;选项B为无中生有,在文章中并没有发起这样的呼吁,故排除;第四段到第六段分别详细描述了3种歧视以及消除这些歧视的一些做法,如第六段第一句中提到消除计算偏差意味着要付出更大的努力,但这并不是不可能的,故排除选项C;由整篇文章的内容以及最后一段可知,作者认为除了有效性外,当评估技术时,公平性也应该被考虑到,故选项D正确。
转载请注明原文地址:https://kaotiyun.com/show/LImZ777K
0

随机试题
最新回复(0)