We don’t think of everyday devices as biased against race or gender, but they can be. Electrical engineer and computer scienti

admin2022-02-15  26

问题   We don’t think of everyday devices as biased against race or gender, but they can be. Electrical engineer and computer scientist Achuta Kadambi is familiar with the problem both professionally and personally. "Being fairly dark-skinned myself," Kadambi says, he sometimes cannot activate no-touch faucets that detect light bouncing off skin. At one airport, he recalls, "I had to ask a lighter-skinned traveler to trigger a faucet for me".
  Medical devices, too, can be biased. In a recent article, Kadambi describes three ways that racial and gender bias can permeate medical devices and suggests a number of solutions.
  The first problem, Kadambi says, is physical bias, which is inherent in the mechanics of the device. Then there is computational bias, which lies in the software or in the data sets used to develop the gadget. Finally, there is interpretation bias, which resides not in the machine but in its user. It occurs when clinicians apply unequal, race-based standards to the readouts from medical devices and tests—an alarmingly common practice.
  Physical bias made news last December when a study at the University of Michigan found that pulse oximeters—which use light transmitted through skin and tissue to measure the oxygen in a person’s blood—are three times more likely to miss low oxygen levels in black patients than in white ones.
  Computational biases can creep into medical technology when it is tested primarily on a homogeneous group of subjects—typically white males. For instance, an artificial-intelligence system used to analyze chest x-rays and identify 14 different lung and chest diseases worked less well for women when trained on largely male scans. But training the system on a gender-balanced sample produced the best overall results, with no significant loss of accuracy for men.
  Stopping computational bias means making a much greater effort to recruit people from different populations to participate in the design and testing of medical devices. In addition to building diversity among researchers, Rachel Hardeman, a public health scientist at the University of Minnesota favors mandatory training of medical personnel, a step that might also help counter practices that lead to interpretation bias. California has moved in this direction, she notes, with a 2020 law requiring health-care providers treating pregnant women and their newborns to complete a curriculum aimed at closing racial gaps in maternal and infant mortality.
  Fairness, Kadambi argues, should be a criterion for evaluating new technology, along with effectiveness.
Which of the following belongs to the interpretation bias?

选项 A、The intrinsic limit of techniques.
B、Data sets trained on homogeneous subjects.
C、Programmers composed of a single gender.
D、Disparity of readings from devices and tests.

答案D

解析 推断题。根据题干可定位至第三段。在该段,作者分别介绍了3种bias的定义,由“physical bias,which is inherent in the mechanics of the device”可知,物理偏差是因为设备的内在机制导致的,选项A为其同义改写,故排除:由“computational bias,which lies in the software or in the data sets”可知计算偏差是由软件或者数据集导致的,故排除选项B和选项C;根据选项D可定位至第四句,其中disparity对应unequal,readings对应readouts,是同义替换,故选项D正确。
转载请注明原文地址:https://kaotiyun.com/show/UImZ777K
0

最新回复(0)