While bias, discrimination and prejudice are often discussed as topics, the conversation is usually within the context of people, how they view each other and what contributory factors are at play. It’s not often that we consider the fact that gadgets and devices could hold similar sorts of bias… until now.
Over the last few years, the issue of medical device bias has come increasingly to the fore, covering everything from physical bias to computational bias and interpretation bias.
Physical bias refers to the device’s own mechanical components, such as touch activation not working because gadgets are unable to detect light reflecting off the skin. Computational bias, meanwhile, relates to issues within the software and data sets, whereas interpretation bias refers to the application of unequal standards based on race to tests and results.
The rise of health monitoring medical equipment has been relatively stratospheric since the global pandemic, but it’s important to note that some people will receive more accurate results than others.
Errors in measurements can be inconsequential, but if devices are in use because the person in question has a serious health condition that may require hospitalisation, any mistakes could mean that an individual isn’t treated appropriately, which could prove life threatening.
Of course, these devices are certainly here to stay and it’s likely that they’ll only continue to become more popular as time goes on, so it’s essential that they serve a diverse population and that they’re effective for use by the wider public.
In the UK, the government has just outlined action it intends to take to tackle ethnic and other biases in medical devices, including making sure that pulse oximeter devices can be used across a range of skin tones, as well as removing racial bias from clinical study data sets.
Unless appropriate action is taken in this regard, it’s agreed that ethnic and other unfair biases can be seen throughout the life cycle of these devices.