x

CR investigates: How tech can discriminate

By: Consumer Reports

Related Story

Technology is meant to improve our lives, but that doesn’t always happen. As a new Consumer Reports investigation reveals, some of the things that power our lives each day can contain hidden biases that result in unfair practices toward communities of color.

For decades, people of color were kept out of home ownership through a practice known as  redlining. It’s now illegal, although the results are still evident.

The information used in redlining has largely been fed into new algorithms that are essentially doing the same kind of thing.

A new Consumer Reports documentary series called “Bad Input” sheds light on the ways in which technology is failing in home lending, medicine, and facial recognition and security. It raises the question: Can technology be racist?

The answer is yes. If tech is fed bad information, it will continue to give us bad outputs.

For example, since the beginning of the pandemic, pulse oximeters have helped save lives by monitoring a person’s blood oxygen level. But a study by the University of Michigan found that the technology isn’t as accurate for Black people as it is for White people.And that has delayed the care that many people of color need, which could have dire consequences.

Facial recognition is another example of how technology can go wrong. It can be found everywhere: on your phone, at the self-checkout at a store, even at an event where security is scanning.

We’ve seen cases across the U.S. where people have been misidentified and faced criminal charges because of it.

To learn more about discriminatory technology, watch CR’s three-part documentary series, “Bad Input,” at BADINPUT.ORG

Some ways to use technology responsibly include limiting the following activities: posting pictures of yourself and family on social media, joining a public WiFi network, and signing up for something where you have to enter personal information.

News

Radar
7 Days