15.5 C
London
Sunday, May 5, 2024
HomeTechnologyStudy finds racial bias in facial recognition tools

Study finds racial bias in facial recognition tools

Date:

Related stories

What You Need To Know About Smoke Alarms

Smoke alarms are cheap and straightforward, but they can...

Unlocking England: Tailored Travel Packages for Unforgettable Adventures

England, with its rich history, vibrant culture, and diverse...

What are map charts?

As the world becomes increasingly digitized, businesses are turning...

Understanding Instagram’s Role in Rising Youth Anxiety

In the digital age, social media platforms like Instagram...

Does pubic hair make you smell?

Yes, especially when bacteria have clung to the hair...

Many facial recognition systems misidentify people of color more often than white people, according to a U.S. government study released on Thursday that is likely to increase skepticism of technology widely used by law enforcement agencies.

The study by the National Institute of Standards and Technology (NIST) found that, when conducting a particular type of database search known as “one-to-one” matching, many facial recognition algorithms falsely identified African-American and Asian faces 10 to 100 times more than Caucasian faces.

The study also found that African-American females are more likely to be misidentified in “one-to-many” matching, which can be used for identification of a person of interest in a criminal investigation.

While some companies have played down earlier findings of bias in technology that can guess an individual’s gender, known as “facial analysis,” the NIST study was evidence that face matching struggled across demographics, too.

Joy Buolamwini, founder of the Algorithmic Justice League, called the report “a comprehensive rebuttal” of those saying artificial intelligence (AI) bias was no longer an issue. The study comes at a time of growing discontent over the technology in the United States, with critics warning it can lead to unjust harassment or arrests.

For the report, NIST tested 189 algorithms from 99 developers, excluding companies such as Amazon.com Inc that did not submit one for review. What it tested differs from what companies sell, in that NIST studied algorithms detached from the cloud and proprietary training data.

China’s SenseTime, an AI startup valued at more than $7.5 billion, had “high false match rates for all comparisons” in one of the NIST tests, the report said.

SenseTime’s algorithm produced a false positive more than 10% of the time when looking at photos of Somali men, which, if deployed at an airport, would mean a Somali man could pass a customs check one in every 10 times he used passports of other Somali men.

SenseTime said the report “reflects an isolated case” and that what it submitted had bugs it has addressed. “The results are not reflective of our products, as they undergo thorough testing before entering the market (and) all report a high degree of accuracy,” it said.

Yitu, another AI startup from China, was more accurate and had little racial skew.

Microsoft Corp had almost 10 times more false positives for women of color than men of color in some instances during a one-to-many test. Its algorithm showed little discrepancy in a one-to-many test with photos just of black and white males.

Microsoft said it was reviewing the report and did not have a comment on Friday morning.

Congressman Bennie Thompson, chairman of the U.S. House Committee on Homeland Security, said the findings of bias were worse than feared, at a time when customs officials are adding facial recognition to travel checkpoints.

“The administration must reassess its plans for facial recognition technology in light of these shocking results,” he said.

Subscribe

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

[tds_leads input_placeholder="Your email address" btn_horiz_align="content-horiz-center" pp_msg="SSd2ZSUyMHJlYWQlMjBhbmQlMjBhY2NlcHQlMjB0aGUlMjAlM0NhJTIwaHJlZiUzRCUyMiUyMyUyMiUzRVByaXZhY3klMjBQb2xpY3klM0MlMkZhJTNFLg==" pp_checkbox="yes" tdc_css="eyJhbGwiOnsibWFyZ2luLXRvcCI6IjMwIiwibWFyZ2luLWJvdHRvbSI6IjQwIiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdCI6eyJtYXJnaW4tdG9wIjoiMTUiLCJtYXJnaW4tYm90dG9tIjoiMjUiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsImxhbmRzY2FwZSI6eyJtYXJnaW4tdG9wIjoiMjAiLCJtYXJnaW4tYm90dG9tIjoiMzAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBob25lIjp7Im1hcmdpbi10b3AiOiIyMCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" display="column" gap="eyJhbGwiOiIyMCIsInBvcnRyYWl0IjoiMTAiLCJsYW5kc2NhcGUiOiIxNSJ9" f_msg_font_family="downtown-sans-serif-font_global" f_input_font_family="downtown-sans-serif-font_global" f_btn_font_family="downtown-sans-serif-font_global" f_pp_font_family="downtown-serif-font_global" f_pp_font_size="eyJhbGwiOiIxNSIsInBvcnRyYWl0IjoiMTEifQ==" f_btn_font_weight="700" f_btn_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTEifQ==" f_btn_font_transform="uppercase" btn_text="Unlock All" btn_bg="#000000" btn_padd="eyJhbGwiOiIxOCIsImxhbmRzY2FwZSI6IjE0IiwicG9ydHJhaXQiOiIxNCJ9" input_padd="eyJhbGwiOiIxNSIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMCJ9" pp_check_color_a="#000000" f_pp_font_weight="600" pp_check_square="#000000" msg_composer="" pp_check_color="rgba(0,0,0,0.56)" msg_succ_radius="0" msg_err_radius="0" input_border="1" f_unsub_font_family="downtown-sans-serif-font_global" f_msg_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTIifQ==" f_input_font_size="eyJhbGwiOiIxNCIsInBvcnRyYWl0IjoiMTIifQ==" f_input_font_weight="500" f_msg_font_weight="500" f_unsub_font_weight="500"]

Latest stories

LEAVE A REPLY

Please enter your comment!
Please enter your name here

15 + 17 =