回本城市首頁 打開聯合報 看見紐約時報
市長:AL  副市長:
udn城市文學創作其他【打開聯合報 看見紐約時報】城市/討論區/
討論區Tech 字體:
上一個討論主題 回文章列表 下一個討論主題
紐約時報賞析:臉部辨識系統 有種族性別「偏見」


Software To Recognize Faces Is Found To Be Biased
臉部辨識系統 有種族性別「偏見」
By Natasha Singer and Cade Metz

The majority of commercial facial-recognition systems exhibit bias, according to a study from a federal agency released recently, underscoring questions about a technology increasingly used by police departments and federal agencies to identify suspected criminals.

The systems falsely identified African American and Asian faces 10 times to 100 times more than Caucasian faces, the National Institute of Standards and Technology reported. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found.

The technology also had more difficulty identifying women than men. And it falsely identified older adults up to 10 times more than middle-aged adults.

The new report comes at a time of mounting concern from lawmakers and civil rights groups over the proliferation of facial recognition. Proponents view it as an important tool for catching criminals and tracking terrorists. Tech companies market it as a convenience that can be used to help identify people in photos or in lieu of a password to unlock smartphones.

Civil liberties experts, however, warn that the technology – which can be used to track people at a distance without their knowledge – has the potential to lead to ubiquitous surveillance, chilling freedom of movement and speech. Last year, San Francisco, Oakland and Berkeley in California and the Massachusetts communities of Somerville and Brookline banned government use of the technology.

“One false match can lead to missed flights, lengthy interrogations, watch list placements, tense police encounters, false arrests or worse,” Jay Stanley, a policy analyst at the American Civil Liberties Union, said in a statement. “Government agencies including the FBI, Customs and Border Protection and local law enforcement must immediately halt the deployment of this dystopian technology.”

The federal report is one of the largest studies of its kind. The researchers had access to more than 18 million photos of about 8.5 million people from American mug shots, visa applications and border-crossing databases.

The National Institute of Standards and Technology tested 189 facial-recognition algorithms from 99 developers, representing the majority of commercial developers. They included systems from Microsoft, biometric technology companies like Cognitec, and Megvii, an artificial intelligence company in China.

The federal report confirms earlier studies from MIT that reported that facial-recognition systems from some large tech companies had much lower accuracy rates in identifying the female and darker-skinned faces than the white male faces.


2020-01-12.聯合報.D4.紐約時報賞析.莊蕙嘉譯 樂慧生核稿

文解字看新聞 莊蕙嘉

本文報導臉部辨識系統經研究後發現誤差不小,facial recognition是近年熱門科技,屬於生物辨識技術(biometrics)的一種,利用電腦和人工智慧記錄及分析人臉特徵,例如瞳孔距離、骨骼位置、眼口鼻大小等,進而比對並辨識身分的技術。

recognizeidentify大致算是同義詞,都有「辨識」之意,若要細分,前者是know who that person is or what that thing is,後者則是know someone/something and can distinguish them from others


此外,recognize也有「認可」的讚賞之意,如recognize A as an outstanding scientistidentify另有「認同」之意,如A identifies himself with the (religious/political/ethnic) group

Earth Science Has a Whiteness Problem
美地科學界 9成博士是白人
By Emma Goldberg

When Arianna Varuolo-Clarke was growing up, her favorite evenings were spent watching the Weather Channel with her grandfather. She wanted to “chase thunderstorms” and understand where tornadoes came from, she said. She decided to become an atmospheric scientist. In 2014, she landed an internship at the National Center for Atmospheric Research as a college sophomore, and quickly realized that her path as a woman of color would not be easy.

“You’d walk through the halls and it’s a lot of old white men,” Varuolo-Clarke said. Still, she pushed forward and began her Ph.D. in atmospheric science at Columbia University in 2018.

The field’s lack of diversity gained new urgency in May when her graduate student cohort was targeted with a series of racist emails. The messages, sent to affiliates of the Lamont-Doherty Earth Observatory at Columbia by a person outside the community, said that black people were genetically inferior and did not belong in academia.

It was “hurtful and invalidating” to be told that she didn’t belong in the world that had drawn her in since childhood, Varuolo-Clarke said. “It was an isolated incident. But it brought to the surface what still needs to be done in the field.”

In a commentary last month in Nature Geoscience, Kuheli Dutt, Lamont-Doherty’s assistant director for academic affairs and diversity, wrote that “a lack of diversity and inclusion is the single largest cultural problem facing the geosciences today.”

The geosciences – which include the study of planet Earth, its oceans, its atmosphere and its interactions with human society – are among the least diverse across all fields of science. Nearly 90% of doctoral-degree recipients are white. In the country’s top 100 geoscience departments, people of color hold under 4% of tenured or tenure-track positions. A 2016 survey from the National Science Foundation showed that representation of people of color in geosciences has barely budged in the past four decades, although significant gains have been made in terms of gender balance.

Asian-Americans are better represented than other people of color, according to Dutt, accounting for 6% of those earning geoscience doctorates in 2016. Between 1973 and 2016, just 20 Native American women, 69 black women and 241 Hispanic women earned doctorates in the field, of some 22,600 total.

The field’s lack of diversity begins with a pipeline problem, geoscientists say. National surveys have shown that black people are less likely than white people to participate in outdoor activities.


2020-01-12.聯合報.D4.紐約時報賞析.莊蕙嘉譯 樂慧生核稿

回應 回應給此人 推薦文章 列印 加入我的文摘