Neutral Algorithms? They’re Not So Neutral
中立演算法未必真的中立
By Claire Cain Miller
The online world is shaped by forces beyond our control, determining the stories we read on Facebook and the search results we see on Google. Big data is used to make decisions about health care, employment, housing, education and policing.
形塑網路世界的力量超出我們的控制,它決定我們在臉書上看哪篇文章以及谷歌的搜尋結果。大數據被用來制定有關健康照護、就業、住房、教育和治安的政策。
But can computer programs be discriminatory?
然而,電腦程式有沒有可能帶有歧視性?
There is a widespread belief that software and algorithms that rely on data are objective. But algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior. As a result, researchers say, algorithms can reinforce human prejudices.
人們普遍認為仰賴數據資料的軟體和演算法是客觀的。但是編寫和維持演算法的是人,學習演算法的機器會根據人的行為來調整作法。因此,研究人員說,演算法可能會強化人類的偏見。
Google’s online advertising system, for instance, showed an ad for high-income jobs to men much more often than it showed the ad to women, according to a new study by researchers at Carnegie Mellon University in Pittsburgh, Pennsylvania.
例如,根據賓州匹茲堡卡內基美隆大學的研究,谷歌的網路廣告系統,高收入職缺的廣告對男性顯示的頻率,超過對女性顯示的頻率。
Research from Harvard University found that ads for arrest records were significantly more likely to show up on searches for distinctively black names or a historically black fraternity.
哈佛大學的研究則發現,搜尋的姓名顯然是黑人或歷史上屬於黑人的姓名時,出現逮捕紀錄廣告的機率顯著增加。
Algorithms, which are a series of instructions written by programmers, and online results reflect people’s attitudes and behavior. Machine learning algorithms learn and evolve based on what people do online. The autocomplete feature on Google and Bing is an example. A recent Google search for “Are transgender,” for instance, suggested, “Are transgenders going to hell.”
演算法是程式設計師所寫的一系列指示,網路上的的結果則反映人類的態度與行為。學習演算法的機器根據人類在網路的活動學習並演進。谷歌和Bing(微軟的搜尋引擎)所具備的自動完成功能,即為一例。例如,日前在谷歌搜尋「變性人要」,建議的搜尋結果是「變性人要下地獄」。
“Even if they are not designed with the intent of discriminating against those groups, if they reproduce social preferences even in a completely rational way, they also reproduce those forms of discrimination,” said David Oppenheimer of the University of California, Berkeley.
「就算設計時並沒有歧視這些團體的意圖,即使以完全理性的方式複製社會偏好,同時也會複製各種形式的歧視,」柏克萊加州大學(教授反歧視法)的大衛.歐本海默表示。
The Carnegie Mellon researchers built a tool to simulate Google users that started with no search history and then visited employment websites. Later, on a third-party news site, Google showed an ad for a career coaching service advertising high-paying executive positions 1,852 times to men and 318 times to women.
卡內基美隆大學的研究人員打造了一個工具,模擬一開始沒有搜尋紀錄的谷歌使用者,然後造訪求職網站。後來,在一個第三方的新聞網站,谷歌展示一個推薦高薪主管職位的職業生涯教練服務廣告,對男性展示1,852次,對女性是318次。
The reason for the difference could have been that the advertiser requested that the ads target men, or that the algorithm determined that men were more likely to click on the ads.
造成此一差異的原因可能是廣告主要求廣告鎖定男性為對象,或者演算法認定男性比女性更有可能去點擊廣告。
Google said in a statement, “Advertisers can choose to target the audience they want to reach, and we have policies that guide the type of interest-based ads that are allowed.”
谷歌透過聲明表示:「廣告主可以選擇想要接觸的目標受眾,本公司也有指導哪些以興趣為基礎的廣告類型可以刊出的政策。」
Anupam Datta, one of the Carnegie Mellon researchers, said, “Given the big gender pay gap we’ve had between males and females, this type of targeting helps to perpetuate it.”
卡內基美隆大學研究人員之一的安努潘.達塔說:「兩性之間現今存在巨大的薪資鴻溝,此一類型的鎖定目標方式將使得巨大的鴻溝繼續存在下去。」
Companies can regularly run simulations to test the results of their algorithms. Mr. Datta suggested that algorithms “be designed from scratch to be aware of values and not discriminate.”
各企業能夠定期進行模擬,測試自家演算法的結果。達塔建議,演算法「從一開始設計時,就要注意到社會價值,不要歧視。」
Silicon Valley, however, is known for pushing out new products without considering the societal or ethical implications.
然而,矽谷向來以不顧社會或倫理含意便大量生產新產品著稱。
Deirdre Mulligan of the University of California, Berkeley, said, “There’s a huge rush to innovate, a desire to release early and often – and then do cleanup.”
柏克萊加州大學(資訊學院)的戴德里.莫利根說:「創新的衝力很大,渴望盡早且經常發布產品。然後再來善後。」
原文參照:
http://www.nytimes.com/2015/07/10/upshot/when-algorithms-discriminate.html
2015-08-11聯合報/G9版/UNITED DAILY NEWS 張佑生譯 原文參見紐時週報十一版右