網路城邦
回本城市首頁 打開聯合報 看見紐約時報
市長:AL  副市長:
加入本城市推薦本城市加入我的最愛訂閱最新文章
udn城市文學創作其他【打開聯合報 看見紐約時報】城市/討論區/
討論區Politics政治 字體:
上一個討論主題 回文章列表 下一個討論主題
新聞對照:保守派火大 轟臉書操縱「熱門話題」
 瀏覽469|回應0推薦0

kkhsu
等級:8
留言加入好友

Facebook, Facing Bias Claims, Shows How Editors and Algorithms Guide News
By MIKE ISAAC

SAN FRANCISCO — Facebook, the largest social media network, published internal editorial guidelines on Thursday, the company’s latest attempt to rebut accusations that it is politically biased in the news content it shows on the pages of its 1.6 billion users.

The 28-page document details how both editors and computer algorithms play roles in the process of picking what should appear in the “Trending Topics” section of users’ Facebook pages.

Facebook describes a list of processes it uses to display some of the most popular content across the network, including relying on algorithms to detect up-and-coming news trends as well as a team of editors who, much like a newsroom, direct how those topics are presented and decide what should be displayed to people who regularly use the service.

As the guidelines make clear, at practically every point in the process, a human editor is given the leeway to exercise his or her editorial influence.

The document was released just days after a report on the tech news site Gizmodo said Facebook editors had intentionally “suppressed” news topics from conservative publications trending across the network. The report also said editors were able to artificially inflate the importance of other topics by “injecting” them into the Trending section of users’ Facebook pages.

Since those claims surfaced, Facebook has been questioned by news sites across the political spectrum and by legislators in Washington. On Thursday, critics urged the company to consider the biases of its editors.

“As long as Facebook is hiring editors who lean left politically, those stories are going to get preferential treatment,” Erick Erickson, former editor in chief of the conservative website RedState and founder of another conservative site called The Resurgent, said in an email. “I’d hope that Facebook would take care to consider all views and all news.”

The company has continued to deny accusations of political bias and pointed to editorial rules that discourage Trending Topics staff members from taking one viewpoint or another.

“The guidelines demonstrate that we have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum,” Justin Osofsky, vice president for global operations at Facebook, said in a company blog post on Thursday. “Facebook does not allow or advise our reviewers to discriminate against sources of any political origin, period.”

The Guardian first reported on Facebook’s editorial guidelines.

As Facebook has noted several times this week, algorithms drive much of the decision-making for its Trending Topics, according to the documents. And the company said it has not found evidence that any editor intentionally manipulated the section to suppress conservative content.

But the guidelines, which have never before been made public, give insight into how editors guide and discover news items being shared widely across the social network, and how those editors decide what to promote inside the Trending Topics section.

While algorithms determine the exact mix of topics displayed to each person, based on that user’s past actions on Facebook, a team of people is largely responsible for the overall mix of which topics should — and more important, should not — be shown in Trending Topics.

For instance, after algorithms detect early signs of popular stories on the network, editors are asked to cross-reference potential trending topics with a list of 10 major news publications, including CNN, Fox News, The Guardian and The New York Times.

Editors are also entrusted to spot potentially large news stories bubbling up outside Facebook by using an algorithm that trawls more than a thousand automated feeds, up to and including competitors like YouTube and Reddit, along with traditional news sites.

These editors can then introduce those trends into the Topics box, in order to “connect people to conversations on Facebook about newsworthy events as quickly as possible,” according to Facebook.

One former Facebook Trending Topics editor, who spoke under condition of anonymity because this person had signed a nondisclosure agreement with the company, said it was up to the editors’ discretion to promote newsy topics that were not quite percolating on Facebook.

The guidelines were first created in 2014, according to a Facebook spokeswoman, and have continuously been updated over the last year and a half.

On Tuesday, Senator John Thune, Republican of South Dakota, sent a letter of inquiry to Mark Zuckerberg, chief executive of Facebook, asking the company to further explain its editorial guidelines and to disclose whether there was “any level of subjectivity associated with” the Trending Topics section.

Facebook said it planned to address Senator Thune’s questions, and that it was “continuing to investigate whether any violations took place.”

However, experts warn that fearing bias in human editors and trusting the neutrality of algorithms is a faulty premise. Algorithms are, after all, created by humans and therefore susceptible to the same unconscious biases.

“Imagine going back in time to the 1950s and building a machine-learning algorithm, based on historical data at the time, to decide who would be ‘successful’ in their jobs,” said Cathy O’Neil, a data scientist and author of the forthcoming book “Weapons of Math Destruction,” a study of how algorithms exacerbate inequality. “It would be only white men, because the data it had was picking up the sexism and racism of the time, and the data was informing the definition of success.”

Facebook’s stance, as it made clear on Thursday, is that the best way to handle these issues is with a mix of both human and machine input.

“Every tool we build is designed to give more people a voice and bring our global community together,” Mark Zuckerberg, chief executive of Facebook, said in a post to his Facebook page on Thursday evening. “For as long as I’m leading this company, this will always be our mission.”

保守派火大 轟臉書操縱「熱門話題」

臉書的「熱門話題」(Trending Topics)單元在挑選話題上是否有政治偏見,最近成為社群媒體的熱門話題。為此,臉書十二日公布部編輯挑選話題的原則,強調沒有偏見

臉書有十六億用,堪稱最有影響力的媒體,適逢美國總統大選年,臉書的話題選擇是否隱藏政治偏好,備受關注。

科技新聞網站Gizmodo九日報導,一名臉書前員工透露,臉書部人員「經常壓住保守派讀者有興趣的新聞」,並以「人為方式」把其他新聞加入熱門話題選單。

此一報導在社群媒體引起熱烈討論,甚至引起美國共和黨籍聯邦參議員圖恩的注意,他致函臉書執行長祖克柏,要求臉書解釋「熱門話題」單元是否有主觀意識。

圖恩並發表聲明:「中立和具有包容性的社群媒體平台的任何檢或操縱政治討論的企圖,都是濫用大眾的信任,並違背網路的開放價。」

祖克柏表示,他打算在幾周之與保守派領袖直接對談,明臉書的立場,並討論如何做到讓這個平台盡可能維持開放。

廿八頁的臉書編輯守則詳述如何透過電腦演算和編輯審來挑選「熱門話題」。編輯先用電腦軟體公式偵測出在臉書上被大量分享的新聞,再審這些潛在的熱門話題是否曾被十家主要媒體報導過,包括紐約時報、CNN、衛報、福斯新聞台、BuzzFeed等。若被報導過,顯示這些話題相當重要。

編輯也可以從臉書以外的網站搜尋大新聞,例如YouTubeReddit或傳統新聞網站。

一名前編輯向紐約時報爆料,未在臉書上大量出現的新聞話題,全憑編輯的個人喜好來決定是否列為「熱門話題」。

紐時指出,從臉書公布的編輯流程看來,編輯在每一個點上都有發揮影響力的餘地。

臉書未透露「熱門話題」編輯群有多少成員,衛報報導,只有十二人。

保守派網站RedState前總編輯艾利克森表示:「臉書只要雇用在政治上左傾的編輯,左派新聞就會得到特殊待遇。」

臉書一再否認有政治偏見,並強調編輯守則不鼓勵編輯有自己的觀點。臉書全球營運副總裁歐索夫斯基在臉書貼文:「臉書不允許或建議我們的編輯,對任何政治取向的消息來源有差別待遇,就這麼簡單。」

原文參照:
http://www.nytimes.com/2016/05/13/technology/facebook-guidelines-trending-topics.html

2016-05-14.聯合報.A13.國際.編譯田思怡


回應 回應給此人 推薦文章 列印 加入我的文摘

引用
引用網址:https://city.udn.com/forum/trackback.jsp?no=50132&aid=5722011