Facebook Inc.
knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands. That is the central finding of a Wall Street Journal series, based on a review of internal Facebook documents, including research reports, online employee discussions and drafts of presentations to senior management.
Time and again, the documents show, Facebook’s researchers have identified the platform’s ill effects. Time and again, despite congressional hearings, its own pledges and numerous media exposés, the company didn’t fix them. The documents offer perhaps the clearest picture thus far of how broadly Facebook’s problems are known inside the company, up to the chief executive himself.
has said Facebook allows its users to speak on equal footing with the elites of politics, culture and journalism, and that its standards apply to everyone. In private, the company has built a system that has exempted high-profile users from some or all of its rules. The program, known as “cross check” or “XCheck,” was intended as a quality-control measure for high-profile accounts. Today, it shields millions of VIPs from the company’s normal enforcement, the documents show. Many abuse the privilege, posting material including harassment and incitement to violence that would typically lead to sanctions. Facebook says criticism of the program is fair, that it was designed for a good purpose and that the company is working to fix it. (Listen to a related podcast.)
Researchers inside Instagram, which is owned by Facebook, have been studying for years how its photo-sharing app affects millions of young users. Repeatedly, the company found that Instagram is harmful for a sizable percentage of them, most notably teenage girls, more so than other social-media platforms. In public, Facebook has consistently played down the app’s negative effects, including in comments to Congress, and hasn’t made its research public or available to academics or lawmakers who have asked for it. In response, Facebook says the negative effects aren’t widespread, that the mental-health research is valuable and that some of the harmful aspects aren’t easy to address. (Listen to a related podcast.)
Facebook made a heralded change to its algorithm in 2018 designed to improve its platform—and arrest signs of declining user engagement. Mr. Zuckerberg declared his aim was to strengthen bonds between users and improve their well-being by fostering interactions between friends and family. Within the company, the documents show, staffers warned the change was having the opposite effect. It was making Facebook, and those who used it, angrier. Mr. Zuckerberg resisted some fixes proposed by his team, the documents show, because he worried they would lead people to interact with Facebook less. Facebook, in response, says any algorithm can promote objectionable or harmful content and that the company is doing its best to mitigate the problem. (Listen to a related podcast.)
Scores of Facebook documents reviewed by The Wall Street Journal show employees raising alarms about how its platforms are used in developing countries, where its user base is huge and expanding. Employees flagged that human traffickers in the Middle East used the site to lure women into abusive employment situations. They warned that armed groups in Ethiopia used the site to incite violence against ethnic minorities. They sent alerts to their bosses about organ selling, pornography and government action against political dissent, according to the documents. They also show the company’s response, which in many instances is inadequate or nothing at all. A Facebook spokesman said the company has deployed global teams, local partnerships and third-party fact checkers to keep users safe. (Listen to a related podcast.)
Facebook threw its weight behind promoting Covid-19 vaccines—“a top company priority,” one memo said—in a demonstration of Mr. Zuckerberg’s faith that his creation is a force for social good in the world. It ended up demonstrating the gulf between his aspirations and the reality of the world’s largest social platform. Activists flooded the network with what Facebook calls “barrier to vaccination” content, the internal memos show. They used Facebook’s own tools to sow doubt about the severity of the pandemic’s threat and the safety of authorities’ main weapon to combat it. The Covid-19 problems make it uncomfortably clear: Even when he set a goal, the chief executive couldn’t steer the platform as he wanted. A Facebook spokesman said in a statement that the data shows vaccine hesitancy for people in the U.S. on Facebook has declined by about 50% since January, and that the documents show the company’s “routine process for dealing with difficult challenges.”
Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8
24World Media does not take any responsibility of the information you see on this page. The content this page contains is from independent third-party content provider. If you have any concerns regarding the content, please free to write us here: contact@24worldmedia.com
Marnus Labuschagne Caught Off-Guard By ODI Captain Call After Steve Smith Snub
Everyone Is Looking Forward To It, The Standard Will Be Very High – Jacques Kallis On CSA’s SA20
Danushka Gunathilaka Granted Bail On Sexual Assault Charges
Ramiz Raja Sends Legal Notice To Kamran Akmal For Defamatory, False Claims Against The Board
Harbhajan Singh Reckons Mumbai Indians Should Release Kieron Pollard Ahead Of The IPL Auction 2023
Ian Bishop Praises Sam Curran For His Performances On Bouncy Australian Tracks
Why Choose A Career In Child Psychology?