Can You Say That Again? Lost Profits? Am I Saying That Right?
Her proper name is Frances Haugen. That is a fact that Facebook has been anxious to know since terminal month when an anonymous former employee filed complaints with federal law enforcement. The complaints say Facebook'southward own research shows that it amplifies detest, misinformation and political unrest—only the company hides what it knows. One complaint alleges that Facebook'south Instagram harms teenage girls. What makes Haugen's complaints unprecedented is the trove of private Facebook research she took when she quit in May. The documents appeared start, terminal month, in the Wall Street Journal. But this night, Frances Haugen is revealing her identity to explicate why she became the Facebook whistleblower.
- Facebook's response to sixty Minutes' report, "The Facebook Whistleblower"
- Facebook whistleblower says visitor incentivizes "angry, polarizing, divisive content"
- Picket Live: Facebook whistleblower Frances Haugen testifies before Senate committee
Frances Haugen: The matter I saw at Facebook over and over again was in that location were conflicts of interest between what was practiced for the public and what was good for Facebook. And Facebook, over and over over again, chose to optimize for its own interests, like making more than money.
Frances Haugen is 37, a data scientist from Iowa with a degree in computer engineering and a Harvard master'south caste in business. For 15 years she's worked for companies including Google and Pinterest.
Frances Haugen: I've seen a bunch of social networks and it was substantially worse at Facebook than anything I'd seen before.
Scott Pelley: You know, someone else might have just quit and moved on. And I wonder why y'all have this stand.
Frances Haugen: Imagine y'all know what'south going on inside of Facebook and you know no one on the exterior knows. I knew what my hereafter looked similar if I connected to stay inside of Facebook, which is person after person afterwards person has tackled this inside of Facebook and ground themselves to the ground.
Scott Pelley: When and how did information technology occur to yous to accept all of these documents out of the company?
Frances Haugen: At some point in 2021, I realized, "Okay, I'm gonna have to exercise this in a systemic way, and I have to exit plenty that no one can question that this is real."
She secretly copied tens of thousands of pages of Facebook internal research. She says evidence shows that the visitor is lying to the public about making significant progress against hate, violence and misinformation. I study she found, from this year, says, "nosotros approximate that we may activity as little equally 3-5% of hate and about 6-tenths of one% of V & I [violence and incitement] on Facebook despite being the best in the world at it."
Scott Pelley: To quote from another one of the documents y'all brought out, "Nosotros take evidence from a variety of sources that detest speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world."
Frances Haugen: When we live in an information surroundings that is total of aroused, hateful, polarizing content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to intendance for each other, the version of Facebook that exists today is tearing our societies apart and causing ethnic violence effectually the world.
'Ethnic violence' including Myanmar in 2018 when the military used Facebook to launch a genocide.
Frances Haugen told united states she was recruited by Facebook in 2019. She says she agreed to accept the job only if she could work confronting misinformation because she had lost a friend to online conspiracy theories.
Frances Haugen: I never wanted anyone to feel the pain that I had felt. And I had seen how high the stakes were in terms of making sure there was loftier quality information on Facebook.
At headquarters, she was assigned to Borough Integrity which worked on risks to elections including misinformation. But after this past election, there was a turning point.
Frances Haugen: They told us, "Nosotros're dissolving Civic Integrity." Like, they basically said, "Oh skilful, nosotros made it through the election. In that location wasn't riots. We can become rid of Civic Integrity now." Fast forward a couple months, nosotros got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, "I don't trust that they're willing to actually invest what needs to be invested to keep Facebook from existence unsafe."
Facebook says the work of Civic Integrity was distributed to other units. Haugen told u.s.a. the root of Facebook's problem is in a modify that it made in 2018 to its algorithms—the programming that decides what you meet on your Facebook news feed.
Frances Haugen: And then, you know, you lot have your phone. Yous might come across only 100 pieces of content if you lot sit and scroll on for, you know, 5 minutes. Merely Facebook has thousands of options information technology could prove you.
The algorithm picks from those options based on the kind of content you lot've engaged with the almost in the past.
Frances Haugen: And i of the consequences of how Facebook is picking out that content today is it is -- optimizing for content that gets engagement, or reaction. Simply its own research is showing that content that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions.
Scott Pelley: Misinformation, angry content-- is enticing to people and go on--
Frances Haugen: Very enticing.
Scott Pelley:--keeps them on the platform.
Frances Haugen: Yeah. Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they'll click on less ads, they'll make less coin.
Haugen says Facebook understood the danger to the 2020 Election. So, it turned on prophylactic systems to reduce misinformation—only many of those changes, she says, were temporary.
Frances Haugen: And as soon as the election was over, they turned them back off or they changed the settings back to what they were earlier, to prioritize growth over safety.
And that really feels like a betrayal of democracy to me.
Facebook says some of the safe systems remained. But, afterwards the election, Facebook was used by some to organize the January 6th coup. Prosecutors cite Facebook posts equally evidence--photos of armed partisans and text including, "by bullet or election restoration of the commonwealth is coming!" Extremists used many platforms, merely Facebook is a recurring theme.
Later the attack, Facebook employees raged on an internal message board copied by Haugen. "…Oasis't we had enough time to figure out how to manage discourse without enabling violence?" We looked for positive comments and establish this, "I don't call back our leadership team ignores data, ignores dissent, ignores truth…" just that drew this reply, "welcome to Facebook! I see you just joined in November 2020… we have been watching… wishy-washy actions of company leadership for years now." "…Colleagues… cannot conscience working for a company that does non do more to mitigate the negative effects of its platform."
Scott Pelley: Facebook essentially amplifies the worst of human being nature.
Frances Haugen: It'due south i of these unfortunate consequences, right? No one at Facebook is malevolent, simply the incentives are misaligned, right? Similar, Facebook makes more money when you eat more content. People bask engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.
That dynamic led to a complaint to Facebook by major political parties across Europe. This 2019 internal written report obtained by Haugen says that the parties, "…feel strongly that the change to the algorithm has forced them to skew negative in their communications on Facebook… leading them into more extreme policy positions."
Scott Pelley: The European political parties were essentially proverb to Facebook the way you've written your algorithm is changing the mode we lead our countries.
Frances Haugen: Yep. You are forcing us to take positions that we don't like, that nosotros know are bad for lodge. We know if we don't take those positions, nosotros won't win in the marketplace of social media.
Testify of damage, she says, extends to Facebook's Instagram app.
Scott Pelley: One of the Facebook internal studies that you found talks well-nigh how Instagram harms teenage girls. 1 study says 13.5% of teen girls say Instagram makes thoughts of suicide worse; 17% of teen girls say Instagram makes eating disorders worse.
Frances Haugen: And what'south super tragic is Facebook's own enquiry says, every bit these young women begin to consume this-- this eating disorder content, they get more and more depressed. And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook'southward own enquiry says information technology is not just the Instagram is dangerous for teenagers, that it harms teenagers, it'due south that it is distinctly worse than other forms of social media.
Facebook said, just last calendar week, it would postpone plans to create an Instagram for younger children.
Last month, Haugen's lawyers filed at least 8 complaints with the Securities and Exchange Commission which enforces the law in financial markets. The complaints compare the internal research with the company's public face—often that of CEO Marking Zuckerberg—who testified remotely to Congress last March.
Marking Zuckerberg testimony on March 25:
We have removed content that could lead to imminent real-world impairment. We have congenital an unprecedented third-party fact checking program. The system isn't perfect. Just it is the all-time arroyo that we accept establish to accost misinformation in line with our land's values.
One of Frances Haugen'south lawyers, is John Tye. He'southward the founder of a Washington legal group, chosen "Whistleblower Aid."
Scott Pelley: What is the legal theory behind going to the SEC? What laws are yous alleging have been broken?
John Tye: As a publicly-traded company, Facebook is required to not prevarication to its investors or even withhold textile information. So, the SEC regularly brings enforcement actions, alleging that companies like Facebook and others are making cloth misstatements and omissions that affect investors adversely.
Scott Pelley: One of the things that Facebook might allege is that she stole company documents.
John Tye: The Dodd-Frank Act, passed over ten years ago at this point, created an Part of the Whistleblower inside the SEC. And one of the provisions of that police says that no company can prohibit its employees from communicating with the SEC and sharing internal corporate documents with the SEC.
Frances Haugen: I have a lot of empathy for Marker. and Marker has never set out to brand a mean platform. Just he has allowed choices to exist fabricated where the side effects of those choices are that hateful, polarizing content gets more distribution and more than reach.
Facebook declined an interview. Just in a written statement to hr it said, "every 24-hour interval our teams have to rest protecting the right of billions of people to limited themselves openly with the need to keep our platform a safe and positive identify. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and practise nothing is just non truthful."
"If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago."
Facebook is a $ane trillion company. Merely 17 years old, it has two.8 billion users, which is 60% of all internet-continued people on Earth. Frances Haugen plans to show earlier Congress this week. She believes the federal government should impose regulations.
Frances Haugen: Facebook has demonstrated they cannot act independently Facebook, over and over once again, has shown information technology chooses profit over safety. It is subsidizing, it is paying for its profits with our safety. I'm hoping that this will have had a big enough impact on the world that they get the fortitude and the motivation to actually go put those regulations into identify. That's my hope.
Produced past Maria Gavrilovic and Alex Ortiz. Broadcast associate, Michelle Karim. Edited by Michael Mongulla.
Source: https://www.cbsnews.com/news/facebook-whistleblower-frances-haugen-misinformation-public-60-minutes-2021-10-03/
0 Response to "Can You Say That Again? Lost Profits? Am I Saying That Right?"
Post a Comment