Facebook is changing its name to “Meta.” The company’s annual Connect conference took place Thursday, shifting attention—at least momentarily—from a crisis spawned when a whistleblower released tens of thousands of damaging internal documents. Founder Mark Zuckerberg announced many new product ideas coming in the next few years or decades, but did not address the fallout from the revelations.
In a nearly 90-minute video presentation, Zuckerberg said he wants to move Facebook beyond being just a social media site. He announced the company is building a “metaverse” that includes futuristic products like virtual reality headsets and augmented reality glasses. Zuckerberg called the metaverse an “embodied internet” that puts users “in” the experience, not just looking at it. The Meta team did say that the products in the presentation will not come to fruition for a while.
“Our hope is that within the next decade, the metaverse will reach a billion people, host hundreds of billions of dollars of digital commerce, and support jobs for millions of creators and developers,” Zuckerberg said.
Scott Galloway professor of marketing at NYU Stern and author of The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google, told The Dispatch the rebrand may actually be good for business if accompanied by structural change, but ultimately the goal of the announcement is clear: “Ninety percent of people will recognize this for what it is and that is a cheap attempt to distance itself from the past and their behavior.”
Even with the name change and new products, big questions remain: Can a rebrand make consumers forget the many problems that have roiled the company over the last six weeks? And will it change anything about the way it does business?
Since mid-September, the Wall Street Journal has published a series of stories based on a trove of internal Facebook documents revealing that dissatisfied employees have been increasingly outspoken over the company’s handling of misinformation, concerns over the mental health of teenagers, and accusations that it has contributed to human trafficking on a global scale.
(Disclosure: The Dispatch is a third-party partner in Facebook’s fact-checking operation.)
Troubling evidence in the documents shows the company is aware of how much damage it is doing to the public, and sometimes not doing enough to prevent it.
Frances Haugen, a computer engineer and former Facebook employee turned whistleblower, collected and released tens of thousands of documents over the course of a sophisticated campaign to keep pressure on Zuckerberg and the company.
Haugen told The Dispatch via email that the system Facebook uses to filter content is simply not working. She said Facebook catches only 3 to 5 percent of hate speech, less than .8 percent of content that incites violence, and 8 percent of graphic violence.
“The problem with Facebook is not ‘bad’ ideas or ‘bad’ people,” Haugen said. “It is a system that gives the most reach to extreme, polarizing, and divisive content.
Haugen and a team of lawyers filed official complaints with the Securities and Exchanges Commission against the social media giant. Her team alleged that what the company and its CEO, Mark Zuckerberg, were telling the public did not match their own internal research on key issues like mental health in teens and misinformation on the platform.
She also brought the trove of documents to Jeff Horowitz and the Wall Street Journal. Haugen told Ben Smith at the New York Times that she “auditioned Jeff for a while” and chose him because he was less “sensationalistic than other choices I could have made.”
Horowitz and others at the Wall Street Journal wrote a series of articles and produced a podcast series called “The Facebook Files.” Haugen appeared on 60 Minutes to discuss her findings. And, on October 7, Smith reported that a consulting firm founded by a former Obama administration official working with Haugen approached several other news outlets with a coordinated plan to release the documents to more than just the Wall Street Journal. Articles have since appeared in the Associated Press, The Atlantic, NBC News, and others.
Galloway told The Dispatch the way Haugen and her team has gone about distributing the information she got from inside of Facebook has made all the difference, “They have ‘big teched’ Big Tech. … They’ve been very disciplined, very coordinated, very well resourced. Basically, she’s done to them what Big Tech has been doing to all of us for 10 years.”
The documents paint a portrait of a staff frustrated by perceived harm to consumers and what they see as an insufficient response from Zuckerberg and other corporate executives. Employees were especially agitated after the riots on January 6. Venting on an internal message board, many employees said the riot was a result of Facebook’s inaction.
“All due respect, but haven’t we had enough time to figure out how to manage discourse without enabling violence?” one staffer wrote. “We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.”
At least one attempt by the company to deal with the challenge of misinformation launched in 2018, but itquickly backfired. Facebook changed its algorithm to try to foster more “meaningful social interactions,” or MSIs, as the company calls them. The goal was to have friends and family interact with one another.
The Wall Street Journal reported that company data scientists looked into claims that divisive content was going viral and serving to encourage media companies to create more of the same. Their findings? “Our approach has had unhealthy side effects on important slices of public content, such as politics and news.” The researchers added, “Misinformation, toxicity, and violent content are inordinately prevalent among reshares.”
Facebook even implemented changes to limit the spread of toxic misinformation in the spring of 2020 for content that fell into certain categories: civic and health. But when Facebook’s integrity team presented ways to implement the same changes across the platform, Zuckerberg said the plan would limit overall engagement.
Facebook’s role in the January 6 storming of the Capitol—militia groups and other participants used Facebook to coordinate plans and recruit people to D.C.—is just one example of problems cited by employees in the internal documents.
Documents also show the company is aware of human trafficking and drug cartel activity across the globe on its platforms, and employees claim the response from Facebook has not been strong enough. Some of the more egregious examples of illegal activity being conducted on Facebook include a Mexican drug cartel using the platform to hire hitmen, human traffickers in the Middle East luring women into sex slavery, and groups in Ethiopia and Myanmar inciting ethnic violence.
Haugen—and other employees at Facebook, according to the documents—said one reason why Facebook can’t stop these crimes from happening is language. Facebook claims it’s systems support several different languages, but Haugen says that’s not true and has serious repercussions, “Facebook has been actively misleading people on what its safety systems do — they claim to support 50 languages around the world, but most of those languages are missing the vast majority of safety/integrity systems found in English.”
The result is an unfiltered Facebook, which leads to radicalization particularly in places where many different languages and dialects are used.
“Facebook is not delivering a product where the label matches what is in the can, and the consequences are ethnic violence in the global South,” Haugen said.
Facebook did not respond to a request for an interview from The Dispatch and Haugen said no one from the company has reached out to her at any point since she revealed herself to be the whistleblower.
Sen. Richard Blumenthal of Connecticut, who chairs the Senate Commerce Subcommittee on Consumer Protection, Product Safety, and Data Security, told The Dispatch it’s time to hold Facebook accountable, “There is ample credible evidence to start an SEC investigation. Potentially other federal agencies, like the FTC should look into misrepresentation and deception by Facebook. I’m not reaching any conclusions at this point, but there is ample credible evidence for investigation.”
Galloway told The Dispatch that while he doubts this is the end of Facebook, it may send a warning shot to other Big Tech companies: “I do think that Facebook is going to have to structurally change. I think you are going to see a perp walk. … You’re going to see a chill sent down the spines of Big Tech executives when there’s a perp walk. None of this stops unless there’s a perp walk.”
Haugen stressed to The Dispatch that she will continue trying to make Facebook a safer platform by talking to anyone who will listen about how to improve the platform. She told The Dispatch, “This is a marathon, not a sprint.”