November 8, 2021 – A slew of documents leaked from inside Facebook shows that the social media giant’s internal investigation uncovered a host of problems on the platform related to public health and other issues, but did virtually nothing. about.
The files were leaked by a whistleblower, former Facebook employee Frances Haugen, who shared tens of thousands of documents with the Securities and Exchange Commission, Congress and a consortium of news organizations. Since then, he has testified before the Senate Commerce Subcommittee on Consumer Protection and before European lawmakers.
Amplification of ‘Anti-Vaxxers’ and other misinformation
President Joe Biden caused a stir in July when he said that thanks to rampant misinformation about the COVID-19 vaccine, social media platforms like Facebook are “killing people; I mean really, look, the only pandemic we have is among the unvaccinated, “he said.” And they are killing people. “
While he was forced to back down the statement, the leaked documents suggest that he was not necessarily wrong.
According to newspapers, in March, a time when the White House was preparing a $ 1.5 billion campaign against vaccine misinformation, some Facebook employees thought they had discovered a way to counter those lies on the platform, and, at the same time, prioritizing legitimate sources like the World Health Organization.
“Given these results, I guess we hope to release it as soon as possible,” wrote one employee.
But Facebook ignored some of the suggestions, and executives were slow to implement others. Another proposal, aimed at reducing comments against vaccines, was also ignored.
“Why wouldn’t you delete the comments? Because commitment is the only thing that matters, ”Imran Ahmed, executive director of the Center for Counter Digital Hate, an Internet watchdog group, told The Associated Press. “It attracts attention and attention equals eyes and eyes equals advertising revenue.”
Facebook’s algorithms, which determine the content you see on your feed, also help spread misinformation.
“It’s not like the anti-vax contingent was created by Facebook,” says Dean Schillinger, MD, director of the Health Communications Research Program at the University of California-San Francisco. “The algorithm said, ‘OK, let’s find certain people with certain political beliefs and link them to anti-vaccines,'” amplifying the misinformation. “That is certainly something new.”
If that wasn’t enough, it appears Facebook may have misled Congress about the company’s understanding of how COVID misinformation spread on the platform. In July, two top House Democrats wrote to Facebook CEO Mark Zuckerberg, requesting details on how many users had seen COVID misinformation and how much money the company made from those posts.
“At this time, we have nothing to share in response to the questions you have raised, other than what Mark has said publicly,” the company said in response.
But the leaked articles show that at the time, Facebook researchers had conducted multiple studies on COVID misinformation and produced large internal reports. Employees were able to estimate the number of views obtained by a widely shared piece of misinformation. But the company did not recognize it before Congress.
Keeping this knowledge a secret was a great missed opportunity to ensure science-backed information reached the general public, says Sherry Pagoto, PhD, director of the UConn Center for mHealth and Social Media.
“We know how misinformation spreads, so how can we think more about spreading good information?” she says. “They have all kinds of data on the characteristics of the messages that go far. How can we use what they know in the field of health communication to make a plan? “
In an emailed statement, a Meta spokesperson (amid the uproar, Facebook announced a new corporate name) said: “There is no silver bullet to combat disinformation, so we take a comprehensive approach, including removing more than 20 million pieces of content that break our COVID disinformation policies, permanently banning thousands of repeat offenders from our services, connecting more than 2 billion people with reliable information on COVID-19 and vaccines, and partnering with verifiers of independent facts “.
Ignoring the effect of Instagram on the mental health of vulnerable teens
Fighting misinformation is not the only way Facebook and its subsidiaries could have acted to protect public health. The company was also aware of its negative impact on the mental health of young people, but publicly denied it.
Instagram, which is owned by Facebook, is extremely popular with teenage girls. But the photo-sharing app repeatedly exposes them to images of idealized bodies and faces, which can lead to negative self-comparisons and pressure to look perfect.
Pro-eating disorder content is also widely available on the platform. For years, social science and mental health researchers have been analyzing the effect of social media on mental health, especially in adolescents. Studies have found links between Instagram use and depression, anxiety, low self-esteem, and eating disorders.
The Facebook documents revealed what Instagram researchers called a “deep dive into adolescent mental health.” And there were serious problems: Internal research showed that the platform made body image problems worse for 1 in 3 teens, and 14% of teens said Instagram made them feel worse about themselves. The data linked the use of the app with anxiety and depression. And among teens who reported suicidal thoughts, 6% of American users and 13% of British users linked that impulse directly to Instagram.
Jean Twenge, PhD, author of iGen: Why Today’s Superconnected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy, and Completely Unprepared for Adulthood, has been studying the effects of social media on young people for almost a decade.
“I was not surprised that Facebook found that social media could have important links to depression and self-harm. Academic research has proven this for years, ”he says. “I was surprised by how deep her research was on exactly the mindset of teenage girls using Instagram. His research really built on what we already knew. “
As with Facebook’s findings on misinformation, the company publicly downplayed the negative effects of Instagram, including in comments to Congress, and did little to fine-tune teen users’ experience on the app.
“I think given what they knew about Instagram and mental health, it would certainly have been the right thing to do to make changes to the platform,” says Twenge.
In his email, the Meta spokesperson said: “Our research does not conclude that Instagram is inherently bad for teens. While some teens told us that Instagram made them feel worse when they struggled with issues like loneliness, anxiety, and sadness, more teens told us that Instagram made them feel better when they experienced these same issues. “
Responsibility for the public good?
While Facebook users may be surprised to learn how the company tends to put profit over the health of its customers, those who study public health are the opposite.
“This is not a problem unique to social media platforms,” says Schillinger.
“Corporate entities frequently pursue policies that involve the public to participate in activities, to buy or consume products, to implement behaviors that are not healthy for themselves or for others or for the planet. … Do you think Facebook is acting differently from any other company in that space? “
That’s where the potential for regulation comes into play, said Haugen, the whistleblower. She has asked for it, as have many lawmakers in the wake of her revelations.
“Large organizations that have influence and access to many people should be held accountable for the well-being of that population, just as a matter of principle,” says sociologist Damon Centola, PhD, author of Change: How to Make Great Things Happen.
Compare the explosion of social media to the history of television, which has been regulated in numerous ways for decades.
“I think that provides us with a parallel to social media and the ability of the medium to influence the population,” he says. “It seems to me that organizations cannot get away with saying that they will not consider the public welfare.”
So-called Facebook Papers are more damning, some experts say, because the company’s defense claims its research was only intended for product development, so it doesn’t prove anything.
This ignores all peer-reviewed articles, published in respected journals, that reinforce the findings of your internal research. Together, the two types of research leave little room for doubt and little doubt that something needs to change.
“Think of it as environmental pollution,” says Centola. “Companies can know that they are polluting, but they can also say that it didn’t really matter, that it didn’t cause any harm. But then you get the documentation that says no, that has huge effects. That’s when it really matters. “
Social media as a force for good
But there is a potential advantage of Facebook docs, experts say: The company clearly knows a lot about how to spread messages effectively. With enough pressure, Facebook and other social media platforms can now start to use this insights in a positive direction.
“Facebook should develop strong collaboration with trusted entities to develop content that is both true and promotes public health, while being engaging and algorithm-driven,” says Schillinger. “If we can use the platform and the scope and the [artificial intelligence] Facebook has content that promotes health, the sky is the limit. “
And efforts like that may be on the horizon.
“We are focused on developing new features to help people struggling with negative social comparison or negative body image,” the Meta spokesperson wrote in the email. “We are also continuing to look for opportunities to work with more partners to publish independent studies in this area, and we are working on how we can allow outside researchers to have more access to our data in a way that respects individual privacy.”
Which is not to say that Facebook will voluntarily put public health before the company’s need to make money, without regulations forcing it to do so.
“I think Facebook is interested in improving its platform for users. But your first interest will always be to have as many users as possible spending as much time on the platform as possible, ”says Twenge. “Those two wishes often have conflicting purposes.”