Facebook, show us the mess

A pile of internal communications has given us a rare, unvarnished look into Facebook’s self-examinations and deliberations over how people are influenced by the company’s product designs and decisions.

Shira OvideThe New York Times
Published : 28 Oct 2021, 04:54 AM
Updated : 28 Oct 2021, 06:12 AM

Perhaps the public and Facebook would benefit if these glimpses were not so rare. Facebook and other internet powers could help us understand the world by showing us a little more of the messy reality of running virtual hangouts for billions of humans.

Something that has pleasantly surprised me from the reporting on the documents collected by Frances Haugen, the former Facebook product manager, is how much thought and care Facebook employees seemed to have devoted to assessing the company’s apps and the ways they shape what people do and how communities and societies behave. Facebook, show us this side of yourself.

Casey Newton, a technology writer, made this case last month: “What if Facebook routinely published its findings and allowed its data to be audited? What if the company made it dramatically easier for qualified researchers to study the platform independently?”

And what if other companies in technology did the same?

Imagine if Facebook had explained out loud the ways that it wrestled with restricting posts with false information about fraud after the 2020 US presidential election and whether that risked silencing legitimate political discussions.

What if Facebook had shared with the public its private assessments of the ways that features to easily share lots of posts amplified hateful or bullying posts?

Imagine if Facebook employees involved in major product design changes could — like the US Supreme Court justices — write dissenting opinions explaining their disagreements to the public.

I know that some or all of that sounds like a fantasy. Organizations have legitimate reasons to keep secrets, including to protect their employees and customers.

But Facebook is not an ordinary organisation. It is among a tiny number of corporations whose products help shape how humans behave and what we believe.

Learning more about what Facebook knows about the world would help improve our understanding of one another and of Facebook. It would give outsiders an opportunity to validate, challenge and add to Facebook’s self-assessments. And it might make the company a little more trustworthy and understood.

Facebook has said that it believed the reporting about its internal communications has lacked nuance and context. Its reaction has included clamping down on internal deliberations to minimise leaks. And in my conversations with people in technology this week, there is a fear that Facebook, YouTube, Twitter and others will respond to weeks of tough reporting on Facebook by probing less into the effects of their products or keeping what they learn under lock and key.

But another way is to be more open and reveal far more. That would not be entirely out of character for Facebook.

In 2015, the company publicly released and discussed research by its data scientists that found that the social network did not worsen the problem of “filter bubbles,” in which people see only information that confirms their beliefs. In 2018, Mark Zuckerberg published a lengthy post detailing the company’s examination of how people on Facebook responded to material that was salacious or offensive. The same year, Facebook disclosed an ambitious plan to share huge amounts of posts and other user data with outside researchers to study harmful information.

These efforts were far from perfect. Notably, the independent research consortium was dogged by botched data and disputes over preserving people’s privacy. But the efforts show that Facebook at times has wanted to be more open.

Nathaniel Persily, a Stanford Law School professor who was previously co-chair of the research consortium, recently drafted text for legislation that could grant independent researchers access to information about internet companies’ inner workings.

He told me that he thought of the research consortium as “road kill on the highway to something glorious,” which would be both voluntary and forced transparency by large internet companies. He praised Twitter, which last week released an analysis of the ways its computer systems in some cases amplified views on the political right more than those on the left.

Twitter’s research was incomplete. The company said it did not know why some messages circulated more than others. But Twitter was honest about what it knew and did not, and gave the public and researchers opportunities for further investigation. It showed us the mess.

© 2021 The New York Times Company