This week, the revelations from a number of internal Facebook documents came to light, thanks to Frances Haugen, a former employee of the social media giant. The documents reveal that the organization, as The Washington Post summarized, “privately and meticulously tracked real-world harms exacerbated by its platforms, ignored warnings from its employees about the risks of their design decisions and exposed vulnerable communities around the world to a cocktail of dangerous content.”

Chris Martin is content marketing editor at Moody Publishers. He studies internet culture and the effects of social media on broader society for fun. He is publishing a book with B&H Publishing in February called Terms of Service that is in the same vein as this newsletter.

Martin joined global media manager Morgan Lee and executive editor Ted Olsen to discuss the revelations that these documents show, what this means for all of us regardless of whether we’re on Facebook or not, and if there’s a “Christian” way to react to this news.

What is Quick to Listen? Read more.

Rate Quick to Listen on Apple Podcasts

Follow the podcast on Twitter

Follow this week's hosts on Twitter: Morgan Lee and Ted Olsen

Follow our guest on Twitter: Chris Martin

Subscribe to his Substack: Terms of Service

Music by Sweeps

Quick to Listen is produced Morgan Lee and Matt Linder

The transcript is edited by Faith Ndlovu

Highlights from Quick to Listen: Episode #286

Tell us a little bit about Francis Haugen and some of the key revelations that she is putting out there.

Chris Martin: The best way to summarize it is she's a former Facebook product manager. She's a veteran of Silicon Valley. She was at Pinterest and Google, I believe before she was even at Facebook. She left Facebook in May of 2021 and took thousands if not tens of thousands of documents with her. Facebook has alleged that she shouldn't have done this or that she's in the wrong for doing so.

I don't know what the rules are on things like that. If they're publicly posted things within company communications, I'm not sure if they have any case against her, but she leaked thousands of documents detailing the inner workings of Facebook and giving us the broader public and Facebook users the ability to see how this place works because it's always been so shrouded in mystery. So many Silicon Valley companies feel that way because they're just protecting so much intellectual property but it's one of the best pictures we've ever had inside of Facebook because obviously, they do a really good job painting a certain picture that they want people to see of what it's like inside Facebook.

Article continues below

Everyone thinks with these Silicon Valley companies like Facebook or Google, these folks just live at work. They've got beanbags and free food and all kinds of amenities that encourage them, they can bring their dogs, etc. and this is one of the rawest, through emails postings by people who work there.

A common Facebook ritual is the badge post, which is when people post on their Facebook internal company called Workplace. When they're leaving, they'll post something like a picture of their badge and a little often heartfelt or honest story of what their time at Facebook was like and how much they enjoyed working with the team or whatever.

Having not seen the Facebook papers that she's released because they're not publicly available, my understanding is that she shared a lot of those badge posts of people who were brutally honest about how they were stonewalled in raising their concerns or things like that.

Francis Haugen’s biggest reason she went before a congressional subcommittee a couple of weeks ago and the reason she was before some committee of parliament this week is that she claims that Facebook has potentially violated Securities Law by not informing investors of issues within the company. This is a good thing. No random former Facebook employee can get the audience of a congressional committee just because they think the company's bad but she has an actual claim that she believes that they violated Securities Law. Whether or not that's true, I'm not a lawyer, I don't know, but what's come of it has much broader and more detailed than that.

When you were reading through the news reports this week, what was the overall gist that you were taking away from that?

Chris Martin: There were 50 articles posted on Monday alone because there's this consortium of 17 different journalists, news outlets who were working together to wait to hold all these articles until Monday which is also when Facebook's earnings call was and they released them all at one time and I've read 7 or 8 of the 50.

I do plan to read all 50 if I can. But what I've gathered is a confirmation of everything I've ever thought. It’s not really what I was looking for, but the thing is because Facebook is so shrouded in secrecy, you get these feelings.

Article continues below

When I was at Lifeway, I led social media strategy there so I oversaw our 270 social media accounts. I was in Facebook business manager seven hours of every day sometimes and so I'm intimately aware of how Facebook works from a corporate user perspective.

As you use Facebook to the degree that I have you get these feelings about what Facebook values and how to get stuff in front of people and how the algorithm works cause there's a point system for every piece of content that gets posted. I always had the feeling that likes might be worth one point, stronger reactions like the angry face or the heart might be worth five points. Comments are worth 50 points and shares are worth a hundred points because that's how engagement works. So, I would try to play that game and figure out how to get content in front of people but Facebook never really says how that works. They never pulled the curtain back on the Wizard of Oz booth and show you how things work.

But here in these documents, we find out that they have a point system assigned to engagement that shows that the more passionate engagement is the more invested, the stronger the emotion is, the more points the content receives, and the more likely it is to appear at the top of people's feeds.

Ted Olsen: So, they are rewarding anger more than love.

Chris Martin: Yeah. So that's my understanding. Washington Post has an article on this that goes into a lot of detail. I think they even have maybe screenshots showing this data. I think there are points assigned where an angry face earns you more attention than a like.

What people like me have always felt around in the shadows is probably the case, we now know is the case. I've always had the feeling that Facebook knows the depths of how their platform affects people on a global scale and the problems that they do have because often they'll put the problem to the side and talk about all the good things they are doing and not pay attention to the problem in their public statements of how to address election security or violence around the world. But these internal documents show they understood the problem they had, they just didn't do as much about it like most people, including their internal employees who are raising the alarm thought they should. Two other minor revelations, I think the best thing that comes out of this for Facebook is the FTC’s monopoly case against them.

The idea that Facebook is a monopoly is destroyed. This is a treasure trove of documents that shows how Facebook is hemorrhaging young people and they know it. They cannot compete with Tik Tok and Snapchat in reaching gen Z or high schoolers and college students.

Article continues below

A point in Facebook's favor as far as this stuff is concerned is that these documents make it very clear that Facebook is losing young people left and right and maybe they don't have quite the monopoly that it appears they have just because of their sheer user numbers and revenue.

Another thing is, it looks like plenty of people that Facebook hired to research how their platform is used for good and ill have brought concerns to the fore that were just ignored or squelched in some way by Mark Zuckerberg himself or by higher-ups in general.

So, people within Facebook know there are problems and there's been plenty of deliberation about these problems, but nothing was done, and a lot of the research has been shelved as we've seen in the past with little leaks that have come to light. There's a great quote by Kevin Roose of the New York Times who covers this stuff pretty regularly. To paraphrase, he said something to the effect of, “if you hire a bunch of people on the premise of changing the world, you need to be careful or they'll take you seriously.” That's what Facebook is dealing with Francis Haugen, with a lot of these researchers, are they came to Facebook on this with a sort of civic-mindedness, this idea that I'm going to Facebook to change the world for the good and then when they try to do that, and they're pushed aside and clamped down on and told a problem is recognized but nothing is going to be done about it because it will hinder growth, then you're going to have people who leak tens of thousands of documents to the Washington Post and the Wall Street Journal and everybody else.

So, is the basic complaint then that Facebook did not do enough to stop bad behavior or that they deliberately incentivized it?

Chris Martin: I think it's both. So I don't think Facebook wants people to use their platform for ill, but I think they want to make money and in line with their earliest slogan of ‘move fast and break things’, I guess you could say there seems to be an internal value of making as much money as possible at whatever cost and this gets into why I think Christians should care, but the factors that lead to Facebook flourishing, lead to the opposite for its users. Facebook flourishes when they make money, they make money when people spend more time on it. People spend more time on Facebook when they're driven to strong feelings, more likely of anger than anything else by sensational content.

Article continues below

So, Facebook is incentivized to make people mad. That's how they make the most money because that's how people spend the most time on the platform. For them to make as much money as possible, their users have to be prevented from flourishing. Those things are oppositely correlated or inversely related. The more Facebook flourishes, meaning makes more money, the worst it is for its users.

I think that's more globally even than in the US. There's a great quote I heard earlier this week, that what a lot of these papers reveal is that as bad as we think Facebook is in the US or the west, generally we have the best version of this app, we have the best version of this site. People in the global south and in countries where they don't speak English have even worse versions than we do because Facebook hasn't attempted to moderate content in many languages.

I don't think Facebook incentivizes bad behavior because they want to wreak chaos. I think they incentivize certain actions to make them the most money and those actions happen to be bad. This is not new data, the Wall Street Journal published an internal report from Facebook around 2017/2018 where researchers presented a slideshow in the company saying that the more people get angry on our platform, the more time they spend and Facebook said, “oh, that's a nice presentation. We're not going to do anything about that.” That was a few years ago. So, I think that they don't do enough to stop bad behavior and they benefit too much from bad behavior or at least mental unhealth.

Ted Olsen: There is some indication in some of these files that a lot of it is about maximizing user time on site and maximizing their ability to serve a lot of advertising and also to mine a lot of user data. It does seem like some decisions were also made politically just in terms of the Trump administration in particular complaining about Facebook and some of the conservatives being silenced on Facebook or not getting views.

There was an effort to work the , I guess that would be one way to put it. There's some indication that working in the refs was successful and some people were called off on some of the January 6th efforts.

How much does working the refs play into some of what's going on with what Facebook has been incentivizing?

Article continues below

Chris Martin: Francis Haugen gave Wall Street Journal advanced access to everything that everyone got access to on Monday and when they ran their articles in on September 13th, Jeff Horwitz wrote that Facebook says its rules apply to all, company documents reveal a secret elite that's exempt, a program known as XCheck or crosscheck has given millions of celebrities, politicians, and high profile users, special treatment, a privilege that many abuse. So, I think that's also an example of that.

Ted Olsen: It is of no surprise that as in other areas of life, the powerful get special rules, they can play by different rules than some of the rest of us users in terms of both how the algorithm treats their posts and whether they get flagged for special attention.

Chris Martin: Right. Going back to the refs question there's a great article written by Jeff Horwitz who has led this charge for the Wall Street Journal. He writes for the Wall Street Journal obviously, and they got access to these Facebook papers back when they were just called the Facebook files long before everyone else back in September, that's who Francis Haugen went to first.

A specific example of cross check’s hidden rules from back in June of 2020 is when a president Trump post came up during a discussion. Jeff wrote in the Wall Street Journal piece that a president Trump post came up during a discussion about crosscheck’s hidden rules that took place on the company's internal communications platform, which is Facebook Workplaces. Mr. Trump said in a post, “when the looting starts, the shooting starts” and a Facebook manager noted in an automated system that the president's post scored a 90 out of a 100 indicating a high likelihood that it violated the platform's rules. I don't know the ins and outs of how crosscheck works, but my understanding is that AI reads content and it then gives an assurance score that indicates that based on the words and how they're arranged in this particular Facebook post, there's an 88 out of a 100 chance that this is probably breaking your rules, or maybe there's a 50 out of a 100. This one rated a 90 out of a 100, indicating that it violated the platform's rules.

For a normal user post, Horwitz writes that such as score would result in content being removed as soon as a single person reported it. So, this AI recognizes this content is probably violating the rules, but then it requires a Facebook user to trigger a review by saying it is violating the rules and reporting it.

Article continues below

It was not deleted and he wasn’t taken off the platform as it has for other normal users who aren't presidents of the United States. I know a lot of people disagree with me on this but the idea that Facebook is anti-conservative is so blatantly untrue and all it takes is looking at statistics. They had a tool that they just recently shut down called CrowdTangle. It used to be its own company, they bought it. Kevin Roose, who I cited before would often post CrowdTangle data on Twitter every given day during the 2020 election. CrowdTangle would show the top 10 most engaged Facebook posts on any given day. He created an AI Twitter account to just do it automatically and almost every single day, eight out of 10, nine out of 10 of the 10 most engaged posts on Facebook in any given day were Ben Shapiro, Donald Trump, Jr., Fox News, conservative media personalities or politicians in general and it's regularly the most engaged content on Facebook. Facebook shut down CrowdTangle in the last six months because it was not showing the picture they wanted it to show and it was legitimate data. It was a company they purchased and ran out of Facebook but they disbanded the team and the guy who founded it left Facebook and they scuttled that whole thing. But I think the idea that Facebook is squelching conservatives is maybe appealing because it creates a sort of enemy and Facebook was created by a liberal Harvard student or whatever.

There's this idea that every Silicon Valley company is liberal and bent on squelching conservatives and I'm not saying that censorship doesn't happen and that we may not have a problem sometime in the near future. I think that's very possible. But I think the idea that Facebook is squelching conservatives is just so easily deniable by the data that Facebook themselves have presented over the years.

Morgan Lee: Yeah, we used to use CrowdTangle and it was a helpful tool especially as a journalist, to get a sense of who's reading your stuff, who's sharing your stuff, who's engaging with it. So that was disappointing when they shut that down.

Pivoting to talking about things in a more theological sense. We're talking about this thing that is going to touch every single person’s life who listens to this podcast and in many ways to almost every single person's life around the world. Having said all that, is there a “Christianresponse” to the news that we're finding out about Facebook this week?

Article continues below

Chris Martin: My advice is never to just log off and delete your accounts. I think that such a course of action is totally fine and maybe smart for a lot of folks, especially folks who are finding their self-worth and value in these platforms. In casing where you feel Facebook is making you mad, envious of other people, or making you feel poorly about yourself then shutting down the apps and deleting your accounts is brilliant.

But I talk to my 85-year-old grandma every Sunday afternoon, I call her usually while I'm making dinner on Sunday evening. She has never used the internet in her life. She had a flip phone from Verizon that she got, one of those clamshell ones that she got probably in 2003 and she just replaced it in 2020 because my wife and I had a daughter and she wanted to be able to see pictures of her because we don't post pictures on the internet and she didn't want me to have to keep mailing her pictures. So, my dad set her up with an iPhone and we have a shared apple iCloud photo album, where we upload photos and she doesn't know how to text. She doesn't know how to use a web browser. She has no apps beyond what apple has on their phones but she can access the shared photo albums to see pictures of our daughter. I talk to her every Sunday. Facebook comes up in our conversation, probably three Sundays out of the month because she's around friends who use Facebook. We were talking and she told me, she said her niece (Lisa) who helps around the house told her that there is a post on Facebook of the American flag with God bless America or the pledge of allegiance written inside this picture of the American flag. She said Lisa told her she posted it three or four times and they kept taking it down saying it was hate speech. Can you believe that?

She said, “can you write to Facebook I am liable to write them and tell them how terrible this is.” So, I'm thinking through this and saying there's probably some way that either it wasn't deleted or there is some way it was violating terms of service that we're not aware of.

So, I went and looked it up and this is a well-documented piece of misinformation that people use. The post is not pulled down by Facebook, people post it and in the caption of the photo say Facebook's pulled this down, share it to keep it up, and my grandmother, who's never used the internet outside of the shared photo album and has never used Facebook was duped by fake news on Facebook.

Article continues below

Honestly, it broke my heart. So, here's the point of that in my view, deleting your accounts isn't going to fix the problem.

So, what's the Christian response? I think the Christian response is to recognize that the water we're swimming in is toxic and like David Foster Wallace’s opening line from his commencement speech at Kenyon College, a number of years ago. When he said, “we're fish swimming in the water, there's the older fish swimming by and he says to the younger fish, ‘hey, how's the water?’ And they're like, ‘hey, what the heck's water? I don't know what water is.’ And the older fish recognizes these fish need water to survive, they're living in the water. By asking how's the water theoretically he knows that the water could be toxic and it could be poisonous to them or harmful in some way. And I think we all need to realize that social media is the water we're swimming in. We can't get out we're fish, we cannot escape the water. We will always, for the rest of our lives, be swimming in social media water and I fear that we don't even know it. We don't even know we are the younger fish, we don't even know we are swimming in the water, let alone the fact that I don't even think we realize that the water is toxic.

My hope is through all of this, we start to recognize that the water is toxic and because we're fish in this analogy, I don't think the answer is to try to extricate ourselves from the water. I don't think we can survive outside of social media. It will always invade in one way or another. But I think if we can just have the awareness that the water's toxic, maybe we can breathe a little bit more carefully, put on our little fishy gas masks or whatever we need to survive and the toxicity. I think that looks like different kinds of Christian application ways, it looks like discipline, it looks like screen time management. It looks like investing more time and energy in forms of media whose flourishing also leads to more human flourishing. I work for book publishers so I'm biased. The more book publishers flourish, the more humans flourish. I think podcasts are a great medium, the more good podcasts flourish, the more we flourish.

Article continues below

I don't think the more money Facebook makes and the more Facebook flourishes, the more we flourish. They work against one another and I just think of one of the more Christian applications we could do with this is invest time in forms of media whose goals are maybe a little bit more aligned with ours than Facebook or even other social media platforms.

Looking at the water analogy, what is the water in this situation? Is it Facebook as a company, is it the internet?

Is it just a sin issue where you have individual sinners sinning and you can never remove sin from the human heart? And so we are just going to learn how to live in a sinful world and not do anything about the systems and incentives that exacerbate some of that sin problem.

Chris Martin: Before the pandemic, I felt like I was banging my head against the wall with this kind of conversation because it was really hard to get anybody to recognize that the water is toxic because everybody just loves cat pictures and funny fail videos so much they don't think about how Instagram's warping their understanding of what beauty is like. Nobody wants to think about that. Nobody wants to think about the side effects of what scrolling does to them because they love what they are scrolling and looking at so much. But I think through the pandemic, people started to rely on it so much, the idol started to show its cracks. We started to put so much weight on these platforms that I think they started to show their issues. And I've had a lot easier conversation about the weaknesses and the failures of all these platforms, not just Facebook. I’ve had a lot easier times getting an ear on these kinds of conversations in the last nine months than I did the nine or 12 months before that. I think the idea that I used to think that social media is a neutral tool that we just use for sinful purposes because we're broken, sinful people. I now think that idea is incorrect and here's why. Because let's say Facebook's a hammer company and they're a bunch of us in a Nissan stadium down here in Nashville where the Tennessee Titans play and it holds let’s say, 40 or 50,000 people or something like that, and they come in and they say, “Hey, we're going to give you guys a tool, it’s a hammer and we want you to use these hammers to build things and to make the world a better place.”

But the thing is, Facebook has sharpened that backside of the hammer where you use it to pull the nails out and made the front part, the blunt side that used to hit nails in, they’ve made it a little less appealing to use.

Article continues below

They've made it maybe not very smooth or sometimes it falls off and it's like they've given us a tool that's incentivized to hurt people. It's got that sharp end of the hammer which is a lot more valuable in this tool that they've given us. I think Facebook has not given us a neutral tool. It's not like we just went out into the desert and discovered Facebook. Facebook is broken at its core because it's made by people who are broken at their core. Facebook is not a neutral tool and it's not just Facebook. No social internet platform is neutral. It has incentives that incentivize certain actions and if you give people a bunch of hammers with the sharp edge made even sharper. Then you say, “hey, however many people you can hit and injure with this sharp thing we're going to give you medals for that or something like that.”

It's like they've incentivized using the tool poorly. You get more attention and more engagement the more you use their tool in the way that they say it's not meant to be used oh and by the way, Facebook also benefits when you use the tool that way. I don't buy the idea that Facebook or any social internet company is neutral and I think the best we can hope for is finding one whose incentives for flourishing align with our incentives. I think that's the best we're going to get. Nothing is neutral. I think to the water part of this question, we can't get out of the water.

I think we just need to recognize that the water is toxic. What is the water? I think it's the social internet and I use social internet broader than social media because Google is the social internet, but you don't think of it as a social media platform. Social internet is a much more accurate tool with what we're dealing with here, where we connect with people around the world and exchange ideas at the speed of light.

I genuinely think, and this is a whole other discussion, social internet is one of the most consequential technical advancements in human history. Obviously, you needed the printing press. You needed all kinds of things, to get to this point, you need electricity.

But I think that the lightning-fast, literally Lightspeed exchange of information allows for a level of damage and destruction that no technology has allowed forever. I think what we need to recognize is many of the tools we use, Google or otherwise Facebook, where it feels like we're picking on Facebook and there's a reason they're the largest in the world. They made $28 billion in the last quarter. It's not just because we don't like them or something like that. They're the biggest player here along with Google and a couple of others are the big dogs in the room.

Article continues below

All of these platforms have incentives for growth and for profit that don't necessarily align for human good and human flourishing. My biggest call to action when I'm having these kinds of conversations with folks, whether it's through the newsletter or speaking places, or even just conversations with friends is consciousness. So many of us spend 40 hours a week looking at a big screen so that on Friday night we can scroll on a smaller screen while our biggest screen plays Netflix in the background. Do we even think about this stuff? I just don't even really know if we're considering what scrolling Instagram for four hours a day does to us.

My biggest interest is not to abolish social media or down with Zuckerberg. He is not the problem. Nobody can solve the Facebook problem. Facebook makes 615,417 moderation decisions every hour, meaning they take down over 615,000 pieces of content every hour. The Supreme court has decided 246 first amendment cases ever and theoretically if you think about it this way, Facebook decides 615,000 first amendment cases every hour.

They've moved fast and built things so quickly that they can't control their platform. So, I think the very least we could do is as we're using these tools, whether we think of them as neutral or not is, I think we can ask them to manage themselves and if they can't manage themselves, I think regulation is in order.

Now that gets into a whole other conversation. I share the same concerns and nerves that Ted does but I think we need something to happen. We can't un-ring the bell as the cliche goes and I think we need something to happen to be able to control this mass communication of information.

It's just human brokenness at a scale we've never seen before. I'm not the only one who's called it this, but the social internet is the modern tower of Babel. We've built our tower. We may not share actual language, but we share a sort of language and now we're getting to the top of building this tower and we're realizing maybe this wasn't such a good idea, to begin with. I just think at the very least it merits us asking a bunch of questions, but we can't expect that we're going to escape this water. You'll be my 85-year-old grandma who still gets duped by Facebook business information, even though she's never seen Facebook.