As community professionals, we’re tasked with helping people start and participate in conversations that matter to them. We’re often held accountable by “engagement” metrics –– such as the number of people participating in conversations and the sentiment surrounding those conversations.
But in this conversation with reporter Jeff Horwitz, you’ll learn that while Facebook obviously wants to gain more attention from users and increase time spent on the platform, there’s less internal consensus around the ethical dilemma of reaching these engagement goals by amplifying divisive groups and content. Alternatively, a lack of concrete metrics to measure impact is perhaps one of the things stopping Facebook from taking a step back and thinking about how their platform is impacting the world.
Is Facebook already too much of a monolith to change its path? Or is Mark Zuckerberg still not convinced that the company is at the center of a moral dilemma when it comes to polarizing its members and advancing the spread of misinformation?
In this episode of Community Signal, Patrick talks to Jeff about the article and dives into the context surrounding the story learned while talking to Facebook employees. They discuss:
If you enjoy our show, please know that it’s only possible with the generous support of our sponsor: Vanilla, a one-stop shop for online community.
[00:00:04] Announcer: You’re listening to Community Signal, the podcast for online community professionals. Sponsored by Vanilla, a one-stop shop for online community. Tweet with @communitysignal as you listen. Here’s your host, Patrick O’Keefe.
[00:00:25] Patrick O’Keefe: Hello and thank you for listening to Community Signal. A few weeks ago, The Wall Street Journal published a fascinating piece by Deepa Seetharaman and Jeff Horwitz discussing how Facebook executives had shut down efforts to make the site less divisive. It was revealed in the story that Facebook researchers found in 2016 that 64% of all new people joining extremist Facebook groups found them through Facebook’s own recommendation tools like “Groups You Should Join” and “Discover.”
That 2016 presentation said it directly, “Our recommendation systems grow the problem.” Deepa and Jeff’s article discusses the internal struggle to solve these issues and how many ideas were generally watered down or simply rejected. I was so interested in this story that I wanted to talk to one of the authors on the show and Jeff Horwitz joins us for this episode.
Our Patreon supporters are a group of listeners who find value in the program and want to support future episodes. They include Maggie McGary, Luke Zimmer, and Rachel Medanic. The support is deeply appreciated. If you’d like to join them, please visit communitysignal.com/innercircle.
Jeff Horwitz is a reporter for The Wall Street Journal focusing on Facebook. Before joining The Wall Street Journal, he was an investigative reporter for the Associated Press. Jeff, welcome to the show.
[00:01:32] Jeff Horwitz: Hey, thank you.
[00:01:33] Patrick O’Keefe: It’s great to have you on. I was really fascinated by this piece. As I mentioned on Twitter and told you, I actually paid for a Wall Street Journal subscription just to read it rather than trying to find it somewhere or try to skirt that. I was like, “This actually looks fascinating, so I am going to pay.” It was a really small amount of money because it’s my first-ever subscription with The Wall Street Journal. I’m going to pay this money, support that effort and actually read this piece in full as it is intended.
[00:01:56] Jeff Horwitz: Well, I’m delighted that the story was worth a really small amount of money.
[00:02:01] Patrick O’Keefe: Well, you never know, right? Because you got to get them in the door to retain them. That’s the offer that the Journal made to me. I took them up on it. [laughs] Just for context, Facebook is your beat, right? That’s really what you focus on at the Journal exclusively?
[00:02:14] Jeff Horwitz: Yes. I’ve been doing that for a year and a half. There’s a bit of broader social media and the little other things thrown in. Honestly, to tell you the truth, Deepa, my co-author on this, also does a lot of Facebook. It’s, candidly, a bigger job than I think I can handle alone and other colleagues who moonlight on it, too.
[00:02:32] Patrick O’Keefe: We’ll link to the story in the show notes, but let’s assume that the person listening at this moment has not read it. What are the main takeaways of the piece from your perspective?
[00:02:42] Jeff Horwitz: The main takeaway is that in the wake of the 2016 election and just the level of vitriol and misinformation that occurred, Facebook decided it really needed to take a look at its platform and figure out whether it might be making things worse in terms of divisiveness, polarization, et cetera. Inside, there are some very smart people to do this. Its own integrity engineers separately had to combat the same issues just in the algorithmic engineering side for integrity.
The company found that in some instances, its platform was increasing polarization. I’m paraphrasing here that Facebook’s algorithms would spit out increasingly divisive content in an effort to boost engagement unless interrupted. Some of the most avid users of the platform tended to be the people that from a social good/moderate conversation point of view would not be the ones you’d want to be the most vocal.
[00:03:45] Patrick O’Keefe: One of the numbers that really just hit me in the face, and I mentioned it in the intro, but just the 64% number that in 2016, Facebook researchers found that when it came to new people joining extremist Facebook groups, 64% of them found those groups through Facebook’s own recommendation tools like “Groups You Should Join” and “Discover.” Just as someone who builds platforms and thinks about recommendation and how things filter through, it’s just amazing because it’s probably one of the finest lead generation machines for extremism that I’ve ever seen before.
That research is like, “We have a problem and here’s how we’re going to tackle it.” I liked how the article told the story and I wanted to get into a few of the points. First, Facebook doesn’t always respond to every article that’s written about them. They don’t issue statements about every article, but they did about yours because yours was big enough and they’ve published this piece, “Investments to Fight Polarization” on their press site. I gave it a read and I was just curious. I wanted to give you a chance to reply to it, but also to ask you if anything stood out to you about it.
[00:04:47] Jeff Horwitz: I don’t take umbrage at Facebook disagreeing with the story. I think their read was that not every idea made it through, but they have made improvements to the platform. I don’t think that’s wrong. There are some things that have changed. Some of the ideas that were proposed that the engineers really were behind in terms of ways to potentially improve discourse on the platform and perhaps not mute but dampen the voices of some of the loudest and angriest users, they went through, but they didn’t go through with the scale that the engineers believed was necessary to solve the problem. I don’t know that I disagree with Facebook’s statement that they have done things. They certainly have. Question is, did they do enough to, say, solve the problem to the degree that the engineers believed was necessary?
[00:05:40] Patrick O’Keefe: Let’s take a break to talk about our sponsor, Vanilla.
Vanilla provides a one-stop-shop solution that gives community leaders all the tools they need to create a thriving community. Engagement tools like ideation and gamification promote vibrant discussion and powerful moderation tools allow admins to stay on top of conversations and keep things on track. All of these features are available out of the box, and come with best-in-class technical and community support from Vanilla’s Success Team. Vanilla is trusted by King, Acer, Qualtrics, and many more leading brands. Visit vanillaforums.com.
A primary premise of your story is that Facebook executives are simply choosing not to make the site less divisive. It’s not that they don’t have ideas or smart people, or at least access to smart people who want to tackle this issue. They’re making a choice. On Twitter, I follow an experienced community designer named Derek Powazek, who’s been on the show before and he said, “The hurdles aren’t technical. They’re not too hard to implement. They’re not too expensive. They just choose not to. Why did they make that choice? That’s what we should be asking them every day. We should make them explain.”
I was just curious. What are you hearing? Why do you think they are making these choices?
[00:06:52] Jeff Horwitz: I think some of this is it happened by accident and then constituencies develops. One of the things that some of the engineers really regretted was that they hadn’t been able to take action earlier because if they had, potentially, it wouldn’t have had this large ecosystem of misinformation and misinformation publishers that actually started defending its own interest. Making those changes once you’ve already got an established platform and communities on this platform is a lot harder, right? If you’re going to take out a group, it’s best to do it when it’s got 50 members rather than 50,000.
In terms of that quote you just read from that very smart guy who I’ve not previously heard of, but it was really good, I thought, one of the things that was really surprising me about researching this thing over a few months was how many options they did have available. There were just so many different potential avenues to change discourse on Facebook. That’s not how the company talks. The company talks about it like, “Well, we can take it down or we can leave it up.”
[00:07:58] Patrick O’Keefe: [chuckles] There’s a lot of color in between.
[00:07:59] Jeff Horwitz: Yes. That’s like, “What do you want us to do?” This gets into weird Section 230 stuff that I don’t want to go down that path too far. In some respects, Facebook is like a publisher in the sense that they are actually making choices, not just of what to take down or leave up, but what to distribute, what to promote, and what viewpoints are going to succeed. There are any number of ways to do that and why engagement alone, which is historically been the guiding light for all this stuff, should be the solution. I don’t know that’s obvious to me.
[00:08:38] Patrick O’Keefe: It’s such a classic problem in community and that we talk about on this show, not only setting the tone from the start but also the tough choice that we make to harm, let’s say, vanity metrics temporarily with the idea that we’re making a change that makes a difference in the quality of the community longterm. You mentioned constituencies that develop. I think we’re seeing a trend with Facebook and with Facebook vs. Twitter right now and this political climate we’re in and how the president responds to actions that you take and who he rails against or who he doesn’t. I just got a sense from reading your piece that, at least on the Facebook end, it feels like, and maybe this is wrong, some of these decisions are impacted by almost a fear of how they will be viewed by Republicans or by conservatives.
[00:09:29] Jeff Horwitz: Oh, yes. That was, I think, fairly explicitly understood by some of the engineers. It was implicitly one of the reasons, I think, why a lot of these things didn’t take off. One engineer, a guy named Carlos Gomez-Uribe, who headed the News Feed integrity operation, noted that it’s not the case in every country that there’s more misbehavior on the conservative side than on the progressive side. Right and left.
There is a much bigger ecosystem on the right, however, on Facebook. As a result of that, there’s a larger constituency. Even if you take, let’s say, completely impartial rule like something’s just, say, supposed to reduce clickbait headlines but doesn’t seem like, in any way, it’s partisan. You’re going to get disparate impact because it’s larger on one side. I think there was a lot of frustration by that.
[00:10:25] Patrick O’Keefe: I like to say that you can’t outrun the founders. What I mean is that I’ve found myself reporting up the chain before to CEOs, COOs, and founders who would continually weaken my initiatives. Ironically enough, they were initiatives aimed at encouraging more quality contributions to communities in some cases. I really connected with that part of your story that talked about how these ideas would get weaker as they went up the chain.
You mentioned engineers and talking to engineers and part of the piece is really the obstacles that change is facing like in my case. I turned down a recruiter from Facebook in 2009 and I really don’t want to work there, but let’s say I decided, right? I was going to join and fix this. What are the obstacles that are in the way of these engineers?
[00:11:09] Jeff Horwitz: Facebook has this interesting process that has been branded, “Eat your veggies.” The idea is that any significant engineering change to the platform for integrity purposes has to get run by the rest of the operation. We’re talking legal, policy, marketing, comms. Maybe one more something else too that I’m forgetting.
[00:11:32] Patrick O’Keefe: PR.
[00:11:32] Jeff Horwitz: On its face, it doesn’t seem like a particularly burdensome thing. The problem was that the process per the engineers I was talking to, I was set up so that, basically, the status quo was really the preferred outcome. One of the things the engineers had to do was for any proposed ranking change, they had to basically submit a list of potential ways that it could go south on the company, right? It’s being presented almost as this, “Heads, you don’t win. Tails, you really lose.”
What happened is a lot of ideas and plans that the engineers shared were very well-rounded. I have to say talking to these people, they did not seem like partisan firebrands. They seemed like they were deeply concerned about fairness, but they basically had a lot of their ideas either gated or watered down. One of the parts of the story that I think was most compelling to me was the way in which this stuff did get down-pedaled and softened in relation to an effort to perhaps quiet, to some degree, hyperactive users.
[00:12:46] Patrick O’Keefe: You mentioned the departments it would have to go through. To me as a practitioner, it reeks of– there was this story about this luggage company a while back. Gosh, what was it? I’m sure you know the story because it went everywhere. The CEO of the luggage company was basically mistreating her employees and it was bad. It was ugly. One of the things that she made them do was walk their ideas through the general Slack channel in front of the whole company.
The responses to those ideas would often be not just critical, but so leading, right? If you have to walk your ideas past people who don’t have to live with those challenges, goodness, it weighs on you. In your story where you mentioned public policy, legal, marketing, PR, and having to get approval, I was just thinking about how infuriating a process that has the potential to be.
Because like policy and legal, I can see. It’s not that you don’t want to input. It’s just that these are not people who have to live with the problem of divisiveness on Facebook. It made me think about the people who really should be thinking about that problem because you talk about engineers. Engineers can take a lot of different forms. Some would be well-versed on these problems, some would not.
One thing I was curious about was just these ideas. Facebook has a lot of people there who are social scientists or researchers or who are involved in content moderation and the people side of these things. A lot of those ideas, I would assume they came from a place where it was a collaboration between engineering and those people who are more directly exposed to the divisiveness issue.
[00:14:20] Jeff Horwitz: Yes, absolutely. There were plenty of Ph.D. sociologists roaming around that place. They obviously could consult with pretty much whomever they wanted outside. One of the things that really struck me was the engineers. There’s the stereotype of engineers being very narrowly focused and not aware of the world outside of their computer screens.
Man, these folks were really well-rounded. I think that partly has to do with the type of person who wanted to join the integrity operation, but there was some fairly– and just in the course of reporting, I had some fairly deep conversations with people about the nature of fairness in moderation and what it would mean to change civic discourse and what would be too much for Facebook to do, what are the lines that Facebook shouldn’t cross.
They were in some ways just a really deep bench at the company. I think that was one of the reasons why they were so frustrated to have a whole bunch of outsiders who were more from a policy or comms background questioning their judgment and whether they were perhaps bluntly imposing their bias on the platform or at least trying to.
[00:15:33] Patrick O’Keefe: As I listened to you, and this might not be how Facebook sees all issues, but to me, it sounds a lot like there’s this mentality that creeps over a company when it gets big. [chuckles] There’s a lot of big companies that I wouldn’t want to work for because it just feels like I would be weighed down by bureaucracy and intentional because of maintaining the status quo or not, it really feels like, at least on this issue, Facebook has become this almost old-school mentality, big company as opposed to a more nimble –– obviously, they’re not a startup anymore, far from it, but just a more nimble company that can more adapt to change. It just feels like it’s weighed down in bureaucracy.
[00:16:15] Jeff Horwitz: I don’t know though from a company’s point of view that’s a problem here. From the company’s point of view, it’s not like this is– I’m trying to think of the best tech analogy.
[00:16:25] Both: IBM.
[00:16:25] Patrick O’Keefe: [laughs]
[00:16:26] Jeff Horwitz: It came quickly, but it’s not like them watching their lunch get eaten in front of them over and over again. This was more intentional, right? This was, I think from Facebook’s point of view, exactly how the process should be working, which is that there are a whole bunch of constituents in everything Facebook does now. Some of them are on the policy and legal and comms team. It’s not just about building the product. That’s completely understandable for an organization that’s well north of 45,000 employees and growing now.
[00:16:58] Patrick O’Keefe: Going back to bringing things up the chain, the article talks about a specific change where the debate was elevated to Mark Zuckerberg, who made a decision, but– and I’ll quote from the article here, “also signaled he was losing interest in the effort to recalibrate the platform in the name of social good, they said, asking that they not bring him something like that again.”
I wanted to ask you to– I don’t know if you could break that down a little bit because when you say “social good,” a lot of people think of charitable giving, right? Not necessarily ensuring that Facebook is less divisive or is more productive in conversation. Why is that social good treated with such disdain and I guess a general just lack of interest at this point?
[00:17:41] Jeff Horwitz: Yes. Let’s go back just briefly and discuss what that meant in this context. Because this change was basically just going to reduce the impact that hyperactive users had on the success of certain publishers. For example, if Occupy Democrats or something had this core group of people that was just rabidly spreading its links around the platform like posting hundreds and hundreds of times, that would have less weight than would be given to an article that was, say, being posted by a whole bunch of people, but lightly, who didn’t post extremely. I think it’s pretty common in a whole bunch of recommendation algorithms that you don’t necessarily let the heaviest users steer the ship then you end up with some weird outcomes, right?
[00:18:32] Patrick O’Keefe: You can disincentivize new people, too.
[00:18:35] Jeff Horwitz: Yes. The engineer that put this together or that was presenting it called it the happy face when he did the chart of its outcome, which is that on the far-right and the far-left, you would reduce the clout of those users. In the middle, you would actually be giving those people more of a voice. They really went for this idea, the engineers that were working on it. They thought it was excellent. It was non-biased. It was something that was going to hopefully just change the platform a bit and make it so that the angriest voices weren’t the loudest, right? I went through this process. It did get kicked up to Mark and Mark’s response was, “Do it, but cut it by 80%.”
[00:19:14] Patrick O’Keefe: [laughs] Which is almost like, “Don’t do it,” but still ––
[00:19:18] Jeff Horwitz: I think that from some folks in the company view that as a victory for the idea, right? I think that for someone who thought that they had come up with a good way to address a problem, being told that they could address 20% of the problem or address the problem 20% didn’t always go down well.
[00:19:37] Patrick O’Keefe: I guess just going back to the question there, this disdain and lack of interest in these social good changes, is it just at a point where they feel like they can do no right at that high level and it’s not worth even bothering to try to confront this issue still or just having other priorities or just frankly being against it as an idea?
[00:19:58] Jeff Horwitz: Oh, there were so many things. I think one of the things that really hamstrung this effort internally was a lack of metrics. It’s really hard for a platform to tell what impact it’s having on its users. If Facebook was asking the question, “Are we making the world worse?” how do you even answer that question, right? Facebook is a very metrics-driven organization and that was a pretty hard thing to quantify. It was like if you can’t quantify it, then how do you know you’ve solved it? Thus, if you can’t quantify it, maybe we should just move on. I think that was part of it.
Part of it was the presence of constituencies that we’re not going to like the changes, right? I think quite correctly, some entities view Facebook as a battlefield.
Part of it was, I think, also just, yes, like institutional interest and the fact that Facebook was catching hell from pretty much everyone for a pretty long time. This is something that some of the engineers or some of the folks on the Common Ground team noted in documents, which is that they were going to need to find outside voices to work with them, people who are going to be validating their work. Because even if it was good, even if it was intended to improve the platform, and even if it was actually going to be successful, still nobody was going to trust Facebook. Very easy to be empathetic with that situation, right? At a time when Cambridge Analytica is going on in terms of the scandal component of it, not the actual events, admitting, “Hey, our platform has some problems,” is pretty rough. I think there was a reluctance to dig too deeply on some of those things.
[00:21:37] Patrick O’Keefe: In their reply to your story, Facebook mentioned that they had an integrity team of more than 35,000 people. I’m going to guess that some of those people are content moderators or at least similar in a function. A couple of weeks before your story came out, it was announced that Facebook had settled a lawsuit with 11,250 current and former content moderators, paying $52 million as compensation for the mental health issues they had experienced due to their work. Part of the issue of divisiveness on Facebook does relate to content moderation in my view. Your story focused on the top end of the hierarchy at Facebook and how those decisions are being made. I was curious. Did content moderation come up at all as you spoke to people?
[00:22:17] Jeff Horwitz: Yes. I think the Common Ground initiative people were this more academic approach to thinking through polarization and fighting on the platform. By the way, the interest in polarization, it was effective polarization, right? People could disagree all they want, but the goal was to try to make it so they didn’t hate each other. Your parenting group, maybe people have different views about vaccination schedules, but nobody thinks that the opposing viewpoint should be going to jail for child abuse, right? That’s like the affected polarization versus just disagreeing, which I’m sure your listeners actually probably know more than most. A lot of the tools that they were proposing were things that were aimed at giving moderators better control.
There was one proposal to create an ability to take a timeout in a group discussion. Let’s say if your group was about to descend into a flame war on some vitriolic topic, you could either limit how frequently people could post on that topic or you could take the entire topic and create a subgroup for all of the social media warriors who wanted to duke it out, they could, but then everybody else could, on the main thread, go back to having a regular normal discussion. There were a whole bunch of things that were kind of– I mean, not even the paid Facebook integrity people, but even just giving group administrators more tools to do this work themselves.
[00:23:43] Patrick O’Keefe: It’s interesting because I mentioned this on a Twitter thread I’ll link to in the show notes, but when I had Howard Rheingold on the show. Howard Rheingold is an online community pioneer, well-respected in the space, was invited to speak at Facebook to their social scientists. One of the things that he said I laughed at the time and sticks with me is he said, “I don’t understand why they designed Facebook groups to function so poorly.” [laughs]
It’s one of those things and it’s not really directly related to your story, but still related to the idea of how decisions get made around improvements to the platform and the conversation is that it seems to be a recurring trend from conversations I’ve had over the years with people is that, yes, Facebook will want your ideas or will pay for research. Of course, it’s a big machine and not everything is going to be approved.
Just in general, just the way decisions are made skews toward which does make sense, but it’s Facebook’s own self-interest to keep you engaged in at least the short term. On the community end of things, I like to look at things from a long-term perspective. It’s just frustrating to see that. I use Facebook. I don’t love Facebook. I would like to see Facebook be better. It’s just a recurring trend around people going to Facebook, talking to Facebook, and coming away not being thrilled. I don’t know if there’s anything you could talk about. I was just speaking freely. It can die right there.
[00:25:02] Jeff Horwitz: No, no, no. I think that was the way that some of the engineers approached it, right? A whole bunch of folks came in with the idea of like, “Hey, there’s really valuable things we can do here. Look, there’s low-hanging fruit. Let’s do it.” It turns out a lot of that fruit was out of reach. In some ways, I think this goes back to just being a really large organization with as many constituencies as Facebook has at this point.
It’s a product that really does work and it really is successful in just a truly profound way. Making changes to it, I think, is pretty hard. Even if there are so many other ways for groups to be or for News Feed to be, this is the way that has built Facebook into a wild, almost unparalleled success. Why would you give that up?
[00:25:53] Patrick O’Keefe: One of the central figures in your story is longtime Facebook engineer and eventual product lead, Chris Cox. Cox had led a cross-departmental group that you referenced earlier, Common Ground. They made proposals for ideas that Facebook could implement to combat polarization. According to your piece, the Common Ground group warned that some of their ideas were “anti-growth” and required that Facebook “take a moral stance.” Ultimately, it has been said that Cox lost most of those battles and he ended up leaving Facebook in March of last year, 2019. Now, we’re recording on June 12th and by some odd coincidence- [chuckles]
[00:26:27] Jeff Horwitz: He’s back!
[00:26:28] Patrick O’Keefe: -Cox announced yesterday that he’s returning to the chief product officer role at Facebook. I know you wrote about this and we’ll include the link in the show notes. I wanted to ask you about Cox’s return and what it means, but I’m going to muddy the water just a little bit and try to broaden the context. Because early in the morning, Eastern Time on the day your story went up, 17-year-old Darnella Frazier posted a video on Facebook of the murder of George Floyd.
That same day, Twitter decided to add a fact-checking notice to a couple of President Trump’s tweets. Twitter finally decided to take this stand. I feel like it’s forced Facebook back into the news over those issues and made someone comment, in this case, Mark Zuckerberg, on Fox News saying that Facebook wouldn’t do what Twitter had done. I can’t help but look at this broader context right now, the backdrop of the protests occurring across this country in search of justice and equality and to stamp out racism.
Trump, of course, later posted a tweet that included both the word “thugs” and the phrase, “When looting starts, the shooting starts.” He posted it on Facebook and Twitter. Twitter hid it. Facebook decided to take no action. Many Facebook employees are now speaking up. They’re staging virtual walkouts and they’re discussing work stoppages. I guess if I want to wrap this all up, [chuckles] this big complex thing up into a single thought, it’s really, is this a moment right now where the pressure coming from inside and outside of Facebook might actually force a change on the issues discussed in your article?
[00:27:59] Jeff Horwitz: I obviously can’t prognosticate that, but I see where you’re going. Honestly, the recitation of all the things that have happened in the immediate wake, that makes me most- in terms of the moment, Chris Cox being back is significant. He was basically the chief patron of this whole effort, right? He didn’t run Common Ground, but it existed because I think he wanted it to work.
When he stepped out, the reason was disagreements over encryption and the future of the platform, which makes sense, right? He headed product. Now that he’s back, whether the company is going to do a little more is unclear. I think that Mark Zuckerberg did get caught by surprise by how much Twitter’s decision ended up rubber writing inside of Facebook. We saw late last week, the company sort of saying it was going to reconsider a whole bunch of things, right?
There were no promises made, but there was a lot of talk about thinking through things. What that is going to mean, it’s too soon to say. I think that some of the folks who really were fans of the polarization work are happy to see Chris Cox coming back. At the same time, a couple of folks that I spoke to for this story noted that there was a reason that he left in the first place. It was because the company was going to do what Mark wanted to do and that was that. How this works out now, I’m going to be curious.
[00:29:27] Patrick O’Keefe: It’s interesting to see how sometimes with these big companies, another example is escaping me, but there was one a while back and it’s killing me that I’m not remembering it. Like Twitter, Facebook, YouTube, when one of them decides to do something, the pressure for the rest of them to do it grows. Twitter, just personally, it’s not like the Trump problem if you view it as a problem, just popped up yesterday.
Again, if you believe it’s a problem and kicking the can down the road for a while. Now, Twitter has decided to do something. That’s now making others ask questions about Facebook or YouTube or Instagram or other places that, also house his content. It feels like it’s really a case of sometimes there’s a unified front even if it’s unintentional and no one’s talking. Once one of them actually takes a stand on something, then it creates even more pressure for the others.
[00:30:15] Jeff Horwitz: I don’t know that Twitter historically would have viewed itself as the poster child for heavy content regulation or for heavier content regulation. That’s traditionally where the company has been not on that side. It’s really fascinating seeing this distinction here. To some degree, it seems like maybe this is being a little bit blown out of proportion in the sense that they are fundamentally still on some very similar pages in terms of free speech and the idea that their algorithms are a good way to distribute content and that they don’t really want to get in the way and take a heavy hand. That seems like that’s a shared thing. Even if you have had Facebook explicitly saying, “Well, we wouldn’t do what Twitter did,” and you’ve got a whole bunch of Facebook employees saying, “Do what Twitter did,” right?
[00:31:04] Patrick O’Keefe: Right.
[00:31:05] Jeff Horwitz: It doesn’t seem to me like there’s historically been a huge gap. It candidly seems hard to imagine how a truly huge gap could develop given just the businesses that both of them have.
[00:31:17] Patrick O’Keefe: Jeff, thank you so much for finding some time for us. I really enjoyed the conversation and appreciate the context for the article.
[00:31:24] Jeff Horwitz: No problem. Thanks for having me.
[00:31:27] Patrick O’Keefe: We’ve been talking with Jeff Horwitz, Facebook reporter for The Wall Street Journal. We’ll link to his reporting in the show notes. Follow him on Twitter @JeffHorwitz.
Community Signal is produced by Karn Broad and Carol Benovic-Bradley is our editorial lead. Thank you for listening.
If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.
One comment