Facebook’s Australian News Ban Will Lead to Even More Misinformation
What would happen if you woke up tomorrow and couldn’t share any news articles on Facebook? How would that impact the communities that you manage or the way you share information with family and friends? What if this ban included information provided by emergency services agencies for things like natural disasters, wildfires, and domestic violence? This situation is not a hypothetical one for Australian users of Facebook. Just last week, after Facebook failed to make an agreement to pay Australian news organizations for linking to their content, the company issued a ban that prevents sharing Australian or international news content on the platform.
In this episode, Patrick talks to Dr. Jennifer Beckett, a lecturer in media and communications at the University of Melbourne, about the immediate ramifications that this has had and what it might mean for communities on Facebook moving forward.
Dr. Beckett’s work also has a focus in the mental health of digital workers, given the prevalence of moderation-related work, no matter what the job title. As our field expands, Dr. Beckett points to the need for visibility and protection for the people who do this work.
Dr. Beckett and Patrick discuss:
- The well-being of content moderators and what some organizations are doing to protect their well-being
- The legal environment for community builders in Australia
- The need for better communication about the “toxicity” of communities
Our Podcast is Made Possible By…
If you enjoy our show, please know that it’s only possible with the generous support of our sponsor: Vanilla, a one-stop shop for online community.
Big Quotes
Facebook has issued a broad ban of news and essential information for its Australian users (4:53): “Facebook has used a really broad definition of what news is. They’ve blocked our Bureau of Meteorology, several emergency services, [and] Queensland Fire & Safety. … That’s problematic because there’s actually floods happening in Queensland at the moment. They’re using the platform to get really vital and immediate emergency information out. The domestic violence government support pages, they’ve all gone down. It’s really, really problematic.” –@JenniferBeckett
How Facebook’s ban of news in Australia may encourage misinformation and disinformation (6:42): “[This ban is] going to allow for a lot of non-news organizations to spread a lot of disinformation and misinformation because there’ll be nothing to curb that misinformation or disinformation in a community. If I’m running a community and somebody starts spouting QAnon conspiracy theories in my group, I’ve actually got no ability to now post fact-based, fact-checked news articles that debunk those conspiracy theories.” –@JenniferBeckett
Healthy communities can still be toxic to others (11:38): “The thing that makes [groups like the Proud Boys] so dangerous is, internally, they’re super healthy. They’re not toxic to their own members. They’re toxic to everyone else.” –@JenniferBeckett
Moderation isn’t about perfection (24:23): “Moderation is an effort that requires a lot of care, thought, and time. Even when you do your best, you’re not perfection.” –@patrickokeefe
When your job responsibilities undermine your personal identity (42:05): “Have you ever had to moderate content [or leave up content] that goes against your sense of self-identity and safety? I’m thinking about people of color and First Nations people, anybody in a minority group who’s doing this job, who suddenly has to remain professional while people are actually questioning their right to exist in many cases. Living in this constant state of cognitive dissonance can actually have physical ramifications, as well.” –@JenniferBeckett
About Dr. Jennifer Beckett
Dr. Jennifer Beckett is a lecturer in Media and Communications at the University of Melbourne, Australia. She researches online governance and the mental health of digital workers and teaches a Masters level subject in community management. Before heading back to academia, she worked as an online and social media producer for the Australian Broadcasting Corporation.
Related Links
- Sponsor: Vanilla, a one-stop-shop for online community
- Dr. Jennifer Beckett on Twitter
- Dr. Beckett’s University of Melbourne profile
- The Australian Broadcasting Corporation
- Facebook will block Australian users and publishers from sharing news links in response to new bill (via The Verge)
- The Betoota Advocate, a satirical site that was blocked as part of Facebook’s ban on news
- Venessa Paech on Community Signal
- Australian Community Managers
- Community Signal Episode About Section 230
- A Community Management Perspective on the Violentacrez/Reddit Troll Story, an article about Reddit’s past, by Patrick
- Australia’s Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act
- Sarah Roberts, UCLA
- Arlie Russell Hochschild
- Quiip launches industry-first resilience training in response to overwork and burnout
- Dr. Beckett acknowledges that her research is done on the lands of the Wurundjeri people of the Kulin Nation
Transcript
[00:00:04] Announcer: You’re listening to Community Signal, the podcast for online community professionals. Sponsored by Vanilla, a one-stop-shop for online community. Tweet with @communitysignal as you listen. Here’s your host, Patrick O’Keefe.
[music]
[00:00:25] Patrick O’Keefe: Hello, and thanks for tuning in. Dr. Jennifer Beckett has been studying the well-being of content moderators for years. That’s important to me, and we’ll discuss it on this episode as well as the legal environment for community builders in Australia and how it relates to the US and better communication around what toxicity means within our online communities.
Thank you to our amazing Patreon supporters for seeing the value in our show. This includes Jules Standen and Heather Champ and Rachel Medanic. If you’d like to become a backer, please go to communitysignal.com/innercircle.
Dr. Jennifer Beckett is a lecturer in media and communications at the University of Melbourne, Australia. She researches online governance and the mental health of digital workers and teaches a Master’s level subject in community management. Before heading back to academia, she worked as an online and social media producer for the Australian Broadcasting Corporation. Jennifer, welcome to the show.
[00:01:20] Dr. Jennifer Beckett: Thanks for having me.
[00:01:26] Patrick O’Keefe: It’s my pleasure. We happen to be talking on the very day, [chuckles] at least the very day that Facebook banned Australian Facebook users from sharing news articles no matter where the outlet’s located, and has banned articles from Australian news outlets from being shared to the site by any user no matter where they are located. Just to put that another way, I’m in the US, I tried to post a link to an Australian media outlet, I can’t. Then for people inside of Australia, you can’t post news at all. It was effectively banned, at least the major outlets. It’s a weird day to be talking. It not only extends to individual users, it extends to Facebook pages and Facebook Groups.
Of course, this is in response to regulatory measures in the country that were aimed at compelling companies like Facebook to pay news publishers because of news content shared on their platforms. Even in cases where news outlets might be promoting themselves, they drive traffic to Facebook, they want Facebook to pay news outlets and support media within the country. I was thinking about the relevance to our show today. The immediate relevance is that there are community pros who use Facebook. Against my protesting, they are out there.
They use it as a platform on which to build community and many of them are in the media, some are in Australia, their work changed fundamentally today. A lot of those professionals won’t be able to link to their own websites. Just thinking about that community angle, what’s your reaction right now in the moment?
[00:02:46] Dr. Jennifer Beckett: My reaction is because it happened without any warning. Our treasurer, Josh Frydenberg, not my favorite human being, was actually on a phone call with Mark Zuckerberg earlier this morning and said that the conversation had been very productive, and that there was no indication from Facebook that they were going to drop the hammer today. I think it’s bad faith negotiation on the part of Facebook. It’s really designed to show the power that they have. I think what they’ve actually probably unintentionally done is shown to everybody the problem that that raises in terms of sharing information and things.
I think now people are probably a lot more aware of that outsource, the outsized power of Facebook in these kinds of realms. As somebody who actually used to work for the Australian Broadcasting Corporation, doing the work of getting news articles on to Facebook and having conversations with members of the Australian public around really important issues. The Australian Broadcasting Corporation by the way, just so everyone knows, is also the country’s emergency broadcaster. During the black summer of the beginning of 2020 and all through the COVID pandemic, floods, fires, anytime there is an emergency in Australia, ABC is your emergency broadcaster.
The fact that the ABC now can’t share any content is really problematic from a perspective of this is about helping to keep people safe. They have their own apps, and a lot of news organizations today were sharing information like, “This is where you can actually find us. We have our own website. Download our app.” I think that’s been really interesting. That’s all been happening on Twitter. Although, one of our satirical papers, The Betoota Advocate, which has also banned, which is interesting, because it’s actually not breaking news, it’s comedy. It actually managed to post something on Instagram, I think. I’m not sure how it’s working across Facebook’s various owned apps.
It’s really, really problematic, and they didn’t just block news organizations because the legislation is really broad about what it considers news. Facebook has used a really broad definition of what news is. They’ve also blocked our Bureau of Meteorology, several emergency services. I think Queensland Fire & Safety, that they’ve gone down as well. That’s problematic because there’s actually floods happening in Queensland at the moment.
They’re using the platform to get really vital and immediate emergency information out as well. The domestic violence government support pages, they’ve all gone down. It’s really, really problematic. This is the funny part. They’re culturally wide net, and they’ve actually taken themselves down. They’ve taken their own Facebook page down [laughs]…fun.
[00:05:32] Patrick O’Keefe: Within Facebook Groups, which I think is the way most people who try to build community within Facebook are probably is in Facebook Groups, what do you think this means? Especially if you’re within Australia, you can’t post a link to what is broadly defined as a news organization however many URLs or domain names that is. Who’s left? Who are the links that you can share? How does that impact the information flow within these Facebook Groups for people in Australia?
[00:05:58] Dr. Jennifer Beckett: This is a really interesting question because Facebook groups are places that people meet to share news about a range of things often. Say you are in a disability support network, or you’re in a health Facebook group like I am for health conditions or whatever. In those groups, we share news with each other to keep people informed and up to date about new things that are going on and to have those conversations. Now, they’re not being able to share that kind of information in those groups is going to be really, really problematic for people who use those groups and that information to seek support around healthcare and various other things. That’s one problem.
The other and probably more immediate problem is that it’s going to allow for a lot of non-news organizations to spread a lot of disinformation and misinformation because there’ll be nothing to curb that misinformation or disinformation in a community. If I’m running a community and somebody starts spouting QAnon conspiracy theories in my group, I’ve actually got no ability to now post fact-based fact-checked news articles that debunk those conspiracy theories in my group. Because Facebook groups tend to have stronger social ties because they are more about building community rather than the one to many push that you use as a familiar platform for more broadly if you’re a news organization or a social media professional or brand.
The problem then becomes I’ve got no way to stop that taking hold in my community. Because the tires are so strong, it’s so easy for it to get bedded down in that community. That’s going to cause a huge amount of problems for people.
[00:07:43] Patrick O’Keefe: I get the rationale. Who knows though. I feel like it might end up being a miscalculation by Facebook. People if you do something, and they don’t like you and you caused them harm, it doesn’t matter if it’s for five seconds, five minutes, five hours, five days. They’re not going to like you again. It’s going to be hard to regain trust that has been lost here. You said something interesting. You mentioned QAnon. QAnon, from my perspective, I see it as a very US-focused thing. It’s a rebranding of a lot of anti-Semitic tropes put in to place. Also, it’s relevant to Donald Trump and a secret person within the government and all of these things. Is QAnon something that is prevalent in Australia? Is it something that actually crops up in Facebook Groups in your communities?
[00:08:28] Dr. Jennifer Beckett: Yes, it is because one of the great things about conspiracy theories, or should I say, one of the not so great things about conspiracy theories is that they’re really malleable. They’re designed to flow and curve around different kinds of circumstances. We’ve seen how QAnon has obviously recorded on a lot in Australia, although we weren’t seeing the effect on Facebook anymore. Is that it’s able to veer off in different directions depending on events that happen. The great thinking around the 45th’s loss is that suddenly like, “Oh actually, no, it was this thing. This thing has happened.” That’s the thing around conspiracy theories.
What has come into Australia, we’ve actually imported a lot of that. Let’s be very clear, there is a lot of racism in this country. It’s really endemic. We have massive problems with it. In fact, Australians have often been leaders in the kinds of communities. Things like Men Going Their Own Way, those more what we would consider toxic to society communities. Australians have often been some of the leader in those communities. It’s very easy for them to import that stuff and start folding it around Australia and policies and politicians. It’s just adapted for the environment.
[00:09:45] Patrick O’Keefe: Got you. That makes a lot of sense. [chuckles] Great import from the US of A. Let’s stop for a moment to talk about our generous sponsor, Vanilla.
Vanilla provides a one-stop-shop solution that gives community leaders all the tools they need to create a thriving community. Engagement tools like ideation and gamification promote vibrant discussion and powerful moderation tools allow admins to stay on top of conversations and keep things on track. All of these features are available out of the box, and come with best-in-class technical and community support from Vanilla’s Success Team. Vanilla is trusted by King, Acer, Qualtrics, and many more leading brands. Visit vanillaforums.com.
I wanted to mention this. I think there’s some relevance to eSafety legislation which I want to talk to as well. I know that you’ve been working on, to quote you, “The discussion of online toxicity via a community health perspective and the need to differentiate what we mean when we say that. Because what we might see as toxic from the outside is actually perfectly healthy on the inside. That’s a really huge part of the concern.” Talk about that. Talk about toxicity, that phrasing and the confusion that you are trying to avoid.
[00:10:58] Dr. Jennifer Beckett: I think one of the things eSafety– Anyone who’s worked in the community space, which obviously your listeners all have, I have as well is that we know that community health is relative. It’s relative to the particular community. If I’m building up the community of QAnon supporters or the Proud Boys or whatever toxic to society. This is why when I say it like this, we tend to just look at something like that and go, “Oh, they’re toxic.” When we do that, then we miss how they fly under the radar in things like Facebook groups, how they’re not picked up. What it is about them that actually makes them so dangerous?
The thing that makes them so dangerous is, internally, they’re super healthy. They’re not toxic to their own members. They’re toxic to everyone else. We think when we talk about toxicity, we need to think about, “What are we actually referring to? Let’s differentiate that out.” Then think about how that works, then in a governance perspective on a platform. Because a lot of platform governance, we think of governance as being this stack door thing. You’ve spoken to Venessa Paech, from Australian Community Managers before, she thinks about it just like this nested approach to governance and moderation.
If everything in your platform is designed to have people report bad behavior. Then if the group that you’re looking at is really, really healthy, no one in that group is going to report to you that community guidelines of your platform are being breached because that would be a breach of trust of the community norms established in that individual community. I think that’s part of the problem. That’s what helps them to skate along below the radar also. Also, I think it has this problematic thing. I really appreciate what Reddit did earlier on in the year where they de-platformed the whole bunch of Reddit groups.
The thinking behind that was free speech takes you so far, but when these groups start to become toxic and dangerous to society, they have to go. We can’t platform them anymore. I think it’s really important to remind people that platforms are private corporations. They actually legally can, even under the First Amendment, decide what is said on their platforms and what is not said on their platforms. Moderation is doing that job. I think it’s disingenuous when platforms don’t acknowledge that they’re already operating outside of the bounds of that. I want to be really open here and say in Australia, we don’t actually have a right to free speech.
In the Constitution, what we have is what’s called an assumed right to free political expression, which operates in a similar way, but it’s not enshrined in the same way as it is in the US. What’s interesting from an Australian perspective is we’re being governed from a US perspective of speech. This probably goes beyond what your question was.
[00:13:56] Patrick O’Keefe: No, no. It’s true because, for us, moderation is the First Amendment. Moderation is not only protected by 230, but when people say, “I want to be allowed to say this thing on this platform.” I especially like when factchecking started entering into the conversation with the major platforms, when Twitter fact checks Donald Trump, that’s Twitter exercising its First Amendment rights. If you’ve got a problem with that, your real problem is with the First Amendment. It’s not with Section 230 because Twitter has the ability to say, “Hey, we’re going to fact check this.” This is Twitter’s opinion, this is Twitter’s sharing content.
You’re liable for the content you yourself create, but it’s protected by the First Amendment in that case. Reddit’s such an interesting example because Reddit has had to have a 360 to come to that realization of saying, “We’re going to de-platform these things,” because if you go back in time, Reddit especially the founders, especially Alexis Ohanian, they profited off of terrible things. The sexualization of children is the example I like to point to. The jailbait subreddits used to have that reporting brought out and showed that they knew it was going on.
They knew what was happening. They just said, “We’re going to let these separate moderators do this thing. We’re going to get the traffic, we’re going to add that into our total daily average users, monthly average users. We’re going to sell out to Condé Nast, we’re going to profit, and we know what’s going on.” Even when they took it down, it was because the media was banging at their door. It was like, “We didn’t want to do this.” Even on something as obvious as that. We generally agree, we don’t want that content.
They were a totally different thing culturally. Now, they’ve had to do like I said, a 360 to come to the point where, “Now we’re taking responsibility for this space a little bit better. We’re going to remove these things. We’re not just going to say, “Yes, we know this is happening, but we’re going to put our head in the sand.” Of course, they just raised a big round of funding. The world keeps spinning. It is an interesting example because they have turned that around, which I think is a credit to them. You reference the difference between how we look at the First Amendment and how that the right to political speech or free speech is over there. How does that impact platforms and online communities in a way that’s uniquely Australian and different from what we might see over here?
[00:16:06] Dr. Jennifer Beckett: I don’t necessarily think that it impacts our communities per se. Except that, as moderators, as community management professionals, we do have to spend a lot of time reminding Australians that we don’t have a right to free speech because that’s the argument that’s always given to you when you moderate something of your platform. Free speech and you’re like, “Not how that works here.” I think that that’s the biggest thing. I think, also though, there’s no such thing as 100% free speech anywhere in the world. That’s just a fallacy. One, because as we can say on platforms already, we moderate. As you said, that is the free speech thing, but that’s impacting on other people’s supposed free speech.
There’s so much in law that we also regulate. There’s laws around hate speech. There’s laws around inciting violence, all of those things. I don’t know if this counts in America, but in Australia, we have laws around those things that you can’t go online and do a racist scream at somebody. They can charge you under that. There’s legislative requirements that say, that’s not acceptable speech. There’s also things around misinformation. There’s legislation that is around saying that’s not an acceptable use of speech. I think the simple thing goes down to it, it’s like, you don’t have the freedom to walk into other theater and yell fire. That’s what 100% free speech would allow you to do.
[00:17:34] Patrick O’Keefe: Right. For listeners in the US, we have 230 obviously. The liability for most speech, there are exceptions to 230, falls on the speaker. Not so much the platform, not usually the platform that it’s hosted on. How is that different?
[00:17:46] Dr. Jennifer Beckett: In Australia, that’s the same because the platforms we’re using tend to have been created in the US. By default, we have that governance system already in place. This is what we mean by nested governance. We are already abiding by everything that the platform already has. Then we might create a community on something like Facebook and then put in other governance regulations there.
[00:18:08] Patrick O’Keefe: Is there any additional liability that comes from, say, being someone who’s fully operating in Australia versus using a platform based in the US? You’re hosting your own community, something like that.
[00:18:16] Dr. Jennifer Beckett: While hosting your own community, yes, then that’s problematic. In fact, actually, we saw in the last couple of years that news organizations, going back to news organizations, were actually successfully sued for defamation based on the comments and things that they had posted on Facebook and various other social media platforms. It’s happened a couple of times. It falls on the publisher of the defamatory content here. We also have what quickly called the Sharing of Violent Abhorrent Material Act, which came into force post the Christchurch massacre. That is basically designed to force anybody who is hosting that kind of content to take it down rapidly or face sanction.
[00:19:01] Patrick O’Keefe: Now, we’re getting into eSafety legislation. I want to talk about that. I want to talk about it through, or at least part of it, I would love to hear about the lens of a small online community operator. Because when we talk about laws that govern this work throughout the world for me, but obviously, I talk about the US a lot. I always like to think about how it would impact the teenager starting an online forum because that’s who I was. I was 13, 14, 15 is how I got started. I launched my own online forum. I moderated it pretty well, honestly, in hindsight and really tried hard. Most people who operate online communities are closer to that teenager than they are to Facebook.
Anytime we talk about laws, I like to think about, how is that person impacted? These laws are talking about these more recent efforts, especially. Over here, our main problem our legislators have is they think Facebook equals internet. They’re like, “I’m going to write a law, and it’s going to be about Facebook.” Yet, the 15-year old starting a forum would have the same law, and they don’t have private investigators or whatever the latest 230 amendment is calling for. They just like Roblox or something or they just like playing Minecraft, and they want to start a forum to talk about it. What is the current climate over there? Are teenagers in Australia, are those some kind of people able to host online communities? Is that at risk? Where do you see it going?
[00:20:18] Dr. Jennifer Beckett: Look, I think that’s a really, really gray thing because when we tend to make legislation, particularly in the tech space, always falls well behind what people are actually doing because it takes time to get stuff across. I do have concerns, particularly, if we go back to that defamation example, if teenage Patrick was hosting an online community in which people started to post defamatory contents, then online Patrick could be sued for not an insignificant amounts of money. You’re not Murdoch.
[00:20:54] Patrick O’Keefe: [chuckles] No. Frankly, I’m probably going to remove most things fairly quickly. Hey, I had to go to school on Monday. [chuckles] I didn’t get to until Wednesday. I had homework, and it’s not like anyone was being murdered. It was maybe like piracy or maybe someone said they didn’t like somebody or something. I got to it, but it was like a day later.
[00:21:11] Dr. Jennifer Beckett: Also, does teenage– you know the intricacies of defamation law?
[00:21:15] Patrick O’Keefe: No, adult me doesn’t.
[laughter]
[00:21:17] Dr. Jennifer Beckett: That’s the thing. When we think about younger people starting communities and smaller platforms, often, for young people starting communities, you’re not aware of what the legislation is. You don’t know what’s protecting you, you don’t know what’s harming you, and you just don’t have the knowledge to be able to necessarily work with all of these legislations. There’s a lot of stuff that comes at you, which is why the job of moderation is so specialized, and people just do not acknowledge that. You have to be across all of these different things.
With the new eSafety bill, which has gone through a process of consultation on that. I’ve put in with Australian Community Managers. We put together a response to the draft legislation. That bill is actually designed to bring a lot of disparate legislation around safety online under one bigger umbrella. There are some problems with it because there’s always problems with legislation. Because legislation, the law is often a very blunt tool. As you and I know, community is a very not blunt tool. It’s very nuanced. The legislation is more about something like generalized safety online.
What it does is that it looks at things like cyberbullying of adults, so we’ll have the world’s first legislation on cyberbullying and abuse of adults online and offering records. There’s a lot of stuff there around taking down revenge porn or non-consensually shared images, all of that kind of stuff is all under that. There’s been separate legislation, but it’s all being pulled in. As a community manager, you have to understand what those takedown orders for you would mean. What they’ve done is they’ve standardized takedown orders to be from 48 hours to 24 hours which brings us into line with most of the European takedown orders for this content.
As a community manager in a group, you’d have to be responsible for that. I think it’s more going to affect owned communities, rather than people surrounding communities on big platforms because it’ll be more at the platform to take it down at that point. Does that make sense?
[00:23:32] Patrick O’Keefe: Yes, that makes sense. To me, it sounds bad. [chuckles] One thing we have over here happening too, and I was talking about this today with someone is just people are going after 230. I’m not an absolutist, I guess would be the word. I’m not against change of things. I’m open to hearing ideas and how things might work. You can’t get around the fact that the people that are helping write these laws in some cases are the Facebooks of the world. They are writing laws that they can follow. It’s a tough thing because in a scenario like that, like you mentioned, there’s nuance in community building, and I want as many bad actors as possible off the internet.
Anybody who’s not trying, who doesn’t care, who’s creating spaces that lead to people getting harmed and especially knowingly doing so and profiting from it, I don’t have a lot of compassion for them. As you know, moderation is an effort that requires a lot of care and thought and time. Even when you do your best, you’re not perfection. That’s especially true for a kid or a small org or somebody trying to start an online community, that’s a single operator, which again, is most communities throughout the world. What happens is two things.
Number one, you are so sensitive, that you remove everything, everything, anything, anybody whispers a threat to you, the big, the powerful, they whisper a threat to you as a person who doesn’t have a lot of power, you’re taking that down. You can’t fight those people. You don’t have the money or you just give up. You don’t try or you just shut it down. In which case, who wins? I think there’s a debate to be had there. Is it a net positive for society that you give up, and it falls into the hands of people who maybe are bigger players or who have a certain amount of money?
Is it a net negative to society and creativity that you don’t feel like you can start a hosted platform, so you end up in what is essentially like a monopoly of social media companies? I have to use either Facebook or I have to use YouTube because they are the big platforms that I can safely create a community on where they’ll be responsible for all removal and ultimately, will bear the biggest cost. I won’t host it so they continue to win? Very complex. No answers here. [chuckles] I think there’s a lot to talk about there.
[00:25:44] Dr. Jennifer Beckett: I think one of the things that should clarify is that the takedown orders come after a complaint is made to the office of the eSafety Commissioner. We actually have– it’s an independent organization to the government that deals with eSafety. It started to protect the safety of children online. It was really about getting child pornography down and dealing with cyberbullying of kids and all of that stuff. I just want to be very, very clear that it’s not just anyone who can tell you to remove content. It’s going to come from the office of the eSafety Commissioner. When you make those complaints, you also have to show them that you’ve gone to the platform itself or to the community owner and asked for it to be taken down, and nothing has happened. It’s a staged thing.
[00:26:28] Patrick O’Keefe: Makes me feel a little better. [chuckles]
[00:26:30] Dr. Jennifer Beckett: They do investigations, and they look at context, and they do a range of different things. We think having that as a regulator at the top of it, I think it’s really important in protecting those smaller community owners. Obviously, there are instances where they might go, “You have to take it down,” and there are instances where they might say, “We’d strongly recommend that you do.” One of the things that we’re concerned about is every time someone reposts the content, you’re going to have to make a new complaint to the platform, and then you’d have to make a complaint, again, to the office of the eSafety Commissioner.
I think that that’s difficult for a couple of reasons. One, I don’t see a way around that loop. The problem then becomes we know that complaints are made in communities often and to take things down, and they’re actually abusive process. They’re another way to just continue abusing the person by forcing them to go through these continual loops and getting them banned and having all sorts of things done because you’ve made these complaints about them. I think it’s very important for us as community managers just more broadly to understand how abusive process can work and the impacts that it can have on people.
I think that the big social media platforms just don’t understand how that operates at all. They think those running smaller communities and owned communities, we have a real opportunity to think ethically about how our flagging processes work and what happens when people get flagged and how we make those decisions. I think that’s really, really important for the mental health of people in your platform, and also for getting rid of bad actors. You don’t want to get rid of somebody who’s actually the victim of abuse, and leave the abuser in.
[00:28:14] Patrick O’Keefe: If I were to ask someone, “What’s an online community that changed your life? What’s an online community that you really love that gave value to you, that brought you through a tough time?” I think it would be almost unanimous that the community they named would be much smaller than Facebook’s user base. [laughs] That’s not a coincidence. I think a lot of the best stuff happens in smaller groups. More often than not, what I find is those people are trying to do their best, and they’ve realized that there is a limit at which, I think for a lot of people, where you can build truly great communities by volume of people in.
Honestly, good moderation sometimes slows down some of that growth, which is why a lot of startups don’t do good moderation I think, and they just prioritize growth, and so they ignore that for a while until down the road. Speaking of Reddit, their scale. There’s definitely something too that it’s there’s so many good oasis’ where, to your point, the abused find solace, focused niche communities, that exist to serve them, their unique needs, and their communities they just don’t get from Facebook. I am, as I’m sure you, are a diversity of the web person, I want to see more and more and more, which is that makes me feel a little better.
Of course, they’ll course correct. I think we need to do that in the US. I think with SESTA-FOSTA probably what came from that was just a lot of safe spaces for sex workers being shut down out of fear when they passed that legislation here. It’s still on the books, but I think that people are slowly learning about that, and maybe it’s made them a little more thoughtful about future changes and hopefully, they will course correct too. I want to talk about the well-being a moderators because I could continue down this political road forever. I want to talk about moderator wellbeing because I know you’ve studied it. How long have you studied it for?
[00:29:51] Dr. Jennifer Beckett: I’ve been looking at this now for the past five or so years. Because I came back into academia from having worked in industry. I drop in, and I’m like, “What is the number one thing that I think is the most important thing from my previous industry to have a handle on?” For me, that was just the impact of doing that work, and there are people like Sarah Roberts at UCLA. I’d say she more broadly looks at the political economy of moderation. Where people are situated, the volume of work they’re doing, the types of decisions that they’re doing, and she glosses briefly over some of the impacts of that, but it’s not really, from what I can see, the focus of her work.
Whereas, I just go “That’s going to be the focus of my work.” More specifically, everyone looks at like the sexy big platforms because there’s so much data, and I’m just like, “You know what? They do a lot of content moderation, but they don’t do the majority of content moderation on the web.” That’s done by community managers, social media managers. They’re all working in disparate kinds of spaces or volunteers. It’s in the small Facebook group that you set up to support people. It’s in the news comments. It’s like the comment section of a news website. It’s your brand on social media. People have moderation as part of their jobs. It’s not recognized that it’s part of their jobs often.
They’re invisible when they do it, as moderators are, which is, I think, problematic and something that’s not discussed enough, like how that can increase the chances of people acting badly. You’re coming at an organization, not at a person, and so that depersonalization process that moderators go through, it protects the moderator from some extent. In doing that, are we also creating a situation in which it’s easier for people to post bad things? I think that was my big takeaway. It was like, “I’ve moderated content on news sites now for four and a half years.” Yes, it’s not pretty, and there were days where it did affect me, and I would go home and shut the door in my apartment and be like, “I’m not leaving for the rest of the evening because I just can’t with humans today.”
Then there are other days where I was like, “Humans are the best.” Again, just not thinking through that, and they’re not thinking through things like when we talk about moderators, everyone’s like, “Oh, it’s just about taking down content,” and that’s not what moderation is. The best way to describe it is, yes, it’s part of what you do. The moderation is really like when you go into a panel discussion, and there’s always the person on the side making sure that everyone’s getting a chance to speak equally or shutting down conversations when they get too out of scope or people are starting to be abusive. That’s a moderator. They’re literally called a moderator.
That’s what a moderator online does as well. That’s part of the whole job. It’s very skilled. It requires a lot of knowledge we’re talking about before of legal issues and all sorts of stuff as well. Too often, though, when we look at research, it’s given to people who are straight out of university. It’s the thing the intern does, it’s the thing that the young person does. One of the things that I started to think about was like, “Are we missing so much of these harms just because the people are in these low-level roles?” There’s so much churn that we’re missing when people leave because they can’t cope anymore.
[00:33:25] Patrick O’Keefe: When you’ve studied moderator well-being, have you focused on a particular platform or a group of people working for a particular company? What’s a group of people and what type of companies do they tend to work at?
[00:33:39] Dr. Jennifer Beckett: The types of companies they tend to work at are news publications, they work for brands. I’m trying to think of a brand that stays the same in America and Australia by the names. Say, they work for Johnson & Johnson, and they’re doing like the Dove Australia social media, or the Dove Australia Group as the community manager. That’s the kind of people that I’m looking at. I’m not so much interested in like, “Let’s have a look at all the Reddit’s moderators.” That’s not what I’m interested in. When I look across a range of different industries, and so what I’m trying to do is look at how it just affects the average digital worker who has this as part of their job. Community managers tend to fall under that.
I’d like to just acknowledge that my research is done on the lands of the Wurundjeri people of the Kulin Nation, and I’d like to pay my respects to their elders past, present, and emerging and also, pay my respects to any First Nations people who might be listening to this podcast.
[00:34:39] Patrick O’Keefe: In that time, you’ve been studying it for five years or so. Has anything changed? Has it gotten better?
[00:34:45] Dr. Jennifer Beckett: I think people are more aware of some of the harms, but they tend to think that it really only affects people like Facebook moderators. They still haven’t connected between large scale commercial content moderation and the person moderating your Facebook group. They haven’t made that connection at all and they haven’t made the connection between the kinds of work that community managers and moderators do in terms of managing other people’s mental health in a community as well. There’s an enormous amount of emotional labor that goes into it. I mean that in the true Arlie Hochschild sense of the word.
It’s like you having to moderate your own emotions as part of your job for which you are paid. There’s a lot that goes on. I think it’s coming more into the fore but it’s ignored down the line. I think places like the ABC, they’ve actually started doing some amazing stuff. The ABC, the Australian Broadcasting Corporation, they’ve actually hired somebody to be the wellbeing coordinator to work specifically in a wellbeing role for all of their digital workers, all their social media managers. Then they’ve moved into a community management model now rather than a social media management model, which is going to work really well for them.
They can’t post any news on Facebook because they’ve been developing their own platforms, which is great. They brought somebody in to do that. They started to do– In Australia, we’ve got organizations like Quiip, which has started to roll out resilience training to people who work across various corporations. You can actually have them come in and teach you understanding what resilience is and how it works. I think in Australia, we’ve done quite a fair bit of work in this space at making sure that we’re starting to address some of these concerns, but that is all really quite new.
[00:36:31] Patrick O’Keefe: You just mentioned bringing in someone as the ABC does, which is great. What are some other things that the best companies are doing, the ones who get it, or even the ones who are starting to get it and are making changes? What are the things that you’re seeing that they’re doing?
[00:36:45] Dr. Jennifer Beckett: I said the ABC is a company that gets it. It’s big media organization. It gets it right. I’m not convinced that I’ve seen other organizations get it just yet. I think they’re starting to, “Oh, I see that maybe your organization, in general, seem to be starting to get it.” I think that that’s because they’ve always had issues with PTSD, say, in reporters and things like that. They’ve always been across some of these issues. I think they already had foundations in place that they could start to plug stuff in around moderators and community managers who worked for them. I really don’t see it happening very well in other places.
The stuff that I see that does work is literally believing people when they tell you that this is what is happening and this is what it means. There’s that old thing, believe people when they tell you. People need to be making sure that their community managers, their social media managers are at the table because they’re often just not at the table when there’s discussions about policies and various things are being made. They’re just left out because I know it’s like this silo, other weird stuff that happens over here. I think that’s the problem. To me, the biggest problem that I see is just access to that care because so much of employment now is debased employment like you’re in an insecure employment.
Even if it’s there, are people going to access it? If you say, “I’m having a really hard time, and I’m now going to take up all of these company funds in getting mental health care through the EAB,” or whatever it is. Then the fear is I’m not going to get the work. I’m showing people that I’m weak, that I can’t do this. It goes against everything that you would tell somebody about resilience. Resilience is not just about keeping on going because you have to, it’s about recognizing when you need to stop and seeking help.
The fear is that if the gig economy that this neoliberal world who has decided that this is the best idea in the world is to make sure everyone’s in the insecure employment because it benefits everybody else is that people who need care don’t get it.
[00:38:59] Patrick O’Keefe: This just goes back to a basic thing because what I was thinking about was like, “Someone’s starting to get it, what do they do?” One thing I’ve heard, which is very basic, is just being able to provide cover. Especially when you’re talking about community managers, maybe there’s one person who has to see this stuff, and no one else has a real strong idea of what it is that this work represents. They might say, “Oh, yes, I can hop into the queue and moderate some posts,” but they don’t have any idea of what they’re doing or what it means or how to apply those standards.
They’re not really trained to step in and do the job well. That makes you even less likely to take time off. Even just having someone else who can step in for you, which is important in a lot of ways, just in a selfish business way. I forget what it is that people say. I think unkindly, it’s something like the hit by a bus rule or something. It’s like if this one person disappeared tomorrow, who would step in? Just from a selfish business perspective, you want redundancy in your business. If you think this is an important core product, you don’t want to have it all on one person. It’s a positive for multiple reasons, not just for their mental health, even though that’s a vital, important thing.
I hate to even try to reframe it that way, frankly, because it sounds terrible, but it’s true. They need to have other people who can do this work and who can step in and take care of it ably. Frankly, a lot of people don’t even have that.
[00:40:18] Dr. Jennifer Beckett: Yes. I think the other thing that’s really, really important is to have that not be like, “Oh, I’m sick for the day,” or whatever. Somebody can step in and do it today, but to recognize that there are going to be days where content is posted, that attracts particularly getting virulent responses and that maybe don’t have the one person moderating that all day. Break it down, say, you moderate it for an hour and you don’t look at it for the rest of the day.
You’re not dropping people into the bear pit and going, “Off you go.” Also, providing mechanisms for unofficial support mechanisms, like peer support and various things. Open that up and have that available for people to form their own groups and their own networks to talk through this stuff. Be okay when they go, “Hey, you know what, we need some resources.”
[00:41:08] Patrick O’Keefe: Yes. That’s obviously super relevant to the media, and I get why you flag it because I’ve talked to people who are moderating in the aftermath of a terrorist attack or Brexit or a major political shift or something that represents nationalism populism. Those days when Donald Trump is elected, thank goodness, that’s over. Donald Trump was elected. There’s a different tenor that day. Because he may have represented an attack on who they are as a person, it adds to the burden of the day because they themselves are dealing with stress about their own identity and about their own safety moving forward.
Now, they have to step into the community, the comments, whatever it is, and moderate people who either don’t care about that or also have that problem. It just adds to the burden of the day. I can imagine why that would be such a pressing example.
[00:42:01] Dr. Jennifer Beckett: That’s completely it. Because one of the questions that I ask people when I speak to them is, have you ever had to moderate content and leave it up, which you often have to do or even just moderate it, that goes against your sense of self-identity and safety? I’m thinking about people of color and First Nations people, anybody in a minority group who’s doing this job, who suddenly has to remain professional while people are actually questioning their right to exist in many cases. Living in this constant state of cognitive dissonance can actually have physical ramifications as well.
There’s a reason why confirmation bias is a thing. Because when we see things that we don’t agree with, it sets off that cognitive dissonance that sense of uncomfortable– We start to immediately go to seek things that confirm our sense of identity and our position in the world. If you have to work under those conditions, that’s really problematic. This is where we need more diversity in hiring teams and more diversity in community management teams and social media management teams, so that those concerns can be upfront and center and that we can work to mitigate those effects on people.
[00:43:19] Patrick O’Keefe: Jennifer, thank you so much for your time today. I’ve really enjoyed the conversation – deep at times, uncomfortably funny at others, perhaps. A pleasure to have you on. Thanks so much.
[00:43:28] Dr. Jennifer Beckett: Thank you.
[00:43:30] Patrick O’Keefe: We have been talking with Jennifer Beckett, a lecturer in media and communications at the University of Melbourne. Follow her on Twitter @JenniferBeckett.
For the transcript, from this episode, plus highlights and links that we mentioned. Please visit communitysignal.com. Community Signal is produced by Karn Broad, and Carol Benovic-Bradley is our editorial lead. Until next time.
[music]
Your Thoughts
If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.
2 comments