The Role of Credibility in Community Management
With a global network of content moderators, it seems that Facebook might be the largest employer of community professionals in the world. But even with these resources, their content moderation practices continue to make headlines. Outsourcing this work barely seems to help Facebook keep up with the volume of content that needs to be reviewed, not to mention the toll that this takes on the often undervalued and underpaid people that are responsible for it. To this Ben Whitelaw, the engagement lead for the Engaged Journalism Accelerator, asks when Facebook is going to start taking bigger risks to solve this problem?
While Facebook’s moderation practices have lots of room for improvement, Ben also shares how the platform proved to be an asset when Times readers needed a space for discourse around Brexit. For newsrooms organizing communities today, Ben shares that Facebook’s ease of use makes it easy to spin up new groups and show proof of concept, but that this isn’t a full solution for long-term reader engagement. Patrick and Ben also discuss:
- How Brexit played out in comments section at The Times and led to internal advocacy for readers
- Scaling the work of moderation and the importance of consistency and credibility in community management
- Whether or not big platforms like Facebook should be allowed to self-moderate
Big Quotes
On public policy playing out in the comments: “Our columnists who wrote about Brexit would get 1,200 comments on an article. … Some of [the comments] were very good. Some of them were very informed. The Times has got a pretty interested and educated audience. You would have lawyers and people who work in policy, people who are civil servants often anonymous, comment on intricacies of Brexit, and it made for very interesting discussion. … There was a real passion for the topic and a need for it to be heard. Obviously, in hindsight, we see that … legacy media in the UK didn’t really highlight some of the voices of people outside London who were wanting Brexit to happen, and that was playing out in the comment threads under articles.” -@benwhitelaw
The potential impact of governance of the web: “I am afraid that Facebook will trigger something legally that will harm online communities as a whole. I think we sometimes forget that the vast majority of online communities do not have Facebook’s problems and are probably fairly well managed and are one person operations with volunteers. … Government intervention, if it happens, really needs to be targeted at people above a certain scale, above a certain size. … If it falls hard on small communities, it’s really going to stifle a lot of voices.” -@patrickokeefe
About Ben Whitelaw
Ben Whitelaw is the engagement lead for the Engaged Journalism Accelerator, a program run out of the European Journalism Centre, where he is responsible for the growth and engagement of grantees and content programming. The Accelerator is a $1.7m program that provides funding, mentoring, resources and events to support a more participative kind of journalism across European news organizations of all sizes. It recently announced its first twelve funded news organizations.
The program is supported by the News Integrity Initiative at the Craig Newmark Graduate School of Journalism and Civil, the blockchain-enabled journalism community. Prior to joining the accelerator, he was head of audience development at The Times and The Sunday Times in London. Over seven years there, he built an award-winning team, ran several high profile editorial campaigns, including Cities fit for Cycling, and helped both newsrooms reach 500,000 subscribers before leaving in 2018.
Ben has also worked at The Guardian as a health and government IT journalist and set up Wannabe Hacks, a online platform dedicated to helping young journalists get into the industry. He writes a weekly newsletter about content moderation on the web and the policies, people and platforms that make it happen. Inevitably, it is called “Everything in Moderation“.
Related Links
- Ben Whitelaw’s website
- Ben on Twitter
- Everything in Moderation, Ben’s newsletter about content moderation
- Wannabe Hacks, a platform dedicated to helping young journalists find their way in the industry
- The Engaged Journalism Accelerator and the European Journalism Centre
- The Engaged Journalism Accelerator’s first grantees
- The Correspondent
- Jay Rosen on Community Signal and The Daily Show with Trevor Noah
- Spaceship Media, an organization using journalism to spark discourse between communities at odds
- Mary Hamilton and Bruce Ableson on Community Signal
- The Knight Foundation
- Wikipedia page for Section 230 of the Communications Decency Act
- The Cleaners, a documentary about the people responsible for content moderation across the web’s hugest social media platforms
- MeeTwo, an app that helps teenagers talk about difficult things
Transcript
[00:00] Announcer: You’re listening to Community Signal, the podcast for online community professionals. Tweet with @communitysignal as you listen. Here’s your host, Patrick O’Keefe.
[00:00:22] Patrick O’Keefe: Hello, and welcome to Community Signal. On the show, we talk to so many people interested in bolstering journalists and the support of the practice through a stronger community connection. Ben Whitelaw is working on one such effort. On this episode, we’re talking about how much the news media should trust Facebook, if big platforms can be left to their own devices to moderate and whether or not you can serve an audience you aren’t a part of.
Thank you to everyone who supports our show on Patreon, including Rachel Medanic, Luke Zimmer and Maggie McGary. If you’d like to learn more, visit communitysignal.com/innercircle.
Ben Whitelaw is the engagement lead for the Engaged Journalism Accelerator, a program run out of the European Journalism Centre. He is responsible for the growth and engagement of grantees and heads up program content and resources. The accelerator is a $1.7 million dollar program that provides funding, mentoring, resources, and events to support a more participative journalism across European news organizations of all sizes. In October, it announced it’s first four grantees from Spain, Greece, Ukraine and the UK, and they will announce a further eight at the end of January. The program was supported by the Craig Newmark Journalism School at the City University of New York and Civil, the blockchain enabled journalism community.
Prior to joining the accelerator, he was head of audience development at The Times and The Sunday Times in London. Over seven years later, he built an award winning team, ran several high profile editorial campaigns, including Cities fit for Cycling and helped both newsrooms reach 500,000 subscribers before leaving in 2018. Ben has also worked at The Guardian as a health and government IT journalist and set up wannabe hacks, an online platform dedicated to helping young journalists get into the industry. He writes a weekly newsletter about content moderation on the web and the policies, people, and platforms that make it happen called Everything in Moderation. Ben, welcome to the show.
[00:02:04] Ben Whitelaw: Thanks so much Patrick, great to be here. Glad to be part of the show.
[00:02:10] Patrick O’Keefe: Glad to have you. I have had a lot of people on this show that are working to help news organizations better engage with the people they serve through a mix of tools, strategy, funding. There are a lot of folks trying to solve this problem with an eye on a larger problem, which I would say is the financial sustainability of news. It seems like we haven’t quite seen the dawn yet, not from the number of organizations that are cropping up to sort of address this. What do you think it will take to turn the tide?
[00:02:38] Ben Whitelaw: Well, there’s a lot of factors that play into that I think but, one of the big things that I’m really interested in is having more organizations doing this work. I really don’t thing we’ve proven out the idea that, having a more engaged audience is necessarily going to bring about financial sustainability. It’s probably one of our best shots that we have right now. I think there is a lot of really work happening in the US, and the work that I’m doing with the accelerator team in Europe is starting to explore these ideas and produce some case studies that we can then evaluate and show.
I think only at that point can we start to really make the case that this is a route that we should be going down. We should be involving the community in the process of journalism, in the co-creation of stories and giving them a voice that they really don’t have or haven’t had since the dawn of the internet. Which is obviously ironic because, the internet is meant to bring about a dialogue and a conversation which has really never taken place.
[00:03:35] Patrick O’Keefe: Getting outside of the US. You were at The Times of London and I know that you received some press around your use of Facebook and how you converted that to paying subscribers. But, getting outside the US, what are some of the news organizations that you think are doing this really well?
[00:03:52] Ben Whitelaw: There is a whole swathe of organizations who are doing different aspects of this, what we call engaged journalism of trying to bring the community closer to the process of doing journalism.
[00:04:03] Patrick O’Keefe: Let me rephrase it. What are the ones that you look at and I don’t know if, “looking up to” is the right word. But, you’re like, “This is great, these are sort of the models,” is anyone out there?
[00:04:11] Ben Whitelaw: I would say that, there’s probably two that come to mind. I think BFT in the UK have been very strategic with the way that they engage their audience. That is obviously super specific, they’re looking at a well off, more mature, business focused audience. But they haven’t necessarily wasted time trying to do frivolous experiments. They have been very strict themselves. They focus a lot on newsletters. They focus a lot on involving readers in comments actually underneath the articles. They have got a pretty coherent community there and so, they have done a really good job just to be very strict and not be too all over the place and scattered with what they’re doing.
Then, obviously, in the last few years, you’ve seen The Correspondent in the Netherlands as well, who have really changed the conversations around the nature of bringing these communities into the news process. For those who don’t know about them, they started up, I think it was in 2014. They had a big crowdfunding campaign. They recruited members and those members are really involved in the process of news-gathering, of sourcing, of generally just having conversations with reporters about what they’re interested in, and then, that has expanded. Last year actually, they did another crowdfunding campaign to bring it to the US, would do The Correspondent where a lot of the same principles of participative journalism are going to also be played out there. That will be interesting to see how that takes place. Because, English language news is obviously very different to Dutch news. But, again, it’s all part of the experimentation process.
[00:05:42] Patrick O’Keefe: Listeners of the show will remember I recently had Jay Rosen on the show, who’s the US ambassador for The Correspondent and their move over here. It was funny because Jay was on the show and then like, I don’t know, a month later he was on The Daily Show with Trevor Noah, and I was like, well hey. He went on Community Signal, then he ended up on the other program. No, it’s unrelated, but it was a fun show with Jay.
[00:06:03] Ben Whitelaw: I’m sure Trevor is a listener of Community Signal.
[00:06:05] Patrick O’Keefe: Yes, I’m sure he is.
I just mentioned the pressure received about your use of Facebook groups and converting people to being subscribers of the publication you work for. Now, I don’t know if the news media as a whole had warm, cuddly feelings about Facebook then back in 2017 when those writeups were happening but, if they did, they’ve certainly declined at this point in 2019. Even Facebook as a wider platform, is that still a place that you recommend news outlets invest their resources now?
[00:06:36] Ben Whitelaw: I would say, no. Facebook was always really a breeding ground for ideas. We used it at The Times as a means of experimentation. The nature of the platform means it’s easy to get things up and running. For newsrooms that are slow and sometimes quite cumbersome to try new things, we found it really useful to prove ideas out and we saw our politics Facebook group grew a lot. We had a Brexit Specialist Group, where we had people from both the remain and the leave side come together and try to have a discussion about, what is still rumbling on actually in terms of whether Britain is going to leave the EU. We used it in that quick turnaround, can we prove something out? Does this mean we can then bring that community elsewhere or bring them to our own platform? I wouldn’t recommend doing it for a long-term project. I really like what organizations like Spaceship Media have done with it. Where, again, they’ve tried to use the scale that Facebook has to bring different types of people together and foster a discussion in a place that they’re very comfortable. News organizations, I would say, steer clear, but use it for proving of ideas.
[00:07:43] Patrick O’Keefe: The groups you created are still there. They’re still active, I looked at them as I was researching to have you on. They’re still providing value to The Times of London. But when Brexit happened, were you hosting onsite community at that time, like did you have comments?
[00:08:00] Ben Whitelaw: Yes, we did.
[00:08:01] Patrick O’Keefe: Tell me about that a little bit, I talked about this I think with Mary Hamilton of The Guardian when I had her on, but being in the moment of Brexit, which is, no matter how you view it, this historic moment that divided people in such a massive way, and the online communities for the noted news outlets over there are the ground zero for the disagreement and the argument and the conversation around it. Just, I’d like to hear you talk a little bit about that moment, the lead up to it, now in hindsight. When you look back at it, what do you think about? What was that like?
[00:08:37] Ben Whitelaw: We probably didn’t see it coming in terms of the reaction to it. Obviously, the politics story, no one really saw, but the reaction really was quite something. I remember in the months afterwards, we looked at the most read and the most engaged with stories and the most commented stories and they were all Brexit, every single one of them was Brexit. Our columnists who wrote about Brexit would get 1,200 comments on an article, which for a Times article…comments were not small comments either.
They weren’t one or two sentences, they were perhaps 200 or 300 words each. They were arguably many articles in their own right. Some of them were very good. Some of them were very informed. The Times has got a pretty interested and educated audience. You would have lawyers and people who work in policy, people who are civil servants often anonymous, comment on intricacies of Brexit, and it made for very interesting discussion. It was unfortunate, the nature of commenting software on news sites means, you can’t really establish a debate as such. There was very long threads, sometimes it would descend into anger.
But, what we did see, that there was a real passion for the topic and a need for it to be heard. Obviously, in hindsight, we’ve seen that that is really what has been one of the failings of the media, is that we, legacy media in the UK didn’t really highlight some of the voices of people outside London, perhaps, who were wanting Brexit to happen and that was playing out in the comment threads under articles.
[00:10:11] Patrick O’Keefe: What was your comment moderation like over there as far as people? I’m sure it was a stressful time but, was it the sort of thing where you were assigning people and there was not enough hands? It’s a lot for any newsroom or news organization to deal with. I always think about and sympathize with my friends in the media whenever, over here in the US, we have a president who I’m not a big fan of and that’s no surprise if you listen to the show or read my Twitter stream. He puts friends of mine in danger. I do believe that, regularly. When he does these things, he impacts those people in all ways.
That’s not to compare it to Brexit but, I always sympathize with the people behind the scenes who have to do the tough but necessary work of moderating those comments and trying to preserve some sense of standards. This is a very vague question but, in your time, in moderating comments and looking at that problem, was the announcement of Brexit– did you think of that as one of the biggest volume challenges that you’ve ever seen or was there something else?
[00:11:15] Ben Whitelaw: I think it was volume but also, quality. We definitely struggled to keep up with the volume. Some days, there was 25%-30% more comments than prior to when Brexit happens and a lot of that was driven by those columnists, articles, those front page articles that would get a lot of traction. There wasn’t a lot of flex in the resources that we had. The nature of commenting and moderation teams in newsrooms where I think was still to prove out the business value of them, even though I personally feel like we’re getting closer to that, it’s very hard to make the case to managing editors and senior editors that we should have more resource. You’re doing more work with often the same resource. Things can easily slip. We definitely had to be careful with the way that certain commenters who we knew were very pro-Brexit, and actually, some of those that were very anti-Brexit, spoke to each other and some people went to quite extreme lengths to go at specific commenters. We had to make sure that was dealt with in a careful way.
By the time I left we could start to see some benefits to that approach. We started to bring some of the better commenters who were leaving interesting and interested comments into the newsroom actually to meet some of our politics journalists. That was really fascinating for us.
We tried to convert the super users, the people you often knew would spend a lot of time each day leaving 20 to 25 comments, we brought them in to give them a sense of what The Times is about and really to help steer their behavior online. That is a nice way we tried to turn that difficult situation into a positive. Overall, yes, it was really difficult and what we didn’t see it coming when I think that was one of our failings, probably.
[00:13:01] Patrick O’Keefe: I want to get off Facebook but I just wanted to ask you about this since it’s news. What are your thoughts on their announcement that they’re investing 300 million dollars into journalism projects? There’s only some of that that’s been committed to various programs or grants like, the Knight Foundation and several others. What’s your take away on that? Is that good money? Is it bad money? Is it money that you should just take for what it is and use it however you want if you’re a news organization or trying to support a news organizations? What’s your feeling on that?
[00:13:31] Ben Whitelaw: I think news organizations shouldn’t turn up the opportunity to use that money to experiment, to find a model that’s sustainable. Facebook has, obviously, played a role in the downturn in advertising but, it’s also caused some of that itself. I think a lot of it is self-made. I think leaders in newsrooms, senior editors, CEOs of media companies, have a lot more responsibility to take than probably they do right now. Maybe that will bear out in the future. Yes, for now, I think that it’s worth using the goodwill that Facebook have towards the media industry and trying to make that work. They’re doing a lot of stuff, Facebook.
It’s important to be able to choose what’s right and what’s not. If they did come forward just in the last few years and say how instant articles was going to be a big way for news organizations to drive revenue. It turns out that news organizations are making less than programmatic ads. You have to take what they’re saying with a pinch of salt, but also not be too principled about it.
[00:14:30] Patrick O’Keefe: It’s money. Before the show, I asked you for a few challenges that you’re thinking about. One of the things you said related to big platforms is about moderation and, should it sit with tech companies or if that represents too great a threat to free speech? Talk about that a little bit.
[00:14:50] Ben Whitelaw: In the last year or so, I’ve been fascinated by the idea of moderation and that being something I did for a long time at The Times, but it becoming one of the central debates on the internet. This unsexy idea of people saying yes and no to content presented by users is really becoming the crux of all of these conversations we’re having around platforms, around free speech, around what we’re allowed to say and what not to say. These platforms, whole business models depend on user generated content. That’s what they’ve provided advertising on the back of for the last 10 years.
Yet, there is this looming juggernaut or regulation both in the US and Europe, which could potentially make that a much less profitable endeavor. I’ve become really interested and I’ve started sending a weekly newsletter with some of my thoughts and rounding up some of the best stuff each week, and it does come back to this idea of, should the tech companies be in charge? Should the community guidelines, which has obviously been dreamt up by a policy team in an office somewhere in probably San Francisco. Probably not a very diverse team, I would probably add to that.
Should that be that be one end all or, should there be something country specific or, cross country specific that is shared by governments and mandated by governments? Really, I don’t have the answer and I flip back and forth between the pros and cons of each and right now, I’m on the fence. If anybody has any ideas or you have any thoughts Patrick.
[00:16:20] Patrick O’Keefe: I’ll try. I think one of the things that I am afraid of, and we talked about this on an episode recently with Bruce Ableson. I am afraid that, Facebook will trigger something legally that will harm online communities as a whole. I think, we sometimes forget that the vast majority of online communities do not have Facebook’s problems and are probably fairly well managed and are one person operations with volunteers. Most online communities are just independent and run by someone who has an interest in that topic or even small companies and small businesses might have an online community where they only have one or two people working on it.
Most online communities are not that large. I have a community that’s going to turn 18 in May. It’s obviously very small compared to Facebook, but it’s still an active community. I don’t have any problems. That’s just because I’ve managed it correctly for 18 years, and I’ve put time in and done the best I could. Sometimes, you make mistakes, but overall, it’s a really good community I’m proud of. Honestly, it requires little moderation because, that’s the simple cheat, is to run the community well for five years with attention to detail, and then people get the idea and start doing it themselves.
Facebook’s problem to me is always that, their policies are fine, I don’t care about their guidelines, they read fine. That doesn’t matter. It’s always the application of them, and the same is true for Twitter. If you go to the Twitter community guidelines, they’re reasonable enough, they’re fine. They’re good guidelines. I could maybe update this part or that part or change this or that, but they’re fine. It’s application. Who’s applying them and how consistent that is and the training those people receive and how much is dedicated to that team.
I read in your newsletter that Facebook’s going to outsource moderation to the Ireland office. My name is Patrick O’Keefe. I can’t really fault them for giving moderation to the Irish. The thing is, they don’t prioritize it. It’s tough. My thing is that, over here, we have people that hold the pen for legislation, and I’m sure you have this over there too. We have people who I would never want me to tell me how to manage an online community, that hold the pen and write that legislation. When we have the CEO of Alphabet, Google’s parent was up in front of Congress a while back, Steve King from Iowa, who’s a representative and frankly, he’s a racist, and he was up there saying, “My grandkids picked up this cell phone and they saw an ad that said something bad about me and that shouldn’t happen.”
He said, he picked up the phone, this phone was secondhand, it was passed down to that child. You know what happens? Somebody on that phone looked up you and they gave it to your grandchild and that’s how they got the ad. This is a guy who talks about legislation and talks about what we have over here, Section 230 of the Communications Decency Act, which is a big deal and the legal basis for moderation in our country. It says that, we can moderate without being liable for what remains. In other words, we can say this isn’t appropriate without also accepting liability for saying that this is okay.
If you’re a moderator of a forum like I am and you’re doing the best you can and removing everything that violates your guidelines, but you leave that post up and that post said something bad about someone or, for whatever reason, was legally challenged, then I’m not liable for that. Now, if I’m part of the speech, basically, what it boils down to is, the speaker of the speech is the one who’s liable. That’s what essentially the law is. That’s dangled out there by these folks, that could change. Section 230 could be adjusted, could be changed because of Facebook, because of Twitter.
No one seems to be thinking about the vast majority of us who are just managing online communities, don’t have these massive problems and are doing a decent job. My big thing is just, government intervention if it happens, really needs to be targeted at people above a certain scale, above a certain size. Whatever that is if it’s revenue, if it’s some other measurement. If it falls hard on small communities it’s really going to stifle a lot of voices.
[00:20:18] Ben Whitelaw: Yes, I think you’re right. I think it shouldn’t be blanket. These issues have come about because of several large social networks, they’ve been brought to light because of the problems they faced. I think you’re right that they shouldn’t try and legislate across the board. I mean, I think there’s a problem generally about legislation and the speed with which Internet evolves, which makes it very hard to legislate for generally. There’s a huge problem there. There’s also the fact that many people are on Facebook and are on YouTube and it’s across all of these countries as well.
How do you create something that is applicable and that those organizations can pass down to the use if it is going to be something that the government legislate for? It’s a real Gordian knot. It’s really very complex, but it is like some of the most important stuff that’s going to go on in the next five to ten years. I think I’m glad to hear that your community is hopefully not going to be affected.
[00:21:09] Patrick O’Keefe: [laughs] We’ll see.
[00:21:10] Ben Whitelaw: There are some really big challenges. I’ve now seen just this week that Facebook are upping up another office, and outsource office in Sofia in Bulgaria, maybe 150 people there who are going to look Turkish and watching content, that content’s currently moderated in Dublin. So, there’s a whole new set of staff — not staff, they’re not employed by Facebook, but workers who are going to need to understand the intricacies of Facebook’s guidelines, they’re going to need to understand the nuances that come with moderating complex content.
That’s a risk, that’s a liability. When you create a team like that and when it’s outsourced and it lives miles away from the Menlo Park headquarters, reports back occasionally, that’s risk. Facebook can only deal with the challenges it has by creating those risks. So, I don’t think it’s going to stop anytime soon, frankly.
[00:21:58] Patrick O’Keefe: I think that’s an interesting point, Facebook needs to take some risks to solve this issue, if it’s going to be solved. My general perspective is and I’m not saying you feel differently, is that they don’t care enough. I’m certain there are people there– For me, it’s not some sort of deeper thing. I don’t care, I don’t mind Mark Zuckerberg, I don’t mind Facebook. I use Facebook, I haven’t had any massive awful experience with Facebook where it’s harmed me. I’m sure there are people there who are trying hard enough that have good intentions. It’s not like it’s evil as a whole or all bad or anything like that.
But, we all know that moderation, you talked about earlier for news organizations, how hard it is to lobby for those resources. Even though it’s core to Facebook’s business, UGC is the whole thing as you said. It’s still probably a challenge for those people with good intentions to fight for those resources to be allocated and to make the case with shareholder value that, it’s worthwhile to do that. I mean, there’s enough anecdotal evidence. Every scandal is anecdotal evidence to better moderation, that we could have stopped this from happening, this could have been done in a different way.
We wouldn’t have looked so stupid by removing this that we never should have if we had had better training. But, I take that cynical perspective. They just don’t care enough and the numbers of moderators they keep adding, it’s this astronomical number. I guess technically speaking, are they the biggest employer of community professionals in the world now, if you really consider a moderator you know in another country to be a community professional? Probably. They’re probably the biggest employer of people looking at content and moderating and doing community work in the world. Which is funny to think about — I’m going off in a different area.
It’s just tough because, a lot of the people probably don’t have adequate training, mental health services, and so on and so forth because, they are really seeing the worst in a lot of cases of the world. My community at karateforums.com or even my day job, I’m not seeing anything that bad. I mean I’ve had a couple of suicide threats to deal with, and certain key emergencies that I’ve had to elevate and work on and have stressed me out certainly, that’s the job. But Facebook, Twitter, YouTube, especially the people who do the moderation there, it’s tough. I mean there’s that balance of being sympathetic versus, calling out that it’s not being supported by the business. Like you said, it’s such a tough challenge, I don’t know what the answer will be but, we’ll keep talking about it here I’m sure.
[00:24:20] Ben Whitelaw: I think it’s important too, I think the conversation is starting to change. The documentary, The Cleaners, I don’t know if you’ve seen that.
[00:24:26] Patrick O’Keefe: I haven’t.
[00:24:26] Ben Whitelaw: It shines a light on some moderators in the Philippines who are working for Facebook and a few other companies. These people are basically pulled off the streets. Outsourcing third-party companies that Facebook outsources this work too, go to the streets, they say, we can give you a reasonable wage, you can go home, take that money, pay for your family, your children whatever they need. People become moderators without any experience or even interest, they’re doing their best. I think that has to change, that shift from it being something that somebody can do with very little experience or training, that has to change to an area which we think we know, if you’re work in this space, is that it’s a nuance job with a variety of different skills that you need to have.
You need to have experience working in probably a low risk community to begin with to build up your understanding of some of those social interactions that go on and how you deal with them. I wonder if Facebook were forced to stop outsourcing some of this moderation work, whether that would make them rethink the platform because, if they’re then forced to employ these people, if they’re forced to give them certain rights, a certain wage, or the trainings that come with being staff, again, that starts to cut into profits. Whether that would see a change that we want to see, I don’t know. But, I think for me, one of the big problems is this outsourcing culture that we have, and this idea that anybody can do this work.
[00:25:58] Patrick O’Keefe: Yes, I think moderation is not busy work, I mean that’s just what it comes down to, moderation is not busy work. It’s not answering the phone, to do it right matters. It takes, as you said, training experience. I think your point about what’s the starting point for those people? What do they see first? Do you throw them right into the deep end? Or do they have something a little easier to deal with before they really are exposed to the worst stuff? It’s tough because, moderation is viewed as, it’s almost like they want human robots.
I mean, that’s basically what it comes down to. They just want people who will push a button over and over again and do it within a second and a half. That probably works for 96%, 97%, 98%, 99% of their moderation decisions. It probably does. People file awful reports, they do things that aren’t inappropriate. It probably works but, the 1% or whatever small percentage that is, that’s when somebody kills themselves or, that’s when an act of violence is committed, or something happens and, there is an acceptable loss there I think at some point that they just decided it was fine.
I do hope that the tide turns and that might result in lower profitability, which is probably the thing that’s stopping it. But, it needs to be because, it’s important work. It can be a life and death matter. Anyway, we could talk all day about this.
Another thing you brought up that I found interesting was, because, I’m affected by this, is whether or not you need to be in the target audience of a community in order to serve that community, in order to create something that’s useful for them or, is it simply enough to show empathy and I assume to be a competent professional, right, someone who knows how communities work and can manage them effectively. Why is that a thought for you? What’s driving that?
[00:27:39] Ben Whitelaw: Well, I’m thinking about in the future, how communities would exist online. I think there’s going to be a rise of increasingly niche communities. Some of those will be branded, and some of those that exist within social platforms perhaps, but there will be communities. I think, for those communities to thrive and survive, you’ll need people like some of your listeners, like us with skills that can help make those as effective of places to be as possible, to make them fulfilling, to make them have value. Then, I’m thinking, “Okay is it possible for anybody to do that? Could I create a community of shark enthusiasts even if I hate sharks?”
Should I be brought into that community to help them thrive and grow? Or, is it the case that you have to really understand what makes the community tick in order to create something for them and to talk in the way that that community talks? I think this is something that, I guess I’m conscious of just in some of my work, in the past when I’ve been working at The Times, you try to talk to a fashion and beauty community and, you don’t want to be called out, you can easily be seen to be bluffing it a bit but, as things scale online, as we start to have these more niche communities, how do we train these people to go in and really shape how they work? Or, is it the case that you’re bringing up people from within that community to take the role of the community organizer or the community manager? How do we train these people? It’s an educational question as much as anything else I think.
[00:29:11] Patrick O’Keefe: Do you think that there is an element of this that relates — I think you touched on this with the fashion example but, just the idea of readily apparent credibility to that community. A couple of examples come to mind in movies and in Hollywood. I live in Hollywood so by the way, it’s top of mind. The whitewashing of casting, people playing something that they’re not, a role that, let’s say, in the book or in the source material was, I don’t know, an Asian person or a black person being played in the movie by a white person, or perhaps even the other way around, or — a friend of mine who is a man who manages a community for women.
Is there something, especially now and I say now because, there was a time with online community, I think this is still true to a large extent but, online communities were based more on handles. It was a pseudonymity, it was– it’s not anonymity, you have a handle, you have a name but no one knows what you look like or necessarily where you are or where you’re from. But, more and more, I find that to be– I don’t know if less common is the word, but certainly, more identity tied to community tends to be cropping up. Is that part of this conversation?
Is the credibility that the community sees you as, even if you appreciate that community, even if you have empathy for them, even if you’re qualified as far as like the skills on paper let’s say, or you’ve managed communities before and had success. Is there something that can be said for the community seeing you as one of them and viewing you as credibly as opposed to being a member of some other camp or background or something and not seeing you as credible?
[00:30:35] Ben Whitelaw: I think you’re right, I think you’re spot on. I think credibility is a really good word for it. I guess, my worry is that we understand how to build credibility in the real world, and there are ways that we can show that we have credibility. Whether it’s through the work that we’ve done or your education, or there were a bunch of other things that most people understand. But that credibility is I think hard to score online. There’s not really a score for it. It doesn’t transfer well between the different spaces where people exist online. If I am taking part in a football community and then I go to a work community, those two things don’t tie up. I don’t have the same credibility in one that I do the other. I’m interested in how we as we spend more time in these communities online can evaluate our effectiveness and our credibility in each of these areas, and whether there is something that ties all of them together. Whether it’s some score or health metric perhaps that gives an indication that, yes this person is able to do something meaningful over here and therefore, would be useful to have as part of our community in whatever form. I think the idea of creating credibility in a lot of different areas is potentially quite exhausting, and maybe quite difficult.
But I think you’re right. I think you need to show that you are credible in whatever space that you’re interacting in. I’m just kind of interested in how that might scale a bit if we’re doing that a lot more in the future.
[00:31:59] Patrick O’Keefe: Yes, especially if you have someone who’s at an organization, like a news organization where you cover different beats and you’re not going to necessarily have the budgetary resources to have someone credible to moderate the comments for all those different beats.
I think it depends on the community to some extent. How it relates to me for example is that, martial arts community I mentioned, I’ve never taken the martial arts. I’ve run that community for 18 years. I thought the community was a great idea. I thought I could do it well.
Going back, I was a teenager at the time and a friend of mine was like, “Well, that’s a great idea that’s needed. If you don’t do it I will.” I was like, “I want to do that.” I did it and it’s one of the longest running online and martial arts communities in the world. It’s something I’m super proud of, the community the way they talk to one another, it’s a special thing. I tend to think that my lack of experience helps me sometimes because, over the years, I’ve made moderation decisions and people will be like, “Well, of course, you believe that, you yourself take karate.” [laughter]
I can be the one to tell them, “Well, actually,” because I don’t hide this fact is not a secret. I say, “Well actually I don’t take the martial arts. Now I have people on my staff my moderators who are experienced martial artists, and in the rare cases where I need that knowledge, that’s where I tend to go, and of course, I picked up a lot over the years and 18 years.”
I mean, that’s a case where maybe you can get away with it and if you’re good at managing a community, then it’ll be okay. Because, like, again, different subjects. So a community targeted at a certain, let’s say pregnant mothers or breastfeeding, like if that’s probably odd for me to start that community. If I’ve never been a pregnant mother and never breastfed and I don’t even have kids. It’s like that’s a tough thing for me to look credible in. But, other things, that might be hobbies or passions or interests, or if you extend it over to — I think that this might fit more news media, but almost like the brand side of things. Like, call it ego, but I believe I can pretty much manage a community about anything. Because I believe in my abilities to the point where I think that’s true.
If I’m on the brand side, Q-tips. Q-tips wants to start a community? Okay, I can make that happen. I don’t know where I’m going to start with Q-tips because that just doesn’t seem very interesting to me. But, let’s think about it and maybe I’ll say, “No.” Because, again, sometimes the answer is not to have a community, but let me look at it. Maybe there’s something in Q-tips that will spawn some sort of great community interaction. There is that, but then, I think on the other side of it is, there are topics that are very sensitive, that are very personal, that have to do with identity. Those things, you really need to have in some cases that shared identity to be able to– not only be credible of course, not appear credible but actually be credible, and to recognize the issues that they’re facing and be able to moderate and write guidelines and write policies effectively.
Even if you can’t afford the budget that doesn’t mean you can include people who match with that community in those conversations in an advisory role and help guide the way the policy should be. Because, as empathetic as you might be, if you’re not in those shoes you’re likely missing something even with the best of intentions.
[00:35:10] Ben Whitelaw: I really like that. The idea of surrounding yourself with people who know more than you. That’s a really good mate for life, frankly. In that particular case, you found people, your team who know an awful lot about martial arts, they’re your go to. That’s important, to be humble enough to say, “Okay, I don’t know what the answer to this is, but I have people near me who do.” I think that’s what’s often a good marker of a successful community. Let me ask you this on that community, what blowback did you have when people found out that you weren’t an avid martial artist and what was the thing that convinced them that you were the guy to host their community?
[00:35:51] Patrick O’Keefe: I would say looking back, surprisingly little blowback, that could be attributed to a few different things. If a community is well run, people tend not to notice moderation. A lot of moderation happens behind the scenes. A lot of that work is done behind the scenes. If you’re really doing a great job, a lot of the time that means that, they never think about it. That’s part of it. Then, if you’re talking about like a negative interaction, sometimes that would be with people who just have their post removed. They would maybe have an accusation of bias and then, I would say, “Well, I don’t have that bias. Basically I have other biases probably but this just isn’t one of them.”
What wins people over, the reason I have, because I’ve had staff have been with me for 12, 13 years as volunteers. I have members in the community that have been there for 15 years. I have members who go away, like they left for Facebook. We’ve seen this more and more like, they come back, after six years because, that’s the funny thing about being around for so long, is that you can have members who went away for six years and then decided, “You know what, I miss that place, let me go back there.” Which is a funny phenomena.
The way you get around it is just to do the best you can and moderate fairly and apply those guidelines as equally and as fairly as you can. For the most part, you can win people over that way just with your work. Now, of course, there are people who probably thought it was stupid and maybe left and didn’t tell me, I’m sure there’s a small percentage of those. I think that you can win people over, you probably just started a little bit of a debt at a deficit where you have to overcome that deficit and then start from square one and then build that trust.
At this point, when you get into managing a community after three, four, five, six, seven, eight, nine, 10 years, now 18 years. Another former credibility is just a time investment that you put into it. It’s so empowering to be at a community for so long where you have loyal members versus, coming into a community that’s been around for a long time where you’re new as the community manager or director of community or whatever. Because, as someone who’s been around forever in this community, I was there before it was open, I wrote the guidelines, the guidelines are the same now as they were 15, 16 years ago. Literally, it’s like, well, the guidelines have been this way and this is the way they’re applied. This community may or may not be for you, but there’s no question of, should I be making that or not with veteran members at least. I’ve gotten over that through just proving myself and being committed long-term and not selling out the community or having it be full of ads or I’ve never had pop-ups or any of that stuff. That’s how you build that credibility over time.
[00:38:24] Ben Whitelaw: That’s a testament for the work you’ve done in the longevity because, there’s very few communities that have lasted that long or that had started with the view to lasting that long. That’s a testament to you. I hope you people that listen take that on board and realize that it does take time. This isn’t a quick turnaround thing. You can’t build Rome in a day.
[00:38:45] Patrick O’Keefe: You’re right. I agree with you. Tie into that. You mentioned something about community health. I wanted to flag that because, it was something you mentioned as well, the idea of community health measurements, how that’ll be approached in the future. How do you think that’ll change?
[00:38:58] Ben Whitelaw: I worry about it to some degree. I don’t know if you’ve seen Black Mirror?
[00:39:03] Patrick O’Keefe: I haven’t, I’ve heard about it. My girlfriend works at Netflix.
[00:39:07] Ben Whitelaw: I’m all about it. It’s this dystopic series written by a guy called Charlie Brooker, who’s former Guardian columnist actually. There’s an episode of that where everyone has a score and you can vote people’s score up and score down depending on their interactions with you in life. Everyone is kind of hologram of each other. I worry that we could have a version of that where, your interactions start to govern your credibility as we say, people can have an overdue influence on that score. At the same time, there’s something in the idea that you accrue something for the time and effort you invest online.
I don’t think that happens. The internet is very in a flimsy in that regard. You can in a lot of cases, only 1% of people are active in whether it’s forums or Subreddits or Wikis or whatever it is, I think there is something in that idea that only 1% of people will ever contribute because there isn’t I think the benefits in a lot of cases for people to be more involved. We haven’t quite got that right. I’m interested to see where that goes. Whether people can start to create the mechanisms to get people more involved for the right reasons. That’s one way that we counter some of the problems that we have at the moment with misinformation and with fake news and the way the information spreads is that, you encounter by making it more profitable personally speaking to partake in communities in positive ways. I think, at the moment, it’s too skewed towards those people who intend on creating havoc with this information. We need to try and flip it about the other way. Who knows what it will come to? Do you have any thoughts or have any of your guests had any good ideas?
[00:40:50] Patrick O’Keefe: I don’t know. I think this idea of scoring is frightening in the sense that, we are always wondering who the scorekeeper is. That’s the challenge because, I do think there is an opportunity. I think maybe Google might be one of the easiest conduits of this. The fact that what you post online is often indexed, even if it’s through a pseudonym. If you go back and you search forums for ifroggy, you will find forum posts probably from the ’90s. That name, ifroggy has been associated with me and my email as you might know from responding to it as 22 as patrick@ifroggy.com. I haven’t made a big effort to separate myself from that.
A username can be tied to a person and you can view their history now. We’re getting to a point where we’re going to start to have, at some point, presidential candidates who have forum posts from the ’90s or maybe even earlier than that. We’re not there yet, not with that minimum 45 age here in the US but, we’re getting close. Another 10 years and that president will be my age. They will have forum posts online. They will have old things. Yes, they can clear their Twitter. Sure. They can clear certain things but something’s going to be there. For better or worse, they are going to be judged based upon that.
How that speaks of community health, I’m not sure, but I think that we’re at this time where we’re swinging back away from identity I think. I think a lot of kids, teenagers, are less interested in being identified by their name online or have a mass attention given to them than may be the generation prior, like small communities, small groups of people, like my brother is not on Facebook. My youngest brother is not on Facebook. Never had a Facebook account, maybe never will. Just seeing how the next generation of kids uses it, I think it’s often a pretty good signpost of what’s to come when they actually get to take over, which is now or soon.
[00:42:52] Ben Whitelaw: I think it’s really interesting. I’ve across an app called MeeTwo, which just come out in the UK and is four teenagers who have issues around anything frankly, homework, stress, anxiety. It’s supposed to be a safe space for teenagers to talk to one another that actually has 100% moderation. They have a rule where every post is looked up by someone else. I don’t know if it’s another community member or they have a full moderation team but, one of the things that they have which I’ve seen is, they have words that you select. You pick three words and that becomes your name. Your avatar, essentially.
You’ve got people who are shy-mini-star or wild-green-warrior. They’re just random words that you can select together to create your own user. I think that’s straight away a great equalizer in many senses for that particular community. It’s great because, you’re not having to give any personal information about you. You’re not having to create your real name. You’re not having to give your age or anything. I think there is something that you’re right. Whether or not when that generation has to be judged based upon their work online as I think a lot of people having been done over the last ten years. Whether it will shift again, is clear where their preference is right now.
[00:44:11] Patrick O’Keefe: Ben, thank you so much for coming on the show. I’ve enjoyed the conversation.
[00:44:14] Ben Whitelaw: Brilliant. Thanks, Patrick. It’s been fantastic. Glad I finally got on Community Signal and it was so overdue. Great to be part of it.
[00:44:22] Patrick O’Keefe: We have been talking to Ben Whitelaw, engagement lead for the engaged to journalism accelerator. Subscribe to their newsletter engagedjournalism.com/newsletter. Connect with Ben at benwhitelaw.co.uk and follow him on Twitter @benwhitelaw. Find his content moderation newsletter at getrevue.co/profile/benwhitelaw. Revue is spelled R-E-V-U-E.
For the transcript from this episode plus highlights and links that we mentioned, please visit communitysignal.com. Community Signal is produced by Karn Broad and Carol Benovic-Bradley is our editorial lead. See you soon.
Your Thoughts
If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.