Why Has Clubhouse Been Plagued by Trust and Safety Issues?
If you were building a community product, how would you start? Who would you choose as your first hire? What efforts would you make to ensure that the product is inclusive, safe, and well-moderated?
In this episode of Community Signal, we’re joined by Danielle Maveal to do a deep-dive on audio-first platforms and specifically, Clubhouse. While every platform and community has moderation issues to work through, Clubhouse has made headlines and Twitter rounds for the lax moderation that has brought anti-Semitism, misogyny, and misinformation to the “stage” on the app. In this discussion, Danielle and Patrick discuss how other audio-first platforms have approached trust and safety and what steps they would take to scale the teams, communities, and norms that power them. And while they acknowledge that not every conversation or connection that happens on the platform is bad, they offer a reminder that we can all do something to hold platforms accountable.
The members and the content that we allow on our platforms dictate the culture that permeates in our communities. If there’s one thing that Clubhouse proves, it’s that there is still room for platforms that are built with safety and inclusivity in mind from day one.
Danielle and Patrick discuss:
- The current landscape of audio-first communities
- How they would scale a team and membership base of a community product
- Why community guidelines, enforcement, and tools matter from day one
Our Podcast is Made Possible By…
If you enjoy our show, please know that it’s only possible with the generous support of our sponsors: Vanilla, a one-stop shop for online community and Localist, plan, promote, and measure events for your community.
Big Quotes
Community governance influences the culture of our communities (8:05): “I haven’t heard about anyone being removed from [Clubhouse]. I’m sure there have been some, but there’s no transparency. When is someone banned, or when are they muted? What if they are a repeat offender? What happens then? There’s no discussion of that happening publicly so [Clubhouse] feels like a brand new territory for these scammers to go and chat.” –@daniellexo
Not all audio-first platforms are terrible (14:55): “[At Clubhouse], I didn’t see a key community or trust and safety hire very early on to set the norms. I’m on two other audio apps with absolutely no problems. [Space and Quilt]. They’re smaller, they’re growing at a smaller rate, but they have key community hires there. The social norms that are being developed are just completely different.” –@daniellexo
How do we make people care about trust and safety issues? (17:29): “I’ll have a conversation with someone who is a very reasonable person, and we’ll talk about Clubhouse, and all the issues that have been raised. Then I’ll see them ask for an invite. My mind is blown. We are not learning from lessons of the past. How do we make people care?” –@daniellexo
The value that Clubhouse members bring to the stage (24:35): “[Clubhouse] creators who are now put into these moderator and facilitator roles; they’re going to make the founders and investors rich. [Clubhouse] invites are positioned like a gift, but really [users are] creating the experiences that draw in hundreds of listeners, thousands in some cases, and they get absolutely nothing for it.” –@daniellexo
Scaling your moderation efforts with your community is a must (34:30): “Being able to shut down a bad thing while it’s happening [on an audio-first platform] is important and if you can’t do that, maybe it’s a clue that you shouldn’t be doing this. … Taking care of a call two days later is not going to be a workable solution to stopping bad things.” –@patrickokeefe
Building a community for the long haul, not for short term vitality (38:00): “One thing I would focus on is really keeping the testing pool super small. Trusted people only, no invites, or very few and I would know who’s coming in the door. I wouldn’t be allowing more people in the door until I was ready to be responsive to live reports, so that I can come out very strongly against bad behavior.” –@patrickokeefe
About Danielle Maveal
Danielle Maveal has 15 years of experience launching, growing, and supporting brand and marketplace communities. She was on the founding team at Etsy and BarkBox and since has worked at Airbnb and Lyft. She’s now moving from building community teams and programs to building community products.
Related Links
- Sponsor: Localist, plan, promote, and measure events for your community
- Sponsor: Vanilla, a one-stop-shop for online community
- Clubhouse
- Clubhouse on Twitter
- Danielle Maveal’s website and Substack, Community Feelings
- Exclusive Social Media App ‘Clubhouse’ Had an Anti-Semitic Meltdown Over Yom Kippur, by Yair Rosenburg
- Jewish “Control” of the Federal Reserve: A Classic Antisemitic Myth, via ADL
- “You become hostage to their worldview”: The murky world of moderation on Clubhouse, a playground for the elite, via Vanity Fair
- Kia Richards, product compliance manager at Square, on Clubhouse’s disinformation problem
- Taylor Lorenz, journalist at the New York Times, documents Clubhouse’s moderation issues
- Other audio-first platforms referenced in this show include Airtime, Space, and Quilt
- Heather Merrick, director of customer support at Airtime, on Community Signal
- Tracy Chou, CEO at Block Party
- Wesley Faulkner on Community Signal
- Ustream
Transcript
[00:00:04] Announcer: You’re listening to Community Signal, the podcast for online community professionals. Sponsored by Vanilla, a one-stop-shop for online community and Localist, plan, promote, and measure events for your community. Tweet with @communitysignal as you listen. Here’s your host, Patrick O’Keefe.
[music]
[00:00:28] Patrick O’Keefe: Hello and thanks for listening to our show. Have you heard about Clubhouse? It’s an audio-first platform that has had high profile issues with anti-Semitism, racism, sexism, and more, due to an apparent lax approach to moderation, trust, and, safety. Danielle Maveal is joining us to discuss.
Happy 2021 to you and our great Patreon supporters, including Heather Champ, Serena Snoad, and Rachel Medanic. If you’d like to join them, visit communitysignal.com/innercircle.
Danielle Maveal has 15 years of experience launching, growing, and supporting brand and marketplace communities. She was on the founding team at Etsy and BarkBox and since has worked at Airbnb and Lyft. She’s now moving from building community teams and programs to building community products. Danielle, welcome to the show.
[00:01:23] Danielle Maveal: Hey, thanks for having me.
[00:02:02] Patrick O’Keefe: It’s my pleasure. I wanted to have you on to talk about Clubhouse and audio-first platforms as they are being called, because while we haven’t known each other for all that long, we met this year, I guess one of the good things to come out of 2020, as we recorded this on the last day of 2020, Clubhouse and audio-first platforms that sort of dominated our interactions to the point where we have been randomly sharing things about Clubhouse with one another, so I figured why not bring that to the show.
Before we get into it, I want to set the table a little bit because despite Clubhouse popping up a lot in our own feeds, our feeds, or at least my feed, are tuned in a specific way and most of the world has no idea what Clubhouse is. If I said that to most people, they would not have any idea. Let’s talk about Clubhouse and how would you describe it functionally?
[00:01:59] Danielle Maveal: Clubhouse is an audio social platform. Anyone can start a room and take the stage and they have ownership of mic usage, so other people can come in the room and they can pass the mic around. It’s audio-only sort of like this podcast, but if we had hundreds of other people here and you could bring a bunch of people into the conversation.
[00:02:21] Patrick O’Keefe: So kind of like audio chat rooms, I guess, is how I’ve come to think of it. I think a lot about– Oh, that’s wrong. I don’t think a lot about but I do sometimes think about Yahoo chat rooms from the ’90s, and Slack to me is a big… it’s kind of a Yahoo chat room with some really nice extra features. Clubhouse is sort of an audio chat room and Clubhouse has garnered a bit of a reputation early on for poor handling of trust and safety issues. If you Google Clubhouse and anti-Semitism or Clubhouse and misogyny, you’ll find plenty of examples of it.
Yair Rosenberg wrote about anti-Semitism on the platform over Yom Kippur, which is the holiest day of the Jewish year, and to quote his article, “Although this was a particularly high profile incident of anti-Semitism on Clubhouse, Jewish users said it was not an isolated incident, merely a more noticeable one. ‘I have never heard the Rothschild and Fed conspiracies mentioned so many times in my life as I have in the past two months on this app, across rooms,’ said one user. Clubhouse is currently in beta and hasn’t yet been rolled out to the general public. The question is, can it adjust to make situations like these likely less to occur?”
For those who don’t know, mentions of the Rothschild family are often a good clue that you’re dealing with someone peddling anti-Semitism, part of which includes convincing you that Jewish people control the world’s financial system and that’s often a part of it, and we’ll include a link to an explainer on that in the show notes, but at the time Rosenberg wrote his piece in September, he noted that Clubhouse had approximately 10,000 users.
Even if it was double that, part of the big concern that I have about Clubhouse is that if they couldn’t get it right with 10,000 or 20,000 users and that’s total users, not users online at one time, which is total, and it’s clear that growth and scale is their goal, how will they get it right with 50,000, with 500,000 with five million? I guess the bigger point is just about cultivating reputation and how it sticks with you. Because, there are plenty of streaming platforms online and there are plenty of people doing bad things online, but for some reason, here we are, you and I, talking about Clubhouse in particular.
Clubhouse has introduced newer tools and practices, they’ve made some changes over the last few months, but those are at least in words, not execution necessarily, which is the important part and they may get rise above this, but they didn’t really need to create something to rise above if they had just been more thoughtful. What do you think is the recipe that led us here?
[00:04:40] Danielle Maveal: Absolutely and I just heard they have 600,000 registered users.
[00:04:45] Patrick O’Keefe: Scary. [chuckles] I’m just kidding.
[00:04:47] Danielle Maveal: -and a team of eight.
[00:04:49] Patrick O’Keefe: Oh boy, okay.
[00:04:51] Danielle Maveal: Well, there’s one thing. Make sure you have the proper team in place at each stage to be active on the platform and understand what’s going on. I didn’t see that. I saw the founders in high-profile chats where there were hundreds and hundreds of people, and I’m sure those people act much differently when a founder is in the room. I didn’t see a lot of their team. I was on Clubhouse in the spring when it was much smaller and lasted a few weeks, and then I was out of there because of the behavior and there was no way to report a room, and this is when the DNA of a company is formed.
I’ve been part of a few founding teams of tech startups, and that’s what I saw take place. All of the things we built into moderation and community very early on scale with the company including the cracks, so they put growth over moderation, over a safe place for people to chat and it’s just now scaled. There was a Vanity Fair article this week, it’s hitting the media as well so we’ll see what happens.
[00:05:54] Patrick O’Keefe: That was a really interesting article and you hit on something there because bad things happen oftentimes when you prioritize growth over moderation, trust, safety. For example, you might be backed by a well-known VC firm with celebrity connections, and you might lean into those connections to step on the gas and add a lot of new users who want to connect with those influential people on your platform and tap into that scarcity of attention, but you’re in this critical early stage where you are becoming the platform you will be in two, five, 10 years.
You’re allowing those norms to be dictated by people who will probably leave once too many people want their attention and who are only there because they can be a big fish in a small pond. Then after they leave, I don’t know, what are you really stuck with or will you have just cashed out by then?
[00:06:40] Danielle Maveal: Right, and there are no paid moderators so you’re thinking the people who are hosting, facilitating, moderating these conversations, most of them have fantastic intentions, not all of them do, so how does that scale and what are those people there for, and like you said, what is their end goal?
[00:06:59] Patrick O’Keefe: There was a Twitter thread by Kia Richards who is a product compliance manager at Square and has worked in financial risk analysis. She expressed concern over how many frauds have been using Clubhouse in order to give financial advice and conduct transactions with unsuspecting victims. I’ve also seen tweets about health and medical advice being dispensed, anti-vax folks, a lot of guru types re-emerging to claim their slice of the influence pie if you will. Taylor Lorenz of the New York Times has been all over this and has a really great Twitter thread that we’ll link to in the show notes.
I don’t know, I’m seeing this crop up now after the wave of racism, anti-Semitism, misogyny, I’m now seeing this wave of misinformation, of scamming. I don’t know, there’s something about Clubhouse right now where it is that seems like it’s made it really ripe for, to use a word I don’t use often, charlatans. It seems like it’s really ripe for charlatans. Why is that?
[00:07:52] Danielle Maveal: I think it’s that that’s in the DNA that they weren’t paying attention, they’re not moderating. Anyone can go onto the platform and take the mic and take center stage and know that there won’t be many repercussions. I haven’t heard in all the Twitter threads I’ve read about anyone being removed from the app. I’m sure there have been some, but there’s no transparency. When is someone banned or when are they muted? What if they are a repeat offender? What happens then? I think there’s no discussion of that happening publicly so it probably feels like a brand new territory for these scammers to go and chat.
A lot of what they’re doing that is very wrong, I’m sure is hard to enforce the guidelines on there now. Not everything is about guidelines and moderation.
[00:08:36] Patrick O’Keefe: To be honest, I’m not like I’m watching this, I’m sure they’ve banned people, but the only time I’ve seen so far, someone talk about being banned on Twitter, I think was an account called VCBrags. The reason they were banned was because they supposedly didn’t have a name and a photo attached to their profile, which if that’s a standard you want to enforce, that’s fine. I don’t see a problem with enforcing it on that account because it just said VCBrags, but I think the thing that caught people’s attention was that it happened within 24 hours.
Within 24 hours of signing up, this VCBrags account was banned and fair or not, and I’m being very fair here in my description, is that creates this perception where, so how fast did the anti-Semites get banned? How fast did the misogynists get banned? How fast did the people pushing, I don’t know, some scam or some MLM get banned, right, because it’s not so much that that VCBrags account should or should not be on there, but like they got such a quick reaction. Outwardly, it looks like, why did that get such a quick reaction but these other things seem to fester.
I think part of the problem with this misinformation and this scamming thing is this celebrity culture that exists around Clubhouse. I’m curious to hear what you think about this, but it seems to me, and you’ve certainly spent time on the app, as opposed to me, is that they have all these celebrities and they have these influential people coming in there. To me it makes it seem like it’s a special thing that is more credible than your average Twitter feed or your Facebook profile. If you ran into a guy on a big social network and he said to you, “Danielle, hey, I’m Bob. I have a new way for you to double your money. Go to bobthefinancialwizard.biz, right?
[00:10:09] Danielle Maveal: Yes.
[00:10:09] Patrick O’Keefe: You probably would not view that person credibly. You wouldn’t because you’re obviously well-spoken on internet safety and trust and safety issues, so I’m not saying you’d fall for this anyway but if you went into a Clubhouse room that was full and Bob was on stage talking about how he could double your money, and everyone in the room was amazed. To your point earlier about moderators, which we’ll get into in a second, and you don’t know who all the moderators are, but they’re people are all on stage and they’re talking about their experience with Bob and how Bob helped them double their money, that changes the dynamic, doesn’t it? Doesn’t Bob seem a lot more credible on Clubhouse potentially?
[00:10:42] Danielle Maveal: Yes, it’s invite-only and so you can actually see on someone’s profile who invited them. That initial group was VCs and celebrities and clout chasers now who are begging for invites so yes, I think you do trust them. You feel like everyone’s connected in some way. They talk a lot about this being a community and that’s really what upsets me because these people are there with full-on public personas, especially the people who are peddling their scams.
I have been present in conversations that are private conversations where we’re really being our real selves on there, but most of the time, it’s a public persona talking about self-aggrandizing, being the loudest voices in the room. It doesn’t have a community vibe to me. It is like an audience and you have a stage and you’re listening to people talk about their black and white opinions. It’s very hard to watch, especially in the name of community building. The VCs are investing in community right now and they’re obsessed with Clubhouse. I fear that this is what they think community is.
[00:11:53] Patrick O’Keefe: Let’s pause here to talk about our generous sponsor, Localist.
Localist is an event marketing platform that aggregates, automates, and analyzes all of your virtual, in-person, and hybrid events so you can grow and engage your community. Their platform allows you to centralize your event calendar, automate email and social media promotion, and measure and analyze the ROI of your events. Localist integrates with your existing tools and you can even predict future event success using their Event Reach and Event Score features.
You mentioned something interesting there, which it reminded me of this quote from this Vanity Fair piece that was really good. The author of the piece is Tatiana Walk-Morris. Walk-Morris said that, “In the bubble that is Clubhouse, pseudo-intellectual monologues from powerful users can go unchecked, leaving them free to promote racist ideas under the guise of posing legitimate questions or playing devil’s advocate.”
Obviously, we weren’t even talking about sort of the racist stuff at this point, but that’s the hallmark of Holocaust denial is asking questions, playing devil’s advocate. Could six million Jews have really died? That’s how they sort of plant the seeds for what then becomes full-blown anti-Semitism. In our case, the same stuff happens, the same tactics are used by people who are trying to cultivate an audience to make that audience do something. Whether that’s give them money for a cause or it’s to sell them on flat belly tea or whatever it is. All of these things, they all pull on some of the same psychological tactics.
The thing that you hit on there was, it’s an environment where the power dynamic is already present between somebody who has a big following and someone who doesn’t. If you got in a room with someone who no matter what they have done in their lives, good, bad, who they are, and they have millions of followers on Twitter, for example, and you’re in a room with them and they’re speaking, where does the power dynamic lie in sort of holding them accountable for their words?
I’ve seen multiple people pop up on Clubhouse in the articles I’ve read, that were people who are either mocked in other corners of the internet or have disgraced themselves through wrongdoing, through sexually abusing others, through criminal activities. They’ve been disgraced from those platforms and now all of a sudden, they’re sort of popping up again on Clubhouse and they have a pulpit there. They’re taking advantage of the early lax standards to build a following and to redeem themselves on the back of this hot platform because “Hey, Kevin Hart was there. I’m in it now. Oh, look at this other guy, isn’t he smart too?”
We all bear some responsibility for that, the participant, the listener. We have to have internet literacy. We have to train ourselves not to trust people so easily, but still, this is a platform that has a lot of money and some people are coming in and taking advantage of this early credibility they’d built up thanks to the names that gave them that money.
[00:14:51] Danielle Maveal: Yes, and you asked me earlier, what could they have done earlier on? I didn’t see a key community or trust and safety hire very early on to set the norms. I’m on two other audio apps with absolutely no problems. One is called Space and the other is called Quilt. They’re smaller, they’re growing at a smaller rate, but they have key community hires there. The social norms that are being developed are just completely different, and the dynamics. The dynamics of the stage and the microphone and the people they bring in, who they say the Clubhouse community has ownership over because it’s invite-only but for sure, they see everyone who’s walking through the door and they do have say over who comes into that platform.
[00:15:36] Patrick O’Keefe: That’s really interesting because as I mentioned earlier, there’s so many streaming platforms, but we’re talking about Clubhouse. [chuckles] There’s another one I’d shout out, Airtime, who does a very similar thing to Clubhouse, just maybe in a little different way and maybe the room discovery is different, but they’ve been around for like four years. They’ve raised money, they have video on their app. I was amazed the other day. I went to Twitter, and this isn’t the be-all-end-all way, but I searched for their handle on Twitter plus anti-Semitism, there were no results. I searched for joined Clubhouse plus anti-Semitism, there were many results.
That was the case with multiple other issues like racism and sexism and they had no results, so I’m like, “Wow.” So you could hide that stuff, you can, it can happen. It’s tough to be around for four years and have this platform that everyone knows about and eventually it leaks out, right? Eventually, we find out, but they seem to have handled it okay and I think Heather Merrick, who’s been on the show before and built out their trust and safety program, really deserves a lot of credit.
It’s that type of thing because what you’re alluding to, to me is like, people are going to say, this stuff exists, it’s everywhere. What are you going to do? How can you stop it? It’s just the internet, it’s just people, but there are platforms that are working out okay. I’ve used Space a little bit on the conversation that you and Erin were kind enough to have me on and it was good.
I don’t necessarily want to be on an audio call all the time with people every day, but I would use it again, where what I’ve seen about Clubhouse makes me exceedingly hesitant to use Clubhouse and so it’s good, it’s important for us to know that there are platforms out there that are trying, and that don’t have these sorts of problems or these problems just exactly like this.
[00:17:17] Danielle Maveal: It’s really interesting, and like the investment in this, the valuation of Clubhouses is $100 million dollars. To me, my mind is just blown. The number of people who are still joining– I’ll have a conversation with someone who is a very reasonable person and we’ll talk about Clubhouse, all the issues that have been raised. Then I’ll see them on the other hand ask for an invite. I don’t know, my mind is blown. We are not learning from lessons of the past. How do we make people care? That’s what I think about a lot. How do I make people care? You see me on Twitter talking about Clubhouse all the time, yet some people who are so close to me are actually logging in every day and joining those discussions.
[00:18:02] Patrick O’Keefe: It’s interesting because obviously, we care about these things more than the average person. I try to be sensitive about that. I know that nobody cares about community guidelines except Patrick, I get that. When I’m at Thanksgiving, no one cares about this stuff, but Patrick. I get that, I understand. I think that our audience will get this. The people who listen to the show, get this, but there might even be some people who discover the show, especially folks outside of the industry, and they listened to it and they might say like I said bad stuff exists everywhere. Focus on the positives.
I saw Justine Bateman who’s an actress tweeted the other day. I think Christmas Eve before Christmas that, it’s amazing “LOL,” with the negative articles. If you search for Clubhouse, the only articles you find are negative. Basically, that’s not what Clubhouse is about. It’s about happiness and fun. Don’t let these articles take away your happiness and fun basically. [Note from Patrick: Justine Bateman later deleted this tweet]
I was like, here’s the thing. I think that people are looking past things because it’s hot right now and because a celebrity they want to connect with is using it, not in Justine Bateman’s case maybe, but the average person’s case. To those people who are looking for a shot to get in the door, this is their shot. This is my shot. I can get someone’s attention in there. I get that and I want to be sensitive to that. It’s an opportunity for them.
I think what’s tough and confuses people is that there are good things happening on Clubhouse. It’s not all bad and they don’t want to miss out because hey, some bad people are using it to turn others into racists or to scam artists or to scam them of their money. It’s like, I’ve been thinking about, what do we say to those people? I don’t know if it’s stopping to use the app, if it’s being more deliberate in your use, if it’s demanding more of Clubhouse, greater action, greater accountability, being louder about that and making that a part of the experience, not just promoting Clubhouse, but also promoting the problems with it.? I don’t know.
I’ve had people on my feed too just like you and I’m like, “Dang, I like that person.” [chuckles] Dang it. I want to know them. I like that person like them in their feed, but they’re promoting Clubhouse here and Clubhouse just hurts me to my core and my trust and safety core. Have you approached any of those relationships or just, what would you say?
[00:19:57] Danielle Maveal: Yes, I actually met a woman through another community I’m in, and she wanted to start a Clubhouse chat, so they often have someone who runs a weekly chat. I was talking to her about Clubhouse. She had played some of the things. I sent her a few links and I told her why don’t you try to support Space or Quilts? She was actually trying to start a women’s only chat and Quilt is run by a woman founder. I think I did all that I could and then just today I saw her drumming up the Clubhouse chat that she was doing.
I think she’s a lovely person. I just think that there’s such a draw to it. I get that too. I’ve seen some chats on there I’m like “oh, I wish I could go in, but I’m not going to.” I think the other thing that’s interesting, I saw on Twitter someone say “I met these amazing indigenous artists on Clubhouse. I had no idea I would meet other people like me on Clubhouse, and I’ve had amazing conversations with them and it literally has changed my life.” That’s what community is about and I love to see that.
It made me really think what are we missing from the other platforms where this can happen? How come indigenous artists can’t meet and build community with other indigenous artists unless on a toxic platform. That was an interesting thread I’ve been pulling on.
[00:21:16] Patrick O’Keefe: I think that’s an interesting point because having those good use cases is going to be good for Clubhouse and it’s going to allow them to skirt past these issues. Because I’ve taken a look at some of the positive things people said and who’s there to like it, @joinclubhouse. When the negative things are said, where’s the silence, @joinclubhouse. They are very much amplifying those positive things right now and hoping to get past this negativity and drown it out.
I don’t know, for me, what I’d say to people who have been vocal on Twitter is that it matters who you stand next to. If you want to have credibility in these areas, I’m not saying you don’t need to use Clubhouse, I’m not saying delete your account, get off there, but I think what you should do then is you should urge them to be better about these issues.
You should be loud about it. You should talk about it. You can say Clubhouse has been a great platform for you but you recognize these other issues. There are plenty of areas of life where it’s no longer acceptable where it once was for us to stand next to someone who was problematic otherwise because we ourselves derived a benefit from that. More so than ever before, we’re not doing that anymore.
I don’t love Facebook. I have a Facebook account. I’m also very critical of Facebook. There are things Twitter does that bother me. I’m critical of Twitter. I love Twitter to an extent, the connections that I’ve made in it. It’s okay to like something and also see it as having problems. I think Clubhouse is in this rabid fan base mode right now though, right, where there’s a potential to be a star, and I think, unfortunately, maybe not in the case of the folks you mentioned, I’m sure they had a really good interaction with other people just like them, but with some folks I think there’s a star-maker opportunity here where they’re afraid of losing that shot if they are critical of the app.
Because you can be an early adopter, you can be an influencer right now. Hey, I was early on Google+. I had a ton of followers on Google+, didn’t amount to much, [chuckles] in the end, but I get it. I would say if you love Clubhouse then you should love Clubhouse enough to try to make it better, and that’s not just in taking on the labor yourself which I think is–
Let’s transition to this because I was talking to Community Signal’s editorial lead, Carol Benovic-Bradley, about this episode. She mentioned how it sounded like the brunt of the in-the-moment moderation falls to Clubhouse’s community members, and that’s a theme you’ve mentioned to me too about the number of people doing free labor and what they get out of it. I guess if you’re on the Clubhouse side, maybe you make an argument that it’s like Reddit subreddits which people generally seem to accept.
Reddit has a bigger more fleshed-out community team now, right, but you can make that argument. These are like individual communities people are going to own, whatever. What flies in the face of that is there was this interesting comment made by Tracy Chou of Block Party in that Vanity Fair article by Tatiana Walk-Morris, where she shared how one of the co-founders of Clubhouse, Rohan Seth, had contacted her early this year to talk about anti-harassment and moderation practices for their platform, and Chou had given Seth her time for free while suggesting that he pay others for their insights, what a genius idea, given that Clubhouse now had substantial funding.
The quote that stuck with me was, “Chou said she explained that moderation goes beyond adding, blocking, and other technical features, it also requires setting community norms and thinking through the platform’s policies. Though Seth seemed enthused about users taking on moderation roles on their own, Chou said she stressed the need for paid moderators who know what they’re doing.” What do you make of the reliance on users moderating Clubhouse right now?
[00:24:31] Danielle Maveal: Oh, my God, she’s brilliant. I love her. It really saddens me but they’re creators who are now put into these moderator and facilitator roles and they’re going to make the founders and investors rich. Their invites are positioned like a gift but really they’re creating the experiences that draw in hundreds and hundreds of listeners and thousands in some cases, and they get absolutely nothing for it. I’m actually a little shocked because I think we’ve seen in the last few years that having millions of followers only gets you so far. A lot of these followers won’t actually pay for your products when you’re ready to sell, so it makes me sad that people are volunteering to do this hard work.
When I came on, not only are the larger creators turning into moderators and facilitators, but I came on and there were a few women who met each other on Clubhouse and therefore loved Clubhouse and they’re still there, and they created this magical onboarding experience just for me. I walked away from that thinking, oh my God, I can’t wait till Clubhouse hires one of them because they had done that for, I would say dozens, if not a hundred other people, and I haven’t seen that happen either. It’s saddening to me to see the same issues that have come from Vine and early on at Reddit, we’re seeing the same things over and over again.
[00:25:52] Patrick O’Keefe: Let’s pause here to talk about our generous sponsor, Vanilla.
Vanilla provides a one-stop-shop solution that gives community leaders all the tools they need to create a thriving community. Engagement tools like ideation and gamification promote vibrant discussion and powerful moderation tools allow admins to stay on top of conversations and keep things on track. All of these features are available out of the box, and come with best-in-class technical and community support from Vanilla’s Success Team. Vanilla is trusted by King, Acer, Qualtrics, and many more leading brands. Visit vanillaforums.com.
You mentioned something earlier I think that’s really relevant here which is, and I should say, I’ve worked with a lot of companies. I’ve started communities with no budget. I managed communities with volunteers. I love volunteer moderation, it has a place with some communities. We’re not talking about a small forum I started as a teenager 20 years ago that makes no money, we’re talking about a company that’s been funded 41 million-plus dollars.
There’s lots of differences, lots of flavors of community out there but when you have the process they have, which is that there’s no vetting standard for moderators. Within a subreddit, especially in an established sub-Reddit, you have caretakers in that subreddit and they’re vetting incoming moderators and who they want to take the reins in that community that they love and appreciate. That’s what happens most of the time. It’s what happens most of the time in, as you know, online communities. Moderators get promoted, they get promoted because they are great members that value the community.
Here, you have situations where some person, some scammer, or somebody with an audience comes in and starts a room, has moderators, and the moderators are the people who just want to reinforce their scam. Right now, because it is so susceptible to that, it creates those echo chambers where, yes, you might have people taking on this labor and some of them are doing a great job, and some of them are very selfless and doing this to cultivate a great experience on Clubhouse, when the app people themselves, the people behind the app aren’t doing it, but you’ve got people who are there also as bad moderators. Reinforcing the issues and like, if you question that thing, well, then that moderation is going to be that you’re going to be removed. It almost reinforces the problem. I don’t know if it’s possible to report a bad moderator. It might just be part of user reporting, but that’s a real thing.
[00:27:57] Danielle Maveal: There’s no feedback on the moderators or the sessions. I’ve never had Clubhouse ask me about my experience when I leave a room. Definitely a missing piece.
[00:28:06] Patrick O’Keefe: Thinking about how people often think about these things when they come from a VC backed mindset is their answer might be a reputation system.
[00:28:13] Danielle Maveal: You’re right, yes.
[00:28:15] Patrick O’Keefe: -of the best free workers we have to put it in the rooms, the answer is a reputation system.
[00:28:19] Danielle Maveal: Yes, and the number of people. If there’s thousands of people there, it must’ve been a great conversation.
[00:28:24] Patrick O’Keefe: We’ve talked a lot about Clubhouse, but I want to take a step back as you did earlier and think about audio-first platforms as a whole because I know you’ve spent a lot of time thinking about this. Clubhouse is getting the attention, they have money, they have high highs and low lows, but they aren’t the only one. You’ve mentioned Space, Quilt, I mentioned Airtime, but you could tell me right now that you don’t think something like Clubhouse should exist, right, and I wouldn’t say I couldn’t understand that point. I really would be like, I wouldn’t be disagreeing with you. I’d say I can see that perspective, but if these platforms exist, we’ve talked about a few things, but I want to tie it together here. What does a day one responsible moderation strategy look like? Where would you focus?
[00:29:02] Danielle Maveal: I would hire professional moderators, facilitators, and a community moderator is much different than a conversation facilitator. Those are different skillsets. You’d want to have one of each of those on your team and then slowly grow and bring people in and train them on how to host conversations. The problem with audio too, it’s really expensive. You’d also want to support it with tech, which I would hope if you had an investment of over $10 million, you would do that.
Now, they are recording some things on Clubhouse, but they delete it if there are no reports on that chat, which is also problematic for other reasons so you don’t have to go back there. Recording the audio, transcribing the audio. I think we haven’t touched on the accessibility of audio, that’s another interesting thing to think about. Who can be part of the conversation, especially if it’s audio-only? These are things I would hire people who have been thinking through them. Discourse has been around for a long time. I’m sure there are community people who have been thinking through audio problems for a long time so I think those would be key hires.
Or like my obsession recently is have people who have been operators in the space be the founders, instead of people who just have an idea. Give people who have spent 10, 15 years working on a problem, like rolled up their sleeves and got into the problem, give them the $10 million and have them figure it out. That’s my frustration with the VC backed system.
[00:30:32] Patrick O’Keefe: It’s such a great point because we love thinking about these problems, okay, and we’ve solved them in many ways. It’s never 100% perfect, we have challenges, but we’ve seen it, we’ve done it and this podcast is proof that we’re talking about it. You know, you mentioned such an interesting thing which I cut from my notes because I was like, I don’t know if we have time for that, but you brought it up and I want to talk about it because it’s a great point is just how shocked I was when I found out they did not have live transcription.
It blows my mind that something could have so much funding but pay so little attention to accessibility. Like this is not a Herculean task now. You and I are recording on Google Meet. Google Meet has an awesome live transcription service. You put on closed captioning, my grandfather uses it to talk to my mom. It works great. I know Google, you might say that’s a large company, but it’s not– this is doable.
This is something we’re doing now and we’re doing it well, and it’s possible, so to me, that just is a great example that you brought up of sort of the overall blind spot for empathy and consideration that exists at Clubhouse is you don’t want people who can’t hear or are hard of hearing to be Clubhouse users. Is that what you’re saying? Like those people don’t exist to you.
I can’t say clearly like that’s terrible and you know, you mentioned the role that could have in moderation, which I think is smart too, is just moderating audio and video is, it goes without saying hard. It’s tough, it’s really sensitive stuff. It’s not like text or like a forum thread, it’s different and you know, when you make things accessible, as Wesley Faulkner said in our last show, when you do accessibility like this, when you make it accessible, you make things better for everyone and that includes your own staff.
[00:32:05] Danielle Maveal: Absolutely.
[00:32:07] Patrick O’Keefe: You mentioned the recording thing. I wanted to come back to something you said, you talked about how they delete recordings after call where there’s no reports and that’s problematic. Talk about that some more. Why is that problematic? What do you think a service like this should do with those recordings?
[00:32:19] Danielle Maveal: Right, well, they could transcribe them and then have them saved if saving audio is too difficult, but I’m not sure. You and I have both seen that bad behavior isn’t a one-time thing and they come out of the gates with this obviously racist, come in and we kick them from the platform. Usually, it’s an escalation, and we start with baby steps and we see how far we can go.
You need to look at someone’s behavior over time to really understand their motivations and should they be in the community or not. If you’re looking at reports one at a time, and if there was no reporting at that time, maybe somebody the next day goes, you know what? I felt really icky about that. I want to report it, now. There’s no way to do that, so those two things are problematic to me.
[00:33:02] Patrick O’Keefe: Yes, if you don’t see it in the moment you can’t record it and I don’t know, I don’t want to say that everything we do needs to be recorded, but if you’re going to be a platform like this and you’re going to be audio-first, and in this case, audio-only, I don’t know how you can have a really reputable, responsible platform without having some level of recording and, you know, you mentioned this tool that if you report during a call, it will save the audio for them to listen to encrypted privately.
I think that’s an example of Clubhouse’s tools improving, at least their written explanations of their tools, right, because again, it’s execution. They’re saying they’re doing some good things and hopefully, they are, but that wasn’t until November 2nd that they updated their terms of service to say that they could start recording audio tied to abuse reports. What happened before November? Presumably, they don’t know. Like that’s crazy.
[00:33:52] Danielle Maveal: Right. Yes, it’s not until something happens that they react. My favorite Clubhouse story is that somebody changed their name to Elon Musk and when you go live in a room, it pings everyone you know and they get a notification. So MC Hammer, who’s on Clubhouse, got a ping, [laughs] MC Hammer entered the room and spoke to Elon Musk and Elon Musk was in there and then he tweeted about it [laughs] which is an amazing tweet by MC Hammer about identity in communities. I have that one saved. [chuckles]
[00:34:24] Patrick O’Keefe: Gosh, I mean, like I said, live audio, video, hard stuff and if you’re going to play there as a well-funded company, being able to shut down a bad thing while it’s happening is important and if you can’t do that, maybe it’s a clue that you shouldn’t be doing this. Like as a teenager, there’s a reason I didn’t do chat rooms. There’s a reason I didn’t launch a live streaming company. Because I thought to myself, well, let me think what’s going to happen here and can I handle that and I was like, no, actually I can’t, it’s just me, so I think I’m going to do this forum over here instead.
Of course, functionality-wise, I couldn’t have done it, whatever, the internet bandwidth adoption wasn’t there yet, but still, to me, it’s something to think about is all of their stuff that I’m hearing right now sounds very reactive, as in, after the call, but the damage will have been done. If you’re eight people with 600,000 users and you’re allowing them to just start calls as much as they want, it goes without saying that there’s bad stuff happening right now.
If you are not in a position where you have a system of escalating reports and say, three people reported this, we need a human to look at it now, or whatever the bandwidth threshold is, then you’re just letting things happen and you have too much money to let things happen. Taking care of a call two days later is not going to be a- it’s just not a workable solution to stopping bad things.
[00:35:42] Danielle Maveal: I’ve seen not only rooms not be shut down, but you can schedule events. I’ve seen the names of some them be screenshotted multiple times, things that are anti-Semitic or anti LGBTQ, they’ll stay live for hours if not days. Definitely, they’re reacting and not taking action as reports come in. That’s if you’ve got a team of eight people running a platform for 600,000, I can’t imagine you can be too responsive.
[00:36:13] Patrick O’Keefe: Eight people, that’s just so frustrating. I think I’m a big proponent as it sounds like you are of like, if you were handling content coming in from somewhere else, community people need to be like, first five, first three. Like if your whole business is around other people making things on your platform, I think you said it so well. Give money to people who have done it, not people who have an idea about it, they should be at least part of the conversation.
Like the pitch deck, part of that should be like this person’s dealt with these issues for these platforms and that’s why we think that is going to be a safe place. Of course, I don’t know how many people actually ask that. Probably some people do, but it’s got to be a very small percentage of venture capitalists. I once interviewed with some guys years ago who got like $500,000 in seed money, they had never done anything successful. They had done a few startups that like got a few articles on TechCrunch or something, and they wanted to redo text chat and make it empathetic.
I was like, okay, let’s talk to them. I flew to San Francisco, did this interview, which basically ended up me, giving them free advice for half a day. It was like their idea didn’t pan out. It didn’t happen. I don’t know what happened to the money. I think they both have regular jobs now, but like, I don’t know how to get $500,000. I probably don’t know the right person. I’m sure I could figure it out. Like I know enough people doing this work. Maybe I could figure it out but you guys just got $500,000 for this? Okay, so what’s my salary going to be, what’s my equity going to be here? I was like, I just don’t know. Like I don’t know how this all works. I don’t know, that’s not a useful podcast right now, but goodness.
[00:37:48] Danielle Maveal: It’s a wild, wild space.
[00:37:52] Patrick O’Keefe: You mentioned some very helpful things that you would do on day one. I have a couple of things I’ll throw out there just so I’m not constantly just only saying what’s wrong. I think that one thing I would focus on is really keeping the testing pool small, especially if I only had eight people. Super small, trusted people only, no invites, or very few and I would know who was coming in like you said, so you already said this, know who’s coming in the door. I wouldn’t be allowing more people in the door until I was ready to be responsive to live reports.
If I have a live platform and I’ve got money, I’ve got to be responsive to live reports so that I can come out very strongly against bad behavior. I’d absolutely want to throttle growth to ensure it was manageable and that it didn’t spiral out of control. I would give up the virality, I guess. I would sacrifice those big moments of spontaneity in pursuit of sustainable growth over a longer period of time.
There are all sorts of things you can do to throttle growth beyond limiting invitations. You could limit the times of day the platform can be used. You can make it to the Clubhouse events had to be scheduled and scheduled only, not impromptu. They had to be approved, so you could review them. You could limit the number of events that could be hosted in a day. These aren’t forever solutions, but while you’re growing, they’ll throttle the growth and allow you to test things out and understand where the blind spots are.
My point is there are a lot of dials that you have access to between no events and hell. [chuckles] No events and unlimited events, there’s a lot of things in between those two. It just seems like we got more people so why don’t we just let them use it? That’s just not a good idea when it comes to trust and safety principles and to ensuring that you have a good, safe, respectful platform.
[00:39:27] Danielle Maveal: I’m confused because earlier on at Etsy, we had chat rooms, we actually had a video space where people could pop in and put their video on without having to ask permission, that was wild, and forums and teams like groups. They were about three or four of us, we were online at all times. We knew what activity was going on at all times. A lot of spontaneous things did happen, but we were there and we knew it and so we could shut it down.
It wasn’t just us, the community team, but even our marketing team or customer service teams, we had engineers pop into forums and we just cared. We cared about the conversations that were going on and we cared about setting social norms, and we showed up. The spontaneity was there, I don’t think you have to give that totally away. It’s just you’re paying attention and you’re not focused 100% on growth, of course, we were interested in that, but we also were focused on making sure this was a human place where people felt safe and we were having constructive conversations.
[00:40:33] Patrick O’Keefe: I think the key thing you said there was you cared. I hate to be cynical. I hate to be so rude as to imply people don’t care because maybe they do, but man, we are how many months in the Clubhouse now? Six, seven. You said you were on in the spring. It’s December 31st as we speak. They’re hiring for three community roles really, at the moment. One’s called Operations Program Manager, but it’s community-ish. Gosh, with that money, there are so many talented people that you could lure over to you with equity and cash right now than you could have done six months ago. Not only because there’s a ton of talent available, but there’s a ton of talent on the market right now.
If I was in a position where I was hiring a team, I would have just a great time because I’d be able to hire from a very talented pool of available professionals. Even then you could go to whoever you want and throw money at them and equity and say, “Hey, come over here, fix this. You’ve worked at a streaming company, you’ve worked at an audio company, fix this for us, figure it out.” They’re going to go into 2021 with– Maybe they have a person now, I don’t know. If they don’t, they’re going to go into 2021 without that person, which is just a great example of how not to do it.
[00:41:37] Danielle Maveal: Patrick, they did hire an incredible woman, by the way, from Gucci to run their creator pilot program.
[00:41:44] Patrick O’Keefe: You had me going for a second. Okay.
[00:41:47] Danielle Maveal: [chuckles] A key hire…
[00:41:52] Patrick O’Keefe: Which is going to incentivize managing your own channel, still. If you read between the lines there, I assume that’s where it’s going on here. The best moderators are going to become curators and tastemakers on Clubhouse and hopefully, they’ll use that power to push for more changes. Again, we’re throwing money at that side of the table, which I’m not saying you shouldn’t throw money at that side of the table, but it’s amazing how far connections and influencers can carry you as a product.
Because I don’t think of Clubhouse as being– Maybe this is putting it down and being a little unfair, but I was thinking about Ustream when I was researching Clubhouse. Ustream.tv. Years ago, I think it was bought by IBM, and now it’s not a public product anymore, it’s whatever their cloud video thing is. There was a time Ustream was really popping. It was really a popular place. A lot of celebrities were using it, they were hosting live streams, and I never heard about this type of stuff on live Ustream. I’m sure it existed. Ustream was really an innovative product.
Before Ustream, I don’t know that people were hosting their own video streams. I’m sure they were. There’s probably some early people there, but Ustream really hit it. I know a million people doing audio calls. It’s not like it’s that unique of a concept here. They put it into a very infectious display, people are on it, this celebrity is here, this celebrity is there, but the functionality of Clubhouse, I don’t find it to be all that innovative. Maybe I’m being unfair there. I don’t know how you feel about it.
They really had a format that existed and made sense. Really, what did they bring to it? Some nice design and some celebrities? Good engineering? I guess? Those are all things that have value. What was laid on top of here beyond scarcity? What’s the value prop beyond access to influencers or is that just it? Because there’s a great opportunity here for someone to throw millions of dollars at a space that actually makes trust and safety first, and that’s clearly not what this was.
[00:43:41] Danielle Maveal: Right. The biggest thing they had going for them was their growth tactics and making it invite-only, and having VCs in a ton of conversations where people just wanted to be in front of them and be known by them. That’s going to be in the DNA of your company and then is that going to scale to my cousins in Minnesota? Probably not. I think they created this model which somebody will come and create a better version of it that is more inclusive, and hopefully, has better moderation, and it feels like a safer, more inviting space that you can actually feel like you can get on “the stage” and have a voice and speak to someone about a real topic with your real self, not your public self. If someone creates that, then that could actually become huge. I think there’s definitely a ceiling to Clubhouse because of the way they set it up. I’m hoping but that’s my gut feeling about it.
[00:44:38] Patrick O’Keefe: Well, thank you, Danielle. I’ve really enjoyed the conversation. I appreciate the TikTok support as well. It’s been a pleasure and thanks for sharing all the audio first knowledge with us today.
[00:44:47] Danielle Maveal: I will exchange gripes with you anytime, Patrick.
[laughter]
[00:44:53] Patrick O’Keefe: Happy new year. We’ve been talking with Danielle Maveal, consultant and founder at Joyce. Find Danielle at daniellexo.com and subscribe to her Substack, Community Feelings, at daniellexo.substack.com.
For the transcript from this episode plus links and highlights that we mentioned, please visit communitysignal.com. Community Signal is produced by Karn Broad and Carol Benovic-Bradley is our editorial lead and was a special contributor to this episode. Until next time.
[music]
Your Thoughts
If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.
4 comments