As community professionals and hosts, we have the power to cultivate thoughtful spaces online. We serve communities and, if you’re a regular listener of this show, I doubt you’re serving racists.
Systemic problems can feel overwhelming, but small things make a difference. Your community and how you manage it, regardless of the size of it, can be a part of the solution. I encourage you to think about that as you make choices that shape these platforms.
If you enjoy our show, please know that it’s only possible with the generous support of our sponsor: Vanilla, a one-stop shop for online community.
In April of 2019, she was involved with efforts to maintain the Texas Citizens Participation Act, the Texas anti-SLAPP law, and had a hand in the redraft of the modified law. Anette is a member of the internet Lawyers Leadership Summit Group, a co-chair of the Digital Communications Committee within the American Bar Association’s Forum on Communications Law, a member of the First Amendment Coalition and a member of the International Association of Privacy Professionals.
[00:00:04] Announcer: You’re listening to Community Signal, the podcast for online community professionals. Sponsored by Vanilla, a one-stop shop for online community. Tweet with @communitysignal as you listen. Here’s your host, Patrick O’Keefe.
[00:00:25] Patrick O’Keefe: Black Lives Matter.
As community professionals and hosts, we have the power to cultivate thoughtful spaces online. We serve communities and if you’re listening to this, I doubt you’re serving racists. Systemic problems can feel overwhelming, but small things make a difference. Your community and how you manage it, regardless of the size of it, can be a part of the solution. I encourage you to think about that as you make choices that shape these platforms.
On May 28th, a couple of days after Twitter added a fact-checking notice to one of his tweets, Donald Trump signed an executive order targeting online communities and platforms. For years, I’ve talked about the importance of Section 230, especially to US-based community builders and how threats to it were looming from both Democrats and Republicans. Words have turned to action and here we are. I believe that holding Trump accountable for his rhetoric and fighting white supremacy are the same fight.
This executive order is designed to stop you, me and big platforms from doing exactly that. On this episode, we’re talking with attorney Anette Beebe about the resulting fallout and answering some of your questions. Thank you to our Patreon supporters for backing our show. This includes Heather Champ, Carol Benovic-Bradley and Maggie McGary. If you enjoy episodes like this, we’d love to have your support too. Visit communitysignal.com/innercircle for details.
Anette Beebe has worked in the legal field for over 21 years, spending close to eight years as in-house counsel, now general counsel for Xcentric Ventures, which operates one of the oldest consumer complaint forums, RipoffReport.com. Anette formed a solo practice in late 2012 that caters primarily to businesses that operate online and individuals who have concerns about online content. In her spare time, she works to educate youth and adults about repercussions from internet use through public speaking and online courses through her company, Smarter internet Use, and blogs about fighting fair on the internet.
In April of 2019, she was involved with efforts to maintain the Texas Citizens Participation Act, the Texas anti-SLAPP law, and had a hand in the redraft of the modified law. Anette is a member of the internet Lawyers Leadership Summit Group, a co-chair of the Digital Communications Committee within the American Bar Association’s Forum on Communications Law, a member of the First Amendment Coalition and a member of the International Association of Privacy Professionals.
Anette, welcome to the show.
[00:02:40] Anette Beebe: Hey, thanks for having me. I’m looking forward to this today.
[00:02:43] Patrick O’Keefe: I want to get into some specifics, but first, what are your overall impressions of the executive order?
[00:02:50] Anette Beebe: Oh, gosh. From a high level, I’m a little frustrated by it but there has been sort of a long history, a compiling of issues of attacking Section 230 in different ways from different groups. Everyone has their own reason on why they think Section 230 should be abolished or modified. The problem is we don’t typically agree on why that is. This was just another layering on top of other prior issues as I see it.
When I had the first draft, I got a little concerned about some of the things that were in it, but I was also reading it at like eleven o’clock at night as I’m laying in bed [laughs] because it came out pretty late, when it was actually circulated, the “leaked version”. Then, of course, the internet erupted and everyone’s like, “How can he do this?” and those kinds of things. Now that I’ve had some time to digest it I am less concerned about it in the immediate for at least some of my clients, but I’m not so encouraged by perhaps what’s going to happen in 2021.
[00:04:00] Patrick O’Keefe: Let’s talk about that. I guess two parts. What damage has it done already, do you think? Like, what’s the fallout now, confusion? What’s happening right now and then what do you think is fall out in 2021?
[00:04:10] Anette Beebe: What I have seen is one, of course, there’s always confusion about Section 230 and what it says and its purpose and for some reason, and I just don’t know if people just have not actually read the jurisprudence on this stuff or what, but there’s a huge history that pretty much outlines in the law what its purpose is and that sort of stuff. The concept that platforms have to be neutral or those things that they’re going for.
Well, as soon as you decide to get involved in editorial decision, now it’s no longer applicable. I’m like, “Where are you guys pulling this from? This is actually not at all what it says.” When you start with a bedrock of people misunderstanding it outside of all the scholars and people who deal with it every day coming out and saying, “No, no, no, this isn’t what it is.” It’s like, people just don’t pay attention to that.
Unfortunately, you already have people who have a very deep misunderstanding of what the law is to begin with. When you get this executive order and we have our president who is tweeting out, we’re going to be punishing or taking down or affecting Twitter in a way that is going to affect speech. Then everybody gets this executive order, not knowing really how to read it, really not understanding what he can and can’t do, and not really understanding how the DOJ actually works and the FTC and the FCC and those aspects of it, not understanding those, just taking the president as what he has stated, they freak out. I represent clients who do have UGC platforms.
The immediate thing is they’re coming to me saying, “Do we have to do anything?” One, we’ve got platforms already getting nervous because either they don’t necessarily understand maybe they’re the owners or the operators, but they’re not the legal department or whatnot. They have some concerns because they haven’t read it, or you have these people who feel that they have been censored in some way. I know that I have seen in a couple of other counterparts to me who also represent UGC platforms, they too have seen threats of legal action and that’s expensive to websites.
It’s like death by a thousand duck bites when they start doing that because now platforms are having to engage with their legal counsel, which even if they’re in-house, it still costs money. It’s still resources being expended. All the stuff that’s going on, this is one of the things that I have seen is people already starting to threaten meritlessly, but threaten litigation against platforms for either curtailing speech or not doing what they think the platform should be doing.
[00:07:08] Patrick O’Keefe: I’d like to take a pause to welcome our generous new sponsor, Vanilla.
Vanilla provides a one-stop-shop solution that gives community leaders all the tools they need to create a thriving community. Engagement tools like ideation and gamification promote vibrant discussion and powerful moderation tools allow admins to stay on top of conversations and keep things on track. All of these features are available out of the box, and come with best-in-class technical and community support from Vanilla’s SuccessTeam. Vanilla is trusted by King, Acer, Qualtrics, and many more leading brands. Visit vanillaforums.com.
[00:07:40] Patrick O’Keefe: As I was developing this episode, I put a call out on our Twitter profile asking community pros to share their concerns with me so that we could talk about them. Amanda Petersen brought up this general idea of moderation waiving the protections of 230. Now, traditionally, this is a thought that we have to combat against regularly. You just alluded to it, the idea that you shouldn’t take any action because if you take any action, that’s when you become liable. Obviously, that’s not so. So says Section 230, but do you think that the ultimate goal of Trump’s efforts and those who are backing these changes is to effectively disincentivize moderation?
[00:08:19] Anette Beebe: That is definitely one way that it could go because if you are saying you censor too much, which is really that side of the table, that’s what you hear is you’re censoring too much. It typically revolves around what one may consider conservative speech. If there is going to be sudden liability against platforms for “censoring too much” then they’re going to go, “Great. Then we won’t censor at all but get ready for a really, really ugly internet.”
[00:08:49] Patrick O’Keefe: I think this is one of the issues that’s come up most commonly over the years when I talk to community pros and even people who are interested in starting communities, I don’t know, it’s probably up there with like the top two or three most misunderstood things about managing online community from the outside is that this idea that people push around sometimes is that if you do take any action, then you’re going to become liable for everything else.
In the US we’re unique in a sense in that we have this piece of legislation that empowers us to take action and to take responsibility without having to worry about what remains or what we might’ve missed and take some responsibility as opposed to taking none at all.
[00:09:26] Anette Beebe: Right. Getting rid of the moderator’s dilemma, which that phrase is utilized a lot in our space. It removes that concern. It’s very tough to get content moderation right 100% of the time. It’s very subjective. We have to remember that these platforms, a lot of the big ones, especially, aren’t just operating here in the US, they’re global. Global norms may be very different from what we’re used to here in the United States.
[00:09:55] Patrick O’Keefe: That subjectivity is the problem. Josh Hawley, the Senator from Missouri, has made an assortment of public statements tied to Section 230. Oftentimes, it’s supportive of the president where, to me, I’ll just say it. I think the youngest sitting senator doesn’t understand the internet. One of the things that comes to mind immediately as we’re talking now is this idea of political neutrality.
The idea that a board of commissioners of sorts would decide whether or not a platform is effectively politically neutral. Just the way that sounds to the layman might seem like, eh, whatever, but in actual interpretation and application of how that would work, who’s deciding what’s politically neutral. What the heck even is politically neutral? I’ve read various legal blogs and thoughts from people like Eric Goldman. It’s like, what does that even mean to be politically neutral? It would effectively make the situation more confusing, I think, than it is right now.
[00:10:58] Anette Beebe: I believe that to be a 100% true. If you think about any of the advertisements or any of the “political speech”, all of it is subjective. There’s always someone who says we are right, and you are wrong. In the world of politics, very rarely does somebody actually agree on the same set of circumstances, has the same result. Consequently, this is where we get all of this debate. I don’t think that there is such thing as a neutral platform.
You can’t have a “neutral platform”. If you just allowed everybody to post anything and everything they want, then it’s going to be a lot of issues with that. Going back to the politics, I just don’t see that there’s any way to have a politically neutral platform. If that were the case and everything was going to be the same, why do we have two different or technically three, I suppose if you include the independent runners, why do we have three different sections or concepts of our political parties?
[00:12:07] Patrick O’Keefe: It’s sort of a phantom, I think, because I have conversations with members somewhat regularly, but I’ve had them many times over the years where they want to do X and they’re not going to be able to do X and not on this platform, not here where I am managing it. The conversation is essentially, “I understand you want to do that thing, but unfortunately, that thing can’t be something that happens here so we can’t give you what you need in a platform. If that’s your expectation, if that’s what you need, then we can’t do that.”
Now, there are other platforms where that may and in most cases will be allowed. There are other platforms out there that can give you what you need because there’s a diversity in the platforms that exist, but it’s not something that we can do here. There’s a lot of different approaches to managing communities and how things can work. I think that it’s tough. We have this glossy approach to moderation, which is that we’re afraid to say what moderation is. It’s removing stuff, [chuckles] it just is. Like it’s saying you can’t do certain things and that’s okay.
It’s okay for you not to like it. It’s okay for you to say, “I’m not going to give you my money, time. I don’t like you as a platform. I’m going to go find another platform.” The answer isn’t that and it’s impossible. It’s a phantom that you would expect all platforms to approach any issue with some exceptions – sex trafficking, that sort of thing. Obviously we should be able to agree, even though the legislation tied to some of those things can sometimes create arguably more harm than good, but for most issues, it’s just a matter of, this is how this platform works. I don’t think that’s a bad thing.
[00:13:43] Anette Beebe: No, I don’t either. I keep waiting for someone to get innovative. At one point Myspace was super cool too. Then things changed and advanced. If the new argument is that politicians want to have a platform that can reach people, then perhaps the government needs to create their own. Now, they’re not going to be able to moderate it because then you’re going to have a First Amendment issue blocking people and that kind of stuff.
If they want to have an outlet that you can tweet out whatever version of that might be then so be it. I just don’t think that they have to have access to every platform. If they don’t like the curtailing, well, then we need to get innovative and come up with something different that does meet, like you were saying, the needs and wants that they would like.
[00:14:36] Patrick O’Keefe: Are you expecting legal challenges to the EO itself or will those come later when there are actual procedural steps taken?
[00:14:43] Anette Beebe: This has been a question that a whole host of us have been discussing, and I am not entirely sure that we are going to see anything challenging the EO in its beginning, as of right now, because if you go through and read it, Eric Goldman did a great blog on this just recently, but as he points out, there are only so many asks in there.
There’s not a whole lot of teeth to it. It outlines some requirements or requests even. I guess I should say requests of the DOJ and of the FTC and the FCC, they’re free to maybe not do anything. I don’t think unless and until they agree to take action and they try to do something or impose something in particular. I don’t necessarily know that it makes a whole lot of sense for any individual platforms necessarily to say anything because right now it’s still basic.
[00:15:41] Patrick O’Keefe: It’s essentially one man’s opinion, more or less. One of the things that stood out to me. I read the draft and then I did read the whole finished order was sort of the approach to (c)(2)(a). A friend texted me and said that they would, “love to know the impact of deleting content that doesn’t fall under Section 230 (c)(2)(a), like political content, religious content and spam. Removal could be considered editing which would make you a publisher under the executive order,” that’s him speaking.
Just to take that as a concern, I know I’ve seen that a lot on platforms, where you have too. Just to quote (c)(2)(a) to our listeners, I’ll just quote it here because it’s not that long, “Any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd,” What, this is a word I haven’t said out loud or ever, “lascivious, filthy, excessively violent, harassing or otherwise objectionable whether or not such material is constitutionally protected,” that’s (c)(2)(a).
As community facilitators, we remove all sorts of content and behaviors that we deem harmful or inappropriate that may not sound like they belong in that list. For example, requiring all texts be in English so that moderators can properly read and review it or removing generic political discussion as I have on KarateForums.com because it’s a martial arts community.
We don’t talk about politics there because it’s a drain on moderator time. If I had a political forum on that forum, it wouldn’t be about martial arts anymore. It would be 80% of moderator time sorting through people’s opinions on politics, or like Ravelry did, somewhat high profile, last year when they banned pro-Trump knitting patterns. There’s a lot of that content and just general spam that gets removed. What about the moderation of those types of content?
[00:17:28] Anette Beebe: What was interesting and one of the things that stuck out to me is that last section that you read, or “otherwise objectionable” stuff, regardless of whether or not it is free speech essentially governed by protections, regardless of whether or not. That last section was not quoted in the EO which I found to be very interesting because to me as I’m reading that regardless of whether or not it’s free speech, you can take it down. That’s to me, a reading, that’s what it says. That’s pretty much how this stuff has been interpreted going forward because you can moderate whatever.
I explained this to my layman friends all the time, it’s their house, their rules or for my friends that like to have cocktails. It’s their bar, their rules. Are they going to allow the unruly patron to come in there and just be crazy even though he’s paying for his bar tab or are they going to go, “Eh, you’re offending some of our users and we like our users so we’re going to boot this guy out?” It’s that kind of concept. That was one of the things about the EO that just stuck out to me is they cherry-picked language that they wanted to put into it and didn’t put the full definition in.
[00:18:51] Patrick O’Keefe: Just to repeat that as it’s written, it’s like here you can remove those things, whether or not they’re constitutionally protected and that’s okay. Like, my friend, don’t worry about removing spam or like the general religious or political content like you’re okay right now. They kind of misleadingly quoted that section in the EO because it may not fit the narrative.
[00:19:15] Anette Beebe: That’s exactly why I think they didn’t put it in there
[00:19:17] Patrick O’Keefe: There are some exceptions to 230. Obviously, there’s FOSTA and that’s a whole can of worms, but I’m not going to say- I’m reading from the Goldman blog post that you cited there, IP, federal criminal prosecutions, ECPA, and FOSTA. Beyond those four things, you’re good to remove spam. I don’t have to allow people to have MAGA hat avatars in my forums.
[00:19:37] Anette Beebe: If that is what you don’t want and that’s the general group and consensus, you can make it what you want. Every business is doing just that though, isn’t it? Making it what you want and uniquely yours. You have to cater to the entire population.
[00:19:54] Patrick O’Keefe: It’s interesting to me that the thing that seems to have been the last straw was Twitter adding a small notice, like, six, seven, eight words, beneath Trump’s tweet. It’s not quite the typical 230 use case that most community and platform operators will be aware of or run into because most of the time, it’s about something being removed or something not being allowed.
It made me think of, for example, Google, I think they still do this, but in the past, I’ve searched for links and I’ve seen a little notice on links that says, “This triggered the virus scan,” essentially or “This link is unsafe.” They added a notice to those links in the results because it failed that filtering, whatever that filtering tech was from Google or McAfee or– There are a browser plugins actually, like a McAfee plugin that will add something to links.
I know that because for some reason my grandparents always have them enabled when I go to their house and open their computers, but that’s a thing that exists. There’s little alerts that are added to links around the web. I was curious, is there anything unique to this idea of adding notices that we should be concerned about?
[00:20:56] Anette Beebe: Currently, the way the law sits, no. Section 230 has always been about reducing liability or immunizing platforms from putting stuff up or taking stuff down. Adding more speech, this now takes it away from Section 230 and starts getting into your First Amendment stuff. Adding more speech, there’s nothing wrong with it, what are they going to be held liable for, having a different opinion or suggesting that it’s wrong?
Obviously, what the president tweeted out, talking about the mail-in ballots and whatnot, that’s where it started, they have a different opinion, that, “No, this isn’t likely to cause fraud,” or whatever and they can have that opinion. What would someone come and sue for? From a practical perspective, what are they suing for? Having a difference of opinion, that’s their First Amendment right.
They can have a difference of opinion against the president, thank you for living in the US. This is what we can do here. Adding additional content is fine. The one thing where they have to be careful about is modifying the original content or otherwise materially contributing to content in a way that looks like you would be adopting it and if that content were wrong.
I’m using this more in the terms of defamation because that’s really where we see most of these issues is on defamation cases and then the spin-off ancillary torts that tends to come from that, be it invasion of privacy or interference with contractual relations or those kinds of things. Them just saying, “Hey, we disagree with you,” or, “Hey, users, we understand that this is what he’s saying, but here are some other sources for you to consider,” that’s just more speech.
[00:23:00] Patrick O’Keefe: Thank you for taking that down. The short version of it is adding stuff, fine, removing stuff, fine, editing stuff is murky. If you change the wording of it, you change how it’s directed, what someone’s saying, the meaning of it because some moderators like to edit content, I have always tried to avoid that and tried to tell people just to pull it, but I can see there might be use cases where, hey, they got a good post and it’s 1,000 words, maybe there’s just a link at the end and they’re like, “Okay, let me just edit that link and not pull the whole thing down.” I can understand there might be use cases, but editing is a tricky spot.
[00:23:32] Anette Beebe: Editing can be especially if it’s materially contributing to the content or materially altering it, certainly removing a link, isn’t materially altering it necessarily. I haven’t seen any cases where that has been an issue. There may be a case, a Section 230 case, that I’m just unaware of, but I’ve never seen that be an issue or fixing capitalization, some of those basic editorial things that one would do, those have always been fine. It’s when you change like a “is a bad guy” to “is not a bad guy” or whatever. That would be materially contributing or changing things up.
[00:24:18] Patrick O’Keefe: Since we’re taking down a common thing, I’ve talked about this enough, but you’re an attorney, I’m not. Do you want to just briefly hit at publisher versus platform?
[laughter]
I’m sorry to ask, but I don’t even know that it’s more than a sense, but just talk about that real quick and that confusion because even people who know what they’re talking about and talk about it online, constantly will get inundated with publisher versus platform to the point where I think some of them might even start to question themselves because it is such a common drum that gets beaten, that there is somehow a meaningful difference, no matter the example that they’re given.
[00:24:53] Anette Beebe: [laughs] The publisher versus platform. In the context of UGC sites, there really isn’t a difference. I try to explain it to people as in who’s creating the content. A lot of your scholars will talk about the bookstores and the magazine stands, but when you’re a platform, you allow other people’s content, you’re not creating it, and you’re not actually even necessarily putting it out there as fact, you’re basically just showing the information.
I like to tell people, “Who created the content?” And if the answer is not the platform, then they don’t have liability and they’re not there for the publisher. The publisher is the one who comes up with it, who creates it, who materially is involved in that information.
[00:25:47] Patrick O’Keefe: Thank you.
[00:25:47] Anette Beebe: If that makes any more sense.
[00:25:49] Patrick O’Keefe: This came up just the other day because I had someone on Facebook who I know and he’s like, “What’s going on with this?” I said, “Okay, here it is. Here’s what happens. I deal with this. This is my life, basically. This is my profession since I was 13 years old. I started moderating two years after Section 230 was passed. I’ve been at this since ’98, this is what it is.” Then someone’s like, “No, you can’t do that. They’re a publisher.” I’m like, “Okay, so The New York Times publishes articles, right?” Okay, so you’re with me so far. Right? Okay, so they publish articles. That’s them. They’re liable for that stuff.
Now, they have comments beneath those articles, okay? That’s different. That’s not them, and so that’s where Section 230 kicks in.” That’s an easy to understand example, and yet the response I get is, “No, that’s wrong, not how it works.” They’re like, “Okay, well, I’ll see you later.”
[00:26:31] Anette Beebe: Actually, I had this exact debate over on my Facebook page with someone who was commenting, “Well, as soon as you start editing, or as soon as you start choosing what to put up.” I’m like, “No, that’s different.” It’s not them thinking about the content. It’s not them saying, “Hey, we’re going to run a story about Sally’s Dog Service down the street,” and figuring out what’s going in.
Comments are somebody else’s unique thoughts that are going into that. That’s why they’re not liable, but it is hard for some people to comprehend that difference. I keep waiting for someone to come up with a cartoon or a little pictograph or something that is so incredibly silly, but yet captivating that will help people understand. If you ever see one out there, be sure to send it to me.
[00:27:25] Patrick O’Keefe: I think maybe you should. Get with an animator and you should make it. Put your Twitter name on it and let it spread through the internet.
[00:27:31] Anette Beebe: That would be really funny. [laughs]
[00:27:34] Patrick O’Keefe: I’ve mentioned Eric Goldman a couple times. He’s a law professor, past guest on the show and the article he wrote, we’ll link to in the show notes, but one of the things he said in it that I thought was interesting was, “How should social media services navigate this situation with downsides in every direction, as WOPR,” which is the supercomputer from the movie WarGames,” concluded, “the only winning move is not to play. If you can’t police them and you can’t stop them from lying, the least worst option for all social media services will be to dump all accounts by politicians or political candidates and exit that industry segment.” Less of a legal question but what do you think about that as a solution?
[00:28:06] Anette Beebe: It absolutely is a solution, but then here goes the other section of that concept. Let’s say we get rid of all the politicians, no more politicians are allowed on there. Are they going to dump all of the users who have discussed politics because that very well may be an issue too? The argument is still that these platforms allegedly curtail conservative speech. It’s not just politicians, per se. It’s the everyday users who want to have dialogue on the platform that discusses politics.
Then are they going to start moderating out anything that discusses politics? Then I don’t want to say just an election year as I feel like to the last four years every day has been nothing but a discussion of politics. If you’re going to do that re you going to alienate a good portion of your stuff and remember these platforms, for many of them, it’s still a business? They don’t want to get rid of everybody. Is it a solution? Sure, but are there repercussions for that solution? Absolutely, and that’s something that they have to think through.
[00:29:16] Patrick O’Keefe: I did see this tweet that I thought was funny which is true is that without 230, who would host a guy who’s recommending people take untested pharmaceuticals, who wouldn’t be able to take that burden on if you were liable for hosting the speech like Trump is suggesting people take hydroxychloroquine? I know I wouldn’t. You couldn’t accept the burden that comes with drug recommendations.
[00:29:39] Anette Beebe: No, and a lot of speech would start getting curtailed. Without Section 230, you were going to get away from the internet that we know likely. We’ve never seen it because it’s never been tested, but we are so used to being able to freely communicate and share back and forth, but if there becomes liability for that, then we’re likely going to go back to the old-school industry of broadcast, where it’s a one-way conversation.
[00:30:09] Patrick O’Keefe: Where do we, meaning, community pros, people concerned about this, where do we go from here? Taking a longer view, both Joe Biden and Trump have said they want to revoke 230. The law is getting it from, seemingly, all sides. I’m not a one-issue voter. I don’t vote based upon 230 or any one issue generally. There’s a lot of problems in the world. This is obviously an issue that matters to us and those of us who do this work. Is the writing simply on the wall? What can we do?
[00:30:36] Anette Beebe: It depends on who you talk to. I know some practitioners are like, “We are totally screwed in this situation.” Then there are other practitioners that still have hope, but for those that have hope, everyone needs to be involved and taking a position, whether it’s writing letters to their state and representatives explaining why things are important. I’ve noticed that a lot of these people who talk about it in Congress are a little bit old school, and perhaps just don’t understand or are not taking the opportunity to more fully understand the bigger picture.
I think it’s going to be really important for a lot of people who enjoy this way of life, notwithstanding the problems, because indeed there are bad actors, but there’s always going to be bad actors. I have not seen any empirical studies done that would suggest one way or another, that the harms that we hear about, they’re so big that this is actually such a huge issue. We’re on a 24-hour news cycle, and people love dirty laundry. They are going to flock to the dirty laundry, and they’re going to continue to hear about it, but the main case that continues to be brought up against 230 is Herrick v. Grindr. I’m not saying that there aren’t more similar cases, but that’s one person.
How many issues is there really versus how much we’re perceiving it’s a big issue because we’re hearing about it all the time? I would love to see some studies done that actually weigh these types of issues out so we can make better-informed and more educated decisions before we just put pen to paper and start legislating things without really having a full understanding. Going forward, I think that if people enjoy this, it’s important to talk to your government. It’s important to try to get involved in explaining why the platforms have utility and why it would be impacting because I think there’s a lot of people who chatter about it, but they’re not really taking a huge stand and getting super active.
[00:32:50] Patrick O’Keefe: Anette, thank you so much for joining me today. I’ve really enjoyed the conversation.
[00:32:54] Anette Beebe: Thank you. I’ve appreciated being here and having your ear to vent today.
[00:32:58] Patrick O’Keefe: We have been talking with Anette Beebe, principal attorney at Beebe Law, PLLC. Find her at beebelawpllc.com. For the transcript from this episode plus highlights and links that we mentioned, please visit communitysignal.com. Community Signal is produced by Karn Broad and Carol Benovic-Bradley is our editorial lead. Thank you for listening and stay safe out there.
[00:33:30] [END OF AUDIO]
If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.
One comment