Kinks vs. Crimes and Gender-Inclusive Content Moderation at Grindr
Bodies aren’t moderated equally on the internet. Content moderation efforts, especially those at large, mainstream platforms, can suffer from policy-based bias that results in moderation centering a cisgender gaze. This reinforcing of heteronormativity can leave some of your most vulnerable community members – and potential community members – feeling alienated, ostracized, and simply unwelcome.
Last year, in her role as CX escalations supervisor at Grindr, Vanity Brown co-authored a whitepaper, Best Practices for Gender-Inclusive Content Moderation. Insightful, with a straight forward approach to making content moderation just a bit better, I found that it was also a validation of good, thoughtful moderation that has been going on for a long time.
Vanity joins the show to talk about these efforts, which are tempered by a realistic acknowledgement of the limitations of this work, and how our need to be in other places (like app stores) can often slow down the progress we’d like to make.
We also discuss:
- Why it’s not our job to guess the gender of our members
- The state of AI trust and safety tools
- ChatGPT, Midjourney, and how much to worry about them
Big Quotes
How bodies are moderated differently online (2:16): “We want folks to express themselves and their sexuality joyfully, without judgment. Of course, without any harm. But what does that look like? … There traditionally are [community] guidelines for females and guidelines for males, but the world is changing and folks are becoming more in tune with who they are, and we want to be able to treat them equally and let folks, especially I emphasize our trans users, who are uploading photos … and if they are showing the top, then they’re considered a woman if they have female-presenting breasts versus male. There are just a lot of nuances there that we saw as we were moderating content from a community who is very fluid with their gender expression.” -Vanity Brown
When do kinks create a moderation issue? (6:38): “[Kinks vs. crimes get] sticky when the kink looks like a crime. … Everything is about sex and kinks at Grindr. With this mass of kinky stuff, which of these things are harmful? I often echo that, in my work, I’m always driven … to do no harm. At the end of the day, are we harming someone? … Do we have a responsibility to protect them and keep them safe? As we continue to build trust with the community, we have to realize that folks are adults, too.” -Vanity Brown
Empathy sits at the core of good moderation (14:38): “If you can’t be empathetic for the things you are not … then you’re not really doing good thoughtful community moderation, trust and safety work. … Ultimately, if you want to be truly great at this work, you have to protect the people who aren’t you.” -Patrick O’Keefe
What can community pros learn from dating apps? (24:23): “[Community, moderation, trust, and safety pros] can learn from dating apps on the level of how personal and sensitive dating apps are in the content you’re sending back and forth. Folks using dating apps, a lot of times their heartstrings are attached, and their heartstrings are attached on a dating app, but not necessarily Amazon or shopping at Macy’s. … It’s just important to look at folks with a microscope and treat them with kindness as those in dating apps hopefully are doing when they’re handling their customers.” -Vanity Brown
About Vanity Brown
Vanity Brown is the CX escalations supervisor for Grindr, where she has worked in trust and safety for over 2 years, following more than 7 years at eHarmony. Vanity manages an escalations team of specialists devoted to handling the most complex cases that come through Grindr’s support channels.
Related Links
- Vanity on LinkedIn
- Grindr, where Vanity is CX escalations supervisor
- Best Practices for Gender-Inclusive Content Moderation whitepaper, co-authored by Alice Hunsberger, Vanity, and Lily Galib, which I found via Juliet Shen
- Grindr’s community guidelines
- OpenAI’s efforts to identify AI-generated text, which were only able to identify “likely” AI-written text 26% of the time, a bit more than the approximately 10% I mentioned during the show
- Love Light Community, a youth choir founded by Vanity, dedicated to “enriching the lives of youth and families in underserved communities through the transforming power of music and the arts”
- Love Light Community on Instagram
Transcript
[00:00:04] Announcer: You’re listening to Community Signal, the podcast for online community professionals. Here’s your host, Patrick O’Keefe.
[00:00:18] Patrick O’Keefe: Hello and welcome to Community Signal. On this episode, we’re joined by Vanity Brown, who works on moderation, trust, and safety escalations for Grindr, which describes itself as the largest social networking app for gay, bi, trans and queer people. Our conversation is about not guessing the gender of your members, separating kinks from crimes, and the state of automated moderation tools.
Thank you to Carol Banovic-Bradley, Jenny Weigel, and Phoebe Venkat for supporting our show on Patreon. Our Patreon supporters tend to back us over a long period of time, and that means a lot. I’ve been sharing more bonus clips again recently, so if you’d like to learn more, please visit communitysignal.com/innercircle.
Vanity Brown is the CX escalations supervisor for Grindr, where she has worked in trust and safety for over two years following more than seven years at eHarmony. Vanity manages an escalations team of specialists devoted to handling the most complex cases that come through Grindr’s support channels. Vanity, welcome to the show.
[00:01:07] Vanity Brown: Thank you. I’m so excited to be here.
[00:01:10] Patrick O’Keefe: It’s a pleasure to have you on. The way that I found you, Vanity, was through a post by Juliet Shen on LinkedIn, where she shared a link to the Best Practices for Gender-Inclusive Content Moderation whitepaper that you co-authored with Alice Hunsberger and Lily Galib. I enjoyed reading it. I felt like it included solid advice that really validated the good work of community, moderation, trust and safety that’s been going on since the dawn. It was a really good read.
In that whitepaper, there is a set of belief statements for Grindr, and one of them is that “Grindr believes that it is not our role to guess the gender of our users.” Can you talk a little bit about that and how it manifests in your work?
[00:01:44] Vanity Brown: Absolutely. First of all, thank you for your kind words about the white paper. It is definitely something that I never saw a bunch of folks reading, but I’m so glad that you and others have found it so useful. I think that that statement is rooted in the fact that folks are looking to connect, folks are looking to hook up, and a lot of what is valuable to the LGBTQ+ community, which we are an LGBTQ+ app. I think that one of the biggest things were trans users and we are welcoming and open to folks regardless of their gender or preference.
We want folks to express themselves and their sexuality joyfully, without judgment, of course, without any harm, but what does that look like? I think that sexuality and gender is such a sensitive topic for folks, but for me, I noticed that there traditionally are guidelines for females and guidelines for males, but the world is changing and folks are becoming more in tune with who they are, and we want to be able to treat them equally and let folks, especially I emphasize our trans users, who are uploading photos and looking to connect, hook up, whatever, and if they are showing the top, then they’re considered a woman if they have female-presenting breasts versus male. There are just a lot of nuances there that we saw as we were moderating content from a community who is very fluid with their gender expression.
[00:03:08] Patrick O’Keefe: Yes, I thought it was really interesting that you talked about being neutral or at least consistent as much as possible across different genders, and then if you can’t be in some cases, really rely on how the member identifies and how they choose to experience your product, and then a little bit there also in the community guidelines. I noticed too, when I was reading, there was a struggle we have right between what we want to do or what we would do in an ideal world and then what we have to do to be in the app store or to be in someone else’s playground that they’re moderating and they’re responsible for. Can you talk a little bit about that struggle between the ideals that you want to achieve as a professional and the limitations that are out there when you have to play in other ecosystems?
[00:03:47] Vanity Brown: Absolutely. Patrick, your questions are so insightful, by the way.
[00:03:50] Patrick O’Keefe: Thanks.
[00:03:51] Vanity Brown: For me, I really learned from my boss, my mentor, Alice Hunsberger, who had meetings with some of these app stores to challenge the nipple policies and all of the other gender discriminatory policies that exist not only for our app, but anybody else that wants to be on the playground, but for us, I think that we were able to benefit in sharing that information, and sometimes in this world we assume that other folks are looking at the same things or exposed to the same challenges, and for us, all of our work is centered in the LGBTQ+ community and we get to hear a lot of that feedback.
I often want to save the world. I often, every day when I show up to work, want to do something that I know is right and good for mankind, and it is often hard, though, sometimes when there are these rules and you know that they just aren’t right, but somehow, living in this country, I’ve gotten used to it. It’s one of those things where you have to find your balance and you have to cut it off at a certain point because you can become personally and emotionally drained from being invested in trusted safety work.
[00:04:56] Patrick O’Keefe: Yes, it can be something that consumes you. It’s hard to turn off, and I would encourage folks to take a look at the Grindr community guidelines because I feel like I always love a good doc. I love a good long doc also, I feel like short community guidelines, especially too short, are generally bad. That’s my opinion because people will try to couch them in cutesy language, like, don’t be a jerk, which is effectively meaningless. Because, hey, what’s a jerk? I know a lot of jerks, they don’t think of themselves as jerk. It’s very good about the details, and it talks about the things that Grindr allows and doesn’t allow. In this particular case, there’s more info on the Grindr community guidelines.
It’s a summary, and then you tap “more info” to get some more specific information, which I think is also very good, and just related to this particular case that it mentions in the guidelines. These might be just US specific as I read this because I noticed in the URL, it says US language in the URL, maybe it will, maybe it’s not, but the specific language is, “for those who identify as female and/or female-presenting, no exposed breasts or nipples, though we would love to free the nipple.” That’s just a recognition of what you have to do to operate here.
[00:05:57] Vanity Brown: Yes.
[00:05:58] Patrick O’Keefe: I wanted to go from there to one of the challenges that you mentioned that you were thinking about before the show, which was “kinks versus crimes, sex positivity for adults versus crimes, and understanding where to draw the line.” Those are your words. Let’s talk about that. I’ll start it off with this. In areas where it is on the line, or you’re not sure, what do you rely on to come to the decision of whether or not something is a kink or a crime?
[00:06:23] Vanity Brown: We are a kinky place at Grindr. I have to speak to that experience. I used to work at eHarmony, and that was a completely different realm. Kinks there were looked at as, no, you cannot. Let me tell you where it gets sticky. It gets sticky when the kink looks like a crime, and that’s the terminology that I came up with when we were developing policy for- everything is about sex and kinks at Grindr. With this mass of kinky stuff, which of these things are harmful? I often echo that in my work, I’m always driven and all of the teams that I lead to do no harm. At the end of the day, are we harming someone? Are we putting someone in danger? Do we have a responsibility to protect them and keep them safe? As we continue to build trust with the community, we have to realize that folks are adults, too.
There’s that line of you’re an adult, you do what you want, just don’t do these things, and a lot of times, it’s because it’s illegal, for example, soliciting, sex work, human trafficking, some of those things can be gray areas, and that’s why we develop policy that will eliminate the appearance of human trafficking or set up things on the back end, trust and safety is also risk prevention. There are ways for us to prevent things from happening, but I say all that to say, once people are out being kinky, we need to determine whether it is a crime. Consent is big, consent is important. We recently it was a part of it published a part of the help center that is about consent, because everyone has the ability to say yes or no and be a part and consent and remove the consent when they get ready.
I think where it gets really weird is when folks do ageplay, and that is often seen in places where folks are discreet and want to explore their sexuality. I think also when it becomes exchange of money, that’s immediately a no-go, something we can’t allow. Most websites cannot, but I think anything involving children, or a kink that is about someone’s protected class, raceplay, ageplay, I’m not confirming or denying that any of this happens on any particular app, I’m just saying that those are kinks that do exist. In the BDSM communities, there are many groups and clusters of folks who explore their sexuality with consent, and I just think that, again, consent is important. When you’re out meeting folks and having a good time.
[00:08:53] Patrick O’Keefe: I think it’s so fascinating to hear the mental processes that we go through around like looking at something and what we apply to it first in our minds, and then of course, we can sell policies and handbooks, and rules and written policies, but then like all the things that we pick up working in a particular space for a long time as you clearly have.
I think when I listened to you, there’s two things that I drew out from it, which one is just the idea that that thing may be okay, but you don’t have to do it here, is always something to think about. I think that applies not just to kinks and crimes and adult-themed apps and experiences, but just like every community under the sun, from talking about the martial arts to career community to whatever, just because people can talk about this thing, and that’s okay.
It doesn’t mean you have to deploy moderation resources to support that thing in your app, on your product. The other thing, if someone hears this, and they’re not you and they’re not someone who’s worked at Grindr for 5, 10 years, and they get overwhelmed immediately, and they get scared immediately. They might have heard some basic level things about FOSTA-SESTA or like different stringent policies in the US or globally and it scares them immediately and it’s okay.
This is true for veterans and experienced people, especially those verging into new areas and new topic arenas or new verticals. It’s okay not to know things. For all that they’ve just mentioned, you mentioned groups, there are bodies of people out there who are doing research or publishing knowledge, or advocating for something, or talking about something for so many different things. I’m not talking about the really inappropriate things, but all the things you don’t understand that are okay, there are people out there doing those things and talking about them, and it’s okay for you not to know them.
It’s so important to be able to say to yourself, “You know what, I have no idea what this is, but part of my job is now to research it and talk to people who do.”
[00:10:43] Vanity Brown: That is so true. I want you to reiterate your question because what you said just brought up so many things for me that I want to comment on.
[00:10:50] Patrick O’Keefe: No worries. It’s not a question, it’s a statement. You can be someone who knows a lot about an area, and you don’t have to say anything back, but it just scares people. That’s all. It scares people when they hear about things. No, it’s not just they’re scared about things you don’t understand. It’s not about a phobia or that sort of thing. It’s more like this sounds so important and potentially damaging to people, and I am responsible for this space. Therefore, I am frightened because I don’t understand it.
You just have to turn that fright into like, “It’s okay, that’s a strength. I don’t know it, but I care. I care enough to go out and to talk to people who do know it and to do research and to form thoughtful policy based upon what the people who know it are saying.” That’s a superpower in a way. I don’t want to use that phrase, it sounds really gimmicky, but it’s a power in a way, is that just the ability to recognize in yourself that you know your stuff, you are experienced in this work, but that has led you to believe that you don’t know everything and then when you encounter something new, you have the freedom to go talk to people who do.
[00:11:49] Vanity Brown: Oh, Patrick, I love that you said that because it’s so dangerous to feel that you’ve arrived or mastered anything. I tried to live my life as a lifelong learner. I tell everyone like, “If I don’t know it, I’m going to find a person that does.” I think that for me, I will share this with you. I was born 100% Christian in a Christian family in church. I actually am an ordained minister and I’ve been very private about my personal life and didn’t really exercise much of it because I was young with a lot of expectations on me. I honestly was one of the people that was miseducated about the queer community.
Even though I was queer myself, I was judgmental and thought that if God doesn’t like it, it was just easy to throw people away. Then I did some work with myself and then I started looking at my professional life and being able to work in spaces that are really showing what people do, not in public, but in private. It allowed me to see like, “Wait a minute.” I had to put away all of the wonderful Biblical teaching I received and really look at my soul and realize what feels right for mankind and what is terrible. You have to get to the bottom of yourself as a human, take off your titles and all of your wonderful studies.
You have to have that thing in you that gives a damn, that doesn’t want to see someone else mistreating or harm. There’s a deeper work that people interested in safety do and usually have done that should be applauded. I say for those folks that do work in trust and safety and have their beliefs set up, I am one who completely came to Jesus moment in terms of my thinking around what is acceptable and what is not. I learned so much.
I did not know about folks that were transgender and what their experience was in dating. I think there are a lot of assumptions and there’s a lot of just belief of what I see on the internet, but I’m so now proud to be able to say, “Hey, I don’t understand this experience, but I’ve been able to talk to someone who does.” Now opening up my eyes and just even being around them or seeing them and valuing their experience is so powerful.
[00:14:13] Patrick O’Keefe: Thanks for sharing that, Vanity.
[00:14:14] Vanity Brown: Yes.
[00:14:15] Patrick O’Keefe: I like to believe that at the core of good, truly good community moderation, trust, and safety work, however you define those things, however you apply it to yourself, is the ability to have empathy for everything you aren’t. If you only have empathy for the things that you are, meaning in my case, I’m a white man, I’m cisgender. I have empathy for people who are like me.
[00:14:37] Vanity Brown: Right.
[00:14:38] Patrick O’Keefe: If you can’t be empathetic for the things you are not, if you cannot have that empathy, then you’re not really doing really good thoughtful community moderation, trust and safety work. How you cultivate that empathy is on you and can be a combination of life experiences, education, seeking out knowledge and resources, building a strong network, all different types of things.
Ultimately, if you want to be truly great at this work, you have to protect the people who aren’t you, so we all have that. We all have people who are not us. Defining that in different ways– if you can’t see past that, then I think you’re in trouble. You should probably pick another line of work. Let’s go from something very personal and meaningful and deep to something cold, just businessy, which is metrics again. I asked you about metrics and you mentioned that “keeping scammers off the platform is ongoing. We often measure the proactive banning of scammers. There are so many bad guys that do not get in and we are always proud of our security.” How do you when a scammer doesn’t get in?
[00:15:34] Vanity Brown: That’s a good question.
[00:15:36] Patrick O’Keefe: Look, I flipped in it a totally different direction, but I was just curious about that. That’s a tough thing to measure, right?
[00:15:41] Vanity Brown: Yes. It depends on how you’re measuring or monitoring registration. We know some organizations have embargo, countries overs restricted countries that you just don’t do business with. Unfortunately, online fraud comes out of very specific countries, so that’s one way, is to know that they want to and eventually you have to block them. Another way is registration. If you have a preventive measure that is evaluating, scam has a look, fraud has a look and it depends on your product as to what that looks like.
[00:16:15] Patrick O’Keefe: Are you saying that you had, when people are blocked for as scammers, as classified as a scam and you can see a drop or an increase in scam accounts over a period of time? Is that what you’re getting at?
[00:16:24] Vanity Brown: Yes.
[00:16:25] Patrick O’Keefe: I’m sorry I cut you off there.
[00:16:26] Vanity Brown: Some people don’t do it that way and it’s not necessary because it depends at what point that person will affect the rest. At what point you’re let a poison in. There are a lot of machine learning tools that allow folks to create an account but not make it to the pool or require some type of verification in that way. I hope that makes sense.
[00:16:44] Patrick O’Keefe: Yes, it does. As someone who’s been at this for a while, how would you describe the state of automated, the moderation, trust and safety tools right now?
[00:16:51] Vanity Brown: It has advanced so much. I’m actually amazed and really proud. I feel like a great grandma in the trust and safety world because I remember the time when a lot of things were manual and I was one of those folks processing things one by one. Now I think that the advancement in technology and folks building whole data driven businesses has been helpful and I’ve seen it develop over the last decade plus with the proactive moderation, the machine-learning prevention type tools.
[00:17:21] Patrick O’Keefe: Yes, it seems like there’s a lot of vendors out there. Have you done much of a vendor evaluation?
[00:17:27] Vanity Brown: I have, but that’s not really my–
[00:17:29] Patrick O’Keefe: Got you. The reason I was going to ask that is to say I think the increase in vendors is exacerbating a problem that already existed, which is identifying tools that work and work well because you have this segment of tools that are like pitching a set it in, forget it, which is nonsense. I don’t know if we share that opinion, but that’s mine.
Evaluating those tools and especially when– Grindr, I assume, a member base, very protective about their data and so you can come up with dummy data. You can make up data to serve through something to test it out. You can rely on the vendor’s data and their examples they give you. That can be dicey sometimes, I think. Do you have any tips on identifying a good vendor and putting them through the paces, especially AI and ML related and making sure that they can serve the type of content you have correctly?
[00:18:14] Vanity Brown: Absolutely. Again, when we’re looking at these vendors, I was involved in a couple reviews more so with in looking at the results of the data that they received. There are agreements, of course that happen. You need to protect your privacy and all of your materials as a business. It is important for them to assess real content and real examples from your database. I think there are ways to mask user profile information, anything personal, but I think you want to use actual data. I think it’s important to have a really reliable and consistent representative from the vendor side that understands your industry and exactly what your user experience looks like.
It’s really a partnership in making sure that if you have any questions, if there is anything that needs to be tweaked, a good vendor will work with you and if it’s not something in a package that they’re trying to sell you, they will let you know. I think it’s important to have that communication, not just when you do your exchange, but an ongoing maintenance type of plan.
[00:19:11] Patrick O’Keefe: Sticking with automation, but backing up more general, I think where I would like to end here is to get your perspective on the hype around ChatGPT and tools like it. I’m hearing noise in different directions. I have feelings, I don’t want to share them in case I poison the well, I want to hear your unfiltered thoughts. How much are you concerned about these tools and we’ll loosely define them as AI creation tools, ChatGPT, Midjourney, things of that nature. How much are you concerned about those being used negatively on Grindr?
[00:19:47] Vanity Brown: I’m going to be 100% honest with you. I’ve been blocking out the noise so much so that I’m not even clear on why folks are going crazy. I actually–
[00:19:57] Patrick O’Keefe: Wow.
[00:19:57] Vanity Brown: I want to hear your opinion. There have been talks about it, folks all around me are talking about it but.
Patrick O’Keefe: You’re very lucky, you’re very fortunate, you’ve done a good job.
[00:20:04] Vanity Brown: The privacy team is dealing with it and I haven’t had to.
[00:20:07] Patrick O’Keefe: Briefly, my feeling is… ChatGPT, you enter a prompt into it, it gives you back text that you can use. You can ask it for any number of things, make a recipe, plan a trip, write an article, whatever. It’s like, write my bio or I want to sound like a 32-year-old trans woman from Nebraska who is interested in X, Y, and Z, can you write my bio for me? Or you can give it more specific prompts. You can be like, I’m this, I am this, I am this and I am this.
You may be none of these things, but it will work to write a convincing bio or backward information or whatever you want based upon what you give it. Same for photo tools, like a Midjourney, which is like, I want to look like this, or I’m this person, I live in this area, I want this background, I want to be this person with this characteristic and this characteristic and this characteristic and here’s my profile photo, so there’s too much hype, so much.
To me, I think it’ll be interesting to hear your perspective maybe if you ever have to unblock down the road, but my feeling and this might just be an old-head type of feeling, is that we’ve already had policies that talk about low-quality contributions, about impersonation, about inauthentic behavior for a very long time. It’s very simple and it’s very dumb, but on an online forum I’ve run for 22 years in May, I have a line in the guidelines that says no automated contributions [laughs] and it’s been there for 20 years. I was just thinking about the routine bots and automated account creation that we see. My feeling is that we already sort of moderate for these things in a different way.
Yes, it’s going to get more complex and more complicated and all those things, but ultimately part of our job is just to evaluate each piece of content that comes in. If we evaluate it and find that it’s impersonation or gets reported as impersonation or it’s inauthentic or it’s low quality, or there’s a trend in the origin country or the IP address or whatever it is. I think we can take care of it as we have taken care of other pieces of content in the past.
That said, it does make it easier for people to manipulate others, I think, and do so convincingly, and there are probably certain services that are more susceptible to that than others. There’s also sort of a practical thing. The company behind ChatGPT, OpenAI, they are testing this tool to identify creations by their own tool. Their own thing that’s making it, and their tool only could find it at like a 10% rate. It was only not even a hundred percent convinced, it was like likely to be.
If they, the people making this thing and unleashing it on the world in hopes of profit, cannot even identify what their own tool is making, why should we worry? We should worry about everything. Why should we worry so deeply to stress ourselves out and lose sleep at night? I think we just need to apply our standards consistently and evenly and see where the world takes us and not stress out too much.
[00:22:53] Vanity Brown: You know what? I figured that’s kind of what it was, and I feel the same, Patrick. I’m like, I guess the reason why I haven’t been hyped is because, again, the policies exist. Everything you said, we’re just giving people a tool to do it faster. I think that may be on the prevention end, there may be problems with folks’ tools not being able to receive or read the content as fast. I think people may just have to uplevel, but I think the folks who are panicked may not understand trust and safety and content moderation fully and how the policies are developed.
I could just be also assuming, but a lot of folk who are newer to the industry don’t understand that it’s not really a new issue. It’s just they can do it faster. Anybody can put together a profile of somebody else’s information at any point.
[00:23:39] Patrick O’Keefe: Yes, 100%. Since I asked you about something you really didn’t know about or hadn’t looked at much, let me ask you this instead. I tend to think we’re all very related when it comes to online community moderation, trust, safety, policy, pros. We have more in common than we do dissimilar and much we can learn from one another. A lot of your background is obviously in dating apps, which are their own sort of vertical with its own challenges and things we can learn from.
What’s one thing, just the top of your head, not the number one thing, I don’t want to put you in a weird spot. What’s your favorite movie? What’s your favorite song? I have like 10 favorite movies and 10 favorite songs, but what’s something that the industry in general, professionals in general can take away from a well-run dating app or a dating service? What’s something that you know is a good lesson that you’ve learned that you think is applicable elsewhere?
[00:24:20] Vanity Brown: Yes. I think that’s a good question. I would say that folks can learn from dating apps on the level of how personal and sensitive dating apps are in the content that you’re sending back and forth. Folks using dating apps, a lot of times their heartstrings are attached and their heartstrings are attached on a dating app, but not necessarily Amazon or shopping at Macy’s. The way you handling the level of empathy should always be considered.
With dating apps, they’re loaded with empathy, at least for any type of trust and safety stuff, which I’ve helped develop. No refund, you’re blocked, sometimes where it’s more emotional when you’re being blocked out of a place that is a place for community and human interaction. Mainly for us especially in the queer community, it’s illegal to be gay in many places. It’s important to make sure that if you posted a picture of like something bad, and you’re having a really, really bad day, these are the decisions that I try to help make. It’s like we don’t want you to just go and make a bad decision and harm yourself if you’re mentally not okay, in that time, because you’re going through something.
I’m going to be honest, Patrick, trust and safety has changed so much in the last two to three years, because of COVID. I think that the amount of users, I’m sure it’s somewhere, on everything has just blown up. There’s more activity, more content and it’s just important that to look at folks with a microscope and treat them with kindness as those in dating apps hopefully are doing when they’re handling their customers.
[00:25:53] Patrick O’Keefe: We might just be a little better off as people who do this work if we treated that customer review of a product or that post about fishing on Friday with a little bit more empathy, and a little bit more personal connection as opposed to looking at it as something that just can be removed or thrown away or closed or deleted, because it doesn’t seem that important. Vanity, thank you so much for your time today. It’s been a pleasure to chat with you. It’s been great.
[00:26:19] Vanity Brown: Thank you, Patrick. I’ve been all over the place, but I hope that this was good for you and those that hear it.
[00:26:25] Patrick O’Keefe: It was. We’ve been talking with Vanity Brown, CX escalations supervisor at Grindr. Vanity founded a youth choir in Los Angeles aimed at enriching the lives of youth and families in underserved communities, through the transforming power of music and the arts. To find out more visit lovelightcommunity.com and follow @lovelightchildrenschoir on Instagram. For the transcript from this episode plus highlights and links that we mentioned, please visit communitysignal.com. Community Signal is produced by Karn Broad. Until next time.
Your Thoughts
If you have any thoughts on this episode that you’d like to share, please leave me a comment or send me an email. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.