Season 5 Episode 6 Aug 26, 2021

Apple Pay Transparency Survey, and the Battle Against Twitch Hate Raids


An episode all about fighting discrimination.


In this episode, we speak with Cher Scarlett, software engineer at Apple, about her endeavor for salary transparency at Apple to battle pay disparity and the challenges she’s faced during this undertaking. And then we speak with Twitch streamer and moderator JustMeEmilyP, and Twitch moderator NLA about the proliferation of Twitch Hate Raids and the tools and resources they and others have built to fight against it.


Saron Yitbarek

Disco - Founder

Saron Yitbarek is the founder of Disco, host of the CodeNewbie podcast, and co-host of the base.cs podcast.

Christina Gorton

Forem - Developer Advocate

Christina Gorton is a Developer Advocate at Forem. She is a LinkedIn Instructor and technical writer.


Cher Scarlett

Apple - Software Engineer

Cher Scarlett a problem solver, creator, and innovator.



JustMeEmilyP (she/her) has been streaming on Twitch since 2019 after finishing her service in the USAF as a Weather Forecaster. Now, she’s enjoying full time streaming and moderating for her community, rollerskating, and enjoying life while advocating for others.



nlasouris (they/them) is a moderator for several communities on Twitch. While not a professional software developer, they have applied some data analysis and basic coding techniques to combat malicious bots since 2019.

Show Notes

Audio file size





[00:00:10] SY: Welcome to DevNews, the news show for developers by developers, where we cover the latest in the world of tech. I’m Saron Yitbarek, Founder of Disco.


[00:00:19] CG: And I’m Christina Gorton, Developer Advocate at Forem.


[00:00:23] SY: This week, we speak with Cher Scarlett, Principal Software Engineer at Apple, about her endeavor for salary transparency at Apple to battle pay disparity and the challenges she’s faced during this undertaking.


[00:00:36] CS: And I also think that some of these folks are afraid to have it validated that it’s not really a meritocracy at all.


[00:00:46] SY: And then we speak with Twitch Streamer and Moderator, JustMeEmilyP, and Twitch Moderator, NLA, about the proliferation of Twitch hate raids and the tools and resources they and others have built to fight against them.


[00:00:59] EP: The last week has been wake up, ban bots, wake up, ban, all these people over and over again from all the servers that we’re working on in Discord to like combat all this.


[00:01:08] SY: After this.




[00:01:20] SY: Joining us is Cher Scarlett, Software Engineer at Apple. Thank you so much for being here.


[00:01:25] CS: Hi. Thank you for having me.


[00:01:27] SY: So tell us about your developer background and what your role at Apple looks like.


[00:01:32] CS: So I’ve been a software engineer for 16 years. I actually started coding when I was in middle school, when I was 14, out of sheer curiosity, and I played this game called EverQuest and I had a guild and we needed a website. So I had to learn how to make one.


[00:01:48] SY: Very cool. And what do you do at Apple?


[00:01:51] CS: I work in global security on internal tools.


[00:01:55] CG: All right, Cher. You recently set out to create a pay transparency survey to gather salary data from your fellow coworkers. This has kind of gotten a lot of tension recently. Can you talk about this undertaking and the impetus for doing it?


[00:02:10] CS: So I’m definitely not the first person. The reason that I actually got involved with some of the other pay transparency surveys was because I had noticed some troubling trends on Levels.FYI, and I wanted to see if those were indicative of anything. And so I poked around asking if anybody was interested in doing some sort of pay transparency initiative and I learned that there had been one that had started six months previously and also that morning, which is very serendipitous. But the people team shut them down because they claimed that a survey that was asking for inclusion and diversity information was prohibited. I recently learned a lot about employment rights because of what I went through with the Blizzard lawsuit. And I immediately recognized that that was illegal and that maybe it would be good if I was the one to start it because I have such an enormous platform and I know my rights and I’m very stubborn. So nobody is going to be able to message me and get me to take it down. So that’s what I did. And I made it so public because of the fact that it’s so hard to disseminate information inside of Apple unless you are in leadership.


[00:03:39] SY: And what did the survey look like?


[00:03:41] CS: So I made the survey and type form and I try to ask questions around a number of different demographic questions, including race and ethnicity, gender, disability disclosure status, and whether or not folks are remote, what area of the company they work in, what role they’re assigned as a level, and also what job responsibilities that they have in their role because the way that Apple is structured, sometimes it’s hard to tell what somebody does from their organization and title. So getting those job responsibilities can be extremely helpful in determining a similarity between two different employees. And we also asked for years of experience in the industry, years of experience inside of Apple, college education, years of working well in college, so bonuses, and of course, annual salary.


[00:04:43] SY: That’s intense.


[00:04:44] CS: It would be good to mention that one of the reasons that I made the webpage was because there were siloed groups of people organizing inside of Apple that I had insight into and I wanted to find a way to bring them all together.


[00:05:02] CG: So you’ve had this data that you’ve collected. You’ve had multiple attempts of having to gather it because of some backlash from Apple itself. Can you talk about what that looked like and the challenges you face trying to get this data?


[00:05:14] CS: I would say that the folks who had previously tried to gather pay transparency, they had tried to work with the people team to create surveys that would “not be prohibited” by company policy. And even then, they turned around, in this most recent one, they removed all of the inclusion and diversity information just to see if we could get some insight into pay bands within different organizations, and that ended up being shut down as well. They were told it was because it was being hosted on internal Apple tools, which was the corporate box account. And I think just people felt very frustrated that they were trying to work with Apple’s people team and still feeling like we’re not allowed to gather this information. I mentioned that I thought it might be illegal, quoted some things from the NLRB, and a lot of people were wondering, “Well, what do we do now?” But you could sense that there was definitely frustration and concern about the fact that here, these people are trying to do it internal to Apple, not talking publicly about it at all, and just trying to get some sort of insight working with the people team and not feeling like they’re allowed to do that, not feeling heard. And when I first hosted it, all that I got was support, both internally in Slack and then also externally on Twitter. It wasn’t until four days later, the following week, mid-week, that I started getting some troll responses and there were people who entered zero in the salary, which I would say accounted for most of the data that I had to remove. They just were entering all zeros, which I assume was to get the data at the end itself. That happens. But there was also people saying things that I was ruining Apple, that I’m the problem at every company I worked at, swearing at me, and then getting to a point where I’m being told by other employees that there are people filing reports about me, leaking data, external to Apple, leaking PII, that they should avoid me and actively trying to suppress their participation in the survey. And in some cases, I heard some even worse things that I haven’t talked about publicly and instead have reported to business conduct and external authorities.


[00:07:59] SY: And how did you overcome these many hurdles?


[00:08:02] CS: I mean I will say that I’m grateful that these types of things are the minority of responses that I received.


[00:08:10] SY: That’s good.


[00:08:11] CS: The majority of what I’ve received is actually a lot of private messages of support and gratefulness, which to me it’s sad that people feel grateful that I’ve done this because this is our legal right and we should be able to find this information and gather this information amongst ourselves without needing somebody with 40,000 Twitter followers being stubbornly in charge of it. That shouldn’t be the case. And solidarity was great, but also kind of heartbreaking is a lot of the stories I started to get that weren’t about salary that were actually just about discrimination. And that’s what led me to want to connect a lot of these people who were telling me that they were scared to talk about their experiences internally inside of Apple. A lot of them had tried to go to the people team, tried to go to employee relations, tried to go to business conduct, and either had their concerns ignored or dismissed. Or in some cases, they were the ones that ended up having investigations into them instead of into the people that they had reported.


[00:09:36] CG: Can you talk a little bit about the actual data that you gathered?


[00:09:41] CS: Right now, I have 2,400 responses. People can still participate and we want them to, because the larger the sample size, the larger statistically significant the results are. We definitely have found that there is an indication of specific wage gaps in certain areas of the company. I don’t want to talk about specifics because I think that consent is very important. And we are trying very hard to present what we have found so far to the people team and to legal so that they have an opportunity to respond to us before we even share it internally, which we’re actually having trouble getting the people team to speak to us as a concerted group.


[00:10:27] SY: Can you speak to the methodology of the research? I was looking at the Twitter thread. I know that’s something that came up a few times. Are you analyzing this yourself? Do you have a team? Is it going to be professionally done? And I know you responded a little bit through this on Twitter. You said you have thousands of responses, what exactly are you doing with them and how are you getting to your results?


[00:10:50] CS: So we have some folks who work in data analysis and our data scientists who do this for a job for Apple, which is fantastic that they’ve reached out to look at this data. Their methodologies were to control for years of experience, years at Apple, location, of course, and salary first, and then also looking at their role in the company and looking at different aspects of the data, so looking at the gender of the respondents, looking at the race and ethnicity of the respondents, and also isolating data by team so that when we do present these findings, that it’s very clear that we’ve controlled as much as we can for people who should have similar salaries. And we’re paying very close attention to statistical significance as well, of course. We try to indicate where we’re getting close to statistical significance in some orgs and ask the people team to not engage in any alleged suppression tactics so that we can gather more data to understand if these initial findings are actually indicative of something or if it’s just because of the small sample size.


[00:12:16] SY: So this data suggests that even though Apple says it has pay equity between genders, it still falls within the average gender wage gap in San Francisco of around 5%. What was your reaction to what your data suggest?


[00:12:30] CS: I think I simply wasn’t surprised. There were some specific areas where it’s indicative of something much worse than that. When you take all of the organizations together and then you end up with like an average that you do kind of get to that industry average, but then there’s some areas of the company where it’s very negligible and then there’s other areas of the company that are doing a lot of heavy lifting to get to that 5% to 6%.


[00:13:04] CG: So since this data has come out, what has Apple’s response been so far?


[00:13:08] CS: Other than to press? Nothing.


[00:13:11] SY: Is that surprising to you?


[00:13:13] CS: I think that there’s a part of me that’s naive and optimistic, even a little bit whimsical that I did believe that if we went to them and asked them if we could present our findings before sharing it internally that they would at least meet with us, and I’m not getting that from them. I myself and some of the data folks who have actually been the ones analyzing the data, they’re trying to separate us into meeting separately with our employee reps basically, which is extremely frustrating because it makes it seem like, “Why do they need to talk to us individually if they’re not understanding that this is a concerted activity?”


[00:14:00] SY: So throughout this process, have you been worried about some kind of retribution from Apple for rocking the boat?


[00:14:07] CS: Yes, and no. So I was recommended to get a lawyer once the suppression efforts started and I heard about reports. So I did do that and they retained me immediately. So I do have some cushion there as well as having a huge network obviously gives me a lot of privilege to feel like, “Okay, if I have to leave Apple, whether that’s by force or by choice, I can go get another job, but a lot of other people don’t have that avenue.” Right? So I understand why other people are scared and I do feel that pressure, but I think what scares me more is seeing things like the people team trying to separate myself and the other people who are actually working on the data. But then also further than that, there are comments here and there that make it seem like I am a disgruntled employee versus I’ve gathered information from 2,400 employees. And we’re a group of people participating in this activity. I think that that’s what scares me more. And the anonymous messages that I’ve talked about getting, well, they are in the minority, a lot of them feel based on what other employees have said that they may be coming from leadership, trying to get buy-in from other employees who may be on the fence to be against this initiative.


[00:15:40] CG: So pay transparency is a really hot topic in general and many times a contentious topic as I think we’re seeing here, especially within the tech industry. What has been the greater response from the overall tech community to you? I know you’ve mentioned some from Apple in general, setting out to do this and putting out this data. What has kind of been the response from the larger tech community?


[00:16:04] CS: Definitely overwhelming support. But again, you find those folks who, actually, you don’t find those folks, those folks find you, who are just blatantly against any sort of organizing that seems to have anything to do with social justice. And I’ve found myself accidentally in those spaces and reading what people have to say. And it’s frustrating because I try really hard to listen to other people’s opinions and engage genuinely in good faith and it feels like a lot of those conversations there isn’t that space to do that. I’ve had one conversation with somebody who I already know who disagrees with me on a lot of social topics who actually had some reasons why they wouldn’t participate in a pay transparency survey and that they don’t actually think that pay transparency is healthy. But that was one person out of all of the negative stuff that I read that had any sort of clear value that they wanted to have a discussion for me to understand their point and for them to understand where I’m coming from. And that’s something that frustrates me a lot.


[00:17:24] SY: And why is that? Why are so many people in tech so against the idea of pay transparency? What’s there for the people who have thought it through and do have a viewpoint? Why is it such a big deal?


[00:17:38] CS: You know, obviously, I don’t want to speak for everyone, but I do get the sense that for some people that are genuinely against this effort for good faith reasons, I think that they are concerned that pay transparency makes the workplace more toxic because there are people who “have more merit” that get more highly compensated as a result. And it’s not that I don’t think that that’s a valid concern. Maybe there are people who are over-performing and that’s why they have a much higher compensation. But if you start looking at who is pulling at the bottom and who is pulling at the top, in terms of demographics, you start to wonder how valid that is, like what systemic things may be at play that are saying that the majority of high achievers are from one demographic where the majority of lower performers are in another demographic that’s underrepresented. You can’t look at that and not think something is wrong. And if people maybe are lower performing, is there something in the workplace that is preventing them from excelling, whether that’s a bias that is coming in during reviews? Or is there something that is causing them so much distress or are they not being given opportunities? I think it’s important not just to look at this data from like, “Okay, fix their pay,” like what is causing these problems across the industry year over year, that almost every company that I’ve ever worked at has had to adjust salaries every year to create equity. Why is this happening over and over again? And I think that that’s what makes me not buy into the pay transparency makes things toxic because of perceived merit. And I also think that some of these folks are afraid to have it validated that it’s not really a meritocracy at all, that maybe even meritocracies aren’t possible.


[00:20:07] CG: So pay transparency can mean slightly different things to different people. Cher, I’m just wondering, how would you define what pay transparency is in this example, in this context? What does it look like to you?


[00:20:19] CS: To me, it’s disclosing pay bands for a role publicly to people applying for it and also internally to people who already work in those positions. One thing that I would love to see is for companies to have a tool that allows an employee to input their years of experience, their education level, and their role and show them exactly how much they should be being paid. And that way, if they’re not being paid that, they can go to their leadership and say like, “Hey, I used their tool and it shows that my pay might be a little bit low. Can we address why this is?” And management leadership should be actively concerned, whether or not there’s a systemic issue at play that may be hampering this person’s salary or alternatively giving them the tools to help them succeed to reach their full pay potential.


[00:21:24] SY: So what are your hopes and fears for the future now that you’ve spoken up about this issue at Apple?


[00:21:30] CS: My biggest fear is my own exhaustion, honestly. And I think that that maybe spread further to others as well. My hope is that before I possibly burn out that I have gotten everybody else that’s involved enough confidence and attention and organized effort that should I need to take a break from all of this that there’s a well-oiled machine and other folks leading who can continue on because I think that, and this is true in the workplace too, that the mark of a healthy team and a healthy organization is that people can take care of themselves and everything keeps going.


[00:22:19] SY: Well, thank you so much for being here.


[00:22:21] CS: Yeah. Thank you.




[00:22:31] SY: Coming up next, we speak to two Twitch moderators about the troubling proliferation of Twitch hate rates and what they and others are doing to fight it after this.




[00:22:53] SY: So joining us is Twitch Streamer and Moderator, JustMeEmilyP, and Twitch Moderator, NLA. Thank you so much for being here.


[00:23:01] EP: Of course.


[00:23:01] NLA: Thanks for having us.


[00:23:03] SY: So tell us about these Twitch hate raids. What are they? How do they happen? And who do they affect? Emily, how about you start?


[00:23:12] EP: So as far as I know, the worst of them are coordinated in Discord. It’s confirmed they’re not on 4chan or any other forum right now. So that’s actually the most frustrating part is getting these platforms to work through it. So then from Discord, they stream it. They’ll show their screen. It’ll be a Twitch streamer. They’re having a good old time just talking, doing what they do and just spamming their chat with racially charged things, anything to do against any marginalized kind of a group. They’re just going at it. They’ll reference each other and different hate rates in different channels. And then from Twitch, it goes into like Twitter and into their landlords. I’ve seen them contacting their bosses. It doesn’t stop or start on Twitch, but Twitch is definitely enabling.


[00:23:57] CG: NLA, you tell us a bit about like your experience with it so far, the Twitch hate raids?


[00:24:02] NLA: Yeah. I’ve been definitely present in chat, seeing a number of them. What I started noticing was that recently they’ve gone from just doing spam or mass follows with racist terms in their usernames to things like spam raids and things seem to be in a state of flux where for a while Twitch had disabled official notification of raids where a streamer can sort of push their community into another channel or another chat. Often that’s used very positively for a community to share a streamer with another community and that they grow their audiences. But for a while, Twitch had disabled small raids. So if you had below five viewers, if you raided someone, it wouldn’t show up. And I think that was specifically to mitigate things like people creating a hundred bots and using all of them to sequentially mass raid someone. But recently, I’ve seen some of those come back where I’m not sure if it’s a Twitch backend thing where they started allowing the smaller raids because of the pushback from smaller creators that they weren’t able to get the exposure in the raids. But from there, some of the streams I’ve seen have had like 130 accounts back-to-back raiding with the official Twitch notification with profile pictures that are swastikas. And it’s just for a half hour, every couple seconds a new one pops in, a new one pops in, a new one pops in. And there are tools that exist that can help. So a lot of people now have turned their raids to only allow raids from people they follow or to disable raids completely and things like that can help. But ultimately, the main problem that I see on Twitch is that they allow mass account creation. If they were able to disable individual people from setting up VPNs and mass creating, the worst match I’ve seen lately was 1,800 accounts within the span of 10 minutes. They’re creating about 200 bots a minute and Twitch doesn’t do anything about them. So I started using the Twitch API to start just watching every single username that gets created as they get created. So I have a script that’s running now and it has a latency of about a minute. So basically I’m seeing it run through a list of names in batches of a hundred, which was the API limit, of accounts that were created a minute ago. And so this batch of 1,800 that was created within the span of eight minutes, that trips certain pattern matching flags that I’ve set in my code. So I basically compiled the full list within 10 minutes of their creation. I can report those to Twitch. Twitch has reporting mechanisms, but none of them were actually acted on. And they only started deleting and banning those accounts once they actually did the hate raids they were created for. That’s what’s really frustrating right now is not only is mass account creation possible, but when we try to be proactive, when we try to make tools to prevent these from actually going out and hitting community members, it’s just infeasible for us to mass ban thousands of accounts that are created half an hour before they’re actually used and deployed. If you go live and right before you go live, you check the bot lists, you ban everyone there. It takes time to ban all of those. If you use external tools, you can ban about 200 a minute. But if you have 10,000 bots to go through from the previous day, it takes time to ban them all. You have to ban them, you have to block them, and then they create more while you’re live. And the ones that are created after you went live are the ones that hit you, sometimes. Other times they pull old sleeper bot accounts from two months ago. Twitch sent out a tweet recently talking about being able to better detect ban evasion. So I don’t know if it’s an IP-based tracker, but they’re saying they’re coming up with more channel level tools to allow moderators to do their jobs more effectively. But those tools are still reactionary. They’re still meant to combat after these accounts are created. If they attack, what can you do?


[00:28:43] CG: I know you mentioned to me in a message, you said something along the lines of, “They need to put like limitations on new accounts.” So new accounts are automatically on slow mode in certain channels. And then I liked where you mentioned the accounting should hinder them from doing specific things like raids and stuff like that. And I thought that was a really simple, easy solution.


[00:29:03] NLA: Yeah. I think there are a lot of ideas that a lot of people have come up with. I’ve seen a bunch of them on the Twitch Do Better Petition that’s been circulating and that they’re not all ideas from people like me. A lot of them are from the people who are actually getting hit by these raids and asking what can they do or what tools do they need to actually manage these. And I think account age restrictions are a very easily implemented Band-Aid solution. So if you create an account and you can’t raid within the first week or you’re on slow mode, that’ll help some things. It may not help with the follow bot. It may not help with mass spam where you’re only spamming with each account once. So fundamentally, the mass account creation, I keep coming back to, has to be where the problem is addressed. And if it’s not addressed there, then sure, account age restrictions will give us maybe a window of opportunity to detect things. But then when we report bots that we find, Twitch still has to act on that and they haven’t been doing so.


[00:30:10] SY: So have either of you been personally affected by one of these raids? And if so, can you tell us a bit about it?


[00:30:17] EP: The last week has been wake up, ban bots, wake up, ban, all these people over and over again from all the servers that we’re working on in Discord to like combat all this. I myself haven’t received a hate raid on my Twitch in a good amount of time. It speaks to how it’s racially charged and that I’m literally spearheading, trying to get these tools and consolidating things so that those without the energy don’t have to, and I still haven’t been hate raided all week. So it speaks to, as a white woman, they have not touched me, and banning all these people and like staying on top of it, spreading all these server IDs and user IDs and trying to get people to preemptively ban, easy. I’ll do it all day. So that’s definitely like the fight this week and it’s only ramping up and you can already see that. So that’s the unfortunate pipe behind it. That’s why we need action on all these different platforms to take us seriously.


[00:31:10] NLA: The channels I’ve been moderating for haven’t been hit directly by these hate raids, but these raids are affecting members of our community. And so they affect us all. And anyone who says that this doesn’t affect them, they haven’t been hate raided. You don’t have to be hit yourself to see that this is a problem. And right now it’s hitting so many people and especially marginalized community members that we all have to step up. We all have to do our part to make sure Twitch hears that this is a problem, takes appropriate action. And it seems like they’re aware of it. They’ve tweeted responses saying, “We’re working on this,” but this has been going on for a month now, I think. So many of us within the community, I know some people have been coming up with string fuzzy logic matching tools to detect raids. Others have been building bots that automatically put streams in lockdown mode. Some people have been compiling lists of bots for others to ban. We’ve all been pulling together as moderators, as coders in our off time, not as our full-time job, and we’ve compiled a suite of tools. Like this morning, I woke up to a Twitter DM, “Hey, can you run a check on such and such a pattern?” And we just pulled a list of accounts that look like bots running back through the entirety of 2019. They were created gradually, the better at the time. And if we can do that through the publicly accessible Twitch API, someone with access to their back end, a full-time staff member, should be able to make progress in at least mitigating. I’m not saying that I think Twitch can solve this in a night. But given the state of how bad it is, the fact that they haven’t done anything visible, other than tweeting saying, “Trust us, we’re working on it, “that’s just not good enough.


[00:33:13] EP: I think the sheer amount of effective, amazing tools we’ve made in the last month, on our own time, like you said, with our own day jobs going on, the amount that we’ve made and the incredible nature of how strong they are, even in spite of Twitch dragging their feet on giving us more access to the API with verifying our bots, we’re still doing so much. And the fact that we can do all that in one month and they can’t speak volumes.


[00:33:40] CG: Really, really wonderful points. I do stream on Twitch. And like you mentioned, just because I haven’t been affected directly yet doesn’t mean that we can’t do something about it now, especially because for us, we have guests all the time and I don’t want my guests to like feel unsafe. Right? So these are great points and you’ve both have talked about it a little bit. You’ve talked about how you’ve created things to help fight these raids. Emily, I’m wondering if you can talk a little bit about your site.


[00:34:08] EP: So it started with the Discord server, because my thing is organization. I’m like, “If we can get organized and we can work together, then we can beat it.” So that started like the last day of January this year. And it kind of slept for a while. It wasn’t so bad then. People didn’t talk about hate raids like they did, because it hurt. You don’t want to tell people they hurt you. And so people were quiet. And so the server, it blew up over the last month. Right? And so it’s a bunch of people coming together to hand me these tools they’re making over and over. There’s bots every day. There’s different ways to use Twitch tools. There’s so many different programs out there and it’s overwhelming and not everybody knows about them all. So I started compiling them in the server and there was just so much, and it was getting so disorganized that I was like, “Okay, we can’t keep doing it this way.” So my idea was to make the site in that way, it was also a shareable form of what we’re creating in the Discord. That way, even people that aren’t in the server, because we’ve had to lock it down so much, can still access everything that we’ve compiled and be able to share that with other people that aren’t directly assisting. And so it’s really just a compilation of everything people have come together to create. And I’m trying to keep up at this time. But why doesn’t Twitch have any of this?


[00:35:29] SY: So you’ve mentioned that Twitch hasn’t really done very much. You all, as individuals, as people who are not doing this as their full-time gig, have put in the work and have done a lot, made a lot of progress. Are there things within the Twitch platform itself that can fight against raids like this? Have there been any improvements on the platform in terms of features or tools to help you in your efforts?


[00:35:52] EP: I do have a page dedicated to Twitch tools itself, only what they offer. It’s just not enough. And I think like you mentioned, they only took away one of the few tools we had, I think it was less than three people, was the raid cap. But the thing is the raids aren’t just through the raid of features. Like they generally just post their link in the Discord or on their YouTubes or whatever they’re doing in their groups and they all just go to that link. It’s not even an actual in feature like raid.


[00:36:23] SY: I see.


[00:36:23] EP: So even that, it’s uncontrollable. There has to be more.


[00:36:28] NLA: Twitch definitely has some tools. So you can limit raids from people who don’t follow you or who you don’t follow, I believe, is the restriction, or you can disable raids entirely. There are things like AutoMod, which will flag racist or bullying or various different categories of chat messages to varying levels, and using that does okay if they only catch some things. You can put words into your moderation lists to completely disallow certain terms in chat, but there are ways to get around that. There's a unique chat mode, which prevents identical spam. But with algorithmically generated chat bots, they can just append a couple of characters and get around that pretty easily. So these tools are there. The fact that some people can use them effectively doesn’t mean that they’re good enough for combating everything that’s being thrown at us.


[00:37:30] EP: Nor are they good enough though.


[00:37:32] NLA: Yeah, I totally agree.


[00:37:33] EP: Because they’re getting creative now. They don’t use any of those terms that we have blocked. They use statistics about triggering events for certain groups of marginalized creators. It’s a string of very normal words, put into very harmful, targeted harassment. AutoMod cannot capture that. And that’s the issue.


[00:37:54] NLA: Or they’re like substituting little letters, like they’ve been using all sorts of different characters in the Unicode set. And there was a tweet recently looking at, if you take the generic word jogger.


[00:38:10] EP: Oh, yes.


[00:38:10] NLA: J-O-G-G-E-R. And if someone wanted to ban that word and you go through the different character sets within the ASCII range as well as within Unicode, even just Roman characters and just making substitutions here and there, I think he came up with 2.9 million different variations.


[00:38:34] SY: Oh my goodness! Wow!


[00:38:35] NLA: So if you wanted to individually ban every permutation of that, it would take eight days straight of standing terms... So again, it’s things that on the channel end, on the streamer end, it’s not feasible to keep up. This is something where if Twitch is able to do it more centrally, if Twitch can ban the bots more centrally, then it’s not something where each and every individual streamer who’s being hate raided has to go through and do their thousands of bans every year.


[00:39:10] CG: That definitely just does not seem sustainable. Not a great business model for them either, making this harder and harder for their streamers. So there’s an event coming up called A Day Off of Twitch where users plan on protesting these hate raids and Twitch’s response to it by not streaming on the platform, and this is on September 1st. Will you both be participating in this? What are your thoughts on it?


[00:39:35] EP: Yes. I feel that as a white woman, that the best I can do is do everything for everyone else. So for those that are participating that day, I’m with you and those that are doing other protests, like they’re online and everything else, I’m doing it all. I’m just there to quietly echo and continue on what they want in a way to get Twitch’s attention. I don’t know if you saw their demands that we posted today.


[00:39:59] SY: What were your demands?


[00:40:00] EP: For most of them, Twitch needs to hold a round table with group or groups of marginalized creators who have been affected and open that channel of honest discussion and actually say how it’s been and show how it’s been from platform to platform, and the way that Twitch is in the middle, enabling, hopefully to create and implement like more tool sets. But as you said, at the head of Twitch, on the platform itself and not, like you said, every single creator having to nap this, if they are lucky enough to know how to protect themselves. They want proactive protections that could be used or implemented immediately. So not waiting for so long for these account ages for chatters, they’re talking about excepting and/or denying a raid. So like monitors have like a two-minute pop-up that shows up and say, “Hey, this person is trying to raid you. Accept or ignore.” The third would be removing the ability to attach more than three accounts to a verified email. So we’re saying that because there are good bots out there. There are good things people are doing. And they use another email for it. But if you did something simple, like only had three accounts be able to be created from one email, so much of that mass creation gets slowed down in every single step for trolls and these people count. It makes it more work and they’re less likely to continue. More likely to give up. And the last one was provide us with transparency and a timeframe for implementing these tools and not these blanket statements that they’re posting. Real actual plans. I mean, I don’t know if you saw. You mentioned earlier if Twitch has seen it, but Twitch gaming itself was raided yesterday with a black host.


[00:41:38] SY: Wow!


[00:41:39] EP: They cannot ignore it any longer. And so I really feel like A Day Off Twitch is just the start. It’s the kickoff, one day of, “Let’s see how the platform looks without these wonderful marginalized creators.”


[00:41:51] NLA: I wanted to mention, since one of you mentioned that it’s not being a good business model for Twitch. And up to this point, in a way it hasn’t really affected their business model because fundamentally, most viewers aren’t affected by this, unfortunately. The bulk of the viewers on Twitch are sitting in some of those top few streams. There was a Twitter thread about how on the peak day within the past 30 days of Twitch viewing, 80% or 88% of Twitch viewers were in the top 5,000, I believe. I don’t remember the numbers exactly, but basically the top view streams live on Twitch had a concurrent viewership in the hundreds, and then the remaining 80 plus percent of streams that were live had a viewership averaging closer to five or six. So from Twitch’s end, in terms of where they get the money, in terms of where subs are going, in terms of where ads are being seen, those are in the very big streams that up to a point haven’t been affected by the hate raids. And that’s where from the Day Off Twitch perspective, there was a take that was put on Twitter about how it’s not necessarily going to be effective. And from a purely financial perspective, I would agree with that. From actually having the issue, have a spotlight shown on it, having people talk about it, having Twitch forced to acknowledge that this is a problem, I think that’s where the hashtags, the days off those things can have some chance of actually pushing through. It’s just frustrating seeing how much, like, for example, the fact that we’re doing this podcast is spurred by the fact that so many people are getting emotionally hurt, financially hurt by this, and those people are not Twitch. Twitch is barely feeling this. And I think that’s part of the problem is, from a financial perspective, if they let this go on and if all of the smaller streamers left the platform, they’re still left with such a big viewer base on those top view streams.


[00:44:26] SY: Yeah, they’re fine.


[00:44:28] NLA: And if those are fine and Twitch is okay, saying this is the way our community is, that’s definitely very telling about them ethically. But from a financial perspective, that’s not really a factor.


[00:44:42] SY: Is there anything else that we didn’t cover that you wanted to get to?


[00:44:46] EP: For the tools, the response tools, I would just say that if anyone had any more to add, please don’t feel any kind of pressure. But if you want to help people and you want that to go on the page, I put everything on there. Nobody has any certain preference, nothing like that. SmashBot is our baby, but that’s because it is the most amazing tool. It’s got a lot of potential and a lot of sponsoring looking going on. There’s a lot on there. And I think the more we can add, the more we can tell people and the more that they know. We can fight it, still, at least until Twitch does something.


[00:45:18] SY: And we’ll put a link to the page in our show notes so people can check it out.


[00:45:23] NLA: I will just say there’s a lot the community is doing, there’s a lot the community can do, but my hope is that this podcast and the words we’re saying gets out to the sponsors who have to make decisions about whether they want to advertise on a platform like Twitch, if this is the way Twitch is.


[00:45:41] SY: Thank you so much for joining us.


[00:45:42] EP: Thank you for having me.


[00:45:44] NLA: Thanks for having us.


[00:45:56] SY: Thank you for listening to DevNews. This show is produced and mixed by Levi Sharpe. Editorial oversight is provided by Peter Frank, Ben Halpern, and Jess Lee. Our theme music is by Dan Powell. If you have any questions or comments, dial into our Google Voice at +1 (929) 500-1513 or email us at [email protected] Please rate and subscribe to this show wherever you get your podcasts.