Season 9 Episode 7 Jun 22, 2022

How Technology is Used as a Tool for Abuse/Coercive Control

Pitch

Technology-facilitated abuse is any form of controlling behavior that involves the use of technology as a means to coerce or stalk a person.

Description

In this episode, we talk about how technology is being used as a tool for abuse and coercive control with Bindu Oommen Fernandes, executive director at Freedom Forward, and Sonya Mital, community engagement lead at Narika. We discuss possible warning signs of abuse to keep an eye out for, tactics abusers use on their victims, and the different ways technology is also helping survivors overcome these circumstances.

Hosts

Ben Halpern

Forem - Co-founder

Ben Halpern is co-founder and webmaster of DEV/Forem.

Suzanne Aitchison

Forem - Software Engineer

Suzanne Aitchison is a software engineer at Forem. She is passionate about accessibility and maintains a tutorials site for accessible web development (particularly with reference to React).

Guests

Sonya Mital

Narika - Community Engagement Lead

Sonya is Community Engagement Lead at Narika where she raises awareness on domestic violence, technology abuse, healthy relationships and special concerns for immigrant & South Asian populations.

Bindu Oommen Fernandes

Freedom Forward - Executive Direction

Bindu Fernandes is executive director at Freedom Forward, a bay area non profit that is working to prevent the commercial sexual exploitation of youth in San Francisco by transforming the systems that too often contribute to their exploitation. Prior to this, she led an organization serving survivors of domestic violence and also worked at Google for over a decade leading policy and diversity & inclusion efforts for global teams.

Show Notes

Audio file size

49298702

Duration

00:51:21

Transcript

[00:00:00] SM: A lot of times these things aren’t designed to be nefarious, but they’re used in nefarious purposes and we need to strengthen proactively these systems so that they have better protections against that.

 

[00:00:21] BH: Welcome to DevDiscuss, the show where we cover the burning topics that impact all of our lives as developers. I’m Ben Halpern, a co-founder of Forem.

 

[00:00:29] SA: And I'm Suzanne Aitchison, Senior Software Engineer at Forem. Today, we’re talking about how technology is used as a tool for abuse and coercive control with Bindu Fernandes, Executive Director at Freedom Forward, and Sonya Mital, Community Engagement Lead at Narika. Thank you both so much for being here.

 

[00:00:45] BF: Very nice to meet you.

 

[00:00:47] SM: Thank you for having us.

 

[00:00:48] BH: So before we get into today’s topic, can you tell us a little bit about your background? Bindu, can you go first?

 

[00:00:54] BF: My career started in tech at Google where I worked in marketing and advertising and DEI work. After that, I moved into being a small business owner for a couple of years, and then moved into the nonprofit space and social justice space. I then led an organization that worked in domestic violence at the intersection of just cultural responsiveness and identities, and now lead an organization that works with young people that have been exploited or young people that have been trafficked. So have moved really from corporate to small business, to the nonprofit social justice space.

 

[00:01:28] BH: And so that’s Freedom Forward that you’re talking about?

 

[00:01:30] BF: Correct.

 

[00:01:31] BH: And what specific services do you provide to accomplish this mission?

 

[00:01:36] BF: Freedom Forward is uniquely positioned to be an incubator of new, innovative ways to resolve for child sex trafficking. What we’ve recognized is that there’s a lot of organizations, great organizations at the frontline doing important crisis work, but also there is the need for innovation and newness and just doing things differently at a systems level. So a lot of what Freedom Forward does is really think about three to four year-long projects or pilots to really ascertain a new way of doing something. And if it works, it works. And if not, we publish a learning’s report that kind of shares findings about what we’ve done. So one of the pilot initiatives or projects and models we’re working on is FAM. And FAM is a new way of thinking about foster care because there are studies that show this big correlation between foster care and youth who’ve been trafficked. Lots of the young people that have been trafficked, they’re coming from very vulnerable backgrounds. So this might look like foster care. This might look like it’s an LGBTQ youth. As you identify with multiple vulnerable populations, you are likely to also be exploited. So FAM is a model of rethinking foster care. And we’re in the third year of our pilot initiative. We also have HYPE Center, which is a youth center. It’s a youth drop-in center created by young people and for young people, for young people to get access to things like therapy, legal needs, food, et cetera. And so we’ve in the past run various other pilot initiatives and continue to hope to do more of that in the future as well.

 

[00:03:04] BH: At a high level, how do you balance the innovation with being cautious and not doing harm and ensuring that you are not going down the wrong lead or creating the wrong inferences and taking action on those? How do you strike that balance?

 

[00:03:19] BF: Yeah, that’s such a great question. I think it is centering all of our models in really the person and the group that we’re trying to serve. Right? Centering our community, centering our young people in these decisions. So one of the things Freedom Forward does is we have multiple youth advisory boards. We have a HYPE youth advisory board and a FAM youth advisory board as a way of really having our young people tell us what they need and for us to show up in the ways that our young people are asking and to ensure that our solutions are being created by or co-created with our young people. One of the fundamental principles we’ve always had at Freedom Forward is that we want to be the adults that are showing up and doing better for our young people. We really don’t want our young people to have to do better in order to receive services. Right? Often we’re telling young people, “Hey, don’t bring your weapons, don’t have drugs.” Right? We’re telling them, “Do all these things to qualify or do all these things to show up and deserve our services.” But we’re really kind of changing that in the head and saying, “Why do you have to do better? We, as adults, need to do better. Our systems are flawed and we need to figure all of that out with you.” So a lot of that power and decision making goes back to our young people. Like there are simple things like our HYPE Center refrigerator is stocked by our young people. We have a grocery list right outside that says, “Hey, if you want something, you write it. We shouldn’t have to tell you what we think you should eat. You know what you need, and you should tell us, and we should kind of rise to that occasion.” So a lot of it is really keeping the pulse on what do our young people want, how do we continue to center them in all of that decision making and how do we co-create kind of solutions. Right before Freedom Forward was started or incorporated, the founder spent a year in studying the entire landscape of what are all the parts of the ecosystem that exist and what are the pieces that don’t exist. And so really Freedom Forward was created to kind of fit in all the gaps that weren’t there in the landscape and the ecosystem.

 

[00:05:12] SA: And Sonya, could you tell us a bit about your background?

 

[00:05:14] SM: So my background is actually in media and education. After graduating college, I taught English in Hong Kong for two years and then fled Asia because of the pandemic. And then upon arrival heard all the horrifying stories about domestic violence during the pandemic, how it was rising to unsustainable levels. Some statistics saying three times, some saying 10 times or 15 times ordinary levels, which are already too high. So I knew then that I really wanted to join the domestic violence advocacy space and try to make some type of impact with my education and media background. So I ended up joining Narika in 2020, since then have not left.

 

[00:05:54] SA: Can you tell us a bit about what Narika does?

 

[00:05:57] SM: So Narika is a Bay Area-based nonprofit, which supports survivors of domestic violence, especially those in the South Asian and immigrant community, which we know is quite large in the Bay Area. We offer free and confidential helpline in case management services, support groups, prevention, and awareness workshops, and wellness, economic empowerment programs. And all of this is recognizing just as Bindu is saying that all of our services need to be holistic and survivor centered because of the complex and wraparound nature of domestic violence. It often takes many, many years for a survivor to get back on their feet, to find any sense of long term stability, empowerment, and happiness. And so we really recognize that by making sure that all of our services are iterative based off of survivor feedback and responding to serious emerging needs, like technology abuse.

 

[00:06:45] BH: So I’ve heard the term technology abuse. And can we double click on that as a term? Is this an official industry lingo so that we’ll refer to one specific thing? Or does this refer to something kind of general? Can we talk about the terminology and start there?

 

[00:07:03] SM: Yeah, absolutely. So technology abuse, or it’s also known as technology-facilitated abuse, is really any form of controlling behavior that involves the use of technology as a means to coerce, stalk, or harass another person. So from our perspective, we’re looking at intimate partner violence. In tech abuse, that is used to facilitate intimate partner violence. And this is important because we often imagine technology abuse or things like cyberstalking, for example, is happening from shadowy strangers from their basement or the dark web or a foreign country, but the reality is that over 60% of technology abuse and cyber stalking comes from an intimate partner or an ex-intimate partner. So it’s people that we know really well whom we share our private lives with, who know the answers to our security questions. They know our mother’s maiden name and our first cat’s name or whatever it is. They oftentimes know our passwords and have easy access to our devices because we trust them. We trust them with our lives. So of course, when a domestic violence relationship happens, technology abuse is an easy way for an abuser to control the other person. And we see it in a lot of different ways. It can be very intense, very sophisticated ways like with stalkerware, but it could also be something as simple as just trying to guess a password or trying to get more information about bank accounts or joint bank accounts that you may share with your partner or even sharing internet accounts and subscriptions and being able to monitor what content is being consumed or what communications are being made. So technology is great. It really does influence our lives, makes things easier, makes it easier for us to communicate with each other, but it also means that it’s an easy tool to coerce someone and control them.

 

[00:08:53] BF: Yeah. And just to add, I think, as we think about our young people, again, we have this spectrum of technology-facilitated abuse that might happen with the knowledge of the person that it’s happening to. So this can look like, “Hey, hand me your password. You love me. What do you have to hide?” Or threatening messages, tracking whereabouts, et cetera. But sometimes of course is happening without the knowledge of the person. Right? And that could look like checking home cameras, checking phone logs, et cetera. So exactly like Sonya shared, there is a spectrum of really simple things. Like I’m checking your email versus really, really complicated, sophisticated things and happening with a combination of a stranger or someone you know and someone that you trust in your life as well.

 

[00:09:36] SA: So we mentioned, like quite often, the majority of cases, the potential abuser is someone that’s known to the person. When we’re looking at those potential abusers, what kind of people are they in their lives? Is it really just partners? Especially when it comes to children, is it people that they’re meeting day to day? Who are these potential abusers?

 

[00:09:57] BF: Yeah. Again, it is a spectrum. I think as we think about the truth is, again, we are trained to think about strangers lurking on the internet, and that is real. Let’s be real about the fact that it is real. It still happens and it has evolved into strangers that know exactly where to prey. They know exactly what to say. They know how to gain your trust. And so what starts as a relationship with a stranger and your child or a young person evolves quickly into, “Oh, this is a person I trust. This is the person I play games with online and so on.” I always really think about who do we give unrestricted access to our children to. Is it the relatives? Is it the friend? Is it someone in a social circle? Right? The people that we give unrestricted access to, to our children are the people really, we need to start thinking about. And the truth is none of the people that facilitate abuse through tech, everyone looks and sounds just like us. And they don’t walk around with signs, but definitely look and sound like us, and therefore are able to continue this really undetected.

 

[00:11:00] SM: Yeah. And I think there’s also the added notion that relationships in family are made easier through technology and technology fills in gaps that are already present in relationships in family. So I’m thinking, for example, we have child monitoring devices for our babies in their rooms. We have like Nest, for example, or thermometer or AC sort of smart home systems that can control the temperature. Even now, I’m seeing a lot of newer car features have like GPS systems factored in that connect to your phone. So you can see who’s using your car, where they’re going. So these things are ostensibly made to make our connections with each other stronger and safer, but they can also be used very creatively in a sinister way to track each other. And we’ve seen all of these used by abusers. So I even mentioned the smart home systems, we’ve had like abusers who will go on business trips and then turn off or shut off the heating systems in the winter to freeze out their survivor or just make their living life really uncomfortable. Or they’ll use these same child monitoring systems to monitor what their partner may be doing in the household or control even garage openers, making sure that they’re locked to prevent the survivor from leaving or tracking their car as well. So a lot of times these things aren’t designed to be nefarious, but they’re used in nefarious purposes and we need to strengthen proactively these systems so that they have better protections against that.

 

[00:12:29] BH: Can we dive into more specific technologies that people might not think of necessarily or ones that kind of have specific examples? I know AirTags spoke to some of these issues when they were first launched, but I also know that there’s concerns that they haven’t mitigated or blind spots. Can we get into some examples that people might kind of see or in their daily lives might not immediately think of?

 

[00:12:55] BF: Like I think about again, home security systems and remotely locking your doors in the home all, again, seems to have the intention of let’s keep the strangers out and we’re all in this together, but then can be used as a method of control and, “Hey, okay, we’re in an argument. I’m going to remotely lock the door and you don’t get to get out until something happens.” Right? Changing temperature comes up quite a bit. Things like gifting devices and gifting technology often is again another way to monitor someone. So when your child is given a phone, when your partner is giving you a phone, sometimes there’s a reason why you’re being gifted technology and devices. And part of it is to monitor. Again, simple things I think about are just phone logs and having to have the discipline of clearing your phone logs, especially when a survivor is seeking help. So continuing that discipline of like, “Oh, okay, I need to clear my phone log. I called a crisis counselor. I called my lawyer. I can’t have someone monitor this.” It’s all, again, other ways where it may seem like it should be easier to ask for help and it should be easier that these technologies are used for good, but of course can be misused.

 

[00:14:04] SM: Yeah, I definitely agree. And just to get back to Apple as well. So the AirTags example was a huge concern. Just to explain, it’s these tags that you can put onto your possessions to track where they are. So you might put it on, I don’t know, a laptop or important items so that you can track them in case they get stolen or you lose them. And so of course, you had a lot of stalkers and people just placing them on random purses of women that you would see on the train. And so you’d have women going home and getting alerts that something is tracking you or seeing someone actually following them because they were misplaced onto someone. And then Apple announced a new iOS feature for the next upgrade, which will allow you to delete or edit text that you have recently sent. So this is obviously also dangerous because an abuser can send really threatening text messages to the survivor to incur emotional damage and to make them feel unsafe and then immediately delete that once the damage has already been transmitted. So this is another way in which it makes it more difficult for survivors to document the abuse, which is really important when you’re trying to prove that technology abuse is happening. The more sophisticated the technology abuse, the harder it is to track and document. And so this will be another way in which abusers can send messages to increase fear from the survivor without incurring any punishment for it.

 

[00:15:28] BF: Yeah. And just to add, I think all of this culminates in really a person who’s experiencing this, really wondering if you’re going crazy, right? When your temperature in your home is changing, when something you thought was somewhere is not there anymore, like when all this happens, you’re second guessing when a text message you swore was there changes. You’re second guessing everything that’s happening or some of these again seem so outrageous that you tell somebody else and they’re like, “That can’t be real. Nobody can lock you in. What do you mean?” Or so on. So I think this all really culminates into really more disbelief when a survivor potentially speaks up, because it seems too much to believe, right? And then results in really someone who may have had that one bit of courage to speak out and say, “You know what? It’s not worth it. I’m not going to ask for help anymore because it just sounds outrageous. I don’t even believe it myself. I don’t know if I can ask for help.”

 

[00:16:19] SM: Yeah. And that’s also weaponized by the abuser as gaslighting to say, “You’re crazy. You’re imagining things. This isn’t happening. That didn’t happen to me.” And then that again reduces the stability of the survivor’s experience and testimony. We’ve even had survivors go to the police office and say, “This is happening. I know that it’s him. I know that it’s him sending these messages and him controlling my environment,” and have officers or even other service providers say, “That’s not even possible. I don’t think that’s true.” Or like, “Are you sure?” And it’s because technology is advancing at a pace where abusers are able to misuse it much faster than service providers can catch up and respond.

 

[00:16:59] BF: And similar with even just policies, right? Like legislation policies cannot keep up and is trying to keep up with changes. And so you’re also seeing areas where they might be a law or a new law about something. And the people who are in charge of enforcing it are like, “We haven’t gotten trained in cyber safety and cybersecurity.” Or sometimes people saying, “Oh, I don’t even think that’s a thing. I don’t think you can complain about sextortion.” And even just how do our laws and our policies keep up with many of these evolving nature of technology-facilitated abuse and how do we ensure that enforcement also happens and that every person knows their rights around it.

 

[MUSIC BREAK]

 

[00:17:55] BH: We’re going to get into some details about the tactics for good practices and also looking out for problems and general community issues. But I’m curious just right off the bat, before someone gets into a position where they might feel threatened or there’s possible abuse and they want to take action, in terms of just generally practicing good digital hygiene, how might one communicate around this if they start adopting this? For example, if I tell my spouse, “I know it’s kind of neat that the car, like you can find the location and that’s sort of practical.” But in general, I just prefer not doing that. And having that conversation with people in your life before you have reason to think that they’re going to abuse that, but just practicing good behavior. How does one just kick off that level of communication that just like I’m making the choice to turn off location on my messages to you and it’s not you, it’s just something I’m doing? Before we get into it, how might one even just have that initial conversation with their mom?” Just like, “Hey, I don’t use digital. I just don’t have the location turned on. It’s just what I do.” I’m just sort of curious how to kick that off with the people who you do trust.

 

[00:19:12] SM: Yeah. I actually relate to that a lot because I have shared location with my family since high school for safety reasons. And it’s something we’ve just never changed and is still true, even though I am an adult woman. And I think my mom does still occasionally check on me sometimes just to make sure I’m safe. And I know it comes from a place of love and care and safety, but you’re absolutely right. And what you’re talking about is essentially conversation on boundaries extended to tech space. And I think it’s important to just acknowledge that it does come from a place of love and care, but isn’t it a better extension of trust to establish boundaries around these practices like when and in what situations we can use these tracking mechanisms? Like establishing each person’s level of comfort as well over how often you can check or if they want to have these shared I think is really important. And I think it does depend on context and the reasons for it. Like I am a woman and I do feel safer when I have at least one other person who can check up on me. But is that necessarily the case for every person? Probably not. And yeah, I think it’s important to, again, place that conversation based on trust. This is a more trusting relationship if we don’t need these things, but how can we also have legitimate safety practices that aren’t maybe invasive or can turn into privacy issues?

 

[00:20:31] BF: Yeah. And when I think about parents and children, often, I think about how it’s easy to jump into the, “We’re shutting off all tech. No, no, nothing’s happening.” It’s easier to arrive into that space of fear and paranoia and anxiety, but really, of course, any study will tell you that doesn’t get us anywhere. And so really it is the information and conversation and dialogue that is key to maintaining safety in the online space. The truth of the matter is technology has made the world extremely small. And so you have our young children able to make friends and be befriended by really anybody. When I think about video games versus social media, on social media, your guard might be up maybe a little more, but in a video game, there is a presumption of we’re all here for the same reason. So I should trust you, right? Technically. And I think these all are, again, things that in conversation and with our young children and with each other that we’re able to like recognize, learn, and the name all of this as potential ways where we need to be safe and consider what boundaries we keep considering keeping our private information private, right? Like resisting the urge to post on Instagram when you are in that space. It takes someone five minutes to figure out if you are here for this class, that you are going to be here for that class the next week as well. Right? So really thinking about how do we keep our private information private and resisting the urge to kind of open all of that up to the online world where any of it can be shared and re-shared and stolen.

 

[00:22:05] SM: Yeah. And I also think consistent check-ins are good. We shouldn’t assume that once permission is granted that that’s the case for all time. For example, if you’re talking about a partner or a friend, just having regular check-ins like, “Hey, I noticed that this is still going on. Is that okay? Do you want to change that? Or I feel a certain way about this, can we change that? And here are the reasons why. I still love you. You’re still my friend or partner. I’m just feeling this way because I don’t think it’s necessary for our relationship.”

 

[00:22:32] SA: So some abusers or potential abusers will actually be working in the tech space and sort of be a bit more able to use even more sophisticated methods and tactics to track their victims. And I was wondering if you could tell us a bit about what those kinds of tactics are, how that’s being exploited.

 

[00:22:51] SM: The most common one would probably be very sophisticated stalkerware that is able to be planted into a device such that it’s almost undetectable. And those are very difficult to find. There are some ways depending on the phone or the device or the operating system to find out what it is, but a lot of times with those situations, we end up having to just provide a safe phone or find a new burner phone or device for that person. And it’s absolutely crucial to, at that point, be in a situation where you are apart from the abuser. So they cannot reinstall it through whatever means they did the first time, but there definitely are very sophisticated ways like stalkerware. But honestly, we find that very often, it’s just simple devices, like the one we’ve mentioned that are used for purposes that are beyond their original intention. What’s most sophisticated is when they’re able to do it in really subtle and quiet ways that are almost undetectable for the survivor and then gaslight them when they feel that something is wrong or they’re going crazy.

 

[00:23:56] SA: Thinking about children in particular, and we’ve mentioned before, they’re particularly vulnerable to sort of, I guess, these trusting personal relationships that might leave them open to this, can you share a bit about the overlap or the intersection between technology and the commercial, sexual exploitation of children?

 

[00:24:15] BF: As I shared before, I think as our reliability on tech continues to increase, what really is happening is our traffickers or people that want to prey continue to have easy, unmonitored, consistent access to our children. Right? And this is again, when I think about how life was before, right? We never really had the ability to have strangers or other people have unmonitored, easy, consistent access to our kids. Often what’s happening online is that this access can happen and stay persistent over days, weeks, and months, while the person who’s attempting to traffic this person may lure them with promises of money or food or gifts or travel. And so all this, again, continues to build up because it is happening online. Of course, as I shared before, I think video games and related chat rooms also being dangerous in connection with social media or related to social media, because, again, there’s this assumption of something that’s common and a common goal or a common motive of gaming. Typically, things to think about when someone is in a video game or playing games together is part of grooming sometimes is moving or encouraging the young person to move to an off platform or an off chat platform in an attempt to say, “I want to continue to communicate with you outside of these games as well.” There’s a lot of studies about many of the youth that have been trafficked are now being advertised online. So when we think about time in 2004, before the percentage around how many youths that were trafficked were advertised online was under 35%. But now we’re talking about over 75% of youth that are being trafficked or sold for sex are also being advertised online. And so this again is blurring the lines between the past world where solicitation would happen on the streets. And if you brought like a young person, a young child and put them on the street, someone would notice. Sure, they’d be sold, but you would notice. But now all of that buying and all of that selling is moving online. And so again, it’s easier the demand for people to access youth and purchase youth for sex and for a party is easier because now the selling and the purchasing is happening online. There’s a lot of studies that also say that two out of five kids have shared that they’ve been approached online by someone attempting to befriend them or manipulate them. Right? Again, back to the theme of, as technology continues to give unmonitored, unrestricted access to our young people, what does that mean for solicitation and having strangers or people approach them? I also want to add like the correlation between child pornography with trafficking and that as the demand for child pornography increases in the online space and the fact that it’s being distributed online, primarily, you will see that this correlates to the demand and increase of trafficking and of purchasing of young people. So it may start with child pornography and the desire to watch children and then eventually becomes, “Well, there is an easy way, I can just purchase this young person for a party tonight.” The piece that is often difficult with minors or young people is that as they get older, you have young people in their teenage years, believing that they are in control of the entire experience. They believe that it’s their choice that, “Hey, I just have to go to a party tonight and I earn $5,000.” But really to remember that they’re still minors according to the law and they cannot consent to being in the act. Right? And so recognizing that some of this gets extremely complicated when victims or youth that are being trafficked believe it is their power or believe that their expression of love for their partner. When their partner tells them, “I think you should sleep with this person. We need some money because things are hard at home.” Their expression of love or their expression of “I want to feel powerful” is to do this. And that gets very tricky because you are a victim in many ways, but also you feel powerful in that experience as well.

 

[00:28:17] BH: How might a parent develop a mental model for what is and isn’t safe in today’s world? I consider myself pretty technically fluent, but also when I think about parenting, I sometimes think about how I was raised, which is not often applicable and it seems like some people might make that mistake of mapping, how they grew up to what things are like today. What does it take to develop an accurate model for what’s actually safe and dangerous and making good choices?

 

[00:28:47] BF: Yeah. I feel like if we knew the answer, we’d all sleep better at night. I don’t know. I think it starts from being curious, learning, listening, coming from a place of wanting to understand, right? I think many of us inherited styles of, “Hey, you don’t pay the rent. You listen to what I’m telling you.” Right? But moving into the space of curiosity and moving into the space of engaging in dialogue and discussion, I really believe the best thing we can do for our young people is to stay curious, stay open, stay nonjudgmental, stay from a place of, “I will love you no matter what you bring up,” and encourage children to watch out for signs and flags. And that’s really the most we can do, because yes, it is simpler to take away someone’s phone and take away someone’s devices, but know that we’ll not keep our children safer or maybe it will or postpone something. But really what we can do is continue to engage in conversation and dialogue about, “Hey, if you think these things are difficult or you think someone’s encroaching all your boundaries, I think it’s time that you have a conversation with a trusted adult.” What I will also share is there are many instances where the family is involved in trafficking of a young person. And so in those cases, of course, everything changes, the dynamics change and you want that young person to talk to a trusted adult, preferably someone in school, someone that can actually support them. But in instances where we’re really just thinking about how do we start these dialogues, I think it comes from just conversations, learning inquiry within the space.

 

[00:30:17] BH: And we’ve touched on how technology can be abused extensively, but how is technology used to also stop the abuse?

 

[00:30:27] SM: There are some really great apps that I’ve heard of that do support survivors. So I know that there’s one that makes it easy for survivors to record abuse and automatically backs it up so that if the abuser discovers it and deletes it, there’s still a backup copy. I know that there’s definitely apps that also help you identify security risks. There’s obviously antivirus apps, some of which can help identify stalkerware. There’s also, I think actually, Bindu, you told me about this one, it’s an app that kind of masquerades as a notes app, but is actually a resource app for survivors. And so if an abuser enters it, it just looks like a notes app, but the survivor is able to utilize it for other functions. So there are really cool tools. Like as we’ve mentioned, technology is making lives easier. It’s designed to improve our lives and connect us with others, but it can be abused really easily. And I think the point is that while there are technological solutions, we also need to not focus always on solutions, but also focus on proactivity and preventing these situations from happening in the first place because you can have all of these really great apps. But it also means that there’s a risk of discovery. There’s also ways in which the abuser can track what apps you’re downloading. Some of these services and apps are not free. So there may be a digital record if you pay for it, which your abuser can discover. Or even a lot of times, I think the iPhone, you can have it set so that you need to ask permission from some cloud sharing owner to be able to download or purchase an app. So as with anything digital, there is always a risk of discovery and an escalation of abuse. So I think our energies really should be focused on prevention and proactivity, as I mentioned.

 

[00:32:12] BF: Yeah. And just to add, I think we’ve recognized that as we share many of these tools and solutions, we’re also really opening this for people that cause harm to be aware of all the different ways someone can seek help and all the different ways you can further exploit and abuse someone. So it is really a very slippery slope in terms of, as we continue to raise awareness, how much awareness do you raise knowing that in your audience is both, is the people that cause harm and the people that receive harm as well. And so it is definitely, it continues to be a challenge as we think about technology-facilitated abuse. Some of the exciting things I had seen was around a legal chatbot. And so basically what it did was you could ask questions like, “Okay, this is happening. I’m in California. Is there a law connected to this?” Right? And it would spit out kind of some of these answers and really was brilliant in many ways because many times a person receiving harm does not want to speak to a person, like that’s daunting in itself. And so the ability for it to be automated and the ability for someone to like hide under their covers and like type to a bot that’s supported with legal questions, which by the way is very inaccessible, legal services are extremely expensive. So that really kind of was a very beautiful and simple way for survivors to kind of know their rights. And of course, there are many, many other simple things like platforms that based on the amount of money someone might want to spend on a supportive service, the type of support they need, like therapy versus legal, versus medical. The platform would kind of spit up like answers based on geography, all of that. So again, there’s a spectrum of easy, simple and larger solutions. I think a lot of these tech solutions are begging to be created too. Often, I feel like in the nonprofit space, we’re trying to catch up with tech solutions and we just can’t and we’re not because at the heart of nonprofit, social workers or social activists are sometimes people that don’t know tech. And so really just how do we mesh these worlds where we say, “The issue continues to evolve at these alarming rates, but the solutions and the people that support these solutions or solve for the solutions or help aren’t evolving at that pace.” And so how do we make sure our crisis counselors and our medical teams and our schools and our enforcement teams all are catching up? And that are solutions and our apps and our tools also catch up is really an important question.

 

[00:34:35] SA: Yeah. You mentioned that the issue continues at a pace as technology developed so quickly. And I’m curious about your perspective on how the pandemic impacted the snowballing or potentially worsened this technology abuse issue.

 

[00:34:50] SM: Yeah, definitely. At least for our organization, we saw a huge spike in domestic violence cases and also a big spike in technology abuse because a lot of survivors were trapped in the same household as their abuser who then had unrestricted access to all of their devices. There was also just a surge in consumption of just new devices and things at the time when we’re all at home and bored and we’re all ordering new things. I know at least 10 different families who got into Alexa and knew different devices during the pandemic because we’re at board and we need connection with other people. There is an escalation in time spent on video games and time spent on the internet and time spent on phone calls and text messages, which then increases the amount of abuse that people can also experience. And because there was also a severe escalation in abuse, it was much harder for survivors to call for help because their abuser was in the next room and because a lot of times their calls or communications could be monitored. It was very easy for an abuser to get into their email address and then see with whom they were communicating. We had survivors who had suddenly sent emails from their email saying they no longer wanted services from their lawyer. Or even plane tickets that survivors had bought to try and escape or leave or go to their home country, which were suddenly canceled because they were discovered in the survivor’s inbox. So just that closeness within the home and matched with the increase in online communication that we were witnessing at the time was just a perfect concoction for a rise in tech abuse, which was very challenging to deal with as an organization.

 

[00:36:31] BF: I definitely remember the first couple of months where similar to what you were saying, Sonya, that people that were calling would have called in for support literally couldn’t say the words anymore because now the person harming you is in the same house. Otherwise, you had the luxury that they would go to work, or when they were out on a chore, you were there at home and you could call for help, but now you couldn’t do that anymore. That assumption was gone. And so you were literally watching people that could not use their voices to ask for help anymore. I agree with you, Sonya, and that it was the perfect storm. It really allowed for this escalation of bitterness and angst and violence. I feel like probably all families were tested. We were suddenly in more proximity with our families than we ever wanted or asked for. And for many people that are experiencing abuse, this escalated, right? It escalated and there is no space for us to walk out in a huff and like bang the door and de-cool. Now you had to do it all at home and it definitely resulted in a lot more violence. I would add, again, as I think about systemically, we noticed of course during the pandemic such an increase in like economic hardship. Right? And I think it also then trickled into the anger and the frustration, the violence that comes from, “Okay, I’ve lost the two jobs that I have now.” And what does it mean? I have someone I know that their partner had lost funding for their startup in the Silicon Valley during the pandemic and decided that that was a justification to cause harm to this person. Right? And so you watched as decisions made in the workplace around the pandemic now resulted in more violence and anger and hatred at home. We’d see healthcare workers that came home and were accused by their partners of bringing COVID inside. And so anger and violence, again, would start. So everything really around the pandemic and what couldn’t be controlled would then manifest at home in many ways.

 

[00:38:26] SM: Yeah. And all of that is to say, too, that while all of this was increasing, it was always disproportionately affecting people of color, women, queer people, immigrants, very vulnerable communities that often have much less access to services and help or experience much higher barriers as well. And so we definitely witnessed that across the board.

 

[00:38:47] BF: Yeah. And I’ll just add that right when the pandemic started, every single organization, every single structure was also really just trying to figure out how to do the things, it resulted in courts that suddenly said, “Wait a second, we need to stop where we figure out how to move online.” You had crisis centers say, “Hold a second. We don’t know how to do things online.” Right? And so while you saw the need continue to increase, you suddenly saw the support systems break because the support systems weren’t ready to move everything online. They weren’t ready for the pandemic. And so you really saw, again, that perfect storm of the need and the issues really kind of skyrocketed while the people that were created to like support this weren’t able or spent the first six months at least trying to kind of figure out how do you process everything online and how do you move things online in this world.

 

[MUSIC BREAK]

 

[00:39:57] BH: What does it mean overall to practice effective digital hygiene? So we touched earlier at the top of the show about maybe setting some of those boundaries, but then ultimately, what’s the big picture of going about my day with the right overall strategies, tactics, and mindset in this whole topic?

 

[00:40:17] SM: Yeah. So I guess when I think of digital hygiene, I think of setting aside time intentionally to go through all of my technology, all of my devices, check through my privacy settings where all of my devices are logged into and my accounts are logged into. Change passwords. It seems like every year there’s new breaches when it comes to privacy. So changing passwords to save passwords, ensuring that you’re using safe browsers and good antivirus and backing up files. And all of these are kind of very standard practices, but I guess what we need to factor in more is from a more close intimate partner, family violence perspective. Also checking on which of our trusted people and friends also have access to these things, especially considering the pandemic, we have a lot of changes socially in our lives. We may have entirely new groups of friends or have fallen out of contact with a lot of people that we originally very close to before the pandemic. So I think a lot of people are kind of forgetting what sort of digital pathway still exists before the pandemic and which are still present afterwards. So I think it’s important to do these kinds of hygiene checks very frequently, but all this is also to say that while it’s important to continue to maintain that hygiene, we again also have to pay attention to prevention measures and what tech workers can do to prevent these things from being abused in the first place. Because a lot of survivors feel that the onus is on them to always maintain their safety, always do these checks. That’s a very fearful state and they have a right to continue to be able to use these devices freely and safely and that we should always do everything in our power to make sure that they can do that.

 

[00:42:01] BF: The word I loved from what you said, Sonya, was just intentionality. Right? I mean, somewhere in the back of our minds exists the, “Oh, yeah. Oh, technology, and yes, it could be hacked at any point and so on,” but I think being intentional about staying safe and staying alert and aware is important, continuing to recognize that it is evolving at paces none of us fully understands. And so continuing to stay informed and engaging in discussion with people we love about things like boundaries, about changes that are happening in tech and so on. Of course, again, simple things I think about is always doing simple checks, like what apps we still have, do I still need all of them, changing and keeping strong passwords. Again, it seems very simple, but I feel like every once in a while we’re like, “Oh, yeah, I’m over it. I’ll just use the same password. It’s easier.” Right? And so again, remembering to just stay alert and intentional about staying safe online, recognizing that everything is just easier to access and stay unmonitored online. And so just with that lens, knowing that all the precautions we took in real life and in the real world is a little different from how we do things online.

 

[00:43:08] SA: What are some of the things that the tech community, so from engineers, developers, people listening to this podcast, what are some things we can do in tech to help prevent and stop this kind of technology abuse?

 

[00:43:22] BF: I’d start by saying learn about the issue, really immerse yourself in learning about tech-facilitated abuse and tech-facilitated coercion. I recognize as I shared this with many of my tech friends, most of them kind of come back and say, “Oh, wow! I didn’t even realize it had a label. I just thought I was being protective of this person.” And so really kind of learning about ways that we may contribute to the issue in our own lives and even learning about it from the lens of recognizing what is technology-facilitated abuse, I think is how we start. But I would just add and say the most important thing someone can do is reach out to a local organization that works in a social issue, preferably a local one, preferably someone that’s small, that really probably cannot afford to have someone do tech solutions for them and offer to support them with solutions of some sort, offer to support with creating a chatbot or a routing system or a website even, but just kind of reaching out and engaging in discussions about how to be a part of the solution is something that comes to mind. I would add invite local organizations to your spaces. Invite them to your tech talks. Invite them to conversations so that as people continue to develop new products and new solutions that they’re always thinking about this from the lens of intimate partner violence or youth trafficking and so on, because really we sometimes as we’re building solutions in Google and Apple and so on, we aren’t particularly thinking about these use cases or that these could be used or reused for these reasons. So really kind of inviting social organizations and nonprofit organizations into your spaces to have these conversations and really allow more decision makers and more creators to say, “Oh, wait a second. I never thought that this is how it could be used.” It would definitely help with us all really working towards kind of getting upstream and getting ahead of the issue.

 

[00:45:15] SM: Yeah, I definitely agree. I think just make better, safer technology that keeps these entirely new contexts in mind because we always think about, as we’ve mentioned, like strangers and how to protect our passwords or our information from strangers, but we have to always consistently imagine scenarios in which that the abuser or the person accessing your information lives in the same household and has constant access to your devices. And this is a completely different context and perspective that is just not treated with the same weight as we think of like Russian hackers or other forms of hackers. And I would also say, as Bindu mentioned, like inviting us to be a part of the conversation is really important because it still seems like every day these decisions are being made in silos with not effective information from the nonprofits that are kind of collecting and picking up the pieces and the ramifications of what happens when these technologies are not designed carefully with vulnerable people and with women and survivors in mind. And I consistently come back to, again, Apple AirTags and all of these similar devices that are used against survivors and it’s these vulnerable people that always bear the brunt of decision making that is not very careful or holistic in the way that it should be.

 

[00:46:33] BF: Yeah. And the reality is even as we’re having this discussion, it’s interesting that we’re kind of thinking that there might be an assumption that our tech sector or our tech workers haven’t experienced this or haven’t perpetrated it. Right? And yet it is so prevalent that it is absolutely happening within the tech industry. And so creating even the space for those who’ve experienced this kind of harm within tech companies to also elevate their voices in a safe space. Right? No survivor at a tech company wants to say, “I’m a survivor to gain sympathy points.” Right? And so if we don’t create safe spaces that will lead to then creation of better tools and platforms, we’re unable to, again, stay ahead of the problem.

 

[00:47:18] BH: Yeah. I mean, we’ve had support inquiries and abuse reports about targeted harassment on our platform. So targeted harassment is probably in the tiny minority of the type of day-to-day stuff our support team deals with mostly. We’re talking about people being generally rude without it being about a personal relationship. But in the history of what we do with DEV and Forem and all of that, we’ve certainly had a few instances of targeted harassment. And the way we deal with it is usually a matter of the right type of human intervention. But upstream of that, I think the fact that these things get to us the way they do probably has to do with the report abuse button being fairly approachable and discoverable on pretty much every type of page where it seems relevant on our site. And that was sort of a design choice. I don’t know if it’s as typical for report abuse to be sort of as visible as it is on our platform. It was just a small kind of decision, but I think it’s made that funnel for us a little bit more approachable and appealing. And I sort of feel like without that, it’s possible some people might not have even occurred to them that there’s a person on the other end that actually might be able to help this beyond just like the block button and things like that. That just sort of came to mind as we were discussing, and it’s very much a support issue the way we dealt with it, but it was sort of combined with I think some of our design choices to make the topic of contacting the team about abuse a little bit more visible and approachable compared to similar platforms. I also think these decisions ultimately become part of the organization. And so how do we, as an organization, ensure that someone down the line doesn’t decide, “Oh, why do we have this on every page? Does anyone use it? Do we look at the metrics and decide?”

 

[00:49:17] BF: Absolutely.

 

[00:49:18] BH: “We don’t get enough clicks on this button. Why is it here?” The choices are deliberate, but it’s hard to sort of justify unless it’s baked in. I’ve seen the thing happen where like the good choice kind of fades away if people aren’t keeping up with the dialogue and mitigating the problem, part of the culture. You know?

 

[00:49:36] BF: Absolutely. And even documenting the intentionality behind why the button existed. Right? I also, as you say this, immediately what comes up for me is even definitions of abuse. Right? Like what people’s thresholds on what they consider or their definitions of abuse are vastly different. And so I even think about like an info thing that allows us to say, “Hey, this is what we consider abuse. So any of this naming, any of this constitutes your ability to request it.” Because again, my perception of, “Oh, you looked at me this way. I might call that abuse.” Right? But then someone else has a very high threshold for that as well. So really normalizing or naming and putting the boundaries or the box I think also helps people who are questioning or wondering if this is happening to them to be able to actually ask for help because there is a circle of people that know something’s happening and there’s a circle that don’t even know something’s happening and a very, very small subset that will ask for help. So how do we really make space for all of these different circles to kind of get the support and help that they need?

 

[00:50:41] SA: Thank you both so much for joining us today.

 

[00:50:44] BF: Thank you for having us.

 

[00:50:45] SM: Thank you.

 

[00:50:54] BH: Thank you for listening to DevDiscuss. This show is produced by Gabe Segura. Our senior producer is Levi Sharpe. Editorial oversight by Jess Lee, Peter Frank, and Saron Yitbarek. Our theme song is by Slow Biz. If you have any questions or comments, email [email protected] and make sure to join our DevDiscuss Twitter chats on Tuesdays at 9:00 PM Eastern Time. Or if you want to start your own discussion, write a post on DEV using the tag “discuss”. Please rate and subscribe to this show on Apple Podcasts.