Season 8 Episode 4 May 4, 2022

A Surge In Hacks Against Russia, Privacy Concerns With Mental Health Apps, and Lego’s Big Move Into the Digital Space

Pitch

Russia faces an irony when it comes to a surge in hacking against the country.

Description

In this episode, we talk about Lego expanding its online ambitions and its plans to triple the number of software engineers on staff. Then we’ll speak with Joseph Menn, author of the book, Cult of the Dead Cow, and technology reporter at The Washington Post, about a piece he wrote titled, “Hacking Russia was off-limits. The Ukraine war made it a free-for-all.” Finally, we’ll speak with Jen Caltrider, who leads Mozilla’s Privacy Not Included guide, about their research which found that the vast majority of mental health and prayer apps are severely lacking in privacy protections.

Hosts

Saron Yitbarek

Disco - Founder

Saron Yitbarek is the founder of Disco, host of the CodeNewbie podcast, and co-host of the base.cs podcast.

Josh Puetz

Forem - Principal Engineer

Josh Puetz is Principal Software Engineer at Forem.

Guests

Joseph Menn

The Washington Post - Author and digital threats reporter

Joseph Menn is a security journalist for more than two decades, Joseph Menn is the author of the bestseller "Cult of the Dead Cow: How the Original Hacking Supergroup Might Just Save the World," named one of the 10 best nonfiction works of the year by Hudson Booksellers as well as one of the five cybersecurity books everyone should read by the Wall Street Journal. His previous book “Fatal System Error” was the first journalism tying Russian intelligence to organized criminal hacking groups.

Jen Caltrider

Mozilla - Lead, *Privacy Not Included

Jen Caltrider is just your average do-gooder privacy nerd. She lives on a mountain in Colorado with her wife, four dogs, and one cat. When she's not reading privacy policies she's probably reading something much more interesting like fantasy or crime thrillers. She loves technology even if she dreams daily of owning a farm and never logging onto a computer ever again.

Show Notes

Audio file size

53418662

Duration

00:55:39

Transcript

[00:00:10] SY: Welcome to DevNews, the news show for developers by developers, where we cover the latest in the world of tech. I’m Saron Yitbarek, Founder of Disco.

 

[00:00:19] JP: And I’m Josh Puetz, Principal Engineer at Forem.

 

[00:00:22] SY: This week, we’re talking about Lego expanding its online ambitions and its plans to triple the number of software engineers on staff.

 

[00:00:30] JP: Then we’ll speak with Joseph Menn, author of the book, “Cult of the Dead Cow”, and Technology Reporter at the Washington Post, about a piece he wrote entitled “Hacking Russia was off-limits. The Ukraine war made it a free-for-all”.

 

[00:00:42] JM: The irony here is that this is exactly the mix of criminal and government and activist packing that has beset the US for so long is now happening quite suddenly to Russia.

 

[00:00:54] SY: Finally, we’ll speak with Jen Caltrider, who leads Mozilla’s Privacy Not Included guide, about their research, which found that the vast majority of mental health and prayer apps are severely lacking in privacy protection.

 

[00:01:05] JC: So I’m like, “Wait a minute, you’ll never share or sell my data without consent, but by just downloading and registering with the app, I’ve given you my consent. That doesn’t feel like it should be consent to me.”

 

[00:01:22] SY: So we have some news coming out of the land of building blocks. In a recent piece published by the Financial Times, Lego is planning to triple its approximately 400 software engineers in an effort to greatly expand its digital ambitions. Lego has been reluctant to get into the digital space in the past, thinking that it might derail their brand of physical play bricks and has missed out on opportunities of being the first to create something like Minecraft or Roblox, much to the chagrin of some of its developers. A senior official at Lego told the Financial Times in regard to the missed opportunities, “Digital has been like a ghost for them.” Up until now, any game related to Lego has been outsourced to third-party developers. But as a part of this venture, Lego has also partnered with Epic, the creators of the massively popular game, Fortnite, to develop a children’s specific metaverse in the next year. Niels Christiansen, Lego’s Chief Executive, told the Financial Times, “We need to give them good and safe experiences in there and make connections also to our physical products.” I think we can all imagine a world where digital play dates exist for children where they pick up and build things together via virtual or augmented reality, which in a world which has experienced a whole pandemic, with tons of lockdowns, as well as physical shipment issues, it doesn’t seem like a terrible move for a toy company. Josh, what was your reaction to this?

 

[00:02:46] JP: I was really surprised that Lego has 400 software engineers.

 

[00:02:50] SY: What are they doing? Okay. To be fair, I am not their customer. I don’t have kids. I don’t play with Legos. Even as a kid, I don’t really remember being much of a Lego person.

 

[00:03:03] JP: Oh, I’ve always been a Lego person.

 

[00:03:04] SY: Are you a Lego person? You know what it was?

 

[00:03:06] JP: Some people are really, really into Legos, like they’ve got little organizer Lego.

 

[00:03:10] SY: Oh, they are a hardcore adult Lego people, for sure.

 

[00:03:14] JP: Yes. I enjoy Legos. I'm not like a hardcore Lego collector. But as a kid, that goes all over the place. Absolutely.

 

[00:03:21] SY: I don’t know. I’ve just always seen them as a physical toy company. Like the idea of Lego and software engineers is that never, never would have crossed my mind that they would have that many software engineers.

 

[00:03:35] JP: So I did a little digging around. Okay, 400 seems like a lot at first. You’re like, “What are all those engineers doing?”

 

[00:03:43] SY: Yeah.

 

[00:03:43] JP: When you poke around their job’s website and poking around LinkedIn a little bit, I found that it kind of makes sense. So they have lots of engineers working on their e-commerce sites, obviously. Lego sells in all sorts of countries with tons of e-commerce engineers. But also, there seemed like there was a decent amount of phone apps that Lego itself publishes that tie in with some of the kids that they sell. So you might get a kit and you can scan it and it’ll bring some AR bricks into your phone and you can play with it that way or different Lego sets have very specific applications for the phone that changed them in different ways. So those all look like they’re developed internally. I did find a lot on this, but I found some really interesting references to software tools that Lego uses internally.

 

[00:04:31] SY: Oh, interesting. Okay.

 

[00:04:31] JP: So things like design tools, creating their manuals, planning out what bricks are going to be used for which sets. That all sounds really cool. But it seems like the expansion, all of the engineers or a lot of the engineers that they’re hiring are mostly centered around these initiatives to get Lego into the metaverse or expand their digital footprint. And I wonder what you think about that as a potential revenue stream. Or do you think kids that are used to playing with these physical objects, could you imagine the brand loyalty from Lego’s physical bricks translating to a digital world?

 

[00:05:14] SY: I think that the Lego brand period is strong enough that if they executed well, yes, I think that could totally happen. I mean, to me, I think the biggest evidence of Lego’s brand working so well for it is the fact that they have a freaking Lego movie and it did well.

 

[00:05:34] JP: Right.

 

[00:05:34] SY: Like, “That’s what? What do you mean?” It’s so confusing. I can’t remember how much revenue it brought in. I think made even a couple, but like they did really well with their movies and watching a two-hour Lego movie really makes no sense considering that the whole point of the product is to engage and play with it and use your math. There’s no relationship between… you know what I mean? Like watching Legos come to life versus you playing with Legos. I feel like that to me just really shows the brand strength of Lego. And I feel like if it’s executed well and if it’s really thoughtful, I think that going from physical to digital is definitely a possibility.

 

[00:06:17] JP: I wonder though because all the Lego stuff that’s been like really popular, like the movies and there’s been quite a few Lego video games that have been popular, they’ve all been developed outside of Lego. But those video games are generally not about playing with digital bricks. They’re very much like the movies. You’re controlling a character that happens to be a Lego action figure in a Lego area. So the success they’ve had in movies and games so far, it has the Lego brand on it, but it’s not taking the building experience and putting that in a digital world. And I’ve seen Lego try a number of times to demo AR apps and VR apps for building, and those really haven’t gone anywhere. So I wonder if like the idea that Lego will be in the metaverse. I mean, we can talk about how nutballs just metaverse planning in general is, but just the idea that the success that they’ve had in media and in the digital space hasn’t been replicating their physical toys. So I wonder if it’s maybe a mistake to try to think, “Oh, we’ll just take blocks. We’ll make digital versions of them.” Because you really do lose something not handling the blocks.

 

[00:07:30] SY: If it doesn’t work, I don’t think it’s going to be because they want digital. I think it’s going to be that they try to do something in-house that they may not inherently be good at. So for example, before Lego did all the outsourcing and worked with third parties to expand their brand and you do all those things, they did try to do a lot of that in-house. They hired a bunch of people. They try to do a lot of things just with their own team. It didn’t work out. They decided that it’s not their core competency. They licensed it out and that’s when it started working. So I think the risk here is a product where we are playing with digital Legos going to work. I think the question is, “Can Lego build a digital Lego product that is going to work? Or does it make sense for them to stick to what they know, which is physical play and outsource?” Like what they’re doing with Epic, outsource the execution, the creation of the digital version of Lego to a digital company. I think that’s going to be the real question.

 

[00:08:39] JP: I wonder what trying to transition to digital means for Lego in terms of like IP. So in researching the story, I was reading a little bit about the history of Lego and it’s something I wasn’t aware of. They almost went out of business back in 2003. They almost went out of business. And what changed was they pivoted from doing all these custom sets with a lot of custom pieces to reducing overall the number of pieces that they have to manufacturer to keep the cost down.

 

[00:09:06] SY: They focused.

 

[00:09:06] JP: And then bringing in IP from other brands. Like today, you can get Marvel Legos.

 

[00:09:12] SY: Star Wars, Harry Potter.

 

[00:09:13] JP: Star Wars, Mario Legos. They started using all these different IPs. And I wonder, they don’t have carte blanche to just do that in a metaverse space. They’re dealing with Epic. It’s not like they could just like bring Batman in and say, “Here’s a Batman Lego for online and digital.” They have to renegotiate all of those rights.

 

[00:09:29] SY: That’s true. That’s a really good point. Yeah.

 

[00:09:31] JP: And all those IPU owners, they want their own piece of digital. So I wonder if Lego is going to have to like, going along, if they’ll be able to have that same success because a lot of their success right now is from just using this other IP.

 

[00:09:44] SY: And I guess maybe that begs the question, what is Lego’s core competency? At this point, given the history, you just described and given that, yeah, I do remember not that long ago, they almost went out of business and they weren’t doing well and there was this huge turnaround effort when they hired the CEO and it was a whole thing. So it comes down to today. We know them as this childhood favorite toy. But if you actually look at the things that have worked for the company, are they just really good at licenses? You know what I mean? Is that the core competency? Because if the core competency is their deep and fundamental understanding of the way kids play, they understand play better than any toy company. Right? Like they really get what brings joy from play and how to create these magical moments for kids, I can see that core competency and kind of like almost the first principles of play in a way being applicable to digital. Obviously, it will look different, but I feel like if you have a fundamental understanding of it, I feel like you have a good shot anyway of translating that to digital. But if that’s not actually why they’re successful these days, if really it’s about we built this really simple classic, huge invention in toys, but really we’re successful because we’re smart business people, we’re negotiating these licenses, that to me makes the likelihood of them creating a digital place that’s less likely to be successful. So I guess it depends on why is Lego doing well now.

 

[00:11:22] JP: That’s an excellent point. It’s also something that I think a lot of us in companies right now need to focus on as well is what is your core competency. Exactly.

 

[00:11:30] SY: Yeah. Yeah. Absolutely. Coming up next, we talk about how the war on Ukraine has led to an unprecedented surge in hacks directed at Russia after this.

 

[MUSIC BREAK]

 

[00:11:59] SY: Here with us is Joseph Menn, author of the book, “Cult of the Dead Cow”, and Technology Reporter at the Washington Post. Thank you so much for being here.

 

[00:12:08] JM: Thanks for having me.

 

[00:12:09] SY: So you wrote a really fascinating piece about how hacking Russia was seen as off-limits, but the Ukraine war has changed that perception and led to a surge in hacks against the country. Can you talk about why hacking was kind of seen as out of bounds before now?

 

[00:12:26] JM: Well, the main reason is that if you attack Russia or any other country and it’s traceable and you come from, say, the United States, then Russia might escalate and might attack the United States. And it’s something that Russia in particular has shown itself repeatedly quite willing to do. In addition, it sort of undercut the moral claim. Russian criminals have been attacking the US with ransomware, which has just been debilitating for many hospitals, some critical infrastructure beyond that. And they’ve been stealing money from American institutions for years. And we, the US government, has been yelling at Russia, “You are responsible for criminal activity coming from your borders. You need to get this under control or else.” And that argument goes away if we’re doing the same thing to them. So those are major reasons. And finally, Russia is extremely good in cyberspace. They’ve got very talented people, both criminal and government, sometimes criminal and government and professional. And you don’t want to attack somebody who’s super capable because they might retaliate against you.

 

[00:13:31] JP: So what is it about the current war in Ukraine that has led to a shift in perceptions in the hacker community about attacking Russia?

 

[00:13:40] JM: Well, there are a couple of things. First of all, there’s just moral outrage. Hackers have the same sorts of moral standards as other people and a great percentage of the world is horrified by what’s going on in Ukraine. That is the most straightforward reason that this is happening. But also, there’s a real sense that if Russia is going to retaliate in cyberspace, what’s left for them to do? I mean, there are things. I mean, it could get worse. They could attack the grid. They could do destructive attacks, et cetera, but it’s already so bad. Crippling hospitals with ransomware in the middle of a pandemic and attacking companies and stealing their data and not only demanding money, but then publishing the documents or distributing them is not only double or triple extortion in some cases and pretty horrific, but it can be a cover for nation-state stuff. Like one of the great problems with dealing with Russia in cyberspace over the last couple of decades is the intertwining of criminal stuff and government stuff. So if there’s really an espionage mission and Russia wants to obtain secret documents, political documents, selectively leak or edit documents and then leak them and get them into the mainstream media, criminal ransomware is a great cover for that. And that is clearly something that is going on. Why hold back anymore is the basic philosophy among many of the hacktivist groups. There are other people that are attacking as well. I mean, they’re just straight-up criminals. Because the Russian government and Russian cyber defenders are so distracted right now, why not go after them for other reasons? So that’s happening too. The irony here is that this is exactly the mix of criminal and government and activist hacking that has beset the US for so long is now happening quite suddenly to Russia.

 

[00:15:28] SY: Can you paint a picture for us of what this surge in hacks has looked like?

 

[00:15:35] JM: Well, one of the interesting things, I guess, is that the Russian population, in general, has a very distorted view of what’s happening in Ukraine. Russia has gone very far to control the news. You can be sentenced to 15 years in jail for calling it a war instead of a special military operation. And for a lot of the population, this is work. They don’t see what’s happening. They don’t know what’s happening. One of the first indicators that something is seriously wrong can be when their bank website has been DDoS and they can’t connect to it. Or another major website has been defaced with messages about war crimes in Ukraine, maybe photos of civilian victims. So it’s a way of getting through the population there that things are really bad. There’s also email campaigns, texting campaigns, and I believe even faxing where they’re trying to get the message across, but what’s happening is not okay and that their government is responsible for some horrible things. Inside the government agencies, they are worried about losing reams and reams of emails. In one case the national TV and radio broadcast agency lost 20 years’ worth of emails. That includes directives on what to air and what not to air. There are other agencies that are also super sensitive, ones that are in charge of monitoring social media for signs of descent, and here are the areas that they’re most worried about. It’s a look inside the Russian government that almost nobody has ever gotten before.

 

[00:17:06] JP: Are there any major impacts that these hacks have had outside of distributing information to the population or taking information from government agencies? We hear a lot about hackers attacking infrastructure. Has anything like that happened in Russia?

 

[00:17:22] JM: There has been some of that and we don’t know who is behind a lot of it. So one of the complications here is that the government of Ukraine has been openly supporting hacktivists, which is something that hasn’t happened before. The Digital Ministry of Ukraine has been re-tweeting Anonymous and they have launched their own telegram channel called the “IT Army” in which they suggest targets for hacking. Now a lot of that stuff is kind of amateur hour. It’s denial-of-service attacks and defacements that don’t last very long, the smaller websites and so forth and so on. But some of the serious stuff that is going on is military hacking. And clearly, Ukraine is doing that against important Russian military and infrastructure targets. It certainly would not surprise me if other Allied country agencies are doing the same thing. And for all we know, some of these hacktivist groups also have government direction or government infiltration inside them and it’s a way of hiding attribution. We don’t know which infrastructure has been. I mean, lights haven’t gone off. Water hasn’t been shut down. And that’s probably a good thing because at least the US has said that civilian stuff should be off-limits in cyberwar. There are very few rules or norms in cyberwar. So that’s concerned that this stuff can escalate, but people allied with Ukraine don’t want to be the first ones to turn off the lights in Moscow. It’s pushing along.

 

[00:18:53] SY: Do we know where some of these hacks against Russia are coming from? Meaning any specific groups, hacker groups that have claimed responsibility? Anything like that?

 

[00:19:03] JM: Yeah. And we have to be careful about that phrase, claiming responsibility, because Anonymous in particular is famous for claiming responsibility for things, including in this case that they didn’t do. But if you look at the leaks that have gone to WikiLeaks, like site called Distributed Denial of Secrets, which in the past has published all sorts of US documentation, leaks from inside surveillance companies, police records showing turning a blind eye to domestic violence or that sort of thing, they’re mostly known for that in the United States, but they are going nuts with Russia leaks right now. And they verify them and they publish them. And they told me that they’re getting a lot of material from groups claiming some affiliation to Anonymous. And some of the most important ones have come from two new groups. One is called Against the West and one is called Network Battalion 65. I interviewed Network Battalion 65. A researcher I know interviewed Against the West. And there are some similarities. They’re both small groups. They both are a bit cagey about whether they have intelligence backgrounds or getting any help from intelligence agencies. And I just want to say that on Anonymous, Anonymous isn’t really a hacking group. It’s almost like a brand. If you say that you’re a member of Anonymous, “Hey, Presto, you’re a member of Anonymous.” And so there are many times when people are going to be hacking anyway, and they just say, “And this is in the name of Anonymous,” and Anonymous is happy to take credit for it. And it’s not like a conventional hacking group.

 

[00:20:38] JP: You mentioned this a little bit before, but right after the war began, the Ukrainian government put out a call for hackers and engineers to join an IT Army to fight against Russia. You sort of alluded to, you didn’t think they were having a very big impact. I was wondering if you could talk about that. Are they having any impact at all? Is it mostly publicity? How would you gauge and respond to that call?

 

[00:21:03] JM: So I think it is having an impact. One of the things about cyber operations is sometimes the impact is known for a long time. Sometimes it’s never known. So that’s a big caveat. We don’t know how effective they’re being. There may be really important stuff that we’re not seeing. I also wouldn’t say that they haven’t been effective at all. I was saying the thing that gets the most noise is denial-of-service attacks and defacements, which aren’t that big a deal strategically, except that they spread the word and they also provide distraction and cover for more serious stuff. Also, I think in the information space, the propaganda space, what they’re doing is important because they are, again, spreading the word, trying to educate people in Russia about what’s happening on the ground, and through these sort of mass communication campaigns, which historically is part of hacktivism as well. I don’t want to undersell it that that’s actually a lot of warfare is determined by public will and public perception. And turning the tide of public opinion, if they manage to do it, is actually pretty significant and might hasten the end of the world.

 

[00:22:09] SY: So I want to talk about the book you wrote called the Cult of the Dead Cow, which covers a lot of the topics that we’ve talked about in this interview. Can you talk about the crux of that book?

 

[00:22:19] JM: So the Cult of the Dead Cow is about the oldest and most influential US hacking group. And it goes all the way back to the 1980s when they were bulletin board operators, 13 and 14-year-olds scattered around the country looking for some sense of connection. The group evolved repeatedly, completely transformed multiple times, but it was self-selecting. The only rule is you couldn’t ask to be a member. They pick you. Matt Blaze, the great computer scientist, who exposed the flaws in the Clipper Chip and is one of the biggest experts on electronic voting these days described it as nerd, skull, and bones, which is pretty close. It’s this elite group and they were probably best known about 20 years ago at DEFCON, Gailey throwing CDs into the crowd that would allow anyone to hijack a Windows box. And they did this sort of publicity stuff to put pressure on Microsoft to actually take security more seriously. And it, by and large, worked. They realized that Microsoft was a monopoly. They couldn’t get it to do things through normal channels, reporting bugs and flaws and issues with the architecture and even communicating directly to Microsoft customers, which many of them had done. So they decided to have a media circus at DEFCON and that actually did the job.

 

[00:23:38] SY: Smart.

 

[00:23:40] JM: So they’ve kept doing things like that and they kept evolving. There was a lot of overlap with the L0pht, the pioneering hackerspace, the first sort of publicly known hacker workshop in the Boston area. There were four members who are in both the L0pht and Cult of the Dead Cow. So it was sort of a good cop, bad cop thing. And some of them testified before Congress in ’98 that they could bring down the internet because it was so badly flawed within BGP, Border Gateway Protocol. So they’ve done quite a number of things and individuals in it have gone on to found billion-dollar security companies and to run cybersecurity grant-making at DARPA, the people that brought you the internet in the first place. And they’ve all done these amazing things. One of them is currently running for governor of Texas. His name is Beto O'Rourke, which I revealed in the book he’s being an early member. But one of the big issues for me why this book is important is they invented the term “hacktivism”, which is what is going on now at a bigger scale than ever before, in part because of the book inspiring some newer people who hadn’t heard of Cult of the Dead Cow or only vaguely. But they tried out a number of definitions for hacktivism, but one of them was security and hacking work in service of human rights, including the globally recognized right to information. So it’s very hard to get a bunch of hackers to agree on anything, but information being available everywhere and censorship being bad is certainly a core tenant most hackers who think about these things at all. And initially, they focused on China, which was censoring the net and tracking people and they help develop tools for evading censorship in the Great Firewall of China, which prevented a lot of people from seeing what was being reported elsewhere. So they contributed to the development of Tor. They did their own sort of browser for Tor and that prompted the people who were actually behind Tor to add a browser into their service, which is now everybody uses it. They were also instrumental in helping shape the development of a place called the Citizen Lab at the University of Toronto, which is now the foremost researcher studying how governments use technology to surveil their own people. So they’ve had this wide-ranging influence, people that they taught, who worked with them have gone on to run security at Facebook, at Yahoo, have had very high positions and influential positions at Microsoft and Apple, and as I said, in the US government. So if you look at their history, it’s sort of the evolution of attempts at figuring out the right ethics and how to do the most good for the most people. And it’s an idea that was handed off to Anonymous and other groups and now these new groups who explicitly say Cult of the Dead Cow is a model.

 

[00:26:32] JP: I was going to ask about that. How does the legacy of the Cult of the Dead Cow affect what’s happening today? Is the group still around? Or is it more of an influence of their members?

 

[00:26:43] JM: The group is still around. Some of them are extremely influential and they also do work quietly behind the scenes. In some cases, assisting, I’m not going to say any of the groups we’ve talked about, but they’re in the space providing advice, some technical help. And when I interviewed Network Battalion 65, my book was right behind the leader of the group that I was talking to. And he said, “Yeah, I’ve got it right here.” It’s an inspiration for a lot of people. In the book, I quote a guy named Dug Song who’s like a fellow traveler that founded Arbor Networks and then Duo Security System that was bought by Cisco for a billion or two a few years ago. And Dug said that, “Too many people think about hacking as white hat and black hat and it’s much more useful to think of it like the Dungeons & Dragons matrix from lawful to chaotic and from good to evil in another access.” And he said the Cult of the Dead Cow is chaotic good. Hitler was elected democratically. The people who were following his orders were lawful evil.

 

[00:27:56] JP: Right.

 

[00:27:56] JM: So just because there’s a law about something doesn’t mean it’s right. I think hackers in particular have had to wrestle with this a lot, especially in the olden days because everybody in the beginning was breaking law. Everybody was stealing phone service to be able to dial in to a bulletin board from another area code and your parents would get stuck with hundreds of dollars of phone bills. And so they borrowed credit cards or credit card numbers or calling card numbers or whatever. They are all criminals. And they had to think about right and wrong. You may disagree with many of their decisions where they wound up and they certainly disagreed with each other, which is one of the things that goes into the book. I think the debates are very interesting. But they all had to think about it. They all had to think about whether a given law is worth breaking or not. And I think that’s really important. That sort of critical independent thinking is very important and something the hacking community needs to bring back. These days, if you can go to a nice college and do cyber-y things and then get a nice job in a big corporation doing cyber-y things, without ever thinking about the moral implications of your decisions. And that means you can be sort of sleepwalk into doing things that are wrong and bad for people, bad for security. It’s important to be an independent thinker. And that’s one of the things I was trying to get across with the book and that’s why it’s resonated with so many hackers today.

 

[00:29:22] SY: So with this myth of Russian cyber superiority being essentially broken, what do you think this means for the future of the country?

 

[00:29:33] JM: Well, it’s really interesting. I think that chaos is going to spread. I hope it doesn’t escalate to critical civilian services being shut down. I hope that it means that Russian ransomware gangs will stop attacking as many Western things. We don’t know about any of that. It could easily spiral more out of control. One of the risks and why this is generally not officially encouraged is sort of vigilante stuff. Well, there are a couple of reasons. One of them is you can mess up existing government-backed operations. Another is that the people you were attacking might be able to attribute the attack and come after you in such a way that it harms not just you, but others around you. They might take out your ISP. They might take out the company where you worked during the day. There might be a lot of stuff. And then you have this kind of wild, chaotic escalation, which would be a bad thing. On the other hand, I think it’s a very significant change that broad numbers of people with some hacking skills or communication skills or related interests are now thinking about computer security and hacking in broad political terms. The Cult of the Dead Cow initially focused on the Chinese government because some of the members of the Cult of the Dead Cow worked for the US government in their day jobs. Some of the members of the Cult of the Dead Cow hated the United States government and would never work for them, but they all agreed that what the government of China was doing was wrong. We have that sort of thing happening now with Russia and Ukraine where accounts like YourAnonNews with eight million followers on Twitter are urging people to go after Russia and shaming Russia. And that’s super interesting because movements are more powerful when they’re broad-based and not just one government yelling about another government.

 

[00:31:23] JP: Is this kind of citizen hacktivism and involvement in an armed conflict a new frontier that we’re seeing in cyberwar? Or do you think it’s a riff on the things that we’ve seen in the past? And I guess more broadly speaking, how do you think this will affect the future of warfare as time goes on?

 

[00:31:45] JM: To the first question, to the extent, there is precedent for this, it would be what are known as like patriotic hackers. So patriotic Chinese hackers after the US apparently or NATO accidentally bombed the Chinese embassy somewhere in Yugoslavia some years ago, there was a wave of attacks on US websites and lots of defacements and written attacks on the American government by people who are not in the Chinese government, by random citizen patriots with some hacking skills. Russia has done some similar things. There were the notable attacks on Georgia and Estonia that coincided with Russian military action against those countries in cyberspace and in George’s case with people crossing the border wearing uniforms. And that’s murkier. There was some government influence of coordination certainly, in the Georgia attack, there was encouragement in the Estonia attack. And then more recently, like I said, they used to have these blended operations where they’re sort of criminal stuff, but it also helps the government. I mean, one of the things that has been happening is that some versions of software, ransomware in particular, they get inside American big companies and they would search for things that they could use. They go for the CEO’s email. They go for things they could use for further extortion or whatever leverage, but they would also search government contractors, for example, for things marked secret or top secret. And that was obviously going to be handed to the government. So there’s that sort of cooperative thing. This kind of broad-based like regular people doing what they can think of on their own time against the government and government-connected corporations in Russia, that’s never happened before. This is brand new stuff. And the future, that’s a good question. I think maybe this is the way things are going to go or maybe it’ll tend back to normal that we’re in a particularly chaotic period. And when some new balance is established, maybe there’s a change in leadership in Russia, maybe there’s a new Cold War, whatever. Once it’s sort of ossifies and new lines are set, my guess is that the armchair stuff will move on that it won’t seem as urgent as it does right now where every day brings more horrible news from Ukraine.

 

[00:34:22] SY: Well, thank you so much for joining us.

 

[00:34:24] JM: Thanks for having me.

 

[00:34:32] SY: Coming up next, we talk about how the vast majority of mental health and prayer apps have concerning privacy protections after this.

 

[MUSIC BREAK]

 

[00:34:57] SY: Here with us is Jen Caltrider, lead of Mozilla’s Privacy Not Included guide. Thank you so much for joining us.

 

[00:35:03] JC: Thank you for having me.

 

[00:35:04] SY: So you are the lead at Mozilla for a consumer guide you made called Privacy Not Included. Can you talk about what this guide is all about?

 

[00:35:12] JC: Yeah. So back in 2017, we looked around in the world, and Mozilla, we care a lot about privacy, and we saw a lot of IoT products, connected products, like smart speakers and fitness trackers and the like coming on the market. And we saw a lot of reviews for these products that looked at features and reliability, but we didn’t see anything that looked at the privacy and security of these. So we had the crazy, brilliant idea to create a buyer’s guide just to focus on privacy and security of connected consumer products. And ever since then, we’ve kind of evolved it from just what can we actually pull off to now we have a couple of hundred products that we review on our site from apps to fitness trackers, to smart speakers, to exercise equipment. We try and make it as accessible as possible to consumers to understand what products are protecting their privacy and what products are trying to make as much money off that or handle it poorly.

 

[00:36:07] JP: So at first glance, I wouldn’t think that Mozilla, a company that most people are familiar with because of their browsers, would be making a privacy guide for apps and connected devices. Can you tell us how the Privacy Not Included guide started?

 

[00:36:23] JC: So we’re a nonprofit. We’re not a for-profit tech company like Google or Microsoft. We’re a nonprofit and we’ve been that way since the beginning. And so what we do in the world, we do with a mission, to make the open internet better and safer and available to everyone. And so privacy is a big concern and we want to help make the consumers aware of what’s going on in their connected world. There’s not a lot of power that they have right now, but they can vote with their dollars. So trying to give them as much information so they can go out and shop smart for connected products that are going to protect their privacy and then put some pressure on the companies to say, “Hey, guess what, you don’t think people care about privacy with your products, but actually they do.” We have a fun creepo meter on our site on every product page where people can rate how creepy they think a product is, they’re not creepy they think a product is, and it’s something that the companies can see and they do. They reach out to me and let me know. “Why is our product so creepy?” And they’re like, “Well, that’s what consumers know.” And so it’s a way to kind of educate consumers and try and help them shop smart, but also put pressure on the companies to do better.

 

[00:37:35] SY: So I’m curious, what security standards for this guide are? What is the bare minimum that something needs to be in order to be considered decently secure?

 

[00:37:46] JC: When we first started out, it was kind of seat of our pants. And over the last six years, we’ve evolved our methodology. And so right now we have minimum security standards, which are five things that we think every product that’s sold on the market should meet to be able to be sold. And that includes, “Does the product use encryption? If it requires a password, does it require a strong pass? Does the company have a way to manage security vulnerabilities, like a bug bounty program? Does it push up security updates and does the product have a privacy policy?” And those are the five things that we think every product should have to kind of meet the minimum security standards. And if we can’t determine any product meets any one of those five, they immediately get our Privacy Not Included warning label. And so that’s how we look at products from a security perspective. From a privacy perspective, this is something that’s evolved a lot over the past few years. And we look at things like what data does the company collect on you. Because there’s a difference between the data that a drumming app that you use to learn to drum collects on you and the data that a mental health app collects on you. There’s a difference between the data that a fitness tracker collects on you versus the data that a smart light might collect on you, for example. And so we look at that and then we look at what the company does with that data. Do they share it with third parties? Do they sell it? Do they collect other information from other third parties and combine it to build an even bigger profile on you? And what control the consumers have over? Does everybody have access to access their data to see what is out there and to request their data to be deleted in a timely manner? And what are the retention policies that those companies have with that data? And then finally, we look at a company’s known track record. And what kind of track record do they have in respecting and protecting a company’s data? I kind of call this my Facebook test because Facebook met our minimum security standards as do many of the big companies. They have fairly strong security practice. Not everybody did when we started back in 2017, but most of them do now. But Facebook’s got a terrible track record at protecting their users’ data and respecting their user’s data. And so we look at those. And the way that we usually kind of handle it is we have the three dings that we can assign on. What data does it collect? How can you control it? And what’s the company’s known track record? And if any company gets two dings on those, they immediately get our Privacy Not Included warning label. There have been a couple of instances where just getting one ding was egregious enough, their privacy policy was just so egregious and their data collection and selling and practices that we ding them for that. But yeah, so that’s kind of how we review products. We also look at AI, but AI is just a really hard thing to look at with a lot of these new products because there’s not a lot of transparency there. So we kind of started reviewing it just to see if we could learn anything. And what we’ve learned is we really can’t learn anything much, but it’s something we’re keeping an eye on.

 

[00:40:41] JP: So the guide has lots of reviews of apps and gadgets. And in the latest iteration, your team looked at mental health and prayer apps. What led your team to specifically focus on these apps?

 

[00:40:53] JC: Oh, gosh, we live in the world and we along with everybody that we know and see are caught up in this mental health crisis, right? The pandemic and what’s going on in the world was just really taking a toll on people. My partner, Misha, he’s Ukrainian and he himself and all his friends and family are struggling with the war there. And we’ve seen this in the past with Silicon Valley tech companies where they see this. “Oh, there’s an opportunity here to make money.” We saw at the beginning of the pandemic with video call apps, right? Like suddenly everybody needed a video call app, like Zoom or FaceTime or whatever to connect with people because we couldn’t go out in the world. And these apps grew like crazy, but the privacy and the security kind of struggled to keep up. But making money was a priority whereas kind of keeping up with privacy and security for these apps was kind of secondary. And so my partner Misha and I decided we wanted to look into this because it felt really important now. So we looked into mental health and prayer apps. We included prayer apps because for a lot of people, prayer apps are mental health apps, and we wanted to look at those and we’d also heard some problems with them. So yeah, that’s kind of how we decided to tackle this particular area.

 

[00:42:04] JP: So let’s get into some of the results. I’m certain, all of these apps are wonderfully private and adhere to the highest level of data security. Right? Can you go over some of the major findings? Were there trends that you saw across these apps?

 

[00:42:20] JC: Oh, absolutely, trends. The data economy is alive and well when it comes to mental health apps. And by that, I mean, I tell the story, I’ve been telling the story to people, but when they ask about this, I just got a drum kit, right? I’m going to learn to drum. I need to beat something to like reduce my stress, but I don’t know how to drum. So I thought, well, I’ll get an app to teach me. Being a normal person, I Google, “What are the best drum apps?” And then I look at them and then I go to the app store and I actually read the privacy policy because I’m a nerd like that. And the first app I found sounded good. So I go to the app store and it was made by a Chinese developer and didn’t have any privacy information, including a privacy policy. So of course, I noped out of there really quickly. But then I found another app that sounded good. And it looked like most privacy policies do. It was, “We’ll collect data, personal information for interest-based advertising. We might combine it with third parties to get more information about you for personalization.” As I was reading this drumming app, it hit me as like, “This drumming app’s privacy policy looks like the privacy policies I’ve been reading for mental health apps.” And that should not be the case, right? I don’t care if somebody knows I practice drumming three times a week, but I do care if somebody knows that I see a therapist for anxiety and depression three or four times a week, or any of the things that mental health apps can collect. And so one of the biggest trends we saw that these apps collect a lot of personal information. They do the same sort of advertising pretty much across the board that every other app that we download those, which I don’t think that should be the case because we’re talking a much higher level of sensitive, intimate, emotional information that these companies can collect and use. I mean, some of them just come flat out and say, “Your personal information is a business that we’re going to use to target your advertising, to personalize our service, to try and get you to buy more stuff.” And that’s scary. I don’t feel like that should be the case for any app, but I really don’t feel like it should be the case for mental health apps. And then one thing that we had a hard time doing was confirming that a lot of these companies met our minimum security standards in part because they wouldn’t reply to our questions and they didn’t have good public documentation of their security practices, but also in part because we came across a number of apps that allowed weak passwords, everything from just 1-1 was allowed as a password to 8-1s was allowed as a password on one app. There’s one app where you could just enter any email address and then you would get simple link to access the app. And so my partner, Misha, entered my email address. He was like, “Hey, I signed you up for this app.” And I was like, “Thanks, dude!” So the security practices for these mental health apps still raise a lot of questions, like try and find a bug bounty program for any of these where security researchers could report security vulnerabilities was next to impossible. And that shouldn’t be the case. If there’s a security vulnerability in a mental health app, you would think the security team would want to learn about that right away and fix it right away. The other trend that we did see is the non-profits made apps like the rain app, the Sesame workshop app, the Anxiety Canada app, PTSD Coach made by the US Department of Veteran Affairs, those apps had really stellar privacy practices because they’re not trying to make money off you. They’re a non-profit, providing service to the world, but they didn’t have solid security practices or at least we couldn’t confirm, which makes sense. Because they’re smaller. They don’t have security departments. They don’t have the resources to kind of maintain security and monitor privacy emails and things like that. Whereas the bigger companies might be a little better on security, but super questionable on privacy.

 

[00:45:59] SY: Were you shocked by your findings in the space? Because, I guess, specifically for mental health, I would assume that something like HIPAA would come into effect and maybe protect consumers in some way or kind of prevent some egregious privacy and data sharing issues. Is HIPAA even relevant in the context of these mental health apps?

 

[00:46:25] JC: To answer your first question, I was surprised. And I’m a jaded privacy researcher. But when I started researching these mental health apps and realized that they see this data economy free for all is a business model for them. When they’re targeting people at their most vulnerable to make money, it just felt so craven to me and it disgusted me. The other line I’ve used in this process is researching mental health apps isn’t good for your mental health and it’s not. It really kind of jades you to the world. The HIPAA question is a really good one, and I think it’s one that people are struggling with because HIPAA protects conversations with your therapist. It protects relationships between a patient and a doctor. Right? But everything surrounding that is fuzzy and not necessarily protected. So if you sign up with an app, like, for example, Talkspace or BetterHelp, that’s an online therapy app. They’re going to collect a bunch of information on you and then they’re going to try and connect you with a therapist and then off you go. And if you actually go to BetterHelp or one of the other sites, they do pride counseling, teen counseling and faith counseling as well, you land on their website and you’re like, “Oh, this looks like something that would be good for me. I can’t find a therapist. Everybody’s booked out. Maybe I can try this.” And the first thing you do is that you click the little red Get Started button and you immediately get taken into this intake questionnaire where it asks some pretty sensitive questions along the lines of, “What’s your gender identity? What’s your sexual orientation if you’re on pride counseling? How frequently are you depressed? How frequently are you anxious? Are you having suicidal thoughts?” Things like that that are pretty intimate. And that’s before you’ve ever been like pushed to look at a privacy policy or given any kind of privacy stuff and it’s like taken a BuzzFeed quiz and then like answering all these questions. You’re like, “I don’t know where this information’s going. I don’t know who’s using it, but you just do that.” Then once you do all that, then they’re like, “Pay us and we’ll connect you with a therapist.” And you’re like, “Okay.” And you pay them. Then once you talk to a therapist, the conversations between you and a therapist, my understanding is those are covered by HIPAA, but everything surrounding that, the intake questionnaire, the fact that you’re meeting with a therapist. You could log into these apps through Facebook, Facebook then potentially can know, “Hey, you’re using this therapy app. You’re using this therapy up four times a week. You’re using a therapy app that is targeted at gay people,” things like that, and that’s more information that potentially Facebook is knowing about you. So yes, HIPAA does come into play, but there’s so much fuzziness around it.

 

[00:49:02] SY: Yup.

 

[00:49:02] JP: So what can be done about some of these problematic apps? I’m curious if you see privacy policies changing in the future and what will incent these companies to do that? I mean, we can vote with our wallets and our feet and not use the app, but do you think it’s government regulation? Do you think it’s something else that will push these companies to take privacy and security, especially around health and wellness more seriously?

 

[00:49:31] JC: What I always tell people with Privacy Not Included is companies push too much responsibility to protect their own privacy off on consumers and it shouldn’t be that way. Companies should do better. Committing very clearly that you’re not selling your data would be the first thing I would love to see these companies do. Second, and this is a huge one, is clarify what consent looks like, because when you log into these apps or when you read the privacy policy, they’ll crow at the top, “We’ll never share or sell your data without your consent.” And that’s, “Okay, great. I’m safe. They’re never going to share or sell my data without consent so I can proceed.” But then you’re like, “Well, what does consent look like?” And in one privacy policy I was reading and they crowed, “We’ll never share or sell your data without your consent.” And then trying to find out what consent looks like based on their privacy policy, there were lines in there. It was like, “Once you register with the app, you’ve given us consent to use your personal information and all the ways listed in this privacy policy.” So I’m like, “Wait a minute, you’ll never share or sell my data without consent, but by just downloading and registering with the app, I’ve given you my consent. That doesn’t feel like it should be consent to me.” Or they’ll say things like, “You can withdraw your consent by deleting the app. And you’re like, “Oh, so the only way I can withdraw my consent for this is if I just stopped using the app completely.” That also doesn’t feel right. And so really clarifying what consent looks like to have your data shared or used for targeted interest-based advertising, for research purposes or the like. Clarifying that consent seems really important to me. Making sure everybody has the same rights of access to access their data and delete their data. A lot of these privacy policies say, “Depending on the jurisdiction of where you live in the legal protections, you might have the right to access or delete your data.” And that mostly means if you live under GDPR in Europe or if you live under CCPA in California, you’ll have special rights to access or delete it, but those rights aren’t extended to everybody that uses the app who doesn’t live under those special data protection laws. And we feel like everybody should have the same right to access and delete your data no matter where you live. I think when you say that you’re sharing data with third parties, a lot of times they’ll like, “We’ll share data with third parties. For example, we may share with X, Y, and Z, for X, Y, and Z purpose.” And then they use this really vague language about who they may be sharing with it and what they may be using it for. And that’s like intentionally vague because they’ll say, it’s like, “Sometimes it’ll be for legitimate business purposes,” and they’ll talk about that, but they won’t talk about some of the other purposes that are less for legitimate business purposes like advertising. And so really clarifying what third parties we’re going to share your personal information with and how that’s used, I would love to see. And lastly, not collecting data from external third-party sources and combining it, unless you have a legitimate business reason to do that. A lot of these companies will say, “We may collect information from third-party sources such as public sources, social media sources, or even data brokers and combine it with the information we collect about you.” And that just means they’re building an even bigger profile on you. And so having that, either committing to not doing that or if you have to do that for a legitimate business purpose clarifying that. I think those are the five things I would like to see these companies do to make me feel safer about how they’re collecting and using data. Will that happen without any intervention of me talking about it? Probably not. There’s no incentive for a company to make these changes if they’re making money otherwise. So is public policy the way to go? Most likely. These companies aren’t going to make less money, but it’s what we want. There needs to be regulation. There needs to be public pressure. And I know that there’s a lot going on in the world right now. People have really big concerns and there’s this kind of rise to the top. It probably doesn’t right now, but that doesn’t mean it shouldn’t be something that we focus on and try and find solutions to.

 

[00:53:25] SY: So what are some things people can do to protect themselves from seemingly predatory apps?

 

[00:53:32] JC: Yeah, a few tips. If people are using these apps and once it takes the measures to protect themselves is never sign in with any social media login that opens the door for even more personal information collection on you. Opt out as much information collection and sharing if you’re able to. Go to the website and look and see if they have any way to opt out of that. And if they do, do that. One interesting thing I learned in a conversation with a friend who would use this online therapy app was after her first session, her therapist mentioned that she was taking handwritten notes and would delete them after the session and wouldn’t be uploading the notes into the app’s system because she didn’t trust the app’s system. And I think that was great. And this is another indication that therapists also have their own privacy concerns around this. And so if you are having an online session with a therapist, a video session obviously, ask the therapist to take notes offline and not upload them to the system as a way to help protect your privacy. Because even if those are covered by HIPAA laws, data leaks, data breaches, data hacks, employees that have access and snoop, they can all happen. Almost every single privacy policy I read says, “Hey, the internet is not a hundred percent safe. We can’t guarantee the safety and security of your information.” So take that extra step if you can. And then periodically, go in and ask for the companies to delete your data. So there’s less of a chance that your data just to be sitting around where it could be compromised.

 

[00:54:59] SY: Well, thank you so much, Jen, for being here.

 

[00:55:01] JC: Thank you guys for having me on.

 

[00:55:14] SY: Thank you for listening to DevNews. This show is produced and mixed by Levi Sharpe. Editorial oversight is provided by Peter Frank, Ben Halpern, and Jess Lee. Our theme music is by Dan Powell. If you have any questions or comments, dial into our Google Voice at +1 (929) 500-1513 or email us at [email protected] Please rate and subscribe to this show wherever you get your podcasts.