“Digital vulnerabilities that affect one, affect us all.” These are some of the books that helped us understand the complex connections between technology development, cybersecurity, privacy, geo-politics, and human nature. Here's how they inspired our explorations and what we learned from them to improve our mindset and skillset.
“It doesn’t go wrong, it starts wrong.”
This captures the privacy and security challenges that existing and new technologies alike are facing.
When you read about how government agencies are accessing your personal data, don’t get caught up on the actual collection event. You should be just as (or more!) concerned that companies are intentionally developing and promoting software products which allow them or third parties to spy on you.
This is why software developers play a key role in making sure the product they build doesn’t negatively impact the user.
Because the relationship between cause and effect is a very convoluted one in technology, reading books about cyber security is a good way to understand why we need it and how its principles help us. The main benefit of these books is that the pros who write them have done the research and broken down this complexity so anyone who wants to can easily understand it.
Inspired by our recent reads, we have a conversation about ethics in the technology development community and how this impacts our lives.
You’ll learn how the Wannacry ransomware attack triggered a new era in cybercrime tactics with global reach and very visible impact. You’ll also hear about the role government agencies sometimes play in enabling cybercriminals to create and distribute malicious software. Additionally, you’ll learn about the impact that internet overuse is having on people and how it makes bad actors’ “jobs” much easier.
In this episode, you will learn:
Connect with Dave:
Connect with Andra:
[00:41] Dave Smyth: So, I just finished reading, This Is How They Tell Me the World Ends by Nicole Perlroth. She's a New York Times journalist, or was, or might still be. And I had no idea that there was this whole trade in buying and selling zero-day exploits, and that governments have been stockpiling these for as long as software has been around, almost, and just how much access they have to everything. They can, basically, infect anything with lots of these exploits, some of which are zero-click – so you can be sent a message or something, and just receiving the message will exploit a device. It's absolutely wild what she talks about in the book, and it seems like it's absolutely everywhere.
[01:34] Andra Zaharia: It is. It's very generalized. And if I may just unpack that just a little bit for people who don't necessarily master the vocabulary, the language, the lingo, I guess; in cybersecurity, zero-day vulnerabilities are security holes in software that the software makers don't know about. So, they don't know that it exists, someone else found it, and they haven't reported it, so they don't know how to fix it. It's just like having a big house and one of your windows is open and you can't see it because you don't really go into that room, but a neighbor saw it, but they're not telling you because they're interested in using that window to come in and get stuff out of your fridge every now and then, whenever they feel like it.
[02:16] Dave Smyth: Your neighbor's cat won't tell you.
[02:18] Andra Zaharia: Yes, exactly. That's a nice and fun way of putting it. And hopefully, that'll make it a bit more digestible and palatable than how bad the actual thing is. And the other thing is that, yes, governments do trade these things constantly, it's part of their attack and defense strategy. The market for these types of vulnerabilities is absolutely huge, their value is absolutely huge. Imagine having a security hole that you can use to get into people's email addresses. You do not want to tell other people about it. You want to keep it to yourself, so you can swoop in on whoever, wherever, for as long as you want. What I wanted to highlight about this book, which sounds very doom and gloom because things are really bad, but where I think that the, let's say, alignment with empathy is that these kinds of books help you have more empathy for the people working in cybersecurity because you realize the complexity and just the burden of having to deal with these things on a daily-basis constantly, and the responsibility and pressures that come with it, because they are huge. These are matters of national security, of societal stability, of us having access to water and gas and other things that we take for granted because we've always had them. So, tell us more about this book, and how the world ends, I guess.
[03:53] Dave Smyth: So, there's so much I didn't know. This book was a whole education for me. But I had no idea that things like Stuxnet and WannaCry – I think WannaCry, which was the bug that shut down the NHS over here with ransomware. I may be misquoting this but I think that those came from NSA or CIA-gathered zero-days that had been exploited and strung together. I think maybe they were released somehow. They were put out into the wild and then people started stringing them together. They were almost like point-and-shoot things. One of the books’ conclusions is like, it's wild that governments have been stockpiling these things for their own use without telling companies like Microsoft, who are, obviously, a huge target for this stuff because it's used everywhere. Not forgetting that if that stuff gets out, it can be used to do all sorts of stuff like target dams and try and shut dams down or turn off the lights in the dead of winter.
[05:00] Andra Zaharia: I remember the WannaCry weekend like it was yesterday. It was a Friday afternoon/evening, 2016, and news started to hit – I was working for a cybersecurity company – and everyone just came online at the same time. It was one of the probably most memorable experiences of my life, and then a bit of a growing-up experience, bit of a coming-of-age-in-cybersecurity experience. So, we got news in the cybersecurity industry, everyone was on Twitter talking about the fact that there had been a leak about a range of security vulnerabilities in software, but also tools to exploit them, a couple of months before WannaCry happened. So, someone found a way to deploy a bit of malicious software that could replicate itself, and that's the worm that you probably hear about. And that is terrible, because basically, what it did is that this thing crawled exposed systems that had this vulnerability, which no one knew about and they hadn't had time to update the systems and close that security gap. And it would crawl systems, infect them with ransomware, which effectively encrypts data, it blocks you out of the system, and then it replicates on every other device connected to the network. And we don't mean just servers and laptops, we mean industrial equipment and medical technology, and all things like that. So, we got wind of that this was happening. And as you can imagine, almost every company in the world – not almost, probably all of them – first of all, they call their employees to tell them to shut down everything like their phones and laptop, that moment, because it was spreading and they couldn't stop it. You couldn't stop it unless your device was disconnected from the internet. And then everyone wanted to buy any security solution, all of the security solutions in that moment.
[06:56] Andra Zaharia: Except that this was such a big issue, so the NHS had troubles, hospitals were shutting down, car factories were shutting down, all sorts of companies. In logistics, Maersk, they had to actually redo – because of a subsequent attack – their entire IT infrastructure in 10 days, that's one of the top three shipping companies in the world on which – now that we know, in 2022 – the world depends on to bring stuff from point A to point B. Imagine being a company, expecting a shipment, and the entire system is down, you don't know when it’ll arrive if it will arrive. And it took months for companies to recover from this. So, it was totally terrible. And there's an adjacent story to this that the ethical hacker who stopped this attack by buying the domain that was launching these attacks was actually arrested by the FBI, and there's a whole story behind that. I’ll link to another podcast where he did an in-depth interview to show you the data qualities of the system and the lack of ability of the system to deal with nuanced things and to understand the complexity behind these kinds of things. So, that attack was truly terrible. It was the first globalized cyberattack that had such a real and undeniable impact in the – what we call – "real world" – the internet is still the real world, it’s just in a different shape – that had such a big and undeniable effect that it just changed the world forever. We'll never look at security or do security the same way again, and you bet that cybercriminals and the mob – not exaggerating with this – and state actors will try to replicate this in the future, and they're doing their best to try to find something else that works again on this scale and at this magnitude. So, I just wanted to share that story because it totally brought up all of these feelings and just emotions that were packed in that weekend, specifically, and in the weeks and months that followed after it.
[09:14] Dave Smyth: Yeah, it's pretty wild to think that a single hack, or a single exploit, or set of exploits packaged up in a nice little infectious file, or however it got around could have that sort of devastating impact. She says, at the end of the book, that these sort of rationalizations for keeping these tools are buckling because governments have defended this stockpiling because they're critical for war-planning or national security or keeping an eye on adversaries and things. But she says the rationalizations are buckling because they ignore the fact that the internet is connecting us in a way where we're all inextricably connected, and she says, “Digital vulnerabilities that affect one, affect us all,” which is undeniable. This stuff will still be going on right now like she's talking in the book about the pandemic. It's an incredibly recent book. And of course, the pandemic is connecting us even more than we were connected before. So, that's just like one little bit of the book's themes. But it's just a really striking thing that the lack of empathy in stockpiling these things for their own purposes, not thinking about the wider impact of that if those things got out, as they did.
[10:37] Andra Zaharia: And as they continue to do, because this keeps on happening every now and then. And what I appreciate about these books and why I think that it is so valuable, every now and again, or every now and again to read a book that covers these topics, or to get started on your journey towards strengthening your understanding of cybersecurity aspects and their implications and privacy issues and their implications, is the fact that these books give you a full perspective and a rich perspective. They bring together all of the important events. They show you how they're connected, they show you how they impact each other, and how they precede each other, and how everything builds on top of each other. It's systems thinking, basically, which is one of the key mental frameworks that cybersecurity uses. Generally, a critical thinking tool that we can all use because we need it in our lives. This is something that I truly appreciate because it is a huge effort to put these things together in a way that's easy to understand. So, instead of reading tons of articles or trying to keep up with the news, which is so difficult, if you want to be empathetic towards yourself and still give yourself that space to think about these things and to get informed, reading this book is an absolutely great place to start. I'm not just saying this as a devoted book lover because I will always love books more than anything else; I am saying this from the perspective of reader who has educated herself throughout the years with the help of these authors who do the difficult work of putting this together. Because now some of these things may seem obvious to me because I've been in cybersecurity since 2015, but there are so many other things that I don't know. And just seeing them together helps you understand, again, their connectedness, but also helps improve your clarity, and helps you make sense of them and try to use that information in an actual way, which I think is very, very powerful. So, you're on a bit of a reading streak in terms of privacy and security, Dave, aren't you?
[12:50] Dave Smyth: Well, yeah, I'm not much of a fiction reader. So, what have you been reading?
[12:56] Andra Zaharia: Not as much as I want to in this particular area. I have to balance nonfiction with fiction reading. I need both, I just need both. But one of the books that has stuck with me and that I know you read as well, and that has made a big impact on me was Edward Snowden’s Permanent Record. I have a nice little story about how I actually got this book, the physical version of it. It had been on my list for a while. But in 2019, when I attended DefCamp, which is a cybersecurity conference in Romania that gathers the cybersecurity community in Romania, but also throughout Europe, especially in the eastern part. I was presenting the event, and at the end of day one, someone came to me and gave me the book and told me, “Hey, this is for you. I think you're doing a good job. So, thanks for that.” Which was totally unexpected, I was taken aback, I was very humbled and delighted. I love getting books. And on the first page, he wrote a dedication that freedom comes from the heart, which was very touching, and it was very an emotional thing. And I picked it up only last year, honestly. I mean, it stayed with me, I always have my mind on it, that I believe that sometimes we read books when we need to read them and when we can make the most of them. And once I read it, not only -- We know so much about the Snowden revelations, and basically how he just changed the paradigm, and he changed what we knew. He confirmed what some of us suspected about the system and how it works. But I saw so much of myself in his childhood, or parts of it, because I wasn't a tech whizz like he was. But in parts of it, his fascination with technology, with the internet, the possibilities that it opened up; how that degraded and how that got tangled up and messed up and it became a tool to gather power and influence and exert that power and influence. To me, there are plenty of incredibly powerful examples in there, including technologies for spying and the use of ancient technology that only one person knows how to operate at this point, and keeping secrets and spying on virtually everyone. I truly recommend that you read a book and take from it what you need, and make of it what you will. But I was particularly taken aback by two quotes.
[15:35] Andra Zaharia: So, he says that technology doesn't have a Hippocratic Oath. And so many decisions that have been made by technologists and academia, industry, the military, and government since at least the industrial revolution have been made on the basis of “can we,” not “should we.” And the intention driving the technologies invention, rarely, if ever, limits its application and use. So, just like we were talking about at the beginning when you were mentioning the zero-day vulnerabilities, it is true, technology is such a powerful thing but there are -- He also mentions that, basically, these people who create technology and who yield most of the power by using it, they have no incentive to act against themselves, and that's where how systems get corrupted. There's no way to enforce, at this point in time, or before GDPR, there wasn't. We don't have a set of strong enough laws and rules to create this Hippocratic Oath, and to make sure that the ethical implications are considered before we do things, not after Facebook takes over the world, but perhaps just a bit further back in that journey and process. These are very big topics to think about. But I do believe that we can all make a contribution no matter how passionate we are about technology and security and privacy, or if we simply, as a user who values their privacy and just wants for technology to serve them and not the highest being better.
[17:15] Dave Smyth: It's really interesting, actually. The thing about the Hippocratic Oath reminds me of Heather Burns, she works in privacy and ethics field. She wrote something last year, I think, about how people developing the software, they're not trained in ethics or any of these things. And often those are the people making the decisions about these things, either because they're referred to because they deal with it all the time, so they should know. Or no one's thinking about the ethical side, or how things should work, or the implications of it. It's a really interesting and pervasive issue, I think. And also with technology, when you're writing code or creating something, there's a lot of room to do stuff in there. Even if it's open-source, unless it's actually checked, it's very easy to do stuff that's on the edges of what you might consider to be acceptable in terms of privacy or ethical concerns. Whatever you do, in terms of trying to enforce things, it all comes down to the individual or the teams that are writing this stuff. It's very hard to know what's actually going on. This Is How They Tell Me the World Ends is a story about there's a test that somebody does about how many lines of code could you say, for certain, there are no security exploits in. And somebody in one of the three-letter agencies says, “I could say with 100% accuracy that there wouldn't be something in like 10,000 lines of code.” And then if they find out that that's not true, they can demonstrate that they can hide stuff in 10,000 lines of code or something. That's not very many lines of code. And a Hippocratic Oath seems like a very good idea because like you trust the doctor who's advising you based on their experience, and you just have to trust what they're saying, we need to do the same thing with code and people need to understand the possible impact of what they're writing and creating. I know, at the moment, the people who were doing that aren't armed with the training, the knowledge to advise on it unless they do all of that work themselves, which often they're not being trained or set up to do. If people have CPD in their jobs, how many times is that going on privacy things or data protection things or ethical things or on things education in that arena? I bet, in a lot of cases, it's only when it's mandated that they have to because the companies need to cover that backsides.
[20:01] Andra Zaharia: Exactly. And I think that this is so important because we still think. And generally speaking, on average, people still think that technology is separate from our real lives. There's the online life, which is not that real, it's virtual, it's very abstract and real life. But that is such a bogus distinction that doesn't exist anymore, it hasn't existed for more than a decade. And yes, developers do have responsibility. Their work might be impacting people in the sense that a medical application might miss function if we were to tie directly to, let's say, someone's health, but also, like you mentioned, someone could use a security vulnerability to release water from a dam and flood an entire area, they can take down electricity networks, and so many other things with direct and immediate impact over people's lives. And I know that sounds like sci-fi but it is not, and I've seen and attended conferences where this was presented, like, imagine crashing a satellite. And for people who saw gravity, the movie, that scenario, the movie script is based on a potential real-life scenario where a satellite that loses its trajectory will crash into others and potentially then crash into Earth [21:22 inaudible] damage communications, because that would be the least of our problems. But it would crash on Earth and potentially kill people and damage buildings and so many other things. So, these are things that technology makes possible. And yes, I truly believe that there are people who are doing a lot to educate others, and they're investing a lot of personal time and resources to educate themselves, to pass on this knowledge just like you're doing with your business, with your articles, and so on, like we’re trying to do now. But that's because we want to, we are making this effort because we care about it. And it's not that most developers don't care about it, it's that they don't have the time. When you have to keep up with technology and do your job and have a life, it's so difficult to be able to juggle all of these things and to make sure that you're doing the right thing, especially if you're working on a specific project that you don't know how it's going to be assembled along with other people's work, and where that's going to be shipped off. That could be used in war for killer drones or other things like that.
[22:30] Andra Zaharia: This is the kind of conversation that these kind of books start. And I think that that's what makes them so valuable to read, and that's how it helps create these bridges between people and create more respect and understanding for the type of work that people do or could be doing or should be doing, and how we might help each other get there, because it's not one person's effort – a small group’s effort is not enough to create lasting change in individuals, and then in teams and companies, and hopefully, escalating in a good way from there.
[23:10] Dave Smyth: Going back to what you were saying about people thinking that the online world is different from the real world. Not to keep on going back to This Is How They Tell Me the World Ends but she says in that book that that barrier between the physical and digital worlds is wearing thin, which I thought was a really nice way of putting it that the lines are becoming more and more blurred, especially when you see the potential for the impact in the physical world of things that are happening in the digital world. In summary, at the end of the book, she talks about if we think about 911, she says it's now arguably easier for a [23:49 inaudible] or nation-state to sabotage the software embedded in the Boeing 737 MAX than it is for terrorists to hijack the planes and send them careering into buildings. And I thought that was a really great summary of how things have changed, and that's in a very short period of time, that's only two decades. These worlds are becoming intertwined in ways that we're kind of aware of, but at the same time, we're sort of oblivious, too, at the same time.
[24:16] Andra Zaharia: I think it's kind of a boiling frog situation where we're sitting in gradually hot water until something along the lines of “don't look up” happens, and then it's just watching the train wreck happen and not being able to do anything about it. But there are things that we can do right now to try to avoid a potentially world-ending scenario, hopefully, because there are tons of people doing great work. There are all of these authors doing so much research. There are so many organizations and initiatives and NGO and people putting themselves at risk, and doing so many things to keep us safe, things that we don't even imagine, things that we don’t even know about – not even us, who are actively involved in the space. And if any of them are listening, I just wanted to say thank you. I have huge respect for them. And I think that it is a big task, a big challenge to take on a mission like this simply because it's so complex and it has so many ramification, and you have to be so well-equipped mentally, emotionally, intellectually to be able to make good decisions in such a complex environment. And that's why I believe that reading these types of books and talking about them is essential because they help keep our mind clear when the lines between reality and virtual construct are blurred the way that they're getting blurred – hey, meta. How do we know if what we're seeing, feeling, hearing is real or not? There's deepfakes, there's the entire Orwell's “Nineteen Eighty-Four” scenario that is so relevant today, just was much better tech. So, how do we make sure that we stay clear-minded and that we are able to still [26:17 inaudible]? It's just by constantly educating ourselves and getting all of the help and support we need, including by reading books like these.
[26:28] Dave Smyth: Yes. Agreed.
[26:34] Andra Zaharia: I want to stay on the back of this, since we got into the whole topic of thinking clearly and how our thinking gets distorted, one of the books that has stayed with me in the past years, and I read this a while back, is called “The Shallows: What the Internet Is Doing to Our Brains” by Nicholas Carr. And this was before the social network, the documentary came out before so many pieces of research came out, and many other documentaries and things explained how social media, the internet, generally, are screwing our brains. And he makes some very interesting observations in that book, that what we're experiencing is, in a metaphorical sense, a reversal of the early trajectory of civilization. So, he says that we are evolving from being cultivators of personal knowledge to being hunters and gatherers in the electronic data forest. And that means that we're using less of our mental abilities to think for ourselves, to draw personal conclusions, to make inferences, to just simply jog our critical thinking. And this hunter-gatherer behavior, we're just amassing constantly information. And that actually changes your brain's biology, changes the neural paths in the brain that govern your most recurring habits. And that's bad, it's not great.
[28:06] Dave Smyth: So, is he arguing that we're not able to store as much information? Or is it more that our brains are changing, that we're not able to drill down into [28:19 inaudible] what's actually happening, that we've got a more shallow understanding of information?
[28:27] Andra Zaharia: Exactly. You framed it perfectly. He says that we're slowly losing our ability to think deeply, and to just foster our own ideas, and to basically think for ourselves instead of being told what to think, and having all of these influences accumulate into our brain and sway us one way or the other. I know this is by now a cliche, but when you're constantly drinking from the firehose of information, it is difficult to find space and to create space for deep thinking, and just to sit with things. If you're constantly pulling up your phone to doomscroll, if you're constantly glued to Netflix to watch a constant stream of something; there is no space in your life for you to just sit and let your brain process, and be, and come up with those great ideas that we do in the shower because that's when our brain is disconnected from all sources of information, and we need more of that. So, I'm just trying to make a case for being kinder towards ourselves and giving ourselves that space without feeling guilty that we're falling behind and missing out on a bunch of things because we'll never be able to keep up, that is an illusion that's not worth chasing in my opinion.
[29:49] Dave Smyth: Cal Newport makes some similar arguments in digital minimalism. He talks about Henry Thoreau in Walden and the cabin he built in the woods, and how he spent time there, and how he did all his thinking on walks. And then there was a president who spent hours walking for similar reasons. It’s funny because it's partly a sort of human insatiable appetite for this stuff. But it's also partly that we're being conditioned to…
[30:23] Andra Zaharia: Push the button.
[30:24] Dave Smyth: Yeah, and tech companies are exploiting our need to see what the notification says to get that little hit. I was watching a talk of Cindy Gallop, she couldn't remember who came up with this quote, but they said that “every time you get a little notification or a like or a retweet or something, internally, you're feeling like you're receiving a little pellet of love.” And she was talking about that in a positive way, like, how can we send those? How can we make people feel like that? But actually, also, there's a dark side to that, which is that we're being manipulated by our feeling. Our appetite for that is being deliberately increased by tech companies exploiting our basic psychology and lizard brains.
[31:12] Andra Zaharia: Yep, we're basically making our brain obese. It's too much for it’s own good. And I always try to highlight this in conversations with people from everything that I've learned about myself and about other people through either therapy, coaching, reading about neuroscience and how everything works together, because all that we've talked about now is deeply tied to each other, although they seem different topics. Everything that I've learned points to the fact that our biology has limits, and how big tech is currently using their power is severely challenging these limits. We are not built for this. We are built for walks in the forest, or walks anywhere. We are built for one-to-one conversations. We're not built to have our eyes glued to the screen, and lead static lives, and have these constant dopamine hits, like you said, condition our brains into wanting more and more without stopping to ask ourselves why and if this matters in the end. Because at the end of our lives, I don't think that either of us is going to regret not spending more time on the internet, or not listening to all of the podcasts that we could have listened to, or anything else. So, please, by all means, you can pause this podcast and go on for a walk. If you choose to take us with you, that's even better. But if this prompts or triggers or inspires you to do something for yourself in terms of self-empathy, by all means, create that space intentionally for yourself. Because no one, absolutely no one, is going to do it for you. No matter how much they care about you, they cannot. This takes personal willpower.
[33:05] Dave Smyth: And actually, all the evidence shows that the companies and apps that we're using, they're going to try and exploit it as much as they can. So, I think what's difficult about it is that it's actually really a lot of work to turn these things off to make that time. It feels to me like it's beyond self-care, it's like you actually have to work on it to disconnect. Last year, I took email off my phone. And then towards the end of the year, I had to install the app for something, I think I needed a code to pick something up, and then there was something else where I needed it. And then a couple of months went by and the app was still on my phone, even though I deleted it a couple of times in the meantime. I finally deleted it again a week or two ago, and it makes a huge difference. I think I'd gone, maybe, six, seven, eight months without it on my phone, and then suddenly it was on my phone, and then suddenly, it was really hard to delete it again. And I've already noticed the impact of that. I might sit down, open my phone without really thinking about it, and then look at my email. And I kind of want to look at it, but I know that I don't want to, and it not being there is enough of a barrier for me to not look at it in that moment. But if it was there, I would totally open it even if I thought “I don't want to.” All to say that it's really hard work to keep on top of this stuff and to disconnect in the way that we all know or feel like we should, but actually doing it is hard and it's an ongoing effort.
[34:44] Andra Zaharia: It is. The best thing that we can do for ourselves is to make bad habits difficult to engage in and make good habits easy to sustain. And it's the psychology of habit-building and rewiring your brain, which is perhaps a topic for a different episode. But I just wanted to highlight this because I am lucky to have one of my best friends as someone who has done tons of research and read everything about this. She works, actually, with other people to help them get their digital consumption and their digital lives under control, so technology doesn't end up controlling them. And I've learned a ton of things from her. And yes, one of the things is that I haven't had notifications on my phone, at all, for three-plus years. And I only realized how bad they are when I meet up with other people and I see their phones constantly buzzing or popping up with notifications, and then I realize, like, “Whoa! This is a lot. This is very distracting to me.” And if I had these on my phone, I know that it would be difficult. I do, sometimes, check my email as well. I do sometimes still log into Twitter, for example, or LinkedIn, more rarely, from the browser, I don't have the apps on my phone. But I have set limits for those. Perhaps we could touch on tiny habits that have helped us create this pace for reading books and letting our brains have a breather, and just perhaps engaging in the lost heart of doing nothing.
[36:31] Dave Smyth: I think you can probably tell me more about that.
[36:36] Andra Zaharia: I like that our experiences complement each other. I don't think we'll ever ran out of things to share, and hopefully, try to help other people with. And getting back to the book topic, if there are books you find interesting, that you think we should talk about, that are related to cybersecurity, privacy, ethical debates in this pace – please let us know, share them with us. We'd love to have them read and pick things that inspire us to have conversations that will hopefully help you.
[37:14] Dave Smyth: Yeah, there are so many that we can talk about. And I'm sure this won't be the last time we talk about some books on these topics.
[37:21] Andra Zaharia: Hopefully not.