Cyber Empathy

Ethical hackers and the legacy of the Hacker Manifesto

Episode Summary

In this captivating episode, we dive into the depths of hacker culture and explore the multifaceted aspects of The Hacker Manifesto with seasoned cybersecurity expert Tom Eston. Going beyond the surface, Tom sheds light on the driving forces behind cybersecurity enthusiasts and imparts invaluable insights on ethical hacking and leadership. Join us as we uncover the power of curiosity, the challenges of burnout, the golden age of training, ethical dilemmas, and the delicate balance young hackers must strike in this ever-evolving landscape. Tom's experiences and wisdom will leave you with a fresh perspective on cybersecurity and its impact on society. Don't miss this enlightening conversation filled with practical examples and thought-provoking discussions that challenge the status quo.

Episode Notes

Thinking of The Hacker Manifesto solely as a rant against corporations, the government, and all authorities is reductive. The volume also appeals to curiosity and a constructive rebellious spirit, which form the basis of hacker culture.

Curiosity fuels hackers to break things down and figure out how they work so they can improve them. It also drives them to ask questions that are different, unexpected, and that lead to paths less traveled.

In today's episode, we go down one of those paths with experienced leader, team manager, and security professional, Tom Eston. In his over 20 years of cybersecurity work, Tom has successfully led ethical hacking teams and improved industry-standard testing methodologies. He has also been creating podcasts since long before they were cool, and now you can listen to him on the Shared Security Podcast, which he co-founded and co-hosts.

Throughout our conversation, Tom looks at the 1986 Hacker Manifesto from an unexpected angle, shares his thoughts on the potential of the massive amount of information available for aspiring ethical hackers, and shares examples of how to guide young white hat hackers and help them calibrate their moral compass. 

We also talk about the time Tom faced an ethical dilemma as a leader and his experiences being on the receiving end of empathy in cybersecurity.

In this episode, you will learn:

Resources:

Connect with Tom:

Let's connect!

Episode Transcription

[00:42] Andra Zaharia: “Another one got caught today, it's all over the papers. "Teenager Arrested in Computer Crime Scandal", "Hacker Arrested after Bank Tampering." Damn, kids. They're all alike. But did you, in your three-piece psychology and 1950's technobrain, ever take a look behind the eyes of the hacker? Did you ever wonder what made him tick, what forces shaped him, what may have molded him? I am a hacker, enter my world...” This is the beginning of the Hacker Manifesto, written on January 8th, 1986. This is one of the most referenced pieces of writing in the entire cybersecurity industry. This was written when everything was just beginning. And it is fantastic that I got to talk about the Hacker Manifesto with today's guest: Tom Eston, who has over 20 years of experience in IT and especially in information security, and who has seen the hacker history develop, who has seen people transform their curiosity and channel that curiosity into an entire career that actually keeps multimillion-dollar companies safe, hundreds and thousands of employees around the world safe, and generally makes the world a better place. Tom is also the co-host of the Shared Security podcast, and he has been contributing to the hacker culture ever since before it was cool, ever since becoming an influencer was a thing. It was really wonderful to talk to him about all the practical ways in which empathy helps offensive security specialists, defensive security specialists, and any other type of professional in this space, actually do their job better, actually derive more satisfaction from it, and most of all, actually stick to their healthy moral code. This is a wonderful episode that I cannot wait for you to listen to, for you to draw inspiration from, and for you to, hopefully, pass on to someone else who resonates with these principles and values. Thanks again for listening to the Cyber Empathy podcast. Enjoy.

[03:29] Andra Zaharia: Tom, it is so great to talk to you about cyber empathy. It is so great to be able to just have a glimpse at your amazing career that spans decades, that spans so many experiences, and that is so firmly rooted in the hacker culture. So, welcome to Cyber Empathy.

[03:51] Tom Eston: Oh, thank you, Andra. Really a pleasure to be on your show. So, thank you very much.

[03:56] Andra Zaharia: It was so wonderful to talk to you for your podcast, the Shared Security podcast, and to delve into your body of work because you've created so much, not just through your work as an employee, but you have this entire body of work that contributed to the community and to educating everyone in the community, whether they're offensive security professionals or a blue team, purple team, and everything in between. So, I was wondering, since you've been making security podcasts since way before it was cool, and you've basically been a podcaster for over 15 years, what have you seen drives people to invest the mental energy, the emotional energy, and the time in this? What drives you, first of all, and what have you seen drives your guests to get involved and really be present and share their best stories?

[04:53] Tom Eston: Wow, that's a great question. I guess I go back to just how I've always been in my life in terms of how I always feel I want to give back to others no matter what I do, whether that's my personal life or my career. But I've always had this sense of “I have to keep doing more, and I have to keep helping and keep educating people.” And this goes way back. I was in the military here in the United States. I was in the Marine Corps. And I felt that by going into the military, I was doing something greater than myself, something for helping others out as well. And then after that, going to college, and then getting my degree, and then starting off with various technology jobs, and eventually landing myself in security. I think that's where I really felt my calling was in cybersecurity, or back then it used to be called Information Security back in its infancy. For me, it's always been around awareness and education. And like you said, no matter what kind of role I've had, mainly, in the last several years, I've been focused on offensive security. But even when I was the person patching the systems and really getting my foot in the door in the industry, I still felt like I was doing things for other people, if that makes sense. 

[06:21] Andra Zaharia: It really does. It shows that your North Star is actually helping people, it really shows through everything that you do through how available you are to people who want to work with you, talk to you, learn from you. And this is deep proof of practical empathy, of something that you're cultivating and doing every day and you've been doing it for so many years. How do you maintain the energy for doing this? Because there's a very fine balance between being altruistic and being generous and doing that to the point where it can burn you out.

[07:03] Tom Eston: That's a great question. To be honest, I have encountered burnout in my career because I've been in that situation where I've worked too hard. I've ignored my own mental health. Unfortunately, it's impacted my family and my relationships because there was a point in my career where I was almost too focused on it. I mean, I was too focused on it and where I wanted to go with it. I think that's a lesson that I had to learn by actually hitting rock bottom and actually burning myself out. And in fact, it got so bad, I had to change a job because the job was just consuming me and taking me away from the things that mattered in my life, like my family, relationships, friendships, and things like that. And in fact, one of the jobs I would consider a little bit toxic in a way too, I was being influenced in a negative way by the people that I worked with, by the environment, and those things. So it hasn't been all a bed of roses, so to speak, in my career. I've had my struggles and I've had challenges that I've had to overcome. And I think burnout, unfortunately, happens to a lot of us in this industry. And what I've learned is I've come to accept that and I've also come to realize and see in my day-to-day activities as a leader and as a manager, who are the individuals that might be burnt out or they might show signs of burnout. And now I'm of the mindset of like, “How can I help them as a leader and as a manager? How can I get them through that burnout?” Because I know what it's like, I've been there, and it's a horrible feeling, but it was also a lesson for me in my life that it’s just something I had to overcome.

[08:54] Andra Zaharia: Thank you for sharing this so openly. Because, like you mentioned, this is something that's really common, I'm glad that more and more people are talking about it, that more and more people are sharing their experiences, and also how they found help, how they built or use their existing support system, how they made decisions just like you did to improve their context and to do what's right for them in a sense of being aligned with what you believe because when that fracture between our principles and where we work or how we're forced to work happens, that's something that's quite painful. And I love that people like you end up in leadership positions because you can be the leaders that you wish you would have had when you were in your colleagues' shoes. So that's a really wonderful thing to do. Because burnout is definitely a problem for people both in offensive security, defensive security, and then people who have actually mixed roles, I was wondering if the fact that you were first a defender makes you a better offensive security specialist and how? I'd really love some detail around that.

[10:13] Tom Eston: I love that question because I do believe that if you start more on the defense side or more in just the general IT technology space before you get into offensive security, it definitely makes you a better professional in offensive security. I live that. And I think it's because you have to understand how to build things before you break them. And I was fortunate enough to have opportunities to be that builder, or I'll say, be that fixer. So, like I had mentioned, one of my early jobs was I was the person going out and patching all the systems for security vulnerabilities, but not just security vulnerabilities, we were doing upgrades to the email system, and we were doing active directory maintenance. All of these things and all these skills that if I didn't have that experience of being on that other side, I wouldn't quite understand how to properly, or maybe even ethically, break things in a way that to demonstrate a security vulnerability or demonstrate a risk to an organization. And this is what I tell people all the time getting into this industry: the more experiences you can have on that defender side, on the blue team side, or just working on a help desk, or talking to people, and all of those experiences, you're gaining those critical skills that are going to help you in an offensive security career in the future — everything. Just having conversations with people, learning to talk to people, learning to handle difficult situations or have difficult conversations are all going to pay off tenfold when you're later down the road in your career. It's been invaluable for me having those experiences. It's interesting just growing up in the industry when it was in its infancy. And then starting out as, like I mentioned, I was on a help desk, I was doing all these different IT jobs and then just building that up into where I'm at today has given me just a lot of insight and a lot of just great experience I can look back on. And now, hopefully, I can tell others, “Hey, this is a great path that worked for me, even though that path is not linear.” We all know that tasks change, you go in different directions. I'm sure you have as well in your career, and that will continue. I fully expect that my future in this field may be completely different. We have no idea, but that's all the real fun and interesting journey that we're on.

[12:44] Andra Zaharia: It's so is. And thank you for highlighting that building the technical skills for whatever role you want to pursue in cybersecurity, initially, like you said, because that can change. Also, it's just better when you also develop the emotional maturity that actually boosts those skills. I found that in the leaders that I've worked with, in the people in the industry that I respect, admire, and learn from, such as yourself. You combine these two things.  Yhe emotional maturity is very visible in the way that you have conversations, in the way that you show up for the community. And to me, for instance, one of the most powerful things that has transformed not just my mission in cybersecurity, but that's also really created a powerful emotional tie to cybersecurity is that I have been able to see these conversations in public, I've been able to see the Twitter threads, the forum comments, and a lot of other things, the videos and podcasts, everything. This is an incredible thing to be able to witness and to be able to access for free. If you want it, you can be there.

[14:05] Tom Eston: I've never seen so many great free training opportunities for people in this industry. It's a goldmine of information. I've never seen it like this before. It's just incredible. So, to your point, there is so much material out there to learn, not just technical skills, but things like your podcasts and other people doing great things talking about empathy and leadership and wherever it is that you feel that you need to learn more about a particular topic, you can find it now. It's like the golden age of training, I guess I’d coin it that if I had to.

[14:44] Andra Zaharia: It definitely is. It can be overwhelming. Having this skill to carve an educational path through that is a project unto itself, but it's definitely wonderful. And when I think about this, when I think about how the field has evolved, I see that the culture has changed, the culture behind it. But the culture is still rooted in something very elemental and something very fundamental for the cybersecurity industry. And I'd like to travel back in time a little bit and talk about the Hacker Manifesto, which was written in 1986, and has driven cybersecurity in, first, its darkest corners, and then in ways that have permeated mainstream cultures through movies and TV shows and things like that. I would like to ask you how you've seen this manifesto either continue to be relevant or influence people in the industry that have become leaders and people who influence opinions and things like that.

[15:52] Tom Eston: This is a great topic, for sure. To give you a little perspective, when the Hacker Manifesto came out in 1986, I was 11 years old. So, I didn't even know this existed when I was 11 years old even though I started tinkering around with computers. I remember my first computer was an Apple IIe. And I remember convincing my parents that I needed a separate phone line so I could create a bulletin board system where people could connect into my computer and then we could talk, chat, and we could do all these cool things. And in my first experience, I think I was about 13 years old when I had developed all this and had this phone line. And I remember, there were some things that I could get involved with around cracked software. They would call them “warez” back then. And that was actually my first ethical dilemma, if I had to say, where I saw the opportunity to possibly do something illegal, even though this was still very new, and still in its infancy. And I think that's what the Hacker Manifesto touched on, was, “Those damn kids are trying to be creative, and they're doing things against the corporations and against the government.” And I think the manifesto empowered a lot of kids in that generation to explore. And I was one of those kids. And I still didn't know about the manifesto, but after reading that, I still felt that I could do something different, I could start exploring things that haven't been explored with this new technology. And that's essentially what the Hacker Manifesto is about. It's definitely a rant about “We're against the corporations. We're against the government. We're against the authority.” But it explores the curiosity that really is the basis for the hacker culture. And it is the basis, in my opinion, of what we see cybersecurity is today. It is that curiosity is that “how do I make this thing do something it wasn't designed to do?” And that is still all very good and that's really positive. 

[18:15] Tom Eston: But the difference that I see today, compared to back in 1986 when this was written, is that there was no industry around cybersecurity, there were no vendors, there was nothing yet. The industry in itself was still in a conceptual phase. There were no laws, there were no regulations, there was no privacy, or anything to consider. And that's where we're at today. So, the things like he talks about in the Hacker Manifesto, obviously, are illegal. And as I always like to say, nobody in offensive security looks great in an orange jumpsuit. So, that's the moral dilemma now: Do people continue to do things that are pushing the envelope? But what are those ethical boundaries? And I think all of us as professionals have to think about that and think about what does that mean to us and to our industry because I'll tell you, it doesn't matter what type of career or position you have in cybersecurity, you will have some type of ethical situation you'll encounter. You're going to gain access to sensitive information. You're going to gain access to things that you probably shouldn't be seeing by the nature of the work that you're doing, and you will have a situation where-- I talk to pen testers all the time where they could easily take this data, sell it, and make tons of money, and no one would even know what happen because they've got access to this information. So, ethics, your morals, all of these things are just so critical in this day and age, which I think goes back to what was written in the Hacker Manifesto.

[19:56] Andra Zaharia: Thank you for that comprehensive overview and for highlighting something that another podcast guest, Jenny Radcliffe, actually mentioned that we don't really have in our education system or we don't have it prominently enough, we don't have this element of talking about principles, of talking about this backbone that helps us make better decisions. And that's where culture has a big influence. If we manage to highlight positive examples, examples of ways that you can channel that rebellious spirit, that curiosity, ways where you can give yourself permission to experiment, but also do good and make sure that you're thinking about why does this happen, and who does this affect, and always think about the human at the end of the system. This is one of the situations where I think developing our ability to practice empathy, not just think about it, not just use it as a buzzword, but actually practice it, that's where it can make a difference. And I was wondering if you remember any particular situation or a conversation with someone where you showed them that you really understood them or where they understood you and helped you deal with an ethical dilemma.

[21:20] Tom Eston: I have a great story about that. So I want to take you back to the first time I had to fire somebody in my career. And it was when I was at a small consulting firm running a small pen test team of ethical hackers. As you know, in offensive security, we are paid by companies to hack into their systems — legally, of course, we have permission to do so. Well, I had hired this gentleman right out of college, super smart. He was very talented. I gave him a chance. I wanted to see what this person could do, and my whole team validated, “Yes, this guy's got a lot of talent.” On his first day, in this company I worked at, we used to have this room called the War Room, where all the pen testers sat together and then we did all of our work as a team, where we could bounce ideas off each other, we could have conversations, and learn from each other in this room. I remember one of my employees came downstairs to my office and said, “Hey, so and so is doing something I don't think you would approve of.” I'm like, “Well, what's going on?” He's like, “Well, he's demonstrating this new tool and he's showing how we can extract credit card numbers from this one site.” I go, “Is this site one of our clients.” He's like, “I don't think so.” And I'm like, “Oh, boy.” So I go upstairs, I see what he's doing, and I'm like, “Okay, stop what you're doing right now. You need to come with me downstairs.” I took him in the office and I said, “What are you doing? That's not authorized. You're hacking some website that you didn't have permission to test.” And he's like, “Well, I was just going to demonstrate this tool and it's just some no-name’s site. No one's really gonna care.” And I'm like, “You're getting credit card information. This is very serious. And you're putting the company’s reputation at risk because guess whose IP address that is that you're hacking from? It's ours, the company's IP address. That's going to come back to us.” 

[23:22] Tom Eston: And I just had it. I told him, “I'm sorry but I gotta let you go. This is unacceptable.” And he was really upset with me. He's like, “What do you mean? Why are you firing me? Can't you see this awesome thing I just did by showing the others on the team how to use this tool?” I’m like, “Listen, you understand. What you did is wrong.” It really set me back a bit because what I understood about him was while he was smart, intelligent, and all of that, his moral compass or his ethical compass was not like the rest of my team or myself. And what I learned from that is that there are people like that, and it's one of those things of “how do I interview for that?” Because if I knew hindsight is always 2020, but I didn't interview him for his ethical code or his moral code so he wouldn't do something like that, how would I know? So I told him on his way out, I said, “Hey, I know you disagree with me. I know you think what you did was or it was okay. But I hope you take this as a lesson that you can't do this in a professional environment. What you're doing is illegal. It puts the company at risk and put yourself at risk. And maybe this is something you'll learn from later in life.” But never heard from him again after that. But that to me was just my first encounter of someone that, obviously, had a different ethical or moral compass than most people in this industry and a big lesson learned for me.

[24:54] Andra Zaharia: There's something else that stood out to me from your story. The fact that someone in your team flagged this immediately, which I think is one of the greatest things that can ever happen in any sort of group, whether it is a professional group, whether it is a community, to actually say something and to do something about just an ethical behavior that we notice, and to be able to correct that, to be able to confront it, and have those difficult conversations that you mentioned. I think that's so incredibly valuable. And that's how culture actually gets passed on from person to person. That's how we form examples. And when those people end up teaching others, and end up being models for other people, then you already know that they have a very healthy structure internally.

[25:48] Tom Eston: That is a great point. I'm always a big believer of “you become who you associate with.” And if you create a culture in an organization where it's okay for them to come to me or come to the other managers or leaders when there are ethical or questionable situations, that is a sign of a healthy culture because that's what you want. And that's what you want to encourage as a leader in your organization.

[26:17] Andra Zaharia: Could you point to a particular situation, let's say, where you were on the receiving end of empathy, that really represented an inflection point for you, where it really made a difference for you personally and for your career as well?

[26:34] Tom Eston: I've had several situations where I've been shown a lot of empathy for what I do as a leader. I think sometimes I've had to make very difficult decisions about either a direction that we take in a business that may have impacted people or their positions or I've been in situations where the decisions that I make affect the financial stability of a company. I've been lucky enough to have some of my leaders actually show empathy towards me; “Hey, Tom, that must have been a very difficult decision that you made. I support you or I fully support you in that.” Or I've had conversations where, “Hey, I think the decision you made, Tom, was wrong. I think you made the wrong decision. I want to let you know that.” I've had that on both sides. I've been in the military. It's a lot of yelling and screaming, “You did this wrong. We’re punishing you.” But I've also had on the opposite side of more compassionate leaders say it in a certain way where they've told me, “Hey, you were wrong, but I want you to learn from this.” They've been more of that coach type of mentality. Yes, I felt horrible for making the wrong decision. But the real leaders that have inspired me in my life and in my career have been the ones that have told me that, first of all, it's okay to fail, you're going to fail, and that's totally fine. But how do you learn from that failure is the bigger lesson that I've gotten out of those conversations. That's what I tell all of my team and the people that I lead is, you can fail, it's just going to happen. We're all human, we're all going to make mistakes. But what do you learn from that failure? And how can you be better? That's the better message. And I've always been thankful to have leaders that have worked with me through my failures.

[28:32] Andra Zaharia: That's a fantastic example of empathy, creating that space where people can sit with the uncomfortable stuff, knowing that there's a way forward and that people still trust them to learn and to get better and to improve. Well, obviously, those are decisions that aren't as bad as the one that you told about earlier. So there are definitely degrees and nuances throughout this. But yes, creating that space, holding that space for someone is such a powerful thing. And providing the psychological safety for people to know that they can actually do this, I think that is incredibly generous to do for other people. It is a very powerful example of we're here for each other, even if it's a business context because punishing people really severely for everything and treating all of the wrongdoings the same has never helped anyone learn something really valuable. Because we are in this industry of the punishers of people who click on links and the other cliches that are still constantly highlighted and emphasized over and over again. How do you teach this balance between control and responsibility? Because as ethical hackers, you can manipulate systems and you know how to get them to do what you want them to do. How do you teach young people to combine these two concepts? You have this incredible control and skill, but you have this responsibility towards not just your employer, but to the actual people who use the technology. You have the responsibility to do your best to figure out why they've they're vulnerable and what people can do about it to fix them.

[30:36] Tom Eston: That is a great question. It is so challenging thinking about all the different situations that somebody in offensive security could be in. I go back to thinking about not just an ethical hacker, but someone that's doing digital forensics as another example. How do you take what you're doing and then also translate that back to how does that affect the person or the company on the other end? Being a penetration tester, a lot of that comes down to how do you report what you found. How do you take something that's very technical in nature, especially if it was really complex and you had to chain different vulnerabilities together to demonstrate that I gained access to this sensitive information? But the lessons I tried to teach pen testers is about how do you translate that language into normal speech. Somebody that isn’t an executive is going to understand what the impact of those technical vulnerabilities are and then translating that. So, a lot of it comes down to communication and communication skills, especially for very technical people that, by their nature, just want to speak technical to other technical people. But that's not the reality of this job in this industry. 

[31:56] Tom Eston: You have to find those ways of translating technical concepts into realistic and risk-based conversations, which is a skill set that just takes time to develop. So I've had to work with some of the more junior people that come into the company that I work for now. And then teaching them through examples is a good way to show them, “Hey, here is a paragraph of a technical write-up. And then here's the after of how we change this to be a little less technical and more risk-based to show an impact.” And once they start understanding that then, over time, their writing gets a little bit better and those soft skills really start to improve. And as you start moving up in your career, you're just going to be put in situations where you're going to have to speak in front of a group of executives, or a board of directors, or they're going to want to hear from that technical person of how you broke in or how you gained access to something. So communication skills, I always go back to, are some of the most important things that you can learn, even above the technical concepts and how to use a tool or all those things. The more time that earlier in your career you can spend working on those soft skills, public speaking, communication, and writing, the more valuable you're going to be later on as you progress through your career.

[33:23] Andra Zaharia: I really appreciate you highlighting that because the best people, the people that we love learning from do exactly that. They're great storytellers. They're super engaging writers. They are authors. They are speakers. They do all of these things, making the extra effort to travel and to be there, to engage with communities, and that's what moves us forward, that's what teaches generations. And just like you mentioned at the beginning of our conversation, that ability to give back to contribute to keeping this virtuous cycle going, that's one of the most generous things that we can do for each other in this industry and for the people who we serve at the end of the day. One of the things that I wanted to highlight from what you said is that you have a very kind perspective to actually highlight that it takes time to develop these skills. That's so wonderful that you mentioned that because, in this hustle culture that has permeated the cybersecurity industry as well, the idea of hacking goes away through things, it's actually permeated into mainstream culture. But the hustle culture has obviously spilled into our information security industry as well. There's this concept that you can somehow find shortcuts, somehow do this faster and better and without wasting time at all. But sometimes you actually have to spend time on things and do the work and go through the stages because that's how you actually internalize the most valuable concepts, lessons, and abilities. And there's no shortcut to that, whatsoever.

[35:07] Tom Eston: That's so true. And that goes back to the hacker mindset of “I'm going to find a shortcut. I'm going to automate this, so I'm going to write a script to replace this manual work, or I'm gonna find a way to do something better.” These are all great things. We want our teams and our people to do these things because that's what drives business. You can't do all manual things all the time, we need automation. We need all of these things. You don't have to be a great public speaker. You don't have to be the greatest writer. But there are little things that you can learn and things that you can do to improve the way that you write or improve the way that you speak. I can tell you, I've seen amazing technical consultants come through that were so good at their job and they always be like, “I can't speak in front of people, I can't do this, I can't do that.” But you know what, they tried, and they did just a little bit of extra effort to improve something that they knew that they had a deficiency or they knew they just weren't that great at. But I always look for those individuals that at least want to try and they want to better themselves, even though they know they're not going to be the greatest public speaker or the greatest writer.

[36:32] Andra Zaharia: And that's all it takes to try to overcome those self-limiting beliefs. It's 2023, but we're still taught, I don't want to say gender roles, but roles associated with certain kinds of jobs: A technical job is strictly technical, a communication job has no technical background or understanding and that’s superficial. And these stereotypes hurt us so much. They're actually ingrained in us. Even if we don't want to, we're still influenced by them. And we have to be mindful, self-aware, and very intentional about noticing them and starting to change these things. And giving examples of “Hey, you can actually overcome these self-limiting beliefs,” because I bet that every generation will have them one way or another, we just have to deal with them and that's it. I actually wanted to unpack another aspect of penetration testing that you mentioned. There's this huge component of communication to people who are in decision-making positions, such as executives, business leaders, and so on, besides actually breaking down these vulnerabilities and attack chains, and helping developers and other IT specialists fix them, how do you convince people that those attack scenarios can actually happen? Because I find that one of the biggest points of resistance is, “Okay, I understand this. But if you had such a hard time figuring out how to chain all these vulnerabilities and how to fix the business, why would anyone put in that amount of work to hack us?” And I find that there's this bias that “it can’t happen to me” is so powerful, and it takes so much to break it down. How do you do it?

[38:32] Tom Eston: That is a great question. And one of the challenges that we encounter in offensive security all the time is I can't tell you how many times I've been in those boardrooms or those conversations where like, “Yeah, that's really cool. That's a cool hack that you did. But realistically, could that really happen to us? How much time effort and money should I really be spending remediating something that you demonstrated?” And I think it all comes back to that risk conversation that I had mentioned. I think there is a problem in this industry where a lot of things are overblown. Just recently, I talked on my podcast about “juice jacking” about how now the FBI has come out and said, “Look out for those charging stations at the airport. Don't plug your USB port from your phone into this charging station, it’s going to siphon off all your data.” And you look at that, it's coming from the government, so it must be a very serious warning that this could happen to you, it will happen to everybody. But realistically, unless you're being targeted by a nation-state specifically for you because it's something that you have that a nation-state might want, the normal average person is not going to be targeted by a juice-jacking attack. So it's the same thing when we report a vulnerability to a client, we really have to look at it objectively and give them a risk rating or something else to show them. So when I communicate on a report what is serious and what isn't serious, I need to be able to back that up with some type of data, some type of real-world scenario or situation to show that yes, you do need to take this seriously, or maybe it is a little bit lower risk, and then going back to your threat profile or how you perceive risk in an organization, this might not be that big of a deal, even if it is a really cool hack or something that you did that was really impressive. I think that's the challenge for us as professionals is that we have to take a step back, really look at it objectively, and then understand and have those risk-based conversations with our organizations.

[40:49] Andra Zaharia: And it probably helps a lot to have people on the team to do this with, to have these conversations, and to help each other just notice biases, blind spots, and things like that.

[41:03] Tom Eston: Yeah, and the opposite is true, too. There could be something that is low level for us because we see it in all organizations like a low-level vulnerability, but to another organization that might be a critical or a high vulnerability because of how that particular issue might impact their business or their business processes. So I've seen it on the other side, too.

[41:26] Andra Zaharia: And that's why I believe that automation cannot replace this kind of thinking. This type of critical thinking is what's so valuable about this industry. It's such a valuable skill that not only ties back to the ethical part of ethical hacking but also to the actual breaking down of technology, understanding its consequences, and communicating all of that. I find that this is the core skill that also helps you a lot in life, generally speaking, not just in your cybersecurity job. Fear, uncertainty, and doubt is such a huge problem in the industry. And again, it actually blocks communication and connection instead of helping or help persuade people. How do you build confidence with people to have these conversations? So, after a pen test, do you also share what you tried and didn't work to show them that they're actually doing some things right? Because just having someone telling you, “You're doing a lot of things wrong,” can be really disheartening.

[42:35] Tom Eston: Oh, it's very disheartening. It's a horrible feeling when you walk into a room or you get a report, and it's just nothing but bad, and you're doing everything wrong. It's a horrible position to be in. I completely agree that you have to show the good things along with the bad. And in fact, that's how your messaging should really start with: “You're doing these good things. We saw all these observations that were good.” That could be anything like, “Your team alerted us that they saw us.” Or, “We saw how you did your incident response process, that was really good.” So it doesn't always necessarily have to be to the work that is scoped or the exact project that you're doing. I always try to call out these other things that were observed, which are positive for the team. So you're sending a message of, “While we did find all these bad things, there are all these other good things that are going on in the organization that you should be proud of, and that you should start to leverage the good things about those teams with your security program.” But I'm always a fan, too, of showing the entire methodology that you did and then showing, “Hey, we ran all of these tests and these tests came back that we didn't get any results, meaning that you're doing a good job in these certain areas. And then here are the things that you could use some improvement or use some work on.” So, I think it is all about framing that conversation, depending on the type of work that you're doing, to highlight the positives and the good things about what's going on and what you observed, but also the negative things. It's just the way that how you frame it in your conversations. 

[44:19] Andra Zaharia: And the level of detail and specificity that you add to your conversation, and just really being present and being involved in the moment, and taking the time to read the room and feel what others are feeling. Again, empathy comes up exactly there when you're able to understand what's the burning issue for the other person. And sometimes it's difficult because what people tell you will often be something different than what they actually feel.

[44:50] Tom Eston: Oh, that's so true. I always go back to “know your audience.” Whenever you're writing something or you're speaking in front of people, you have to know who you're speaking to because that is going to change the content that you're about to deliver, but also the tone of your conversation because people like to hear things depending on your personality. You consume information differently than other people. So I always tell people in organizations like if you're setting up a meeting to discuss a pen test report, always ask who's going to be in the meeting, what their title is, who they are. In fact, I've gone as far as to start looking up people on LinkedIn and try to understand based on their posts about if they are maybe a little more extroverted or introverted. Those things are little nuggets of information that will really help you craft your message appropriately for the audience that you're delivering it to.

[45:37] Andra Zaharia: And it also uses the skills that penetration testers already have. That's one of the great ways. I find that highlighting that people already have the skills that they need for the job, for the project, for the role, for the task at hand opens their eyes to “Oh, I'm actually able to do this.” It helps them gain a little self-confidence and not feel as intimidated. And again, reinforce that idea that this is not for me, this is for someone else. This is not me. This is not part of my skill set. Framing changes so many things. So, given that we've gone through so many topics, but I see them as all connected. There's this crazy map in my head right now that I wish I could put out, but I don't have the visual skills for that. Well, I don't have them yet to speak to self-limiting beliefs. Which would be the principles that you'd add to, let's say, the 2023 update of the Hacker Manifesto? What are two things that you wish ethical hackers would see as representative for the job, for the challenge, and for how things work in this dynamic in our day and age?

[47:02] Tom Eston: I did cover an update around that. I would say that all the tenants of that still can say the same in terms of, yes, the hacker mindset, a lot of it is against the authority, against the status quo. It's doing things differently. It's pulling out creativity and breaking things down to find out how things work and to do things better. Those are all really positive things. But you have to do it in an ethical and moral way. You go back to the manifesto. I mean, he wrote that after he was arrested. So, if that gives you some indication of you don't want to do things that are going to question your moral compass and the things that you believe in. And I think the way to think about it now is everything that you do, think about how that impacts others. That's one of the things that you could take away from maybe the more modern manifesto, is thinking about it from an empathetic perspective of “When I access this system or when I do these things on a pen test, how does that impact the people on the other end? How does that impact the defenders? How does that impact the company, the organization?” And when you start having those conversations with yourself, things start opening up like, “Oh, I didn't quite think of that.” And that might impact somebody's life or it might cause an issue, especially in our field where any little thing that we do could take a system down, it could cause an outage, it could cause an impact. But that's the mindset that we all have to start thinking about is how does this impact others? And the more of that that we can do, I think the better off we'll all be in this industry.

[48:51] Andra Zaharia: That is a beautiful message to wrap up this conversation with. Thank you, Tom, for being such a fantastic guest, and such a generous person, and for everything that you do. We're really lucky to have you in this industry.

[49:04] Tom Eston: Well, thank you, Andra. It's been a pleasure to be on your show, and thank you again.