It is very hard to make people who have never experienced a cyber attack threat the need for best practices, such as two-factor authentication, with urgency. Most users will see best practices as an unnecessary burden they do not need. That is why those in the cybersecurity space need to empathize with the user while advising them.
One of the biggest barriers to promoting safe practices in cyberspace is attitude – both from users and from specialists in the industry.
People who’ve never experienced a cyberattack have a difficult time understanding the need for taking precautions in their digital lives because the consequences are abstract and far removed from their daily lives. This is absolutely natural given information security has been around for a second compared to human evolution.
Most people see best practices as an unnecessary burden they don’t have time for. That’s why those in the cybersecurity space need to empathize with each user to be able to help and support them effectively.
Our guest today is Joe Giddens, the Director of Content & Communication at CybSafe – a company focused on lowering human cyber risk by educating, nudging, and supporting teams. He’ll help us understand the communication challenge in the cybersecurity space.
In this episode, you’ll hear about Joe’s experience in law enforcement and how he ended up in cybersecurity. You’ll also learn how familiarization makes it more difficult for experts in our industry to show empathy. Additionally, you’ll find out some honest, hard truths about practicing empathy in helping people acquire the digital security skills they need – and why they need them
In this episode, you will learn:
Connect with Joe:
[00:42] Joe Giddens: Security should be a byproduct of our empathy. It should be a byproduct of our ability to understand and help people with their problems. If we know what keeps people awake at night, we can support them. And that starts with talking with people by understanding the problems they have in the workplace and indeed outside the workplace as well.
[01:05] Andra Zaharia: Throwing more information at people and throwing more technology at the problem won't necessarily fix one of the biggest challenges in cybersecurity today, which is closing the gap between information and acting on it. We keep telling people all of these things that they have to do, but it doesn't make it any easier to actually do them to see progress, especially when they end up getting the blame for not doing enough or not being enough, not being careful enough, clicking those links that they shouldn't, and so on and so forth. So, today, I'm talking about this particular issue from a number of angles, and we also dive into a piece of research that's particularly helpful for everyone in the cybersecurity industry with Joe Giddens, a truth-telling, no holds barred guest that really believes in standing up for people and championing them in cybersecurity, and helping them feel empowered and helping them have positive experiences with cybersecurity. We talked a lot about self-fulfilling prophecies and desensitization, how that happens for people who work in the industry, and how to try to reconnect and close that gap between people who know a lot more about these things than everyone else does; everyone else who uses technology for things that are not necessarily related to security, but to their own regular lives. It's important to make cybersecurity something that's tangible, something that's positive, something that people feel they have a natural inclination towards. There's so much that we can do with what we already know, and Joe really does a great job at highlighting this. I am excited to share this conversation with you and to help you get to know another great person behind the great work that happens in cybersecurity. I'm inspired by Joe's story, his approach, and his honest and vulnerable way of showing up in this community. And I hope we get to connect to many, many more people like him because I know that that's the start of real change in this industry and beyond it. Thanks for listening.
[03:42] Andra Zaharia: Joe, lovely to finally talk to you to finally get to explore a bit of your experience with empathy in mind and many, let's say, stories and experiences to explore.
[03:54] Joe Giddens: Likewise, Andra, it’s great to meet you at last. We've been going back and forth for a number of years, I think. And I think we hold very similar viewpoints that both of us shout from the rooftops. And I'm excited to talk about them.
[04:08] Andra Zaharia: Oh, yes. And speaking about that, something that I picked up and something that we have in common is that we both come from non-technical backgrounds. And one of the things that I first wanted to explore is your Inception story, given that you have a bachelor's degree in music and arts management, but you ended up spending almost 10 years in investigating and preventing cybercrime as part of the Metropolitan Police. And I was wondering how did that happen exactly? What did that journey look like?
[04:41] Joe Giddens: Like most 19 or 20-year-olds, when I went to university, I had no idea what I wanted to do. Music was an interest but it was nothing more than that. So, I found a degree that said music in the title and I went to do that. And it was more driven out of the fact that I think in the UK, at the time, it was the last year where student fees were remaining quite low. So, if I left it for a year, it would have been more expensive for me so I just picked a course and I went and did it. And then like anyone who does music, I decided I wanted to join the police force about halfway through that course. So I started making moves toward that as I was studying. What happened is I didn't really see a path in music. Like I say, it was an interest rather than a passion, or actually anything I was good at. So, I picked the police. And after leaving the university, I signed up for the Metropolitan Police in the UK. Spent two years in uniform, working on the streets of Tottenham, which is a lovely place, despite what anyone says. And after a little while of doing that, I moved into detective work, initially working in a domestic violence unit, helping victims of domestic violence and abuse. And then after that, I moved into serious crime. At the time, serious crime was anything underneath a murder — so, all the lovely things like rapes, high-risk missing people, and serious assaults. It was tough work and it took its toll, but it was meaningful and I enjoyed it.
[06:22] Andra Zaharia: It does sound like you had to deal with some very difficult and very emotionally intense situations while you were doing this. So, that sounds honestly quite impactful, especially for someone in, let's say, the first years of their career, that's something difficult to experience as a maturing adult.
[06:42] Joe Giddens: It was. And I use the word “it was” and not “it is” because, at the time, you don't really recognize what's happening. The change is slow, and you build up this armor, which manifests in the forms of dark humor and probably drinking a little bit too much and spending time around your colleagues who become your family. And it didn't leave a good impression on me when I left the place. You come out and you realize the world isn't as bad, but you see the bad side of it on a daily basis. And there was a lot of working through that trauma. Like I say, the shield that had built up, you had to take it down, which was a process. And on a tangent here, for anyone who is working in law enforcement and is regularly exposed to the bad side of life, just talk to people about it, don't cover it up, don't drink it away, don't laugh it away. It's not normal. And talking really helps. And understanding yourself, understanding your colleagues, and other people is a really good thing to do.
[07:46] Andra Zaharia: I really appreciate you sharing that, especially because I feel that it is a more intense formula of something that people in cybersecurity experience as well, people who have to deal with cyber incidents, and they're always kind of the people who have to push others to do the right thing even if they don't want to. But this is something that's thousand times more intense and more visceral than perhaps working in cybersecurity is. So you already went into this with a lot of emotional practice of what it means to deal with emotionally charged situations, I guess.
[08:24] Joe Giddens: Yeah, it's very easy to sleepwalk into a negative attitude, and you don't realize it because it becomes your attitude, it is who you are. And it happens very gradually, and it happens as a result of just desensitization to bad things. And I think you draw a great comparison that as security professionals, we're probably desensitized to bad things which are happening, we're dealing with breaches. And when things go wrong, and when our colleagues aren't doing the things that they need to but not because they don't want to, it's generally because they're in the wrong situation, the wrong environment, or they've got other priorities that are just as important as what we deal with.
[09:03] Andra Zaharia: That's true. Plus, I think that there are so many layers of abstraction that separate people from cybercrime, which is why it's so difficult to see the potential consequences of your actions or lack thereof, it's so difficult to connect the cause to its effect. And I think that that's a real challenge that's going to still persist for quite some time because biologically, we're just not built for that yet.
[09:30] Joe Giddens: I agree. Technology moves quicker than anything, and it sure as hell moves quicker than how we adapt to our environments. I think the first computer was invented maybe 80-90 years ago. And prior to that is millions of years of evolution to get us to where we are, and we're expecting people to thrive in these environments, which has sprung up over the last 100 years. That's just unreasonable. As security professionals, obviously, we're immersed in it more so it's more familiar to us. But on the theme of empathy, we just realized this doesn't come naturally to people. It doesn't come naturally to humans, and it doesn't come easily to people who don't do this day to day.
[10:10] Andra Zaharia: That is so true. This effort to adapt, I think, is such an important thing to keep in our awareness. And it brings me back to how did you adapt to crossing into working in cybersecurity and taking all of that experience that you've gathered to apply in this extremely abstract and seemingly devoid of feelings field?
[10:33] Joe Giddens: So, after the stint in serious crime, I started specializing in cybersecurity investigations or cybercrime investigations. I did this for four or five years. And during that time, I had the opportunity to work with and speak with and interact with many different types of victims of cybercrime. And it was an experience. I don't know if it was a good experience or a bad experience, but I'm grateful for having had that experience because it taught me a lot about people and how they deal with things and how they go about their day-to-day lives. I came across this analogy, which I read in a book called The Black Swan, by Nassim Taleb. He makes this analogy of a turkey. And the turkey lives on a farm with the farmer, and every day the farmer feeds the turkey. And every day the farmer feeds Turkey, the turkey starts to form a stronger view or opinion of the farmer: Because the farmer is feeding me, the farmer must be my friend. And every time the turkey is fed, this view just grows stronger and stronger, until Christmas. And at that point, the farmer and the turkey, they're no longer friends because the turkey ends up dead on a dinner plate. It's a bit of a morbid metaphor, but I think it works really well for cybercrime and what I was doing at the time. This mentality of “It will never happen to me.” And every day that goes past where it doesn't happen to you, you form this stronger point of view, this stronger opinion of “Well, it's not happened to me yet, so it's less likely to happen to me.” And the whole time we're sleepwalking into risk, then it happens, then the breach occurs, then the theft occurs or the incident which rocks people's worlds. And it's not through carelessness because they're too focused on other things, this is just how we're built as people, this is how we think, these are deep-rooted survival mechanisms, which help us be who we are and do what we do. So, it's important to understand that. And, again, layering that on to other areas of work and other areas of what people do, their motivation is not security, their motivation is doing a good job; it's pleasing clients, it's meeting deadlines, it's hitting targets. That's the main goal that drives other areas of the workforce. We couldn’t forget that as security professionals. And it’s where security becomes friction. And in some cases, too much of a friction that people have to find workarounds to it. These were sort of the things I bought with me into the security work and CybSafe and what I'm doing now.
[13:28] Andra Zaharia: That's a very powerful metaphor and it really highlights, let's say, the number of biases that we have to deal with and the number of things that we would benefit from working on. This list of biases feels endless, a little bit. And I think that inherently, actually, cybersecurity does offer some mental models and some frameworks and concepts which can really help improve our self-awareness, not just around digital security and privacy issues, but generally speaking, it just helps make your mind clear. That happens if you're interested. If you have a natural affinity to these kinds of things; for example, me as an anxious person, who has dealt with anxiety her entire life, I think that part of the appeal of cybersecurity for me was that it gave me a way to create some predictability and manage risk better in my life, and that's helped me a lot. So, I love that you're highlighting these mental models and these biases and things that carry over to multiple areas of our lives and are not just specific to cybersecurity. And I was wondering if we think of patterns and biases that we all share if victims that you dealt with, victims of cybercrime, had anything in common in their first reactions to realizing that the farmer has turned against them.
[14:55] Joe Giddens: It was shock. It was always shock. I heard the phrase “we just thought it would never happen to us” so many times in my career. I think that summarizes it as best as I can: “We never thought it would happen to us.”
[15:08] Andra Zaharia: Disbelief, I think, is something that often occurs. Well, I think it happens to everyone, honestly speaking, simply because, again, the consequences and the inner workings of cybercrime are so hidden from view, they're not physical and they're not palpable in any way until it's too late. Although I hate using that cliche, it is true nonetheless.
[15:35] Joe Giddens: That’s the point I was making about desensitization. As security professionals, we see this day in and day out. I came across this phrase: “Familiarization is the best form of anesthetic.” And we just really need to understand that not everyone sees things from our point of view, not everyone is exposed to the number of incidents, blogs, phishing emails, and bad behavior that we see day in and day out. And empathy here is, again, like I was saying before, understanding things from other people's points of view. I think true empathy is living things from other people's points of view. But that's a little more difficult.
[16:23] Andra Zaharia: It is. There are several layers; there are cognitive empathy and emotional empathy. And you can experience whatever form you start experiencing, I think, is still very valuable, especially because the work that you're doing now in a space that's very challenging in the sense of trying to codify and analyze human behavior around cybersecurity and how that changes and try to make this measurable so you can measure progress in terms of how are people actually acquiring all of this information, how they're making use of it, and how it feeds into their behavior at the end of the day, which again, very difficult because human behavior is so diverse and so unexpected at times, just like in the metaphor. So I wanted to ask a little bit about what personally drew you to this particular area of cybersecurity, which has to do, obviously, a lot more with human behavior than it has to do with technology — technology being an enabler at the end of the day.
[17:26] Joe Giddens: It was more through chance than anything else. It was a series of fortunate events, which led me to CybSafe. We started off as a very small outfit, just five or six of us in a small room. We had a vision to build a piece of software that helped people be more secure and safe online and then when they use technology. I guess our understanding grew as the company grew to where we are now. We’re a behavioral risk platform and we influence and measure people's security behaviors. And there's so much to unpack there. But really the transformation that we've gone through over the last five years and the work that we're doing has been around the fact that you can know things, people can have knowledge, they can have understanding, but that doesn't translate to behavior. It just doesn't. I know drinking is bad for me, and I drink too much. I know I shouldn't smoke, but sometimes I do. I know I should exercise, and I probably don't exercise enough. I know these things, I am aware of them, I have this knowledge. But to take that to behavior, you need more, you need a lot more than just what we're giving people at the moment.
[18:45] Andra Zaharia: And what is that specifically? What fills that gap? What brings information and knowledge into actual practice?
[18:54] Joe Giddens: It’s a series of things that are built around situational, physical, mental, and social context as to what people are doing. Let me give some examples. I know I should exercise more. There are devices you can wear on your wrist, like an Apple Watch or a Fitbit, that tell you when you've been sitting down too long. So, I'm taking that knowledge and I'm taking a prompt and nudge in the right direction from a wearable, and it's telling me: “Hey, you've been sitting for too long. It's time to get up and move.” An example of nudging someone at the right time, in the right way, with the right information, with the knowledge that they have already. I should do more exercise, I have that knowledge, I just need a choice, a timely nudge to turn that into behavior. Just on this theme, because there's a lot to unpack here. Some of these wearables also give you goals that you can set for yourself: “I want to do this many thousand steps a day.” That's a tangible thing to aim at, and it's a tangible thing to achieve. And when you do it each day, you get a little firework go off on the watch or the thing that you're wearing, and it says, “Well done, and you feel good about it.” It's a dopamine hit. So, there's an example of goal setting. And then you can compare how you're doing to your friends and your family, and you can compete with them, and there's a social aspect to that. So, just on this one theme of moving and exercising, there's a lot of understanding that goes into helping people do that. The challenge we have with helping people be more secure, I think, is probably even greater than helping people be healthy or healthier. There are over 70 known security behaviors. And for each one, we require research, understanding, data, and knowing what works and what doesn't work, and how that changes from people to people because everyone's different. The challenge is not insignificant, but it's a fun one. And that's what we're doing, that's what our work revolves around.
[21:04] Andra Zaharia: And it's a meaningful one as well. I really see how this type of research, how this type of work has such long-term benefits. As security is now fundamental to global stability, to societal stability, it contributes to so many things and we're so highly dependent on it. But again, people in the industry see this more than people outside of it simply because they're just more involved in the nitty-gritty of it. And I loved how you emphasize helping people make use of their existing information. For such a long time, generally, the main approach in cybersecurity education has been to tell people that they're not safe enough, they're not doing enough, they're clicking on the wrong things, and they're acting in a way that's not compliant or unsafe. And generally, we've just been making people feel bad about themselves, which never led anyone to actually change their behavior or do so sustainably. And instead of seeing cybersecurity as an ally, as a benefit, as a source of information, knowledge and experience that can improve their lives overall, they've always seen it as a chore, as something to do as friction, as “Oh, here's this other thing that I have to take care of.” So, thank you for highlighting how this could change through by making things more tangible, like you said, by making them feel more relatable, and by putting them in a positive context that people can actually enjoy because we really need a lot more of that in our lives, generally speaking.
[22:37] Joe Giddens: I think you've hit the nail on the head there. Security should be a byproduct of our empathy. It should be a byproduct of our ability to understand and help people with their problems. If we know what keeps people awake at night, we can support them. And that starts with talking with people by understanding the problems they have in the workplace and indeed outside the workplace as well. And then doing what we can to support them to solve those problems. If we do that, security should just come naturally because we build in the environments in a way that facilitates behavior in a secure way.
[23:16] Andra Zaharia: Plus, that's a fundamental need that we have. Everyone can relate to the need to be safe, to feel safe, to feel that you are in an environment that supports you and trusts you where you trust other people enough to be vulnerable, to go to them with your issues, with your questions without feeling inadequate, stupid, or less of a professional because you don't know how to do X, Y, or Z. And you’ve uncovered a great many things about this particular topic in the report that you recently launched. I love its name: “Oh Behave!” I read that in a voice, the first time that I saw it, I read that in my head like, “Oh, that's such a great name.” That's exactly what some security people are trying to tell others: “Would you just behave already? I'm sick of trying to tell you to behave. Just do it.” What kind of reactions have you seen to the report? And then we'll just dive into just a few highlights because I think that there's a lot of great information there, which we couldn't possibly cover in a single conversation but we can talk about some highlights.
[24:24] Joe Giddens: It's interesting that you've read the title in that way. It's finger-wagging, it's like telling someone off: Oh Behave! When we put this together, it was a little bit more tongue-in-cheek based on the Austin Powers catchphrase, like, “Oh, behave!” So, the reaction to the report, and I should say at this point, it’s not just a CybSafe report, it's put together in collaboration with the National Cybersecurity Alliance and Get Cyber Safe from the States and Canada, respectively. The goal of the report is to just shed some light on not what people are doing—we know what people are doing, we’ve got that data by the bucket load—but why are people doing it? Why are they acting in such ways when they interact with technology? And I guess the themes to draw out are people want to be safe, they want to be secure. But a lot of the time, security is just too overwhelming. A good example is multi-factor authentication. Multi-factor authentication, what a word, that's a mouthful. I guess what we mean when we say is it's a really good password, it's a really good way to protect your accounts. But for some reason, it's got this long, complicated word that doesn't really mean that much to people unless you unpick each part of it. So, one of the questions is around multi-factor authentication. And a lot of the answers is, “I don't know what you're talking about. What is this?” Of course, it's one of the most fundamental things that people should be doing in security is protecting their accounts and their online information with MFA. But we've got a framing problem here. We've got a problem that we're telling people to use something, but we're doing so in such a way that they don't even know what we're talking about. So, there's this information that we can take from there, that if we just spoke to people, if we communicated using their language, using words they are familiar with, they might be more likely to adopt this. And again, it's not through lack of trying or because they don't want to, it’s simply because they don't understand.
[26:33] Andra Zaharia: That is absolutely true. I've seen this behavior. I try to always kind of eat my own dog food and always be, first of all, observe myself while I have my interact with tech products that require certain security layers, and see when I feel frustrated and make a mental note of that and realize that probably millions of other peoples have the same experience as me, I look at my mom, I look at my dad, I look at friends who I've been nagging about cybersecurity for a long time and they're already kind of responsible about it. But I look at people who are outside of tech, who use tech, but they're not inherently interested in technology and spend that much time with it intentionally, and I look at their behavior and I see all of these things. My mom works in insurance, so she has to follow all of these rules and she knows why. But still, sometimes systems fail her. And I see just as you mentioned that her final, let's say, duty is to the customer. And when the customers need her and she can serve them, that takes a heavy emotional toll, it adds a lot of stress to her life. And these are moments that keep repeating billions of times across people's lives across the world. And that's something that I feel that we may sometimes lose sight of. And seeing things reflected in a report such as this, I think is so important to read no matter what role you have in the cybersecurity industry or in technology in general.
[28:10] Joe Giddens: One of the fundamental changes, which has happened, I think, in our lifetime, because when I was growing up, you could still have a job, which didn't require you using technology. One of the fundamental things that have happened in our lifetime is this change. It's not an option anymore to use technology or not; we have to use technology. If we want to live and work in the modern world, we can't get away from that. Everything we do is connected in some way to technology, our success as people, our safety, our well-being, is it a fundamental need? I don't know if I'd go that far. But it's undeniable that anyone in the modern workplace has to use technology to be successful. And our approach to help people stay safe in that environment is still: “Here's some training. Here's some information.” It's not so much building the environment to be safe, but it's expecting people to be safe in an environment they are unfamiliar with. And that's reflected in the report as well. Lots of people have access to training, and that doesn't necessarily translate into security behavior, and not even everyone has access to training. So, we're failing on two counts. We're not even giving people information in the right way or in a way that they can use it, and the information we do give them doesn't help them.
[29:36] Andra Zaharia: And still, at the end of the day, they're the ones who get blamed. We're putting people in a situation where they just can't win: They're damned if they do, they’re damned if they don't no matter how much they try. To see this in context, to see just to sit with this realization and with the discomfort that it produces, I think that that's such an important thing to feel, honestly. We need to be talking about emotions a lot more in the decision-making process and how they inspire and trigger actions a lot more than we do in cybersecurity. It is inherently in an industry that wants to codify everything that wants to pick things up, scale them, turn them into products and turn them into software. But throwing more technology at the problem isn't helping, just like we can see. I wanted to talk particularly about one of the aspects mentioned in the report, one of the insights, that's called Learned Helplessness. I think that that's something that really catches your eye as you go through the insights. And I wanted to ask what stereotype sits behind that and how we could carve a better path forward or a solution that you see to this particular issue.
[30:49] Joe Giddens: So, learned helplessness, there is a phrase which has tainted our industry for way too long, like a bad smell that won't go away. And we all know what the phrase is: “People are the weakest link.” And my blood boils when I hear it because if you tell someone they're crap at something, there'll be crap at something. I don't know how many times this has been repeated and churned out in different ways and in different words. But you touched on it earlier, it's the whole fear, uncertainty, and doubt as the basis for our messaging. And if we're basing our messaging around fear, uncertainty, or doubt, then people are scared; they're scared to do the wrong thing; they're scared to act in the wrong way. And as far as I know, there are no academic or scientific studies, which show scaring people, shaming people, making people fearful of things, or telling people off is a good way to motivate positive behaviors. It's a good way to alienate people.
[30:48] Andra Zaharia: Or even turn it into a self-fulfilling prophecy at the end of the day. And I think that this particular phrase, people choose to use it because it serves their own ego because it makes them feel better about themselves because they know more and they feel like they're in a position of power over others. But again, that small, let's say, show of superiority will lead to nothing good.
[32:22] Joe Giddens: Yeah, it's egocentric and it's a good way to cover up the fact you're doing a bad job, as far as I see it. There are so many things that has to happen before a phishing email lands in someone's inbox: your network has to be configured wrong, your email filters have to fail, you have to make a conscious decision not to buy enough software and configure it in the right way to let that email get into an inbox. To blame the person at the end of the day is just a real cheap way out. And like I say, I think it's bad people covering up the fact that they're doing a bad job. This is, I think, what we see and this is the learned helplessness; we blame the people that we're protecting or are supposed to protect, and then we're surprised when they don't engage with us or come to us or report things to us. I don't know the numbers, but I would be interested to see how many people in the security team regularly get invited to the pub versus everyone else in the company. I think they are probably quite lonely.
[33:27] Andra Zaharia: That is true. And the previous guests on the podcast actually talked about the value of approachability and how she discovered from various experiences that no one in the security team where she worked talked to people in other teams in the company, which was a huge surprise to her. But they kept receiving a lot of problems that were easily solved through just one conversation with that person who have no idea that they were doing something wrong, obviously. And this kind of truth-telling that you're doing now is exactly why I wanted to talk to you for this podcast. And I love that people are taking a real honest stance on these things, and that you're leading that effort, and that you're doing this consistently because it's not for, let's say, business goals. There are a lot of people in the industry who'd rather sugarcoat things and just hide the trash under the carpet to make sure that the impeccable image of the company prevails. But again, people who've been in this industry for a while know that it's about being flexible, that it's about finding nuance and that thinking in absolute terms helps no one, honestly.
[34:34] Joe Giddens: Absolutely, it’s no secret. I work for a vendor in the space, and there's been just as much bad practice, bad messaging, and bad behavior from vendors. It's so easy or it would be very easy to just create a company, sell some training, and live a good life. We could live a real good life just selling training to people and telling people it works. That's not what we're trying to do here. The reason we're so forthright and punchy with our message and that we're happy to stand up and shout it from the rooftops is because we believe in this. We're not here to make a quick buck, and no one in this industry should; we're here to help people, we're here to keep people safe, we're here to fundamentally change how we approach security. It's not about them and us, it shouldn't be. It's just us. We are here as one, that's how we're going to change. That's the fundamental shift that we're driving towards. And we have to because we're only going to rely on technology more.
[35:38] Andra Zaharia: So true. Thank you for articulating that so beautifully. And it's people like you who drag this industry forward, who show that you don't have to go into hustle mode to build a business that's valuable that you can both build a business and a good living and help people and that those two things; you don't have to compromise on one to have the other. And they think that that's so important because people see the distance between what we see and what we do. And when that distance is big enough, it's impossible to build trust there. But before I go out on that tangent, I wanted to actually give back to you and ask you if there are any particular moments or experiences where you've been on the receiving end of empathy from another person in the industry or beyond it, that have kind of felt like an inflection point or a pivotal moment for your journey in this, not just professionally because I know that it has to do a lot with the kind of person that you are.
[36:39] Joe Giddens: Yeah, I guess it would be joining CybSafe, the company, and meeting Oz, our CEO. He took a chance on someone with very little experience. A somewhat questionable attitude, but a lot of passion, I think that's fair to say. And he showed me that real patience, empathy, understanding, and support, and has been fundamental in getting me to where I am today. And I'm very thankful for that. I'm very grateful.
[37:10] Andra Zaharia: That's wonderful to hear. I hope that we have many more people who do the same thing for others. I can see that there's an undercurrent of people that are constantly doing this, that are constantly investing in others, either through their companies or through hiring people, mentoring them, or just lifting them up into community and giving them the self-trust that they need to use what they already know and move ahead, just like you're trying to do with people that you serve with, with your customers and the people in their companies. Thank you for all of the stories and examples that you've shared, whether they were personal or related to your mission in the industry. I would love to keep this conversation going for a long, long time. But I'm going to leave it to listeners to discover your work and perhaps get in touch with you if they're interested in taking the conversation further. But I do hope that we get to have another chance to perhaps touch base in the future and see how things have changed, and hopefully, report on big improvements that you're contributing to.
[38:16] Joe Giddens: I would love that. And just to say, thank you very much for the opportunity. I've seen this podcast around the space. It's amazing what you're doing. I appreciate it.
[38:26] Andra Zaharia: I'm very thankful for that. Thanks so much, Joe!