Employees are a critical component when addressing cybersecurity problems within an organization. Creating software, a plugin, or an awareness course is not enough. You need to understand how people interact with the system and where they fall short of following best practices. This allows the creation of user-friendly policies that make people feel supported and not attacked. It makes more people open to asking for help when they receive suspicious emails because they don’t feel judged.
People are critical to advancing cybersecurity on all fronts, whether it’s keeping an organization safe or building safer software. Using security software or rolling out an awareness course is not enough. You need to understand how people interact with the system and where following best practices fails them – and why.
This allows the creation of user-friendly policies that make people feel supported instead of hounded for their mistakes. A more empathetic approach to building relationships with cybersecurity (specialists, concepts, and practices) encourages people to ask for help when they identify a potential threat because they don’t feel judged.
Our guest today is Erlend Andreas Gjære, co-founder & CEO of Secure Practice, a Norwegian company that creates data-driven tools to engage, influence, and cultivate security within organizations. He specializes in security and people, focusing on security awareness, training and culture, human risk, behavior, and user experience.
In this episode, you will hear about the role of emotions in human behavior as it manifests and relates to cybersecurity, based on Erlend’s experience as a researcher. You’ll also learn why communication is one of the most important components of making things work in this space. Additionally, you’ll discover real examples that show why fear-based communication is ineffective in getting people to adopt a safer behavior.
In this episode, you will learn:
Connect with Erlend:
[00:41] Erlend Andreas Gjære: We did some user studies and found that very few people are asking for help because they fear to feel incompetent, feel like they're wasting other people's time. They don't know who to contact, they're uncertain about “Is this my responsibility?” So much emotion just going about a very specific problem they have at a very specific time.
[01:06] Andra Zaharia: Everyone has some sort of relationship with cybersecurity, whether it's conscious or not. If you sit and think about it, you probably have a feeling that stirring, you have a trigger that makes you feel either good or bad about this topic or just plain indifferent, and that's natural. Unless you work in this space or have positive experiences around cybersecurity topics, you probably don't have the fuzziest, nicest feelings about it. And again, that's really not wrong. But to be able to make real changes in behaviors, and to be able to use our cybersecurity know-how, whether it's minimal or advanced, we need positive reinforcement, we need to feel confident in our ability to protect ourselves. So, how do we do that? Well, first of all, by changing the way that people interact with cybersecurity products, measures, and especially with training and education in this space. And the topic of emotions and how they influence our behavior is exactly what I have discussed with Erlend in this episode. He and I debated a couple of interesting topics. And we really did a deep dive into his experience as a research scientist because Erlend spent six years working in research before he actually transitioned to industry work and founded Secure Practice, a company that now develops services that help measure and manage human risk in a very different privacy-focused way. Whether it's fear, shame, or guilt that you feel about cybersecurity, whether it's curiosity or indifference, I hope this episode will serve you well because there are some very interesting realizations that Erlend has shared and a couple of very practical ways in which we can improve how we—the people in cybersecurity—behave and interact with people whose job is not to keep technology data and people safe. So, I hope you'll take a lot from this episode, and I'll see you in the next one.
[03:43] Andra Zaharia: So, Erlend, I am so excited to talk to you today. We've been exchanging messages for almost a year now on topics that are very close to our hearts, such as the role that emotions play in cybersecurity. And I'm really glad that today we're going to get to debate them. So, thank you for being here and thank you for sharing with us.
[04:03] Erlend Andreas Gjære: Well, thank you so much, Andra, for inviting me and wanting to discuss this topic. Emotion is such an interesting part of actually what makes us human. And the human side of security and working with people, it's really so much more than the technology, it's so much about people, and then people are so much about emotions — what actually makes us human and how we respond to anything really in life. So, life is much more complex than a simple formula of “Just do this, do that, and then you're safe.”
[04:37] Andra Zaharia: I couldn't agree more, especially because I feel that in the past years, a lot has changed in the way that we see technology and the way that we see the interactions between humans and technology. And we now acknowledge a lot more that we can't really transform into formulas, everything that humans do because they introduce so much randomness, I guess, and so many unexpected types of behaviors and reactions that it challenges us to think differently, to do things differently, and to do more meaningful work at the end of the day. And in talking about this kind of meaningful work, I wanted to dive a little bit into your background in how you personally got invested in studying this area of cybersecurity and how you ended up working and building a company in this space.
[05:27] Erlend Andreas Gjære: I think the word you used a couple of times there, “different,” is one of the really big factors for my mission, personally, in the security space. I have a background in technology from informatics. I wanted to research as a research scientist working on usable security. So, based on the human-computer interaction background, I wanted to also apply this in security and privacy. But as a research scientist, I also got the opportunity to work with my own colleagues at the research foundation I was in. I had 2,000 colleagues at the time and nobody was doing anything on security awareness and training. This was 10 years ago, it wasn't that common, it was all about firewalls, encryption, backups, and stuff, antivirus. So, security wasn't really mature in the organizational parts. I kind of took the challenge of trying to deal with this human risk, if you call it that, trying to help people stay secure. And I got really intrigued by how different people are. When I had like 2000 colleagues from 70 nationalities, I got to see all the diversity among people. And trying to engage all of them, that was a really big challenge for me. I was kind of fresh out of school, I had a couple of years of working experience, and then trying to approach this big group of people I didn't know very much about, trying to tell them how to stay safe, that was a really steep learning curve. Also, a very big eye-opener in terms of how different people are and how human people are in fact.
[07:09] Andra Zaharia: And I think that this is something very important that speaks to some of the choices that companies make because it's so much easier to think that if you just throw more money at technology, it’s going to alleviate a lot of the risk because, obviously, technology is easier to implement even if it's incredibly complex, whether working with people, just like you mentioned, from so many backgrounds that respond to so many different stimuli and just trigger is just 1,000x more difficult in general. So, what did you find that appealed to people the most in those early days of you doing work toward building a security-aware culture in companies?
[07:53] Erlend Andreas Gjære: Well, the first thing I learned is, of course, we often say in insecurity, like, “Yeah, you need top-level commitment, management commitment.” It always sounds a bit controversial when I say it, but one of the experiences I got was, I had to do a lot of work to get top-level commitments, but I had a supporter or a champion, so I was allowed to do my work. But building security culture bottom up was so important, at least there, because when we surveyed people what are your main motivators for secure behavior? I think we got down to 2% of people saying, “Management telling me.” So, it’s something about how intrinsically motivated people are or not. We cannot just simply tell people to do X and they will do X, even though they are employees in our company and the company policy says why? So, we actually need to understand people to be able to have an influence on their behavior and try to help them stay safe, actually.
[09:01] Andra Zaharia: I couldn't agree more. And this reminded me of something that I learned in therapy, actually: Adults do things because they want to, only children do things because they're told to. So, we're basically trying to treat adults like children in some of these educational initiatives, and that's not going to work because people don't want to be treated like less of a person or incapable people that need hand-holding. And that's where positive reinforcement comes in and plays a huge role, doesn't it?
[09:36] Erlend Andreas Gjære: It maybe a natural outcome from how many engineers have been actually working with security over the years. It's been mainly engineering topics, even though security is so much more than technology. But the consequences of having engineers who are used to thinking just like me—like one plus one equals two—that is a challenge when you face the complexity of people. But yeah, your parallel to children, and raising children is also very interesting. My wife is a teacher. She also co-founded Secure Practice with me because we wanted to build something different. We wanted to build a security company with a focus on people. And not simply by using technology, although we build technology to help mitigate human risk, but we wanted to try to understand how can we help people. And as a teacher, of course, you have a role where you could tell children what to do, but is that the best way of bringing up children, just telling them what to do? So, I think, also from this area, this discipline, we can learn so much unlike the multidisciplinary approaches to working with people in security in organizations. That is so important, we cannot simply leave this to technologists. It's way too important and too complex for that. So, having many disciplines involved, like psychology, sociology, journalists, or communications experts, I think this is a very big win for the security area to be able to include many professions and something we should really strive for when we work with this. Recruiting and building a security organization.
[11:22] Andra Zaharia: I really love that you emphasize this approach, especially because I think that it really plays into the hacker mindset. Because most technical people who are really passionate about this space, not just as a money-making job but as a personal mission, they're actually multidisciplinarians; they learn a lot, they learn from various fields, they're very open-minded, they're flexible, they work on cultivating their critical thinking a lot. So, this is what we're trying to encourage everyone is to actually see that they can make a contribution no matter what their role is and that the way that they work on their self-development applies to how they work to serve people through their jobs as well. It's the same thing. It's just you apply your learning model to try to serve other people's learning model at the end of the day. And I was wondering because we're talking about nurturing this critical thinking mindset, what was something that worked for you? What kind of experiences did you find helpful in evolving your role in the industry?
[12:26] Erlend Andreas Gjære: In terms of moments of realization, I think one of the first things I did when I started doing the security awareness work in the research company, I was trying to build some learning curriculum for people. And then I wanted to communicate this and make it accessible and attractive to people and trying to attract people to volunteer actually to spend time on it. But I really had a hard time reaching through with corporate communications to be able to actually send a message to all employees. I would only be allowed to publish a story on the internet, and then people would practically have to look it up. We got results accordingly as many people were following up on that. And this is where I discovered the thing about simulated phishing, which is a very interesting topic and I think there are many opinions about that. But I was fresh out of school almost, I thought, “Oh, well, this is cool. Let's try to send a fake email to everyone,” because I wouldn't be allowed by corporate communications to send this message to everyone. So, I was kind of hacking the communications protocol in the company by just doing this. I got some manager approval from my champion. But apart from that, nobody knew. What we ended up with was a massive amount of feedback. I got like 250 emails when we did this rehearsal with 2000 employees. I spent two or three days simply responding to emails of various kinds. Some people were really enthusiastic, “Wow! You actually tricked us. We never expected this. It was really fun. You really got me.” Some positive, of course. Some were like, “Why did you waste my time on this? I'm not interested.” Of course, you can understand the emotion response to people. Some were like, “Yeah, I got this parcel and notice, and I went to the post office but they didn't have a parcel for me. But then I found that it was a phishing link, and you send it. So now I’ve spent an hour going through the postal office and I didn't get anything.” They were like, “Yeah, I wasted my time, and it was just before Christmas.”
[14:53] Erlend Andreas Gjære: I spent three days just answering people and trying to empathize with them because I really had to connect and use the opportunity to connect with people. I didn't know the explicit term “empathy” at the time. But I intuitively know that “Okay, we did this. We didn't have any cover, we just did it. Now, I just have to make sure we don't blow up everything and make a lasting poor impression.” Rather, I found that by responding to people and empathizing with their situation and their experience, it was a really, really good way to connect. So, going from unidirectional conversation or not the dialogue, but a monologue from the security department, we actually got to connect with 250 people who were somehow engaged with security. And what I learned later on, when I read Nonviolent Communications by Rosenberg, which is my favorite book of all times. The art about empathy is that only one of the parties have to know it to make it work. So, as long as you can start the conversation with empathy, then you will affect the other party. That was not an empathic way to say it, but you will get the response that will bring you both forward. And I think I learned a lot personally from doing the hard work of responding to people and connecting in this way, both positive and negative. That was the big moment for me that I learned how different people are and how powerful actually empathy can be.
[16:32] Andra Zaharia: I find this kind of one on one work brings us the most insights — the most valuable insights and the most insights because when we talk about communication, quantitative data will only take us so far, we need this kind of qualitative research and approach. We need to understand all of these nuances to be able to understand what group of people resonates with what kind of messaging, what kind of practical experiences, and so on. So, thank you for sharing that. I find that this kind of honesty about our shortcomings, our feelings, and our experiments is so important to talk about. Because otherwise, from the outside, it may seem like everyone knows what they're doing from the start, which is not true. We're all experimenting at one point or another. And there are corners of this industry that tend to over-promise on some things, and tend to disregard the kind of difficult work that you just mentioned of replying to hundreds of emails, taking the time to understand why people do what they do, and see how to use that information moving forward. So, I was wondering, what were the changes that you made as you progressed through your career based on that initial experience?
[17:52] Erlend Andreas Gjære: Well, for one thing, it triggered my interest in “Okay, how can we go about this thing, about security awareness and training? And how can we really reach into people?” And yeah, we can make a big boom. But what next? We got the attention. Now we have it. What next? How do we actually increase risk understanding, increase interest? I came up with this hypothesis, and I still think it kind of holds: If I had unlimited time and resources, what can I do to actually make an impact on people in terms of their security and risk behavior? I tried starting this experiment, and it was actually what you said: one on one. If I spent 15 minutes with every single one of my colleagues, talking to them, would there actually be any more efficient way to increase security awareness and competence? The advantage of doing this, obviously, is you can actually meet people where they are and start a conversation. I did this with 40-50 people, takes a lot of time. But what I found was that everyone has some kind of relationship with security. It has an impact on everyone's lives. And finding the just common ground. Finding something you have in common, just as a starting point for a dialogue. That is an ideal place to start, I think. And I really wanted to find but then people are like, “Yeah, but Erlend, it doesn't scale.” Yeah, I can't do this 2,000 times, maybe I could. I would spend a year. I calculated how much time I would need. I would need like a year and a half to reach out to all of them, including half an hour break every day, and not doing anything else.
[19:49] Erlend Andreas Gjære: But then like everyone likes to have personal experiences. For somebody who was hacked or scammed, or they have this policy in the company that they’re like, “I don't follow this,” or “I don't understand why.” There's so much emotional expression through just having these conversations. People have these emotional effects relating to the topic of security. Since I was also a researcher, I went to literature and I found this really great study. It's a paper from a research group at University College in London. And it was called something about rule breakers and excuse-makers, and they did a very similar thing. They didn't do new interviews, but they went back into raw material from various interviews relating to security studies, and they tried to just identify expressions of the effect of emotion in the dialogues with people, and I found this really, really interesting. So we have all these emotional variations among people relating to, if you bring up the security as a subject, people have fear, they have uncertainty, they are empathic or they are annoyed with the compliance policies in the company, or they are champions that may be naive but positive still. There are so many variations here. And they brought up these names. They tried to also connect this with the level of risk understanding. So, you would have these personas or the names of stereotypical people — like you have the champions, you have applicators and ignorant people. And some of them are controversial and not very thoroughly founded in the research, but still, it's a very intriguing idea. We have all these words for emotions that affect people in their relationship with security. So, long story, but this has really brought me into the way of thinking, maybe there could be something more to doing research in this area; how can we relate to people's emotions and find this common ground and then take them from there and help them reach a higher level of risk understanding?
[22:17] Andra Zaharia: Thank you for describing that and for adding all of these rich details. I think they make a huge difference, especially for those seeking some facts and specifics to start with or to enrich their own practice of whatever it is that they are doing, whether in a technical role, a leadership role, or whatever it is because, yes, indeed the conversation around emotions and human behavior and cybersecurity is fairly recent. I think that we may tend to forget that. And I feel that this conversation coincides with the role that cybersecurity now plays in society at large because it's such a huge factor in global stability that you can't ignore or sweep its shortcomings under the rug and just say, “Yeah, we're handling this through technology.” It's not that cool-headed, emotionless approach because that only works for some aspects. But for the rest of them, you need that human touch, you need to understand people and where they are to be able to factor this into their education at the end of the day. This continuous education that we all have to undergo simply because we have to live with technology. And we depend on technology to improve our lives, to improve our careers, to take care of our families. It's not something that we can just shut the door after and say, “I'm leaving technology at the door.” That's hasn't been an option for a really long time. You can't compartmentalize things like that anymore. So, when that comes into play, all of this emotional universe opens up. And I know that some people get heebie-jeebies when we talk about these things, they're like, “Ah, that's too touchy-feely for me.” But is it? Because when we read a bit about neuroscience and we educate ourselves, we understand that actually, emotions are essential to decision-making and that as humans, we cannot make choices if we don't have the capacity to experience emotions tied to those choices. So, even the most cool-headed, logical person has emotions, and it influences them quite a lot. What is it that, for example, you and your wife took away from her background as an educator that you integrated into your work? What kind of triggers, behaviors, or learning patterns has been integrated in your work with the company that you’ve built?
[24:50] Erlend Andreas Gjære: So, I mentioned the book, she was the one who bought the book and showed me this part about empathizing with people. And I think that was a major eye-opener to just putting a word on everything and life-changing books to read. I think the part about helping people, which is very universal and something all companies somehow strive to do. It's a very high level of thing. But still, I don't know, in cybersecurity, we've been very used to security technology just doing work in the server room, and it's all techie. And yeah, I still meet these people that are skeptical, people who don't believe that there is actually potential in working with people. It's just bad. Let's just try to make technology work, so if people don't have to stay safe. From the pedagogy side and the helping people side and understanding people and their special needs to use that word from education. Having special needs, you cannot judge people on that. If we think about people's needs, what do you need? How can we help you? I think this has triggered me to try to find ways to help people in situations where they would otherwise feel helpless. I think in technology and security, people often find themselves a bit helpless — either it's common sense or it's like, “Oh, this techy stuff is scaring me.” Technologists have been so good at using fear-based communications — like, “We are under attack!” And that's not very sustainable. You can provoke reaction using fair. It's not very predictable which reactions you get. Some people may actually just back off; “I'm not concerned about this,” or “What's the chance of anything like this hitting me?” But I think there are other emotional sides we need to interface with rather than fear to reach people and help people. I think we started the company with one very specific use case scenario, where many people feel a bit helpless, which is when they're facing a suspicious email and they don't know if it's legit or a scam. Maybe it's important, maybe it's dangerous. How do you actually go about this?
[27:40] Erlend Andreas Gjære: We did some user studies and found that very few people are asking for help because they fear to feel incompetent, feel like they're wasting other people's time. They don't know who to contact, they're uncertain about “Is this my responsibility?” So much emotion just going about a very specific problem they have at a very specific time. “I have this email. Okay, maybe I just leave it here and hope things work out.” Very few people dare ask a colleague, maybe next door in your office, maybe you can have a discussion, that is very good. And you're taking the first step to engage in a skeptical way, which is very good because then you're much more likely to engage if you didn't find the answer, then you will engage with the next step as well. But just making it more approachable and friendly, I think this was very critical to our mission when we started developing this service, which is now called Mail Risk, and it's a button you can click in your email application if you receive a suspicious email. It's not a button for simply reporting an email, which is pretty commonplace these days, but it's a function that will analyze the email based on technical stuff but also based on other people's suspicions. So we use crowd-sourced input from users and analyze in real time. So, our goal has been all along to give people an answer and we promise people, we will give you an answer. So, either we do it automatically, we have a rate of automatic responses and detections on things that have passed through the spam filter. But then, as a fallback, we allow people to “Okay, you're not sure? Just click here and we'll do a manual investigation.” This is where we have built a network of partners who do these manual investigations and provide the user with a very good response within just a few minutes. They don't have to interface with a human on the other side directly because many people fear this. We’ve lowered the threshold for getting help, which is so important.
[30:03] Andra Zaharia: Incredibly important, plus it gives them information that they can use later on. It's not enough if you report an email and you never hear back, you never know what happened to it, you never know if your hunch was real or not, if it were fact-based or not. You never get better, you never get smarter about these things in the future. And making these things, and actually giving people just a clear answer, a clear benchmark, a clear example that they can use in the future, that makes a world of difference because it builds their self-confidence. It makes them feel like, “Oh, okay, I know this now, so I acted on a hunch and that was correct.” And that actually improves your intuition in the long run. I feel that we're so fascinated, both in the industry and outside of it, with how people misuse technology, and how all of these big hacks happen, and how all of these teenagers managed to break into companies. But we're not nearly as fascinated with people who do quiet, good things that work, and that aren't that spectacular, but that help the ecosystem be safer overall, whether it's a bit of technology, whether it's a particular kind of behavior like the one you're working on. And I feel that that needs a lot more visibility, a lot more talking about, a lot more “Let's look at how this works, and why it works, and how it makes things better,” because it just improves things for everyone and it serves as a positive example that we can show people like, “Hey, this is actually doable. It's not the big, bad, scary, intricate thing that's not for you; it's actually for you.” Because just as you mentioned, everyone has some sort of relationship with technology, with security, we just need to explore what that looks like. And I feel that that's so important to mention and emphasize because it defeats the stereotype that people don't care about security. We don't understand exactly how people care about security, but perpetuating that stereotype really helps no one.
[32:16] Erlend Andreas Gjære: Absolutely. I think the word “self-confidence” is really essential here. It's also called self-efficacy in science, where you describe to what extent you believe in your own capability to do something. So, do you believe you are capable to enable two-factor authentication on your Facebook accounts? Do you believe you're able to spot a phishing email if you receive it in your inbox? And all these questions will reveal how confident a person feels, but it also reveals the uncertainty. And I think finding this uncertainty isn't just accepting them for what they are because two-factor authentication is really complex. Spotting a phishing email, it is really complex, actually.
[33:12] Andra Zaharia: Yeah, just telling people to just go do it is really not motivating at all, it's not nuanced. It doesn't feel like there's something rewarding at the end for just every other person that we know. We're not promising them something that's palpable, and that will make their lives better, we're just telling them, “Hey, you should do this. Just do it. No questions asked.”
[33:36] Erlend Andreas Gjære: I don't mean to be overly critical but we have had national and international campaigns on “stop, think, click, report emails.” How do you actually help people with that? Could you be suspicious all the time and report emails as well? It’s a lot of what's in it for us as an IT department, but what's in it for you as the user to report an email, simply doing that? That's one of the big opportunities we saw. So, if we can actually motivate people and include people on the journey to go from “Okay, maybe this is difficult, but we can help you, and then we can build this self-confidence, self-esteem, and self-efficacy.” Then people will experience, “Okay, now, actually, I'm pretty good at catching these phishers. So, now, I think I could be capable of enabling two-factor authentication as well.” So, impacting people's lives in a positive way and giving feedback. Feedback is really important.
[34:49] Andra Zaharia: Yes, giving them that visibility, giving them that sense of control over the things that they do, especially because we all need a bit more of that sometimes, honestly, especially when things get tough. I have a question for you that actually comes from a conversation that we had on LinkedIn a couple of months back. Do you believe you can make a good business case out of empathy? And if yes, how do you see making this kind of business case for people who are more focused on their personal KPIs and their objectives in terms of growing the business? And empathy feels like one of those touchy-feely things that is difficult to translate to a financial benefit or business benefit.
[35:38] Erlend Andreas Gjære: I guess empathy has somehow become a buzzword, which is kind of good. We kind of level up the conversations on understanding people. Like the business case, it's “What's in it for the company?” Well, happy employees are more important than ever. You want to maintain a good relationship with your colleagues and employees, people are more mobile moving between jobs maybe more than ever. I think we're building on universal values here to increase people's happiness, job satisfaction, and also their competence in security. So, investing in these areas is just a simple business outcome for any organization. It's universal. When we build a business case for our own approach, because we've been developing our service, we were never meant to be an email security company, although we have to do some machine learning stuff to be able to provide the help we want to give. We've been developing a platform for measuring and managing human risk. We've been doing this because we saw it necessary to try and move away from the very transactional nature of how security awareness and training is being done today. We're in October and there's a national campaign, and everyone has to complete their e-learning before the 31st of October, then your manager will review it. Here I come knocking on your door and say, “I have to do this compliance thing because I have told you to.” And we know people don't like to be told what to do, so they do it anyway, but they become annoyed. And then, actually, some people, it's counterproductive to bother them just because companies want their KPIs in order, we actually get the less secure workforce because they have more negative emotions towards security.
[37:49] Erlend Andreas Gjære: And the same phishing simulations. Yeah, we built the platform to do this as well. We shouldn't do it, but you really have to be conscious about why you're doing it. I think this is also where technologists who play around in the IT department feel a lot of power in making a power move like, “Yeah, now we're going to phish our employees.” It's really not a nice way to be a colleague. It's not a good way to go about building a positive security culture either. People will be afraid, frightened, and embarrassed if they are tricked by our phishing simulation. I think what we have been focusing on is to move away from, “Okay, you failed the phishing simulation, now you have to do the training, mandatory training, or else we'll tell your manager, and we know who you are.” So, since we're a Norwegian European company, we had this privacy mindset from the beginning — building privacy-friendly software. And this is also a bit of a touchy-feely thing. Reducing regulatory risk is what the legal department will say. We are just using privacy to increase trust so that people will actually trust and actually maybe even love getting help from our system because we don't reveal who they are. We don't reveal that Erlend has failed a phishing simulation. I actually did this summer with summer interns. And their first day at work, they went about and sent me a phishing email that was so targeted. I was just going to a conference and then I got this post-conference notification just when I was leaving, then they were like, “Yes! We got you.”
[39:34] Erlend Andreas Gjære: Everyone can be tricked, and me of all people who sent all these phishing emails. But just building that trust, you will not be exposed as an individual. We move away from the very transactional nature where we accuse you of being wrong or doing wrong or not doing what you should, and instead just try to measure this risk because it is a risk that people click phishing links, it is a risk that people aren't doing any training. Instead of identifying the people so you can go and nag them about it, we just allow you to measure. And then through our platform, you can also do targeted interventions based on anonymous groups of people, maybe actually based on the risk scores because we do individual risk measurements, but we don't expose them to our customer, to the employer. We did this privacy work together with the Norwegian Data Protection Authority, actually, for a year. It's called the Regulatory Sandbox for Responsible Artificial Intelligence, which is a long name. But there is a public report available as well about all the privacy work we did where we talked to the Data Protection Authority about legal risk, and we talked with employees and companies to learn about “How would you feel impacted if you were actually faced with this kind of system, measurements, and follow-ups, etc.” And also, we did a workshop with the Norwegian [41:10 inaudible] for equality and non-discrimination, and also with labor organizations. So, we try just actually to get all the difficult questions: How can you measure risk without harming people? And I think “without harming people” is a keyword here because organizations somehow easily could do stuff that could, in fact, harm people.
[41:32] Andra Zaharia: That's true, especially since we spend so much of our life at work, and so much of our identity is tied to what we learn at work. So many individuals develop their sense of self and their maturity at work. It's our playground. It's the space where we grow as humans. We know that there's so much talk, obviously, and this is good about balance. But we cannot deny this connection, we cannot deny this conditioning of ourselves as humans through the environment that we have at work and through the colleagues that we interact with. So, I'm really glad that you and people like you who are doing work that's different, that you're taking different approaches, that you're questioning the status quo, that we're challenging it, and that you're providing options for people to be able to just build different relationships with security, to experience different emotions, to experience that positive spectrum of emotions and engage with that, and actually get some sense of reward, significance, and self-confidence out of these experiences so that we might change slowly the way that things are done in the industry. Just like it's something that I got from Seth Godin, which has stayed with me ever since, that people like us do things like this. And when we change the type of things that people like us do — people in security do things like this, when we change the way that we do these things — it changes, I think, our role in general as well and I think that it makes us a bit better human for ourselves and for others as well through the work that we do. So, I'm really glad that I got to talk to you about all of these things, and that we went into so much detail I think that this is such a valuable and powerful example for other people who are perhaps looking for different approaches to contribute to different ways to do their work and just different ways to think about things.
[43:36] Erlend Andreas Gjære: Yeah, we just talked about metrics now, and I think we often get what we measure. And this is often how metrics also go wrong because we measure things that make people take alternative routes to just achieve the metric they need. But when we focus on two particular dimensions, inspired by the paper I mentioned, we also did some research now in this area, it's like competence, knowledge, skills, that's one thing. But then the effect, the emotional side is the other one. I’m actually measuring or achieving people who are happy about security or feeling confident, people self-efficacy, I’m measuring this. So, that brings a new dimension. It's just really positive to build this positive approach. So, I think we're often getting the results we deserve, and then we really, really need to advance this field here. And I hope we can make a contribution here.
[44:43] Andra Zaharia: Yeah. I can't wait to see what you do next, Erlend. Thank you so much for today, for this conversation, and I hope that we might be able to return to this topic a while from now and share with people how things have evolved for the better.
[45:00] Erlend Andreas Gjære: Thank you for the invitation and conversation.