
Cyber Crime Junkies
Translating Cyber into Plain Terms. Newest AI, Social Engineering, and Ransomware Attack Insight to Protect Businesses and Reduce Risk. Latest Cyber News from the Dark web, research, and insider info. Interviews of Global Technology Leaders, sharing True Cyber Crime stories and advice on how to manage cyber risk.
Find all content at www.CyberCrimeJunkies.com and videos on YouTube @CyberCrimeJunkiesPodcast
Cyber Crime Junkies
Shark in the Water! Quantifying Your Chances of Getting Hacked 🔥
🔥New Episode🔥 Former Intelligence Officer, Dan Elliott, current head of Zurich Resilience, joins us and we discuss:
🎯new ways to fight cyber crime,
🎯how to get prepared for a breach, and
🎯how to manage risk in small business.
Growth without Interruption. Get peace of mind. Stay Competitive-Get NetGain. Contact NetGain today at 844-777-6278 or reach out online at www.NETGAINIT.com
Have a Guest idea or Story for us to Cover? You can now text our Podcast Studio direct. Text direct (904) 867-4466
🎧 Subscribe now http://www.youtube.com/@cybercrimejunkiespodcast and never miss a video episode!
Follow Us:
🔗 Website: https://cybercrimejunkies.com
📱 X/Twitter: https://x.com/CybercrimeJunky
📸 Instagram: https://www.instagram.com/cybercrimejunkies/
Want to help us out? Leave us a 5-Star review on Apple Podcast Reviews.
Listen to Our Podcast:
🎙️ Apple Podcasts: https://podcasts.apple.com/us/podcast/cyber-crime-junkies/id1633932941
🎙️ Spotify: https://open.spotify.com/show/5y4U2v51gztlenr8TJ2LJs?si=537680ec262545b3
🎙️ Youtube (FKA Google) Podcasts: http://www.youtube.com/@cybercrimejunkiespodcast
Join the Conversation: 💬 Leave your comments and questions. TEXT THE LINK ABOVE . We'd love to hear your thoughts and suggestions for future episodes!
🔥New Episode🔥 Former Intelligence Officer, Dan Elliott, current head of Zurich Resilience, joins us and we discuss:
- 🎯new ways to fight cyber crime,
- 🎯how to get prepared for a breach, and
- 🎯how to manage risk in small business.
Chapters
- 00:00
- Introduction to Dan Elliott and His Journey
- 02:59
- Transitioning from Intelligence to Cybersecurity
- 05:55
- Moving to Australia: Opportunities and Challenges
- 09:00
- Using Analogies to Explain Cyber Risks
- 12:03
- Understanding the Cybersecurity Landscape in Australia
- 14:58
- Quantifying Cyber Risk for Businesses
- 18:05
- The Importance of Incident Response Planning
- 21:05
- The Role of Regulation in Cybersecurity
- 23:57
- The Future of Cybersecurity and Business Preparedness
- 32:27
- The Importance of Layered Security
- 33:43
- AI and Social Engineering: A Double-Edged Sword
- 38:04
- Deepfakes: The Emerging Threat
- 40:24
- The Reality of Remote Work and Deepfakes
- 43:29
- Detection Challenges in Cybersecurity
- 47:09
- Quantifying Risk: The FAIR Methodology
- 51:32
- Preparing for the Inevitable: Knowledge is Power
- 52:23
- The Shark in the Water Analogy
- 56:18
- Evolving Perspectives on Cybersecurity
Topics: New Ways To Fight Cyber Crime,How To Get Prepared For A Breach,How To Manage Risk In Small Business,How Business Can Fight Cyber Crime,Day Of Data Breach,How To Fight Back Against Cyber Crime,How To Handle A Breach,How To Measure Tech Risk,How To Prepare For A Breach,How To Prepare For A Data Breach,Risk Management,AI,Dan Elliott,Risk Appetite,Cyber Crime Junkies,Incident Response
Speaker 2 (00:10.606)
What do think cybersecurity really is? I'll tell you what it's not. It's not firewalls, technology stacks, and strong passwords. mean, look, 100 % of every breach this year had what in common. They all had firewalls, they all had passwords, and they didn't stop anything. Cybersecurity is about overall holistic view of managing risk. It's about knowing exactly how much a breach would cost you.
in dollars, in trust, in survival. In this episode, I sit down with former intelligence officer who's now helping small to mid-size organizations decode the true cost of cyber risk. And that's exactly where most organizations are dangerously exposed. So if you run a business, you need to hear this and we hope that it resonates. Small talk sucks, so let's dive in.
Speaker 2 (01:10.584)
Catch us on YouTube, follow us on LinkedIn, and dive deeper at cybercrimejunkies.com. Don't just watch, be the type of person that fights back. This is Cybercrime Junkies, and now the show.
Speaker 2 (01:30.318)
Alright, welcome to the studio. I am here today with Dan Elliott, a former Canadian intelligence officer. Today serves as head of cyber resilience for Australia and New Zealand for Zurich Resilience Solutions Australia. Dan is a legendary storyteller, able to translate complex risk into plain terms. Also a proud dad and incredible character creator.
as seen in his LinkedIn carousels and stories explaining various complex risk and cybersecurity risks, some more gangs and common exploits all in like a character short form, very concise, very cool stuff. Elliott, welcome to the studio.
My God, that intro, how can I live up to it?
Dude, will always, you just bring me along to your meetings and I will be like, hang on before he starts. Do you know who he is? Let me, let me explain. You are in Australia. You are in Australia and I am not. Yeah. And I'm not. And this is the magic of technology. You know, like it's just, it's incredible. What time is it there? So that everybody can have a good sense.
with this.
Speaker 1 (02:42.87)
nine o'clock at night, 10 o'clock at night, something like that. try not to give time because it's all relative.
It is relative. It's 7 a.m. here. So you're welcome. So that's good. I get I'm an early riser. I'm not usually doing a podcast at seven. But no, this is good. So man, tell me what brought you. were last time we spoke and you and I have obviously stayed in touch, but you were grown up. We were in our life in Canada and was essentially a police officer, then an intelligence officer or
You know, kind of walk us through that just generally. I know, I know we know each other and you've been on before, but for those who, who don't watch us all the time, which is probably all everybody, could you just kind of update us on your, on your trajectory? And then I want to get into some of your insight and some of the recent things that
Yeah, sure. So yeah, 17 years with the Canadian government, six of which was with CSIS, the Canadian Security Intelligence Service, is, mean, for the Aussies listening, it's kind of ASIO and ASIS for the Brits, it's MI5, MI6. And it's kind of an FBI, CIA, DOJ all slammed together for American. And I did two roles there. I was a field officer. So I worked out in the field and ran sources, did operations. And then I was a
risk manager for the operations in-house and I was doing translation there to assessing risk and managing and explaining what we were doing and
Speaker 2 (04:16.406)
It's so true because the importance of the initiatives to get initiatives done to get an operation completed. You can't get lost in the weeds. The vision and the mission has to be clear from top down. Sorry, I didn't mean to interrupt.
You're, that's exactly.
processing out loud man it's what Myers Briggs tells me it tells me I do good I don't know
It's what it is. We had people who were thinking about whether or not they wanted to have their name attached to things. had field guys who truly believed in every single thing they wrote and every single thing they did. And you had to find a middle ground, right? So I spent a lot of time when I wasn't working field, trying ways to help field officers explain what they were doing.
We have to talk.
Speaker 1 (05:05.398)
in a way that would get buy-in from senior leadership and that would make the, you know, in that old sign, the juice worth the squeeze. Like we had to, the risk had to make sense. And when I moved into the private sector, it was, was a situation I got to a point in my career where
you had to be 100 % in. I I had friends who had lovely ex-wives and that really wasn't my trajectory. And I had other friends who had just like fully bought into the constant moving and the lifestyle. And it really wasn't where I wanted to go. So flipped over to the private sector and I was essentially the equivalent, I guess, of a field CISO for Zurich. And the big insurance company had an advisory and consulting group.
and I ran their cyber division for the country. So did that for a couple of years, which was a lot of fun. And I got to work with all sorts of industries, which having.
You ran it for like the majority of like all of Canada. were driving. Well, you're a lot more important than I even thought you were. I'll have to be nicer to you off stage. That's good. Yeah, it does. Even though there's only like 20 of you up there from what we hear down in the States, there's not that many Canadians apparently. I'm like, I'm going to the word. So anyway, so so.
And I I would head for
Speaker 1 (06:10.926)
Sound good.
Speaker 2 (06:25.654)
Tell me you just this past year you segue, you're still with Zurich or different division, et cetera, maybe probably a different organization completely, but there's a parent, but, they moved you and your family all the way down to Australia, which is the other side of the world for us. So how did, what, what drove you to do it? What's the opportunity walk us through what your team is like, what you all are doing.
So it was a great opportunity. was always love. My wife and I had always loved the idea of living down here, but it was kind of a retirement plan. Freedom 55 was I'd end as head of station in Australia, and then I'd just stay, you know, come and find me. But when I left, it was kind of figuring out how to put the pieces together so that I could make that move.
And when the company decided that they wanted to replicate what we did in Canada, in Australia and New Zealand, I was very quick to put my hand up and, and, and they were happy to offer me the role. have family up and down the coast here. So it, we'd been here a ton of times. We knew the area. I tried very hard to network into the CISO community here. It's been wonderful. do.
You do a lot of public speaking. You're doing a lot of events, producing a lot of content, really kind of just driving the message, which is very consistent with everything you've been doing since I met you. So just just keep it up. I will tell you, I have used bear, not bear in the big blue house bear in the woods as an analogy because it is so applicable. Right. For those who may not know, hikers going through the woods, the train, it's dark, the train is rough. There's a bear, you know,
That bear, we can never outrun it. We can climb trees, it can kill us with a single swipe. But, you know, how do we get through the woods? That bear is the analogy of of of a threat actor or hacker like a black hat hacker. So very, very applicable. Everybody can relate to it. And you explain that Zurich uses it often to explain complex risk to Bort walk us through that. Like, tell us how you've done it. And does your bear have an outfit?
Speaker 2 (08:34.914)
Did you like create a 3D image of the bear and put it on LinkedIn? How's it working?
So coming down here, I've had to switch because there are no bears in Australia.
So what is, is it like a, is it a Joey? What is it?
It's shark in the water now. I had to really bring it to life.
gotta have a new story. We can still relate to sharks. That's why we that's why we don't go in the water. We just sit there and put some tan oil on sit on the beach.
Speaker 1 (09:02.092)
that's no longer an option, right? So I mean, it's part of that cognitive dissonance, right? If I start to explain things in terms of technical controls and of black hat hackers and of the tools they have and the controls we have to put in place, I lose people very quickly. So if I tell a visceral story that even if people haven't seen a bear and they just have a concept of it, it's they have a picture in their mind immediately and it allows them to attach something, you
emotionally triggering, visceral, funny, whatever they picture to it. And when I came down here, I immediately realized that there was something lacking in my bear analogy. So I switched. Most of the people that grow up here near the shore are told some variation of the story that you can swim wherever you want, just make sure there's somebody slower and larger than you that's hurt.
told that so they are told the same message from childhood in Australia. So that is a great audience because that is perfect. Right. When I explain that to people and they always feel relieved because rural health care here in the Midwest just pummeled by cyber attacks and their enterprise environments with nonprofit budgets. Right. Like it's just so hard.
and they're doing everything they can, but they don't always have all the controls. They don't, they're not as prepared as they need to be. And they get sold by vendors that talk gigabit and like, you know, we will stop breaches and we will do all this complex and nobody, it doesn't resonate, right? We have to tell a story. I'll tell you when I've talked to them and just explained like, it's like a bear and the
Good news is like the bad news is it'll kill us essentially and we can never outrun it. Bad news, right? That's the fear, uncertainty and doubt. The typical security awareness training, right? But the good news is we don't have to. When we see that bear coming, step one is lace your shoes up. then when you're, and then when you're hiker next to you goes, dude, what are you doing? You're not going to be able to outrun that bear. You just look at them and go, I don't have to. I just have to outrun you.
Speaker 2 (11:13.964)
you buddy and then go right and that's really what it is because when we think about the modus operandi and the mindset of hackers right they're not targeting unless it's like change healthcare or nike or like then they're targeting them mgm like they're targeting them of course but the majority of the breaches are occurring in the smb space those are not household names they are not going after abc manufacturing in cincinnati
They are going after every small organization that has a certain open vulnerability so that they can get it. Whatever the easiest path is. And so all you have to do is make it obnoxious for them. it a, like move the needle a little and they will move on, you know, to somebody less prepared. Is that what you're seeing? I mean, you see global data and insight, but that's so clear to us.
Yeah, yeah, I think coming down to to and said the biggest shift I saw was the mark well There's two the the first one is the markets? They're a hell of a lot bigger I mean, I I haven't fully moved into my office here, but I share it with a huntsman spider. That's about the size of my
The bugs are bigger, right? Like the bugs are.
Speaker 2 (12:30.36)
size of your hand. You're like, you keep the, you have the den. I'm going over here.
Yeah, you stay above seven feet, I'll stay below. But the breakup of businesses, there's about 97 % of the businesses down here are under 60 million revenue a year.
Yeah, so that's... Yep.
And also, they're very tech forward, they adopt technology fast, but they don't adopt security nearly as quickly.
It's like a startup. whenever you talk to a startup, they're investing in every type of gadget and app and data analysis and nothing on security. And you're like, it's okay for a little bit. It's everybody's risk appetite, right? But boy, it's gonna fly too close to the sun very quickly. So no one's gonna eventually buy it if you don't have, right? If you don't eventually have those controls, you're not gonna get the money that you really deserve.
Speaker 2 (13:28.344)
for the thing you're building.
Yeah, I think that and you said one of the key things right there that I've hit a lot of the what we call middle market kind of that. Yeah, and is that risk appetite? In a lot of organizations, they haven't outlined that. And in cyber, one of the things that the soapbox that I stand a lot on is quantifying risk because they can't develop an appetite off of red, amber and green.
Yeah. We just always use the SMB space.
Speaker 1 (13:59.552)
So, you know, if you haven't understood what your risk appetite is, then you don't even know how much you should be spending on security. And so you spend nothing or you spend too much, you know, and it ends up.
Or you spend it in the wrong place, right? You you buy some tool or some system. Meanwhile, you have an invested in policies and practices and at least done an incident response plan at tabletop. You're like, well, that seems more advanced. I'm like, no, before going after the bells and whistles, why don't you start at the beginning and at the end, like prepare for the end, prepare for boom. Right. And start for the beginning with with a clear like fake.
quite as best as you can quantify it possibly with with an assessment, because then they can kind of see and know what the risk is. Every other department can do it right? HR can say 11 % of our employees are looking for jobs, sales can be like we're going to crush our quarterly number by 12%. And then cyber or it goes in and we're like, we're yellow. Like, what are they supposed to do with that? It's not that's not translating. That's not explaining to
business leadership, where are you yellow? What does that mean? Like at what stage are you and are we okay with you? Because they might be okay with you if they know what it means, but they don't understand what it means.
And that's, think those are the crosswalk conversations that have to happen. I still go back to the old CFO, one of the first CFOs that I dealt with in Canada that admitted to me that he was getting essentially Red Amber Green rag reports, one's a quarter from the CISO and there were roughly 50 lights on it and he didn't know what they meant and he didn't know if red meant they were closing tomorrow and green meant they did, they could stop investing. And he said, it's been two years, I don't
Speaker 1 (15:51.184)
even know if they're the same lights. It's like I don't know if he's switching them out as we go or the second, I don't know what they mean. And I'm too embarrassed after two years to tell him that I don't understand what I'm looking at. Right. I mean, those...
But if he's unclear, it's not the CFO's fault. And I don't want to say fault because everybody has a good intent. It's not like somebody's wrong here. It's just there's a better way doing it. And if a teacher, if it doesn't resonate with the child, it's not the child's fault. Like you didn't teach it right. If you go into a meeting and you give a presentation and people misunderstand you, that's not their fault. That's your fault, right?
So we have to work on that internal business case that explaining in their language what that means.
think that that is, that's the key piece. And everybody has to find their own way to what works for them. You know, is it based in the organization? it based on numbers? Is it based on statistics? Is it based on, you know, our current goals, our appetite in color terms? I don't know. But the reality is if you haven't understood what the business units are trying to achieve, what their goals are and what their fears are, then you can't really work to facilitate that. And I think that that's important. It's less important.
and exactly what controls you're applying. And then by applying controls to help alleviate some of the risk and the fear that they have and applying the right controls so that they can still achieve the goals of the business and can hit those lines. And I think that's where that, if you start speaking the exact same language as the other business units are and they can measure one to one, they can say, okay, it's this likely that we're gonna get hit. It's gonna cost us this much.
Speaker 2 (17:18.498)
my plan.
Speaker 1 (17:38.83)
But if we invest this much in other controls, we can alleviate that and that'll offset this loss we would have had through these business units here. And you get to a point where you're talking in actual earned income versus expenditure rather than in, well, I need an extra 20 % this year just because I mean, we still want to stay yellow, right? so I think that in, well, I need an extra 20 % this year just because I mean, we still want to stay yellow, right?
So I think those are, that's a lot of what I spend my time doing. the nice thing is Australians are very receptive to hear out what's going on. They want to look over the fence. They want to know what other organizations are doing, what other regions are doing. And they're happy to be first. They're happy to step out and say, you know what? We don't want to run into these problems. We don't want this to be our story. We don't want to be on the front page for the wrong reasons.
And I think that things like that new language of speaking of cyber risk is something that really resonates with them here. Plus they're, man, you're sitting on the doorstep to a lot of the threat actors. So you end up being caught out by a lot of the things first here before they move across to the US or EMEA.
Why is that? What is the macro view of that? It seems like they're testing certain tactics, protocols, et cetera, on you all down under and our friends over in the UK before they hit us. Meanwhile, we're still TikToking our way here, like our entire lives, our whole family's in the background right where their school is. And we're like, got our license, picture of our driver's license right here.
You know, we have nothing to worry about. We're Americans. We have nothing to worry about. Yeah. That's exactly welcome to the Wild West.
Speaker 1 (19:34.7)
You know, I'll say it's probably investment. There are probably two things, but a chunk of it is that the US, US companies still invest two to three times more than Aussie, Canadian or UK companies do on security.
In total amounts or you mean pro rate? You mean like the percentage of the individual? Okay, interesting.
The stat last year was about $620 per employee on security and here it was about $260. So that's a big disparity. So you have to hit the right spot if you're going to do it here. I tell the boards here, said, if you're going to spend less than half, you're jumping out of a plane and landing in a teacup. Like you better be right on point. And I think that that's well known enough.
that people are using the same technology here that they're using through the US, but there's less spent. So unless companies are spending it perfectly and really allocating it right, they're going to have some open vulnerabilities that may be blocked off or may have layered defenses in the US that they're missing it here and they can.
Let me ask you this when when from a high level, I don't have to get into the weeds, but from a high level, when you're trying to quantify risk for business leaders or whatever size business, but when you're trying to do it, do you do it in essentially buckets? Like, do you do like, well, let's break this up from from your devices to users, to your infrastructure, to your storage. Is that how you do it? You know, like we're more at risk here, less at risk here, even if the number and people are so
Speaker 2 (21:17.922)
hesitant to do it because they're like, what's not a science? Because you just because we're trying to predict intervening criminal activity. And I'm like, yeah, but that's like saying insurance companies don't know how to set what premium to set because someone's going to drive a car. Like, of course, you never know. But there's tons of data where you can kind of go and especially you can know I'm more at risk in this niche compared to this one. So should we invest in
Is it aligned to our goals? Does it make sense? Right? Am I off track? That's kind of, I'm trying to, I'm trying to read about it and really understand it without having to do, you know, what is it like Monaco accounting and all the different, all the different, you know, like all getting into the weeds of how to calculate a specific risk. Cause I'm like, dude, that's too much math. Like I just, like, I just, I just want to understand it. I don't want to do the home.
You're right. think that the biggest thing is it starts with a really large data set to begin with. The more data you have to start with of what caused other breaches, what vulnerabilities were present, what it cost, how long people were down for, all of those kind of costs that left a boom, right a boom.
play into a piece of it. So it's that garbage in, garbage out. If you don't have enough good quality structured data to begin with, then it's not gonna mean anything because you're comparing yourself to nothing. when we go in and do it, we have a ton of data. mean, fortunately or unfortunately, depending on which side of the coin you're on, we paid out a lot of money in breaches over the years. So we have a ton of data to be able to say what caused things and how much it costs. And then...
we, there's, standard methodology now. mean, it's, it's generally industry accepted the fair methodology for quantifying risk. Is it, is a good system to start from where it tends to lack is the amount of data you're testing it against. So when, when we start to bring in what an organization has in place, what controls they have in place. And that's across kind of their entire NIST framework. And then you look at what industry they're in with turnover there, there is.
Speaker 1 (23:35.522)
with region they're in, and then you run it, run a Monte Carlo analysis against what it's likely to.
Hey, I Monaco. I meant Monte Carlo.
But once you put it all together, get, don't get an exact science. You get a probability. And I think to your point, once we start accepting that gray is okay, like it's not about, you know, either the numbers are perfect or we don't use numbers. And I think you have to say, my numbers are defensible. Like I feel comfortable going in front of a board or my CIO or the ex-co and saying, these are the numbers.
this is the probability and the numbers we came up with had enough data behind it that it has a high likelihood. And you're never gonna, otherwise you're guessing. Otherwise you might as well cover your eyes and throw a dart and that's the control we're gonna have.
Yeah, I mean, and for me, I wanted to ask you, you are a strong advocate for incident response planning right up front. Tabletop exercises. And to me, it's very logical. It makes perfect sense because until you're fully assessed and you understand your risk and you've been able to quantify that and that can take a long time to do the first thing out of the gate is, well, let's prepare. Should it happen tomorrow while we're doing this? Right. Like, let's not let's prepare.
Speaker 2 (24:54.114)
So that we know game plan of who does what our one hour to day one day two, et cetera. Right. Because otherwise it's going to be the day of a breach. Everybody's looking. First of all, they're going to hear about it from the media, from law enforcement. They don't even know about it because they're not ready. And then and then they're they're they're they're going to be like, Carl's supposed to be doing this. Carl left six months ago. Like in right. And then they're trying to figure it out. And it's because they're not staying up to date.
And it's like a fire drill fire drills in school. We do them all the time. Like, how do you not do that? Like we've all been doing that since school. And yet we build these phenomenal brands and we're like, eh, should it happen? It's fine. Or they're like, oh, I've got a disaster recovery plan, but those aren't the same. Are they Dan?
BCP, DRP and your IRP. mean, everybody loves their acronyms, but they are not the same. I've had.
BCP, BCP business continuity plan, DRP disaster recovery plan, and IRP incident response plan. Incident response plan is the data breach.
right away too. And that's, you know, I think that I use, I always call them corporate fire drills. When, when I go into companies, I use that, that same concept that, you know, little Johnny doesn't want to get caught in the classroom the day of a fire. So you, you teach them where to line up and where to go. And I think it's the exact same thing doing tabletops. You can't do the same tabletop over and over again. So you have to find out where are we weakest? Where do we want to build? What skill do we want to work on this time?
Speaker 1 (26:31.79)
and build them differently, chime and get used to them being a learning environment. And then adjust your incident response plan in your playbooks based on the outcome there so that everybody, it's that muscle memory. They'll never get to a hundred percent, but it gives them the mental space to address the extra 20. They'll get to that 80%.
Data is good. The data is good. Like the data supports organizations, especially smaller ones that do that, save money. The ability to respond faster, the amount that they wind up paying, you know, it's so wise. And comparatively, generally speaking, that investment is one of the lower ones compared to rolling sock and sim out across an entire organization. yeah.
You're right on point there. I mean, and I saw it when I was, as soon as I started looking at the insurance numbers, the biggest costs were business continuity, kind of that business interruption piece. When I was working on the Intel side, it was still the earlier days of ransomware, and the numbers were big, and we assumed that that was the big number. And that was what everybody, what was costing companies, because we left before...
You know, any of the business continuity stepped in, but we left before businesses got back up and running. Seeing it now in the private sector and seeing that, you know, that ransomware is 10 % of their overall cost of the incident because they were down for weeks. They weren't back up past 60 % operation for a month. You know, the long tail of these losses is painful.
And in America, we love we love a good lawsuit here. Like I am telling you, we got dude, we all on the you take a drive down Route 66 or you're driving in the Midwest. You know how you've got the the attorneys on the billboards on the hammer or whatever it is, right? And they've got these great pictures and they're like above the board to they do it in like 3D. So they're like coming at you like that. Bizarre. Anyway, they now you'll see like victim of a
Speaker 2 (28:46.52)
data breach is your data at risk. I contact these I'm like, because they're class action suits now for even against smaller organizations, because those organizations were holding so much data. And I'm like, man, that's what that's what changes things. And I mean, while that's, well, that's not good for business to be sued, right. But what it does is it might, I believe, ultimately, in the grand scheme of things, get people to act. Because unless
You know, like it's very similar to the to the automobile and how we we drove those things flying through windshields, right? For like 60 years before somebody goes, hey, how about seat belts? You're not going to require those of me. Right. And there was a big resistance. I haven't had to spend money on seat belts before. Why would I need them now? Right. And and now we don't. It's it's second nature. Right. So.
But see, I think, and this is one of the things I saw going from North America down to Australia is that typically, ideally, the regulation is the tailing indicator, right? You have industry starts to move in that direction first because they recognize it's costing us money. And then regulation steps in to catch all the,
So that's a great point. That's a great point. That's that's in general how it's supposed to work. But sometimes it doesn't. Right. And sometimes you have to be like because now here you're like, we have to be CMMC compliant to do business with the Department of Defense. Like, can't believe I have to have those things like health care. Like a smaller health care says that. And the response from a lot of people is like, we're patients like you should be doing this stuff. We we thought you were right for the most part.
Everybody thought they were already locked out better. Yeah.
Speaker 1 (30:38.068)
It's I think that that cyber has really exposed a lot of those weaknesses, those lagging industries and those those lagging organizations that that thought that she'll be all right. We'll we'll we'll worry about it tomorrow. And I mean, there isn't going to be one fix. It's not like we put a put a padlock on our door and then we think that we're done for the day. Right. But we still put a padlock on our door.
You still you understand that it requires a layer you understand that there are other security things I have to put in place But you don't walk out of your house and go Matt. She'll be alright. I won't lock my doors. Mm-hmm He's still fun. Well, I'll say the majority of people still go out of their house and locked in
every day. Yeah, exactly. Let me ask you, know, AI and some of the ways it's been leveraged for social engineering has really gotten remarkable deep fakes, the manipulation of video, audio. I mean, just in the last three, four months, Dan, like the quality of it, it's not just for like Instagram reels or TikTok video. Like it is very good. They're being used in live teams, meetings, Zoom meetings, et cetera. What are you seeing?
from a global risk perspective or down on your region? Like what is the impact? What are you seeing? What do you believe is where we're heading with that? That's a big question, but I just really want to get your take on.
If I could answer that thoroughly, I'd be making more money. You know, I don't think...
Speaker 2 (32:06.498)
Or might not even be you. Like you could be somebody else. Exactly. Right.
that
No, I think that AI is about moving faster. So it will take talented people, whether they're talented for good or for bad, and it has been allowing them to move faster at whatever they're doing. It's not the be all end all of sovereign safety. It's not the be all end all of destruction. You're not gonna see in the next, 12 months, deep fakes really ripping apart the industry. It still takes time and compute power and effort.
and research behind it. You're also probably 12, I'm going to get caught out on this, 12 months to 24 months away from AI actively conducting social engineering attacks. mean, social engineering, that's my bread and butter. I love doing that stuff. shouldn't say that. I mean, that's, I, I learned that that's where I train. That's what I use to train. And I think it's important. It's that ethical influence and then social engineering is the two sides of the same.
It is nuanced. is hard. I've tried to play around with some AI agents and models to see if it, if it can leverage that. think that the next big thing will be when you can build AI agents that can effectively use social engineering tactics independently without a driver in the seat.
Speaker 2 (33:30.786)
Yeah, like if you're able to set up a ineffective hallbot or an effective video zoom live mask virtual camera.
It changes. Even you're conversations back and forth, usually they'll miss a beat. They'll miss some of those indicators that you could play off of specific social science, like social psychology tools, whether it's authority or cognitive dissonance or social proof. They'll miss one and they'll use it incorrectly. But I think that's the step that hackers are still using really effectively. Criminal hackers are using really effectively. That AI can control some of the tooling, but
hackers by and large are still getting in using base, base it foundational we'll call them social engineering tactics.
Logging in or logging in because they buy they buy session cookies and we reuse passwords or just the fundamentals, right? Can I share a story with you though? So I was talking to a Firm that was doing Marketing and app development for law firms here in the Midwest and they've developed an app and platform and so they were they were expanding they were growing
currently only maybe a 25, 30 employee organization, right? They had like three, four openings and they were hiring for developers, some app devs, some web designers, things like this. You know where this is going, right? And that talent isn't always in the small town or regional town that they were in. So they opened it up to remote workers. And when I spoke with them, they said, literally they said, it's just exhausting. They said 15 out of the last 20 interviews I've had,
Speaker 2 (35:14.658)
have been deep fake. Like the resumes are good, almost too good. And then the people get on and they're not living up to, they look exactly like the person. And the person by traditional hiring processes, the person's clean, right? There's no criminal background, et cetera, et cetera, because the freaking identity is stolen, right? They're not gonna steal the identity from somebody in prison. so they have that. And then they get on the interview and they flub the interview. It's the only protection the organization has.
But they said 15 out of the last 20. And I was, I pressed them hard about whether they were exaggerating that number. They said, no, that's honestly a conservative number. I'm like, wow, they're a small business. that's why I'm asking about it because I really, it's coming up a lot, right? Like it's just, I mean, I don't think DeepFakes are going to be showing up on our Teams meetings and Zooms meetings right away anyway, but for the right organization it will.
So I just want them to be aware that it even exists because most people think it's science fiction. talk to me about your like, what's your take on that?
If I track back four years, we thought it was going to be 10 years before deep fake video could ever be used in live settings. Now we look at it and today it's possible. I mean, it's possible with, you know, the compute power to do it. Where we see it most is in those tech sectors where certain States are trying to get people into the U S to place them within tech roles in specific organizations. And the reason it works in certain sectors is.
they can have remote workers that never come into the office. So you can have a remote worker that's the other side of the world and you're okay with that. Or you can have a remote worker that you think is the other side of the country and you're okay with that. And it's being used mostly by criminal organizations or state actors in order to infiltrate certain levels of the tech sector within the US to gain access usually more to the technology and that supply chain rather than to the funds, rather than to the...
Speaker 1 (37:19.798)
company. But it's still not nearly as common. Now we take a cross section of multiple industries.
Not when you compare Info Steelers and malware attacks and ransomware attacks. When you parse it out that way, no, it's not. It's not the number one threat. It's not the rising threat. Yeah. So that's what you're seeing too, in terms of risk. It's really not something that underwriting needs to pay special attention to. Right? Yeah.
We kind of, mean, that's one of those things that, so fleet operators down here, because there's so many trucking companies, fleet operators have specific types of risks that they have to be aware of that your average retail company that has four trucks doesn't have to consider. And this is one of those specialized risks that, you know, people who have a lot of DevSecOps that run remotely need to consider. And they have to be aware of that. And hopefully whoever's providing security from them for them is talking about that.
But if we look across all sectors, mean, I deal with a lot of financial, financial sectors with, know, pre-annuation funds, with manufacturing, with mining, they just aren't seeing it. And I think that's because, you know, the business structure is different. The expectation of hiring, for example, in this specific case is different. The way of work, you know, when you see people, when you don't, it just doesn't, it doesn't work. That point of ingress doesn't exist.
So you invest a lot of time and effort to pull off this deep fake and what are you gonna get out of it? And I think that's really the trade back.
Speaker 2 (38:54.158)
I want to ask you about detection. So many organizations struggle with it and there are tools and maybe it's just a cost thing or maybe that's one of the things that the industry is not articulating in business terms better, right? Meaning as to why they're not investing in detection services and tools. know, how like the data isn't changing that much. It's still like when you hear about an attack, an actual live launch boom itself, right?
We always find out they were in for months before him. Like, meanwhile, they have IT companies that are monitoring that network, but the business owners don't understand. They're not monitoring it for hackers. They're monitoring up and down the disk space, the health of the optimization, patching it, right? That's what they're doing. They're not looking. They don't have the tool sets on there or the the skill set of the engineers because their IT, not security, and they're
It's not that they couldn't do it, but the point is that that's not what they're looking at that same technology for. So how is the insurance industry, like how are you advising on risk on that? Like are you, is that part of quantifying the risk and saying we don't have eyes here? So if I'm inside and I'm there for a month, nobody in this room will be able to know. And then I can move laterally and I can escalate privileges and nobody in this room would know.
Is that how you kind of are approaching it or is that some of the discussion? Yeah.
That's definitely part of the discussion. think we're slowly getting to the point where some sort of an EDR, a managed EDR product is becoming table stakes, but it's still unobtainable for a good chunk of the SMB market. So that's when you start to look at, okay, how are we segmented? Do we have micro segmentation? What are our backups look like? Can they be accessed? You you start to look at that.
Speaker 1 (40:51.298)
those confluence of other indicators as to how quickly the business will get back up when the attacker finally pokes his head up and says, Hey, I'm here. And, and I think that's the trade back and forth. So when I, when I go into an organization, we start going over the numbers for, their quantified risk. One of those conversations is, okay, how much would, you know, an MDR costs you per year and what is this going to cost you coming out the other side? And what can you save? Cause ultimately when we,
The idea, one of the things I usually try to lean on when we go in and have those conversations is, you know, knowing, you know, how likely that one of these attacks are and how much it's gonna cost you. Are you buying too much insurance? Are you transferring too much risk or you're not transferring enough? And where can you start to save and move that needle with other controls or other spending? So yeah, definitely that detection response is important, but not a single solution is a panacea.
You have to look at them all together. And I think for the size and scale of an organization really decides whether that's the right tool for them and how they're applying it and what are the tools they have in place instead.
Yeah, because you'll hear owners or leaders say, well, if I invest in this, then will that lower my overall risk? You're like, well, no, it'll lower the risk from that type of event. It's not lowering your social engineering. That would be education, test fishing, et cetera. It's not lowering your preparation risk for a data breach. That's incident response planning tabletops. It's not right. So it's you have to be more specific.
You had mentioned the fair methodology as if I were an actuary. so can you, when you're, no, you said like there are the standard mechanisms for measuring risk. Can you walk us through that just high level? Sure. So that business owners can know to, I always want to help business owners ask the right questions of their IT teams, you know? Yeah.
Speaker 1 (42:52.046)
So the fair methodology of risk is essentially quantifying the value at risk. So what is my infrastructure worth? What would it cost to get my infrastructure back up and running? What is the maturity of my organization? What is the likely cost of an incident? What is the probability of an incident based on my industry and based on my country? And it starts to run through that.
aggregated loss scenarios. It's the it's if we go back to, you know, high school probability math, when we start, you know, running through different permutations and estimating, well, there's a 10 % chance of this happening, there's a 40 % chance of this happening, and I can offset by 10 % for this and you start to build that tree down, then what it really comes down to is you can't do that on paper. You you run those numbers through a proper through a system that
can take those, probability against each of those mitigating controls and adjust for it. And that's really what that methodology comes down to is, is we're applying traditional risk methods, traditional probability math to controls that aren't perfect in an industry, in a risk space that's largely governed by people. That's not governed by the natural world where
Bye.
this will always happen after this and this will always happen because of this. So it's a decision tree is the easiest way to look at it. Then we start to affect the probability of something happening based on the controls that you have in place and based on the type of risks that you have in your industry or in your environment.
Speaker 2 (44:38.272)
Amazing. So that's very useful. You have a lot of you've got a lot of content on that. So I might hit you up on some of that and put some images up for people so that they can see it if that's OK. Like a sample redacted report or something like that so that they can say this is the type of stuff that Zurich and Dan's team have been doing for years. And this is where we need to evolve you to. Right. Because as a business owner, I want to know what my risk is. I want to be able to go in
dig down as deep as I can, right? And so that I understand whether I'm uncomfortable with it because it's never going down to zero. So you need to figure out what you're comfortable with. And right now they have no idea like a vast, vast majority. Like they just don't know. They assume because they haven't been breached yet that that's the case. The response to that is how do you know you haven't been breached? Number one,
Number two is that's like, well, let's not have a fire extinguisher anywhere in the home or the building because we haven't burned down before. Right. Let's put a roof over our head because we haven't flooded before. Like there's so many different analogies why that's not good reasoning. Yeah.
I think that yesterday is a poor indicator of tomorrow. certain spaces, you... So I hearken back to that one story I was told that, you know, when did Noah build the ark? You know, before it started raining. And that's kind of this space.
Yeah, it hasn't started raining, I don't need an arc. It's not flooding, I don't need an arc.
Speaker 1 (46:15.874)
Yeah.
And I think that's this, have to, the more knowledge you have in advance, the better prepared you are for that bad day will likely come. how bad it is depends on how prepared you
Of course. What is the shark in the water analogy? us.
So it is very simple, very similar to the bear in the woods, which is the concept that if everybody's out swimming and consider everybody out swimming as their own business and they're all swimming around and because there's so much tide and there's so much rip, some people are getting pulled out, some people are getting pushed in and you never truly know how good a swimmer anybody else is. You may guess, you may think, but you don't really test it until you see the shark fin.
and maybe a dolphin, maybe a shark, but you're not gonna wait and see. You don't need to be the fastest swimmer. You just need to be able to measure yourself against what the other swimmers are doing and then always be further in than the slow guy who's close to the shark. And I think that that...
Speaker 2 (47:24.172)
believe we have a new story to tell. That's a beautiful story. How is that not a carousel? Like your carousels lately. That's a really good analogy though. Everybody can picture it. I just had a beautiful beach in my mind that makes people happy as opposed to being in the wet woods with a bear. Like not everybody's experiencing that, but everybody's kind of been to the beach in some form or fashion. Right? I love that.
Well, and I think that it's the idea that
Consider that one stolen. I'm just letting you.
You can, yeah, steal away. I mean, the biggest thing for us is we've gotten to a point of you have to know what the other, in that analogy, you have to know what the other swimmers are doing and can do. When I have finally gotten out into the ocean and done a bit of surfing, one of the things that I learned is that you have to be watching what the other swimmers and other surfers are doing. Otherwise you're gonna get hurt or you're gonna hurt someone.
A lot of what we do with the red, amber, green is we're worried about what we're doing, but we're not really caring what other people are doing. We're not benchmarking. We're not measuring in terms that everybody else can run to. And I think that one of the things I try to get across with that analogy to businesses is the idea that you're worried about the shark, but you also need to be worried about all the other swimmers. And you need to know where they are, where they're moving, because they might be a good indicator of where a shark's coming from. And that's really where I, that soapbox I want to...
Speaker 1 (48:51.456)
stand on and get get companies acquainted with is the idea that you know learn your business, learn your controls, learn what other people are doing, learn what others are doing successfully and what others are doing not so successfully and and the best way to do that is to start speaking the same language. Find a commonality where you can measure against each
Well, and that's where the maturity of the organization, like if you're not looking at what other similar organizations, your size and your field are doing, you won't know how you compare and whether you're maturing. Right. When you think of like the Gartner slide and an organization matures over time as they adopt technology and it becomes more of a roadblock to their development into leveraging advances in it. And it's doing what you really thought what you were sold.
that it was going to do in the first place, right? And you're not going to be able to know that if you're not looking at the whole beach, right? You've got to see the whole beach.
That's exactly it. So we're slowly evolving. The one benefit I had coming down here was that the analogy just wouldn't have developed quite the way it did if I didn't have a beach in front of me. I had that wonderful impetus for coming up with something better because other than in zoos, they really have no bears down here to speak of. Yeah, my bear in the woods died pretty quickly. That's so funny.
That's good.
Speaker 2 (50:14.486)
I had no idea like that wasn't part of this, but I love the fact that Dave man I moved to Australia my bear in the woods thing like I'm shooting it and it's not going anywhere I got to come up with something else. I'm doing a presentation. I looked out. It's a beautiful beach I pivot and I go well Let me explain and then you just go and you do it and you're like, this one's even easier and clearer so very nice High five to you like that is fantastic
So as we wrap up, family, good, they're adapting, they're loving it down there. You guys have been, you had visited there numerous times. You had family there, you said. So it's an ant trial.
I mean Australia is there it's welcome. It's a welcoming place. The people are fantastic. The bugs bugs and snakes are horrible. but the people make up for it. So yeah, wife and kids absolutely love it here. I won't say it in front of Australian immigration. Hopefully they're not listening, but they'll have to drag us out of here kicking and screaming. No, it's a it's it's a lovely place.
Yeah, I don't I don't think a lot of our I don't have a huge Australian immigration enforcement following, but maybe after today, they're like, they're like, we're gonna build a wall and let them get Dan. So hey, what what initiatives what do you have coming up? What's on the horizon?
Yeah, it's thankfully a quieter couple of months. So we're coming into winter here. So I would say people hibernate, but they just surf less. So I'm doing a lot of things where I'm talking to organizations one on one because risk quantification still isn't that well known, even though it's starting to trickle in. And then coming into the end of August, August, September, October.
Speaker 1 (52:04.054)
I've got a pile of conferences where I'm going out and waving the flag and talking about social engineering for good and talking about risk quantification and really getting that message out so that organizations can bring it home and start to apply themselves.
That's fantastic. Well, we will have your links to your organization, obviously, and obviously connect with Dan, follow him on LinkedIn. Do you have any other social media that you'd like to tell anybody? Do you? you an Instagram guy? I don't think you're a Facebook guy. I don't know. Like, I don't know if you were where you want people to find you, but clearly LinkedIn, you've got great insight there. And it really people so many people.
have commented to me that they learned so much from you. So thank you for for for doing so much. It's fantastic. Separately, what are you using to generate the characters on your carousels? You were trying out different AI image creation when we last spoke, but that was a long time ago. So what have you been using lately? I mean, I have no time to create them, though. But if I ever slow down, like I would love to do stuff like that. It's so fun. Like
It makes you want to read the carousel. You're like, look at that cute guy or look at that cute, you know, skunk in a hacker mask. I want to find out more. Right. It's just it's so interesting. It's engaging.
It's my kids approved. That's a... So yeah, I've done, I've gone through a bunch of different tools and tricks. The easiest one I always come back to is open NNI's chat GPT has a Dali has come a long way. So, and it comes down to how comfortable you are prompting. Three year, two, two, three years now. It feels like, feels like forever, but yeah, I've gotten to a point now where I can...
Speaker 2 (53:30.222)
I
Speaker 2 (53:46.934)
Yep.
Speaker 1 (53:54.99)
kind of carve out the consistency and get it to work. and I still spend far too much time doing it, but my kids have fun deciding.
Yeah, and you can talk to them. It's a way of translating what you do for a living into plain terms for the kids. Like, I love it.
I get requests from them on a monthly basis to start getting stickers for the different hackers.
Yeah. Or playing cards like create like I wanted to take your carousels, grab all of them and almost just like do an episode where you would give a short explanation of each one of these. Like it would be so interesting for an organization like just to be like, that's that. that's who these guys are. that's a good story. Like then they start to understand the foe that they're against. Yeah. My wife.
We did, I did a count. wife asked me a couple of weeks ago, I'm up over 90 groups that I've covered over the last couple of years. So yeah, and they just won't say it's like whack-a-mole. They just won't stay down. So.
Speaker 2 (55:02.37)
Well, a lot of the same guys rebrand, they join different groups, they rebrand.
It's, it's, mean, you just have to look at scattered spiders. of those right now that doesn't matter how many of them you arrest to do more members pop up with a, you know, a new tool that they bought off of somebody else. So yeah, it's, it's, it's interesting. It's engaging and it's one of those ways that I can help my, my kids are five and eight. So, you know, I don't want to scare them with what's going on out there, but I need them to be knowledgeable before they.
get to.
It's got to go beyond fear. It's got to be about empowerment. It's got to be about so now like because if we do the fundamentals again, like we are swimming faster than the slower swimmer right by that chart. So yeah, my Dan Elliott. Thank you so much, man. Dude, you are always welcome here. I will I will not wait a year or so to set this up again. So I'm sure with changes and things that will happen, we will have plenty to talk about.
So I wish you, your children, your wife and your family all the best. Stay in touch, stay safe, and I will be in touch as we always communicate. So really, really appreciate your time today.
Speaker 1 (56:17.75)
looking forward to it and thanks so much for having me.
No problem. Well, enjoy your night. I'm going to go start my day.