Cyber Crime Junkies

Safeguarding Your Privacy: How to Delete Your Data

Cyber Crime Junkies. Host David Mauro. Season 5 Episode 2

David Mauro interviews Merry Marwig, a data privacy advocate, and Jeff Jockisch, founder of Obscure IQ, about how data brokers work and how to delete your data by discussing the intersection of privacy and personal data. They discuss the role of data brokers, the lack of transparency in data collection, and the potential risks of AI and deepfake technology. 

Send us a text

Get peace of mind. Get Competitive-Get NetGain. Contact NetGain today at 844-777-6278 or reach out online at www.NETGAINIT.com  
 
Imagine setting yourself apart from the competition because your organization is always secure, always available, and always ahead of the curve. That’s NetGain Technologies – your total one source for cybersecurity, IT support, and technology planning.

Have a Guest idea or Story for us to Cover? You can now text our Podcast Studio direct. Text direct (904) 867-4466.

A word from our Sponsor-Kiteworks. Accelerate your CMMC 2.0 compliance and address federal zero-trust requirements with Kiteworks' universal, secure file sharing platform made for every organization, and helpful to defense contractors.

Visit kiteworks.com to get started. 

🎧 Subscribe now http://www.youtube.com/@cybercrimejunkiespodcast and never miss an episode!

Follow Us:
🔗 Website: https://cybercrimejunkies.com
📱 X/Twitter: https://x.com/CybercrimeJunky
📸 Instagram: https://www.instagram.com/cybercrimejunkies/

Want to help us out? Leave us a 5-Star review on Apple Podcast Reviews.
Listen to Our Podcast:
🎙️ Apple Podcasts: https://podcasts.apple.com/us/podcast/cyber-crime-junkies/id1633932941
🎙️ Spotify: https://open.spotify.com/show/5y4U2v51gztlenr8TJ2LJs?si=537680ec262545b3
🎙️ Google Podcasts: http://www.youtube.com/@cybercrimejunkiespodcast

Join the Conversation: 💬 Leave your comments and questions. TEXT THE LINK ABOVE . We'd love to hear your thoughts and suggestions for future episodes!

How Data Brokers Works How to Delete Your Data.

David Mauro interviews Merry Marwig, a data privacy advocate, and Jeff Jockisch, founder of Obscure IQ, about how data brokers work and how to delete your data by discussing the intersection of privacy and personal data. They discuss the role of data brokers, the lack of transparency in data collection, and the potential risks of AI and deepfake technology.

 🎧 Subscribe now http://www.youtube.com/@cybercrimejunkiespodcast and never miss an episode! Stay ahead of cyber criminals with Cyber Crime Junkies. Follow Us: 🔗 Website: https://cybercrimejunkies.com 📱 X/Twitter: https://x.com/CybercrimeJunky 📸 Instagram: https://www.instagram.com/cybercrimejunkies/ Listen to Our Podcast: 🎙️ Apple Podcasts: https://podcasts.apple.com/us/podcast/cyber-crime-junkies/id1633932941 🎙️ Spotify: https://open.spotify.com/show/5y4U2v51gztlenr8TJ2LJs?si=537680ec262545b3 🎙️ Google Podcasts: http://www.youtube.com/@cybercrimejunkiespodcast Join the Conversation: 💬 Leave your comments and questions below. We'd love to hear your thoughts and suggestions for future episodes! #CyberCrimeJunkies #CyberSecurity #CyberCrimeStories #CyberLeaders #Podcast #CyberAwareness #ProtectYourOrganization #CyberSecurityTips #CyberProtection Don't forget to LIKE, SHARE, and SUBSCRIBE to stay updated with our latest episodes!



Chapters

00:00 Introduction and Background
06:15 Privacy Expectations in the Digital Realm
13:40 Risks of AI and Deepfake Technology
38:14 Practical Steps to Protect Personal Information
44:45 The Dangers of Inaccurate Data and the Need for Data Management
52:26 Taking Control: Managing and Deleting Personal Data

Keywords

David Mauro interviews Merry Marwig, a data privacy advocate, and Jeff Jockisch, founder of Obscure IQ, about how data brokers work and how to delete your data by discussing the intersection of privacy and personal data. They discuss the role of data brokers, the lack of transparency in data collection, and the potential risks of AI and deepfake technology.

Don’t Miss the Video episode: https://youtu.be/sXRw4-NWlFs


Title. 
How Data Brokers Work. How to Delete Your Data.

How Do Data Brokers Work, How To Delete Your Data, how to delete your information from the internet, how to delete your data from company, importance of deleting personal data online, what happens if personal data is leaked, why are data brokers bad, dangers of data brokers, how do data brokers work, data privacy, data brokers, personal data, transparency, AI, deepfake technology, privacy laws, Need for regulating data brokers, what do data brokers do, cyber crime junkies, mauro

D. Mauro (00:01.948)
Well, welcome everybody to Cyber Crime Junkies. I'm your host, David Mauro and in the studio today, I'm very excited. We have Mary Marwig, a data privacy advocate, and Jeff Jockisch founder of Obscure IQ. Both of them specialize in privacy, in the intersection of privacy with our own personal data, our own personal brands, and then the brands of organizations.

and there's a lot of topics to talk about. So welcome both of you.

Merry Marwig (00:35.438)
Thanks David, great to be here.

Jeff Jockisch (00:37.783)
Hey David, how are ya?

D. Mauro (00:38.908)
Great, great to sit down with you both. So let's start with Jeff. Jeff, walk us through, kind of tell us what Obscure IQ is and what inspired you to kind of get involved in the privacy sector.

Jeff Jockisch (00:55.863)
Well, Obscure IQ is a privacy recovery company. We help organizations and individuals recover privacy from an individual standpoint that's really helping you delete and understand your digital footprint. And from an organizational perspective, it's really sort of understanding how your employees' personal data might be hurting your threat surface.

and how helping you manage some of that employee personal data can reduce your cyber risk.

D. Mauro (01:32.412)
Phenomenal. So I want to dig into that, but I also want to like introduce Mary. So even though she's been on the podcast before, Mary, you're a privacy advocate. Well, like how did you get involved in the industry? What drives you to help organizations with managing privacy?

Merry Marwig (01:54.574)
I got into privacy because I was personally drawn into it. I had my data used in ways that I did not consent to, that I didn't know about, and I felt really at unease from that, just having people be able to follow me, track me, do what I'm doing, and that's information that I wasn't actively giving out there. Like, I'm very active on LinkedIn. I control that, but...

D. Mauro (02:10.492)
Hmm.

Merry Marwig (02:21.422)
What's very problematic are data brokers, which is what we'll dive into a bit today. These companies that collect information about people that they don't have a business relationship with. So how would an individual like myself even know where to turn? And so I got pulled into privacy in that direction. I've done a number of different roles at companies. I've helped individuals through my nonprofit work.

D. Mauro (02:26.364)
Yes.

Merry Marwig (02:47.726)
I've also helped companies operationalize privacy with a number of US -based privacy laws that are coming on the books. But yeah, really just feeling creeped out by data brokers is what drew me in originally.

D. Mauro (03:01.724)
So you got started in the industry through creeps online, basically, is what I'm hearing. So Jeff, what got you in? What was it? Whenever somebody is involved in cybersecurity or privacy, there's usually either an event or they were into computers or they were in the military. They have this bigger vision of serving and protecting.

Jeff Jockisch (03:07.351)
Ha ha ha.

Merry Marwig (03:07.694)
Thank you very much.

D. Mauro (03:29.98)
Did you have creeps online? What was it?

Jeff Jockisch (03:33.353)
Well, no, I was actually doing data science for a search engine. Right. And I started to really wonder about digital identity, right. And how people were using that. And we actually had a lot of young people using our service. And so I was dealing with some of the can spam laws and the tax laws and, you know, essentially how much information people were sharing with a search engine.

D. Mauro (03:36.956)
Uh...

D. Mauro (03:54.364)
Mm -hmm.

D. Mauro (04:03.196)
Yeah.

Jeff Jockisch (04:03.657)
And so privacy started to become important to me. But at the same time, I started seeing articles from people like Kashmir Hill, who's very big on this stuff now. She was even back then, this was back like in 2013. I think she wrote an article about how data brokers were selling lists of people with erectile dysfunction and people who were domestic violence abuse victims and people who were in auto crashes, right? Selling...

D. Mauro (04:16.092)
Wow.

D. Mauro (04:31.548)
So really sensitive, yeah, like, yeah, so, yeah, so really sensitive topics that clearly people did not consent to that being sent out.

Jeff Jockisch (04:32.599)
to ambulance chasers. And I'm like, what the hell is this?

Right.

Jeff Jockisch (04:44.567)
Right, this was even before we had any privacy laws on the books, right? Like GDPR or CCPA, right? So that was pretty astonishing to me. And that's when I started thinking, hey, data brokers, this is sort of scary.

D. Mauro (04:59.228)
Well, to me, it gets just high level before we get into some of the case studies and some of the applications of what you all do. You know, when I think about it and I just think constitutionally and I think of our privacy expectation, it's in the physical realm, we have an expectation, a reasonable expectation of privacy, right? And if we do things in our own

homes were allowed to, right? We have freedom. But you can't, that doesn't mean you can go ahead and do those things out in public or you can do them in the front yard because you don't have like literally like once you separate that physical barrier, you are giving up. You are waiving that right of privacy, that reasonable expectation of privacy. However, what I'm seeing here is online.

We're just clicking away because we want to get to the website or we want to get to the app or play the game or try something out online, a new AI SaaS program, whatever it is, right? And we're just clicking away and we're giving our information not knowing the extent that it's actually being used. Is that a fair statement?

Jeff Jockisch (06:15.479)
Yeah, I think it's really sort of a thing that sort of develops slowly over time. When we were originally, we had this right of privacy and an expectation of privacy. And back in the 70s, the Supreme Court sort of passed some laws that essentially said that if you give your information to third parties, that the government and other organizations sort of have access to it, that you...

D. Mauro (06:28.316)
Mm -hmm.

D. Mauro (06:39.164)
Mm -hmm.

Jeff Jockisch (06:45.367)
violate that you give up that reasonable expectation of privacy, right? And that doesn't necessarily mean the same thing in the commercial sense, right? But it plays in the same realm. And now we just give our information to third parties, and we're giving away that right. And we need to take that back.

D. Mauro (06:47.836)
You wave it, yeah.

D. Mauro (07:08.604)
Mm -hmm.

Jeff Jockisch (07:14.167)
It's sort of sad how that sort of works, right? We shouldn't, because we give our information to one person, mean that we're giving away all of our privacy.

D. Mauro (07:17.276)
Oh, yeah, there's been a lot of. Yeah.

Yeah, and this is it.

Merry Marwig (07:23.278)
I'm going to challenge you on that a little bit because I'd say another part of the problem here is that people were not properly informed with what was actually happening with their data, much like a weather app. Like you get a weather app on your phone because you want to know what the weather is. But did you truly understand that you are the product and your location is being sold and like all of like 800 to a thousand different trackers?

now know exactly where you are geographically because that was a condition for telling you the weather. So I would say, yeah.

D. Mauro (07:57.084)
Right, because yeah, isn't there the phrase that is very common in the privacy industry, at least, and I'm going to steal one from your industry, and if it's not, I'm going to make it, and that is, if it's free, you're the product. Right? Like when you're getting online, you're accessing things, and it doesn't cost you anything, and you're downloading this game, you're accessing this application, and it's not costing you anything, it's costing you something.

Merry Marwig (08:26.446)
Exactly. You're paying some money, but that hasn't been clear. And now it's just starting to come up that all of this reselling of your data, you now no longer have control over it. You don't even really know what was collected. Did you actually read the privacy notice? I'd argue many people don't. Right.

D. Mauro (08:32.668)
now.

D. Mauro (08:44.604)
Nobody nobody reads those things. Right.

Jeff Jockisch (08:47.479)
Right. I actually shared with some people earlier, I think it was on Valentine's Day, sort of my thoughts on this, right? That I think consumers are actually OK sharing information with brands, their personal data. The problem is, is that they're not OK with them sharing it then with 100 of their best friends, right? Or their data partners, right? And that's what the consumers don't understand, right? That

D. Mauro (09:02.876)
Mm -hmm.

D. Mauro (09:09.564)
Right.

Jeff Jockisch (09:15.863)
that it's not sharing it with whoever this app is that they're using, or whoever this brand is that they're giving their information to. It's all those data partners downstream. Because if I give it to that one company, it's going to go to 50 of their partners, or 500 of their partners, and then 500 of their partners. And pretty soon, your personal data is everywhere.

D. Mauro (09:38.108)
Yeah, the magnitude of it. I mean, let's go back to the example of the weather app. OK, well, I want the weather app to know my location because it's got to tell me what my weather is here. Otherwise, it's just going to tell me generally what's going on in the state. Right. So having precise location, especially, you know, when it comes to tornadoes or hurricanes or things that that that that's important for the weather app. But like I like we don't know that all of a sudden.

they're selling it to 500 other companies. So what you're saying is that's actually what's really happening.

Merry Marwig (10:15.95)
Yeah, I'd also just to contextualize what type of data we're talking about. You know, you've got your standard, your name, your address, you know, publicly available information that's not as consequential. But what we're also talking about is highly sensitive data, your health data or information about a health condition, your financial data, biometric data, genetic data, your precise geolocation data.

Jeff Jockisch (10:18.647)
it is.

Merry Marwig (10:41.486)
Private communication. So you think you're having a private communication with someone on a dating app. No, that's not private. Your sexual behavior, your calendar invites. Do you want random data brokers to have access to who you're making calendar invites with? Information about your web browsing activity, which I'd argue a lot of people don't publicly disclose that to random people because it's private. It's like that. Right.

D. Mauro (11:08.348)
No, it's exactly right. And they go far beyond that, don't they? Like, there are many apps out there, and you guys can speak to this better than me, but I know of certain apps that are like, we're going to track all of your keystrokes on your phone. And I'm like, most people are like, sure, let's like, I want the app. And so they do it and it's free. But they're capturing almost everything you're typing on your device. And then...

Are they selling that as well? I mean, that has so many different issues.

Jeff Jockisch (11:42.775)
I mean, that could actually capture your passwords that you're using for other apps, right? Can capture.

D. Mauro (11:45.82)
Yeah, right. I mean, I guess facial recognition can help you, I assume. I assume. Like if you have an iPhone and you use facial recognition to log into the app, but at some point you are typing the password in so that facial recognition can cover for that.

Merry Marwig (12:04.142)
Knee strokes can also be biometric. It's kind of like your gait. It's unique to you. The way you type is also unique to you. Much like a signature is a unique way that you move your hand. The way you type is unique to you as well. So it could actually be used to specifically identify you as a person if combined.

D. Mauro (12:09.852)
Right.

D. Mauro (12:22.076)
Absolutely. If there's an email with a misspelling of let me know, like let me know what your thoughts are. Let me know. Every 100 % of the time, I'm always misspelling that. For some reason, it's L -E and then T -me -know. Like I always have a space in there. But I think that's what you're talking about, right? There's like a digital footprint that I have in the way that I'm typing on my phone or something.

Merry Marwig (12:49.614)
That's right. That's right. So where does this stuff?

D. Mauro (12:51.228)
Well, now I just disclosed that I'll have to edit that out later. So like I just disclosed like away because now they can fish me, right? They totally fish me. They'll just send out like it had to be him. He misspelled that all the time.

Merry Marwig (12:54.734)
The question.

Merry Marwig (13:04.974)
What?

Jeff Jockisch (13:05.239)
Well, they probably won't fish you, they'll fish your mom, right? Right?

D. Mauro (13:07.804)
Oh, yeah, exactly. Right. Yeah. Have at it. So that's exactly right.

Merry Marwig (13:13.134)
You're you're touching on something David that I think is really important that we're actually not talking a lot about in the industry, which is using data broker data to just augment the next level of targeted spearfishing and social engineering because if like, you know, you see all these AI personalization software, you know, sell more whatever bips and bops that you're selling the bad.

D. Mauro (13:31.004)
Absolutely.

D. Mauro (13:36.764)
Mm -hmm.

Merry Marwig (13:40.398)
bad guys can also do the same. And that's something that Jeff has done some work on. So I'd love to hear your perspective too.

Jeff Jockisch (13:49.559)
Yeah, so sorry, I've got a little background noise. Hopefully you can't hear that. AI is getting really good at being able to convince people of things, right? There's actually a new study out that shows that AI is actually twice as good as people at convincing people of arguments. And that's just sort of the tip of the iceberg. So if you imagine,

D. Mauro (13:50.044)
Absolutely.

D. Mauro (13:53.948)
That's all right.

D. Mauro (14:04.508)
Oh yeah.

Jeff Jockisch (14:19.639)
I take an AI, right? And I feed it all of your personal information, right? It can now start to craft arguments and try to convince you to give up your credentials, for instance, right? To log into your computer or to log into your office, right? Or to log into your bank account, right? It will do all of these social engineering and phishing attacks way more effectively, right? Than a human was able to do in the past. And even if it wasn't, right? It can do it a hundred times or a thousand times or

10 ,000 times, right? So that over time, even if Mary was a great social engineer, right? She can only do that several times, right? That AI can do it 10 ,000 times. So eventually it's going to get you.

D. Mauro (14:51.708)
Oh, at scale, at scale. Yep.

D. Mauro (15:01.628)
Right. Yeah, that's unbelievable. So let's talk about data brokers. We use that phrase a lot. What are they? Could you just define some terms for us? Like what are, like define data brokers for us.

Jeff Jockisch (15:18.999)
It's really a tough subject because there are legal definitions and there are dictionary definitions and then there are very broad definitions. Let's just say that a data broker in its most basic form is somebody that aggregates a bunch of information and then sells it to somebody else. That's the simplest definition.

D. Mauro (15:22.14)
Yeah.

D. Mauro (15:37.244)
Hmm.

Okay.

Is it all legally available information or is it information that people don't know is being released?

Jeff Jockisch (15:50.519)
Well, generally, in its most basic form, a data broker is going to grab information from public records, right, and then enhance that information in some way. But sometimes they might, you know.

D. Mauro (16:01.404)
So this is when you're looking for, no, I'm sorry to interrupt. I'm sorry, like an example is when you're looking for somebody, let's say you want to find an old friend or something, you Google them and there might be 15 of them, but then you have these sites online like People Search and Radaris and all these other ones. Those are examples of data brokers, right?

Jeff Jockisch (16:27.095)
Yeah, yeah. But there's so many variations, right? Those are what we call sort of classic data brokers. There's so many variations, right? I mean, it's hard to sort of define what a data broker really is, because in sort of my definition, right, a data broker isn't necessarily have to collect just like those public records, right? But there are people that are collecting vast profiles on us, right? So let's say if...

D. Mauro (16:31.996)
Mm -hmm.

D. Mauro (16:52.796)
Mm -hmm.

Jeff Jockisch (16:56.951)
you know, Target, right, the store, right, is collecting lots of consumer information so that they can sell us something. Well, they're not necessarily selling that data, but they've got a pretty vast profile on this. Could...

D. Mauro (17:09.404)
Right. Because they have of 10 years of history of what we've purchased.

Jeff Jockisch (17:13.527)
Right. I mean, would we consider them a data broker? Not legally, they're not a data broker, but they've got pretty massive amount of information on us. What would we call them? Right?

D. Mauro (17:17.66)
Right.

D. Mauro (17:23.964)
Right.

Merry Marwig (17:26.158)
in.

D. Mauro (17:26.3)
Well, I would say their data collectors at this point, now if they start to sell that data, then they're becoming a broker, essentially. Yeah, we're not talking about any particular company, of course. Yeah, right. Yes. That's...

Jeff Jockisch (17:36.439)
Right. Well, then, and they're not necessarily selling it, right? But there are other people that are profiling us that are right. Right. Or look at look at Facebook, right? Facebook's causing these massive profiles. They're not technically selling the data, right? But they're letting other people access it to give us stuff or to target us. Right. So are they a data broker?

D. Mauro (17:54.812)
Mm -hmm. Well, I think many people would argue, yeah, clearly they are. Yeah.

Merry Marwig (17:58.062)
Thank you.

Jeff Jockisch (18:00.503)
But you see how there's so many different sort of variations on how you might share or sell or give it to your partners, right? So how do you actually define it?

D. Mauro (18:07.644)
Right.

D. Mauro (18:11.644)
Yeah.

Merry Marwig (18:16.142)
Well, you gentlemen are touching on something that has been problematic. Like, how do you actually define this? And we're starting to see laws shape up to address that problem. I will point you to California's law. They have a new law called the Delete Act, where they do define it. It's a data broker is a business that knowingly collects and sells to third parties the personal information of a consumer with whom they don't have a direct relationship. So that could be selling that data for money.

D. Mauro (18:16.54)
Well.

D. Mauro (18:42.524)
Okay.

Merry Marwig (18:44.462)
that could be sharing that data as Jeff was eluding with partners. So like, for example, there was a food delivery app that was sharing people's food order and location data with their marketing partners. And either customers were like not aware of this or, and when they did say, hey, I don't want to do that, they weren't able to actually honor those requests to stop sharing because the data had been.

transferred so many times that the ownership of just changed hands. You can't get it back. It's like trying to put glitter back into it too. Good luck. You know, that's part of the problem here. So defining them is problematic. There is a new law that was just proposed. It's a bill, a federal law, the American Privacy Rights Act. Did I get that one right, Jeff? There's always new acronyms out.

D. Mauro (19:13.532)
Right. They couldn't get it back. Yeah.

D. Mauro (19:20.092)
Right.

Merry Marwig (19:37.294)
They have addressed data brokers as well, but very light touch. It's to like just create a list basically of data brokers, because that's another problem. It's not just how do we define them, but like how do we find them? Because there are those people search sites that you mentioned and that's just the surface level. Then there's stuff that's way deep, like data brokers that just sell lists of all of your mobile advertising numbers on your phone, you know, it may not have your.

D. Mauro (20:04.636)
Right.

Merry Marwig (20:05.934)
your name on there, but that's an identifier to you. So yeah, trying to wrap ourselves, wrap our minds around how big this problem is. Just to give you some context, the data broker industry is presently valued at $200 billion right now. And that's set to expand because the data that they have is going to fuel the AI market, which is expected to surpass $1 .8 trillion. So there's...

D. Mauro (20:09.66)
Mm -hmm.

D. Mauro (20:29.756)
Oh, yeah. Right.

Merry Marwig (20:34.318)
a big role for data brokers to play in this AI era that we're embarking on.

D. Mauro (20:39.548)
Well, yeah, when we think about deep fake technology and things like that, feeding all that information in so that they can create an avatar that can say and sound exactly like you is just horrifying, to say the least.

Jeff Jockisch (20:52.631)
You know the LLMs that are out there now, right? You know, you've got chat GPT and you've got Gemini and Anthropic has one, right? There's a whole bunch of different LLMs out there. There are already several things that we call dark LLMs, right? That people are using for crime, right? Yeah, right. So.

D. Mauro (21:08.796)
Mm -hmm. Oh, absolutely. Right.

Well, they can. I mean, all of the when you talk to talk about social engineering and issues like that, one of the red flags used to be poor grammar. Right. And, you know, that's gone away because AI has eradicated that. Right. It is, you know, you're able to program it to speak in the way of a person of this education from this region. And it says it and it'll have the nuance and the syntax and everything else.

And now deep fake audio and deep fake video are doing the same thing, right? The same syntax, the same speech patterns, the pauses, all of those little idiosyncrasies. So when we talked about those sites, like the people searching things, there was that article that Brian Krebs wrote about where he talked about like the broker Rodaris. They have this publicly available data.

But then the co -founders then are found to be operating even outside the United States, which really has a whole issue, doesn't it? Because how do you control, how does the US pass a law that's going to govern a Russian national or something like that? Right? Like, how are they going to enforce that?

Jeff Jockisch (22:33.719)
It's, yeah, it's problematic in that way. It may also be problematic if they're still operating out of Russia based upon this new executive order that Biden just signed, right? That essentially says that we can't share bulk sensitive data with foreign adversaries. So how are they actually getting their data? Somebody's selling that data to them, right?

D. Mauro (22:43.58)
Yep.

D. Mauro (22:54.332)
Why do you think that, yeah, let's talk about that for a second. Why do you think, yeah, why do you guys think that was passed? I mean, obviously I know why it's a good law or it seems to make sense, right? Because we don't want adversaries, you know, people from people to be able to profit like, like it's a stock and trade people's data that could be used by an adversary, somebody that wants to hurt them. We, I get that part.

But why do you think that came out now? Are there stories of organizations selling data to adversaries in the past? I think there are, aren't there?

Jeff Jockisch (23:34.935)
Yeah, there's been a bunch of stories that have been coming out. They don't get a whole lot of news coverage, but there was a story about a data broker out of California that was selling location data for soldiers. And actually a bunch of the location brokers have been doing that kind of thing. But it's more than that. I think part of it is also legislation for the fourth amendment is not for sale act has been coming out.

D. Mauro (23:50.428)
Right.

D. Mauro (24:03.836)
Hmm.

Jeff Jockisch (24:04.631)
where some people are trying to cut off the government's ability to circumvent the Fourth Amendment by buying personal information. That's a good thing in a lot of ways because we don't really want the government to circumvent our Fourth Amendment rights. The government though has a little bit of a different view on this. They need to be able to,

D. Mauro (24:14.844)
Right.

D. Mauro (24:29.596)
Shocking.

Jeff Jockisch (24:31.863)
track bad guys, right? And that's one of the ways they do it. Right. But their biggest concern is not that we cut off that data. Their biggest concern is that we cut off that data while leaving that avenue available to foreign adversaries. Right. And I think the Biden administration executive order is a way of trying to get ahead of that. Right. And essentially say, first, let's cut off that access to foreign adversaries. And then maybe we can talk about doing something with this.

D. Mauro (24:32.956)
Right. Oh yeah.

D. Mauro (24:47.068)
Right.

Jeff Jockisch (25:01.303)
Fourth Amendment problem we have here.

D. Mauro (25:02.972)
Right, exactly. Because at the end of the day, if people are law abiding, there's nothing to worry about anybody seeing what you're doing. In the other sense, you still have a, again, reasonable sense of a reasonable expectation of privacy, right? That the apps that most people believe when they are chatting on certain apps, they're going to be private. Otherwise, they wouldn't say some of the things they say, right? Like, you assume that, but it's not the case, right?

and a lot of that data then gets sold.

Unbelievable. One of you guys had brought up some efforts that some individual businesses are doing. So what happened with General Motors?

Merry Marwig (25:50.798)
So GM, this is another one of those articles that Cash Mary Hill, Jeff mentioned her. She's a reporter at the New York Times. She's been on this topic for over a decade now. She did a breakthrough reporting on GM that was sharing driver data. So with the OnStar thing, it can tell how hard you brake if you accelerate really fast, if you're driving erratically.

D. Mauro (25:58.076)
Mm -hmm.

Merry Marwig (26:17.486)
where you started your trip, where you ended your trip, how long it took. So all of that data is tracked. And what they were doing was selling that data, which most consumers signed up for that for like safety reasons. They were selling that data to two data brokers that then resold it to driver insurance companies. So...

D. Mauro (26:32.86)
Right.

Merry Marwig (26:39.982)
If you happen to be a sports car lover and you drive your car really hard because that's the point of having a sports car on a track or something, now all of a sudden you've become uninsurable because your insurance says, wow, you brake really hard, you accelerate, we no longer want to insure you and drop you. So this is again what I was talking to earlier is where people really were kind of fleeced. They thought they were signing up for something over here, safety.

And really this data is being used in a way that they had no idea the downstream effects. So like, again, you sign up for a weather app for weather, but it turns out they're selling your data. You sign up for an on -star safety feature on your car and it ends up costing you your insurance. So I think there's a transparency problem right now, but there's a lot of people that make good money.

D. Mauro (27:29.468)
It seems like it's a disclosure problem, right? It's almost like a false advertising problem, right? Because they may disclose it, but it's in the fine print. But there's still, there's a whole host of case law that says you still have to disclose it openly and commonly, right? It's why you can't have a sign warning of dangerous areas that's located on the back part.

of it so that people can walk onto the dangerous area. Oh, well, there is a sign over there, right? Like it has to be posted and visible and obvious to people so that they can decide whether to take that risk or not.

Merry Marwig (28:10.158)
I think that's the idea. One thing I do want to mention about GM, there was public backlash due to that reporting and they did stop selling data to those two data brokers because of that reporting. So I just want to introduce the concept of brand damage here through business practices that may be legal technically, but unethical when it comes to your customer relationships.

D. Mauro (28:35.356)
think about this when you go and you try and you're shopping for insurance, they all have that app that you can either plug in, they'll send you a little piece of plastic that you can plug into your car, or there's an app that you download and then you drive with your app and you do that for like 90 days or six months and then they give you some discount, right? But the question is, is you're doing that during that limited period of time.

Jeff Jockisch (28:35.575)
Yeah.

D. Mauro (29:03.068)
so that you can get a discount on your insurance. What you may not know is the issue, and that is that they're selling all of that data to other carriers, insurers, tire makers, all this other stuff, right? Because, right?

Jeff Jockisch (29:19.127)
Thank you.

Yeah. What's crazy, David, is when you buy a car today, you can't actually buy a car that doesn't have several different location trackers embedded in that car, new cars, right? So all of these cars have crazy amount of smart technology, just like your phones now, right? Your car is tracking you as much as your phone is, and it's pretty scary. You can't even disable it.

D. Mauro (29:33.052)
Right.

D. Mauro (29:44.124)
Mm -hmm.

All right.

Jeff Jockisch (29:50.519)
I don't think you can actually buy a new car that does not have this stuff enabled. You can't buy a GM car that does not have OnStar enabled.

D. Mauro (29:54.396)
Right.

D. Mauro (29:58.524)
So what you're saying, what I'm hearing is buy a 1985 Camaro and leave my phone at home is what you're saying basically, right? Like.

Jeff Jockisch (30:04.246)
Hahaha!

Jeff Jockisch (30:08.822)
You're right, you're right, you know. Well, even that doesn't completely stop the location tracking because there's a network of automated license plate readers around the country. So if you have a license plate on that car, they're going to track you that way.

D. Mauro (30:23.676)
We have auto also I have to take the license plate off now. All right. I'm let me put that on my list. Remove license but then I'm getting pulled over. So holy cow.

Jeff Jockisch (30:27.127)
Hahaha.

Merry Marwig (30:31.886)
Well, there's a hack for that, guys. You can get a license plate cover that has one part that shoots out so it's visible head on. But if it's slightly at an angle, you can't see the full license. Just saying.

Jeff Jockisch (30:46.647)
Ha ha.

D. Mauro (30:47.74)
Well, the issue is, look, and the cyber criminals are like, cool, we're going to go shop for that now. Look, and the point is we're not trying to find ways around safety. We're not trying to find ways around, we're just trying to maintain a reasonable expectation of privacy. I mean, that's the whole point of it. When people want to address this issue of data brokers, meaning,

Jeff Jockisch (30:55.255)
Haha.

D. Mauro (31:17.404)
And I'm not saying all data brokers are bad. I'm waiting to meet one that I like. So let's see. But it's about I'm agreeing to do this, but then things are being done without my knowing consent. Yes, I might have clicked OK to get this app because I wanted my weather, not because I knew you were going to be selling it to 500 or 1000 different places. But what can we do to

Jeff Jockisch (31:22.807)
Haha.

D. Mauro (31:47.42)
take back some of the stuff from the data brokers. In that Krebs article, it was crazy because the people that are advertising on some of these sites that say, well, delete me, we'll help you delete your stuff off, they themselves are the same owners of the, like they're data brokers. They're capturing all that and then they're not deleting your data. They're just capturing more information. Like that's what...

as Brian Krebs pointed out, I'm like, well, that makes matters even worse.

Jeff Jockisch (32:21.623)
Yeah, I actually have to give you a little bit of a scoop here, David. We were deleting information from some of our clients and Radar sent us an email back and said, hey, you know, we deleted that information, but if you want to delete more of your information, click here. Right. And so we click through and started filling out some information, not of actual clients. Right. And it essentially took us to that other site that that Brian mentioned.

D. Mauro (32:45.692)
Mm -hmm.

D. Mauro (32:51.26)
Yeah, right.

Jeff Jockisch (32:51.607)
one rep, right? And so the two companies that he was mentioning in his two exposés, Radaris and OneRep, are actually now working together, right, to help consumers out.

D. Mauro (33:04.508)
Share data. Right. Unbelievable. Well, because it's so valuable, right? Like you can, if you can predict when people, what people buy, right? Or where people are or what medical conditions they have, then you can target ads and products and maybe leverage it in social engineering or cyber crime. There's so many different elements that can be used.

that the person isn't aware of, right? And that's what's really unnerving. So how do we solve this? You guys work in the industry. You have...

Merry Marwig (33:47.342)
I want to just mention, like you just alluded to, like what this is used for. It's used for advertising. It's used for doxing. It's used for government and military purposes. In the advertising world, you'll get segments like parents of preschoolers. That's a segment of people you can buy. Christian churchgoers. Wealthy and not healthy. Heavy drinkers.

D. Mauro (33:53.34)
Hmm?

D. Mauro (34:05.756)
Mm -hmm.

Merry Marwig (34:13.678)
behind on bills. Those are literally some of the segments that you can buy. A lot of your credit card swipe data also ends up in these data sets and it can really tell a lot about you. In fact, there's like kind of a joke that I do sometimes at new companies and you show someone your Amazon purchase list and they try to guess who it belongs to. You can tell who someone is, right?

D. Mauro (34:34.364)
Right. Yeah, because you can tell. Right? Yeah.

Merry Marwig (34:38.19)
So if you don't want that data out there or something we also haven't discussed on this podcast yet is what do you do when the information's wrong and people are making decisions? Oh my gosh, well this had a real life consequence for me. So I'll tell you a quick story. I signed up at my pharmacy for one of those loyalty points things. You type in your number every time you buy something and you get some coupons.

D. Mauro (34:48.252)
Yeah, how do you correct it? That's a great point. We didn't even address that.

D. Mauro (35:00.7)
Mm -hmm.

Merry Marwig (35:05.87)
and I gave a fake birthday because I was like, they don't need to know my real birthday for coupons. So I gave a fake birthday. Well, COVID comes around and it's time to get vaccinations. And I go to my pharmacy and I try to get a COVID shot and they refuse to give me one because the data they have on me, my birthday didn't match my ID, which is my correct birthday. And I told them, I'm like,

D. Mauro (35:11.292)
Right.

D. Mauro (35:23.612)
because your birthday was.

D. Mauro (35:30.94)
Mm -hmm.

Merry Marwig (35:32.398)
Whatever data set you're pulling from is a marketing data set. That's not like a secondary use that I ever agreed to. I couldn't get it. Thankfully, I had another pharmacy in my neighborhood because I live in a big city and I could go to another place and just be an anonymous person. But can you imagine if someone didn't have that choice and the false data floating around in these data sets could really have a life and death implication for you? And I will also just plug.

D. Mauro (35:38.396)
But they were making medical decisions based on it.

Merry Marwig (36:02.478)
The data accuracy is more problematic for women than it is for men. Think about it, many women have name changes throughout our lives. And so now you've got incomplete data and now you're seen as like a less trustworthy person when you're getting some sort of credits, like a social credit score or whatever score that's being applied to you. So accuracy is also a problem.

D. Mauro (36:11.676)
Oh, sure. Right.

D. Mauro (36:22.3)
Right.

Jeff Jockisch (36:26.999)
Yeah, these data brokers are, they want to have very thick profiles on you, right? They want to aggregate all the information they can. But if you have a common name or even a vaguely common name, right? Information about you gets aggregated up and they'd rather have something connected to you that's not you than have nothing about you. So oftentimes, you know, if they think it's 80 % connected to David, it's going to be in your profile.

D. Mauro (36:48.316)
Unbelievable.

D. Mauro (36:54.62)
Unbelievable. And this is everywhere. I mean, I came across something. One of my kids were downloading like a gaming system. I won't name them, but in before they clicked, I said, hang on, let me, let me look at that before they clicked, agreed to the T's and C's. And there was 803 partners that they were going to sell all of the data to. I was like, just to get a game. Like that is crazy.

It's it's there's no way anybody realizes that. Right. And especially for people that are younger, it really worries me about like identity theft and and issues concerning that, because children's identities are stolen so often because parents don't monitor their kids credit. Right. Their kids are six years old, seven years old. Why are they going to they're not getting out. They're not taking a loan out. Right. So.

Why would we monitor their credit? And the truth is, is right, because people are going to use their identities for a decade. We have so many case stories we've met through our work, where we meet people and they're like, our kids trying to go to college, but they have a foreclosed condo in Nevada. Right. And they're like, how is that possible? They're like, oh, because somebody was using their identity for 10, 15 years before they became of age. It's, it's, it's, it's really troublesome.

Jeff Jockisch (38:15.095)
I would give you a couple of pieces of advice to your viewers, David. I mean, obviously, I think you should start cleaning every digital footprint, right, services that we offer. But there's a couple of things you can do short of that too, right? One is to freeze your credit. The easiest thing, you should definitely do that. Another thing you can do is start.

D. Mauro (38:28.092)
Mm -hmm.

D. Mauro (38:35.036)
Right. That is something we always recommend to everybody. Yep. And if people don't know, it is free. It is free. There's three different credit bureaus. You can go and fill out the forms. And it's really simple because we were in line, I think, last fall. I've always kind of, in recent times, I've always been freezing our credit. We were getting...

Jeff Jockisch (38:40.151)
Right.

D. Mauro (39:00.444)
substantial amount of like going back to school stuff or whatever it was and they offered a store card and we never do that right but this was going to actually be a significant amount of money so we said all right we'll do that we just won't use the card so literally from my phone I unfroze my credit they ran it and then I froze my credit back like it was pretty easy to do so I mean that just it allows you know I've had Brett Johnson who

used to run Shadow Crew, one of the largest, he was the US Secret Service called Brett, the original cybercrime godfather. And his recommendation is if people would just freeze their credit, a lot of these guys would go out of business. He's like, he's like, it's so it stops 80 to 90 % of all the identity theft, because nobody can take out credit or use your name without you unfreezing your credit. So it's a really, really good.

Jeff Jockisch (39:58.327)
appreciate that. Another thing, if you haven't done it yet, turn off your mobile advertising ID. Most people on Apple ecosystem can turn that off pretty easily. About 97 % of people using Android phones have not turned off their mobile advertising ID.

D. Mauro (39:59.676)
Recommendation, Jeff.

Jeff Jockisch (40:23.831)
Yeah, it's essentially location tracking. Yeah, and so it's easy to turn off. You just have to go into your location stuff and say, don't send me personalized ads. For some reason, people think, oh, personalized ad, that sounds good, right? I'm going to get coupons, right? Well, you might get a couple of coupons that you're going to like, but all the stuff that you're not going to like is really, really bad for you.

D. Mauro (40:24.955)
Most people don't know what that is. Can you explain what that is?

D. Mauro (40:31.548)
Ah.

D. Mauro (40:41.948)
Ahhhh

Right. Right.

D. Mauro (40:55.26)
Yeah, that's great advice. That's excellent. What type of things are you guys doing at Obscure IQ?

Jeff Jockisch (40:59.447)
Well, what do we do?

D. Mauro (41:04.892)
Yeah, I mean, you do it for both individuals as well as organizations. So let's talk about organizations from a high level and then let's get into individuals.

Jeff Jockisch (41:05.687)
Yeah, yeah.

Jeff Jockisch (41:10.999)
Yeah, so for organizations, what we do is we essentially sort of scan your employees and find out which of your employees have really large footprints and sort of correlate that to their position and what they do in your organization and say, hey, you know, this is a really large risk. Maybe you need to like, you know, work with this employee and see if we can delete their information, right? Or these set of employees, because that threat surface is really problematic.

D. Mauro (41:24.892)
Mmm.

D. Mauro (41:37.34)
Got it.

Jeff Jockisch (41:38.359)
especially for smaller and medium -sized organizations. Some larger organizations may have that on lockdown, but frankly, most of them don't. They don't really see employee personal data as a threat surface yet. It's just not a thing.

D. Mauro (41:41.98)
Absolutely.

D. Mauro (41:55.132)
And that's a great example of where privacy and cybersecurity intersect because a lot of organizations don't have zero trust infrastructure set up or configuration set up. So if somebody is able to compromise one person that has a wide footprint, right, they're able to access other things that that one employee may not even know how to access, but threat actors would, right? And they'll be able to go and escalate privileges.

Jeff Jockisch (42:19.351)
You look at the casino heists that happened last year, right? That was privileged escalation from like managers, lower level managers. $100 million it cost them, right?

D. Mauro (42:25.916)
Yep. Yep.

D. Mauro (42:32.156)
Correct. Yep, all the way up. Yep, all the way up. Yep.

Merry Marwig (42:33.23)
Thank you.

In addition to using privileges incorrectly like that, there's also the threat of blackmail. The information that you find from employees, maybe it's things they'd like to keep private. And now you have someone who could be, you know, persuaded to do something against the company's interest for someone else's gain. So that's also a threat surface with that data.

D. Mauro (42:44.252)
Hmm.

D. Mauro (42:48.988)
Right.

Jeff Jockisch (42:59.511)
Well, and companies are starting to realize that they need to protect their C -suite, maybe, you know, and maybe their top, you know, IT department employees, but it's virtually every employee, right, who's a threat.

D. Mauro (43:00.476)
Absolutely. Right. Yeah.

D. Mauro (43:15.196)
Mm -hmm. Right. Well, it's so true because so many people are, A, on the one hand, curating their lives on social media and putting things out there that they may not realize. And the amount and the level of the OSINT that can be done, the open source intelligence, like the investigation of somebody, like they are able to, I've seen it with my own eyes, like they're able to just a picture.

of a person in a field, right? They're able to tell by the soil and the light, right? Exactly what time of day, what time of year, where the person was located, where they were standing. It's remarkable and scary at the same time. Yeah, it really is.

Jeff Jockisch (43:46.231)
Ha ha.

Jeff Jockisch (43:56.375)
It's really scary. Well, and that's how we actually sort of differentiate ourselves from other data deletion services. I mean, the companies like DeleteMe and Optry and Incogni, Canary, those guys actually have a great service, right? They'll delete your sort of base footprint, right? And it's going to help you make you less visible. What we do differently is we actually do an OSINT scan of your footprint. Go out and find all the information about you.

Merry Marwig (44:01.39)
You're sick.

D. Mauro (44:14.268)
Mm -hmm.

D. Mauro (44:19.228)
Mm -hmm.

Jeff Jockisch (44:25.527)
And then we can actually more effectively delete a much wider and deeper level footprint. They're not doing that. They're doing a broker scan and deleting stuff. And it is valuable. All right.

D. Mauro (44:29.02)
UGH

D. Mauro (44:33.276)
Oh yeah, right.

Which is helpful, which is helpful. Sure, that's helpful. Right. But but good old, old school gumshoe investigation really helps. Right. Because you're doing what the cyber criminals will be doing if when they want to social engineer you. Yeah.

Jeff Jockisch (44:45.399)
Great, great.

Jeff Jockisch (44:51.479)
Exactly, right? I mean, you have to actually know what's out there first to be able to delete it all because it's not just in the surface level data brokers, but it's also because those data brokers don't really want to delete your data, David.

D. Mauro (44:58.652)
Right.

Jeff Jockisch (45:07.127)
they have this belief that they should be able to keep your information. And so if I, for instance, deleted your information from, let's say, people, PIPL, or been verified, or essentially any of the people search websites that really want to keep your information, they're going to delete your index. You're not going to show up on their website. You're not going to show up on Google. But if I happen to know that you've got a

D. Mauro (45:08.956)
Of course not. It's money.

D. Mauro (45:34.94)
Mm -hmm.

Jeff Jockisch (45:36.823)
David at AOL .com account, or an old username from when you used to play on Xbox, or an old address that you used to live at. I could probably go to those websites, type that stuff in, and up pops your entire profile, even after you've been deleted. But if I do an OSIT investigation, find that stuff about you, chances are I can get them to delete those indexes too. And now you're not going to pop up. You're much safer.

D. Mauro (45:43.132)
Mm -hmm.

D. Mauro (45:47.26)
Right.

D. Mauro (45:54.748)
Right, exactly.

Jeff Jockisch (46:05.015)
That's the kind of deal that you really need.

D. Mauro (46:06.524)
Right. Yeah. And I think that's really important, Jeff, because the level that someone's going to go after, you know, so many people think, well, no one's going to really stalk me like that or whatever. It's like, oh, they do it at scale and they use AI now and they go very, they go multiple levels, right, because they want an entire dossier on a person. You know, when you when you mentioned the MGM breach and the

Caesar's breach, people think they did like a 10 minute scan of LinkedIn and made a phone call and got in. Like that's not what happened. Like that's not what happened. It was months and months of research. They had questions, backgrounds, they knew the histories, all of this information so they can impersonate somebody. Any question that could be asked, it is like the person sitting right there. And that's

Jeff Jockisch (46:41.943)
Ha ha.

Merry Marwig (47:02.862)
I think touching on something important, David, that we all need to be safe together. It's not enough just to protect your data, but the people around you too, whether that's your family or your colleagues. So if you're trying to get to the CFO, who else would you go after? If you're trying to get to the CEO, who in their orbit do you also need to go after? So it's an issue that does affect us personally, but it is also a societal level issue. So that's what's interesting.

D. Mauro (47:15.708)
Mm -hmm.

D. Mauro (47:19.516)
Right.

Merry Marwig (47:31.822)
about what Jeff is doing with the corporate offering that he has where you can, you know, handle your high visibility or high net worth individuals in your organization and take care of groups of people, not just each one as a standalone.

D. Mauro (47:48.7)
Right, exactly. I mean, that's so critical because think about it, if they can take over any of those people, then from a cybercrime perspective, they could launch exfiltration, remote access, Trojans, ransomware, you name it, because they can take over those accounts. Unbelievable. It's unbelievable.

Merry Marwig (48:09.39)
Yeah.

Merry Marwig (48:13.294)
I also want to just throw out one thing. I actually hired Jeff to do this for me. So he did OSINT. Well, because I wanted to know, you know, I thought I had excellent hygiene and everything that they weren't going to find anything, but no, they did. They found stuff from like 20 years ago, like just it was pretty much, you know, like when I was just a kid, but.

D. Mauro (48:18.926)
Oh, okay. So there's a twist. That's good to know. So that's good. Yeah.

D. Mauro (48:33.852)
Really? Like an old MySpace account or something? Like crazy. Yeah, that's great.

Jeff Jockisch (48:38.679)
Yeah.

D. Mauro (48:42.876)
Well, that's really good. That's really good. And think about it. There's so many people that, and again, this gets back to individual cyber hygiene that we always talk about, right? Because so many of us reuse passwords and we reuse some of the same credentials over and over and over again. And then maybe those were lousy. So we're like, well, I've got a really good password now. I've got a really good one. So I'm using this one on everything. Right.

Merry Marwig (48:43.502)
You know?

D. Mauro (49:09.692)
Like I put it in the, I put it in the test my password and it says it's really good. So I'm using this one on everything. No, no, no, no. Because once that gets sold to somebody who's sold to and five layers down, they get breached because they have lousy cybersecurity practices, right? Then all of a sudden that great password is now for sale right on the dark web, which by the way is available on everybody's computer.

Merry Marwig (49:14.222)
Yeah.

Merry Marwig (49:38.318)
So it's funny you say that because Jeff did find one of those great bad passwords of mine. So, you know, and I know better. I know better. It's just, you know, sometimes you have these throwaway dumb accounts where you're like, oh, I just need to get in and you're just, it's just a problem of scale. That's why it's really hard to do this by yourself. So when I did this exercise, I opened up one of my password managers and I'm like, okay, so I have like 200 accounts that I know about.

D. Mauro (49:46.62)
Right.

D. Mauro (49:54.14)
Mm -hmm.

Merry Marwig (50:04.782)
But I'm old enough to remember when the internet, we didn't have password managers. So I know there's accounts out there and that the information's been sold. I also know that David, because I have a subscription to an identity theft monitoring service. I get these alerts all the time. Your identity has been compromised. Your identity has been compromised. I'm like, okay, but what do I do? You know, that's the next level here. So what we're doing is going to the problem, which is deleting the data as best we can.

D. Mauro (50:11.196)
Absolutely.

D. Mauro (50:18.684)
Mm -hmm.

D. Mauro (50:25.852)
Right.

Merry Marwig (50:34.83)
you know, getting vacuum, like sucking up some of that glitter that fell out of the jar. You know, we're never putting it back, but you can go sweep it up. And that's kind of where.

D. Mauro (50:35.132)
Right.

D. Mauro (50:39.548)
Right.

You do realize Jeff and I can't relate to the glitter jar analogy. Like I'm like, is that a problem? I'm like, I haven't experienced that issue, but I can conceptually visualize it. So that's good. Right. Exactly. Well, that's remarkable. And it's right at that intersection where...

Jeff Jockisch (50:42.423)
Hahaha!

Jeff Jockisch (50:47.383)
Ha ha ha!

Merry Marwig (50:56.142)
as sprinkle some glitter.

D. Mauro (51:10.236)
where cybersecurity meets and so much of cybercrime can be stopped by just individuals just caring about it. And I think it's almost a cultural thing. I know our brethren overseas, they view data differently. They view it as a fundamental human right. I've talked to CISOs and people involved in all levels of cybersecurity.

over in the UK and other various parts, even Australia, and they really view data differently. They view their own personal data differently. And it seems like America is kind of coming around, but it's slow. It seems to be a slower evolution. Meanwhile, we're just curating our data, our videos on TikTok and giving it all the bite dance and everything else. And so...

Jeff Jockisch (51:58.967)
Hehehehe

Jeff Jockisch (52:04.247)
I think we might need some sort of scared straight stuff. When we do footprint audits of people, we generally get some crazy responses because we find interesting stuff. Well, people...

D. Mauro (52:07.548)
It's.

D. Mauro (52:21.212)
Like what? Like what? Yeah. I mean, don't just go as who it was, but share with us some of the things that you found.

Jeff Jockisch (52:22.807)
No, of course. Right?

Usually, well, I mean, I'm just talking about like, you know, even like normal stuff, like show people like their breached passwords. And almost everybody is like, I mean, I mean, you can go to have I've been pwned, right. And see all the different breaches people, right. And, and that like that'll mildly wake people up. Right. But it doesn't have the same impact as actually showing you a password that's been compromised, right. That you thought was safe. Right. So many people, when we show that we do a footprint audit and show them that they're like,

D. Mauro (52:37.98)
Right.

It's always shocking. Yep.

D. Mauro (52:54.588)
Mm -hmm. Right.

Jeff Jockisch (52:56.535)
Oh my God, right? You can see they actually just want to like, and cut the interview short and go change a password.

D. Mauro (53:04.956)
Right, absolutely. Unbelievable. Well.

Merry Marwig (53:09.87)
though, is that we don't just scare people because it is frightening when you, when you realize this information is out there and that there really is now a step you can take. Um, that's what I hope for listeners get to get from this interview that we've done today is that there are now companies like Obscure IQ and the litany of other ones that Jeff rattled off earlier that are helping to get to the source of the problem, to solve the real problem, which is the data itself, get rid of the data, get rid of the risk. So.

D. Mauro (53:20.316)
All right.

D. Mauro (53:36.124)
Right, because it's out there, right? It's out there and taking those steps to kind of manage your own digital footprint I think is absolutely critical. So we will have links to obscure IQ in the show notes and we encourage everybody to check out some of the great work that Jeff and team are doing. And Mary continue to.

Merry Marwig (53:40.206)
Yes.

D. Mauro (54:03.324)
be an advocate. This is a great topic and it's one that we will always have more and more content about as cybercrime continues because it's up. Believe it or not, cybercrime is not going down. It tends to be going up. We'll continue to monitor things. We thank you both for joining us today.

Jeff Jockisch (54:25.559)
Appreciate it, David. Thanks, guys.

D. Mauro (54:30.3)
Thank you, really appreciate it.

D. Mauro (54:36.892)
Awesome. So if you...


People on this episode