Cyber Crime Junkies

A Story of Innovation in Privacy. Merry Marwig.

Cyber Crime Junkies-David Mauro Season 4 Episode 30

This is the story of Merry Marwig and Innovative Approaches To Data Privacy. Second Episode in series on Privacy with Merry. Merry Marwig, a privacy professional with DataGrail, discusses how to reduce risk when managing data privacy. 
 
She emphasizes the value and power of data, as well as the need for transparency and consent. Merry also discusses the role of privacy in business and the challenges of privacy compliance. She concludes by highlighting the intersection of privacy and security and the importance of aligning the two. 

Key Takeaways

  • Data monetization has become normalized, with companies collecting and selling personal data without consumers being fully aware.
  • User awareness about privacy practices and data collection is crucial.
  • Moving beyond passwords and implementing stronger authentication methods is essential for better security.
  • Personal privacy can be compromised, and hiring authorized agents can help manage privacy requests.
  • Automation can help operationalize privacy requests and save time and resources.
  • Privacy tools should have least privilege access to ensure data security.

  CHAPTERS

  • 2:33 The Implications of Data Brokers
  • 4:23 Managing Data Sprawl
  • 6:03 The Importance of System Inventory
  • 9:47 The Threat of Weaponizing Access Requests
  • 11:49 The Historical Context of Privacy
  • 12:56 The Cultural Differences in Privacy
  • 15:22 The Use and Abuse of Data
  • 17:24 The Need for Privacy Education
  • 19:52 The Importance of Privacy in Business
  • 22:34 The Growing Landscape of Privacy Laws
  • 24:46 The Impact of State Privacy Laws
  • 26:25 The Challenges of Privacy Compliance
  • 28:22 The Need for Automated Priva

Send us a text

Have a Guest idea or Story for us to Cover? You can now text our Podcast Studio direct. Text direct (904) 867-446

Get peace of mind. Get Competitive-Get NetGain. Contact NetGain today at 844-777-6278 or reach out online at www.NETGAINIT.com  
 
Imagine setting yourself apart from the competition because your organization is always secure, always available, and always ahead of the curve. That’s NetGain Technologies – your total one source for cybersecurity, IT support, and technology planning.

🎧 Subscribe now http://www.youtube.com/@cybercrimejunkiespodcast and never miss an episode!

Follow Us:
🔗 Website: https://cybercrimejunkies.com
📱 X/Twitter: https://x.com/CybercrimeJunky
📸 Instagram: https://www.instagram.com/cybercrimejunkies/

Want to help us out? Leave us a 5-Star review on Apple Podcast Reviews.
Listen to Our Podcast:
🎙️ Apple Podcasts: https://podcasts.apple.com/us/podcast/cyber-crime-junkies/id1633932941
🎙️ Spotify: https://open.spotify.com/show/5y4U2v51gztlenr8TJ2LJs?si=537680ec262545b3
🎙️ Google Podcasts: http://www.youtube.com/@cybercrimejunkiespodcast

Join the Conversation: 💬 Leave your comments and questions. TEXT THE LINK ABOVE . We'd love to hear your thoughts and suggestions for future episodes!

D Mauro (00:02.249) THIS IS EPISODE 2 of MERRY MARWIG's Privacy Series

How To Reduce Risk When Managing Data Privacy

Summary

Merry Marwig, a privacy professional with DataGrail, discusses how to reduce risk when managing data privacy. We explore the importance of privacy in today's digital world. She shares her personal journey into the field of privacy and highlights the need for awareness and education. Merry explains the impact of privacy harms and the vast amount of personal data being collected. She emphasizes the value and power of data, as well as the need for transparency and consent. Merry also discusses the role of privacy in business and the challenges of privacy compliance. She concludes by highlighting the intersection of privacy and security and the importance of aligning the two. 

Merry Marwig discusses various aspects of privacy, including its role in organizations, the operationalization of privacy, data monetization, user awareness, moving beyond passwords, personal privacy, and data deletion. She also highlights the importance of authorized agents for privacy requests and the need for least privilege access in privacy tools.

Key Takeaways

Privacy is a multidisciplinary field that requires awareness and education.
Privacy harms, such as identity theft and stalking, can have significant emotional and financial impacts.
The vast amount of personal data being collected by data brokers and companies raises concerns about privacy and consent.
Consumer privacy laws, such as GDPR and state-based laws, aim to give individuals more control over their data.
Privacy and security are separate but related issues that need to be addressed together for effective data protection. Privacy is still being figured out in organizations, with different departments often handling it in different ways.
Operationalizing privacy requires collaboration between various teams, including GRC, risk compliance, legal, executives, and the board.
Data monetization has become normalized, with companies collecting and selling personal data without consumers being fully aware.
User awareness about privacy practices and data collection is crucial.
Moving beyond passwords and implementing stronger authentication methods is essential for better security.
Personal privacy can be compromised, and hiring authorized agents can help manage privacy requests.
Automation can help operationalize privacy requests and save time and resources.
Privacy tools should have least privilege access to ensure data security.
Companies should carefully consider the access they give to privacy providers and focus on the specific privacy operations needed.
Chapters

00:00 Introduction and Background
03:18 How Privacy Became a Passion
05:34 Privacy Harms and the Need for Protection
07:37 The Vast Amount of Personal Data
08:30 The Impact of GDPR
09:55 The Value and Power of Data
11:29 Privacy as a Career Path
11:39 Data Privacy Technology Companies
12:11 The Power and Danger of Data
13:40 The Use and Abuse of Data
15:07 Lack of Consent and Transparency
16:06 State-Based Consumer Privacy Laws
18:10 Consumer Rights and Expectations
19:07 Accessing and Controlling Personal Data
20:21 The Monetization of Data
22:04 The Relationship Between Privacy and Security
23:01 Data Protection and Data Use
25:33 The Importance of Privacy by Design
27:09 Privacy Harms and Lack of Awareness
28:22 The Impact of Historical Context on Privacy
32:33 The Implications of Data Brokers
34:23 Managing Data Sprawl
36:03 The Importance of System Inventory
39:47 The Threat of Weaponizing Access Requests
41:49 The Historical Context of Privacy
42:56 The Cultural Differences in Privacy
45:22 The Use and Abuse of Data
47:24 The Need for Privacy Education
49:52 The Importance of Privacy in Business
52:34 The Growing Landscape of Privacy Laws
54:46 The Impact of State Privacy Laws
56:25 The Challenges of Privacy Compliance
58:22 The Need for Automated Privacy Solutions
01:00:20 The Intersection of Privacy and Security
01:01:08 The NIST Cybersecurity Framework for Privacy
01:04:45 The Changing Landscape of Privacy
01:06:09 The Importance of Privacy and Security Alignment
01:10:09 Privacy in Organizations
01:11:32 Operationalizing Privacy
01:13:13 Data Monetization and Privacy
01:14:45 User Awareness and Privacy
01:15:15 Moving Beyond Passwords
01:17:32 Password Hygiene and Authentication
01:18:44 Personal Privacy and Data Deletion
01:19:32 Authorized Agents for Privacy Requests
01:20:30 Operationalizing Privacy Requests
01:23:21 Least Privilege Access in Privacy Tools

Topics: how to reduce risk when managing data privacy, how managing data privacy compares to cyber security, risks when managing data privacy, innovative approaches to data privacy, best ways to manage data privacy, best practices for managing data privacy , latest on data privacy and consent, top approaches to data privacy, how the us handles data privacy, data privacy in united states, approaches to managing data privacy, Data privacy and consent, approach to data privacy, role consent plays in data protection and privacy, data privacy compliance, data subject rights, data privacy vs data security, recent changes in data privacy and consent,

All right, well, welcome everybody to Cybercrime Junkies. I'm your host, David Mauro Very excited about today's episode. We've got Merry Marwig with us, and she's a privacy advocate and a privacy consultant with Data Grail. We're gonna hear all about her background, her expertise and Data Grail and some of the things that they do while we discuss and explore privacy and its intersection with cybersecurity.

She works with top brands, there are data grail places like Salesforce, Amazon, Instacart, and more, focusing on kind of building customer trust and reducing risk. And you've been in the privacy industry and the data security industry for many years. You hold a CIPP, and we're going to find out what these acronyms mean in just a minute, Merry

It's a CIPP slash us a CIPM and an FIP certification from the International Association of Privacy Professionals and earned a master's in polysci from U of I. So welcome to the studio.

Merry Marwig (01:15.746)
Thanks for having me, so excited to be here.

D Mauro (01:18.193)
No, we're very excited. So let's get the acronyms out of the way first. So CIPP, what does that mean?

Merry Marwig (01:25.442)
Certified information privacy professional for the United States. So the IEPP, the International Association of Privacy Professionals, is one of the top professional organizations that we have for privacy. They have great content. They do certifications. I've earned the US one and privacy in the United States is very fractured compared to other jurisdictions like in the EU. So, oh my gosh.

D Mauro (01:30.683)
Excellent.

D Mauro (01:49.789)
It's very different, right? I mean, especially in the areas where they have GDPR and all of those, where they hold privacy as a fundamental human right, unlike we do here. Right.

Merry Marwig (02:01.974)
Right. Over there in the EU, which they do have a certification for that. That's more of a comprehensive privacy law that covers everyone in the United States. We have sectoral. So privacy for healthcare, privacy for education, for finance stuff. We have privacy now. Absolutely. And now we're starting to get state-based consumer privacy laws.

D Mauro (02:06.294)
Yeah.

D Mauro (02:12.833)
Hmm. Certainly. Yep.

Yeah, all the all the different verticals, right?

Merry Marwig (02:25.862)
And so that's where we specialize. I also have a certified information privacy manager. That's what the other acronym stands for. And the fifth, that stands for fellow of information and privacy. So I've been around long enough to earn a fellow distinction.

D Mauro (02:26.254)
Right.

D Mauro (02:33.082)
Yep, the CIPM.

D Mauro (02:41.429)
Wow, very cool. And what's a FIP? What's a F-I-P?

Merry Marwig (02:44.618)
a fellow of information privacy.

D Mauro (02:48.257)
Very cool. So I'm very familiar with all the cybersecurity certs. So as I'm learning more about privacy, this is really good. Let's back up just generally kind of what inspired you to get into privacy as a career? I mean, you were coming from political science. So that was my background too. So I stemmed into cybersecurity. How did you segue into privacy?

Merry Marwig (03:18.798)
Privacy found me. And the interesting thing about our profession is there's no one natural path, really. We see it. Right. Absolutely.

D Mauro (03:27.065)
No, right, in either profession and neither secure cybersecurity or privacy, right? It's it's so it's a mix of, of marketing, politics, technology, legend, like desire to pass laws, all of this combined,

Merry Marwig (03:45.482)
It's very multidisciplinary. So there are now starting to be higher educational programs geared for privacy, but really at this stage, I'd say it just kind of has to find you or you have to have a passion for it. And so that's how I got involved. I became very aware of my lacking privacy rights because I'm a victim of a few privacy harms that fundamentally shaped my life.

D Mauro (03:47.72)
Mm-hmm.

D Mauro (03:54.122)
Right.

D Mauro (04:12.685)
Oh, yeah.

Merry Marwig (04:14.742)
So I had my identity stolen. That's kind of a natural way for a lot of people to get involved and immersed in this topic. And then later.

D Mauro (04:24.133)
And for those listeners that have undergone that, because we get a lot of comments and a lot of emails from listeners. So if anybody has had identity stolen, issues like that, please feel free to reach out right on our site, cybe There's a link right there you can get in touch with us. We can direct you into various aspects of places where you can get some help, but it is brutal when that happens. It's a long journey, isn't it?

Merry Marwig (04:53.966)
Absolutely. And you know, I solved my identity problem back then by just paying my way out of it because I didn't know there were resources like what you just offered, David. And that was a huge lift for me. When I was a young person starting out in life, I didn't have extra money to spend on that. So it really cut into my wellbeing. So, you know, we have to think about it that way. It's emotional costs, financial costs.

D Mauro (05:03.438)
Right.

D Mauro (05:12.492)
Right.

D Mauro (05:20.753)
I was going to say even you have far beyond just the financial impact, right? I mean, it really it really affects the emotions and your just your sense of belonging, your sense of safety, your own psychological well-being.

Merry Marwig (05:25.503)
Absolutely.

Merry Marwig (05:34.89)
Absolutely, and that was the second privacy harm that you're touching on that really got me interested in privacy. I was heavily stalked by someone with a lot of access to commercial data. We'll get into this a little bit today about the vast amount of personal data floating out there that we may not even be aware of. Just the problem is at a scale that is just so large.

D Mauro (05:43.735)
Ugh.

D Mauro (05:58.105)
Yeah, I really think listeners are going to be shocked by some of the information that you've shared with me so far, because it has blown us away when we were reviewing it.

Merry Marwig (06:07.726)
Absolutely. There's, oh my gosh, how do we even start? But like data brokers, there's thousands of them that literally broker in a personal data. You may not have a business relationship with these companies, so why would they have information about you? But it's legal or it's slowly becoming less legal.

D Mauro (06:26.301)
Well, yeah, yeah. I mean, well, in here, let's jump into that in just a second, because there's a couple segues that we can go to get there. And that is, you know, we always encourage our listeners to check out, like, haveibeenpwned.com, right? You can see whether you've been breached in a guarantee you have. And then what'll be shocking is you won't recognize some of the places that have leaked your data, right? And...

and you'll be like, who are they? And you'll look and you'll find they're a data broker. So you're in, and a lot of it has to do with clicking on cookies, accepting cookies. A lot of it has to do with giving out information to download papers or to get information from websites. There's a whole way in the ads that we track, our social media use, there's a whole bunch of ways that they track our data. It's shockingly scary.

Merry Marwig (07:22.506)
it more.

D Mauro (07:22.913)
Getting back into what we were talking about, your initiation into privacy, it kind of stemmed right after they formed GDPR. That was really kind of the beginning of privacy for most of us, right?

Merry Marwig (07:37.994)
It was actually right before that. So I used to randomly work in the hospitality technology industry. And I got invited to a conference in 2017. And almost the entire conference was about how hoteliers were going to operationalize for this thing called GDPR. I was like, what's GDPR? What is this? And people there were like, you know.

D Mauro (07:40.564)
Mm.

D Mauro (07:57.363)
Mm-hmm.

Well, yeah, I mentioned that industry got hit right off the bat, right? Travel. Yeah.

Merry Marwig (08:04.558)
Right. Well, hospitality has a lot of information and there are some major cases of issues with privacy harms, like when Marriott bought Starwood, I think it was like 20 years ago.

D Mauro (08:16.661)
We have several episodes on that. We talk about it all the time. It's a part of the trifecta of the largest data breach in history. Right. The OPM breach, the Office of Personal Management, the, yeah, the Marriott Starwood breach, which happened. And then the Anthem breach, which happened right here in Indianapolis all around the same time. And on the dark web.

Merry Marwig (08:19.243)
Yeah.

Merry Marwig (08:24.91)
All right.

Merry Marwig (08:30.266)
I'm part of that.

D Mauro (08:43.689)
that data, they were massive breaches, right? And on the dark web, they would have been worth billions. And in our in our security awareness trainings, when we train people, we always say, how much do you think it was sold for on the dark web? And people are like 10 billion, 5 billion, however, like zero, because it wasn't about a cyber crime for financial gain. It was about data collection. Right? It was about privacy, right? At the end of the day, it was about

Merry Marwig (08:49.879)
Yeah.

D Mauro (09:13.761)
They just, you know, the four people from the Chinese government that got, that the DOJ indicted because it's about espionage.

Merry Marwig (09:26.89)
Absolutely, information is power. And that's why when I found out about the GDPR back in 2017, so it went into effect in 2018. So everyone was freaking out. We have a year to figure out how to operationalize this and I'll say we are still trying to figure that out. Not very many companies have everything worked out yet. And on top of that, we're getting more and more laws. But so I decided at that time, just given my hope,

D Mauro (09:35.917)
Correct.

D Mauro (09:43.338)
Right.

Merry Marwig (09:55.742)
as a person who suffered some of those harms, like just psychological harm, financial harm, that I should move into this direction with my career. So I did. So I first started at a company called g2.com. They do software reviews. They didn't have a data privacy technology taxonomy yet because the technology was so new. So how do you, as a company, operationalize any of this? That's a tough question.

D Mauro (10:12.011)
Yep.

D Mauro (10:22.587)
Right.

Merry Marwig (10:23.998)
And so through that, I researched this area. I also did data security research and through that process, I got to know a lot of founders at data privacy technology companies. And through that. That's it.

D Mauro (10:38.729)
And that's where we met Daniel, the founder of Data Grill.

Merry Marwig (10:42.974)
That's how I met Daniel. So Daniel is the CEO of DataGrail. It's one of the leading providers of data privacy technology services for companies. So we work with big brands, like you mentioned, Instacart, Amazon, Salesforce as a customer. If you want to see a really great example of what a public-facing privacy program looks like, I highly encourage you to check out Salesforce's Privacy Operations Center.

where you can do a number of actions, which will get through what your rights are as a person. So yeah, I've been working there for.

D Mauro (11:14.805)
Well, and they even have that, is that why they have that ad with Matthew McConaughey on it? Where he's talking about if AI is the Wild West and data is our new goals and how Salesforce keeps things private? Is that, is that, it's part of it, right? Well, you should, you should, you should not have a life and watch TV at night. It's, it's, it's on. So anyway. Yeah.

Merry Marwig (11:29.354)
Yeah, I actually haven't seen that, that I've heard about.

Merry Marwig (11:39.71)
Oh my gosh. I've heard it's really good, but you know what? I'm going to push back on that though. I don't think data is the new gold. I think it's the new plutonium. It's very powerful.

D Mauro (11:48.014)
Oh, wow. Interesting little like word ninja move there. Yeah, very good.

Merry Marwig (11:54.622)
It's very, very powerful on the right hands, and it's very, very dangerous in the wrong hands. So, that's how I'd like folks to start thinking about this, because the data is about you. It's about individual people. That's what's at stake here. We're not worried about companies losing information about their go-to-market strategy for next month. No customer really cares about that, but they do care if you lose information about where they live.

D Mauro (11:59.405)
Correct.

D Mauro (12:16.606)
Right.

D Mauro (12:20.613)
your own health, your own health, your own psychological treatment, your own like all of the things that you would that an individual may deem sacred, right?

Merry Marwig (12:32.85)
Absolutely. Where do you go to church? Who do you love? What do you like to watch? You know, where do you go? Location data, I think, is not well understood of how powerful that can be. Like, even if you take away identifiers, you can figure out very fast who someone is just based on their movements, you know. There are locations. Yeah.

D Mauro (12:35.593)
Right.

D Mauro (12:54.225)
Oh, yeah. Well, and what's scary, you just brought up something that's really scary. So a lot of doing this podcast and being in the field that I'm in, I know a lot of hackers and I know a lot of good hackers, whitehead hackers and blackhead hackers. And I'll tell you, it's scary when they when they challenge each other, like somebody takes a picture, and then they'll put it up and they'll be like, Okay, where am I? And man, it takes

You are at this intersection in the middle of Scotland, and they're able to identify it just from all of those markers, the angle they're able to look at the sun angle to figure out what time of day it was taken. Oh my gosh, it's scary. So sorry, I didn't mean to go down that rabbit hole, but it really shakes me up whenever I see that.

Merry Marwig (13:40.774)
Imagine drinking... Oh no.

Merry Marwig (13:46.322)
Imagine you don't even need a picture. Imagine you just need someone's last five credit card swipes. How many Dunkin' Donuts has Mary Marwig been at in the last two hours? We know. Oh, she's probably within this area. Or we know she goes here every single day. She's probably there at this time.

D Mauro (13:48.651)
Yeah.

D Mauro (13:52.651)
Right.

D Mauro (13:57.456)
Right, exactly.

D Mauro (14:05.865)
Well, and we want and the data tracking, though, in the beginning, I believe, is there for a good reason, isn't it? Because, for example, we use this example when we do our trainings and that is, you know, let's say we're taking a road trip or in Dubuque, Iowa, and we swipe our credit card in a gas pump, right? Our bank will text us and say, is that you? Like, how do they know? Like, we could be at home or at the office.

and buy things on Amazon from China all day long and no one's gonna text us. But why when we do that will they immediately know because they know what our behavior is and they know that that's an anomaly, right? And we want them to do that because if that wasn't us, we wanna be alerted so we could tell them that would be fraud. So I mean, in the beginning or at least a part of this, there's a good, there's a benevolent.

Merry Marwig (14:48.866)
Yes.

D Mauro (15:05.033)
purpose to it, isn't there?

Merry Marwig (15:07.662)
Yeah, I think about things just like the plutonium example. There's use cases and abuse cases, right? So what is happening? But I think the problem that's not well understood right now is that there are use cases that we're not aware of. So what sort of tracking is happening on you that you just don't know? Maybe you just agree to some terms of service or some privacy policy thing because you were trying to get through your day and just get a service.

D Mauro (15:12.461)
Mm-hmm. Exactly. Yeah.

Merry Marwig (15:36.598)
But part of that may have included, oh, well, we retained the rights to sell and share the fact that you went to that gas station and that you bought a thing of beef jerky. And so now we're going to target you with lots of natural beef jerky products or whatever. And did you agree to that? And that's part of the privacy problem right now is, at least especially in the American context, we lack consent. We lack transparency that any of these data practices are happening. And so that's where we're starting

based privacy laws, which we have, I think it's like 13, 15 now. We just passed two more this year. So within the last three weeks, New Jersey and New Hampshire passed a state-based privacy

D Mauro (16:19.909)
Now, what's the aim of those is the aim of those to provide like informed consent? Is that really the aim of them? Because I know that there's I know that Congress was addressing, for example, tick tock, right. And part of the issue there is, hey, we just want to curate our lives, our dance videos online, we don't necessarily even realize that you're recording all of our keystrokes.

Merry Marwig (16:46.638)
Great.

D Mauro (16:47.021)
throughout, like everything we type in, every time we type in a password, you have a recording of that, we didn't know that. We just wanna record our dance strokes, like our dance moves, right? Like I think because people weren't informed of what they were giving consent for.

Merry Marwig (17:03.582)
100%. And so one of the things that I, the questions I ask myself is how do these companies make money? Because what the service that they're offering us might be free. We don't pay on some of these platforms. How do they monetize this data? Who's ultimately getting that? Something you mentioned about the keystrokes is the dynamics of them. So it's the dynamics of how you type can identify you as an individual. It's like the same as your

D Mauro (17:10.155)
Right.

D Mauro (17:14.866)
Mm-hmm.

Merry Marwig (17:30.594)
signature, like only you can sign this biometric. How hard you tire-

D Mauro (17:34.813)
Right. Well, they can tell with your right-handed, left-handed, how you, what words you misspell off and all of it.

Merry Marwig (17:41.23)
Yes, it's like your gate on your fingertips. It's itself an identifier, a biometric, but that's something I really challenge people to peel back. How does this company make money before you sign up for any kind of service or just agree to the terms and services of this? And it's really a issue of lacking transparency. I think a lot of people have expectations of their privacy, and then when they find out there's a company selling their data, sharing their data, or using it for a...

D Mauro (17:43.361)
Mm-hmm.

Merry Marwig (18:10.302)
another purpose that they didn't expect. That's where this conflict is coming in, where the companies are saying, well, you agreed to this 100-page document that you should have read, whereas it's being marketed as something completely different. So there's a clash in expectations. The consumer privacy laws are really there to give consumers rights, to understand what is going on, so to make better informed

choices, like to your point, informed consent. Yes, I allow you to sell my data to third parties in exchange for this free service. You know, there's a, what's the phrase? There's a phrase called like, it's like pay or data, something like that. Pay or okay. That's what it is. It's you either pay up for like a subscription or you okay for the data sharing, right? But that's just not really been very clear until now. So that's a step, but then you also have other rights like

D Mauro (19:01.077)
Right.

Merry Marwig (19:07.01)
You have the right to access the data that a company has on you. What kind of categories of data are they collecting? Like for example, if you are a car owner of a smart car, you could say, hey, what types of data do you have on me? And you might be shocked to learn that there's a car company that collects sexual orientation data. Why? What does that have to do with it? I don't understand.

D Mauro (19:32.045)
it's very relevant to the cars that you buy you know i'm like what like how is that even relevant yeah right

Merry Marwig (19:39.575)
That's my exact question, but for them, it's like covering your bases on the types of data if you have access to or collect, then you can sell it.

D Mauro (19:48.885)
So why do you think that is? Why do you think that they want that data? Is it about how they sell ads, right? And the targeting of ads? Because if you've ever been on the I wanna buy ad side of the table, right? You wanna know, I wanna buy ads. And the information that they offer you is shocking. They're like, well, you can run an ad for this type of person.

and find exactly what the type of person that's shopping for this thing. And you're like, how can, how can you run my ad? Like, like I'm older, so I remember like just television and just regular television. Every ad would come on like they did. They weren't able to target that ad for the exact demographic, right? They would kind of do it generally based on the television show that was on.

Merry Marwig (20:21.646)
I'm sorry.

Merry Marwig (20:32.618)
Yes.

D Mauro (20:45.237)
but it wasn't very targeted. Now they're able to pinpoint exactly who sees what at.

Merry Marwig (20:51.598)
Absolutely. It's like divorced moms of young children making $150,000 a year who work in, you know, XYZ industry and are up at 2 a.m. scrolling, boom scrolling. Yes, like that's like a list you can buy. Yeah, I mean, data has value. Like, you wouldn't collect it if you can't sell it, you know. That's part of the surveillance economy.

D Mauro (20:55.809)
Yeah.

Right.

D Mauro (21:04.145)
Exactly. Right. Yeah.

D Mauro (21:14.945)
Right, right. So, so let's talk about that bridge that overlap that kind of Venn diagram between data and, or data privacy and data security. Because, because it's really interesting because all of this, I mean, those that are charged with defending an organization's brand, right. It's all about, you know, what type of

Merry Marwig (21:27.876)
Yeah.

D Mauro (21:41.249)
data, first of all, most organizations have no idea the vast amounts of data, or even where it's located, right? And we learned that after a data breach, because after a data breach, we're like, so where is like, what was like, what happened? And then where's your data? And then that's where you're always like, Oh, then there's another copy over there. And then there's another copy over there. And you're like, Oh, my, you guys don't even know where your data is. Right. And that's very common, unfortunately.

Merry Marwig (22:04.238)
Yes. This is so common. And how privacy and security relate, they're like cousins. Sometimes people conflate the two and they're like, oh, security or privacy is just a subsector of security. No, not at all. They're two separate issues. So I consider security to be data protection of all sorts. So company data and then including personal data, customer data. Privacy.

D Mauro (22:07.999)
Yeah.

D Mauro (22:30.429)
Right, the employees as well as the customers.

Merry Marwig (22:33.994)
Yeah, because employees are considered by law data subjects in certain jurisdictions. So California, EU, some of the new US ones don't. They're just consumers. But data privacy, on the other hand, is about data collection and data use. What are you doing with this data? Where does it go? What systems process it? Who do we share it with? How long do we keep it?

D Mauro (22:38.806)
Hmm.

Merry Marwig (23:01.366)
What legal basis do we have to hold this data? If you're in an EU context, there's only six legally ways, like legal ways you can process a person's data. You know, what's happening? What business processes do we use this for? You know, do you, like a good example. One, something that drove me crazy was when, I think it was a social media platform that was collecting phone numbers for the purposes of second factor authentication.

where they're like, oh, give us your cell phone. We're gonna send you SMS codes for this. And then their marketing team got ahold of the same data and started using it for a marketing purpose. And that was not what the people who signed up for the operation signed up. So, I think that's a good point.

D Mauro (23:30.794)
Hmm.

D Mauro (23:36.502)
Right.

D Mauro (23:43.841)
Unbelievable. No. That gets back to that informed consent thing again, right? Like that's not what we said you can use my cell phone for.

Merry Marwig (23:49.919)
Right.

Merry Marwig (23:54.03)
Correct. Like, imagine if you were like printing pictures. This was one that I came across recently. I uploaded my pictures to a photo printing place and they were like, you agreed to allow us to collect the location data of your photos. I'm like, oh my gosh, where is this going to end up? You know, and like what sort of, you know, data?

D Mauro (23:58.986)
Right.

D Mauro (24:09.525)
Yeah, they're secure. Heh. Right.

D Mauro (24:18.129)
Yeah. You're like, I just want my picture of my dog on a pillow. Like, that's all I'm trying to do. Like, I don't want you to know that I was in Georgia in August. Like, right. And you're giving them all that in order to process the end. And most people, most people never read the T's and C's. They just don't. They just don't. They just click and they go, Yeah, of course, I agree. I want to get this pillow. And that's it.

Merry Marwig (24:23.424)
Yeah!

Merry Marwig (24:29.019)
Right.

Merry Marwig (24:43.81)
That's one of the greatest lies of the internet, that I agree, people don't. You're starting to see that now with the cookie pop-ups. They're like, click here, agree to being tracked. Most people just either dismiss it or click away, but no, we don't agree. It's just how the surveillance capitalist world has evolved. But getting back to that, we are starting to see companies take...

D Mauro (24:47.369)
Yes.

D Mauro (24:52.855)
Right.

Merry Marwig (25:09.75)
the right steps forward here. So GDPR was a regulatory issue that we're still operationalizing. How do you service people's data rights? So you need, what you said before, it was like, you need to know what data you're processing. You have to be able to understand the risks of processing that data and either do something about it. So like.

that might include securing it properly or having the data retention schedules very clearly outlined and then actually destroying it after that. Or making sure that only certain parts of the business, like the security team, has access to those phone numbers for two-factor authentication purposes and not the marketing team for ads.

D Mauro (25:53.201)
Well, and we're seeing that, yeah, we're seeing that in some of the large data breaches. So on the other side, what's happening is certain organizations, their networks aren't configured. This is why there's such a big push for zero trust. Because there's like, okay, social engineering occurs, somebody's two factor authentication, they get them on a WhatsApp.

and they get them to approve the multi-factor authentication. And then they get in. But that person's over in this department. But they wind up getting access to the data in the financial end or the code, the source code of the organization. That person didn't even know that they could access it. But the threat actors figured that out. So how is the, some of that is the.

technical configurations that were set up or the fact that they don't have zero trust they don't have to Reauthenticate each level as they escalate privileges But some of it is also the organizations willingly Share data and in part of it is they want the data the marketing department wants that data, right? And so they're going back and forth

Merry Marwig (27:09.922)
You've just touched on a very good point where I feel like in security, we're dealing with usually external threat actors or like malicious insiders, like data misuse. But the problem with privacy is actually like they agreed upon uses of this data. The business is like, oh, we thought this was all copacetic. Like that's not, so it's a threat.

D Mauro (27:19.533)
Mm-hmm.

D Mauro (27:26.373)
Right. Yeah.

D Mauro (27:30.704)
Mm-hmm.

Merry Marwig (27:34.754)
just lack of awareness of what processing is happening, what is happening over here in the company versus somewhere over here or like a product? Yes.

D Mauro (27:39.553)
Well, yeah, so it winds up being a threat to a consumer by agreement, right? It's like internal agreement at an organization to share data winds up harming a consumer.

Merry Marwig (27:51.154)
Exactly. And so that's why there's this concept of privacy by design. I actually was kind of horrified to learn of this term because I thought this was already happening where like when you build a product, you have someone on the team being like, Hey, this is how there could be a privacy harm. You know, someone could do something wrong with this or like not aligned with our values. That's actually a thing.

D Mauro (28:10.849)
Yeah, what are the different types of privacy harms? You talked about you had an image, we can put it up on the screen. It was like the typology of privacy harms by Citroen and Solof.

Merry Marwig (28:18.339)
See you.

Merry Marwig (28:22.706)
Yeah, so those two are fantastic privacy researchers. They've done foundational work in our area. And so they've really listed out various experiences and end person. So in a consumer context or an employee context could feel from a privacy harm. So that would be physical harm. So like if your location data gets out there and someone's using it to literally figure out where you are, like that's.

part of my concern as a person who's been stalked, my physical location is very important to me and the privacy of that information. Then there's reputational harm. Say you did something that you're not proud of. I don't know, there was a really interesting case. Many years ago, there was a breach at a dating site called Ashley Madison.

D Mauro (29:13.961)
Yeah, we talk about it often in our security awareness because that's a site that was that was based on privacy right it was for married couples that wanted to have affairs and the whole point is that got breached and The attackers then extorted them to get payment They didn't pay and so they released the names of all these people so lots of harm came from that

Merry Marwig (29:18.125)
You know.

Merry Marwig (29:36.194)
Right. Oh my gosh. And if you read some of the user stories there, like just like heartbreaking situation. So it's not as black and white as, you know, anyone on their moral high horse would expect. It's like tragic, tragic stories. So yes, reputational harms, emotional harms, you know, just like having to deal with this thwarted expectations, economic harms, relationship harms.

D Mauro (29:54.284)
Oh yeah.

D Mauro (30:07.093)
Well, there was a breach of, I mean, there's so many, especially in the healthcare space, when there's breaches, there was a breach of an online therapy app, and it has caused a lot of damage to people because they have their innermost sacred thoughts and fears and anxieties expressed, and now they're literally for sale online, right? I mean, that's...

Merry Marwig (30:32.842)
Absolutely. So we've seen that where, so and this is what gets very interesting. So a lot of the times the trackers that we are being tracked by online will figure out information about us. Like whether we're depressed or we may have a medical condition or maybe you're pregnant. That's for an advertiser. That's

D Mauro (30:36.055)
That's it.

D Mauro (30:50.375)
Hmm.

D Mauro (30:55.457)
Hmm

Merry Marwig (30:58.302)
very valuable information because people who, for example, may be depressed, may be more willing to buy something. People who are pregnant are shown to have a propensity to switch brands. So it's like a very critical time to possibly capture a new customer. That was the issue with the Sephora case that you might've heard of a few years ago. The California attorney general who at the time

D Mauro (31:27.509)
or at the CCBA.

Merry Marwig (31:28.25)
Basically, you know, there was this case with Sephora where Sephora was selling the shopping cart contents of someone to a third party and that third party would say oh look They're buying prenatal vitamins Maybe we should put them in the possibly pregnant group and then sell that information

D Mauro (31:47.425)
Really? So Sephora is the makeup company, right? Okay. Got it. Got it. So they were selling the like discard or just the shopping cart items that somebody might have in and then remove for whatever reason, right? What whether they buy it or not.

Merry Marwig (31:50.218)
It is, but they also sell vitamins and they also sell other stuff. Like, it's usually, you know, their target audience is women.

Merry Marwig (32:04.714)
Yes, well, no, what they were talking about. Yeah, and so that was the issue because especially in this era now where in some states, you have fewer and fewer bodily autonomy rights, that could be very dangerous data to people in states like Texas or whatever. So there's harm in that case. So that would be a chilling effect harm, right? Where you might not wanna buy stuff online anymore. The vitamins issue was an example from many years ago that target.

D Mauro (32:20.393)
Right. Yeah.

D Mauro (32:26.345)
Yep. Right.

Merry Marwig (32:33.374)
was also caught up in Target. There was a woman who was pregnant and didn't tell her family members yet. And then all of a sudden, she lived with her folks. And then all of a sudden, their household started getting coupons in the mail for baby stuff. And the person's face was so red.

D Mauro (32:50.901)
because Target had captured some of the data that she had been looking at.

Merry Marwig (32:55.766)
She had bought the prenatal vitamins and then they put her in a marketing data set. And so the household was getting a bunch of coupons for baby stuff. And one of the parents of this person was like, why are we getting all these baby stuff coupons? Right? And so just a simple act of you thinking you're just going to the store, you're gonna buy yourself some vitamins, you haven't told anyone, that's your personal business. You know, you show up in these data sets, right? And that can be very harmful.

D Mauro (33:18.842)
Right.

Merry Marwig (33:22.538)
Imagine, I don't know what this person's story was, but maybe she didn't feel safe telling her family that he was pregnant, right?

D Mauro (33:27.617)
Right. Well, maybe they were gonna kick her out or something, right? Like, I mean, you don't know, like every family's different. Yeah. Holy cow.

Merry Marwig (33:32.214)
This is a real thing. Right. How do customers make sure that they don't do that? So that's kind of what we're in with these consumer privacy laws that we're starting to see. It's like giving people back some rights. I can tell a company, don't sell or share my data. I've told companies, delete everything you have on me. You can access it. What do you have on me? Things like that.

D Mauro (33:40.555)
Right.

D Mauro (33:55.585)
Right.

D Mauro (34:00.414)
Right.

Merry Marwig (34:01.222)
I actually do employ a service, so you'll start to see this more. They're called authorized agents, which are allowed by California law. Basically, you can give permission to a company or another person to go and do data deletion or data access requests on your behalf or do not sell and share.

D Mauro (34:20.433)
Oh yeah, almost like your advocate, like your privacy advocate, right?

Merry Marwig (34:23.718)
Yes, there's a few out there. One, Consumer Reports recently came out with a free app. It's not comprehensive, but it's a start. I personally pay for a service that does this. It's not cheap, but it's comprehensive. Just to get my data out there, especially my location data from my mobile identifying numbers, the advertising numbers. So if you don't turn that off, do.

D Mauro (34:41.386)
Yeah.

D Mauro (34:47.29)
Oh yeah.

Merry Marwig (34:51.402)
But yeah, I mean, it's really about retaining control, making sure that... Oh.

D Mauro (34:55.657)
So hang on back up just a second. So what's that tip that you just gave to our listeners? Like remove the...

Merry Marwig (35:03.742)
the mobile, the made number, mobile advertising identifier. So basically the apps that are on your phone usually have access to that and you can turn that off. The other tip would be is if you want to reduce your digital footprint, you can employ, you can either go company by company, which is very laborious and that only covers the companies you know about to delete your data or to restrict the selling or sharing of your data.

D Mauro (35:07.881)
Right.

D Mauro (35:13.178)
Okay, good advice.

D Mauro (35:33.196)
Right.

Merry Marwig (35:33.802)
or you can employ an authorized agent service that will do that for you. Not all of them are very comprehensive. There are gaps right now, but there is a new law in California that's trying to resolve that. It just got passed last year called the California Delete Act. Basically data brokers, so any company that sells or shares information about a consumer with whom they don't have a direct relationship.

D Mauro (35:51.201)
Mm-hmm.

Merry Marwig (36:03.63)
has to register with the state. And then in the next couple of years, they have to figure out a way to do deletion requests. If the person goes to the state and says, hey, I want all the data brokers in this state of California to delete all of the information they have on me, the data broker doesn't usually ask. It's a lot.

D Mauro (36:20.821)
That's a big ask on companies, isn't it? That's a huge ask because that's a really big manual task to go and find that user, find their data, and then delete it.

Merry Marwig (36:35.394)
Well, and that's what we do at DataGround. So we help companies do that. What data are you processing first and foremost? Because like, as you said, it's a mess. There's a lot of applications out there that process this. So we typically find that in an organization with like 500 to 1,000 employees, they'll have 1,500 different applications, but IT may only be aware of like 200. There's a lot of honor, shadow IT.

D Mauro (36:41.993)
Mm-hmm.

D Mauro (36:58.229)
Right.

Right, because of shadow IT, right? Because of all these, there's internet of things devices, people get grants or money or they invest themselves and they tie in different apps and everybody's got an AI app that they wanna try out and all these things get tied into the network.

Merry Marwig (37:19.074)
download an extension that nobody realized was sucking up data. It's a lot. So that's why it cracks me up when I hear about companies trying to do this manually. I'm like, how? How are you doing this manually? This scale is just too big. So we help companies understand what they're processing first and foremost. We also help them understand the risk of specific data, holding the risk themselves as a company and the risks to their consumers.

D Mauro (37:20.885)
Mm-hmm. Yep.

D Mauro (37:32.397)
Right.

Merry Marwig (37:48.05)
And then we also do things like processing these consumer requests. So, you know, if I'm a customer of a makeup company and I want to know what data they collect on me, I have a, I don't have a right in Illinois, but if I was from California or the other states that have these laws, I would have a right to that. And so what do you give them? You know, you have to find that data in the systems, pull it back, put it together, send it out. And it all has to be counted for and tracked.

D Mauro (38:08.905)
Right.

Merry Marwig (38:18.147)
because this is regulated. It's not just a need to have, you're just like, well, we're doing the right thing, sure.

D Mauro (38:20.107)
Right.

D Mauro (38:23.797)
Well, it can't just be emailed as an attachment because that could be compromised too, right? You would have to have like a secure encrypted, you know, encrypted on one end, encrypted on the other, and then sent. Like there's a whole mess of things you have to think about.

Merry Marwig (38:40.674)
Well, so you just actually touched on something that I've talked about. This doesn't really get talked a lot about, but like weaponizing access requests. So pretending to be someone else in order to get commercial data about someone. So the FTCs are-

D Mauro (38:47.999)
Right.

Yeah.

D Mauro (38:54.129)
Absolutely. I mean, especially we just saw that large, massive, the mother of all data leaks with 26 billion credentials. So it's just becoming easier and easier to impersonate somebody or to be impersonated online. And so from that point, then they could make data requests and get more of your data.

Merry Marwig (39:02.518)
right.

Merry Marwig (39:11.776)
Excer-

Merry Marwig (39:17.554)
Absolutely. So that is a threat vector as well, especially if you're doing like social engineering, right? So you find you're doing like a whaling attack. You find someone you want to like try to get into their system. Then you go impersonate them. You get a bunch of commercial data about them. You pretend to be the company they do the most business with. Send a few crappy phishing links because it's contextual and appropriate. And like they would recognize it. You know, what have you given up? So that's why you need to verify these requests as well. Um, so you need to have.

D Mauro (39:23.403)
Right.

D Mauro (39:40.79)
Mm-hmm.

Merry Marwig (39:47.346)
We have a product called Smart Verification at Data Girl, which basically provides contextual information. So it's not something you can just easily Google or find out about someone, but something they would only know. It's kind of like the authentic, you know, something you know, something you are, the types of authentication methods. But it's something to keep in mind when you're operationalizing this, because it's a hard problem. So you, there's really no way to do it manually. I just laugh when people tell me that, like, oh, how?

That's privacy theater.

D Mauro (40:17.185)
So let me tell you, oh yeah, that's unbelievable. So let me ask you this. Why is it that you think that in the US we struggle so much with our data compared to other countries, like the EU? Like why did they, why were they the first ones to have GDPR and we don't have anything like that? Like what?

Why, why, why is that do you think?

Merry Marwig (40:49.162)
Well, it really goes back to World War II and the horrors of the Holocaust. That's kind of where the modern concepts of privacy started. In Germany afterward, there were obviously, as you know, a lot of data. Yeah. Internet.

D Mauro (41:07.154)
That's a really good point. Yeah. They were able to track down who had what religious beliefs, who had what medical conditions, etc.

Merry Marwig (41:18.218)
It's horrifying, but that's where this concept really began of how you can weaponize data against people about things that are who they are, their religion, their ethnicity, their race, their family members, their citizenship, basic stuff that you can't, it's highly sensitive information, but the data privacy.

D Mauro (41:19.734)
Yeah.

D Mauro (41:25.537)
Mm-hmm.

D Mauro (41:29.713)
Right. Yeah.

D Mauro (41:35.514)
Mm-hmm.

D Mauro (41:42.422)
Meanwhile, in America, we're curating our lives on TikTok, voluntarily, right? Yeah.

Merry Marwig (41:49.098)
Yeah, well, I mean, I think that's part of it, too. So like we don't.

D Mauro (41:51.669)
Like taking videos with our school and our house in the background and our cars and like driving around in cars with like the images of how many people are in the car and in the family and stuff and you're just giving it all away for free.

Merry Marwig (41:56.526)
Oh my gosh.

Merry Marwig (42:06.566)
Oh, I know those little stick figures on people's minivans. I'm like, that you're hurting.

D Mauro (42:09.394)
Yeah, my family cringes whenever they see that. And they're like, why would we not do that? I'm like, trust me, there's a thousand reasons why you would not do that.

Merry Marwig (42:15.042)
Yeah.

or the bumper stickers of like, you know, my students. So, I don't know. You're like, no.

D Mauro (42:19.985)
Oh, yeah, exactly. I'm a proud. Yeah, my, my kid is a proud student of whatever. Right. And then their kid is taken. And you're like, what happened? I have no idea how that happened. Like, well, it's, it's not that complicated.

Merry Marwig (42:30.058)
Yeah. So I would say really just like the cultural experiences and the historical context of privacy in our two distinct histories is why you see more of a focus in Europe. Even I used to live in Germany. It was a fascinating time. I lived in geographically what used to be East Berlin. So it wasn't just, yeah, it was

D Mauro (42:49.099)
Hmm.

D Mauro (42:55.288)
Oh, interesting.

Merry Marwig (42:57.53)
I, the people I met there, they weren't just dealing with, you know, only the aftermath of World War II, but what happened in, you know, communist Germany after that. And the, there's an incredible film. If anyone who's watching the podcast likes film, I highly recommend this movie called The Lives of Others. It was from like 2005, I don't know, 2004, 2005. Wonderful film explaining the surveillance state and how that can really

D Mauro (43:07.042)
Right.

Merry Marwig (43:26.406)
affect people in their lives and their job opportunities and things like that. You know, in the United States, I feel like we just, we're so individualistic. We don't realize that we're actually part of a capitalistic surveillance society. Now, everything we do, we're generating bits and bobs of data. And I think people have been, it's easy to see the use cases, but harder to see the abuse cases of that.

D Mauro (43:40.449)
Mm-hmm.

Merry Marwig (43:52.07)
One thing I always try to tell my friends who are like, oh, Mary, who cares? I have nothing to hide or whatever, is that data can be used to do lots of things. Like for example, contextual pricing. Maybe you're a rich person and you're like, I have nothing to care, I have nothing to hide, nothing to worry about. I can just pay my way out of this thing. Like, oh, okay, well, do you want the data about you to be used to charge you a much, much higher price than your neighbor who doesn't have this kind of data available?

D Mauro (44:05.275)
Right.

D Mauro (44:21.901)
Right. Yeah.

Merry Marwig (44:23.05)
Right? Like, I don't know, a t-shirt for your neighbor might cost 50 bucks, but for you because of contextual information about your zip code, your income, which they can get from data brokers and like, you know, what subset in a data set you are, like, you know, man with dog in this zip code, you know, who works in this industry, you know, now that t-shirt's 150 bucks for you. Is that fair? But...

D Mauro (44:35.199)
Right.

D Mauro (44:43.099)
Mm-hmm.

D Mauro (44:47.455)
Right.

Merry Marwig (44:49.442)
Things like that, I think people just don't realize how much is going on in the background because we don't see it. Much like the personalized.

D Mauro (44:56.553)
Well, at the end of the day, they're going to be like, I can afford the $150 t-shirt. I don't really care anyway, right? But it's much bigger than that is the point, right? Because it, and as you get older and you go through more life experiences, there are going to be things you do not want out there, right? It's just part of life or something about family members that you don't want out there, right? It's just private and sacred. And so.

Merry Marwig (45:02.087)
lol

Merry Marwig (45:22.186)
Yeah. Here's a good example. If you're trying to get life insurance, right? If you have an app, let's say you download an app and it's tracking your weight loss journey or whatever you're doing and you fail spectacularly like everyone else who doesn't get on the program or whatever. And this app, which was free, how they made money was by selling your data. So they have your name and how much you weigh and how you didn't do very well on your diet plan.

D Mauro (45:28.17)
Right.

D Mauro (45:34.723)
Mm-hmm.

D Mauro (45:41.823)
Right.

Merry Marwig (45:51.594)
Well, if a life insurance company buys that data and they can say, oh,

D Mauro (45:54.853)
It can be used to decline coverage. It could be used to charge you higher rates, all of that.

Merry Marwig (45:59.11)
Yes. And this is a problem with period tracking apps. So menstruation apps, there are a lot of menstruation apps where you can track your cycle. And people would be like, well, why would a retailer be interested in that? Well, it's not just retailers. Well, as we talked about earlier, women who are pregnant, as I've explained, are in a position to change brands. It's a transitional time.

D Mauro (46:21.845)
Right. Change brands. Well, just the fact that you know that, right, the fact that now we know that women that are pregnant change brands, I guarantee you back in the 60s, people didn't know that. Right? Like, even big advertisers didn't even they weren't even aware of that phenomenon. Right. But because we're able to track the data and look at trends, we're able to see things like that.

Merry Marwig (46:35.542)
Yeah, right.

Merry Marwig (46:47.49)
Right. Well, and then who else would be interested in that data? So I've read articles where employers will buy that data and use it to determine your absentee rate. So if like in your menstruation app, you say like, oh, I had a rough day, I had to call off work because I didn't feel well, now you're pegged as someone who may be absentee and so they won't extend that job offer to you, right?

D Mauro (47:13.857)
Wow.

Merry Marwig (47:13.898)
All because of a period tracking app. You just didn't want to put in your calendar, but you didn't realize the implications of.

D Mauro (47:18.101)
Right. But because that data got sold by somebody that sold it to your employer, now all of a sudden you have that mark against you.

Merry Marwig (47:24.334)
Right. So, getting back to your Matthew McConaughey ad from Salesforce that I haven't seen as a...

D Mauro (47:32.573)
Yeah, data is the new gold, apparently.

Merry Marwig (47:35.91)
But they're now advertising on that. Like we're a privacy forward company, right? Because we're at a point now where consumers are getting more and more aware of this and they're rejecting it. And so smart companies are really leaning into privacy and saying, you know, these are things that we will do and things that we won't do. And the things that we are going to do, we're going to be transparent about it and you're going to know about it. So there's going to be no surprises, you know.

D Mauro (47:39.917)
Uh huh. Right.

Merry Marwig (48:01.814)
So don't come at me and say like, we had no idea you were selling or sharing my data for this end, or maybe we just won't do that. We won't use that as a revenue stream for us. So we're starting to see more and more companies figure out that privacy is actually a business differentiator. Like look at Apple. If you had to choose between an Apple phone and a not Apple phone, and privacy was a factor for you, which one would you choose? You'd probably choose the one that's being marketed more as the privacy forward company, right?

D Mauro (48:07.052)
Right.

D Mauro (48:19.021)
Mm-hmm.

D Mauro (48:31.629)
Right.

Merry Marwig (48:31.958)
So there are benefits to investing in this and really meeting your customers' expectations that they already have.

D Mauro (48:39.753)
Right. And well, and no technology is perfect because even Apple has the new, I believe it's the new journal that is in the new upgrade in Apple. And that automatically tracks like certain things, like certain geographies, I believe. I read a report where there's not only some geo tracking, but also it allows kids to like journal.

and just lock it all down with their own face ID, which creates a division between children and parents, right, because there's such a push on parents to not wanna violate their kids' privacy, but to know when they're communicating about things like suicidal ideation or cyber bullying and things like that. And so there's platforms like Bark and things like that,

that can assist parents in that. So it's really, it's such a challenge to try and generate and manage, you know, all of those technologies while balancing privacy as well. It's tough. And then, yeah, and then what about data sprawl? You mentioned that quite a bit, like there's...

Merry Marwig (49:52.682)
I did not know that.

D Mauro (50:02.433)
What can organizations do about data sprawl? There's so many different variants and usages of the data. What are you seeing in the industry? What are different organizations starting to do?

Merry Marwig (50:18.222)
So data sprawl is a problem. We, and every customer, we find systems that they were not aware that they had in their ecosystem. So shadow IT is something that not only security practitioners need to know about, but also privacy practitioners, because if you're processing data, it's in scope from a privacy regulatory perspective. So you can't just be like, well, we didn't know. Well, you need to know.

D Mauro (50:37.91)
Right.

Merry Marwig (50:46.934)
So that's something that the company I work at, a differentiator. So there are tools out there that you basically fill out a spreadsheet, which is never gonna be complete. So you want something that's automated and that's gonna flag systems as they come online. And we do that through a number of ways. Usually it's like relationship, we'll hook into your SSO. We can hook into your procurement tools. We can hook into your accounting tools. We can look into how different systems talk to each other to make that map.

So understanding the processing environment is the first step and really like the hardest step, David. That's really the most important part. If you're gonna operationalize privacy, if you're gonna say to somebody like, we're gonna provide you this access request that you asked for, you can't just give them a half-hearted answer. And regulators are not gonna care either if it was hard because there are tools out there to do it. So.

D Mauro (51:41.844)
Right.

Merry Marwig (51:44.31)
Getting a hold of your system inventory list is step one. And then figuring out what data is in there and the sensitivity of it. A challenge that a lot of companies have right now is that different states consider different data sensitive or not. So.

D Mauro (52:01.437)
Right. So like, what is it like, first of all, the amount of new privacy laws out there that have come on the last few years have been monumental, right? Like the number of US citizens that are covered under privacy laws has increased exponentially just in the last what, five, six years.

Merry Marwig (52:23.301)
Yes, yes. So there's this graphic I'd love to show.

D Mauro (52:27.601)
Yeah, we can put it up on the screen now. It's like the percentage of US residents covered by state privacy laws skyrocketing.

Merry Marwig (52:34.738)
And right. And this is part of the challenge that I really want to educate people on, especially in the American market, which is where I spend a lot of my time here and also EU, we also do global. There are many, many more privacy laws globally, but that would take, that would be a whole nother podcast. I'll see.

D Mauro (52:49.977)
Right. Yeah, exactly. And I have to bring in some of my UK and EU friends.

Merry Marwig (52:56.294)
Yeah. So basically in the United States, consumer privacy laws are just, they're popping up like every other day. It started with California, then we got Virginia, Colorado, Utah, Connecticut, then a slew of other ones are coming online this year, next year. By 2026, just under half of the United States population will be covered.

And why it's important to think about this, not by a state by state way, but rather about individual people is because they're the ones with the rights, right? This is about privacy, about people, your data. Like, so when I come to a company and say, show me my data, if they give me your data, I'm like, wrong data, where's mine?

D Mauro (53:34.779)
Right.

D Mauro (53:38.451)
Right.

D Mauro (53:43.181)
Right, exactly. Do do all of these states have and I'm not asking for like legal opinion or anything. Disclaimer, none of this is legal advice. Please consult a licensed attorney. But like is a lot of this? Do they give private causes of action? Like is there a private right to sue? Or for just some states? Okay.

Merry Marwig (53:52.491)
That one.

Merry Marwig (54:03.702)
Yes. Some states, not all. So let's Washington, Washington states, My Health, My Data Act is something I would really strongly encourage.

D Mauro (54:14.485)
That's a big one because that does give a private right to sue.

Merry Marwig (54:17.302)
it has a private right of action, which means like, you could be sued by a lot of people. So not just the attorney general can come after you, but everyday regular customers can as well. And the reason why I highlight that one, it's because

D Mauro (54:22.538)
Right.

D Mauro (54:29.977)
insert class action attorneys in this line, right? Like that's, cause that's where the danger is, right? Because now, now all of a sudden privacy matters because when you have to pay out on a class action, it's not like a slip and fall accident, right?

Merry Marwig (54:33.198)
Yeah.

Merry Marwig (54:46.03)
Correct. And well, the thing about that law, too, is they define health data so broadly that it could really implicate almost any data. So the intent.

D Mauro (54:56.377)
Oh really, that's much more broad than HIPAA or the HHS, right.

Merry Marwig (55:00.186)
I also, HIPAA, oh my gosh. This is the problem with US market and privacy is there's sectoral law, federal agencies doing stuff, and there's these states. I would say HIPAA, I'm not even gonna touch that, it's its own beast staying on consumer privacy. But the issue is with the health data, like if you are, again, the vitamins, right?

D Mauro (55:08.953)
Mm-hmm.

Right.

D Mauro (55:17.779)
Yeah, it is.

Merry Marwig (55:27.886)
we're getting back to these prenatal vitamins or you're looking for something and it's implicating you for having some sort of disease or whatever. And it might just be like, I don't know, you bought orthotics for your feet. I have no idea. Like whatever random thing happens that you're just buying on your foot inserts, you know, oh, do you have diabetes? So do we put you in the diabetes list or whatever? So basically like kind of innocuous data.

D Mauro (55:28.161)
Right.

D Mauro (55:35.882)
Right.

D Mauro (55:42.865)
Right.

D Mauro (55:49.705)
Right.

Merry Marwig (55:56.11)
today might actually be considered consumer health data under that law. So I would really encourage the folks listening to this podcast to look into that. What sort of data are you processing at your company and is it in scope of that law? So that's coming online in March in more full. They've implemented...

D Mauro (56:11.989)
Hmm

D Mauro (56:15.421)
And is that only going to apply to people that live in Washington or that anybody where they gather our data, which it could be any of us.

Merry Marwig (56:25.274)
So that's also interesting. So it's for Washington state residents, but also includes anyone's data that's processed or collected in that state. So if your data's collected there, well, think about what type of tech companies are there.

D Mauro (56:38.729)
Right, so that could be anybody. Yeah.

D Mauro (56:44.533)
Right. A whole mess of them. Right. Yeah.

Merry Marwig (56:46.762)
Right, lots of them. So that's one to keep in mind. California's law is very robust as well. Something of note there is they have a regulatory agency. The only agency, they have a dedicated group of people going after these privacy issues. And so they are on top of it. That's called the California Privacy Protection Agency. They're well-funded. They have a big team. They are...

getting ready to start doing some more enforcements there. Up until...

D Mauro (57:18.833)
I remember when it first came out, we were doing these security awareness trainings and we were explaining, I'm like, you don't understand how this is gonna change things. You see the billboards of the trial attorneys now talking about if you're injured in a truck accident, you're gonna start seeing them, if your data's been exposed or if you've been part of a data breach. And everybody's like, yeah, that's never happening, don't worry about it. And now it's everywhere. And I'm like, I'm telling you, this is where it's going.

It's like the new, it's the new torque practice basically.

Merry Marwig (57:51.466)
Yeah, if you're interested in that, I highly recommend you check out BIPA in Illinois, the Biometric Information Privacy Act that also has a private right of action. And it's the cases are wild. Some of those deal with basically like clock in clock out technology that scans your fingerprint. And every single time you swipe your finger, that's like a $5,000 fine per person. Every time. So $10,000 a day if they do a swipe in, swipe out. It's wild.

D Mauro (58:13.845)
Yeah, right.

D Mauro (58:20.036)
Unbelievable.

Merry Marwig (58:22.202)
Yeah, privacy is definitely something I put on your radar for 2024 if it's not already on the radar, especially because I think we're going to start seeing a lot more enforcement with these regulators. I really hope that companies realize that their customers demand this and it's really a customer play because nobody wants to be surprised. Nobody likes to find out the hard way that the data they provided a company was used in an unintended way.

So maybe get out of a get out ahead of it in that way, but if you can't use these pending regulatory actions as the fuel to light your fire.

D Mauro (59:02.825)
Yeah, absolutely. And it's really about risk, right? I mean, this is about business risk. It's about calculated risk. And it's very similar to the conversation we have when we're talking cybersecurity, right? Because business owners and leaders in organizations, even if it's not a private business, but it's a government entity, they want to manage their risk in a reasonable way, right? And they want to do it in a cost-effective way.

And oftentimes, they're challenged, because they haven't had to expend budget in the past for some of these things. And now all of a sudden, there was some new regulation, now we have to do it, we just want to check a box, we just want to do the minimum amount. But really, it's about balancing risk, right? Like if you don't take certain precautionary matters.

it's going to happen and then you're going to pay exponentially more, right? And the likelihood of that happening continues to grow and grow and grow every month that goes by. And, and, and it's, and it's very similar to the privacy risk is very similar to the cybersecurity risk. You have a visual about how cybersecurity and privacy posture relates and, and we'll, we'll put that up, walk us through that. There's like, um,

Merry Marwig (01:00:20.998)
Yes.

D Mauro (01:00:26.501)
a high risk impact and then it goes down to medium risk and then the lower risk. How is that maturity? Because we like in the cybersecurity space, we always like talking about like maturity scales, right? Because organizations sometimes they have concerns and literally they just don't know what to do, right? They just don't know like, am I at high risk right now? Like we're doing X and Y. I hear I'm supposed to do 50 things. Okay. Well, we can't afford 50 things.

but we're doing two, are we supposed to do, which ones do we do next? What do we do to move the needle the most in the most cost effective way? So walk us through that visual if you don't mind.

Merry Marwig (01:01:03.958)
Right.

Merry Marwig (01:01:08.07)
So I hope the listeners on this podcast are going to be familiar with the NIST cybersecurity framework that they've put together very, very well.

D Mauro (01:01:15.165)
Yes, most are. We live in it every day, mostly.

Merry Marwig (01:01:19.342)
There's one for privacy. It's the same. They overlap. So this makes it really easy. And so this graphic that I made, I kind of mashed the two together to show how they relate. And it's the same. It's like, OK, you're just getting started out. You really just don't understand what your risk is. And it's very reactive. Maybe you get your first consumer request, and they're like, tell me all about the data you have. And you're like, whoa, I've never done one of these before. What do I do? And now you're trying to figure that out. So then the next step.

D Mauro (01:01:23.233)
Phenomenal.

D Mauro (01:01:42.401)
Hmm?

Merry Marwig (01:01:48.65)
same as the CSF is risk informed. You know what is happening, but you just haven't been able to address the risk. We find that the biggest problem is automating this risk or dealing with privacy risk from a technical standpoint. Like I am of the mind.

D Mauro (01:02:05.933)
Right. So they might have piecemeal solutions in place for some departments, but not all, etc.

Merry Marwig (01:02:12.99)
Yeah, yeah. Well, and you'll see privacy 1.0 tech solutions really, it was about spreadsheets, like, oh, just put in a spreadsheet, everything that you're processing. That's not how modern businesses work. If you ran your cybersecurity program on spreadsheets, it's not a complete... Yeah, it's not dynamic. The landscape changes all the time, the same thing with privacy. Every time you get a new system.

D Mauro (01:02:25.473)
Right.

D Mauro (01:02:30.743)
You got bigger issues if you're doing that.

Merry Marwig (01:02:39.81)
that needs to be pulled into your privacy visibility. So having an automated way to detect that and bring it in and deal with the risks. So let's say you, like we've, we'll see this a lot at DataGrail where some of our customers are like, oh, somebody just onboarded some new AI tool. And we didn't know, they didn't tell us, but our system found it, our shadow IT detection thing. Like, oh, great. Well, so now you can go to that person and be like,

D Mauro (01:02:57.218)
Hmm.

D Mauro (01:03:01.985)
Mm-hmm.

Merry Marwig (01:03:06.75)
Yes, no, you're not going to do this. Or like, hey, I found out that you're sharing sensitive customer data with this unsanctioned third party SaaS tool. Let's not do that. So really getting a visibility into your privacy risk. And it's the same for cybersecurity, but at a data level. So then it's repeatable. You get the tools and the people on staff to deal with this risk. We definitely think that the.

D Mauro (01:03:17.292)
Right.

D Mauro (01:03:24.97)
Right.

Merry Marwig (01:03:34.678)
Build it yourself way is not the way to go because it's such a dynamic changing regulatory environment that having a solution where you've got purpose built privacy tech in place is important. I don't think some people are like, oh, just use a security tool for that. Like, no, there's privacy operations on top of that. So it's not just detecting what that allows me.

D Mauro (01:03:56.361)
Yeah, there really are because security tools are more about detection or remediation and, you know, like reducing the risk of the impact, whether it's ransomware or whether it's exfiltration. But when security incidents happen and they kind of have their arc, the privacy damage starts to explode, right? So it's so you need to. So really in a.

Merry Marwig (01:04:00.791)
Yes.

Merry Marwig (01:04:19.89)
Absolutely.

D Mauro (01:04:25.133)
comprehensive, I would almost think as part of incident response plan or as an organization is talking about doing their plan, what happens in a data breach, right? Well, the biggest part of that is how are we going to notify our customers? How are we going to engage law enforcement? How are we going to deal with the media and the long-term play? Because I mean, because the damage does not come from, oh, we're going to have to pay for monitoring for a couple million people.

Merry Marwig (01:04:45.102)
But absolutely.

D Mauro (01:04:55.157)
Like that's like people aren't happy with that anyway. It's useless. And all of us have like 30 of them because we've all been in a million breaches, right? It's not useful. Like it's more about what are you doing with our data and how are you responding to it? How did you prepare?

Merry Marwig (01:05:10.782)
Absolutely. So you're absolutely touching on a point that for people that do cyber incident response, the privacy person at your organization must be a part of that because when a breach happens and it implicates personal data of your customers, now these people have legal privacy rights. That is new. That is new. So you have to respond to them within a certain timeframe. If it's Europe, it's 30 days. If it's the US, mostly 45 days.

D Mauro (01:05:32.031)
Right.

Merry Marwig (01:05:39.734)
That's not very long when you're dealing with an incident, right? Like 45 days comes and goes. But so, and then how are you going to do that at scale? So, you know, let's say you have a million customers and you know,

D Mauro (01:05:40.01)
Right.

D Mauro (01:05:43.661)
Correct.

D Mauro (01:05:48.061)
Right. Well, what about the requests that come in after that? Like, here's my yeah, because I didn't even I didn't even take that into consideration. Like after the breach. Yeah, there's you needing to proactively notify. But how many requests are going to come after that? Right.

Merry Marwig (01:05:53.686)
That's what I'm saying. So you get 10.

Merry Marwig (01:05:59.806)
Yes.

Merry Marwig (01:06:09.362)
Right. And so that's why I laugh when people say they do this stuff manually. I'm like, how? How? How are you going to respond? And a regulator is not going to be like, I know that you never invested in this and you like didn't take it seriously.

D Mauro (01:06:13.985)
How? Right.

D Mauro (01:06:22.569)
Yeah, it's fine. It's fine. It's that's like, that's like a, a doctor, like, doesn't wash his hands and just operates from the barbecue, right, and just kill somebody. They're like, dude, we know you didn't want to buy the expensive tools. And you didn't want to bother doing that. It's fine. Don't you worry, right? Like, no, like, these rules are in place for a reason, and they're going to enforce them.

Merry Marwig (01:06:35.424)
Thank you.

Merry Marwig (01:06:48.934)
That's what I'm saying. The world has changed around us. The world has changed totally. You know, like I use this analogy. Maybe it's not very good, but you know, where I grew up, I used to live in kind of a semi-rural area and like we had country roads where it'd be a two lane road and there was literally nothing but farms. So you could see. Yeah. So growing up, speeding was not a problem, right? Oh, who cares? Oh,

D Mauro (01:06:51.369)
I'll do respect to doctors out there. Please don't be mad at me. Yeah.

D Mauro (01:07:05.15)
Hmm?

Yeah, they still have those around Chicago suburbs. I grew up in them too.

D Mauro (01:07:15.357)
No. You're not going to hit anything. There's nothing out there. Right.

Merry Marwig (01:07:18.414)
Right? A cop might pull you over, but not really. So you need speed. But then what was really interesting is the area where I grew up then started to develop. And we started getting little malls and like a school and then movie theater and all these shops. And that same country road was now a four lane highway with like cross streets, right? People coming in and out, stoplights. So if I was speeding on that.

D Mauro (01:07:22.081)
Right.

D Mauro (01:07:27.894)
Yeah.

D Mauro (01:07:41.342)
Right.

You can't be blowing down that road anymore, right?

Merry Marwig (01:07:47.166)
Because the risk to me has changed. I could get t-boned. I could t-bones I could blow a rat and then hurt someone and kill myself, you know And that's kind of what I'm trying to get at here is that like that you can continue speeding I guess if you want to but the world around you has changed rapidly and I don't think that we have really grasped that in Security and privacy like how fast and how much this has changed and that that's why that one

D Mauro (01:07:49.789)
Right. Yes.

D Mauro (01:08:10.736)
No.

No, no, I mean, and people, listeners of the podcast know that we come across this my team works here in the Midwest. And we come across it all the time with, you know, let's, you know, a hypothetical health care organization in rural Kansas, right. And they're like, why, why do we? Why do we need this? Why we like we're over here, no one's gonna, no one's gonna get to us. And we try to explain.

But you're online. When you get online, you enter their world, right? And it is very complex. And so we're not in Kansas anymore.

Merry Marwig (01:08:54.006)
We are not in Kansas anymore. I feel like whatever data you're collecting, if it's valuable to your company, your privacy program and your security program have to match that value, right? So if it's location data, sensitive personal information, you've got to have a security and a privacy program that's going to be commensurate for that. So like you're right, we're not in Kansas anymore.

D Mauro (01:09:00.395)
Mm-hmm.

Merry Marwig (01:09:19.198)
We have to do better and we have the tools in place to do that. We have the regulatory pushes to do this. Consumer expectations are changing. I see this a lot too, when we work with B2B companies, they know they need to get a privacy, a defensible privacy program in place because it's slowing down their deals. So there's sales velocity at play too, where you're like, ah, we didn't invest in this. And now when I'm trying to onboard a new B2B customer, they're asking me questions about it.

D Mauro (01:09:40.586)
Right.

Merry Marwig (01:09:48.526)
privacy program and I have nothing to respond to them about. So that's another thing too.

D Mauro (01:09:53.517)
Well, it's so true and that is the same in cybersecurity because so often now even smaller organizations who are vendors or subcontractors of a larger organization, they're being asked now by that larger organization, what is your cybersecurity posture? What layers of security do you have? And now they're getting asked about privacy, right? And they're like, I don't even have my cybersecurity layers intact, let alone my privacy. And they're like, well, who's your privacy person?

Merry Marwig (01:10:09.128)
rate.

D Mauro (01:10:23.029)
Who's our what? They're like, it's our CEO, I guess. I don't know. Like talk to HR. Like nobody knows, right? It's another hat that somebody has to wear.

Merry Marwig (01:10:30.926)
Well, and let's.

Merry Marwig (01:10:34.926)
potato of privacy, because if you don't have a purpose like job for that, then sometimes security gets it. Sometimes legal does. Sometimes it's kind of all over the place. And that's what's so funny. We deal with so many different types of people from different professional backgrounds dealing with privacy because organizations are still figuring out where this fits in the organization. So the listeners on this podcast, we do see security practitioners get kind of handed this.

D Mauro (01:10:43.626)
Right.

D Mauro (01:10:57.143)
Mm-hmm.

Merry Marwig (01:11:04.386)
thing called privacy and like how do you operationalize that? You know, it's a team sport. You're gonna need your GRC person. You're gonna need your risk compliance people. You're gonna need your legal team. You're gonna need your execs. You're gonna need your board. You're gonna need your investors all on the same page here. And the easiest way to make that connection to the work that you're doing today is to understand when a data breach of personal information becomes a privacy harm.

D Mauro (01:11:12.875)
Right.

D Mauro (01:11:22.466)
Yep.

Merry Marwig (01:11:32.086)
So it's not that you just send the letters, you're the, oh, you're part of a data breach and you're done. That's just the beginning, you know? It's kind of like, I always laugh when people like buy new software and like the salespeople are like, okay, we're done. I'm like, no, that's the start. Yeah, same thing in privacy.

D Mauro (01:11:39.602)
Right. Yep.

D Mauro (01:11:47.185)
No, that's when the work begins. Yeah, that's exactly right. Yeah. Well, I think this has been great, by the way. So the conversation has been absolutely fantastic. And we'll address the questions. And we'll gather those up. What my most shocking thing is, how much our data is sold and then resold.

that we're not aware of, right? You have one last visual that talks about a study that was done about like all of the data, like our data is touched by thousands of different companies when we just kind of click, I accept the T's and C's for one app, right? Like what, like how is that like...

Merry Marwig (01:12:39.907)
Right.

D Mauro (01:12:45.549)
How like, how did that get by? Like how is nobody manning the store at that point? Like, I'm still boggled by that. Like, part of this whole problem that businesses are having or organizations are having, having to deal with all this additional regulation now is because the data brokers have kind of created this monster in a sense, right?

Merry Marwig (01:13:13.062)
Yes, so the graphic is a study that I think it was Consumer Reports, yes it was Consumer Reports did. They have a number of very interesting privacy studies by the way. I'd highly also recommend your readers check out the privacy in automobiles report they've done. It is fascinating the type of data that car companies are. I would argue that a big part of...

D Mauro (01:13:19.894)
Yeah.

D Mauro (01:13:33.971)
Really?

D Mauro (01:13:39.649)
owning an EV myself, I find I'm a little disturbed now. Now I'm going to have to go check that out.

Merry Marwig (01:13:45.31)
Well, what's interesting is car companies have really become data brokers. It's a revenue source for them. So it's not just selling you an IOT device, what called a car, you know, now what data are they, are they collecting and selling about you? So that this is kind of where monetization of data has just become so normalized. I'm part of the surveillance capitalism society that we live in. But the problem is the consumers haven't really been privy to that.

D Mauro (01:13:50.285)
Sure. Right.

D Mauro (01:13:55.538)
Mm-hmm.

D Mauro (01:13:59.604)
Sure.

Merry Marwig (01:14:12.602)
So this graphic you are showing is basically showing that 2,230 different companies touched individual person's data from Facebook, which as we all know has a lot of different types of tracking uses and things like that. But that's a lot.

D Mauro (01:14:29.033)
Yeah, they're really good. They're a fantastic company when it comes to privacy, aren't they? Sorry. No offense. We're not sponsored by them. It's okay. And if we're live streaming to them at some point, I'm sure this video will be killed. But that's all right. Not worried.

Merry Marwig (01:14:33.87)
I'm gonna go to bed.

Merry Marwig (01:14:38.911)
You know, and that's so interesting.

Merry Marwig (01:14:45.09)
make it impossible. But what's interesting is a lot of people and users of Facebook don't even know that, right? They just want to share pictures and...

D Mauro (01:14:52.533)
No, they have no idea. They're still responding to the quiz about who was your favorite teacher back in high school and where did your parents meet and what is your favorite food? Because that's a quiz on Facebook, not knowing that that's really a social engineering tactic because those are the same questions that are your security questions to guess your password. Yeah.

Merry Marwig (01:15:15.262)
100%. This has nothing to do with privacy, really, but I'm a huge proponent of maybe moving on from passwords and doing other types of authentication. Methods and password lists would be great. Or multiple. Maybe. You can start.

D Mauro (01:15:27.041)
Oh yeah.

D Mauro (01:15:30.377)
Yeah, or biometrics, some type of key work. Yeah, some type of key. I mean, a lot of people in the security industry have passed keys, you know, like a physical thing that have to go in. Oh yeah.

Merry Marwig (01:15:42.826)
You need, yeah, we need to move past the knowledge-based authentication factors, passwords, you know, like I think you shared with me this morning, 26 billion records were, you know, consolidated and now are for sale on illicit web right now.

D Mauro (01:15:59.393)
One of the big issues, here's the big issue, the issue is not only are we terrible as a culture of creating good passwords, but once we get a good one, right? We all have it, we all have a really good one. But then once we get that good one, we're using it on everything. Like we're like, I've got my good password, I'm putting it on Facebook, I'm putting it in my banking app, I'm using it at work, blah, blah.

But what we don't know is that gets sold and mixed and, and it's for sale right now. And they do credential stuffing. They just go and they run these password spraying softwares and these, and they, and they will just go and take that or algorithmic, you know, changes to that really good one that you had. Right. And they're going to get into anything you have and, and

In our trainings, we walk through people what really happens when they do become you online. They literally can make people homeless. They are you. When they can compromise one account, they can compromise almost all of them. And then they're able to just take over your life digitally. And what have you done recently that didn't involve a computer or a phone or an electronic credit card payment?

identify many activities that you've done because when they're able to take over your digital life, they're controlling all of that, right? So there's not much else left that's not part of that life. And that's where the risk comes in.

Merry Marwig (01:17:32.117)
Absolutely.

Merry Marwig (01:17:38.024)
Absolutely.

I completely agree, David. It's funny because I know better and I do have decent, I would say password hygiene and I use multiple factor authentication, what we're available and you know.

D Mauro (01:17:52.429)
Yeah, I'm not. I have to come. I have to. Yeah, I, we absolutely I make no comment. I make no comment. Yeah, because we're all guilty of it. Right? Like, I would love to say I don't have a really good password that I've used. Like, of course I have. Like, of course we have. But we like there are some things that we do now. Like we've I've gotten a lot better, especially starting this podcast. A few years back, I've gotten a lot better. But it's just human nature.

Merry Marwig (01:18:02.78)
Oh, absolutely!

Merry Marwig (01:18:07.874)
I know!

Merry Marwig (01:18:20.694)
I know, but you're just trying to get out through your day. You know, and this is the same thing when you collect, yes, I agree, even though you don't, when you agree data practices, but I was gonna.

D Mauro (01:18:25.1)
Yeah.

D Mauro (01:18:30.057)
Right. Yeah, I just want to download like the white paper. Oh, like there's some cool study or there's something I could use. I just want to do that. Like giving your information. I'm like, sure, sure. Whatever. Here, just give me the download, right? Or let me download the app. It's, it's crazy.

Merry Marwig (01:18:44.187)
Yeah. Right. It really is. So I hired a firm to do open source Intel on me and see what they found. And I was horrified. I was like, oh, man, I thought I was doing well. And they were able to find stuff, including passwords, like my go-to crappy password for when I'm trying to download something silly, which, you know, anyway.

D Mauro (01:18:54.15)
Oh, did you really?

D Mauro (01:19:07.974)
Right.

Merry Marwig (01:19:11.862)
And I was like, oh man, but that was part of me doing an authorized agent for my privacy rights requests where there's just too many, like you just, I don't have a full-time job to sit around and delete myself from all these companies. So I hired another to do it. And so.

D Mauro (01:19:19.285)
Wow.

D Mauro (01:19:25.737)
I know. So there are companies that will do like an OSINT investigation on you.

Merry Marwig (01:19:32.374)
Well, so there's kind of layers of authorized agent companies. There's ones that are more automated and they'll just kind of fire off emails to companies. So like check out the Consumer Reports one. It's called Permission Slip. It's an app you can download. Read the privacy policy though, before you do that. And basically they'll just like fire off emails to big brands and saying like, hey, you know, we were authorized by so-and-so to delete their data. You know, please go ahead and do that.

D Mauro (01:19:49.386)
Mm-hmm.

Merry Marwig (01:19:59.746)
But then there are companies that do this as a service. So they do much more in-depth work on your behalf. It's much more comprehensive. So there's about 50 of those types of companies out there. And if you want something that can.

D Mauro (01:20:13.217)
So that's permission slip by consumer. Cool.

Merry Marwig (01:20:15.854)
Consumer Reports, yes, but there's ones you can actually pay for. So that's a freebie. You know, you get what you pay for, but there's companies you can hire to do this.

D Mauro (01:20:22.513)
Right, of course. Yeah, but if they're if they're searching for that, they'll come across the more extensive ones.

Merry Marwig (01:20:30.73)
Right. Well, so here's how this ties back into privacy though. So if you're one of those companies, like, so for example, when I did the permission slip, they had companies in their like Home Depot and AMC theaters. And so now Home Depot is getting all of these privacy requests from this app called permission slip on behalf of millions of people who've downloaded this app. And imagine now everybody shops at Home Depot almost, right? So now all of a sudden you get this flood.

D Mauro (01:20:52.133)
Oh my gosh.

D Mauro (01:20:57.738)
Hmm?

Merry Marwig (01:20:59.822)
of consumer privacy requests, how are you as a company going to operationalize that? Like, it's, I don't know. I haven't talked to them. I should, but, but yeah, stuff like that where it's the scale of what's happening in privacy is also.

D Mauro (01:21:06.573)
How did they? Do we know?

D Mauro (01:21:11.251)
Yeah.

D Mauro (01:21:15.893)
Holy cow, yeah. I didn't even think about that. That's a great point. Here's an app that's there to help people. And now we just have to now we just caused a company to like employ five people, because they have to like start to deal with this or to engage with managed services, outsource contracted services, or both. Right?

Merry Marwig (01:21:33.762)
Right, right. Well, and that's why I'm really strongly suggesting people look into automation capabilities to meet that demand. So we have a customer that had 300, it's a big, big enterprise. They had 300 different people operationalizing one data deletion request. That's because they would go to the system owners and be like, delete so-and-so out of this CRM, delete so-and-so out of the marketing thing.

D Mauro (01:21:42.106)
Oh yeah.

D Mauro (01:21:58.665)
What? Oh my gosh.

Merry Marwig (01:22:01.554)
delete so-and-so out of the payroll thing, you know? Like, it's just like, now you're bothering system owners where they're like, I have my own job to do. I can't just delete for you all day. So that's why if you get automation, you can automate those deletions, system-based deletion or access requests. So from a process that was 300 people, we got down to one person who's just the privacy manager who oversees the software. So that's a lot of time.

D Mauro (01:22:10.278)
Right.

D Mauro (01:22:17.421)
Mm-hmm.

D Mauro (01:22:30.017)
Very cool.

Merry Marwig (01:22:31.114)
employee time saved and then also being distracted because a lot of engineers are asked to, you know, make a privacy program from scratch, you know, and that's not their core competency.

D Mauro (01:22:33.499)
Yup.

D Mauro (01:22:42.729)
Well, they're asked, I mean, and that's the whole thing. I mean, a lot of IT engineers, network engineers that have backgrounds in networking, right, are because it's technology, like business leaders are like, okay, you're handling cybersecurity. And they're like, sure, I got it, don't you worry. But the point is that they're not cybersecurity experts. Right, like cybersecurity is a whole different category. It's a whole different, you know.

Merry Marwig (01:22:59.787)
Yeah.

D Mauro (01:23:13.073)
Ecosystem everything about it the certifications the experience the training everything right? There's there's so many different layers, so

Merry Marwig (01:23:21.746)
And so here's something else to consider is if you do, which I'm hoping people do, get a system to deal with your privacy operations, one of the things I'd really recommend people consider when they're buying a tool is how invasive the privacy tool is itself. How much access are you giving this third party to go do these deletion requests or understand your data map? Because having a least-

D Mauro (01:23:46.629)
Oh my god, I didn't even think about that because... Well, and threat actors are going to target those organizations that are doing that.

Merry Marwig (01:23:49.502)
Right. So now, I'm asking you to describe that.

Merry Marwig (01:23:56.374)
Well, so that's why you need a tool that has a least privilege access to achieve privacy problem. We see a lot of companies that are like, we can be everything to all people and we can do all of this stuff. Just give us access. And I'm like, whoa, sirens blazing. Like, not let's not do that. If you're trying to operationalize a consumer's data privacy request to delete their data, you don't need to give all of this access to a privacy provider just to do that. Go do that.

D Mauro (01:23:59.209)
Right. Yep. Exactly.

D Mauro (01:24:09.811)
Mm-hmm.

D Mauro (01:24:13.47)
Right.

Merry Marwig (01:24:26.198)
Go do the privacy operation. Don't try to do all of this extra stuff. So we have, there are companies in the ecosystem that do like network sniffing to like find data, data's moving. You're like, Whoa, like, could you do that another way? And so that's something I would really encourage your, um, your listeners to really delve deep into like how much access are they willing to give in order to do the privacy operations. So.

D Mauro (01:24:26.47)
Right.

D Mauro (01:24:38.404)
Mm-hmm.

D Mauro (01:24:43.209)
Right.

D Mauro (01:24:54.893)
Right. That's a really good, that's a really good insight because, yeah, leave the network sniffing to the threat actors because they'll do it for us. And we won't know it and nobody will know it. And that's why we're employed, unfortunately, but that's why it's the well. Yes. So, excellent. Hey, Mary Marguerite.

Merry Marwig (01:25:04.906)
Yes. Thank you. I was like, oh my god. It's like, like.

Merry Marwig (01:25:14.622)
like you can do this other ways. So, yeah.

D Mauro (01:25:22.229)
Thank you so much. This has been a great conversation.

Merry Marwig (01:25:25.879)
Yeah, I've really had a lot of fun. It's so nice to talk with you and thanks to everybody who listened.

D Mauro (01:25:30.681)
No, absolutely fantastic. We'll have links to your LinkedIn and to Data Grail right in the show notes, so check them out. And I'm sure it's not the last time that we will be speaking, so we will. Absolutely, thank you so much. Appreciate it, thanks everybody.

Merry Marwig (01:25:44.418)
Yeah, hope it's first time. Thanks, David.


People on this episode