Cyber Crime Junkies

Identity and Brand Protection. Hannah Sutor.

May 10, 2023 Cyber Crime Junkies-David Mauro Season 2 Episode 41
Cyber Crime Junkies
Identity and Brand Protection. Hannah Sutor.
Show Notes Transcript

Public speaker, educator, Dev Ops specialist and privacy advocate Hannah Sutor joins us in the Cyber Crime Junkies Studio discussing new insights for identity and brand protection, identity and brand protection , identity and brand protection online, and best practices for protecting personal data online.

Connect with Hannah:  https://www.linkedin.com/in/hannah-sutor


 VIDEO of Full Final Video Episode: https://youtu.be/nfENN3-wt30

Topics: Brand protection with identification authentication, best policies to limit cyber liability, best practices for protecting personal data online, best practices identity protection, best security practices for individuals, best ways to protect people from cyber crime, how ai will effect cyber security, parenting tips for children online, how to protect families online, new security insights with expert, parenting tips for online gaming, parenting in a technology world, risk and brand protection.


 

PLEASE CONSIDER SUBSCRIBING to our YouTube Channel. It's FREE and it will help us to help others. 

Our Video Channel @Cybercrimejunkiespodcast https://www.youtube.com/channel/UCNrU8kX3b4M8ZiQ-GW7Z1yg 

 Connect with us. 

 

DAVID MAURO Linkedin: https://www.linkedin.com/in/daviddmauro/ 

Cyber Crime Junkies Linkedin: https://www.linkedin.com/in/cybercrimejunkies/ 

Cyber Crime Junkies Instagram: https://www.instagram.com/cybercrimejunkies/ 

Cyber Crime Junkies Facebook: https://www.facebook.com/CyberCrimeJunkies 

Podcast Cyber Crime Junkies: https://cybercrimejunkies.buzzsprout.com 

Site, Research and Marketplace: https://cybercrimejunkies.com 

 

Thanks for watching! -David, Mark, Kylie and Team 

 

@CCJ Music Credits: Two Guitars by Admiral Bob (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license.


Try KiteWorks today at www.KiteWorks.com

Don't Miss our Video on this Exciting KiteWorks Offer!

Try KiteWorks today at www.KiteWorks.com

Don't miss this Video on it!

The Most Secure Managed File Transfer System. 








Identity And Brand Protection

 

Public speaker, educator, Dev Ops specialist and privacy advocate Hannah Sutor joins us in the Cyber Crime Junkies Studio discussing new insights for identity and brand protection, identity and brand protection , identity and brand protection online, best practices for protecting personal data online, Brand protection with identification authentication, best policies to limit cyber liability, best practices for protecting personal data online, best practices identity protection, best security practices for individuals, best ways to protect people from cyber crime, how ai will effect cyber security, parenting tips for children online, how to protect families online, new security insights with expert, parenting tips for online gaming, parenting in a technology world, risk and brand protection.

 

[00:00:00] It's always in the news. Cyber criminals attacking great organizations wreaking havoc on the trust of their brand. We socialized cybersecurity for you to raise awareness. Interviewing leaders who built and protect great brands. We help talented people enter into this incredible field and we share our research and blockbuster true cyber crime stories.

This is Cyber Crime junkies, and now the show.

All right, welcome welcome everybody to Cybercrime Junkies. I am your host, David Mauro and in the studio today is Hannah Suter. And Hannah is a public speaker educator, DevOps specialist and a privacy advocate. Works as [00:01:00] a, , Senior project manager at, , GitLab and, focuses on authentication and authorization in, dev security ops within that context, and a remarkable kind of privacy advocate.

So much so that our own kind of OSINT investigation couldn't find that much out about her online. So we're gonna all find out about her in person, so that'll be great. Welcome. Hey, thanks David. Good to be here. No, we really appreciate you being here. So let's talk about identity and brand protection.

But first I want you to kind of like, let's back into that, right? So tell everybody what your current, what your current role, what you're currently doing, , in your professional life, and then we'll get into some of the, , the passion projects that you've got going. Okay, sure. So by day. I guess you could say.

I work at GitLab. I'm a product manager, working on authentication, authorization, and identity. So we are sort of the boundaries of [00:02:00] how you authenticate into our DevSecOps platform. , the credentials that you may have, the, you know, security and usability balance on how administrators choose to enforce controls on those credentials.

And really about keeping things secure and accessible is kind of the, the balance we try to find there for our customers. Makes sense. How did you, how did you get into that kind of field? Is this something that you, when you were a kid, you were like, that is when people log into websites, man, I'm gonna make sure that is secure.

Like what, what is it that drove you to do that? I think I sort of stumbled upon it and then it's be definitely become a passion of mine and I really can't see myself anywhere else now. I love. So I have been in product now for about eight years. Prior to that I was a software developer. It was a little bit heads down for me, and I started, I didn't even know about product.

 This was like before product was as big of a thing as it is now. I started asking too many questions and was like, why, why are you asking me [00:03:00] to code this? Can I sit in on the customer call? You know, And they were like, actually, if you're this interested in, you know, interacting with the customer and finding the why, like you might be a good fit for product.

So, I ended up in product. And then slowly over time, got a job doing, api product management, cloud platform, sort of level services. And then those services expanded to include identity. And then the past three jobs I've had, I've sort of narrowed in on identity specifically and looking after the identity portion of, various products.

I think I like the fact that it's like security, is a big aspect of it, right? And then you still have the, the human side. So I like the fact that it's not only a p a API only is straightforward, which is nice. Like you don't have anyone arguing over the placement of a button or the shade of gray when you're PMing for API products.

But I like the fact that I own like a little bit of a API and a front end. Component and that it's a really [00:04:00] important part of the product, right? If identity goes down then the whole product goes down. So it's a lot of responsibility, but I like it. Okay, so what I, I, I agree completely and it's also something that it gets to the very heart of the credibility of any product, right?

It's only as credible as it is secure in, in, in a lot of senses. So why does that appeal to you? Like, what is it about security? What, what I find is a lot of people have come from the military or they come from all different types of different kinds of background and they're all kind of drawn to cybersecurity cuz there's this kind of belief in serving others or protecting others.

There's this greater calling that they recognize in your content creation, in your privacy advocacy. You definitely hit on that. So what is it? Is there. Is there something that kind of inspired you when you were a [00:05:00] kid to, to be that way or what, is there anything you can, have you thought about that?

Like what, why you do that? Yeah, I think it's a good question and I, I've done enough personality tests and self-help to kind of understand a little bit about where this comes from. And I think it's because I have like an innate sense of things need to be fair. And things need to be just, and just because you know you have a bad actor doesn't mean that a good person should be compromised.

And that that's just okay, right? Like we need to find. Make it so that doesn't happen or there needs to be some negative consequence. So I think a lot of it just comes from my like, innate sense of justice, which I recognize sometimes I can like, you know, be overly that way and I it's not always like a, a great thing, but I think that's definitely where it comes from is just this feeling like, We need to protect the people who can't protect themselves.

Right? And like, you know, if you're up to no good, we're gonna find you or we're gonna make it so you can't do what you think you're gonna [00:06:00] do. So it's, it's, it's good motivation, but I think it's just a, so less technical. Yeah. I didn't mean to speak over you. Sorry. So a less, so, a less technical version of you would've been a trial lawyer or a prosecutor.

It's, or a police officer or something. Yeah. Right. In the military. That's what it seems like. But now that you, it does not seem like that. Yeah. But now that you learned how to code when you were younger or whatever, you, you developed some technical skills, so now you went down this path. Yeah. And I've even thought about like public service, mm-hmm.

Doing it. Mm-hmm. And then, I've, you know, the more I talk to people, you know, they're like, well, you know, being in public service, you can have an impact on the people you interact with that day usually, but in doing something, like being a firefighter police officer, but one advantage we do have of being in tech is that we can, like, it's really scalable and so potentially you can protect Yeah.

Thousands of people per day. Mm-hmm. So that's appealing to me as well. Yep. Yeah. The scalability of of, of tech is really the key, right? Yeah. When when things are put in place to protect [00:07:00] people, it protects thousands or millions. Right, right. As soon as it's launched, as opposed to just dealing with somebody locally.

Right. Totally. So, you know, you talk a lot about, and, and we will have links to your Instagram and your LinkedIn in the show notes because when we were developing this podcast and we were kind of just kind of filtering out our social media, you were one of the first people that we saw online kind of talking about individual privacy and identity authentication and the importance of that.

And it's really good content. I mean, I mean, the things that you point out I kind of learn something almost every week. I'm like, oh, hey, that's a good site to check out. Like you've always got some good tips or some, some, some good things. So let's, let's explore some of that. Cause I think the listeners and the viewers will find some benefit in it.

You talk a lot about data brokers, and this is, I talk a lot about this in, in our client meetings. In our presentations we do these security [00:08:00] presentations and we. We talk a lot about it cuz people don't realize it. Like they go to a website and then all of a sudden they didn't fill out anything and then all of a sudden somebody's emailing them about it and they're like, how in the world did that happen?

So walk us through kind of some of the, the things that you think that people don't realize in terms of the data brokerage arm that is happening in the background. Yeah, I think. I think we don't ever really look under the cover of what, whenever we install an app on our phone, for example, like sure, maybe it's providing you a wallpaper with a motivational quote, but like, what is it doing behind the scenes?

What data is it taking from your phone in exchange for that free app. Right, right. I always, I think what I parrot on my my social is If you're not paying for the product, you are the product. Right, right. And I think even using something like Gmail, right, which I use, like, I'm not gonna, I'm not gonna say [00:09:00] we should never use any of these products, right?

Because I, there's a, a benefit to them and then there's a cost. And the cost is that, you know, your data is taken. You may get targeted ads, they may be building, well, they are building a data profile on you and then selling that. And is it worth it to you for, you know, the great email service the Gmail is, for example?

So I think understanding that whenever you're getting a online product for free, that software is not free to make Every time you visit a webpage, there is something, a computer hosting that and serving you the page and all of that costs money. The bills for cloud services are huge and they're, we need a way to pay them and companies are paying.

That generally, if you're not paying for it, they're paying for it with by selling your data. And that's what you mean by we are the product because that's Yeah, that's exactly right. Like it's, it's, they're, they can make a. X amount if you buy something. But they can make so much more by selling all the information about [00:10:00] who you are and what your preferences are and the time of day, your location, other things that you look at.

Because then that can be used to sell to other marketing companies that want to target you in the future. Right? Yeah. I mean, isn't, that's what it's all about. That's what data brokerage is. Yeah, I mean it's, it's kind of tying in different data points of things. We either curate on our lives online or we are unaware that we're giving up.

Yeah. So like to me there seems to be two elements there. One, you need to educate people. We all need to educate people on. Like if you're not aware, you're giving up this right. That a lot of people are not aware of. Right. And then the other thing is, is when you curate, do you realize, be careful of what you do because there's so many things that can be taken advantage of that go beyond what your, what your expectation of privacy would be.

Yeah, [00:11:00] exactly. And I think sort of building off the second part of what you said too is I also get concerned with, you know, the. The algorithm fed reality that we all live in at this point, right? Of like everything we look at, almost everything we look at online. If you're tied into your ecosystem of, you know, using a browser where you're signed in and looking at a new service, where you're sign, where you're signed in and everything that's fed to you is based off of what they think you wanna see or what they think you'll engage with, right.

To up their engagement and their stickiness metrics. Yeah, and I think we, whenever you're giving up all of that data, you're also. Like losing the ability to have a unbiased reality presented to you, which I think is much. Much more compelling than, oh, they're gonna show me a targeted ad, right? Like, that's only so scary.

I mean, it's it's weird and creepy at times, right? But I think people are like almost accustomed to that now. I mean, I think people, everybody knows, hey, they're gonna target me with ads. Like I looked for shoes online or [00:12:00] boots online, and now all of a sudden I'm starting to see boots all the time.

Right? But it gets creepy, doesn't it? When. We've seen it in our own family. When we're talking about something like we've tested it in our family, we've literally said, I'm really interested in a really big green truck man. Like we want, like we don't own trucks. Like we were like, we want a huge like gm C something and we're just saying it into our voice assistant device.

Right. Or something like that. And then all of a sudden like, 30 minutes later, my son pulls up something and he's like, check it out. We're getting ads for like green trucks. Yeah. How did that happen? Yeah. Is that like, that's data brokerage, right? Yep. That's data brokerage at work and I think I. Yeah, like I was saying, it's one thing to get targeted ads, but I do think like the larger picture of what this data can be used for, even if right now it's targeted ads.

Right, right. What's it going to be in the future? Because now, you know, the past two months have shown [00:13:00] us a thing or two about ai, which. Honestly wasn't even really on my radar a couple months ago, and now it's, I don't think it was 80% of my day. Yeah. And so you wonder what could happen in the future?

I just saw there was a company that does online alcohol recovery counseling mm-hmm. And that they ended up leaking a hundred thousand of their customers. And their health information, their contact information to data brokers. So I think things like that are very concerning as well. Where in the future maybe insurance companies right.

Can discriminate against you based on some of this data, things like that. Right? Yeah. And that, that was a there was actually a couple instances involving mental health care. Yes. Better help, I think. Yeah, exactly. Where, where, where they go and they're breached and then leaked and then like extorted for that and, and you're just like, that's a whole other level of like, that's a level above just trying to make profit.

You know what I mean? Yeah. Then, then, then it's really al almost like that borders, that there's, that's black and white criminal [00:14:00] behavior as as, as opposed to just theft. Right, right. Theft is just for profit, but this is actually just like cruelty. So It is, it is, yeah. It's definitely concerning. So let me ask you what, what is.

What can the average person do? Like what, what are some of the things, because what we find is one of the battles is A, to educate, but B, to inspire people to care about it. Because some people are just like, well, what's somebody gonna do anyway if they get my Gmail account? Like I, that doesn't matter.

Right. But there is so much that can be done. Yeah. What I try to do is make. Data protection, really approachable. So, you know, I understand that not everyone's gonna buy like a raspberry pie and filter their internet traffic to, you know, make it safer. I think I try to provide approachable things that I would do or I would advise my friends to do.

Right. If you have like five minutes between giving your kids a snack or watching their show, right? Mm-hmm. What can they do? The iPhone [00:15:00] has good-ish privacy controls built in. Mm-hmm. In terms of being able to request that apps don't track you with one toggle. Not all of them obey it, but it's something you can do quickly to at least try to improve.

And that's done right in the settings of the iPhone, right? Yep. I have a reel on it somewhere. I forget the exact setting, but if you would just Google like Apple iPhone, do not track. You could find it. Yep. You can also with one toggle take away apps from knowing your precise location. I think our constant location data being fed to apps is not good.

And an app, like your meditation app or you know, any of the apps that you have, they don't need you to constantly know where you are, especially down to the precise location, which. Can pretty much tell them exactly which room in your house you're in. There's another location setting sort of up from that.

That's more general location. And then you can also request no location and then only give them a location. It will come back and request from you [00:16:00] when you're using it. Like, okay, now can I use your location, for example? If you need it? Sometimes for Google Maps, Right. And things like that because it's about an expectation of privacy, I think.

Like, like there's not, like if you are curating your life and you are. Hiking and you wanna have a picture of you there, then you are volunteering the information that you are at that location, at that time. Right? Right. But there's risks to that, right? Because by doing that, that means you are not home and somebody might not be at home.

Somebody could take advantage of that. That means you're not at work, somebody that could cause problems. It means you're also not somewhere else. Like there. You don't want to to. To be giving away your geolocation all the time without even knowing it. Because it can be, that can be taken advantage of by people that don't have good intent.

For sure. For sure. For from, you know, a personal safety perspective. That's definitely an aspect. And then like going back to [00:17:00] the data broker aspect, right? Like that's also very valuable information to advertisers, you know? Yeah. Arge wants to know how many times you were at Walmart versus how many times you were at Target and Oh yeah.

Your phone's location data can tell them exactly that, right? So there's a lot that can be deduced from that information. So what recommendations do you have for people to protect their own individual IDs? And we talk oftentimes about protecting the organization's brand that we all serve, right? When we have a job, whether we know it or not, we are serving an organization's brand.

We have a duty every time we get online to protect that. Brand. That's why security needs to matter to people. But individually, we all have a brand and we all have our own finances, our own health information, our own family information, our private thoughts, information medical treatment that we get, et cetera, that we don't expect to be publicized.

So what's your recommendation on ways that individuals can protect that? [00:18:00] I think one of the easiest things to start doing if you haven't already so the main way that our identities are sort of tied in these back channels is through our primary email address and our phone number. So if you can if you've been using your primary email address for a long time, like I have personally starting to make different ones, you can just even take your normal email address and then add a plus sign and then add.

A kind of word you can easily remember and then use that as your email address rather than the same one that you always use. Or the iPhone again, has a nice built-in feature where if you're typing in an email address, it will pop up and it will say random email or something like that.

And what it does is generate a random email and then they forward it to your iCloud email address on the backend. So I think starting to, especially if it's something sensitive, starting to sort of cover up those really identifying factors. If you can, maybe you sign up for a Google Voice number and use that as the [00:19:00] phone number instead of your primary.

And then simple things, honestly. Enable, use strong passwords, right? These are like, so boring, but so many people still don't. It's, it's so shocking, right? It's still shocking. And, and I, and when I talk to people, even pretty bright business owners, like they don't, they're like, I've got a really good password.

I use it. And I'm like, Hey, that's great. What do you use it for? I, all my finance stuff. And I'm like, but you've, once you use it for one thing, I think the danger in data brokers, data brokerage process is that what we don't realize is that password, that username can be sold without us knowing it, right?

And then they get breached or leak it, and then that great password, no matter how strong it is, how long, how many characters or whatever is now tied to us and available on the market. And so it doesn't matter because we've used it on more than one site [00:20:00] and it's really, yeah. It's boring and it's dry, but it's still not, people still don't do it.

I know, right? I know. It's so true. And it, it's annoying, right? Like it's not a fun task to no, I, you know, my account was breached. I think it was from a dark web leak. Mm-hmm. And they were successful because it happened or at like two in the morning that they were able to order some stuff on an account I had that I stupidly left the credit card attached to the account and then it must have gotten some kind of traction or up votes in the dark web world because they were able to successfully use it.

And it still haunts me to this day, even though I've changed the password, turn on mfa, I've done all of the controls. I still wake up to emails saying like, At 2:00 AM someone was fr someone from Country X was trying to log into your account. So they are ruthless, I feel. Yep. And it just goes to show that even someone who, you know, lives in this world and knows what they're doing.

Yeah, yeah. You can still be, you can still be, can still, it's not foolproof. Absolutely. So [00:21:00] again, use multifactor authentication. I think I saw some statistic. It does stop like 95% of breaches. It's not perfect, especially SMS is like the weakest. Form, but it's still much better than nothing. So if you're gonna, you know, use SMS or not use it at all, like please sign up and use sms.

Absolutely. And then there's, there's certain sites before you go to buy something. There's certain sites like scam advisor.com and some other ones that are vendor related, but they'll show you whether that site has been. Lagged. And then what about freezing your credit or freezing your kids' credit or recommending to other family members to freeze their credit?

Yeah, that's one of my, you know, biggest recommendations. Again, one that's not exciting. I have mine frozen and you know, my dad actually he's not very tech savvy at all, called me and he was seeing some weirdness with just a loyalty card he uses at the hardware store. Mm-hmm. Right. Yeah. And he's like, Suddenly I went to three [00:22:00] different stores and they're all saying that my, I've never been there and my loyalty account doesn't exist anymore.

Like, what do you think is going on? And I'm like, please go freeze your credit. I don't know. Right. But exactly. You know, someone has, someone probably has a hold of your information. So I think that is an excellent thing to do, especially you know, for kids as well with their social security numbers.

And it'd probably be the first thing I would suggest to older people as well. And it's It's easy to do online, but make sure you do it with all the bur, all the bureaus and for parents. Yeah, and for parents to do it for their kids. Yes. Because one of the things that we've found is that Kim's C kids come out and they have.

You know, graduated high school, they're trying to get their first loan or buy their first car or whatever it is, and oh my gosh, they have a f a condo that is foreclosed in Nevada and they're like, oh my gosh, what happened? It's like, cuz somebody's been using your ID for 10 years cuz nobody ran their credit.

Nobody looked at it and you didn't freeze it. Right? So parents should always kind of, I always [00:23:00] recommend that. Right? Yeah, just freeze. Freeze the credit, and you can, if, look, there's nothing wrong. Your credit score, your FCO score still goes up while your credit is frozen. Doesn't affect that. It's just that if you want to buy a house or take out a loan or get a new credit card, you just have to voluntarily unfreeze it, and it can be done right within an hour.

They'll unfreeze it. You can go borrow the money and then freeze it back up again, and then they can't take out new credit lines. Under your identity. Yep. Yeah. Yep. Totally. I, yeah, I applied for a credit card and forgot mine was frozen and had to do the whole unfreezing thing. Yeah, yeah, exactly. Absolutely.

So let me ask you, and, and we appreciate this. You've, I, I know that you've got something, speaking engagements coming up. We'll get into that right before we wrap up, but I wanted to ask you about how chat G P T and AI and all of these. These factors how is that affecting identity protection? I mean, clearly it's affecting security because [00:24:00] we see things like the Samsung.

Breach where some of the employees were actually lo using chat. J p t. It's great for ideation, it's great for coding, it's great for so many things. Any of these, any of these AI platforms really are phenomenal. But you have to be careful, don't you, about the information you input in there. Because by putting in information that is confidential, then the whole world has it.

Yeah, I think it's it's really been interesting to watch sort of the excitement around it all and the fact that, you know, even the most seasoned tech pros at this point are feeding proprietary code to it. And that, you know, it's really tempting cuz it's so exciting what you can get back out of it.

But definitely I think more and more companies are building up policies around this. Probably as we speak. Many legal departments are getting their ducks in a row. But from a personal perspective, I posted recently, sort of my thoughts on it in general when it comes to personal identity and [00:25:00] privacy, and I think where I landed with it at the moment is that I'm a little concerned from I guess it's like I'm concerned about it from a, the fact that everyone is, Contributing right to, mm-hmm.

To G P T, both by what we write on the internet and by what we talked to it about. And then we're all getting information back out of it and, right. I know that, you know, there's a couple other UMIs, but it's like most people are engaged with G P T at the moment. Right. So I just worry about the group think or what evolves when we're all feeding the same thing and then all getting information back out of the same thing.

So that concerns me just from yeah, I don't know. Like I don't think one source of information is ever. A good thing, and I know we're not using it for all information or anything, but that's my concern with it at the moment. What about, I mean, there's a correlation there between that and us kind of voluntarily giving it so much information.

I see a correlation between that and TikTok. [00:26:00] Mm. Because TikTok is a hugely debated issue, and they're, people are like, let me voluntarily curate my life online. Mm-hmm. Like, I have a following. I have content, I wanna share it. And there's nothing wrong with doing that, right. I don't think anybody is saying, no, you can't, you don't have this freedom of nobody.

I don't believe that anybody intentionally wants to limit freedom of speech or freedom of expression, nothing like that. They're just concerned that. What we don't know is being done behind the scenes with that app. I think that's the concern and whether there's evidence of that or not is, is up to people with certain levels of security clearance that we don't know about.

But what are your, what are your thoughts about that? Have you formulated any opinions? I haven't seen you post on that. I, I did see you say you are looking into that. You see that that's an issue, but I was just curious to get your feedback on that. I think. So setting TikTok specifically aside what you just said really resonated.

And [00:27:00] honestly, that was kind of like my first jumping off point for my privacy research. Before I was educating, I was doing research mainly, and I met with a lot of privacy professionals. And what I was trying to figure out is if a company says they're collecting data X, y, and Z about you, right? In their privacy policy is usually where that's detailed.

If, if they say they're doing that, is there ever a check of like, are they actually doing that or are they collecting more than what they're saying or, right. They say they're not transferring it to third parties, but are they like, but are they, there's no right audits, there's no really check and balance.

Right. There's none. And I think in with the gdpr, there's some government checks into that and like there have been some fines for TikTok and Meta and others in the us so I guess there's. A very, very, very small chance that you may get audited. Right. But I think it's, I, I almost wish my idea was like whenever the label organic [00:28:00] is on something, right?

Like it means something. We know that there's a standard Could there be, you know, the same thing for tech where there's like some transparent tech badge where we know, okay, this app says it's doing this and it's actually doing this. I think there's a huge disconnect there. And like you mentioned, it's probably not purposeful, but I think a lot of times, you know, legal and the developers actually implementing, are they aware of what each other is doing?

I don't know. Well, and to me it's almost like, Plain text and plain labels, like when there's warning labels. On products. I mean, I know the US is extremely litigious compared to other countries, but because of that we have warning labels on ridiculous things like, you know, don't like your hair dryer. If you have ever looked at the warning label on a hair dryer, it's like, don't use it when you're in the tub.

And I'm always like, mm-hmm. What lawsuit was that? Like, who like did that and got in the tub and then sued, oh, you didn't warn me because of that. But [00:29:00] in a certain sense, like a lot of this could be eradicated if. An app was required to have a warning. Like, by the way, if you submit, if you click accept the terms and conditions you are giving them the right to ex, like to check all of your keystrokes, to get a copy of all of your contacts, to be able to sell that data to third parties.

Like people don't realize that cuz nobody has ever read the terms and conditions when they click, like install an app like nobody has. Yeah. I mean, I think, I think that's where the disconnect is. That's where the injustice is, right? Yeah. Is because people, you know, like people post things on the internet and they expose that they are traveling somewhere at that time.

Right. Well, you're voluntarily doing that. Right. The best practice there is you take the videos, you take the pictures, you record them, and then you post them after you [00:30:00] get home. Because otherwise it's the, it's the time and place. It's, it's what it's indicating that it's you. You're telling everybody where you're not right.

You're not at home, and you won't be at home until next Thursday. So now everybody knows your home is sitting, there is open prey to be robbed. So like, what do you like? How do we address that? Like where do we go from there? Where do we go from there in terms of like the non-transparency with data collection?

Yeah. Like how do we, how do we, how do we solve this? Is it a law that has to be in place? Does the US need something like gdpr? Like what do you, what's the next step? I mean, some privacy protection law would be great. There are some individual states that I think there's about seven states cpa Yeah.

And there's c cpa. There's things like that. It's hard though when it only applies to certain states. Right. I think at a national level, [00:31:00] I think that was one of the main arguments I saw with the whole TikTok thing is like, In instead of passing or proposing meaningful privacy legislation right.

They were trying to ban a specific app. Yeah. Right. So I agree with that general sentiment that like, it seems almost like a cop out, right? For Yeah. What us not getting the protections we deserve as citizens, right. I'm trying to do what I can just as one person with educating, right? Like I show people, Hey, you actually can see what data an app is collecting.

Here's how. And, you know, if you're gonna use it fine, but at least make, be aware of what it's doing in the background. So there's some element of at least educating boots on the ground until we can get there with legislation. Now, what do you think? Yeah, no, I, I, I, I agree. I think that you have to keep doing what you're doing.

We have to keep raising awareness because that's socializing of the risk, the socializing of the of the, the. The taking the, the being [00:32:00] taken advantage of when we don't realize it. Right. People don't realize this stuff. So I think that by socializing it and, and it, it's a good translation because when people think of cybersecurity, they think of a hacker in a hoodie and they think of something very, very complicated in reality.

Right? And, and it's, it's not that. And it's, it's by being able to translate that into everyday, Practical effects for individuals, right? You know, 100% of businesses are run by people, right? And so by being, bringing it down to the individual level, I think it really raises awareness and makes people care about it.

Nothing's really gonna change until they kind of mandate it. And I think I agree and, and I think they have to do that, but. You know, I've been a student of politics my whole life, and all politics is local. It all has to start, it all has to start with individuals, right? It all has to start with a grassroots effort and then it gains momentum because [00:33:00] while you're doing that there, someone else in another part of the country is doing it also.

And then we, we kind of feed each other and then on thanks to the internet, like the you, they start to see and the message kind of bubbles up to the top. And I think that's, that's really where it's great work that you're doing. What are some of the other resources that you advise people on?

Like there's, have i been pawned.com? Like that lets people find out if they've been part of a data breach and I implo people all the time to go check that site out because you'll be so surprised because you'll be, you'll find that your. Username, your passwords, all of your information has been part of a data breach from a company you don't even know about.

And that has to do with data brokerage, right? That has to do with, right. They took your data, they sold it somewhere else. Anything else? Yeah. Any other kind of. Cool sites or tips that you've come across lately? So there's one I love called Privacy Not [00:34:00] Included, and it's by the Mozilla Foundation, who does a lot of great work.

Mm-hmm. And they kind of do what, what we've been talking about where, okay. The Alexa dot, right? Mm-hmm. Like tells you it collects this. But we did some testing and it actually collects this so it gets a, you know, bad privacy rating. So it, it, oh really? It does, it does it for apps and it also does it for physical products.

Like, you know, your oh, that's great. Your Roomba is your, the camera on your Roomba, like, is it sending photos back up to the cloud? You know, things like that. Oh yeah, that's something people would not expect. Right. Yeah. Right. How much is your smart refrigerator like recording in the room and things?

Exactly. Yeah. So I think that's a really interesting one. And that's Privacy not included. Yep. Privacy not included at the Mozilla Foundation. And they do really easy to read ratings. Like I think it's a a to an F scale of Oh, that's great. You know, how, how privacy a aware and certain products are. So yeah, that's one of my favorites and I'm a huge fan girl of their work.

That's [00:35:00] great. Yeah. Any others? Any others you can share? You know, I think one thing you can do, I hate doing this, is just Googling yourself, right? Well, it's always awkward and it's always awkward. No one likes it. But what you can do, and I hate that it's this way, but these data brokers like you know, people search and yellow pages and radars and some of these ones that always come up and they tell you, oh, you know, she's this old and she lives at this address and this is her family, right?

You can make requests. It's usually buried somewhere on that website, and you have to go through each individual site and request it on each one. And I, that's when I say I hate it, I hate that part, is that it's like so manual. Right. But you can request that they remove your information. So, and, and, and are they required to by law, like within a couple weeks, will you see your information gone from those sites?

I am skeptical. I've had mixed experiences. I, I think if you're a California resident, because of the C C P A, they are required, oh, for other [00:36:00] states. I don't know if they are within a certain timeframe, and some of them I've gotten, I click on the Request to remove me and I get a 4 0 4. So it's like, ah, to me, very dependent.

It's hard to say which ones remove you and which ones don't, but it's worth a try. Yeah. And then sadly, you also have to keep on top of that because I think it's every four months or so, they re-scan public records and put it all back out there. Wow. Now there are some key services. Yeah, I was gonna say, I bet there's services that'll do that for you.

Yeah, there's paid services that will do that for you. I don't have any recommendations because I haven't looked into it that much. Right. Nor are we sponsored, nor are we sponsored by that. So we're not gonna list their, their names Exactly. I know there's some credit cards that offer it as well with their, yeah, with their credit card.

Yeah. All right. Good stuff. Last thing and then I'm gonna let you go. I, I, you talk about privacy breadcrumbs, Like, can you elaborate on that? I mean, I've, I've, I've heard you kind of mention that, is that [00:37:00] like, what should people do when they go to a website and every website pops up? Cookies, right. I mean, what are I, I know what I tell people to do, but I'm interested in hearing what you have to say about that.

Yeah, this is a fun one. Thanks for asking. So cookie consents popping up on every website, right? Mm-hmm. They're required to by law. They're not required to be very transparent, so they can make you think by looking at it that, you know, I have to accept them all to make it go away. I have to accept it all to use the website.

But you don't, and you can click If it has a reject button, you can click reject. Or sometimes they make you click further into it and you might have to say, customize, and then you can click reject or turn them off. And usually you only have to keep on essential cookies, which are one or two needed to keep the site running.

All the rest that usually involve like, you know, that creepy stuff of I went on this website to buy this thing. I didn't buy it. I never put in any in my information. But now they're emailing me saying, mm-hmm [00:38:00] Like, you left this in your card. Right. You can get rid of all of that usually by rejecting cookie tracking cookies.

And so some of my favorite Instagram reels, I put together, you know, like five screen caps of me tapping. Reject or no on those. And I put it to like a catchy thing and I'm like, Hey, did you know you can actually click no. You don't have to click. Yes. And people are like, what? I always thought I had to click yes to make the website work.

Right. So that's that's like an easy thing to do. And I, it's kind of empowering. Every time I click reject, I'm like, yeah. Right and right. And the site still works. You're still, and the site still works. You're not booted off the internet. Yeah. Yeah. And, and another thing is session cookies, right? And session hijacking.

We just saw a very popular, one of the biggest tech YouTubers that got hacked. And his sight was taken down for a while and it was brutal. And what had happened is somebody had fallen for a fish or something and clicked in the malware in the background, grab the session cookies [00:39:00] like, so when you log into.

Google or you know, a browser or a site, they'll often ask you, A pop-up will happen, and I'll say, stay logged in. Like, do you wanna remain logged in that way? Session cookies in, in and of themselves aren't bad because they allow you to not have to keep re authenticating, not have to keep logging in. But the problem is, is if there's malware out there that will capture those and if they can capture those, Then MFA doesn't matter.

Your passwords don't matter. They are logging in live. Just like you were, they're relogging in under that same session. So everything in your browser, they can absolutely access and it's really, really scary. So my recommendation is always go for a little bit of inconvenience and don't allow that. Right.

Just, you know, even if, if you have a password manager, you could just log back in through the password manager anyway, but don't stay signed in. Yeah. Yeah, that's really [00:40:00] concerning because like you said, if they have that the token, then it bypasses sort of all of the other checks, right? Because a lot of products have checks of like, oh, are they logging in from a location they don't normally log in from?

But if you have the, the cookie that has that location already associated with it, it bypasses a lot of the other checks we do for security. So yeah, that's a really important one to protect. Absolutely. Well, hey, thank you so much for today. Before we go, like you've got some public speaking coming up.

What what, what, what events are you presenting at? So I'm doing two privacy basics workshops. These are more targeted for smaller audiences like employee working groups. And what I do is sort of give like approachable privacy advice. Here's, you know, five things you can do with five minutes to help improve your privacy online.

And then education, obviously around why we should care a lot of what we talked about today. And then I'm also speaking at Iden Averse, which is a big digital identity conference. What I'm talking about [00:41:00] there that is at the end of May is the business of identity and how identity can actually be a revenue driver for your business.

In the past, it's been seen as kind of like a thing we have to do to keep the lights on, right, provide right authentication. But there's a lot of ways it can actually contribute to your company's bottom line when done correctly. So I'm excited to talk about that there. Well, I'm not gonna let you go until you at least tell us a little bit about that.

So is that okay? Like, I'm not to not, because not everybody that's listening here or watching is gonna be at that conference. So what, like, how does a business, just generally speaking, I don't wanna take all your, all the secret sauce, but how generally, how does that work? Like how does a business, well, what are some of the ideas to ke to generate identity as a, as a revenue source?

So, as we talked about, identity and security are really tightly coupled, right? Mm-hmm. Your basic identity. There's you know, a, a sort of like goodwill in the security and iden identity community of like, We shouldn't charge for basic security features. [00:42:00] Right. And I agree. So the basic things needed to secure your product you should not, we should not be charging extra for those.

But there are a lot of advanced advanced levels, advanced level features. Like I always say, we provide sort of a toolkit for people to decide how secure they wanna make their instance of our product. And they can balance that with accessibility. So there are some more advanced security features that are really cutting edge.

And they're honestly not for everyone. So it also depends on the technical level of our customers who want to implement this for their audience. They need to know, to know their audience. But I would say that's where. The rubber meets the road in terms of increasing to your a r Yeah. Yeah. Makes perfect sense.

If, if the customer of a certain product line or something is in the finance industry, you may charge a premium because you'll give them advanced security features tagged on to that product line or service line or whatever. If they're just in the, in unregulated [00:43:00] field. Then maybe it doesn't really matter.

They, they bear more of a risk cuz they don't wanna pay the cost, but they can get the product or service at, at, at, at a less cost. So that's interesting. That's, that's, and then also too, another aspect of it is having a swift, you know, sort of user acquisition experience. And that is all done through your first impression of the product, which is usually creating an account, right?

That's a really important step in having someone become a paid user, right? And how can we make that process as streamlined as possible while still respecting security and building trust? And that also helps contribute to the bottom line. Excellent. Excellent. Well, Hannah Suda, thank you so much for for the insight today.

It was, it was really, really good. I, I think people got a lot out of it, so thank you. We really appreciate it. We'll have links to your information in the show notes. Check that out, follow her. Got great advice. I learned a little bit of something every single week, and it helps our individuals, helps us, [00:44:00] our families, as well as The organization's brand that we serve stay secure.

So thank you so much. We appreciate it. Awesome. Thanks David. Bye. Thanks. Bye. Hey, well that's a wrap. Thank you for listening. Our next episode starts right now. Please be sure to subscribe to our YouTube channel. It's free, and download the podcast episodes available everywhere you get podcasts. To support our show and get exclusive pre-release episodes and bonus content, please subscribe to Cybercrime Junkies Prime Lincoln, the description and show notes, and thanks for being a cyber crime junkie.