Cyber Crime Junkies

Hidden Cyber Security Risks In Software-As-A-Service (SaaS) Platforms

March 21, 2024 Cyber Crime Junkies-David Mauro Season 4 Episode 37
Cyber Crime Junkies
Hidden Cyber Security Risks In Software-As-A-Service (SaaS) Platforms
Show Notes Transcript

How safe are software-as-a-service platforms? Yasir Ali is the Founder and CEO of Polymer,  explores hidden cyber security risks in software as a service platforms and innovative ways to reduce risks in software messaging apps.  Polymer is a data loss prevention platform for Software-as-a-service  platforms.  It leverages AI which automates the protection of sensitive information across SaaS apps like, Salesforce, Google Drive, Slack, Microsoft Teams, and Zoom. 

We all use these platforms. Did you know there are little known dangers of software as a service platforms?

There are many risks. Some intentional  and some created. We explore them today. 

This is the Story of Start-up CEO and founder Yasir Ali and how he is debunking common cyber security myths about software.

 TOPICS: 

  • hidden cyber security risks in software as a service platforms, 
  • innovative ways to reduce risks in software messaging apps, 
  • how safe are software as a service platforms, 
  • little known dangers of software as a service platforms,
  • new ways to reduce risks in software as a service platforms,

Takeaways

  • AI adoption in SaaS platforms is increasing, and there is a need for AI governance and regulations to address data security concerns.
  • Deepfake technology poses risks to organizations, and Polymer is exploring ways to mitigate these risks.
  • Zero trust architecture and data protection measures are crucial for ensuring the security of sensitive information in SaaS platforms.

Chapters

  • 01:12 Genesis of Polymer
  • 04:11 Data Protection and Control
  • 05:37 Risks of Oversharing in SaaS Platforms
  • 07:07 Functionality of Polymer
  • 08:05 Real-Time Redaction and Data Observability
  • 09:28 Customization and Flexibility of Polymer
  • 11:20 AI Governance and Concerns
  • 22:17 Polymer's Role in Zero Trust Architecture
  • 24:24 Myths and Misinformation about Data Security
  • 26:21 Yasser Ali's Journey as an Entrepreneur
  • 28:14 Future Plans for Polymer
  • 30:48 Conclusion and Contact Information


Try KiteWorks today at www.KiteWorks.com

Don't Miss our Video on this Exciting KiteWorks Offer!

Try KiteWorks today at www.KiteWorks.com

Don't miss this Video on it!

The Most Secure Managed File Transfer System. 








Hidden Cyber Security Risks In Software As A Service Platforms


How safe are software-as-a-service platforms? Yasir Ali is the Founder and CEO of Polymer,  explores Hidden Cyber Security Risks In Software As A Service Platforms. Polymer is a data loss prevention platform for Software-as-a-service (SaaS) platforms.  It leverages AI which automates the protection of sensitive information across SaaS apps like, Salesforce, Google Drive, Slack, Microsoft Teams, and Zoom. 

We all use these platforms. Did you know there are little known dangers of software as a service platforms. Risk that exist when working on and sharing documents and messaging. Everyone thinks when they chat on Slack or teams that it’s encrypted and nobody can access it outside your organization. Or that sharing confidential information internally through SaaS platforms will remain internal and confidential.

Well we are wrong. There are many risks. Some intentional from third-party threat actors and some risks are created by internal employees actions which inadvertently cause big breaches. We explore them today. This is the Story of Start-up CEO and founder Yasir Ali and how he is debunking common cyber security myths about software.

TOPICS: 

hidden cyber security risks in software as a service platforms, 
innovative ways to reduce risks in software messaging apps, 
how safe are software as a service platforms, 
little known dangers of software as a service platforms,
new ways to reduce risks in software as a service platforms,


TAGS: hidden cyber security risks in software as a service platforms, innovative ways to reduce risks in software messaging apps, how safe are software as a service platforms, little known dangers of software as a service platforms, are software as a service platforms safe, dangers of software as a service platforms, common cyber security myths about software, new ways to reduce risks in software messaging apps, how to reduce risks in software messaging apps, how to reduce cyber security risks in software as a service platforms, how to reduce risks in software as a service platforms, new ways to reduce risks in software as a service platforms, cyber security risks in software as a service platforms, fixing cyber security risks in software as a service platforms, how to fix cyber security risks in software as a service platforms, uncommon cyber security risks in software as a service platforms, common cyber security risks in software as a service platforms,

How safe are software-as-a-service platforms? Yasir Ali is the Founder and CEO of Polymer,  explores hidden cyber security risks in software as a service platforms and innovative ways to reduce risks in software messaging apps.  Polymer is a data loss prevention platform for Software-as-a-service  platforms.  It leverages AI which automates the protection of sensitive information across SaaS apps like, Salesforce, Google Drive, Slack, Microsoft Teams, and Zoom. 
We all use these platforms. Did you know there are little known dangers of software as a service platforms?
There are many risks. Some intentional  and some created. We explore them today. 

This is the Story of Start-up CEO and founder Yasir Ali and how he is debunking common cyber security myths about software.
 TOPICS: 
• hidden cyber security risks in software as a service platforms, 
• innovative ways to reduce risks in software messaging apps, 
• how safe are software as a service platforms, 
• little known dangers of software as a service platforms,
• new ways to reduce risks in software as a service platforms,
Takeaways
• Polymer provides a control layer for data loss prevention in SaaS platforms, offering real-time redaction and data observability.
• AI adoption in SaaS platforms is increasing, and there is a need for AI governance and regulations to address data security concerns.
• Deepfake technology poses risks to organizations, and Polymer is exploring ways to mitigate these risks.
• Zero trust architecture and data protection measures are crucial for ensuring the security of sensitive information in SaaS platforms.
Chapters
• 00:00 Introduction and Background
• 01:12 Genesis of Polymer
• 04:11 Data Protection and Control
• 05:37 Risks of Oversharing in SaaS Platforms
• 07:07 Functionality of Polymer
• 08:05 Real-Time Redaction and Data Observability
• 09:28 Customization and Flexibility of Polymer
• 11:20 AI Governance and Concerns
• 13:21 AI Adoption in SaaS Platforms
• 15:26 AI Regulations and Governance
• 22:17 Polymer's Role in Zero Trust Architecture
• 24:24 Myths and Misinformation about Data Security
• 26:21 Yasser Ali's Journey as an Entrepreneur
• 28:14 Future Plans for Polymer
• 30:48 Conclusion and Contact Information

D. Mauro (00:03.337)
Well, welcome everybody to Cybercrime Junkies. I'm your host David Morrow and we are joined today by Yasir Ali, who is a founder and CEO of Polymer, a data loss prevention platform for SaaS and AI. It automates the protection of sensitive data across SaaS apps, common ones that you can think of like Google Drive, Slack, Microsoft Teams, Zoom.

and it's a really phenomenal platform and he's got a great story and great inspiration and some ideas that we're going to focus and talk about AI and some of the myths and some of the common risks that organizations need to address. Mr. Ali, welcome.

ya (00:51.702)
Thank you, David. Really exciting to be here. Really looking forward to our chat here.

D. Mauro (00:57.769)
No, I'm really glad that you joined. Thank you for taking the time. So tell us a little bit. You're an experienced startup founder. Walk us through Polymer and some of your evolutions here.

ya (01:12.718)
Yeah, so Polymer actually came out of some of the challenges I saw during the consulting business I was running focused on data privacy and technology, working with finance organizations. I used to be back in the day on Wall Street as a bond trader. After the financial crisis changed the stripes and moved into more of a technology role. And that's kind of where Polymer idea germinated from to solve

compliance and privacy issues which were plaguing organizations I was working with and thought I could do it better because I didn't see any of the product in the market doing it properly. And it's been three years plus now, yeah, at this company.

D. Mauro (01:57.769)
That is fantastic. Did you get a, you know, being an I used to trade options. So like, but I also know that I couldn't develop something like this myself. Did you get like subject matter experts or somebody that could develop this stuff or did you have the wherewithal and the know -how even how to develop?

ya (02:15.63)
Well, I have a phenomenal team. My co -founder, Usman, he's amazing chief technology officer from day one and then the rest of the product and sales and the rest of the engineering team, which is 25 strong now total. They are the real working bees. I'm just a sales person who comes up and talks in podcasts.

D. Mauro (02:17.641)
Okay, good. Yeah, that's where I was wondering. Yeah.

D. Mauro (02:35.017)
That's great. Yeah, that's exactly right. That's as do I, right? So I mean, I would like to think of it as that we translate. Like we translate all of the technical features and benefits and the technology into like business impact, right? We explain why it matters and why it's so important.

ya (02:54.414)
I guess that does have some value.

D. Mauro (02:56.393)
Yeah, exactly, because otherwise nobody would understand what they say at the time. So let's talk about Polymer. Walk the ladies and gentlemen what it is. Tell us what it is. It's a quick install, like a 15 -minute install, as I understand it.

ya (03:00.526)
That's right.

ya (03:13.55)
Yeah, so every single, pretty much any single organization right now is probably using either Microsoft 365 or Google Gmail. They might have a customer facing ticketing system like a Zendesk or a Salesforce. They might be writing their own code using GitHub, for example. They might also be using Slack or Teams within the environment to chat amongst themselves. So all these platforms,

D. Mauro (03:32.263)
Mm -hmm.

ya (03:42.478)
As you're discussing business, as you're working with partners, your customers, your patients, you might be accidentally, or in some cases by design, sending or transmitting sensitive information. And this could be patient records, customer banking information, hey, go buy me this money here, for example, or other intellectual property like designs or strategy documents, which you might be sharing through your OneDrive or SharePoint with a link.

to the rest of the world or to your partners. And all those things over time creates a risk profile on your organization's data. And Polymer basically is in the business of providing a control layer that connects in. It doesn't get in the way of the business. It sits in the back behind the scenes monitoring any data changes, any new file being created, any file being shared.

D. Mauro (04:13.321)
Right.

ya (04:39.374)
someone is basically emailing a document to himself or someone is sharing a document to their own Gmail account or downloading it.

D. Mauro (04:46.537)
And that happens all the time. Like that happens all the time. All of these programs are SaaS programs. When we mean by that, most, almost all of our listeners know what that is, but it's software as a service. It basically means like a subscription service and you get on your, you get online, you get on your browser and you're able to use these services. Think, think of Office 365, your Microsoft, think of Gmail, all things like that. All of the things that you just mentioned, like Salesforce, et cetera. And

People do that all the time, right? Especially when they're communicating among teams, right? And they're like, oh, I've got this file I need to share with them. And then you go and you get the share privileges and it's like share with anybody that gets the link, right? Or share just with this one person, et cetera. Most people just share with the world and then they send it over. And that does create a risk.

ya (05:37.006)
And what's shocking is that everyone is doing it, but people are finally waking up to at a company level, realizing that this is a risk, a very big risk, and which has led to some very large data breaches and data leaks in the past couple of years and happening all the time. And so it comes down to basically data protection, some data governance guardrails.

D. Mauro (05:48.233)
Yeah, it's not good. It's not good that we're doing it. Right.

ya (06:07.086)
and maybe training employees into being better data safeguards or data citizens of their own customer information or the company data. And because it's a collaborative platform where you can share a file with a link or correspond with someone openly on a ticket with like taking a snapshot of your ID card, you're sharing someone on a chat, doesn't mean it's more secure just because it's in the cloud. It's in fact, very insecure.

And we've had some major, major breaches, which I'm sure we'll get into talking about some of the why these risks are clear and real and present.

D. Mauro (06:45.193)
Absolutely. What does Polymer do when it sees this occurrence? Does it block it? Does it alert somebody? Does it tie the event and notify the security operations team so they can verify it? How does it work?

ya (07:07.63)
All of the above. So Polymer basically, once it connects, first thing it does is does a historical scan to give you a picture of your risk environment. What does it look like? What information that's sitting publicly or things you might not have looked at in years and years, they're still sitting out there shared with your third party contractors. Just regular stuff will provide you a menu item of like how best to reduce the risk with like three or four different steps. So it's a playbook basically. That's like.

D. Mauro (07:09.193)
Oh, great.

ya (07:37.294)
like day one or step one. And step two is like ongoing. So ongoing data observability, we're looking, inspecting each and every data asset, a file, message, email, image, whatever it might be, we're applying AI on it to understand what information is within that data. And then basically applying a security policy on top of it. Is this sensitive data? Is this being shared with folks that should not be shared with?

And then there is obviously that's the visibility component and then there is remediation options or warning options. So SOC can get an alert if something matches a given threshold, like maybe one person name or one phone number is not enough, but if there is a file containing a hundred phone numbers, I might want to know about that as a SOC team. Usually SOC teams are very small. Exactly.

D. Mauro (08:25.219)
Right. And all of that can be customized, right? Like all of that. Like, so as you work with an organization and you've got some really good organizations like CVS Health, RSA Security, Edward Jones, like those are my tech, like those are, that's really impressive. So as you're doing this and you get to know them, you work with them more than you fine tune it, right? Like so, so that it, you get to know their business, their day -to -day operations.

ya (08:53.486)
Exactly. And the system is flexible. We try to initially have everyone open up remediation mode and remediation can even look like things like we can do real time redactions. If you have an ID card floating around, we can go zap out only the sensitive items within the ID card and leave the rest of the ID card intact. So we have lots of other nifty technology.

D. Mauro (09:11.305)
Oh, really? You can actually you can let's let's I didn't mean to interrupt you, buddy, but let me back up like that's impressive. So you can do real time like redaction so you can capture and stop the confidential sensitive part from being shared.

ya (09:28.678)
Yes, and not only that, but depending on who's looking at it or wants to look at it, we can unlock pieces of that PDF, pieces of that image, depending on the privilege of the user. So you might look at a document which is redacted, containing person name and banking information. I might have access to person name, but you should have access in finance to banking information. So you can press a button and unlock. It'll recreate a file for you containing those things unredacted. So there's a lot of flexibility in terms of how we deploy remediation.

And the users can take charge of getting or overriding those remediation if they have access to it.

D. Mauro (10:07.553)
That's actually really helpful. When I think of all the various security layers and useful platforms that all help an organization reduce risk, this is one that's really customizable. It's really specific. That's great. You should really start a company with this. Oh, you did. That's really good, man. This is great.

ya (10:23.918)
And...

ya (10:30.622)
Now, thank you. And the whole, this is becoming a pretty interesting feature actually for AI chatbots really. That's kind of where this is becoming pretty important where someone is trying to type in social security number or banking information in charge of E .T. Our platform can go and zap it out before it goes into charge of E .T. for example, and things like that for sure.

D. Mauro (10:40.105)
Yeah.

D. Mauro (10:51.089)
Yeah, so let's segue to that. So AI, ever since, you know, OpenAI came out with ChatGPT and now there's various variations in every company in the world that has any kind of automation is claiming that they have AI. And we've seen issues where people are overstating the AI. But the traditional machine learning and the traditional generative AI, very common. There's a lot of...

ya (11:03.944)
Thank you.

D. Mauro (11:20.201)
questions and business owners don't know what to do. A lot of businesses are just saying, just don't use it when that could be a very competitive disadvantage. Right. You want your people to to use it and leverage it to speed up repetitive processes and help with ideation and ideas. So talk to me about what what your role with Polymer has with with with AI and some of your what you're seeing in the

in the industry about like governance and AI concerns as it relates to data security.

ya (11:54.136)
That's a pretty broad question, but let me try to break it down. So our thesis is the, well, you have charge -gbd, obviously everyone's using it. That's been in the cultural zeitgeist as the upper echelon of what AI can do for a user.

D. Mauro (12:02.793)
I like to start broad and then go to and then and then whittle down.

ya (12:21.922)
summarize documents, write resumes, write job postings, write LinkedIn posts. I mean, it's just incredible what we can do single -handedly. At a user level, like that's cutting edge and obviously copilot and Bing search and all that good stuff. That's been changing how we as individuals do our jobs more effectively. At an enterprise level, yes, some organizations, most organizations, I think I would say have allowed some form of chat GPT usage, copilot usage or some sort.

But when you think about an enterprise adoption layer, other than Chajibiti, the one -offs with the users interacting, how do I unlock all this data, all this information I've collected over the years, all this business knowledge I have as a company? How do I unlock and make my entire teams more productive? And we see a very small number of companies looking to build their own models. Yes, the investment banks of the world who always were cutting edge.

always had billions of dollars of technology budget, technology companies themselves have the expertise, have the teams to go and build their own generic models. There's a very small subsection of those. My vast majority of AI enablement is taking place within SaaS platforms themselves. Hey, copilot, summarize this for me. Hey, Google Gemini, search my documents. Write code for me.

D. Mauro (13:33.193)
Correct. Right. Right. Right, right, Sonic, right, Sonic, write me this LinkedIn post, whatever, right.

ya (13:50.094)
Exactly. These are basically things which are already available within enablements that are going on within SAS. So we see that's the biggest adoption cycle of AI adoption cycle in 2024 and 2025 of AI adoption being happening within SAS itself. Close second to that is, OK, I have these documents sitting out there.

Can I employ other pre -trained models which are specific to my business use case? So companies like Cohere, Anthropic, you also have Hugging Face, which has a lot of open source models sitting out there. I might have an engineer who can point those pre -trained models into my email system and allow a semantic search, allow a HIGPT type search for all my employees to be available. The issue with that use case is again from a...

Data protection perspective is, is my line worker on the factory? Does he have access to emails that my management should preview to, for example? And that's kind of where the data protection piece starts getting important, where what is the output coming out to the user? And how do I restrict it for certain groups of users versus not? And that's the piece where we're playing it. We basically have opened up our platform.

enabling services like Cohere, which will go live in a couple of weeks, Gemini will be next, through our platforms where we are providing data protection guardrails while safely enabling AI into your enterprise.

D. Mauro (15:26.209)
Yeah, because otherwise, whatever is put into the machine learning can be drawn out of it, right? I mean, and that's really where the risk is, the confidentiality risk. Like, and we've seen, I thought it was Samsung or one of the other, was it Samsung where they were putting in the parts of source code for a model of one of their products because it was fixing the bugs in it, and that was great.

ya (15:33.344)
Exactly.

ya (15:42.958)
Yes, big one.

D. Mauro (15:55.047)
problem is, is that's now public, right? So unless it's done correctly. So they went full circle and said, now nobody can use it, which really is not in anybody's...

ya (16:08.622)
Yeah, no, exactly. And that's, that's a big risk out there, but also even if these models are closed where copilot, basically copilot might say it's a private copilot environment, which no one else has access to. Even if it's, that's the case internally, who has access to that information? I might not want to have a reference to a document which contains patient data. For example, what are the guardrails around that? Can I put in some GRC governance risk?

D. Mauro (16:20.201)
Mm -hmm.

ya (16:38.486)
compliance controls over what's going in or what's coming out of these models and that's where Polymer comes in.

D. Mauro (16:44.205)
Excellent. So you help with compliance in financial institutions, HIPAA compliance for healthcare institutions, et cetera. Yeah, that's phenomenal. What is your take on some of the, where the AI governance is going? Like here in the US. I know over in the EU, they've passed certain initial,

ya (16:54.092)
Yes, I'm Max Medhi, yes.

D. Mauro (17:12.891)
AI regulations. Do you have any, I won't hold you to this. I was just curious since you're in the industry and you've developed this phenomenal platform, what is your take about, are we gonna get certain regulations do you think? Do you think that has an appetite? Is there talk in the industry about some type of AI regulations coming down in the US?

ya (17:38.19)
I mean, there's a lot of work happening. Biden administration posted the White House initiative on AI governance, which was like a basically a kitchen and sink of every single lobbyist out there thrown into one document.

D. Mauro (17:42.673)
Yep.

D. Mauro (17:50.973)
Yeah, right. It had like every platitude, every keyword on the planet, right? Right.

ya (17:56.206)
But it's a start. It's a start. Then on the other side, you have the regulatory bodies like NIST and CMMC. You also have OVASP, the open source alliance, which have put out directives in terms of what are the top 10 risks they see. And these risks are basically, the categories of those risks revolve around data biases of the models.

D. Mauro (18:03.483)
Yep.

D. Mauro (18:10.859)
Mm -hmm.

ya (18:24.322)
data poisoning from a trade training perspective, but a majority of them relate to some shape or form of data loss. What sensitive data could be leaking, going in or coming out of these models that could put my data at risk. And so we are not involved in any of the biases and model training components, but we do touch upon the data loss prevention, some of the data GRC functions of those things and we're watching those pretty closely. It's still early days.

D. Mauro (18:26.009)
Right.

ya (18:53.294)
But things are moving fast. There is a lot of, from what I'm hearing from Washington, there's a lot of Congress level demand. They're asking for hearings. They're bringing people in to educate them on this. So there is a big, I guess they're hearing from the constituency around what kind of risk AI is posting on their kind of, their voting base. And so we are seeing a lot of under the ground action going on.

behind closed doors.

D. Mauro (19:24.521)
Absolutely. What about, I have to ask about this. Deepfake has led to some major business email compromise scenarios and the FBI had issued a warning back in July, 2022. Will Polymer have a role in...

in helping organizations control their data to avoid certain deep fake issues, or is that really just something that's kind of different because it's kind of a different type of risk?

ya (20:04.334)
I mean, deepfake is like a very like a hairy problem. It's still evolving, but I can tell you one example of deepfake within email from the phishing attacks, where basically we've hearing of these incidences more and more. You get an email sent to you. There's a long chain of email, which has your CEO, your CFO or other folks in your, in your immediate circle in the company talking about some, some matter.

D. Mauro (20:08.841)
Right.

D. Mauro (20:14.217)
Hmm.

D. Mauro (20:29.609)
So it looks legit, right? So it looks legitimate. Yeah. Right.

ya (20:32.832)
looks completely legitimate and you are like 50 deep, 20 deep into this chain and then you getting this last email, please send via instructions here. And we have seen people fall into that trap. This other example I saw recently, which is pretty scary was someone asked for writing instructions. The employee said, I don't want to send it. I don't like this is seems fishy. The CEO says, no problem. Let's get on a Zoom call.

He had like six, seven other people in the Zoom call, their CMO, their CFO, a bunch of other folks in the company. And then at the end of the call, it was like, okay, send the money over. And what was basically realized was the person employee who was being attacked was the only real persona in this Zoom chat. We're fake. So it's in real time, it's a...

D. Mauro (21:20.297)
Right, all of the other ones were deep faked. In real time. In real time. Yeah.

ya (21:28.138)
Unreal. And I'm sure there are companies, there are products out there which will be solving the problem. We're not quite there yet, but the text -based thing is something we are definitely keeping a close eye on. And we have a few customers requesting some features on our product to be able to solve those.

D. Mauro (21:42.953)
Yeah, well, and the messaging right now is done every day by everybody. So the problem that you are solving is in the grand scheme of things, much more common and has much more of an impact right now than deepfake. I was just asking about it since it tends to be. Yeah, no, it really is. That's phenomenal. How will Polymer play when it really is? I can see a direct alignment, but in

ya (21:58.286)
That's what we like to think.

D. Mauro (22:12.165)
organizations that are creating zero trust architecture.

ya (22:17.486)
Yeah, that's somewhat of a subjective answer. I can't get too crisp because zero trust could mean slightly different things depending on what part of the tech stack you come in. So as long as at the SSO level you're connecting MFA, there's zero trust to get into the environment. We do help from a zero trust at the data level in one aspect.

D. Mauro (22:26.089)
Hmm? Exactly. Right. True.

ya (22:46.392)
that is around reducing the amount of sensitive data that's flushing around unredacted, unencrypted into your platform. So even if you're in the environment, for you to be able to access certain document requires you to kind of do another check, be it be MFA or unredactioned piece or whatever that might be, that does make it a little bit harder.

to take the information and do a Grand Theft Auto leak type video leakage, which costs the game publisher tens of millions of dollars in monetary loss. And our underlying engine basically is built on entity level permission. So not all customers are mature enough to be able to deploy this on our side, but we have the ability for you to, for this redaction piece, if you were to unredact,

that based on the group you're in as an employee, you can unredact or see different parts of a data set depending on the permissions. And that's gonna, as you retrace as it gets at a data level, but I would say the implementation of that gets a little bit friction full at a large organization.

D. Mauro (24:01.193)
That's phenomenal. Yeah, I can see that's why I've been taking notes. I can see how this would drastically help. So tell me about some of the myths or misinformation that is out there concerning data security in the marketplace today.

ya (24:24.16)
SAS is all encrypted. I'm safe because all the data is going back and forth, all the wire is encrypted. I don't have to worry about it. My login, my SSO is safe enough. Once you're in, like you're part of the family, you can see anything and look at anything and everything. These are the two biggest myths that lead to the biggest amount of data, like hacks, data breaches, and data theft out there in the marketplace.

D. Mauro (24:52.435)
Absolutely, yes.

ya (24:53.838)
And we know, I come from financial services, you come from financial services, so we know that financial services organizations have been ahead of the game here from data security perspective, Chinese wall, very, very strict policies around data, document and email classifications, who can see what, need to know bases, retention policies that are very strict around seven years, why increase the risk if you can remove this data off your environment, purge it out.

So I think I see the world moving towards where financial services in the US at least has been for the last like 15 years. It's getting there.

D. Mauro (25:35.689)
Yeah, no, I would agree completely. Yeah, and it's gonna take the rest of the commercial industry and manufacturing and everything else, it's gonna take them 10, 15 years to catch up to financial services. Yep, that's exactly right. So that's phenomenal. Let me ask you, what caused you, I understand you were a trader and the financial crisis happened, we were all there. What caused you to...

ya (25:46.494)
Yeah.

ya (25:55.406)
you

D. Mauro (26:05.833)
to create the startup, Polymer? I mean, like, what was it? Did you grow up always wanting to be an entrepreneur? Were you inspired by somebody? Did you recognize the void? And how did you recognize it? Those are like three different questions. You can answer any of them.

ya (26:21.512)
So, you know, no, usually like we as humans have a tendency of weaving a backward looking story and filling in the gaps after the fact. So I'll try to like not take the lens off and try to like picture myself like three years ago. This is pre -COVID. I have was in a bunch of different consulting projects where I had tried.

D. Mauro (26:32.957)
Yeah.

ya (26:48.92)
different product ideas in the past that didn't work. So, and one of the superpowers or fortunate things I was able to kind of find myself in the position was because I used to work at Deutsche as a consultant CIBC and so on and so forth. I had an email domain that allowed me to basically look at what the wider data analysis and data privacy world was evolving to be post GDPR.

I would get hundreds of demos. I would try out different products, hands on myself. I used to be C++ developer at Beardstown training desk. I had a little bit of technical background going in, but that kind of gave me a sense of what the different dimensions, different buckets of different products were in startup plan. What it required, like I used to get on these sales calls with people who would pitch to me. I would understand what it took for some of these companies to get built. So was learning from the wider market.

D. Mauro (27:37.351)
Mm -hmm.

ya (27:43.982)
and realized what I was good at, what these guys were good at and what they were to solve for. And really it was really the simple idea that can I take this little bit feature widget type thing and raise money on it. Was fortunate enough to pitch it to Excellator who took a chance on me, raised a little bit of money. And then after the races at that point, to be honest, COVID hit right after that. So there's no...

D. Mauro (27:57.853)
Hmm.

Yeah.

ya (28:11.904)
Ejection button on the other side consulting was finished at that time. No one was hiring things were shut down for six months. And then it was like, okay, all ships are burned. You have to make the most of it while about where I'm at. So it's a combination of just a back against the wall and having to make do with something and just getting a little lucky with a few early customers.

D. Mauro (28:14.697)
you

D. Mauro (28:34.857)
That's phenomenal. I mean, that's the story of every great startup, right? It's usually necessity is like the mother of invention. So that's phenomenal. Well, yeah. Well, there's clearly a need and a void here because so like everybody's using these and they're all doing exactly what we've talked about, right? Everybody is oversharing or...

ya (28:47.598)
and not giving up. I think that's the main thing.

D. Mauro (29:03.485)
communicating and sharing these files, thinking that everything is secure within these SAS programs. And without some eyes on it and some ability to redact and fix it, we're at greater risk than a lot of organizations even realize.

ya (29:20.846)
When we started three years ago, we were laughed out the door. They said SaaS is encrypted, there's no need for it. So it did take a while for the market to kind of come our way, to be honest. So it was a pretty clearly day.

D. Mauro (29:26.729)
Hmm. Well, several large breaches can help with that, right? Yeah, absolutely.

ya (29:34.208)
That definitely helped, yes. But this is a classic example of an idea that probably would not have worked, but we just got lucky with a thesis that is proving out and becoming more relevant in the age of AI.

D. Mauro (29:50.345)
Well, we wish you nothing but the best, man. This is phenomenal. We're going to have links to your socials, but as well as links to Polymer. We encourage our listeners to check those out. We'll have those in the show notes. This is really interesting. I always get wary when somebody says, oh, I've got a box and it guarantees security for your whole organization. We all know that that's not because...

The other group of listeners that are hackers on the show are listening going, yeah, tee it up. I'm about to blow that box up, right? But here, this is a really useful layer. And this is something that is easy to install, and the impact is phenomenal. So we wish you nothing but the best. We're firmly behind you on this. This is great. So we encourage all of our listeners to check out Polymer, reach out to Yasir Ali directly through his socials, and we'll go from there.

ya (30:41.518)
Thanks.

D. Mauro (30:48.969)
What is on the horizon for you? What is coming up?

ya (30:53.902)
So we basically have opened up a platform to deploy third party services on areas which we don't cover. So software supply chain risk, for example, looking for malwares while code is being written. That's a piece where we have allowed a third party app to build, to connect through our platform. And we're looking to build more services. So our services are being embedded now through our platform. So basically that construct of understanding the data flows and the...

D. Mauro (30:59.783)
Great.

D. Mauro (31:08.265)
you

ya (31:21.422)
creating security policy to make sure sensitive data does not get leaked. Once that foundation is set, we have realized that you can deploy a bunch of other services on top of it, like encrypting devices, backup devices, and all those other things. So we're looking to really open up this platform to be something bigger than what it started off to be. So that's going to be a pretty interesting next chapter for us.

D. Mauro (31:31.785)
Mm -hmm.

D. Mauro (31:42.121)
Thank you.

D. Mauro (31:45.961)
Yeah, that's phenomenal. And those are logical next extensions of the platform. Makes perfect sense. That's great. No, that's phenomenal. Any way that people can get in touch with you if they're interested in a demo or finding out more other than obviously Polymer.

ya (31:55.062)
Appreciate it.

ya (32:08.364)
Yeah, so email Wiley at Polymer HQ. And then obviously, we have links on our website to connect with the team. And we can definitely walk through. And we are very consultative. So we will tell you upfront if we can be helpful and hot. We don't like to waste our customers' time or put perspectives' time just because everyone's busy. So just chat with us. We look at all sorts of use cases coming our way. And if it makes sense, we'll kind of build it out if you don't have it already.

D. Mauro (32:34.313)
That's fantastic, man. Well, I wish you nothing but the best, and we will have you on during your next iteration, all right? As you guys expand and extend, then we'll circle back and talk about how this has grown.

ya (32:48.142)
Really appreciate you having me, David. This was an absolute pleasure. Thank you so much. Appreciate it.

D. Mauro (32:52.521)
Thank you so much, sir.