Transcript

Patrick Spencer (00:03.15)
Hey everyone, welcome back to another Kitecast episode. I’m your host for today’s call, Patrick Spencer. I’m thrilled to be joined by two cybersecurity compliance leaders, Heather Noggle, who I’ll let her introduce herself here in a moment, but just as a high level introduction, she’s a regular speaker on various circuits, podcasts and webinars. She’s an advisory board member for the Missouri Cyber Center of Excellence. She’s the founder and principal at Codistac.

And then Arun DeSouza, if you’ve been watching any of our Kitecast episodes, you’ll know he’s appeared in a couple. We did one deep dive with him on his background, which is fascinating. You know, the journey he’s had in cybersecurity. And then he joined me for, I think it was the CMMC. I’m trying to remember which, one of our recent podcasts where we looked at one of our industry thought leadership reports, survey reports, and he provided some great insights and he has joined us again for.

Heather Noggle (00:33.537)
And then, Arun and Susie, if you’ve been watching any of our high-tech stuff, you know, they would probably be good. He’s by and to him on his background, it’s fascinating. And then, we’re hearing these types of stories. And then, we’re going to need more agency and we’ll see a time where we’ll get to that. One of our recent podcasts that we’re going to be doing are in this proper way, so that we can provide us some great insights.

Arun DeSouza (00:50.762)
Mm-hmm.

Heather Noggle (01:01.921)
for our annual survey report. When we look at data security in economy and industries, it’s a very, very main report for the world. We look at the bottom of the page, the bottom desk. It’s 50 plus pages with cards and graphs. We survey 460 million people. That’s a lot of questions. going to more questions. Industry professionals decided to submit a plan for the management of the world right now.

Patrick Spencer (01:02.414)
Our annual survey report where we look at data security and compliance issues. It’s a very, very major report. We’ll have that as a link at the bottom of today’s podcast. It’s 50 plus pages. It has a bunch of charts and graphs. We surveyed 461. Don’t ask me why it’s 461 versus 460. Industry professionals and cybersecurity compliance risk management, as well as IT around the globe. So there’s some.

Heather Noggle (01:28.897)
for those valuable insights that are on the line. So I’m going to look at this as some of them vary in pace, guess. And then I’m going to be quiet and read a little bit of object data. But before we dive into some of the questions I had for him, I thought we’d do a little e-book guide on that.

Patrick Spencer (01:29.838)
valuable insights that our audience will find useful and we’re gonna look at least at some of them during today’s podcast and I’m gonna be quiet and let Heather and Arun pontificate on those but before we dive into some of the questions I have for them I thought we’d do a little deeper dive in terms of an introduction for each of them. Heather, tell our audience where you’re based and what you do and how your career has evolved.

Heather Noggle (01:56.283)
Sure, I am Heather Noggle. I’m in the lower part of Missouri, the southwest corner called Springfield, Missouri area. And I work with the Missouri Cybersecurity Center of Excellence. It is a nonprofit that is teaching the next generation of practitioners and getting them some work experience in addition to their self-study and their degrees and whatever walks of life that our interns are working in. We bolster that so they can get jobs.

And we also are working to make the Missouri region and all of region D where we’re located better at cybersecurity, local and county governments, local cities, et cetera. And then I also do some consulting work through my firm, Kodastac. I have a long career in export compliance software. So I come at everything in compliance from just a slightly adjacent angle. And also a lot of security people tend to come from the network background and the systems engineering and I’m more software. So that’s who I am.

Patrick Spencer (02:50.306)
That’s great. That’s an interesting background. Everyone I interviewed has a different background, Arun, remind your audience about your evolution in the cybersecurity compliance space.

Arun DeSouza (03:04.555)
Yeah, thank you for the warm welcome, Patrick. It’s good to spend time with you. And I particularly appreciate the opportunity to chat with you and Heather today. A few words, I’ve over 20 years experience as a CISO and I’ve built two security programs from the ground up. Now, the second program was unique in so far that I built an integrated security and privacy program from the outset, know, anticipating at the time the regulatory winds of change, right?

And then as the Chief Information Security and Privacy Officer, I led the corporate mobilization for GDPR and HIPAA. So I made data characterization, protection, governance a priority powered by a policy-centric framework in collaboration with key business functions such as HR, legal, internal control, and internal audit, and overarching information security and privacy council.

my team along with IT and other stakeholders work to deploy people, process and technology safeguards to ensure data privacy and minimize enterprise risk while enabling the business. And fast forward to today, now I’m an executive advisor with Netscope and I continue to stay abreast of the whole data protection space and privacy, but as well as related macro trends such as data sovereignty, observability, zero trust.

Sassy and AI among others.

Patrick Spencer (04:32.59)
AI, we’re going to talk about that in a little bit. That’s one of the questions we had in the report, I think everyone has to ask that question. This year, for the first time, and we’ve done this in a couple of adjacent studies and reports that we’ve published, developed a risk score algorithm that basically accounts for three different factors that we ask our survey participants about.

Arun DeSouza (04:36.137)
Hahaha

Heather Noggle (04:57.761)
One, how many reasons would they have to Two, what would the litigation cost? Litigation would be just of the total costs that you would experience.

Patrick Spencer (05:00.142)
How many breaches related to data did you have in the past year? One. Two, what did the litigation cost? Litigation being just one factor of the total cost that one experiences from the data breaches, we know as well as our audience. And then three, the time to actually remediate. We created this algorithm, which you’ll find in the report, and then it scores it. We rationalized it on a scale of zero to 10. And it came out that we were about at a five in terms of average.

interpret it as you know a good algorithm so we don’t have something that’s too far away in one direction or the other but we also looked at them from the perspective of you know critical risk, high risk and so forth. We had 15 % when we looked at the total number of responses that fell into that 15 % high critical risk area and quite frankly if we added in the next component which was high risk we had

Heather Noggle (05:35.664)
We also looked at the critical risk, higher risk, and so forth. We had 15 % in the local number of responses that fell in. had 15 % in the higher critical risk area. And quite frankly, if we added in the next component of the higher risk, we had almost half of the person falling into that area.

Patrick Spencer (05:56.878)
think almost half of the participants fall into that area. Arun, let’s start with you. You’ve had a look at those findings at a high level anyway. What’s your reaction to seeing risk quantified in this way, the way we went about it with this algorithm? What’s it mean for, the median organization sets at four point, almost five, 4.84.

Arun DeSouza (06:12.233)
Yeah, so thank you, Patrick.

Arun DeSouza (06:21.237)
Yeah, you know, the few questions that you assigned me to sort of give and prepare some thoughts for, this was kind of the hardest one, honestly. I might wax a loco in for a minute. pardon me. So first of all, based on your three factors, it’s very insightful to understand the state of cyber risk in the industry, at least limited to the survey response sample size that you have, right? However, in the immortal words of Aaron Levenstein,

Heather Noggle (06:28.929)
You

Arun DeSouza (06:49.151)
What statistics reveal is suggestive, but what they conceal is vital. So in my view, these results indicate three macro observations regarding the state of enterprise risk as highlighted in illustrating the report. Number one, ecosystem risk. So a critical state in cybersecurity with so many enterprises indicates a broader systemic risk as such than

Vulnerabilities in any one organization can have cascade effects across interconnected systems and supply chains to amplify ecosystem risk. Number two is cyber resilience, right? It’s absolutely mission critical and vital for enterprises to prioritize cybersecurity investments, implement robust security measures in the form of appropriate people, process and technology, safeguards.

And of course, enhance the ability to prevent, detect, and respond to cyber attacks. And thirdly, strategic alignment. I couldn’t help myself. Essentially, so many enterprises in or near a critical state, it really implies a lack of adequate investment in cybersecurity, in my view. Especially in the current dangerous milieu of dynamically evolving and sophisticated cyber threats, enterprises are frequently attacked, right?

So in the absence of strategic mobilization, noted, enterprises will find it hard to detect attacks by myriad threat actors in a timely fashion and may consequently incur major financial and reputational losses. Now, the one caveat that I would have is I think your results, are being masked due to the present factors that are selected, right? I believe maybe a future iteration could expand these key dimensions.

adding, for example, people factors such as insider threat, training and awareness, negligence and cyber skills gap, technology factors like identity compromise, vulnerabilities and misconfigurations, and as well process factors, example, digital transformation or ecosystem compromise, emerging technology integration, legacy governance, and so on and so forth. Thoughts?

Patrick Spencer (09:03.643)
It always depends on the type of data I think that’s involved as well in the breach, right? Some data is more sensitive than others and that’s a question we could add for next year’s report and that gets baked into our algorithm. We have four five or well, if we go with your list, maybe 50 elements as part of the algorithm. Heather, you have any thoughts on that? Good, go ahead, Rune, sorry.

Arun DeSouza (09:20.099)
No, well, it’s just my thoughts, right? No, no, I was simply saying…

Heather Noggle (09:27.692)
My thoughts always tend toward the people in the process because as the organization shifts and grows with whatever is changing out in the environment, what probably isn’t changing is the way we look at it and the way we view it and whether we’re seeing the risk.

Arun DeSouza (09:28.992)
Nope.

Heather Noggle (09:42.363)
And so the more strategic we are about, okay, this is changing. How does this affect what we care about and what we protect? And to do that as early as possible, I think any reports that come out that get us thinking are super helpful. And this is no exception.

Patrick Spencer (09:57.262)
That’s a good point. Now, Heather, here’s a question for you. We looked at the data from the standpoint of third parties and how many third parties organizations actually exchange sensitive data or private data with. And we discovered that organizations that exchange data with between a thousand and five thousand third parties actually had a higher risk score than those that exchange data with five thousand plus. Why do you think that’s the case?

Heather Noggle (10:19.284)
I don’t know.

Heather Noggle (10:27.549)
I think if you’re looking at more than 5,000 third-party vendors, you are probably ahead of the game and really working to manage that because it is such a risk. We really haven’t focused until this year on third-party as the big risk factor. And I think we’re starting to do that with what you’re finding and what we’ve seen in the Verizon DBIR that third-party risk is the new hotness. And therefore, we now need to.

get ahead of that with the medium businesses. In years past, medium businesses, medium sized businesses have been ahead of the game because they’ve been adopting things on a more frequent scale than maybe the more monolithic big organizations. But I think the big organizations are having their heyday and doing a better job of taking care of what it is that they’re protecting and maybe understanding that. And with third party risk, it’s not a new thing. We can go all the way back to the target breach.

It is now though going to be something we have to absolutely focus on, which means we need to treat third parties as assets. We need to know exactly who they are, what they’re providing us and all of the data about them so that we can secure where they touch, what’s important to us. An alternative way to do that is to really look at what’s important to us and then be very careful about what we select as third party vendors. So it depends on where we are in the life cycle of how mature we are in our relationships and what we can change.

Patrick Spencer (11:53.71)
There’s a lot of best practices one needs to follow when dealing with third parties and it’s more than a checklist that needs to be adhered to. Is that an accurate assessment?

Heather Noggle (12:03.115)
Mm-hmm.

Heather Noggle (12:06.753)
Very accurate. you know, we’re talking SBOMs now and knowing in software integration, we’ve been doing that a long time. Who are we integrating with? What data are we sending? How are we securing that? How frequently are we double checking to make sure that’s secure? And the technologies behind that have evolved over time and how we manage that. I’ve actually shuttled flat files back and forth. I’ve been in the industry that long.

And so, you know, how do we secure that? I can talk about AS2, I can talk about modern methods, but those things change, but what we’re protecting doesn’t change so much. So we need to have a very systematic approach to how are we protecting, what are we protecting, and how do we inspect that so we get into the governance piece of it. And so now we’re governing our third parties and we need to do that very intentionally.

Patrick Spencer (12:31.01)
Ha ha ha ha.

Patrick Spencer (12:53.454)
Yeah. Rune, data classification. You go ahead, Rune. Sorry.

Arun DeSouza (12:54.643)
And if I may, insights Heather. I think we…

Arun DeSouza (13:02.955)
No, no, not to worry. I was just saying, fantastic insights, Heather. The thing that strikes me is, know, when you’re managing, like you mentioned, over 5,000, by nature, means that you’re a much bigger organization. So you’re a lot more resources and, you know, people and stuff. And many a time you’ve got a platform-based approach. I mean, obviously it’s people-possessed technology, but if you’ve got a platform, I think, prevalent or whatever it is that you can have, I think it’s much more easier to…

Heather Noggle (13:11.713)
You

Arun DeSouza (13:29.749)
you know, manage these and issue the control things. And I think that might be something that was not asked, but might be one of reasons why these organizations are doing better because, you know, they can see in real time some of the risks, you know, whether there was a phishing attack, compromising one of the vendors or whatever, right? So I think that it makes a lot of sense to me and I fully agree with what Heather said.

Patrick Spencer (13:53.374)
I did not, I could, that’s a good idea. We’ll add that in supplemental, supplemental report. We actually could analyze the data from the standpoint of the number of privacy enhancing technologies, for example, and practices that they have in place versus their risk scores. That would be, that’s a great point. That would be an interesting exercise. Arun, there was a counterintuitive finding, or I thought it was perhaps in the report where organizations that are somewhat confident in their data control have

Heather Noggle (14:00.705)
Yeah.

Heather Noggle (14:18.113)
in the organization that some of the competition that you’re told have high-career risk for than those who say they’re not common at all. would think those are not common at all. they’re severe risks, but that’s not the case. When you think of the case, there’s always a lower confidence level you can talk about. Some of us, they never call someone they don’t know.

Patrick Spencer (14:22.734)
higher risk scores than those who say they’re not confident at all. You would think those that are not confident at would be, you know, severe risk, but that’s not the case. Why do you think that’s the case? There’s always, you know, an overconfidence level when you talk about, you know, someone’s baby. You never call someone’s baby ugly, right? You think that’s what’s going on here?

Arun DeSouza (14:46.347)
Well, you know, I haven’t kept up with this but in a sidebar to preface my remarks for many years I used to hear that you know, they used to have this you know blindfolded throwing darts at the keyboard and Predicting the stocks that are going to win, right? So I think it reminded me of that story I’m going to explain here a minute first of all, like it says in your report overconfidence breeds complacency, right? So this may well be the case in many cases

And this is to me a clarion call because risk posture monitoring shouldn’t be static, it should be dynamic and cyclical, right? Because people, processes and technology safeguards should be adapted as a risk appetite of security posture changes due to macro factors of business conditions such as &A or joint ventures and things of that nature. Now back to why did I preface it like that, blindfolded, it’s because I think it’s very important for CISOs, especially large organizations, to be more…

quantitative focus than qualitative focus, right? So just blinking and saying, okay, this is really yellow green, but to some extent, if they have a scorecard that quantifies certain key factors tied to the business plan, it’s much more easier to make that statement. But if you just sort of eyeball it, which I think a lot of people are doing, I think that’s the reason why I see that. think CISOs don’t have time, but I think that’s where private-public partnership or cross…

industry groups across industries are very very important because in share best practices they help each other out. I think that’s pretty much what it is that you know people don’t have a lot of you know like this hybrid approach of qualitative and quantitative. Heather what do you think?

Heather Noggle (16:32.101)
I fully agree. I was going to highlight the qualitative approach of feeling confident. It’s a feeling. And so we’re asking an organization and its leadership how it feels. And if it’s had a good year, it might feel pretty good. But like,

the, if when you buy a brand new car, you drive it off a lot, you’re already incurring damage wear and tear to the car. So its value goes down. So if we rest on our laurels, Hey, we did really well last year. We’re feeling secure. Then there’s not enough quantitative there to back it up of what changes tomorrow when we learn about a new breach or we learn about if we don’t know all of our technologies, a hundred percent, maybe somebody has some shadow IT and it’s just so evolving.

I think that’s what I want to add. think you had a really good insight there Arun on all the big things we need to focus on, but quantitative, over qualitative in terms of determining confidence is a big one.

Patrick Spencer (17:25.014)
Do the two of you think we have an issue related to that where, you know, it needs to be quantitative when it comes to classifying our data? Do we know what kind of data we have and where it resides and moreover, even if you do, then you still need to have the mechanisms in place to, you cause you can know I have this pile of highly confidential IP data set in here. I only want certain people to have access to it.

Heather Noggle (17:45.121)
don’t have this highly confidential IP data and say here, I only want certain people to have access to it. Unless you have it in place, you can’t really manage that access.

Patrick Spencer (17:53.654)
Unless you have the controls in place, you can’t really manage that actual risk. What are your thoughts on that front?

Heather Noggle (18:02.344)
I think knowing the data, knowing what you’re protecting is more important than protecting it because if you don’t have consensus on what you’re protecting, it’s very interesting in some of the conversations I have where we use a word in a group of people and they interpret that word differently. So unless we’re really clear on what we’re protecting and it will change from time to time, it’s very hard to protect it. So everyone can add the deep insights to that.

Arun DeSouza (18:28.627)
No, just repeat the question again Patrick, so I modulate.

Patrick Spencer (18:34.446)
We’re talking about data classification versus actual protection controls being in place. The handoff doesn’t always happen between the two as it relates to being quantitative in our cybersecurity strategy.

Heather Noggle (18:37.089)
Mm-hmm.

Arun DeSouza (18:51.817)
Yeah, so I think we’re talking about data at holistic level, right? Not just the metrics around success, right? Because I think the answer is different for each case. I think in the second case, the one we started with, I think it’s very important to have, you know, balance scorecard with this five or seven key factors aligned to the business and then with information security and privacy counsel, for example, identifying key success factors and measureables and kind of keep it and iterate it.

However, on the other side, when look at data as a whole, I classification is not enough because as you know, classification can’t be done automatically because the scale and pace of data growth, right? So the first thing you need is a data classification policy. But as we all know, once you have that policy, you can’t manually shine the light and microscopically manage it. You need a platform, right? I think we’ve talked about this subject before. So I think that’s where AI can be a great help.

Heather Noggle (19:28.929)
Mm-hmm.

Arun DeSouza (19:45.674)
because it can actually go and it can identify something and say, well, look, this thing has got a social security number looking, so it probably is. So fix it. The permissions are off. But beyond that, think we’ve got to the point where we’ll talk about that again. See, back in the day, we talked about visibility, right? So visibility of data. So platforms can do it. But now, I think with AI, we have the advantage of actually using not just, I would say,

passive visibility, and alerting would even use observability and proactive even fixing of the data problems. you know, obviously there could be some level of supervision, et cetera, or if fixes the, if the platform finds egregious store of data that’s not well-protected, can, you know, temporarily just shut down access to it and send an alert to see, so versus leaving the risk kind of a thing, right? So I think with AI, I see the, with,

Patrick Spencer (20:16.75)
Corrected.

Arun DeSouza (20:43.093)
transmuting, it’s become more dynamic, the ability for security teams to be more efficient, not only in data protection, but also in so many other areas like security operations, identity management, or what have you.

Patrick Spencer (20:57.816)
Yeah, I agree. It’s everything both of you just said. Heather, talking about industries, or let’s transition over to that topic, the report listed some interesting findings on that front. Energy was at the very top of the list when it comes to risk. Life sciences, surprisingly, was relatively low. Technology was second in terms of the highest. Did you have any takeaways for our audience in terms of

Heather Noggle (20:58.849)
I

Patrick Spencer (21:25.574)
the findings around industries, any surprises either higher.

Heather Noggle (21:28.257)
The energy is a bit surprising with regulations because you compare that with healthcare and all of the regulation there and the need to protect that data right down to a draw on the export compliance piece. When you’re exporting, you need to list the lot, its expiration date, and all that information about the lot at that level. A pharmaceutical company might be exporting giant amounts of pallets and you still have all that data traceable down to that level.

Going back through that, I think technology is not that surprising because of the overconfidence factor. If we go back to that, I think that that’s where we’re pointing and that’s why it is as high as it is, is we work really hard, we’re doing really good things, they’re just not full coverage.

Patrick Spencer (22:16.984)
Yeah.

Arun DeSouza (22:18.699)
Yeah, and if I may add a couple of things, that didn’t surprise me at all because the reason is, know, if you look at the energy and the utility sector, there’s a lot of Internet of Things devices there, the IoT devices and the space and yeah, all the technology and the arts are messed and networked. I mean, just take, for example, ring camera or your TV. We’ve already seen in private life. So I think that’s why the energy

Heather Noggle (22:32.884)
Thank

Patrick Spencer (22:33.486)
and old technologies that are 30 years old,

Arun DeSouza (22:47.817)
they don’t know and they’ve seen some micro file hacks. So that’s the first thing. Pharma is so highly regulated and like banks and whatnot, know, they have such a big budget because they’ve got to comply and check the box. I know people say compliance is not security, but compliance can help security because you tend to get more money. And the one, the tech sector doesn’t surprise me at all. The reason being even this sort of somewhat, you know, slowing down situation.

Heather Noggle (22:51.615)
Mm-hmm.

Heather Noggle (23:05.345)
Yeah.

Arun DeSouza (23:13.525)
There’s so much turnover in the tech industry, whether in security teams, IT teams. And you couple that with the need to adopt innovative technologies very fast. I think by nature, that adds a lot of risk. mean, if it’s a little bit older company, they would be more measured in their approach. But tech companies are losing people so fast, they want to implement things so fast. That to me is a perfect storm. And I’m not at all surprised by these things. And so very valuable insights.

Patrick Spencer (23:37.23)
Hmm.

Interesting. No, that’s good. We’re on the mark with the data points is always positive from an expert standpoint. Arun, go ahead.

Heather Noggle (23:47.968)
I have one more quick thing to add on that too. I think that hiring in the tech sector is broken and I think that that interplays with how secure and how we implement security. So that might be a topic for another day, but that to me also is just the hiring and how we manage people turning over. Go ahead.

Arun DeSouza (24:02.821)
No.

Patrick Spencer (24:09.326)
or an extra report. He just gave us another report to work on, think. Arun, one thing that I found interesting in the report, I think the two of you probably did as well, this concept of organizations that don’t know their third party count, we asked them, do know how many third parties you have, are 42 % more likely not to know their breach frequency.

Arun DeSouza (24:15.295)
Yeah.

Patrick Spencer (24:37.518)
compounds into what we might call an exponential exposure risk, right? The exposure risk goes up because of the lack of visibility. And that was a key point that we had in the report was there’s a lot of unknowns out there and organizations should know that we’re talking to senior leaders for the most part when we did this survey. So we’re not talking about folks who are down in the trenches who wouldn’t have visibility into some of the questions that we pose.

Arun DeSouza (25:01.279)
Yeah, a few things come to my mind. First of all, I think it’s vital that organizations have a cross-functional third-party risk management process centered on policy and periodic state assessment as quarterly or semi-annually or what have you, right? Now, to your point, I think at least two high-level metrics are vital. Number one is risk tiers, because you’ve got to classify all your, at least your major suppliers or vendors by potential impact.

and likelihood of cybersecurity incident. That’s a given. But you also need to be able to do security posture characterization, if you will, be able to illustrate overall cybersecurity resilience and resilience across your.

supply ecosystem and back to the point I was mentioning before, it’s ideal if you can invest in a platform because today many companies don’t have the time of the budget and doing through Excel spreadsheets, they’re outdated as soon as you do them. So I think assuming that you have a policy-based framework across functional collaboration and at least these two levels of tiers, think having a platform-based approach where you have visibility across the supply chain is very helpful.

Any other thoughts?

Heather Noggle (26:16.232)
Supply chain to me has always been physical as you look at it from the export piece. so translating that into software, of course, that makes sense. My company was typically a supplier, a third party and all the evolution of how we used to say, here’s how we’re secure versus what we have to do now. It’s been interesting to look through that. And I think that’s going to be even in the future, a bigger thing that third parties then are going to have to more deeply attest to their own security than they even do now.

in a more standardized way than even say a SOC 2. So I see that coming. And then in addition, I think it is extremely important. And to go back, I mean, if we’re just going to look at identify, detect, protect, then we’re at the very beginning of identify. If we don’t know who our partners are and we don’t have that as deep in a data and measurable sort of way, then that’s where we start is making sure that we’ve identified them all, what they do, what they touch, what they can take.

Patrick Spencer (27:14.126)
No, very, very true. So, all right, we got to talk about AI podcast now, Dave. And we did a standalone, actually the same data set back in June that we released as a report. So we’ll stick that as a link.

Arun DeSouza (27:15.539)
Yeah, well said.

Arun DeSouza (27:21.055)
Ha ha ha ha!

Patrick Spencer (27:37.134)
We’ve gotten great publicity around it, appeared in the Wall Street Journal and some other places. But Heather, let’s start with you on the AI question. We found that while 64 % track AI generated content, only 17 % have technical government’s frameworks in place to do so. That’s in the June report. The 64 % is in this new report that we’re publishing. The risk score data at the same time shows organizations with

No AI governance plans score 5.23 to be precise versus 4.12 for those fully implemented. This shows a substantial risk reduction if you have those things in place. Do you have any thoughts there in Alaska?

Heather Noggle (28:23.903)
Without governance you can’t manage and so those who are working without governance are going to be experiencing shadow AI

they’re not going to know what’s leaving without something to protect them if they don’t have that data loss prevention in place. And it is still a wild, wild west thing with AI and what we can do with it and whether we’re talking about just general LLMs and practices over whether people can access them at work or in-house AI and how that works. think that governance is extraordinarily key missing.

and the United States is behind some other countries and how they’re adopting what they are doing with AI. So I have a fairly laser centric approach to that and we’re gonna probably add a bunch of wonderful things to that. So I’m gonna turn it over to you.

Arun DeSouza (29:10.101)
Yeah, thanks, Heather. I recently actually did a series of courses at Johns Hopkins on AI. so I learned a lot in a quick time. I’m certainly not an expert by any means. I thought, hey, let’s learn about this cool stuff. And I think some things had come to my mind. And this is not to pick or say anything about any organization.

Patrick Spencer (29:23.227)
He’s the typical PhD, Heather.

Heather Noggle (29:24.001)
You bet.

Arun DeSouza (29:37.356)
I’ve been talking to a of people, a lot of friends in the industry, and the things that I’ve understood through my various conversations, and not all organizations are facing these, but no particular order. First is a lack of strategic vision, business alignment, and leadership buy-in, because at the end of the day, just like cybersecurity, privacy, where does AI fit into the business long-term vision objective, right? The upper management needs to articulate that, and then call in functions like…

IT, legal to see what does that do to the enterprise risk, right? Now, to me, think AI should be one of the key enterprise. Having that right, you got to start with that. But I think many companies don’t have an AI policy or reference use cases, and they do separate things. The policy is a policy, but if you have an ever evolving set of use cases for HR or legal, say you want to do this, you can use AI. You want to do that, you can’t use AI and keep…

that and that can be a trusted guide to the people and the business functions because today people are just flying blind. I think we talk in cybersecurity, the skills shortage, but my God, if you talk about cybersecurity, the skills shortage, what about AI? Demand for AI skills, I mean, I think really we can all agree far outweighs the supply today, right? That’s for sure. And then…

Heather Noggle (30:34.197)
Mm-hmm.

Arun DeSouza (30:58.347)
The other thing is, OK, you want to use AI, right? Take the example of your data classification thing. We talked about AI there. How do you fit these things into the lens of integration, interoperability, and backwards, compatibility with existing systems, right? Because you can’t just plug it in, because AI wasn’t designed for that. And what risk can you have? And just like with Power BI, back in the day, you could take, you know,

data from one protective system, two and three, and make this view and all hell will break loose. So people need to start thinking about all these issues. And the of the matter is deployment costs, right? Because the thing is, and I’m talking not just about technology per se, but it’s the people, process, costs, it’s time and effort. To me, know, deploying AI is kind of like in the old days of Six Sigma, right? People want to deploy Six Sigma.

What they forgot is that in order to deploy Six Sigma on top of people existing job, lot of companies realized it was not that easy. I think the same is happening with AI. People are just diving into it because it’s a cool thing to do, but they’re not doing the due diligence. I think scaling and complexity have their place as well because people don’t realize that it’s kind of like shadow ideas. Heather was saying earlier, Department A does one thing, and how cool is that? Can we use your model, make some tweaks to it?

Heather Noggle (32:01.546)
Yeah.

Arun DeSouza (32:17.459)
And they just rushed to deployment, right? And the point, and it suddenly gets so complex because just like shadow IT, shadow AI deployments are also expanding. So why do I mention that? Because to me, just like in mobile security, when it first started, people would say, okay, this is very important. Let’s just put it out there and all these beaches would happen. They didn’t do proper evaluation. They didn’t do vulnerability assessment. They didn’t do much of anything until the space matured. So I think…

What I’m trying to say is we need, it’s just my word, AI development and deployment lifecycle based and centered on the policy and business partnership with clear metrics and things of that nature. At least that’s what I think. mean, it’s a lot I know, but it’s one of my favorite topics these days.

Patrick Spencer (33:04.619)
It was.

Heather Noggle (33:04.639)
I think it mirrors the SDLC very closely as well as software development, system development, life cycle that when we look at AI, that if we approach it appropriately, it doesn’t change that much in terms of where we understand things need to fit together and how we govern them and look through that. But it doesn’t seem that way. I like to call that the technology hype trap that, I hear about this great new thing. I’ve got to implement that somewhere in my company. How do I do that? And then in the early days it is, it’s not where it needs to be.

Arun DeSouza (33:16.843)
Mm-hmm.

Arun DeSouza (33:26.283)
You

Patrick Spencer (33:34.894)
Yeah, that’s a great point. know for those in our audience, if you haven’t checked out the new IBM 2025 cost of data breach report, do so because it does have a very, very substantial amount of information on AI. And we were talking before the podcast started about a couple of findings that I found quite interesting. One of every AI related breach that happens, 97 % of them lacked controls.

Heather Noggle (33:37.301)
We’re still in the early days.

Heather Noggle (33:57.505)
you

Patrick Spencer (34:05.134)
That’s why, right? You didn’t have the right governance policies and technical controls in place. You’re more likely to be breached. The average data breach that was related to AI was $670,000 more than the average data breach cost. Data breach cost actually went down a little bit this year, which is the first time I think in the past.

five, six years or maybe it’s been a decade since they went down and since they started publishing that report. So there’s some substantial risk involved if you aren’t following the right strategy to begin with or at this point you don’t have the right policies and controls in place second of all. So all right we could have a whole separate conversation.

Arun DeSouza (34:45.387)
One thing, Patrick, one thing I just want to mention, you know, we’ve been talking a lot of what companies can or cannot do or not able to do, but the thing is sometimes you inherit problems too. For example, I don’t know if either or both of you have heard of this thing called Echo Leak. It was a zero-click AI vulnerability that exploits copilot’s use of historical contextual data to silently execute hidden prompts or user interaction, right?

The unique thing about this attack method is relied on embedded invisible prompt injection. And of course, why do I mention that? Because it demonstrates a broader class of risks tied to GenAI features like summarization. And it brings to mind the importance of securing not just AI models, but also the environments they act in. And I learned about this from Trend Micro because I subscribed to them and just kind of recollected from my memory what they said.

think it’s important to just as when you onboard SAS programs or SAS systems, as you start to take the system with AI using the same sort of due diligence.

Patrick Spencer (35:52.174)
That’s a great point.

Patrick Spencer (35:58.636)
Yeah, very, very, very, very true. When you onboard an AI system, whether it’s a private one or a public one in your environment, you need to go through a checklist just like you do a normal SAS solution to make sure that you have all the proper security.

Heather Noggle (35:59.457)
very, very, very good. And you get on board with many of these systems, whether it’s a pilot or a public, and then you get on board with them.

Patrick Spencer (36:14.798)
Well, I’m being cognizant of time. I want to make sure we cover a couple more topics here before we wrap things up. We could have a two hour conversation, I think, from the report, because there were a lot of interesting things in there. That’s what happens when you use AI to do all this cross analysis. Now you can fair it out a lot more things than you do with just the humans doing it. The scoring rubric that we had, know, heavily weights detection time. And we saw that, this is for Heather, 31 % of organizations was over

Heather Noggle (36:15.105)
I mean, part of the time, I want to make sure we have a couple more topics here before we wrap, and we could have a few more topics.

Patrick Spencer (36:44.666)
back to the third party issue. 5,000 plus third parties in their environment take over 90 days to detect, detect breaches. Man, there’s a lot of things that go on in 90 days, unfortunately.

Heather Noggle (36:57.675)
Yes, if we’re back at identify and we’re at detect, detect should not be that difficult if identify is done well and the right tooling is in place. However, the reality is I like to think about maybe there’s one corporate refrigerator for an analogy and there’s a hidden drawer in there and somebody put something in there and we can detect that in a matter of days by smell.

Not necessarily the case always if the right infrastructure and tooling isn’t in place with security. The 90 day dwell time to an outsider seems like a very long time. I’ve seen other reports in some organizations that are less mature that it’s a year or something like that. So I’m not surprised by it. I am disheartened because 90 days is a long time to do damage to ex-filtrate data and do problems with it. So I’m going to, I’d like Arun to talk a little bit about, and then I want to weigh in with something to see if he says something I’m thinking.

Arun DeSouza (37:52.716)
Now, that’s a hard act to follow, Heather. I’m going to try my best, So I think in addition to what Heather said, in my view, think organizations should prioritize breach detection speed improvements by focusing on a combination of proactive measures, robust monitoring, and effective incident response plan. And there, I’m going to say it, at least semi-annual, if not

quarterly tabletop exercises driven by key and development key playbooks for like supply chain or AI and keeping those up to date. Because preparedness is essential because anytime you hit a breach or an incident, know, the paper plan is okay, but how do you know is it going to work very well? So including the business functions in it, making sure that, you know, communications know what needs to go on in case of severe breach. All that needs to be, you know, practice-based.

Heather Noggle (38:23.883)
Ten more times.

Arun DeSouza (38:51.637)
Perfect. And I think many companies don’t do that enough. And especially the age of AI and especially people are diving feet for, so think preparedness is key. In addition, you need to implement or deploy continuous monitoring, automate threat detection and response, and also regularly testing some response plans beyond just the tabletop exercises which is educational. So I’m talking about calibrating the plans on ongoing basis, right?

So focusing on these areas will help organizations identify and contain pieces faster and minimize potential damage. But that’s not enough. We talked earlier about observability and having a paradigm shift from just passive insights to observability. So I think really leveraging platforms, whether it’s the SIEM or SOAR platform or EDR platform or data classification that has observability in it.

Heather Noggle (39:19.339)
Mm-hmm.

Arun DeSouza (39:46.645)
take proactive corrective action with some level of supervision I think that would be much needed. Thoughts Heather?

Heather Noggle (39:53.729)
The tabletops is what I was looking for. I tend to focus in security on the people and then lightly into the process with the technology of supporting. I think you and I complement each other because you’ve got the deep technical knowledge where you can bring in there and look at technology and process in addition to where it touches people in things like tabletop exercises and intentional planning and strategy. So thank you for adding all that because I couldn’t have done that that well. Appreciate that.

Arun DeSouza (40:18.315)
You know teamwork makes the dream work Heather you’re my new friend least I can do

Heather Noggle (40:21.377)
Indeed.

Patrick Spencer (40:25.134)
All right, let’s see. I’m looking at our questions here. We’re gonna skip one or two, but there’s a couple of these I wanna make sure that we cover. Privacy enhancing technologies, those without any privacy enhancing technologies, they scored high. No big surprise on the risk score index. All those who said they had a significant investment, which meant they had multiple technologies in place and proceeds.

They scored much, lower. Arun, what’s your sense here? We actually found that the adoption rate was pretty low here in terms of responses. I was surprised that it was this low. You have any thoughts on our findings?

Arun DeSouza (41:13.035)
I’m very shy Patrick, I got to think… No, just kidding. In my long experience, almost a decade as Chief Privacy Officer, you know, I’ve actually fought these battles so I can speak a little bit about them. So I think the reason is a variety of things. Number one, it’s complexity and usability, right? Because these PET systems like you introduced require a higher degree of technical expertise in the average user processes.

Possesses right or because there’s no training because many time you take the system and you just put it out there and people use it you can So there’s got to be a proper sort of onboarding system plan in that case So that’s the first thing and and in fact, I would say that many times, you know We need to bring the users into this complicated technology ahead of time You didn’t even the pilots are getting their feedback just by and I think that’s the reason why people hear about the system fail but that’s one of the main reasons in my view that

may fail. Next is cost, right? Because it’s not just the cost of the platform or the time spent evaluating before buying, or just the deployment and maintenance can be expensive and also requires major computing power, Especially for the many organizations that are challenged by limited resources, like many of the ones that took a survey and don’t have it, I’m sure. Third thing I think is infrastructure and standards.

Because a lack of robust and widely adopted infrastructure and standardized solutions for privacy protection hinders interoperability and overall adoption. And interoperability is key, especially, for example, in the health care space because the risks of beaches are so high. And I think there’s not always the proper framework. So that’s a scary thing.

And of course, have to say auditability and governance challenges are something Heather can relate to, I’m sure. It’s difficult to audit for regulatory compliance, accountability, while assuring a balancing responsible use. That’s a hard one, you know, and I always struggle with that a little bit, making that clear to upper management. Then I think there’s limited awareness and understanding, right? Because, you know, it’s sort of a niche skill set, know, privacy is…

Arun DeSouza (43:30.059)
Very few organizations have dedicated privacy organizations, more like a loosely federated coalition, know, where, you know, CSO knows a little, Legal knows a little, HR may know a Legal, and they don’t even do it at a programmatic level, like there’s a privacy program manager and all. So therefore, how do you then make users understand, right? I mean, I used to do boot camps and, you know, privacy training, but I had a limited budget, but we need to do all that.

Then I think we talked about this in another area, complacency, right? Because I think people are so worried about oversharing data. And so then they sort of get lulled into a false sense of security and therefore it undermines caution that, hey, we need to be doing something more, right? And then the trust deficit, because we’ve seen so many as data breaches and whatnot, even with password managers or so many other things that…

Many times, even me as an end user, I’m skeptical about a provider’s ability to prioritize data privacy, which to me is a significant barrier. Heather, Patrick, did any of that resonate with you?

Heather Noggle (44:35.039)
A lot. Privacy seems like it is just not the focal point in a lot of companies. Like you said, and sometimes this happens with exports as well. There’s not an export compliance officer until there is. And so there’s not a privacy officer until there is. there’s usually some sort of series of catalytic events that say, okay, now’s the time, let’s do this. And then that’s a nice place for change and adopting how we approach dealing with privacy is something that’s not a subset of security.

Patrick Spencer (44:55.384)
concert.

Patrick Spencer (45:05.88)
Do you see a move and adoption by some of your clients to put in place a data privacy officer? Or are they still like tacking that on to multiple people within the organization or they’re adding it to the CSOS responsibility? What do you see from a trend stand?

Arun DeSouza (45:06.187)
And one.

Heather Noggle (45:29.035)
Arun, will you handle that one? Because my organizations I work with tend to be smaller and do not have the privacy officer.

Patrick Spencer (45:34.038)
They don’t have one.

Arun DeSouza (45:37.132)
Yeah, well, you know, think definitely a few observations about that. I I’ve come from the automotive industry, like, you know, the midsize enterprise, right? And even there, sometimes it’s not that easy unless you, for example, have some tailwinds. Like, for example, if you’re working in Europe and you need to comply with GDPR, right? Or your operation in China or Chinese company and you have to comply with data privacy law or in the United States, HIPAA, California.

Many companies won’t do it. That’s the one thing. But the hidden thing that I’ve observed is many times I’ve heard people make these comments, two of them. One is to say, know, this GDPR, for example, you know, where is this small company? You know, they’re not going to go after us. They’re going to go after Google and they’re going to go after all those big companies. But that’s kind of like playing Russian roulette in my view, right? One shouldn’t be doing that.

So that’s the first thing is like saying, okay, is it really worth it? So that’s the one thing. So the other thing that I’ve run into very often in my career, and for example, you know, it’s happened to me at least a couple of times. Why do you talk so much about privacy at home? You know, you’ve given us a cybersecurity training. You’ve talked to us about confidentiality, integrity and availability triad. That’s perfect at home. That’s perfect. So why do you talk so much about privacy?

And I said, well, that’s great that you remember your training. That’s wonderful. And thank you for that. However, I would say it in this way, know, cybersecurity is about data, you know, lot of our thing, but if it’s a cornerstone and privacy, it’s about people. It’s when the data is about people, that’s when security and privacy becomes inextricably intertwined.

Heather Noggle (47:33.983)
Mm-hmm.

Arun DeSouza (47:35.444)
And that’s back to my earlier point about the lack of awareness and understanding. So those are the two things. With that being said, think for sure, think if you have those regulatory headwinds, just at least go to a point that DPO, because for GDPR, it needs to have it. Even Singapore data protection need to have it. And if you don’t have enough resources, at least there’s got to be a programmatic approach to be done. I think that should probably be led by the CSO. And I think one of the things

Not to get in the soap box, I think many a time the CISO is under IT and I think the CISO should be a peer of the CIO, should report to the Chief Risk Officer and one day then maybe that umbrella of enterprise risk security privacy will be under the Chief Risk Officer and qualified CISO could be the Chief Risk Officer. So I think that’s needed, but they’re not there yet, right? Because otherwise it’s just going to be, know, the example I’ll give you is, you know, some of us may have been to the disco, you know.

You go to disco, you see all those beams of light and it’s all really cool. But we take all those beams of light and you call this thing, you get a laser and that’s what I see in the field of privacy. It’s like a discotheque, all these dancing beams of light. There’s not enough coherence and focus if that makes any sense.

Heather Noggle (48:47.027)
Now I’m seeing black cats roller skating at the disco ball as a cybersecurity topic and privacy topic because I’ve got to write about that. That sounds very fun.

Patrick Spencer (48:52.846)
I’m sorry.

Yeah, I didn’t think we’d be talking about discos on this podcast. All right. We’ve got to wrap things up, but before we do so, I’d like both of your perspectives on takeaways or what organizations should do next. Heather, let’s start with you. You know, maybe we’re at a critical inflection point this year. Almost half are at a point of high to critical risk. That means that your organizations are at risk. incur in many instances, millions of dollars in

Arun DeSouza (48:57.087)
Thank

Heather Noggle (48:58.642)
Yeah, I know.

Patrick Spencer (49:25.272)
detrimental value to your company, whether it’s financial penalties, reputation loss, litigation costs, the list could go on and on, right? They need to do something. What do you recommend to these organizations? Where should they start? How would they prioritize the list of actions?

Heather Noggle (49:43.971)
Well first don’t panic. Panic should never be a priority. But that is sometimes after reading one of the reports if you know that something in the report is a sore spot then panic might be the first thing. So that’s what not to do. What to do is not once.

We have to be strategic and it can’t be just everybody sits down once and figures out what we’re going to do. It has to be something in some cadence that says, this is a priority to us. Here’s the reason why. Here is who it affects the stakeholders and here’s who needs to be at the table. Because if we ask those things, those questions and get those answers, then the team that approaches how to be better at security might look be comprised of different people than.

we might’ve thought it would be years ago. You can go back to what Arun said about, now we’re looking at security is not really a technical thing. It’s how we enforce it. And now that technology is so integrated in our lives, it’s under the CIO or it’s seen as a technical thing and it needs to evolve as being one of the prime risk factors to any organization because we are so interconnected. So I would say, take a look at how is our strategy going to be ongoing and who needs to be at the table. And it may be different than who we thought.

Patrick Spencer (50:56.33)
Interesting. Rooney, thoughts on your part?

Arun DeSouza (51:01.375)
Yeah, I think based on the report, the thing that I would say is, think Heather touched upon earlier, is enhancing visibility. Because you can’t protect what you don’t see, right? Whether it’s for third parties, AI, compliance, and what have you. But you don’t just enhance visibility. We talked about earlier, it’s boosting visibility with observability and orchestration to be proactive instead of reactive and to initiate corrective workflows.

So we talked over some of that earlier, but then how do you do that? I think I would submit like a few points. First is to enhance automation with intelligent insights, because observability will provide the data and insights necessary to build more intelligent and responsive orchestration workflows. Then accelerating and optimizing incident response. Orchestration can automate the execution of predefined responses to incidents.

detected observability, enabling faster and more efficient instant resolution, right? So to your earlier question, how can you reduce the time for detecting breaches and all that? Then streamlined operations and reduced downtime. So combining real-time visibility and automated workflows for quicker issue identification mediation, which ultimately reduces downtime and improves overall system reliability. And now sort of the softer one.

Two of them, improve collaboration and communication, right? So a unified platform for visibility and orchestration will foster better communication and collaboration between the different teams, both on the IT side and the business side. So in case of things need to be done, people will be actually able to actually enact some of the tabletop exercise lessons or the annual things that they did, right? And they can work together more effectively.

Heather Noggle (52:25.515)
Mm.

Arun DeSouza (52:47.987)
And last is, back to my laser analogy, a laser focus on continuous improvement and optimization. Because the insights gained from observability can be used to continuously refine and optimize orchestration workflows, leading to ongoing efficient and effective improvements in performance and reliability.

Patrick Spencer (53:11.118)
That’s great. Are there any final thoughts on what Arun said?

Heather Noggle (53:15.13)
Combine everything I’ve said from the high level, get the people going and put the process and the technology to it. And I think you have the answer there completely that it is the people, the process is the technology and the people make the processes and the technology.

Patrick Spencer (53:30.52)
That’s great. Well, I really appreciate all of your time today. You guys had some fabulous insights that our listeners will certainly find helpful. I encourage everyone to check out. It doesn’t require registration. You click one click and you have the actual report. Make sure you click on that at the end of the podcast. And I welcome your insights and ideas in terms of how we can enhance and improve the report next year and in years to come.

Before we conclude, Heather, how should folks get in touch with you if they’re interested in knowing more about your consulting services?

Heather Noggle (54:08.65)
You can reach me on LinkedIn. It’s probably the easiest way to do. Just my name Heather Noggle. And then I also have HeatherNoggle.com and then hnoggle.co.stack.com for email. Great.

Patrick Spencer (54:18.73)
And we’ll include the links at the end of the podcast as well. Haroon, I assume kind of the same process for you as well.

Arun DeSouza (54:27.017)
Yeah, the same. LinkedIn is the best way. It’s always with me. yeah, thanks.

Patrick Spencer (54:32.238)
Well, we encourage your listeners to reach out to both Heather and to Arun if you think you need their expertise as you tackle some of your cybersecurity compliance projects. Well, that concludes another KiteCast episode. I think this is up to episode 48 or 47. I forgot. I’ve lost track. We’re almost to hit 50. I know that there’s some great episodes, including the one that we talked to Arun about his background. And then he appeared in the other one there. We did a deep dive on the.

other reports. So check those out. Look forward to having both of you on another KiteCast episode in the future. Thanks.

Heather Noggle (55:09.121)
Thank you.

Arun DeSouza (55:11.093)
Thanks, Patrick. Thanks, Heather. Appreciate it.

Heather Noggle (55:15.733)
Thank you both.

Get started.

It’s easy to start ensuring regulatory compliance and effectively managing risk with Kiteworks. Join the thousands of organizations who are confident in how they exchange private data between people, machines, and systems. Get started today.

Share
Tweet
Share
Explore Kiteworks