This Kitecast episode features Chris Pogue, Director of Digital Forensics at CyberCX, a cybersecurity veteran with 25 years of experience. Chris brings unique insights from his extensive background spanning penetration testing, executive leadership, and military instruction. As an adjunct professor at Oklahoma State University, he teaches both international business and digital forensics, emphasizing the critical importance of communication between technical and non-technical stakeholders.
Chris introduces CyberCX as “the biggest cybersecurity company you’ve never heard of”—a pure-play security firm with 1,500 professionals globally. Founded in Australia through the acquisition of 24 boutique security firms, CyberCX stands apart by focusing exclusively on cybersecurity expertise without the distractions of hardware sales or software development. With specialized teams including 200 penetration testers and 40 incident responders, they offer comprehensive security solutions tailored to each client’s unique risk profile.
The conversation reveals alarming trends in the threat landscape, including the surprising resurgence of SQL injection attacks targeting forgotten systems and unpatched vulnerabilities. Chris explains that once an exploit is announced, threat actors typically begin targeting it within 24 to 48 hours, yet organizations often take 60 to 90 days to implement patches. The podcast also explores how ransomware tactics are evolving from simple data encryption to targeting operational technology and critical infrastructure, creating more leverage by disrupting business continuity rather than just threatening data exposure.
Third-party risk management emerges as a critical concern, with Chris noting that the traditional “castle and moat” security model has become obsolete in today’s interconnected business environment. He describes how business email compromise attacks frequently move laterally across supply chains, with compromised trusted partners becoming vectors for invoice fraud and malware distribution. The conversation also touches on the emerging role of AI in creating more convincing phishing campaigns and voice synthesis attacks.
The conversation takes a fascinating turn when Patrick raises concerns about generative AI security risks. Patrick references a comprehensive Stanford study that found approximately 8.5% of content being loaded into public AI tools is private or confidential information that shouldn’t be exposed. Chris acknowledges this emerging risk, noting that while many organizations initially responded with blanket prohibitions against AI use, this “ostrich head in the sand” approach has proven ineffective, equating it to what Patrick says is shadow AI. Both experts agree that AI security represents an evolving frontier where cybercriminals are already developing techniques to retrieve sensitive data that organizations have inadvertently exposed to these systems.
Drawing on decades of experience, Chris offers this compelling perspective on security investment: “In my career, I have yet to find an organization who under-invested in cybersecurity and was thankful that they did later.” With data breach costs averaging $4.88 million globally and over $9 million in the United States, the economic argument for proactive security becomes increasingly clear. Don’t miss this eye-opening discussion on the frontlines of cybersecurity defense.
LinkedIn:
https://www.linkedin.com/in/christopher-pogue-msis-6148441/
CyberCX:
https://cybercx.com/
Transcript
Patrick Spencer (00:01)
Hey everyone, welcome back to another Kitecast episode. I’m your host for today’s show, Patrick Spencer. For today’s podcast, it’s going to be real treat. I’m speaking with Chris Pogue, a cybersecurity luminary with a 25 year journey that spans everything from hands-on penetration testing to executive leadership, public, as well as private sector roles. Through Chris’s experience as a global head of operations, the chief information security officer, military instructor.
university professor, among other things. He’s developed a rare combination of technical expertise, business acumen that allows him to decode complex security challenges through a strategic lens. Currently, Chris serves as the director of digital forensics over at CyberCx. We’re going to have an interesting conversation about that organization. They’re doing some really cool things, as well as an adjunct professor over at Oklahoma State University.
Chris, thanks for joining me today. I’m looking forward to this conversation.
Chris Pogue (01:02)
Yeah, thanks for having me, Patrick. It’s a real pleasure to be here.
Patrick Spencer (01:05)
Well, thank you. let’s, know, some in our audience may not be familiar with the Cyber CX. Let’s start by talking bit about the organization, what you guys do, how long you’ve been around, who are your clients and so forth.
Chris Pogue (01:19)
Yeah.
Yeah, it’s a, I you, it’s a great organization. It’s the biggest cybersecurity company you’ve never heard of. predominantly because it is a Southern hemisphere company. So born and raised in the Australian government. The co-founders, one of them, Alastair McGibbon, the other one, John Petteritis came out of, one came out of government, one came out of Telco. And they said, look, everyone is sort of nibbling around the edges. We want to do…
a very strong, very deep, pure play cybersecurity company. And so they gobbled up 24 different boutique organizations, put them under one banner, which is the CyberCX banner. while as an organization, we’ve only been officially around for five years, the companies that CyberCX purchased during that acquisition phase have been around for as much as 20 years. And so we have a very deep and long pedigree in the Australian New Zealand market. Well then,
through a combination of organic and inorganic growth. Today, we sit at about 1,500 cybersecurity professionals, and we are pure play cyber. We don’t do anything else. We don’t sell hardware. We don’t sell software. We don’t schlep cloud storage. We are pure play cybersecurity experts, which makes us one of the largest teams on the planet. And then just because of market dynamics and size, the next…
Patrick Spencer (02:21)
wow.
Chris Pogue (02:40)
logical movements for the company was the US and the UK. And so we have a presence of about 25 people here in the United States, but our capabilities are the global team, right? We have 200 penetration testers. We have, you know, 40 incident responders. We’ve got about 250 people in our strategy and risk team. So very wide, very deep ⁓ capabilities. ⁓ And ⁓
like I said, very wide swath of different unique ⁓ pillars within the organization.
Patrick Spencer (03:16)
Interesting. So you have an advantage in that you’re not distracted by all the other things that you could be involved with from a professional services standpoint. You guys are focused only on cybersecurity, whereas the big five, know, they’re doing that and they’re doing, you know, supply chain methodology. They’re doing business re-engineering and among many other things, you guys are totally focused. So when someone engages you, they’re getting your full organization and the scope of your expertise in cybersecurity only.
Chris Pogue (03:46)
Correct. And you know, it’s been in cybersecurity long enough to know that no investigation, no pen test, no strategic uplift is ever singularly focused, right? There’s always a, well, we need some help with cloud and we need some help with networking and well, we really want to do a threat hunt and well, we have to do pen test. And so it’s very easy to have a conversation about one of our service offerings. And then it becomes five conversations, right? About, how do we…
How do we build this and wrap this and make it like a whole, know, the part of this complete breakfast, right? Remember those commercials when we were kids. And so that is our strength, right? And as you pointed out, we’re not distracted. don’t have things that are me too’s. They are, we just do this and that’s all we do, ⁓ which is very powerful. And then we focus on trade crafts. Like, yes, we want to make money and yes, we, you know, want to…
compete and go to market and write and all the things a business should do. But one of the things that makes CyberCX so fulfilling to work at is we genuinely focus on tradecraft. We want subject matter expertise to be as deep as possible. And so we encourage training and education and cross training and cross function. And so when you get someone on the team, you know that, they’re going to be deep in what you’re asking them to do, ⁓ but they may have expertise in other areas. ⁓
Or they’ll know enough to go, look, I can have about a five minute conversation about this, but I tell you what, let me get one of the foremost experts, you know, in the Southern hemisphere on this very topic and we’ll get them on the phone and then we can, we can create really robust solutions very quick.
Patrick Spencer (05:26)
Interesting. Before we jump into some other details around your thoughts, ⁓ for example, around where digital forensics and investigations going, some of the reports you guys publish, the Verizon Data Breach Investigations Report just came out. It’s probably worth a brief ⁓ touch on it as well. But before we do so, let’s talk a bit about your background. You’ve been over at the CyberCX, I think, for about three, four years now, if I remember correctly.
Chris Pogue (05:55)
I just passed two, so I’m at two years, two months now.
Patrick Spencer (05:56)
I do. Great.
And then you teach part time over at Oklahoma State University. That’s what you’ve been doing recently. Talk about your career. mean, you have an expansive career in both public and private sector, military experience, a lot of cool things that you’ve done.
Chris Pogue (06:14)
Yeah. So, I’ve been at CyberCX two years, little over two years with, you know, in the forensics role. But as you said, I’m an adjunct professor at Oklahoma State University at the Spear School of Business. So I teach international business and digital forensics. And so I get to write the courses, which was fantastic. So I get to mash my students together. So when we run tabletop exercises,
I bring in my business students and give them a lens of what this looks like if they were going to be part of a data breach scenario in their jobs in the future. It also teaches my, you know, my MSIS students that one of the most important lessons I think that anyone entering this industry can learn is how to communicate. Like it’s easy to communicate with other technical people, right? We all sort of speak the same language and we have our own jargon and acronyms, but it’s really difficult to get.
non-technical people to understand what we’re talking about sometimes. It’s a different language. so teaching them early on how to translate what it is that you’re trying to say into the language of your target audience. Cause I can tell you as a former CISO going into a board meeting or an executive meeting, speaking tech does not get you far, right? Everyone starts wondering what’s for lunch as opposed to, you know, listening to you talk about speeds and feeds and ones and zeros and firewalls and
and all the stuff that we like talking about. But if you can make that transition and think, well, what is the chairman of the board really interested in? And what is the CEO really interested in? And what is the CFO really interested in? And how do I nuance my language so that it resonates with them? Well, then I can communicate what I want. I just do it in a different way. And then they go, OK, well, I want those things. So therefore, you can go do your things, Chris.
really important piece that I’ve learned over the years, both consulting and being a CISO is that language is just so important. And if you don’t get that right, you’re just gonna struggle. And so I get the opportunity to teach that to my students. So now coming out of university, they’re ready to hit the ground running and have a little bit of an advantage over some of their other competitors in that landscape.
Patrick Spencer (08:38)
That’s great. A lot of your students aren’t necessarily cybersecurity professionals, it sounds like. They’re business majors, they’re finance majors, marketing majors maybe even, who knows?
Chris Pogue (08:46)
Mm-hmm.
Yeah, it’s all across the spectrum. so ⁓ even though my tech students are typically, you know, they want to go into cybersecurity, right? They’re getting a master’s degree in data science or data analytics or management information systems. Like they’re true technicians. ⁓ But as you said, my business students could be everything from finance majors to econ majors to, you know, marketing, entrepreneurship, right? They’re business people. And then how do we mash them together?
And how do we get them speaking the same language? So it really is a microcosm of the much larger industry, but so important for them to learn it at those early stages. And I was just, in fact, I was just on a panel in India last week where we talked about this exact thing with a number of universities across India, having to deal with students are very smart and they’re very technical, but now you’re in the global job market.
and how do you take that technical knowledge that you have and not let it pigeonhole you into, you’re just a technician. All you’re going to understand is technical stuff, but you understand the business of that technical stuff and how it interacts with, you know, all of the different aspects of the business and what you’re trying to accomplish and things like that. So ⁓ really different ⁓ perspective, but I think it’s one that’s gaining notoriety and traction.
Patrick Spencer (10:13)
You guys are engaged, you know, for organizations are trying to be proactive and strategic in addressing, you know, potential gaps in their cybersecurity methodology or their infrastructure, if you may, or their process. But you also are brought in after the fact as well to deal with, you know, we had a data breach, we need to do forensics analysis to help us figure out, you know, where it happened, what data was exposed, what type of data was exposed, you know, where it went.
and so forth. ⁓ From a forensics analysis standpoint, are you seeing trends in the marketplace in terms of changes, in terms of the types of attacks? Are they getting more sophisticated? they using the same old tools because they’re still effective in terms of getting into, know, past the network perimeter and getting into valuable data repositories? Or are you seeing more AI being used and more sophisticated attacks?
Chris Pogue (11:09)
⁓ yes. And I’ll, and I’ll explain and I’ll explain what all those mean. Right. So I actually just finished a blog post and it’s being edited by our marketing team now, ⁓ called the more things stay the same. And it’s, we talk about the resurgence of attacks, like SQL injection is making a comeback. ⁓ and so was like, when we really dig into why some of these older attacks, ⁓ are, starting to resurface.
Patrick Spencer (11:11)
I figured that was the answer.
Chris Pogue (11:38)
We’re seeing ⁓ some, I don’t know if I call them alarming just yet, but some interesting characteristics. We see unpatched systems, systems that have been forgotten about in large infrastructures, or the thoughts that, well, this is an older vulnerability, no one’s attacking this anymore. So almost a laissez-faire approach to older patches and older operating systems, which is not the case. They’re still there, they’re still present.
I did some research recently that showed once a exploit is announced, it’s typically 24 to 48 hours before threat actors will start pouncing on it, and averages to patches between 60 and 90 days. But then you move into unpatched systems that are older that maybe people didn’t know about, or was a reserve system that was an older laptop that was given out to someone on a temporary basis as a loan or something like that. So we still see these older things.
popping up. So don’t think just because SQL injection is what 35 years old now that it’s gone. It is absolutely not gone. And then, you know, the other part, the second part to that question is what are the trends? I think globally, right, BECs and ransomware and phishing are still king, right? It’s every organization everywhere in the world, you know, whether we’re talking to victims, whether, you know, I’m an InfraGuard member in the New York City chapter.
Patrick Spencer (12:45)
Hmm.
Chris Pogue (13:06)
I work with the secret service cyber fraud task force in a number of cities. In Australia, we’re members with the Australian federal police, AFP, NCA in the United Kingdom. Everyone sees the same stuff. And I think the concepts work. And so why stop using them? Right? We’re hacking the human. We’re getting people to click on things. So the technicians are spending lots of money trying to figure out, well, how do I protect you from yourselves?
But we still have cases where CEOs clicking on well-crafted phishing emails, pop-ups, I don’t want a virus. I want to scan for a virus. I’m going to click on that, not knowing you don’t have that virus protection program. You have another one. And so ⁓ we see a lot more of those, Vishing or voice phishing is becoming more prevalent. So we’re concerned about deepfakes, not video. Video is really, really hard to do.
Patrick Spencer (13:59)
Yep.
Chris Pogue (14:05)
from an AI perspective, because you got to match facial expressions, hands movements, intonation, voice pitch. And one of the things I teach my students is only 7 % of communication is the words we use, right? 93 % is everything else. And AI is not good at that other 93%. But if there’s no visual component, right, they’re a little better at it. So that’s a bit of a concern. And so that’s where we start to see the maturation of attacks and the integration of AI.
Patrick Spencer (14:18)
Everything else, yeah.
Chris Pogue (14:34)
and sort of looking down the future how quantum is going to be used, ⁓ we’ll see more polished attacks. Like it used to be, I am Nigerian Prince. I have money to share. it’s, those are long, long gone. Like it’s the fishing emails we get are super clean. They look good. They’re timely. You know, they hit on holidays. They hit on tax season, right? You know, the NFL draft is coming up. So there’s going to be lots of NFL draft ones. Like there’s, there’s a whole, whole surgeons.
Patrick Spencer (14:47)
Sounds real now.
Chris Pogue (15:04)
And then once they get access, right, then they take you to a mirror site. So we’re seeing ⁓ legitimate sites get poisoned and become mirror sites. And so they become data harvesters. log in, know, MFA, know, session hijacking is not a super difficult attack. So even if you have MFA, don’t think that’s the silver bullet. And so we see that sort of stuff really prevalent all over the world. And, you know, as technicians, we think, well, these are pretty easy to spot, but remember the
overwhelming majority of technical users are users, right? They’re not IT security professionals. So ⁓ yeah, I think we’ll continue to see that. Like if it ain’t broke, don’t fix it. ⁓ But I think we will see the evolution of AI generated attacks, AI generated botnets, know, quantum, ⁓ you know, leveraging quantum computing for, you know, decryption and things like that. I that’s all on the horizon.
Patrick Spencer (16:02)
What we all staying on the topic of attacks, you know, with the new Verizon report that just came out, there was a section on ransomware it’s up. I remember a conversation I had on a webinar or podcast with Charles Carmichael over at the Mandiant a couple of years ago, or a year and a half ago or so, where we were seeing like with the move it attack, more organizations were refusing to pay ransom. And because the data is out there.
Chris Pogue (16:28)
Hmm
Patrick Spencer (16:31)
It’s already been exposed by, you know, due to regulations, they need to announce that it was hacked anyway. And they were simply opting out of paying ransom. The report from Verizon this year seemed to indicate that trend is continuing. Is that something that you’re seeing when you’re doing forensics analysis post cleanup when a data breach has happened?
Chris Pogue (16:51)
Yeah, I think it’s both, right? Because you’ll see, and exactly like the report says, and exactly like the folks you spoke to at Mandiant and said, organizations are getting better at doing backups, right? I mean, that’s kind of the magic bullet for ransomwares. Can you restore from backup, right? And then you’re OK. Then you only lose from the time of the attack to the backup. But that’s…
We got to remember what’s being reported is only what’s being reported. Like we don’t see what’s not being reported under reported or misreported. Right. So you have to kind of take those statistics with kind of a, you know, a little bit of skepticism because we’re not getting the whole picture. ⁓ cause I guarantee we have worked with, with clients that have paid and they’re like, look, we don’t have another option. They, they, the first thing they’re going to do when they get in is they’re going to find those backup servers. Right. And they’re going to try to take them offline and crit them or delete them.
So if you remove backups, then you better hope you have tape backups ⁓ or ⁓ some sort of backups offline. So that’s the advice that we give organizations is if you have backups, test them. Don’t think just because someone said we have immutable backups that you really do, like go out there and make sure. And then if you have tape backups, make sure you can restore from tape. It works, right? All that sort of stuff. And then the other part, we see double dipping a lot.
Patrick Spencer (18:05)
Yeah.
Chris Pogue (18:16)
So organizations will get extorted and then ransomed. So they say, Hey, we got your, you know, your highly regulated data. We’re going to release it on GitHub or, or, you know, some sort of dump sites. And if you don’t pay us our two bits, right. All of your data is going to be going to be out in the ethers. And that’s where, you know, okay, we’ve already reported it. The data is already gone. Like, you don’t have to whack us in the head with that, but then they go, okay. And you’re encrypted. And so, which means you can’t do anything like we’ve whole corporations.
Patrick Spencer (18:44)
in action.
Chris Pogue (18:46)
And we have to look to now we’ve got other types of OT systems that, you know, it’s not necessarily getting the data back, but it’s your inability to do your jobs. Right. So think of a port, you know, all the ships coming in and out, you know, where they go and what’s on those ships and manifest like that’s all computer driven. You lock that up. You lock the port up, you lock the port up. There’s no imports. There’s no imports, right? You’ve got all these other problems. Same thing with emergency management systems, building management systems.
All this OT, which we say is typically air gapped from IT networks, that’s not always the case. So I think that is going to be a trend that we see. Like not as much, ha ha, we have your data, give us money or we’re going to release it. is, ha ha, we have locked up your building and no one can get in. No one’s using the elevators. No one’s using the escalators or we’ve locked up your payment processor. Ha ha, you’re done. You’re not processing payments.
until you do this. So I think less data brokerage and more ⁓ arresting functionality.
Patrick Spencer (19:52)
Yeah, that makes a lot of sense. The report also talked a bit about third party risk and then the human element. Let’s talk about third party. When you have organizations, we do an annual study like you guys, and we find that two thirds of organizations have over 2,500 third parties that they exchange data with, right, in various ways, email, file sharing, file transfer, web form, and so forth. ⁓
Chris Pogue (20:01)
Hmm.
Mm-hmm. Right.
Patrick Spencer (20:22)
That’s a huge exposure risk because you don’t always know what kind of security controls those organizations have in place. So one, how do you vet vendors when you’re consulting with a client? What recommendations do you have for them? And then there’s some governance controls, or we argue that’s the case based on our business, that you need to have in place to ensure that the data that you have is only seen by those who need to see it, who need to edit it. There’s controls in place to restrict
who they can send it to. And then we have this possessionless editing capability where the data never leaves your environment. Someone can edit it, but it stays in your environment and they can’t pass it around and it can’t be breached outside of your environment. So what are you seeing on those two fronts?
Chris Pogue (21:07)
Yeah, so the third party risk, that’s such a huge challenge. Because, I mean, like you said, you don’t know what everyone’s security posture is. And everyone, we have these sheets you have to fill out and organizations are doing their due diligence saying, well, you have to show your compliance. You got to show your SOC 2.0 type 2 or whatever regime you’re beholden to. You got to show us all that stuff. And I think that gives you a measure of confidence.
I mean, nothing’s unhackable and no one’s bulletproof. So just knowing that if you have a supply chain, whoever the pieces in the supply chain are, depending on your reliance on them to execute your business model, it’s a risk. Let’s say you’re a book publisher, you don’t make paper. The company that makes the paper and cuts it into the little things that you then bind into a book, what if your paper distributor gets…
gets compromised, right? And they get arrested and they can’t provide paper and now you can’t make books. And so your core competencies damage. There’s hundreds of those across the business. so I think there’s sort of best case scenario. There’s, you doing your due diligence with these companies? You’re doing business with reputable companies. Are you asking them to make sure they’re giving you timely data? Like your SOC 2 type 2 compliance attestation can’t be three years old, right? You got to show that you’re on top of this stuff.
⁓ And then just knowing it’s a risk and, you know, walking through, what do you do if you have this? And table topping, like when we run tabletop exercises, we will include third parties because it might not be you. Your stuff might be completely fine and it’s someone else’s and it brings you to your knees. Right? So how do you tell your customers, the board, you know, your executive team is we have ground to a halt by something that’s not our fault at all. ⁓
And then there’s another piece of it that’s really interesting when we see BECs and how organizations move across, you know, and spread their wings, so to speak, you know, business email compromise happens. They assume my identity. They are sending and receiving emails as if they were me. So it’s very easy to send an email to a trusted client or to a trusted partner with an attachment or an update. Hey, I’ve just changed our payment terms. You know, you’re used to talking to the controller.
And so for them to say they’ve got a new bank account doesn’t seem very strange. Yeah. And so we see a lot of that where we see invoice fraud, and then we’ll see expense fraud, right? We’ll see, I went to this. I’m going to use AI, generate a receipt, submit that as an expense. And so that lateral movement across the supply chain, like we’ve worked a number of BECs within the last year that was downstream because of supply chain. ⁓ And so
Patrick Spencer (23:34)
Actually correct. Yeah.
…in a rep…
Chris Pogue (23:58)
Yeah, it’s, it’s, it’s complex. so as business leaders, right. I encourage all of my customers and those that I speak within the community, like your security used to be really easy. was like this castle moat, you know, analogy that we would use. And it’s just like, I got to protect my thing. That’s that has that ship has sailed my friend, you know, now it’s, got to worry about my thing and I got to worry about everybody else’s thing and how it interacts with my thing. And we haven’t even started talking about API’s yet.
because that represents a whole other challenge is, well, what if my thing is plugged into your thing? Can I trust that while your UI might be secure, you may not have those same controls within your APIs. so that represents, so it’s this really complex web that if you’re not living it and breathing it every day, like you’re really going to struggle to keep up.
Patrick Spencer (24:30)
New camera arms.
Chris Pogue (24:53)
And I know it’s kind of self-serving, right? We’re a cybersecurity company, so it’s going to sound like that. But man, if I promise you, if you’re not living and breathing this on a daily basis and you’re dealing with the PRC, the PLA, the DPRK, Volt Typhoon, Salt Typhoon, Ransom Hub, all of these threat actor groups who are 20 times bigger than any resources we can throw at the problem, like you’re just…
What’s it called ballistic podiatry, right? You’re just shooting yourself in the foot unnecessarily.
Patrick Spencer (25:27)
There’s an analogy a military guy would only know, think.
There’s a lot of third party applications that organizations use for exchanging information with third parties, as you well know. a lot of those are legacy technologies that may not have the latest security capabilities built into them, know, double encryption, 256 encryption at least, right? DLP, SIEM, know, AI capabilities around anomaly detection and so forth.
You know, is that a real concern when you’re consulting the organizations? Is that a point you touch on with them? The unfortunate aspect is a lot of those are a bit difficult to rip and replace, right? To get rid of something requires a new acquisition ⁓ and a whole vetting process. And then you need to get everyone trained and up to speed on that new technology. How do you deal with that as an organization? Change is not always easy, but sometimes necessary.
Chris Pogue (26:28)
No,
no. And your point about rip and replace, right? I remember working with a client once that said, look, we want to move from technology A to technology B. Technology B is superior. We want it. It will cost us a million dollars just to get rid of technology A. Right? And that doesn’t count things like lost opportunity costs, like you mentioned, you know, what if I miss something, I got to fine tune it, you know, and then I have to have a period where I’m, you know, making sure that my changes and my fine tuning
right, is instrumented properly to get the results that I want, that I got trained, educate. Like that is a bumpy road. And some people are like, yes, this isn’t the best tool. Yes, it has some challenges, but right, there’s alligators closer to the boat. Like let’s deal with those and worry about this, you know, this transformation or, you know, digital transformation or uplift later. And so with, you know, businesses that we consult with all of them.
have similar problems. There is not any organization in the Fortune 500 or the Fortune 1000 or the 33 million SMBs in the US, give or take, that are coming in pristine. There is always a mountain of data or legacy systems or new CISOs that inherited this or outgoing CISOs that didn’t pay attention to that because they were looking towards retirement. There’s a litany of reasons.
And all of them are, I mean, they’re good, right? Businesses, I their primary focus is to make money and you want to do that in a secure way. And you want security to be seen as an enabler versus a detractor from the business, which isn’t always the case. So really understand, I think this is now becoming more, you know, prevalence with ⁓ consultants as opposed to just leaning on CISOs, having a really wide aperture of not just cybersecurity, but of business.
What is your core competency? How do you make money? What is your infrastructure challenges? What are your supply chain challenges? It’s this really big thing that isn’t one thing anymore. It’s an amalgamation of a lot of things, which is becoming increasingly complex. so any organization, we would say, look, don’t feel bad. Everybody has this, but you can’t go it alone. Hire CyberCX or hire another firm.
for us, but don’t go it alone. Get someone that does this on the regular to help lead you and guide you and help security being an enabler to the business, because you can’t do it. You just have to understand the nuances of the business, how it makes money, how it goes to market, how you work with your customers and make that parts of that security planning, as opposed to a bolt on. Bolt on security has never worked. It never will. It’s always going to be, well, I spent money on this. I didn’t get the outcomes I want.
because you approached it completely wrong.
Patrick Spencer (29:23)
You make a good point. published a report on the preparedness level of the Dib in regards to CMMC 2.0, which we can talk a bit about because you have some experience in that area. And we found that we published it jointly with Coalfire that organizations that have engaged experienced third parties like yourselves were much more likely to have their encryption house in order.
have advanced technologies in place to have all their security policies and practices fully documented and so forth. I assume that’s what you find as well, that organizations that are really immature are doing most things in-house. And there’s certainly exceptions. There always are exceptions, but many of them are coming to you and things are in a bit of disarray because they just haven’t had that. And sometimes it’s good to have an external party come in who’s not, you know,
has the blinders on there within the organization. They’re going to have certain biases and they’re just going to be blind certain things that are going on in the organization that, you know, they have a lot of work to do and you help walk them through that process. And are you seeing that in the, dib, particularly when we talk about CMMC 2.0, which here, know, organizations need to get prepared for it. Katie Arrington is a big fan of it always has been. Uh, and, and that’s going to be, it’s going to have a teeth very, very soon.
Chris Pogue (30:31)
Yeah.
Yeah, and it’s already dropped. So it’s, you if you want to do business with the federal government, that’s it. Thou shalt be CMMC compliant. Doesn’t matter if you make battleships or you are filling vending machines, right? Thou shalt be compliant. ⁓ so, yeah, I think the challenge there is a business lesson. has nothing to do with cybersecurity. It has everything to do with core competency. And this is a ⁓ good to great, right? know, core competency, hedgehog principle, right?
Patrick Spencer (30:51)
Or he does.
Chris Pogue (31:20)
whatever language you want to use that Jim Collins book. ⁓ But organizations have something they’re focused on. It’s what they do best. It’s how they make money. It’s whether they’re making candy or they’re an amusement park or whatever, they have a thing that they’re doing. Now you throw cybersecurity on top of that, or really anything. You throw that on top of that. If that is not your core competency,
then you’re, and you do that internally, you are relegating it to non-experts because your organization is focused and pointed in one direction to be an expert in one thing. It’s the same concept. Why, you know, an amusement park, why don’t you make your own, you know, why don’t you bake your own bread? Why don’t you manufacture your own trash cans? Why don’t you, you know, manufacture the steel for the rides? It’s just, it’s not what you do. And, and cybersecurity is no different. I think it’s finally caught up to that, you know,
that position that all these other aspects of the business are in. But organizations increasingly are looking at this going, look, we have a few people that know about this and they might be very good, but we just have a few people that know about this that are very good. We need a thousand people and their brains helping us on this because of the increased complexity. So it’s not like you can’t have the internal function, but I would limit internal function for that exact reason.
they probably have a dozen other things that they’re doing at the same time. But by relinquishing that to an expert, I have nothing else. I don’t think about anything else. I don’t do anything else. All of the experts within CyberCX are singularly focused on cybersecurity and our clients’ wellbeing. That is our core competency. And so, man, I would say, and it’s infinitely less expensive to pay my bill than it is to pay for a breach.
Patrick Spencer (33:03)
Interesting.
Yeah, absolutely.
Chris Pogue (33:13)
Right? The, IBM
Ponymon report says globally it’s about four and a half million, United States about nine million, you know, nine and change. My bill is not four and a half million dollars. I wish it was. It wouldn’t shock me. Yeah. So, and I will tell you listeners after doing this for 30 years, and if anyone has a contradictory example, I would love to hear it.
Patrick Spencer (33:23)
I’ll bet you it’s over 10 million when it comes out in July.
Chris Pogue (33:39)
But in my career and in my experience, I have yet to find an organization who under invested in cybersecurity and was thankful that they did later. just, I’ve not run across it ever. It’s always, you brief the board and especially after an incident, which is where I spend most of my time, they constantly look backwards and go, how did we get here? And then the next question is how do we make sure we’re never here again?
And so those two things are like, I don’t want to dime anybody out, but I want to say, look, we could have answered that stuff six months ago for you. You know, we can run a pen test, we can run compromise assessments, we can do business impact analysis to go, look, if this were to happen tomorrow, this is what it would look like. Right. We wargamed this in the military, you know, for hundreds of years going through, you know, combat scenarios and threat simulations and stuff like that. That goes back to the Romans. So being, actually it goes back before that, it goes back to the Greeks, right.
Patrick Spencer (34:36)
Greeks.
Chris Pogue (34:38)
Yeah, there was a hoplite general named Arkalaus,
if I’m saying it right. But he basically said, you don’t rise to the level of your training or no, you don’t rise to the level of the occasion. You fall to the level of your training. Right. So if you don’t go through these processes, you’re find yourself in a position where you’re, you’re not prepared mentally. You’re not prepared emotionally. You’re not prepared technically ⁓ to deal with it. And then that’s why the bill becomes $4.5 million or $9 million versus
Patrick Spencer (34:49)
Yeah.
Chris Pogue (35:06)
If you look at it proactively, you’re like, yeah, I got to make an investment of half a million upfront, but I will never pay that four and a half million. So again, I’m getting the better end of the deal.
Patrick Spencer (35:21)
Do you know, speaking of regulations, we talked about CMMC in Australia where your parent company, you know, where your company’s based, they have IRAP, ⁓ which we did IRAP certification three or four years ago. One of the first things I did when I worked on press release and some other things ⁓ when I first joined Kiteworks. Are you, from your vantage point, you know, are these regulations like these FedRAMP, you know, we’re FedRAMP high ready now, we’re FedRAMP moderate for
Chris Pogue (35:47)
Hmm.
Patrick Spencer (35:50)
since 2017, I think it is. Are these helping to drive, you know, lower risk within, know, is it, they lowering risk within organizations? they driving better security standards or organizations better because of these? What’s your perception?
Chris Pogue (36:06)
Yeah, I think to an extent, right? And the comparison that I frequently use is the airline industry versus the medical industry, right? So when you, before you get on any flights, right, commercial or private, ⁓ pilots got a checklist and the pilot’s going through that check. Now that doesn’t mean, yeah, that doesn’t mean planes don’t go down and it doesn’t mean there’s not pilot error or mechanical errors or things like that.
Patrick Spencer (36:26)
You don’t take off unless everything kicks off.
Chris Pogue (36:36)
But you are minimizing the amount of risk based on prior knowledge. So that’s why we have the NTSB. A plane goes down, they descend, they figured out what happens, they extract those lessons learned, they apply it to the airline industry, airline industry gets smarter and better. So even though we see planes that have challenges from time to time, it is orders of magnitude more safe to fly than it’s ever been. It’s something like a 0.05 % rate of failure.
which is, you it’s like one flight for every 10 million or something. Like it’s super, super safe because we’ve learned things. So if we take that mindset and we say, look, we’ve seen stuff go wrong for 30 years. So if we find all of those things and we build a checklist, just like pilots do, and we tick all those boxes, we’re reducing, you know, the likelihood of something that we know could probably or could likely happen from happening.
Patrick Spencer (37:32)
Hmm.
Chris Pogue (37:33)
doesn’t mean nothing bad is ever going to happen, but it means we’re in a better place. And why I bring in the healthcare industry is that’s a great example of it not doing that. I don’t know the statistics off the top of my head, but there’s a lot of people that go into hospitals that either don’t come out or come out in worse shape than they went in. then hospitals have lawyers and they don’t want investigations. so they use this, well, it was a mystery of medical science or
medicine is complex or people respond to it. Like there’s always some reason, and I have friends that are doctors, right? So I think they’re all brilliant people, but I think the culture is one that’s more, that’s less like the NTSB and more litigious, right? So we don’t have the level of safety. It’s safer than it’s been, but it hasn’t kept pace with the airline industry. And I think that’s why. I think it’s the exposure to what did we do wrong? What happens?
What lessons can we learn from this? What do we need to change? How do we implement that? And that’s what these regimes are meant to do, whether it’s CMMC or IRAP or the essential eight in Australia or CPA or GDPR, Sabine Zakhari, Grand Leech Gladly, like PCI, whatever. They’re just tick boxes to make sure you’re not making the mistakes of the past. Now there’s new mistakes and hopefully we evolve and get better over time. But the caution that we always give our clients and whenever I speak at conferences, things like that.
Patrick Spencer (38:35)
Yep.
Chris Pogue (39:01)
And this has been for years. This isn’t me. This is security professionals in general. Don’t mistake compliance with security. They are not the same thing. They will never be the same thing. You’re compliance with a list, just like a checklist, right? On an airliner. It doesn’t mean you are 100 % safe. It just means you’re less likely that something’s going to
Patrick Spencer (39:14)
Yep.
Yeah, that’s a great analogy. I’m going to steal that. Every podcast in cybersecurity would be remiss if you didn’t touch on GEN.AI and AI. ⁓ Specifically, I’m thinking about the risks that AI poses to organizations, not how organizations can use AI. That’s a whole separate conversation. And there’s definitely some advantages that AI, if we use it in the right way, that can result.
Chris Pogue (39:25)
Go right ahead.
Patrick Spencer (39:51)
in improving and bolstering cybersecurity and compliance. But you have employees that are using corporate devices, they’re using their own devices. There’s a plethora and it keeps growing in leaps and bounds of different AI tools that are available for them to use. And the reality is there’s human error that happens. That was one aspect in the Verizon report that came out, which is not new. That’s been the case forever, I think, since digital technology existed.
employees are loading stuff into these AI tools that they shouldn’t be loading into. If it’s a private AI tool that is hermetically sealed, that the corporation has developed, and we have that here at Kiteworks, then it’s okay to load it into those tools because it’s not going to be exposed publicly. But with these public gen AI tools, ⁓ you can load PII data, PHI data, corporate secrets.
Q and A activity that’s taking place, financial documents, strategic plans. You go down a whole long list of things that you don’t want to have exposed in those tools. There’s a great Stanford study that I published a blog post on. I think it’s coming out today, actually. It’s a 600 and I think it’s 48 page report. It’s hefty. But in it, they found that I think it’s 8.5 % of the content that’s being loaded into the AI tools.
Chris Pogue (41:07)
Mm.
Patrick Spencer (41:14)
is private or confidential and shouldn’t be loaded today by employees. are you seeing that as a big risk with your clients? Are they thinking that far in advance? And what are they doing other than, you know, initially when AI came out, was, well, we’re not going to use it. We knew that was, you know, ostrich head in the sand is not going to work. We predicted that from day one. Uh, isn’t going to work. What are you telling your clients and what are you seeing on that front?
Chris Pogue (41:40)
Yeah, it’s funny. So I have students that do the same thing, right? We want them to use AI like a tool, not as a crutch. And so I imagine, like when I went to college, I’m 52, when I went to college, like we still use the Dewey Decimal system. And then you could use this really new cool thing called the internet. we were the grumpy old curmudgeon saying, well, back in my day, we had to look things up on the Dewey Decimal system. And so I think…
You know, AI becomes that, that next evolution of, you know, the, the, the previous generation saying, back in my day. So I think there’s a little bit of, you know, generational technology angst. Um, how do you use it effectively? Like a tool as opposed to, let’s just throw everything in there. And so, uh, like most organizations are either blocking it saying you can’t, you can’t use it. Um, or they have their own, like within M 365, you know, they’ve got their own co-pilots.
which is contained to their instance. And so they can use that within their organization. That becomes okay. And I don’t know a way to get it, because you have shadow IT, and so people can take stuff home. Yeah, yeah. And they can put stuff in there. So I don’t know. I don’t have a good answer. When you put data into the model, the model gets smarter.
Patrick Spencer (42:53)
I have shadowy eye.
Chris Pogue (43:07)
And, and, know, based on how you ask it questions and you respond to those prompts, like then you get better, you you get better responses and better outcomes. So that data is actually benefiting everyone else except for, you know, if you’re exposing PII, EPHI, but how would you even get to that? Like I haven’t asked AI yet or chat GPT, Hey, show me social security numbers from 500 people or, know, and I don’t know that it would do that. Right. So I’m sort of leaning on.
the hope that the developers will put some sort of mechanism in there to keep AI from responding to those prompts. But that’s crossing my fingers at this point. I think by and large, organizations accept that people are going to use it. They have corporate policies that say to not to. And then they use some sort of internal mechanism. Like I said, if they’re an M365 shop, they use Copilot. But I think…
I don’t think that’s the last word on that. Like, I don’t even think we’re close to the last word on that. Because I think there are security threats. There are attacks that can leverage AI. There are malware modifications that you can make through AI to evade detection. There’s, can you deta… Could I ⁓ get around a WAF and run a DDoS attack with AI generated packets, right?
There’s, I think there’s a lot that we haven’t figured out yet. So I would say watch this space. I think there’s going to be a lot of activity here. And I think industry is looking to, you know, the government looking to academia, looking to working groups, you know, like InfraGuard, like CyberFriod Task Force to say, Hey, help us, help us figure out what we do here. Cause I don’t think anyone has a really good answer just yet. Other than everybody be careful. You know, we don’t quite know what’s going to happen with this.
Because once the data is out there, it’s out there. There’s no putting the.
Patrick Spencer (45:04)
Cybercriminals are figuring out how to write reverse prompts to retrieve data that organizations have inadvertently loaded into the AI. That’s quite possible. That was proven in the first year when AI was released. They had already figured that aspect out, unfortunately.
Chris Pogue (45:08)
Hmm.
Yeah, that will surprise me.
Yeah.
Yeah, it’s, I think, I think the whole AI and quantum computing things, like it’s like, I think right now we’re monkeys thrown darts at dark boards. Like, yeah, we can guess, but typically human beings are really bad at predicting the future. I just think we need to be cautious and understand that we’re playing with things that, you know, are probably going to, you know, not give us the, you know, the return that we want and make things a bit worse for us in the end.
Yeah, I’d be really cautious in any predictions. I think it’s chimpanzees and dark boards.
Patrick Spencer (45:56)
We would argue that you need to have the right governance controls in place and Kiteworks can help with that from the standpoint of you have private data, it’s stored in Kiteworks, it’s transferred, whether it’s file share or MFT or email. If it’s in Kiteworks, it’s controlled. You’re only going to be able to share it with certain people and you know who you share it with if they opened it. And then with your possessionless editing, then the data doesn’t leave your environment. So I think there’s some technology capabilities throughout there, but
Chris Pogue (46:00)
⁓ Yeah.
Hmm.
Patrick Spencer (46:25)
getting the right policies in place as an organization also is critically important. And then, you know, if your organization’s using AI, realize there’s some applications for AI that need to be done in that hermetically sealed box. So you need to develop your own AI tool that you use within your organization. yeah, lot left to do obviously on that front. Well, we’re out of time. Unfortunately, this has been a really interesting conversation. Chris.
Chris Pogue (46:42)
Yeah, yeah, for sure.
Patrick Spencer (46:51)
Number one, how can organizations get in touch with CyberCX? What’s the best way to do so?
Chris Pogue (46:56)
Yeah, you can go to our website, cybercx.com. If you’re in the Southern Hemisphere, cybercx.com.au or cybercx.co.nz, N-Z for New Zealand. You can also look us up on LinkedIn. You can call me, you can email me, chris.pogue at cybercx.com. We have a huge LinkedIn presence. You can follow us on LinkedIn and you can reach out. We have 12 disciplines of cybersecurity and this is all we do.
We’d love to provide you some additional information about the capabilities that we have and how we can help ⁓ your organization.
Patrick Spencer (47:38)
That’s great. Well, I encourage everyone to, we’ll include a link obviously with the podcast on the web page. ⁓ And it will go into the different podcast platforms at the same time. So make sure to check out CyberCX as well as ⁓ Chris’s LinkedIn profile. Well, Chris, we appreciate your time today. I look forward to talking to you in the future.
Chris Pogue (48:00)
Awesome. Thank you so much, Patrick. Really appreciate it.
Patrick Spencer (48:02)
Well, and for our audience, thanks for listening to another Kitecast episode. You can check out other Kitecast episodes at kiteworks.com slash Kitecast.