Matthew Rosenquist is an industry-recognized pragmatic, passionate, and innovative CISO and strategic security expert with 30 years of experience. On this episode, Matthew talks about cybersecurity and the challenges it comes with. In addition, Matthew highlights the fast moving cryptocurrency market along with the pros and cons of this new digital currency. Matthew shares some personal stories when it relates to cybersecurity attacks as well as countries that have dealt with similar issues. Listen all the way through for more information on how to “Keep Up With Matt”.
00:00:01:29 - 00:00:26:09
Narrator
At a crossroads of uncertainty and opportunity. How do you navigate forward? This podcast focuses on making smart choices in a rapidly changing world. We investigate the challenges of being at a crossroads and finding opportunities that arise out of disruption. Listen in on future forward conversations with the brightest luminaries, movers and shakers. Let's navigate forward together and create what's next.
00:00:26:12 - 00:00:59:02
LIsa Thee
Hello everyone, and welcome to the Navigating for podcast. My name is Lisa and will be your host today. We enjoy bringing you the brightest luminaries, movers, shakers and thinkers in the area of technology. And today we have the luxury of interviewing Matt Rosenquist, who is one of my favorite CISOs out there in cybersecurity. He is currently the CSO for Eclipse and also on 14 advisory boards, including the Smart Cities Economic Development Forum for the G 20 countries, as well as Brandon University and the University of Phenix.
00:00:59:03 - 00:01:03:12
LIsa Thee
So thank you for joining us on the podcast today, Matt.
00:01:03:14 - 00:01:06:04
Matthew Rosenquist
Yeah, it's my pleasure to be here.
00:01:06:07 - 00:01:15:20
LIsa Thee
So can you tell us a little bit about your background and where you're from and is there anything in your childhood that you think led you to being a thought leader in this space? Eventually?
00:01:15:23 - 00:01:36:29
Matthew Rosenquist
Yeah, well, I grew up all over all over the United States, so, you know, I had the pleasure of of seeing lots of different cultures and so forth. And, you know, my father actually was a homicide detective. And so that kind of got me involved. And I loved, you know, the seventies and eighties TV shows, whether it was Columbo or whatnot.
00:01:37:01 - 00:02:00:05
Matthew Rosenquist
But, you know, in those kind of shows, you're dealing with an intelligent adversary. It's not just Mt. Everest to climb. It's you're dealing with somebody that may have more resources than you may have a head start, maybe smarter than you. And there is that that gamesmanship that you have to figure out, okay, how do I win? How do I get an advantage and how do I do that over time?
00:02:00:07 - 00:02:29:02
Matthew Rosenquist
So, you know, when I embrace technology at the same time, there was just a natural fit for cybersecurity. And by that time, by the time I joined Intel, I had already spent several years with another company doing internal investigations for theft, fraud, embezzlement, you know, charge fraud, things of those things. So again, blending in with the growth of technology back in the day and as the Internet blossomed, it was just a natural fit.
00:02:29:02 - 00:02:33:23
Matthew Rosenquist
And I've loved it ever since. I've loved every single day I've been in the industry.
00:02:33:25 - 00:02:48:19
LIsa Thee
So it's a natural element of mission, of having a reason to prevent these bad actors from being successful, as well as with innovation and the challenge of the gamesmanship that keeps you showing up every day.
00:02:48:22 - 00:03:06:00
Matthew Rosenquist
Yeah, and there is a there is a personal aspect, right? There is a selfish aspect that I'll share. I love technology. I really do. I love all the gadgets and gizmos, and I want the intelligent cars that I can just pile into and it drives me where I want to go. I want all those toys. I really like tech.
00:03:06:03 - 00:03:21:13
Matthew Rosenquist
I also realize that, you know, as we become more dependent on it, there are risks. Right. Somebody can take that away from you. And now we've got tech not only for transportation, but food delivery, logistics, power.
00:03:21:16 - 00:03:24:09
LIsa Thee
Clean water, health care.
00:03:24:11 - 00:03:54:17
Matthew Rosenquist
And so if the trust goes down where the innovation then dies, it begins to wither. So because of that long term relationship, but I'm a strategist, so yesterday's really just a history lesson. Today is kind of interesting, but it's really about tomorrow and that, you know, down the road, if we don't promote and get to a good place where we can manage the security, the privacy and the safety of our technology, we won't get all those cool toys right.
00:03:54:17 - 00:04:02:04
Matthew Rosenquist
The innovation will begin to wither. And so I want to make the world, you know, technology more trustworthy.
00:04:02:07 - 00:04:20:01
LIsa Thee
I love some of the words that you're using and some of the ways that you're using them in combination. And the reason I say that is often when you hear people talk about privacy, you you don't recognize that the other side of the coin can be safety for groups that are maybe marginalized, that are also accessing the same technology.
00:04:20:01 - 00:04:36:20
LIsa Thee
And I know you've spent some time thinking about the ethics of cybersecurity and artificial intelligence. Do you want to share just kind of your broader point of view about how do we give people as much privacy as possible without eroding the public safety standards that we all enjoy in the physical world?
00:04:36:23 - 00:05:25:23
Matthew Rosenquist
Yeah, and there's natural tension in that, right? Privacy in most countries is considered a right. But on the other hand, when you make private or conceal bad actions that harm society right now, you become a detriment to the to the citizenry. And we don't want that either. So there is a natural tension. How do you respect the rights and the choices of citizens who are acting appropriately and being constructive and helping society, but at the same time, you're not providing the shields and the hideouts for the really bad people that are harming and intentionally harming our children, our souls, the community and our future.
00:05:25:26 - 00:05:54:01
Matthew Rosenquist
So it's not it's maverick, right? It's always a tension and it's always a debate. And one thing that both sides can agree to, there's no easy solution. It's something we're going to have to feel through and understand as the technology beneath us is constantly changing, as our social value structures become more aware and more important. But all that plays in because you have to have support, it can't just be one company, one group, one government.
00:05:54:08 - 00:05:58:19
Matthew Rosenquist
You have to really it's a team effort and it's not.
00:05:58:22 - 00:06:00:11
LIsa Thee
The bad guys could not agree.
00:06:00:13 - 00:06:02:08
Matthew Rosenquist
To exploit it.
00:06:02:10 - 00:06:19:12
LIsa Thee
And what's funny to me as we are on opposite sides of that tension, right, I tend to bias more towards the rights of marginalized groups and children. You tend to bias more towards the rights of privacy for the individual. But we can always come to a different place where we go, Oh yeah, I never really thought of that.
00:06:19:12 - 00:06:26:18
LIsa Thee
It's really important to look at both sides and not have unintended consequences that are more problematic than the original problem itself, right?
00:06:26:20 - 00:06:45:11
Matthew Rosenquist
Yeah, I would say it's more than just important. It is crucial to success. You have to see it from both angles because if you're just so you know, you have your blinders on and you're just, Oh, I just want privacy for people. Everything you do should be private. You're missing and you're not realizing there's going to be a lot of those unintended consequences and there's going to be a lot of victims out there.
00:06:45:11 - 00:07:06:18
Matthew Rosenquist
Victims that it's horrendous right. And the same is true on the other side. Oh, no. We just need to open everything up and identify the bad people and go after them. Now, other victimization can happen, especially with governments and suppressing people. And we've seen bad things like in World War Two, right. If you could identify it, a certain sector people, you're talking genocide.
00:07:06:18 - 00:07:25:27
Matthew Rosenquist
And we see that not only World War two, but even afterwards in other parts of the world. So you've got bad things on either end of the spectrum. We have to be able to respect each other and incorporate the views from both sides because ultimately we really want the same thing. We want the betterment of society.
00:07:26:00 - 00:07:28:14
LIsa Thee
And it's about finding the balance, right?
00:07:28:14 - 00:07:32:25
Matthew Rosenquist
Yes, Yes. It's finding that optimal balance, which could also be a moving target.
00:07:32:25 - 00:07:47:06
LIsa Thee
Looking at it regularly because that balance is going to shift as our values shift, as our technology innovates and it's always going to be a lag cycle with our policies, trying to keep up with the innovations that are going out in the wild.
00:07:47:07 - 00:08:07:21
Matthew Rosenquist
And with the innovation of the bad guys out there as they leverage new technologies and come up with new attacks and new ways to hide costs, conceal and do harm, we have to then learn what we don't know, right. Learn about that and also adapt to those as well. So there's also an intelligent adversary here that we have to deal with.
00:08:07:23 - 00:08:19:27
LIsa Thee
Yeah, my collaborations with law enforcement about improving trust and safety online, I can definitely say that oftentimes the bad actors tend to be the early adopters of the newest technologies and incredibly savvy.
00:08:19:29 - 00:08:20:16
Matthew Rosenquist
Yes.
00:08:20:19 - 00:08:32:01
LIsa Thee
And sometimes it takes a while for the people that are trying to bring light to dark places to have the same access to information and tools that are available otherwise, right?
00:08:32:03 - 00:08:52:07
Matthew Rosenquist
Oh, yeah. You know, there's some great examples about that. If you look at a cryptocurrency or something like that, know, you saw very early on some bad actors jump on that. Now, you know, I believe it's a great technology that will help, you know, everybody around the world and equalize and create equity in banking and a whole bunch of wonderful things.
00:08:52:09 - 00:09:08:12
Matthew Rosenquist
But if we only look at the first 2 seconds of that story, you would think, okay, this is just a criminal. No, it's just the bad guys jumped on it first. And they do. They do for everything. They are very good at adopting things for their advantage.
00:09:08:14 - 00:09:35:08
LIsa Thee
So, yeah, I think you ever heard about Bitcoin was from the Sacramento Sheriff's Department and it was to facilitate human trafficking. And so it just locked in. My mental model is it's for illegal activities. And then in talking to some, you know, crypto experts that regulate the industry, they help me see different uses for it. Like in countries that are getting destabilized by the government is a trustworthy, reliable way, such as what happened in Venezuela.
00:09:35:08 - 00:09:43:23
LIsa Thee
You needed something that was independent of the government to keep society functioning when things were falling apart. So help me have a more balanced point of view.
00:09:43:25 - 00:10:05:11
Matthew Rosenquist
And you get back to the criminal side. So when it first came out, I had great discussions with the FBI and with DEA because I work with law enforcement agencies as well, and they were petrified. No, this is bad. And, you know, calm them down and said, okay, is that bad? Because you don't have an understanding of it and tools and capabilities like you do with other they're like, yes.
00:10:05:19 - 00:10:25:07
Matthew Rosenquist
I said, okay, you're absolutely right. We need to evolve to get those tools and capabilities. But let's also talk about the good parts. And they're like, There are none. I said, What a way to suck it, right? When you go in, you arrest somebody and they've got a, you know, some money or drugs or whatever on them. Really, you're just arresting them for that.
00:10:25:07 - 00:10:47:08
Matthew Rosenquist
I said, How do you know about all the transactions before that? They said, Well, we don't, you know, unless you find some magic, you know, read ledgers somewhere. I said, guess what? With bitcoin, you will be able to see every transaction before and you just saw their eyes light up like, oh, we can we can attribute we can see all the Yes you can right there.
00:10:47:08 - 00:10:52:09
Matthew Rosenquist
Like oh well we get maybe this isn't so bad now.
00:10:52:11 - 00:11:07:24
LIsa Thee
And it seems so marginalized. When I was learning about it five, six, seven years back. Now my CPA is sending me questions about whether I did any digital currencies right? Yeah, it's amazing how things kind of come out of the innovator's cycle.
00:11:07:26 - 00:11:30:19
Matthew Rosenquist
Becoming pseudo anonymous is not the same as Anonymous. So and again, when we get caught up, we jump to conclusions and there's always two sides. It's a tool. It can be used for the good guys, it can be used for the bad guys. And right now the good guys are using it in great ways and coming up with tens and millions of dollars of busts, whereas that just wouldn't happen with normal techniques, with cash and other things are criminals use.
00:11:30:21 - 00:11:44:01
Matthew Rosenquist
So it's it's a little secret, but it's actually a good thing. Now if you can catch somebody and they're using cryptocurrency, the law enforcement really likes it now because they've got a lot of good tools to help them. And it's not just a skill.
00:11:44:03 - 00:11:46:17
LIsa Thee
For the industry to make it happen, right?
00:11:46:20 - 00:12:03:20
Matthew Rosenquist
Yes, I do. I do. So they just just did something very interesting and caught somebody and were able to look and go, Oh, wow, they have all these transactions, right? And they were able to piece together things that they just simply would never have been able to do.
00:12:03:22 - 00:12:29:26
LIsa Thee
And that's where I've devoted my career in terms of ethics of how do we get better tools in the hands of the people that are defending our our legacy, our children, to be able to look at patterns of behavior. Usually these things don't happen in isolation. And so being able to get large amounts of data to be able to find the the signal through the noise and being able to find the people that you need to focus on because you're never going to catch every criminal.
00:12:29:27 - 00:12:47:27
LIsa Thee
You want to catch the criminals with the priority of who's impacting the most people and having the most damage. So it really is critical to be looking at it for for more of an analytics point of view and being able to look at the bigger picture, which I agree technology tools enable in a way that just isn't feasible in the real world.
00:12:48:04 - 00:13:11:06
Matthew Rosenquist
And is a incredible tool. When you talk about data analysis and even to the extent potentially data prediction, it's one of those tools that we'll look back on and go, Wow, this fundamentally changed everything into a new data world. So it is a powerful tool, but the bad guys are also using AI as well. Again, they tend to dive into technologies first, just as we talked about.
00:13:11:06 - 00:13:25:18
Matthew Rosenquist
So we've already seen bad guys use AI for to be able to scale out and to improve their operations. So it's a tools, a tool. It can be used for good, it can be used for malice. We have to deal with both sides of that.
00:13:25:21 - 00:13:42:19
LIsa Thee
It's it just helps you scale faster In most cases. Right. So with that in mind, Matt, I think we've all seen in the news cycles a lot more around cybersecurity breaches. I even find myself getting to the point where I'm like, Yeah, yeah, another one. We're like, almost getting too used to it. Don't do.
00:13:42:19 - 00:13:44:10
Matthew Rosenquist
That. Don't do that.
00:13:44:13 - 00:14:05:27
LIsa Thee
Oh, there's like 735,000 things to worry about in any given day, and we can only like rank and rate so many of them. So but one that really, really stands out to me is the SolarWinds attack that happened at the towards the end of 2020. And I know that you had some experience managing through that being and Cisco leadership during that time.
00:14:05:27 - 00:14:10:01
LIsa Thee
Do you mind kind of pulling back the veil of the next click down of the media story?
00:14:10:01 - 00:14:35:25
Matthew Rosenquist
And for those who don't know, the SolarWinds attack, it actually originated with Mirai, who is a well-respected security company out there. They realized that their environment was hacked, which is not a good thing for a security company, a top tier security company. But as they were digging in, they actually found the root cause. And it was an I.T. company that provides service to most of Fortune 500 governments, and it's widely used.
00:14:35:27 - 00:15:01:12
Matthew Rosenquist
And their products, their I.T product itself had been compromised and somebody very, very smart had inserted extra malware. We call it a Trojan, Right. Embedded it into that product. And so when it when updated right as products normally do, it brought this in and it allowed the attackers basically a backdoor into all the organizations that had installed it.
00:15:01:15 - 00:15:10:09
Matthew Rosenquist
And in the industry, we call this a supply chain attack and it is the ultimate when it comes to scalability, because instead of having to attack.
00:15:10:09 - 00:15:11:00
LIsa Thee
18.
00:15:11:00 - 00:15:34:14
Matthew Rosenquist
Thousand individual companies and figure out how to do that, you have one company that services those 18,000 and now you get into all of them. And that's exactly what these attackers did. And it is very confidence in the analysis that it was probably a nation state. You're talking tens of thousands of hours working on this many, many millions of dollars to develop it.
00:15:34:16 - 00:15:57:17
Matthew Rosenquist
And it was hugely successful. And the industry itself, this from a cybersecurity perspective, it wasn't a surprise. We've actually been talking about this exact scenario for more than a decade, trying to ring that bell. And again, as you said, there's, you know, 10,000 things you have to worry about every day. If it's not immediately a squeaky wheel, you can tend to push it off.
00:15:57:17 - 00:16:27:26
Matthew Rosenquist
Well, people have pushed it off. So when it did hit and most of the Fortune 500 were impacted, many of the United States government agencies were impacted. Critical infrastructures to the US were impacted and made vulnerable. You know, all of a sudden overnight it became this. It really was a watershed moment of opening people's eyes to the risks that every, you know, a subset of security was already talking about and saying, please, please, please pay attention.
00:16:27:29 - 00:16:52:20
Matthew Rosenquist
So that is really kind of putting I.T. departments and technology departments and security departments across the globe, kind of through us on our side, because in most cases, there aren't the normal controls that would be in place to resist that. The industry right now is struggling to figure out, well, how do I trust my third party supplier, Right?
00:16:52:20 - 00:17:13:19
Matthew Rosenquist
How do I look at the code? How do I compartmentalize? How do I do all these things? And nobody has a perfect answer. There isn't there's not a product you can buy. There's no special lock box. There's not There are steps that you can take to improve your position. But even the general structure, what's accepted practice for software developers and so forth aren't sufficient.
00:17:13:21 - 00:17:34:05
Matthew Rosenquist
So we're going to see a fundamental change in how code is developed, how it's pushed out, how it's validated, what kind of access it's going to have, how much you're going to trust it. Fundamental things are going to change in the industry because this particular attack access, you know, was able to access a whole bunch of things in the impacted companies.
00:17:34:08 - 00:17:46:23
LIsa Thee
So you mentioned something about the experience of saying that the industry believes it was likely a nation state. Can you share a little bit about why that that is the suspicion, given the behavior of how the breach happened and what happened afterwards?
00:17:47:00 - 00:18:13:19
Matthew Rosenquist
Yeah, absolutely. So attribution in cybersecurity is actually really, really difficult in the digital world. You can hide yourself, you can try to be stealthy, you can masquerade, I can make myself look like you, I can implicate others. There's all sorts of ways of hiding in deception that can happen. So in most cases, the forensics is trying to piece together things.
00:18:13:21 - 00:18:51:23
Matthew Rosenquist
You know, what language was the code written in? Was there acrylic in some of the the information? What were the tactics that they used? Did they use a particular command to control server that we know is from this attacker? Lots of different things. But again, each one could be just subterfuge, right? Misdirection intentionally. But when you look at the bigger picture and in this case, the exploits that they used, the sophistication that they had within the code of what it was doing, how it was updating, how is, you know, remaining stealthy, how it was being controlled top notch.
00:18:51:26 - 00:19:24:13
Matthew Rosenquist
And if you look at how long it took, you're looking at in an attack what we call a threat agent. Right. That was very patient. Cybercriminals are not patient, Right. They want to grab, smash, grab, get as much money as they can and go Right now, this particular threat agent extraordinarily patient. They probably paid either a lot of money for some of the exploits, probably in the millions, or they developed them themselves, discovered themselves also, probably costing millions.
00:19:24:15 - 00:19:51:29
Matthew Rosenquist
Right. So there was a lot of investment time patients involved. And when you start getting those kinds of combinations, there's really only one threat actor you're looking at. You're looking at nation states. It's another 20 to $100 million investment for this particular hack. There's not a company or even a top tier cybercriminal crew that's probably going to invest that much.
00:19:52:01 - 00:20:02:11
Matthew Rosenquist
So that's how we kind of indirectly say, yeah, it's probably a nation state. And at that level there's only a handful out there that could pull it off. So so we're.
00:20:02:11 - 00:20:22:16
LIsa Thee
Doing with that. Matt I would really love to hear a little bit more about what were some of the vulnerabilities based on the installation of the software that made so many companies trust it? Because I think that nobody shows up to work saying, I want to expose us to a cybersecurity breach. Nobody shows up and says, I wanted to happen on my watch.
00:20:22:16 - 00:20:30:23
LIsa Thee
So can we just demystify a little bit of why it was so successful? Because unfortunately, everybody who wants to do bad things is now studying that, right?
00:20:30:24 - 00:21:08:14
Matthew Rosenquist
Yes, they are. And I guarantee you every other nation state out there is trying to duplicate this and many probably already are. There's there's other ones out there that we don't know of yet that are doing exactly this with different products and we're trying to find them. But to give you some of the examples that it that it did when they originally hacked SolarWinds, instead of going straight into their code or modifying what a developer was doing, what they actually did, which is really, really smart, is what software is, is basically developed and test and ready to go.
00:21:08:17 - 00:21:31:01
Matthew Rosenquist
They're going to compile it and that's kind of the last step, compile it and then distribute it. And what they chose to do because they had access to all those stages, they wanted to wait until all the code was done and tested and in that compile state, they wanted to then inject it as it was being compiled. So they waited until that point.
00:21:31:01 - 00:21:52:26
Matthew Rosenquist
They compromised the compiling environment to inject their code in and again, they had would have had to have tested off network that code with their code to make sure it didn't break anything because that wouldn't be noticed. And then it basically it injected at that point. And then, you know, the company said, okay, we've compiled our code list distributed to our customers.
00:21:53:03 - 00:22:24:26
Matthew Rosenquist
And so they leveraged that company's distribution network to get it to all the different companies. Another thing that most people don't realize is this particular product. I'll go ahead and say it. It was designed purely from the perspective of the amount of rights and access that it needed. It wasn't elegant. It was kind of brute force. When you installed this product, almost every anti-malware program would, you know, raise a flag and go, Hey, this is malware, right?
00:22:24:26 - 00:22:47:08
Matthew Rosenquist
This is their normal, perfectly good product out of the box. It would raise it because it needed so much access and it was sending data in very unusual ways that would normally flag as malware, but as part of the installation process, they actually told customers, yeah, we know it's going to get flagged, go ahead and turn off all those alerts.
00:22:47:11 - 00:23:09:00
Matthew Rosenquist
So purposely turn off your security tools to trust our product perpetually in the future. That's a bad business decision and any process requires you to do that. You should be going, No, you've designed it poorly. You got off the track by section right?
00:23:09:03 - 00:23:17:03
LIsa Thee
I mean, obviously there had to be a precedence for people to trust it and there's probably going to be more of that in the future that people have to be aware of, right?
00:23:17:09 - 00:23:35:15
Matthew Rosenquist
Yeah. Well, we hadn't seen Major attacks like that before, so. Well, of course, I don't have to worry about, you know, one of my trusted vendors and their product. I just have to worry about the malware, the email, the social engineering. So my trusted vendors, you know, I've got an agreement with them. It's a legal agreement. That's how I'm going to protect myself.
00:23:35:17 - 00:23:56:10
Matthew Rosenquist
And so we'll just sign the legal agreement. Okay? Sure. I'll just trust them. And that's really kind of blind trust. So when this did happen, everybody's now taking a step back and wait. That legal contract really didn't help me. It didn't help the 490, however many out of the Fortune 500, it didn't help the government agencies. It didn't help all those critical infrastructures.
00:23:56:12 - 00:24:25:05
Matthew Rosenquist
Right. That could have been taken down. Now, we got lucky because once it was detected, the attacker, depending on the type of attacker, they really had two paths to go down. They could either quietly start removing it. Right. And just gather what information they could passively, or they could basically burn down the environments that they had compromised, do critical harm.
00:24:25:07 - 00:24:52:04
Matthew Rosenquist
And it's a scorched earth policy. And we've seen that with some malware and some attackers. In this case, the attacker nation state, chose not to do a scorched earth, not to conduct what would be largely considered cyber war, and instead passively started just backing away, gathering more information. But just let it. Yeah, in fact they were even self deleting to not do harm.
00:24:52:07 - 00:24:53:26
Matthew Rosenquist
And so that.
00:24:53:29 - 00:25:07:27
LIsa Thee
Would have happened if like it affected like a power plant, like our critical infrastructure. Like what if they had chose to do a scorched earth model given the places that they've gone to? What could a real world outcome have been of that attack?
00:25:08:01 - 00:25:28:13
Matthew Rosenquist
We would still be recovering. So given the access that it had to government systems, to critical infrastructure, to Fortune 500, Fortune 1000, Fortune 10,000. Right. It would have taken out well, if they did if they really wanted to do it, they would have been able to take out the electrical grids. They would have been able to take out the Internet.
00:25:28:13 - 00:25:51:29
Matthew Rosenquist
They would have been able to take out all of our communication stations. Are logistics linked food supplies. Right. It's got to be on trucks. And those are those are managed by logistics systems and communications. Right. So water again, electricity, the lights going out. And again when you start building all those things, if the power goes out here today, we're in California and it's it's going to be 106 degrees.
00:25:51:29 - 00:26:26:21
Matthew Rosenquist
Right. We're going to have some rolling blackouts. One critical infrastructure going out. We can recover from it. But if several of them go out, you get what's considered a cascade failure. And I don't want people to panic or anything, but when you get into a cascade failure, then even your crisis response capabilities are knocked out of game. And so you can't communicate to tell people you can't coordinate efforts, you can't bring in what you need immediately for survival, and you start getting back to pencil and paper for everything.
00:26:26:28 - 00:26:48:27
Matthew Rosenquist
And when you can't get gasoline in your car, when you don't have electricity in your home, when the hospitals are not accepting patients and they can't care for them, you can't spend your money, right? Things start going down and the water that's coming out of your tap now smells funny and all your toilets are backed up.
00:26:48:29 - 00:26:55:24
LIsa Thee
Okay. You've convinced me that everyone needs to invest in their cybersecurity. Is to unmask.
00:26:55:26 - 00:27:12:29
Matthew Rosenquist
Failures are really, really, really bad. We haven't seen them yet. Right? We've talked about it for years, like a like a digital Pearl Harbor. In most cases, that's considered cyber war. In fact, in every classification. And if you do that to a country, you're you're enacting war and you're going to cause casualties.
00:27:13:01 - 00:27:22:18
LIsa Thee
So, yeah, it's almost like the Cold War, right? We kept all building up reserves to say, If you keep me, I'll nuke you. And there's a parallel here, right?
00:27:22:23 - 00:27:25:23
Matthew Rosenquist
We already are. Is a parallel. Very much so.
00:27:25:26 - 00:27:32:00
LIsa Thee
You can't put the cat back in the bag, right? Yeah. Thank you for clarifying that. I just want to.
00:27:32:00 - 00:27:33:01
Matthew Rosenquist
Happy.
00:27:33:04 - 00:27:42:25
LIsa Thee
So is that. Why does it matter? I'm not even sure most people know what SolarWinds technology does. Why would so many companies use it?
00:27:42:25 - 00:28:04:10
Matthew Rosenquist
It was a very affordable way of monitoring your servers and being able to remotely see if they're, you know, up or down. Right When our Internet goes out, there's some server or group of servers that are misbehaving and causing errors or blue screen. If you're the IT department, you want to be alerted to those as fast as possible and preferably just as they start to fail.
00:28:04:12 - 00:28:31:24
Matthew Rosenquist
Right. You want to be alerted so you can go in and redirect and and, you know, reboot things and do whatever you going to do. That's really what this does. It's simply an IT and information technology monitoring tool that would, you know, keep track of different applications and servers and services within an infrastructure. But because of that, it needed access to all those servers, all those networks, all those core functions.
00:28:31:27 - 00:28:37:26
Matthew Rosenquist
And if you have access to that, then you can shut those things down, you can corrupt those, you can do some really bad things.
00:28:37:29 - 00:28:44:26
LIsa Thee
Thank you for demystifying that. I think that really helps everybody get a grounding of what what we just lived through.
00:28:44:28 - 00:28:56:24
Matthew Rosenquist
We dodged a bullet. We really, really did. But it was a wake up call. We got lucky. So we need to learn from it, though. We need to make sure that we're not put in that vulnerable of a position again.
00:28:56:27 - 00:29:04:23
LIsa Thee
Yeah. What are the thought leaders of the industry leaning towards in terms of mitigating risk going forward now that we know.
00:29:04:25 - 00:29:24:16
Matthew Rosenquist
There is not one easy solution? There's a lot of work going on on the political side. So if you can see that the president at states was actually just in Europe and talking to other nations, potentially even the nation that attacked us to help clarify, okay, what are things that are off limits? And there were some specific discussions.
00:29:24:16 - 00:29:48:28
Matthew Rosenquist
What critical infrastructures are off limits that I'm not going to talk to you. You're not going to talk me talking with Naito to say, okay, what is an Article five that basically says if one country is attacked, then all of them were small ones. And we have that for kinetic warfare, digital warfare, it in theory applies. But again, if you don't have clear attribution, you can't see troops in uniforms crossing the border.
00:29:49:05 - 00:30:26:17
Matthew Rosenquist
And I can make it look like somebody else, it becomes very, very tough. So they're working to figure out, okay, what attribution would be needed and how would we do that. And then, okay, now it's a team effort. We're working on the technology side to help limit some of those ingress points. We're trying to change the industry to bolster up that application and DevOps environments, the OPSEC and DevOps and Dev SEC to, you know, make sure that, you know, people can't insert code or compromise, you know, different aspects of the development.
00:30:26:20 - 00:30:50:13
Matthew Rosenquist
We also want to make sure that, you know, code is written properly. So security tools will work in lockstep with them so you don't have to turn them off or blind them to certain technologies. But most of all, it's about letting the executives, the leadership, know this is a viable threat, it's a viable threat to you. You've got the potential, the hundreds of suppliers to you.
00:30:50:15 - 00:31:13:22
Matthew Rosenquist
And if you're not vetting them and if you're not confident or compartmentalizing them, you putting your core business and your customers because it could flow through you down to your customers and partners all in jeopardy. So the most important thing we can do is to make sure that we're communicating with the executives and making sure that they understand this is an important issue.
00:31:13:28 - 00:31:42:03
Matthew Rosenquist
It's not something that's just going to go away. Someone's not just going to come up with a nifty app or tool that's going to fix this overnight. This is something that they have to look out for. They have to put in the hands of their security professionals in their organization and have them come up with plans that are optimal for that company, for your company to mitigate the risks now and to start embracing some of those changes moving forward as the rest of the industry comes up as well.
00:31:42:06 - 00:32:01:02
LIsa Thee
And it's important to resource it properly and make sure that they have when these security professionals come back to you and say this is what we need, it's not a wish list of things that are just coming up with. Right. It's a minimum effective dose is required and that does require people money and time, right?
00:32:01:03 - 00:32:20:21
Matthew Rosenquist
It does. And there's going to be additional friction, unfortunately. Right. Any time you institute and you're trying to make an address there, you're going to introduce additional friction and costs more than likely. So, you know, there's challenges there and don't set the expectation, hey, you know, see, so go fix this for us in perpetuity. There is no fix yet, right?
00:32:20:21 - 00:32:44:15
Matthew Rosenquist
We can get better. We can manage the risk by lowering the potential impacts and consequences and likelihoods. But it is it's a long story and telling them to start addressing it now is chapter one, but there's many chapters that are going to be on folding over time. So don't expect that there's just a product or a switch that you can flip to make yourself secure.
00:32:44:17 - 00:32:51:19
Matthew Rosenquist
It's an evolutionary process. It's tied to the growth of your company and the technologies that you use.
00:32:51:22 - 00:33:29:22
LIsa Thee
Yep. And that makes a lot of sense to me and the consulting that we provide in the data for good space because those policies are ever changing and evolving. And it's always a natural tension between the teams that are developing the policies, the technical teams that have to implement tools to ensure alignment with those policies, and then the operators that are looking at the outputs of the tools to manage in the human game, right, There's always going to be those three elements, the engineers that are in the product manager level that could be optimizing for many different things at the same time and making sure that those risks are really clear to them as to
00:33:29:25 - 00:33:42:06
LIsa Thee
why they need to think about it and why it might be the right trade off to make to have a slightly more friction user experience because you don't want to take on risk that is avoidable, correct?
00:33:42:09 - 00:34:03:10
Matthew Rosenquist
Yeah. And you used a great word. Tradeoffs. You know, nobody's ever perfectly happy, right? You always want something cheaper, faster, better. And in security, okay, it's the risks that we manage. It's the cost to the organization and it's the friction that you're introducing either internally to support it or even your customers. And again, you're kind of squeezing the bubbles there.
00:34:03:12 - 00:34:25:11
Matthew Rosenquist
And it is about tradeoffs and it's about consciously making a decision, setting those optimal targets, putting the right people in charge and empowering them with the right resources to go make that a reality and realizing that target may shift over time. And so there has to be adaptability in it again to manage those tradeoffs.
00:34:25:14 - 00:34:49:20
LIsa Thee
And also executive alignment, right? You have to, especially in highly matrix organizations, you need to be making sure that you're empowering the product owners across your organization to understand the sense of urgency and what you measure you achieve. Right? It's important to have OKRs and dashboards and make the invisible visible to the decision makers, or else sometimes there's just missteps that happen.
00:34:49:23 - 00:35:09:28
Matthew Rosenquist
More often than not, right? If you don't have senior executive support, if they're not also toeing the line, right. They have to be able to show by example and show that support and be competent enough to be able to discuss it with not only with internally, with employees and contractors, so forth, but also with business partners and even customers.
00:35:10:00 - 00:35:35:24
Matthew Rosenquist
Now, all the executives don't have to be experts, but like any business risk and cyber is one of many, they least need to understand it and be comfortable in speaking about it at a high level because cyber is part of doing business now. It's just the cost of doing business. So those risks have to be incorporated in the entire corporate risk model, including competition and, you know, all those other kinds of things.
00:35:35:27 - 00:36:08:04
LIsa Thee
So one of the topics on the news right now that I think is personally very interesting and I know you have a point of view on is some reforms that are being proposed both the US and then also in the Parliament around affecting the Communications Decency Act 230, which is it's the US based lots of Parliament has a different slant on it, but around the full immunity of tech platforms from having responsibility for user generated content that's put on their platforms.
00:36:08:06 - 00:36:34:24
LIsa Thee
A lot of the rhetoric and the conversation that's happening in the media is really around protecting free speech. But we all know that there are more implications as we increase the privacy dials and protect free speech than just on the surface. So I would love your point of view in terms of where you think the right balance might make sense so that we are making adjustments and we're not breaking systems.
00:36:34:26 - 00:37:01:10
Matthew Rosenquist
Yeah, And and again, this, this gets into some real sticky issues, which I absolutely love. So, you know, from my position, I believe privacy should be an absolute right. Unfortunately in the US only one state even lists it as a right for its citizens and that's California No other has it in their constitution. Only California. When we go to Europe and other parts of the world, privacy actually is declared as a right.
00:37:01:12 - 00:37:32:28
Matthew Rosenquist
But that doesn't mean that it creates a an impenetrable shield for bad actors to do what they're doing. So when I see the evolution of our digital species and being able to communicate and being able to share things in a in a a productive way, we also need to have those tools to help protect victims of that exact same technology.
00:37:33:01 - 00:38:09:18
Matthew Rosenquist
So it's a matter of finding that balance. And I'm a huge advocate of allowing, especially law enforcement, because that's who we rely on, law enforcement to be given specific tools so that they can do the investigation, so that they can do what is necessary. And I'll give you an example right. We want law enforcement to be able to investigate a particular criminal when there's probable cause and look at their digital communications and see what's going on so that they can quickly understand, yes, this is bad and let's get them out.
00:38:09:18 - 00:38:29:13
Matthew Rosenquist
Right. Let's get a case built, get them arrested and get them out there so that they're not causing harm. But at the same thing, we don't necessarily want the government to have those widespread tools to be able to do that. All the citizens, because the majority of them are law abiding and that would be a violation of our privacy.
00:38:29:16 - 00:38:57:05
Matthew Rosenquist
So as an example, there was there's there's always this debate, should law enforcement be able to crack phones? And you've got two streams, you've got on one hand law enforcement says, yes, we should be able to crack all phones at any time and be able to do it remotely and see all the content. Right. And they've even gone to the extreme of saying, let's get rid of encryption.
00:38:57:08 - 00:39:21:27
Matthew Rosenquist
Well, wait a second. Right. Like my financial my financial transactions encrypted. You know, I don't want the criminals to do that. So and on the other side, you've got people that say, no, everything needs to be encrypted and law enforcement should not have access to any of that. Okay. Where's that right balance? Well, if we take it in and apply different examples.
00:39:21:29 - 00:39:40:04
Matthew Rosenquist
Right, in order for law enforcement to come into my house and do a search and seizure, they have to have a writ. They have to have a warrant. They have to have, you know, somebody outside of the executive branch and the judicial branch approves it. That and it's just for my house. It's not for everybody in my neighborhood.
00:39:40:11 - 00:40:05:03
Matthew Rosenquist
It's not for everybody in my city or state. It's my house. And you have to have probable cause. Well, we look at the cell phone issue. Should law enforcement have the ability to sniff all traffic from all people, from all phones? No. But if they have probable cause, should they have the ability to go after the traffic from a particular user?
00:40:05:07 - 00:40:29:16
Matthew Rosenquist
Yeah. Should they have the ability to crack a particular user's phone with that same oversight? Right. Separation of duties? Yeah. You know, in my mind that is okay. But again, you know, and some of the tools that they're using, which are fantastic, it actually requires physical possession of the phone. They hook it to this nifty little, really expensive device and it chews on it and it breaks it.
00:40:29:22 - 00:40:57:02
Matthew Rosenquist
And now you can look at the data. I think that's a great tool because that tool doesn't decrypt everybody in my neighborhood. It doesn't decrypt everybody on my cell tower in my city or in my state. It is a specific use investigation tool that I firmly believe, yes, law enforcement should have, that it is incredibly difficult to abuse that type of power when it's specific and you have oversight controls over that.
00:40:57:04 - 00:41:25:25
LIsa Thee
And I hear you optimizing if you are a child that has had a crime committed against them, that was documented, that is being spread all over the Internet and you can't get it down, is your privacy less valuable than the privacy of the person that is distributing that information? There has to be some checks and balances or else we are in a position where it's the Wild West.
00:41:25:25 - 00:41:32:19
LIsa Thee
I mean, you know, for the first 20 years of the automobile, it wasn't a requirement to have seatbelts because there wasn't that many accidents on the road.
00:41:32:22 - 00:41:40:08
Matthew Rosenquist
But there were a lot actually in Chicago. The only two cars that were there actually crashed into each other. You can. But yeah.
00:41:40:11 - 00:42:03:06
LIsa Thee
That's a high number. That's the point where street signs were required and lights were required and seatbelts were required to be put into cars before you saw them. We didn't have driver's tests to see if anybody was a good enough driver that they didn't need a seatbelt. You still get a ticket, right? You don't wear it. And I think we're starting to get at that 20 year mark a little bit long in the tooth on.
00:42:03:08 - 00:42:08:26
LIsa Thee
Everybody can just do whatever they need to do. And we'll just trust that companies will take care of it the best of their ability.
00:42:08:26 - 00:42:38:04
Matthew Rosenquist
I definitely don't trust the companies. Heck no. Right. And that gets into the other thing. I'm not big on regulations. I'm really, really not. But when we talk about privacy, it's the unusual situation where the financial incentives and normally financial incentives, consumers and good practices and so forth, because consumers will then buy and promote and so forth. When it came to privacy, that wasn't the case.
00:42:38:06 - 00:43:08:27
Matthew Rosenquist
And we had a lot of companies, the financial incentives were flipped and it was more beneficial for them to harvest data without telling people. And if data was breached, don't tell anybody. Don't tell that we lost your data or we're selling your data or we're doing all sorts of bad things with it. And so we actually did have to have regulation to turn those tables and require that if we didn't have that, companies wouldn't report that data breaches happened.
00:43:08:29 - 00:43:34:02
Matthew Rosenquist
Right? They just wouldn't. They didn't until we made it a law. So, you know, we have to look at some of the financial incentives to see where do we need regulations. And in those situations where the incentives are skewed appropriately, that's probably the place that we need them. We see a lot of social media companies. We see a lot of companies as well acting unethically.
00:43:34:04 - 00:43:56:29
Matthew Rosenquist
And this is a great conversation that I have with lots of companies. You know, ethics and cyber ethics in general is, okay, you know, what kind of company are you going to be? You know, where do you want to be? Which what's your future here? How are you either contributing or potentially exposing bad things to your customers, your clients, and the people that you're touching?
00:43:57:01 - 00:44:18:23
Matthew Rosenquist
And a lot of times companies are moving so fast, they're seeing the dollar signs for a product or a service and they're not taking the due care, they're not taking due diligence, they're not thinking about it strategically. And when bad things happen, they either go, Well, that's kind of okay because we're making enough money or they're saying, Oh, we didn't think about that.
00:44:18:23 - 00:44:22:12
Matthew Rosenquist
Yeah, I guess I just really harm 2 billion people. Okay.
00:44:22:15 - 00:44:24:20
LIsa Thee
Sorry we started that coo. We didn't mean that.
00:44:24:20 - 00:44:48:15
Matthew Rosenquist
Yeah Sorry about that. Sorry that a whole bunch of people in country X, Y, Z were attacked in the middle of the night by secret police of that government because they attended a independence rally. Right? A democracy rally. They just showed up. And because technology we used to identify them, the secret police went and grabbed them in the middle of the night.
00:44:48:15 - 00:45:12:00
Matthew Rosenquist
And they've never been seen since. We've had situations like that now. So again, it can be life and death. It really can. And some companies and the one company that I'm thinking of where that happened and that company realized that their technology was used for that they had their teams, their engineers up for almost two weeks, 24 by seven, coming up with a fix.
00:45:12:02 - 00:45:22:24
Matthew Rosenquist
So the government in that country couldn't do it again. So that's great. It would have been nice if that was thought of ahead of time because there's a lot of people that never came to me before.
00:45:22:24 - 00:45:37:11
LIsa Thee
We spent 12 weeks before we deployed the product to think through all the consequences and weren't doing a 24 hour cycle after the fact. Right? It does. Safety by design, security by design.
00:45:37:12 - 00:45:40:26
Matthew Rosenquist
Ounce of prevention, pound of cure. Yes.
00:45:40:26 - 00:45:42:03
LIsa Thee
Yeah. We have to.
00:45:42:03 - 00:45:59:06
Matthew Rosenquist
Have the right mindset in there. You also have to have the mindset a little bit paranoid, a little bit, you know, able to read, team it and go, okay, well, how can bad guys manipulate this? Because most of the technology out there is really trying to satisfy people and it's not thinking about how the truly malicious and diabolical are going.
00:45:59:06 - 00:46:03:23
Matthew Rosenquist
I'm so glad you did that. I you know, I can now do X, Y, and Z.
00:46:03:26 - 00:46:22:06
LIsa Thee
Yeah. And when I was running my own startup company, that was the question I would always put out to my team of a nation state. Bad actor purchase our company tomorrow and decided to flip it on its nose for a different purpose. Are we comfortable that we put this technology in the world? Because once it's in the wild, you can't take it.
00:46:22:06 - 00:46:29:26
Matthew Rosenquist
You can't you can't let that genie doesn't go back in the bottle no matter how hard you push. Yeah, true. Think of it.
00:46:29:28 - 00:46:47:06
LIsa Thee
I think you made us all much more aware of what we've been reading in headlines and news cycles and giving us the next click down to actionable information that executives, leaders, innovators and thinkers can take. And figure out how to apply it to their own areas. I really appreciate you taking the time with us today to do that.
00:46:47:11 - 00:47:09:06
Matthew Rosenquist
Thank you, God. Thank you. I mean, ultimately, we all have a role, whether we're the CEO or we're the person cleaning up the desk at the end of the night. We all have a role as consumers and as leaderships in business. So, you know, we have to step up. And if we don't have the knowledge, reach out to the industry, bring in the right people, ask.
00:47:09:09 - 00:47:31:13
Matthew Rosenquist
I know myself and my colleagues, we're constantly working with with companies and helping them with insights because we're really all on the same side. We are we want technology to be trustworthy. We want the proliferation of businesses and everything else to connect and enrich the lives of people all over the globe. It takes a team effort.
00:47:31:15 - 00:47:38:00
LIsa Thee
That's awesome. So for people that are inspired and want to keep up with what you're doing, what are some of the best ways to find you?
00:47:38:02 - 00:48:12:11
Matthew Rosenquist
Probably the best way is LinkedIn. So you can find my profile. You can follow me there. You'll be joined by a 190,000 other followers. I do a lot of blog posts and so forth. I just started a YouTube channel, Cybersecurity Insight, so you can search YouTube for cybersecurity insights in my name. It'll come up when I'm doing, you know, short blog or short video blogs about different topics and different issues and taking stories in the news and, you know, looking at it through a lens of what does this mean long term?
00:48:12:11 - 00:48:32:15
Matthew Rosenquist
Why really should we be concerned about this? Or is this a really bad idea and kind of tearing that apart, in fact, right now I'm just starting a series where I'm doing about a ten part series on ransomware and some of the challenges there. And we're seeing those crop up in news and critical infrastructure and everything else. But, you know, tearing it apart.
00:48:32:15 - 00:48:50:11
Matthew Rosenquist
What are the impacts? Let's look at the attackers lists. You know, look at options of what we can do and why it's such a problem, what they're going to do next. Those kinds of discussions that I love to talk about. So you can follow me on Cybersecurity Insights on YouTube and definitely follow me on LinkedIn.
00:48:50:13 - 00:49:13:20
LIsa Thee
And if any of you were enjoying the banter between Matt and I around the double edged sword of privacy and safety, you can check out Health net security dot com. We wrote a three part article series in a little bit more depth about our points of view on the different areas and how we come to a joint recommendation of something in balance that can help us all to get to the the tipping point that we think is reasonable for our times today.
00:49:13:20 - 00:49:16:18
LIsa Thee
So please feel free to check that out as well.
00:49:16:20 - 00:49:25:20
Matthew Rosenquist
And we can bring your opinions again. We all have opinions as important that we see from those aspects. So don't hesitate, you know, give your voice.
00:49:25:23 - 00:49:28:00
LIsa Thee
Thank you so much for joining us today, Matt. I really appreciate.
00:49:28:00 - 00:49:33:02
Matthew Rosenquist
It. Thank you, Lisa. My pleasure. Hey, everyone.
00:49:33:04 - 00:49:43:03
Narrator
Thanks for listening to the Navigating Forward podcast. We'd love to hear from you. At a crossroads of uncertainty and opportunity, how do you navigate forward? We'll see you next time.