Time
59 minutes
Difficulty
Beginner
CEU/CPE
1

Video Transcription

00:03
Hello and welcome to Episode 11 of Conferences of the Effect of Sea. So competency 11 risk.
00:10
Just a quick reminder we will be skipping next week.
00:15
Uh, and due to the syriza conferences that are occurring on do we will have our final session on leadership Compass. See, 12
00:26
um, August 15th. It's been an awesome, awesome journey out with you all, and I'm looking forward to finishing this off. Really? Well, um, Ed, really excited. Take it away
00:39
already. I guess next week with the black cat and def con, I hope people take the time to enjoy the conference. My observation is
00:48
I have been going to these conferences now since the eighties used to better the old and CSC Conference in Baltimore on the waterfront in the
00:57
late eighties. I've been doing it forever. The time you spend preparing for a conference is pretty vital. I find if you go to a conference with the plan,
01:07
then it's a much more effective use of your time than
01:11
just kind of going and say I'm gonna check out the vendors and
01:15
picks him talks much more effective to to spend the time in advance before you go
01:23
um, listing out. What do you think your priorities might be in terms of the types of companies or specific cos you'd like to visit with
01:30
or, um,
01:32
which talks were going to make sense? I I always do that, like with my little team, a tag cyber. We we divvy up the work
01:38
fixem areas that makes sense for me. It's almost like the difference between walking in a hardware store
01:45
to just walking up in that walk up and down the aisles and look at stuff versus you know you're gonna build of, Ah, bird cage or something. You know exactly what you need. It's a very different experience in a hardware store when you're building something
01:57
as when you're not. So keep that in mind. You know, I have so many colleagues and friends who go, you know, just in sort of breezily out wander up and down the aisles,
02:08
and that always strikes me as a spectacular waste of time. So I hope you'll do some prep time. I think far too few few people
02:17
prepare for conferences, they just go. So I hope that's useful.
02:22
Now, Um, we're competency 11 and this is an interesting when our last two
02:29
are the most challenging,
02:31
Uh, risk is our competency 11. And the question that should immediately come to mind is,
02:38
Do I mean
02:40
risk is a competency where you're a risk taker?
02:45
What I mean? Risk is a competency where you're a risk avoider and and I get that the business school answer is, well, I'm a risk manager. Risk optimizer me bull ***, and you're either.
02:58
We all know
03:00
that there's not three types of people. There's two types of people there. People who, like may
03:06
are always an hour early for everything, because I've thought through every possible scenario
03:13
that could cause me to be late.
03:15
So my family jokes that I'm always way early for everything.
03:21
But I do that
03:22
because I factor worst case scenario into everything I do. Blah, blah, blah, blah, blah, blah, blah, blah, blah
03:28
you. So she You all know that person,
03:31
But then you also know
03:34
that person who does the obvious were my first bosses at Bell Labs person I just thought was fantastic. And he's one of my favorite bosses. His name's Tom Curtis Legs retired. Now what wonderful manager.
03:46
But man that dude, if you're going to travel?
03:49
Um, he'd be the guy rushing through the door on the airplane. Is your flying somewhere tie untied?
03:57
Bag's still kind of disheveled Justus. They're closing the door and the airplane's gonna push off. You have a smile, ear to ear and just delight in the fact that he didn't waste one second of time
04:10
waiting in some stupid airport. He optimized his morning at nice breakfast, blah, blah, blah, blah. But you know that person, too.
04:17
Um, there's not too many people in between, you know, you're the one or the other, and I know some of you say, Well, usually this or that. But we all have our tendencies.
04:28
I think you can be an effective C so with either.
04:30
But I really believe that the former
04:33
is a little better suited to the job of your crazy risk taker. Then I recommend you become an inventor, you know, with sales going, the marketing, you know, become someone who,
04:48
um
04:50
can parlay that kind of risk into advancing the business.
04:55
But I really do think that the chief information security officer generally
05:00
it's someone who's doing everything he or she can do to avoid risk. I you know, I'm just saying this because you we all know that person. Most of my friends
05:10
to do this job
05:12
are that type of person. But not always. You know, there's there's different tribes. My friend Gary McGraw
05:17
some time ago, while he was over it. Sigil think they're synopsis now,
05:23
um,
05:25
came up this concept of different tribes of si SOS. And I think we talked about that at the beginning of this course, and I showed you some pictures and we talked about
05:32
the fact that there's different tribes and I think there's also different tribes of risk.
05:38
So there's no question that, you know, you can point to some people, you know,
05:43
we're in this role. You are in the top
05:46
or one of the top positions managing information, security, risk for corporation.
05:50
Who would have this tendency of being, in some sense, a risk taker and being willing thio to be That person just hops onto the airplane 30 seconds before Gosh, I would I would sooner die than do that. I I feel like if I'm on Lee 30 minutes early for something that I'm late. So
06:10
I think for the most part there are these tribes,
06:14
but I'd say be reflective of your your own
06:17
personal tendency.
06:19
And if you are a risk taker, you may have to compensate a little bit because your job
06:25
in the cease of position,
06:27
yes, to manage risk and yes, to accept some risk but do not like it. I think you're not supposed to like risks. You're supposed to be, um, categorizing risk in the Debit column
06:39
of the ledger, not the asset column of the ledger on def. You working financialservices. You know that in many cases risk
06:46
is considered an asset,
06:50
so we'll start with that now. We always right are
06:54
little sentence here just to kind of codify our belief
07:00
around risk. And there's a couple of key words here that I think we need to focus on. So first is all we redid defectives. He so understands that risk
07:09
primary driver in prioritizing safeguards and then the second word near that's important is that it must be properly balanced.
07:18
It's a risk to be bounced, and then the other one is cost, like balancing with cost constraints, needs in business. So, really, the way this all works is that when you try to understand risk when you try to understand,
07:32
Um, you know the potential scenarios that could be
07:38
negative to a business or to an organization
07:41
that it becomes your goal, too.
07:44
Prioritize how you didn't I handle this sort of thing, cause you let's face it, you you're gonna have cost constraints. The business is gonna have to run. You can't just shut down the business to shut down rest. That's an absurd,
07:58
you know, Scenario makes no sense. So,
08:01
So prioritizing is what this is all about and also kind of tailoring those safeguards to the business. Let me give you an example.
08:09
I do a lot of consulting
08:11
and often
08:13
consulting around. See so, um,
08:16
team design Or, you know, I probably have
08:20
more clients, and you'd imagine who contact me when they're putting together either a new team or their mid sized business. That's getting bigger. And it's their first team
08:31
or it's a first time. See, so are someone who, for whatever reason, needs to make some strategic decisions about
08:39
security. What I always find
08:43
is they go to these generic kind of sources like a course that someone might be teaching on setting up, you see, so organization.
08:54
And it lays out a sort of a formula
08:56
for what kinds of construction being a team. And I am don't sort of guilty that attack cyber like I have my 50
09:05
areas that I list in my research that I think are important for sea. So but the really, really, really successful see so starts with the business works backwards. Like I I had a client. I was talking to this morning. I think I could describe this without
09:20
giving away who they are, even close. But
09:24
you're kind of in a business that involves satellites.
09:26
They're talking to me about all this I t security stuff and how they
09:31
work that in. And I was listening. And then I just asked, What is the scenario that has your the most freaked out
09:37
and the scenario that had them the most freaked out with something that didn't even come anywhere near i t security? It was
09:45
hijacking the Mission Control and, you know, affecting
09:50
were negatively affecting
09:52
the trajectory or the orbit of one of their satellites, either in a act of war, act of vandalism and act of sabotage.
10:00
It was the scenario they were the most concerned with
10:03
and all the i t security controls that we're putting into their land. In some sense, we're orthogonal to that problem because the the mission control was separated
10:13
from the i t part of the business. So when I asked them, were they sure it was separated? Lot of quiet and and and everyone listening to my voice gets the point here
10:24
that their risk
10:26
potentially here.
10:30
Is that the assumptions they were making
10:31
that you know, nobody could quote unquote get to the control system for the satellites.
10:37
Um, the risk is that maybe they're wrong. Maybe there are ways they don't know about that. And that's what they should be focusing on, not thes conventional I t security controls on the local area network that air so comfortable and familiar and
10:50
consistent with compliance documents, and meet the expectation of anybody who had asked him. That's about what they're doing.
10:58
The correct way to manage risk is to start with the business, start with what you do.
11:03
What are the things that can happen that keep you up at night? Really, And then the things that don't you know, then you gotta figure that out
11:13
like there's a big difference, for example,
11:16
between a negative consequence that really is not good for your business, your shareholders, your stakeholders, your employees.
11:24
But it just stays there
11:26
versus a negative consequence that does all the things I just said, but also maybe could kill the lives of a bunch of people like a plane. Crashing
11:35
is the kind of thing that has consequence way beyond the organization of the airline versus Let's Say you print comic books for a living. I mean, there's not too many scenarios
11:46
where you know it's gonna be loss of life there. So So the risks are different that say safeguards are gonna be prioritized differently. The cost
11:56
availability is gonna be different for security.
11:58
I'm in the needs of the business or just different to different missions. So
12:03
really ineffective. See? So does not just, you know, stamp out missed compliant
12:09
programs that include all the things you learned in Sands.
12:13
Oh, our cyber Harry. You know, cyber has amazing courses that you could go through, but they're not gonna have a course in how you
12:22
You know how you you know you control satellites. You know, maybe Maybe Leaf does have. Of course I'm guessing not.
12:31
But if that's the business you're in, then that's where your risk is. And you're not gonna get that from. Of course, you're not gonna get that from a book. You're not gonna get that from a template.
12:39
You're gonna You're gonna derive
12:41
your understanding of that risk from the business. Do you guys follow? It's extremely important.
12:48
Now, before we kind of get into the habits or the thoughts of the same principles
12:52
that I've pulled from. You know, the last 30 years of doing this
12:56
for C says,
12:58
there's a little crib sheet that I use whenever I'm making a decision about safeguards and I want to show it to you and we'll go through that quickly and then we'll get to back to the risks thing. But But this is a crib. She that has been helpful. Tonight's carry this around in my pocket.
13:15
And when I was making a decision, I would always wonder which of these eight lines
13:22
Um, I follow it
13:24
and let's go. Let's take a look here. You see what we're going to start in that dot in the middle?
13:31
So we just start with some unity, unity or origin, which is an abstraction of wherever you are. So I make no
13:39
absolute or even relative judgment about your security being that dot in the middle, you just are. We are your some mid tier bank and you do what you do on security or your
13:50
dIsa and you do what you're doing security or a little company
13:52
or you have an amazing program. You're like Varieties and R I B. Emerson. It was great, great security, and it's really awesome. Whatever you're out on your somewhere.
14:03
And then a decision has to be made about security.
14:07
And it strikes me that if you look at two variables and I know you look a tw n variables, I know that any decision has a vector of consequence.
14:16
But let's just look it, too. Let's look at
14:18
from making a decision about security. Am I making security better or worse? Uh, you know, I mean, I was kind of deliver it, but it's true, like making it better or worse.
14:30
Then we'll know that their scenarios we're making it worse, you know, you decide
14:35
you're going to allow 1/3 party to come into the enterprise using source based authentication at your gateway
14:43
and just trust that there's not going to be I p address spoofing
14:46
and they come in and you just guy, I got no choice. I mean, I'll do. I am authentication not gonna get into my systems, but they're going to get through my gateway if they just advertise a source i p address that is within range.
15:01
You've just decreased security in your enterprise. When you decide to do that period, you just d'oh makes it easier. But there may be a zillion reasons why you do that. It might be reducing your spend
15:13
by some spectacular amount. That's why people outsource.
15:16
So you see the little arrow from the dot from there down to the bottom left to the little circle, says remove system security and accept risk means I'm decreasing security.
15:31
We're not talking about risk here, and I'm risking the title. But this is just a made a decision
15:37
about security and the spend went down, which is good but also security. When dance. It's so sort of rational. It's not like I have two things happening in different directions.
15:50
Now let's say I go the opposite part of that line. So instead of the bottom left top right,
15:56
there's the one we're all comfortable and familiar with. You're gonna
16:00
introduce a security system you buy from some vendor. You
16:06
introduced some new control costs money if by it license it, install it, management. But
16:12
you have more security, so you'd go all right,
16:15
make sense
16:18
and increasing security. I'm increasing my spend all makes sense in them. You know, in this sort of context, it's it's a perfectly rational decision. So along that line that line that is has slope one. Essentially,
16:37
it's they they don't make sense, right? You there. If I take security out and I save money, I'm cool with that. If I add security and I've spend money, I'm cool with that.
16:47
Those are both scenarios.
16:48
That makes perfectly good sense to me.
16:51
Um, on the other hand,
16:52
on the other sort of access Here, let's look at the top left
16:59
where I'm making a decision. You're I'm decreasing security, and I'm also increasing my spend. You can see how that would be just spectacularly unpopular and an unreasonable decision,
17:11
and I'll let you think through a lot of different scenarios where that happens. But it does happen. You know, you might have, for example, a severe insider problem. Let's say you're
17:23
you're suspicious
17:25
that insiders are hollowing you out and creating sabotage. And your solution is I'm gonna do more security training,
17:33
um, and teach people more about the security systems that we've got in place so that they're better operated.
17:40
Well, if you believe you have a sabotage system than your training the saboteurs to understand your security, better decreased security increased spent. I'm making that scenario up. You could make up 10 that are maybe a zillion times more appropriate or better scenarios. But get the idea and then the bottom right here is
17:59
it is also sort of an unusual one, but certainly a good one.
18:03
You know, there's a case where I'm increasing security and reducing spend.
18:07
I'm just that every vendor at Black Hat next week will tell you that they're in that category. They're going to say you can you buy our tool and you could get rid of that. You know that loser thing that you have from our competitors? A bunch of jerks
18:21
get rid of their thing. Are things cheaper? It's better you move into that innovation Best case,
18:26
the straight up and down here, like the
18:30
why access
18:32
on this chart
18:33
to the right of the Y axis. The positive X values, um,
18:40
all look like pretty good decisions. And on the left,
18:44
you know, the negative X value strike. Mia's not so good.
18:48
And as I said, I was carried this little crib sheet around. When I was making a decision, I would always ask myself, Where am I? Here? It's not. This is not, you know, deep methodology. There's nothing mathematical here. There's nothing all that terribly foundational.
19:04
But I was found this a useful crib sheet for me because I can't tell you how many times I would just make a decision because I was busy
19:12
where it seemed like the right thing are is being talked into it in some sense. And then, boom, you know, consult my crib sheet and come to the conclusion. Wait a minute.
19:23
I'm moving down into the left. That doesn't seem like such a good
19:29
decision year. Do we really want to do this?
19:33
And
19:33
sometimes you have no choice. Like if you look at the, um, the Axis
19:38
that cuts through the origin that starts in the bottom left hand. Ghost with a cop, right?
19:45
Um, you could argue that
19:47
anything below that seems somewhat rational and anything above that I don't know.
19:52
But there's a lot of there's a lot of interesting ways that you can look at this everything below that why the X axis is a reduction and spend everything above the excesses of an increase in spending cell on you. Get the point.
20:06
So, um, so you might want to consider this. I thought I'd show this to you
20:11
because it's it's really not from any book. It's something that I wrote down
20:15
and I liked it, and I've been showing this to students for 20 years and just sort of sharing, and for some percentage of them,
20:22
it may be a useful management tool.
20:26
Now, let's spend a little bit of time here on risk principles. We've got about 40 minutes left,
20:33
and I'm gonna dig you through, um, a bunch of principles that
20:37
I think a security ist cyber security manager executives he so needs to needs to understand. You don't necessarily have to be something that
20:49
comes naturally to you and look Let's let's face it, this is different than a c R o. In a bank,
20:56
um,
20:56
Sierra Rose and banks have a different, different job. This is not what we're talking about here.
21:02
So risk comes in many different flavors. Information risk is different than finance risk. And yes, I do understand
21:10
that when you're measuring information, rescue many cases, the units will be dollars as we'll see in a minute.
21:17
But they are different.
21:18
And a C R O,
21:19
you know, is a position that
21:22
that is a lying with a fundamentally different purpose and mission
21:27
that the information risk issues that a cease A needs to deal with. So keep that in mind as we go through this. This is not I'm not training Sierra rose here, nor could I ever consider myself
21:38
even qualified to talk to a c. R O looks less to train you on it.
21:44
But what I am qualified to share with you
21:47
is that in a lifetime of working as a C. So, um,
21:52
I do understand the principles that have helped may and help some of my friends and peers get through some pretty tough situations, including understanding when things were presented to you that may just flat out be nonsense and you should be able to smell when something is nonsense.
22:10
Which brings me to
22:11
first principle here.
22:14
So Monte Carlo estimates,
22:17
um,
22:18
using Excel, spreadsheets and little scripts that you're right in excel.
22:22
It's not
22:25
really considered a best practice.
22:27
In fact, there's so many risk platforms that are better.
22:32
But I would say it's the most common practice, right? I mean, you
22:36
you get excel with your office 3 65 subscription.
22:40
It's really easy to code little expressions, math expressions that will,
22:47
you know, take your estimate of likelihood of a particular thing happening.
22:53
Bounced off of the likelihood of some range of potential loss to the business. There is a
23:00
between 10 and 25% chance,
23:04
uh, that we will lose between 10 and $15 million
23:10
you know, in the next six months, you know, as a result of some sort of credential problems, you know? I mean, people make those statements and I gotta tell you him put a picture up here of the dice.
23:22
That lesson is really on. Leah's good as the gases that you make right, and
23:27
you know, there's a lot of interesting things. We can sail share one in a minute, but the bottom line is
23:33
when somebody provides to you, uh, some sort of a risk statement
23:38
that's based on some estimations that are made
23:44
in order to support a very large number in many cases of Monte Carlo simulations.
23:49
Be careful because
23:52
people doing this
23:53
are going to remind you that they ran a 1,000,000 of them.
23:56
So there's a lot of money Carlos simulations,
24:00
but they're only gonna be good. As good
24:03
is the *** that went into the mill. It's taken example.
24:06
Let's say I have a population of integers that represent whatever. Let's say these air, ah, lost values.
24:15
Now it's got a bag of numbers. Imagine, and they have a low and a medium and a median and high. And so
24:23
well,
24:25
let's take them the middle of those numbers. We'll just call it the Media.
24:30
It turns out
24:32
that in that population,
24:33
if I just pick five of the numbers five elements in that population at random.
24:41
So I've got a bag of numbers. There is a medium.
24:44
There always is,
24:45
and I just pull for close my eyes and pull out five jelly beans. You know, five numbers.
24:52
If you do the math, I'll explain what the math looks like in a minute.
24:56
It turns out there's essentially in 94% chance, 93.75
25:02
that that median will live somewhere between the highest and lowest values in that random sample.
25:11
Think about that.
25:12
Get a bag of jelly beans, the roll numbers.
25:15
It could be whatever then it would now, however big you wanted today
25:18
that I reach in
25:21
and it pulled five out.
25:22
Okay,
25:23
and we know that there is a median in that bag. So if it's a numb bunch of numbers from,
25:30
you know, one to a 1,000,000 I take all the values, I can calculate the media. It's basically in the middle
25:37
and
25:38
it turns out pulling five jelly beans out. There's 94% chance
25:45
that if I line the jelly beans up,
25:47
the lowest number in the highest number
25:49
will live on either side of that median. 94% chest. How how is that?
25:56
Well, think about it and in any in any population. And if I flip a coin,
26:00
um, and it comes up heads, that means I picked a number that's above the median
26:07
or if it comes up tails and I pick the number below the median.
26:11
Well, then one coin flip can simulate whether the first jelly bean that you pulled from the bag is above or below media.
26:19
So
26:21
you could argue that if the sample is one that I have 50 50 chance that I'm above or below the median right
26:29
now, it's that picked too
26:30
well. If I wanted the median to live between those two numbers, and I would have to roll one number that's higher
26:38
and one never. That's lower than the media, right?
26:41
What are the chances of that will do the math I'm flipping a dice are flipping a coin,
26:47
it turns out, in any random sample of five, if all five numbers live above or below the median,
26:53
that's the equivalent of rolling heads five times in around or rolling towns five times in a row.
26:59
And do the math 93.75% chance
27:03
of doing the heads five times. Isn't that interesting? This book down here on the left. How to measure anything in cyber security risk?
27:11
Excellent book. I see you guys see Stew McClure, McClure there from BlackBerry and my good friend Dan year from in Q tel.
27:18
Um,
27:19
but really kind of a cool concept, right? Didn't you can play these games with the numbers, and you can have people speaking very confidently and backing it all up. But what it comes down to is this.
27:33
Then, when my favorite books in graduate school had a lie with statistics,
27:37
I still think that, you know, it all comes down to what is you're talking about here? I mean, you can dazzle with numbers and you can dazzle with percentages and you can
27:47
dazzle with statistics. But the question is, did you even make reasonable judgment in the first place?
27:52
Like those values that were sitting in the bag? Are they reasonable? I get that. Maybe there's a lot of them, and you did everything you could to come up with things that are reasonable.
28:03
But again, your overall these Monte Carlo simulations, where I run through and spend the dial and spend the dialling spin the dial, do it a 1,000,000 times, using the probabilities that you assigned a beginning and just let it run through and just sort of
28:21
pick values
28:22
that taken to account the probabilities that you coded into the numbers. All right, I'm gonna get some sort of a graph,
28:29
and I'm going to come to the conclusion that overall, you know, our cyber security is pretty good. And here's why. And blah, blah, blah, blah. Ball used all those Monte Carlo's to support that.
28:38
Like I said,
28:40
effective C so recognizes that those estimates are on Leah's good is the underlying. And don't fool anybody. These air gases thes air, not measurements. You'll know *** well
28:52
that it's a bunch of gases. I just did it, Charlie. See So cartoon last week
28:56
where I had Mary and Charlie talking and Mary,
29:00
you know, Charlie says, How'd you come up with these incredible numbers? And she said, Well, I just hit random divide on Excel on a bunch of made up numbers and everybody kind of laughed because they know that that's so true
29:11
in the way people do risk Now Number two.
29:15
Um, you could do all these numbers. You can come up with estimates. You can come up with all sorts of things, but I still believe that in 2020 and we'll see whether artificial intelligence changes this
29:27
but experience human judgment
29:30
I still think can improve risk related decisions. I think it's one thing to do the gearhead stock. Look, I sat on the board of a large bank
29:38
numbers put in front of me, and I get that there is this
29:42
this data driven approach
29:45
that is not only recommended but is in many cases demanded amongst executive teams. I get that data driven guy, right? We're all kind of gearheads here.
29:57
I'm not avoiding the man
30:00
saying that they're the human judgment makes a big difference. Here's an example.
30:04
So there are a bunch of this. I love this
30:08
picture rights. 50 years ago, we land on Moon
30:11
and this book by Gene Kranz. You should definitely have him buy. It's a great book.
30:17
Failure is not an option, really is a term
30:21
that I think has become synonymous with NASA at that time. But the reason I bring this stuff, it's because there was a scenario that occurred literally
30:30
50 years ago, almost to the day, Um, a couple weeks ago,
30:34
where this panel here, this is the
30:38
This is the Neil Armstrong and Buzz Aldrin were staring at as they were guiding the lunar module
30:45
to the surface of the moon. This was their, ah iPhone screen,
30:52
his travel bleak. You know, I guess if you put your hand there, I don't know why I pulled my phone in front of this thing. And it's probably the size of a couple of iPhones, right?
31:02
But as they were guiding and descending to the moon,
31:07
the software,
31:07
you know, started blinking. An unusual alarm. It's a 12 01 and 12 02 alarm. I hadn't known about any of that. And some of you know that my early
31:18
earliest job when I was in grad school, I was writing flight software for the space shuttle.
31:22
Um, inertial measurement, inertial guidance software with my friend fell the plant.
31:27
And
31:30
I remember there was some old Apollo software that found its way into some of the real time executive programs that were being built them. But at any rate,
31:40
those things started blinking. A 12 01 12 02 alarm, which basically says
31:45
the computer's overloaded.
31:47
It's taken in too much data. It's taken in too much telemetry, and now it's just it just can't handle it.
31:56
And as they're coming down, you know, really close. Neil Armstrong says I need to read on that alarm.
32:04
And they went to one of the engineers and said,
32:07
Dude, what are we D'oh!
32:08
Um, and again, this experienced human judgment. You see, this
32:13
was basically that It looks like we're still reading
32:17
inertial data, inertial guidance data, forgetting that it seems like things are working.
32:23
You know, this 12 1 12 02 had not been a problem during any of the preparation.
32:30
And they thought,
32:30
Let's go from Luckily,
32:34
they decided the human decided,
32:37
take the risk, go. We're down their land
32:39
knowing full well that you could strand two astronauts on the moon to lives could have been killed a human being.
32:46
I made that decision, and it turned out to have been the right one. My guess is, if that had been programmed today and that alarm had popped up, they probably would not have landed.
32:59
So I'll let you decide whether that was right or wrong, and I'll let you decide whether if you were the programmer,
33:06
you would have made the same call. But that is where that's the decision that was made,
33:09
um, and you know something, Here's here's a good one for you.
33:15
What I said,
33:15
Are you the kind of person who
33:17
is late or early? You know, you that person and I'm that person who's always early. So I was thinking of every contingency. I drive my family freaking crazy
33:29
with every contingency for everything.
33:31
I have so many contingencies, and I think it's just drives me crazy sometimes. But let me ask you a question. And I watched an interview with Buzz Aldrin on this New Jersey guy.
33:44
I'm asking, What were you guys thinking about when you landed on the move, you know,
33:49
and everybody sort of laughing and looking going on. It must be an amazing feeling of land
33:53
High five.
33:54
I feel like it's great. You must have just been this incredible sense of relief and best of all, just looked at him like what I'm talking about
34:02
that the instant they landed on the moon,
34:07
they had to prepare for an emergency takeoff just in case. That's what they did
34:12
when they landed.
34:14
They weren't sure if something was damaged. They weren't sure that if they had to just haul *** out of there quickly and that was the procedure not to sit there and relax, but to think of what could have gone wrong here. And do we have to get out of here quickly, then credible. A love
34:34
that story because it's so indicative
34:37
of the kind of thinking that I recommend for for all of you. That's my advice. That
34:42
that you think that through you, that type of person
34:45
I just paused, looked in the chat. I see there, some people talking about frameworks like fair and so on. Absolutely keep in mind, I'm not here to train you necessarily unrest frameworks. So I want I'm not gonna cover fair, but it But that is a wonderful framework. There's no question that from a risk perspective,
35:05
there are some really good
35:07
conceptual and engineering models that I think you should pay attention to. A lot of lot of vendors use that. I'm more thinking this moral leadership course. I try really hard not to do
35:17
the computer security. I think I'm about 95% successful there, I said, ready to begin in. Leaf and I were talking and I said for this course
35:25
I wanted to be pictures of things like
35:29
you know, the command
35:30
screen and the lunar module instead of you know, the things we see every day. So I hope that's okay. I'm not gonna cover fair, but that is a good mom.
35:42
So at any rate, this is the
35:44
I love this picture. And I think that's a very useful story because it does illustrate the way you should be thinking about risk.
35:52
Let's go to the next one.
35:53
Um, you you should not be doing reckless gambling with speculation.
36:00
So I like talking about this one after the scenario we just went through.
36:05
When you're sitting in the sea, so roll and somebody says, Hey,
36:08
um, we need to launch. We need to launch the service. But
36:14
none of our audit systems air working. We're going to be collecting Know what? It logs. No telemetry.
36:20
Um, and we won't have any eyes or ears into whether or not we're under attack.
36:24
Should we launch?
36:25
Well, that's a risk management decision, but it shouldn't be reckless. Gambling
36:31
like reckless gambling means
36:34
there's no It's just a just a flip of the coin, totally with no means of controlling it. I can't affect our influence whether a 50 50 probability exists When I flip a coin, it's just that's scandalous.
36:49
So risk management, Not that, And this is a great book to read. This is one of my personal heroes. Bernard Baruch. I love his story. I've read
36:59
two of his biographies, loved his time that he spent helping President Wilson and President Roosevelt. Wonderful, amazing man.
37:07
I'm and I work just a few blocks from brute college
37:12
where I'm setting up a little internship program brings from the students over there. But but Bernard Baruch was a speculator,
37:20
a financial speculator who who's that was his propensity. He's the guy who's a risk taker.
37:28
And yet during wartime
37:30
he adopted a very different stance. He became a risk manager, so you can be a speculative type person as he is. He became a millionaire very quickly at a time when being a millionaire
37:43
probably met, meant more like you're being a billionaire in the Millionaire.
37:46
But he's really interesting man and you know, if you're like me and you read computer science stuff all the time
37:53
what a wonderful break.
37:55
Thio Thio have ah, book like this Now I remember
38:01
Ah, year after I was married, I've been married now for 30 34 years.
38:07
Um, I had this book,
38:08
and my poor wife puts up with my weirdness, but I took this. We went somewhere on a trip and I lost this book, the one that I'm showing here I was reading at 34 years ago
38:20
and, um,
38:21
so upset. And then I went into a used book store in New Hampshire.
38:27
I was at a security workshop in 1990. I think it wa ce
38:31
and I found the book. So I It's not an easy book for I'm guessing maybe now that there's Amazon. There was no Amazon then, but it's such said, I remember the how happy I was when I found this
38:45
this book by Bernard the Roof and part written in sixties or something. But he's somebody look at if you're a risk manager. If you do, we do. These are the kinds of people you should learn from Amazing God
38:54
and that risk managers must understand and manage Worst case scenarios. I, you know, already started traveled through some of this,
39:01
but let's just like my kids love these kinds of books. Like Worst Case Scenario Survival Handbook, like you're sitting on a plane and the pilot, you know, as a heart attack, what do you d'oh like?
39:15
But I gotta tell you, that's the kind of drinking game that you should do with other si SOS. You should be thinking worst case scenario. And here, of course, is Hurricane Katrina Zim packed
39:24
on New Orleans. And I think that you had way too many manager city managers
39:30
that had not thought through the worst case scenarios. Listen, let me say this. I want you to hear it. And I want you to memorize this
39:37
when you are the sea. So it is not your job to manage the average scenario. It's your job to account for the worst.
39:45
That is, that's the job.
39:47
Let others in the business say the chances are we'll be okay.
39:52
That's not your job to job to say. Well, yeah, I understand that we're gonna optimize said some sort of a bell curve. Maybe we're at the top of the bell curve is most likely case. Okay, optimize your resource is, But there's also a scenario where things can go very wrong and be very, very, very bad.
40:14
And and the job is to figure out how to manage that without a prioritize how to account for that sort of thing. And I love Ah, Lane stink chief. Impossible Scenario office, Others air. Really? I'm gonna actually have to steal that.
40:27
That's a really good point.
40:29
But, you know, that's kind of the idea here is impossible in a sense,
40:34
to just have all the solutions for something like Katrina.
40:37
But that's the job. The job is. What are you gonna do if something really wrong happens here? I know back in telecommunications,
40:45
um, you know, I spent a long lot. Which part of my career in Tel. Com.
40:50
Most of the telecom companies would have response procedures, disaster response procedures that would start by flying. Resource is out to the affected place. Like if there's a hurricane war,
41:02
catastrophe or some natural disaster, you fly. Resource is there. That's chapter one. But that paragraph one sentence, one word, one in chapter one fly out to
41:15
what would you do on 9 11
41:16
on 9 11 all airplanes are grounded.
41:20
So did you think through that? Obviously didn't take the worst case scenario into account. The worst case scenario is not flat work because you can't fly out because there's no planes.
41:31
So you get the point
41:32
like having that tendency, a willingness to really consider the worst cases
41:38
that is your job. Even if the most likely case isn't going to travel anywhere near that boundary condition
41:45
you hope
41:46
we have. You know, you don't expect a 9 11 kind of thing every week, but you should be planning for that. Makes him should be part of the equation.
41:54
No, the next one
41:58
Great Miss Risk management starts with organizational culture. I think many of you who do this sort of thing would would agree. It might even suggest that this should be the first,
42:07
um,
42:08
first bullet here, that the organizational culture really does dictate whether or not you got good response. And that means trusting
42:15
the decisions that are made
42:19
from a risk perspective. That's what it means. It means trusting those decisions. And look, here's a great book. I hope you hope you've read this with Michael Lewis.
42:29
Well, my favorite authors
42:30
of The Fifth Risk came out recently, and this story is basically
42:36
he went, visited a bunch of agencies to to read the transition plans that they had created for the Trump administration,
42:45
and it gave him an opportunity to really dig in tow. How some of the organizations,
42:50
um, that that supporter of the United States anyway operate on dhe. Here's an example. One that's the Department of Commerce. In fact,
43:00
that manages Noah
43:01
that takes this picture that shows our hurricane brewing,
43:07
you know, off the southeast coast of the United States.
43:09
Um, and it's a It's a lovely book. It's a It's an easy short read. I think some of the chapters were printed in the Atlantic, and Michael Lewis is such a great reader, whether you like him or not. And I know there's some politics involved in some of this, so you sort of critical of Trump
43:27
stepped. Aside from that, this really is not a political book. It's a celebration
43:34
of agencies in the civilian agencies in the U. S. Government.
43:38
And what an amazing job they doing managing risk. And it's all about culture if you work it. No, uh, that's the organization that does the
43:45
essentially weather.
43:47
You know what the culture is like there. It's a culture of data. It's a culture of accurate interpretation of trends. It's and it's a culture of trusting your eyeballs here like Look what's happening on this picture. I mean, pretty obvious you got an issue here. If you if you
44:06
play the time
44:07
movie back
44:09
10 hours,
44:10
then that eye of that hurricane is off to the right somewhere.
44:15
And we all know that if in organizational culture is flippant
44:19
that it might say a who knows what's gonna happen. But if you follow the data and if you study trends and if you do it long enough, you know exactly what's gonna happen. And that's part of the culture of an organization. Not everything is obvious is this. But if you see trouble brewing with the human resource related
44:37
issues and one of the business unit,
44:39
it's something doesn't seem right. And and you know that things air are not right and they're heading in a direction. It looks like it's gonna explode on the coast of your company.
44:50
How do you take action or not? Well, tell me about the organizational culture, and I'll tell you if you took action
44:55
and that's the point.
44:57
So to do this properly really do have to have a culture that understands and trusts
45:04
the guidance that comes from the feudal who are managing risk. I think that's absolutely essential.
45:09
Look at this one. Risk transfer has always been an important management options.
45:16
Well, you know what that means today means this right?
45:22
Insurance.
45:22
Um,
45:23
look here on the left. The invisible bankers written in the eighties. It's an explanation of the insurance industry. I may have shown this to you earlier in the course.
45:34
I think it's a must read. Um, Injured Tobias, the financial writer and
45:38
that is active today, wrote a really cool book called Getting By on $100,000 a Year. That was a joke.
45:45
It was like, Who could ever say,
45:47
You know how getting by on $100,000 you're right In the eighties, it would be like saying, getting by in a $1,000,000 a year this year.
45:55
Well, you know, with inflation now, it is tough to get by. That does here for a lot of families, and Mom and dad both work making 50 K each a couple of kids.
46:07
They're probably struggling. 100,000. He wrote that book, but this invisible bankers was a description of the insurance industry that I think it's just still spectacularly good.
46:16
Many goes back and explains the history, and I put this picture. This is a carved tablet from the Code of Hammurabi that included certain codes around insurance for sailors who were venturing off and would, you know, pay a little extra
46:34
in case something went wrong so that their AH loved ones would be taken care of insurance. That's where insurance comes from. You know, these basic decisions around
46:45
how to account for risk, how to manage it and, in this case, how to transfer it.
46:50
So, First, cyber security teams, you need to understand what risk transfer all means.
46:55
You'd understand how it's done and you know it. Also, the business relationships around the business decisions around it, like here's my said this before and I'll say it again.
47:06
Well, as long as C says don't have to pay for insurance, they're always going to be for it,
47:10
right? If you have an insurance group that buys your insurance
47:15
Oh,
47:15
hey, by Oyo by as much as you want, I'm all for it. You know, nobody's going to be against,
47:22
um, buying cyber insurance
47:23
if you don't have to pay for it.
47:27
But as soon as it starts maybe hitting the
47:30
the budgets of security teams where maybe then at that point you might see some differences
47:37
in whether, you know it's just a given that you're going to go ahead and pay the money
47:42
to an insurance company
47:45
to transfer risk over to them. That may not just be an obvious given the way it sort of is today. So take some time and understand risk transfer RL. And in the context of cybersecurity, that means insurance. And that means reading books like The Invisible Banker. So you have more than just a
48:02
a little baby surface understanding what's going on. Look, the reason I show you these books
48:08
is because I think you have to travel around an issue to understand it.
48:13
You can't just bull's eye something. Get a briefing on it and be an expert.
48:19
You have to travel the issue. You have toe nibble around the edges. You have to look at the adjacent issues. You have to read some background. You have to reflect
48:29
on on its history. Then you understand something, maybe 5%. That's why freaks me out when anybody considers himself an expert in anything. I've made a life of trying to understand cybersecurity. And I'm at the point now where I'm willing to say I think I have some expertise. But anytime I give it talk, the first
48:47
thing I always want to say is
48:49
God, there's probably 20 people here know more about this than May. But because it's so hard to really become an expert in anything
48:57
extremely difficult thing to dio,
48:59
I'm some such a non expert in everything I can think of cybersecurity. I've spent four decades trying to kind of guests over insurance. You're not going to be an expert, but you can travel it. You can learn enough so that you can make some valid decisions.
49:14
Last one here,
49:15
risk can lurk
49:17
in unexpected places. Right.
49:21
Um and how you handle that?
49:23
I think as it is interesting, it could be reflective of your integrity, your judgment, your
49:30
your style is one of my favorite example. So I'm sometime ago. This is right off the T mobile website. Um, you know, a few years ago,
49:38
um, t mobile got wrote a letter that consumers saying T Mobil CEO and experienced data breach now I want to read that sentence to you again.
49:47
T level CEO
49:50
unexperienced data breach.
49:52
And
49:52
it says in the second paragraph we have been notified by Experian,
49:58
a vendor. That process is our credit applications that they have experienced the data breach.
50:07
So then the bottom there, obviously I'm incredibly angry about this data breach. We institute a thorough review of our relationship with experience. Now go back and think about this. You tell your a T mobile customer
50:19
Do you know anything about Experian? You care who they're Frankly, do you give a *** who they are? You sided with T Mobile. If they want to do credit applications with 1/3 party, then go for it. If they want to do credit applications internally, then that's just a business decision. If they said we've been notified by our T mobile business unit
50:38
that does credit applications that they experienced the data breach,
50:42
I'm really annoyed that this business unit had this data breach. That's not really T mobile. It's just a business unit to you'd say it was ludicrous.
50:50
So the risk lurked in 1/3 party. I totally get it here. I mean, I
50:55
I can I can understand if you feel is pain here, I get it.
51:01
But this son of the Wei works, right? I mean, if your third party gets hacked and then your customers or hit a za result, that's your data breach. This should be T. Mobil CEO on our
51:13
data breach. Not experience. I think it's an interesting When the risk lurks an unemployed expect the place of late third party.
51:21
It's an interesting kind of reveal
51:23
when some sort of decision is made either announcing something about the breach or how you manage it or how you respond to it or how you,
51:34
in some sense, you know, uh, deal with the impact of that breach. I think it reveals quite a bit about the executives, about the culture, about the team, about everything. So
51:45
so that that that's something that you should be looking for scenarios like this and try to understand whether or not they,
51:53
you know, our our instructive in the way this stuff goes on. An interesting footnote here at the bottom of this letter, which I've cut off,
52:02
um, T Mobile. And again, there's no picking on T Mobile. There may be some people here who work for T Mobile and I spent my life in the same industry is you and I take no pleasure
52:15
in problems and every company Intel Commons had issues. So this is not nothing to do with
52:22
t Mobile, but, uh, as you always see in these kinds of breaches, some identity protection was offered in the service that was offered specifically when you click on it.
52:31
That said, the service is brought to you by guess who
52:36
experience. So I guess I thought it was just a
52:38
an interesting way
52:40
of handling this sort of thing. And And I think probably not that the picture of the guy up here, creative guy.
52:49
And I guess he's gonna try emerge a sprint now, creative person.
52:52
But I thought in this case, maybe not one of his better days or better decisions in handling this third party risk.
53:00
Let's go to our case study Now Had some fun writing this one.
53:06
So here's what the case study for less and 11 really amounts to.
53:13
We have,
53:15
you know, our hero talking to somebody Jeffrey,
53:19
who works for a small company in Philadelphia, and he says that they design and create compounds, they make chemicals. I made them organic cause
53:28
that,
53:29
you know, it just seemed like that be maybe consistent, more consistent with the story here. So make these organic compounds
53:36
And he says you have news into interesting business. We make these aromatics and you sell them blob lives Nice two factor authentication
53:45
star data securely and encrypt. That was a pretty standard stuff thing as you
53:50
is. Well, wonder our chemists last year was combining a bunch of things. I had him for her, combining a bunch of acids, you know, doing presumably some testing on something or other. Just because you're developing organics does it really wouldn't be using
54:07
some dangerous other chemicals.
54:08
Person was combining things and and they made this
54:13
crazy explosive that blew a lab to bits, literally blew the place up
54:19
and going back and reviewing what it happens. They realized
54:23
that make it and created something interesting, like a bomb. Somebody called Department of Homeland Security and D O D. And they said they had come take a look at this and they reproduced the
54:37
the explosion. And the military folks were actually quite interested in this and said, Wow, this is really interesting. We make this cool bomb
54:45
and the security guy says, Listen, IIs is I'm protest a little bit and said, Hey, you know what A pretty security, serious security problems here.
54:53
We're gonna be developing weapons,
54:57
you know, bombs. That's a little different than aromatics for perfume.
55:01
And, you know, it was kind of really pushing back here. And and he says here that what happens is, um,
55:09
you know, they did, in fact, start to see some very unusual activity coming in.
55:15
Um, you know, and and And the the guy was tells his bosses, we need to really dig into security problem. And the boss gets furious and says, You know, I'm gonna bring 1/3 party in. You just step away in its investigation with the FBI. You're hurting our business.
55:31
What's going on here is a new line of business, and you're creating nothing
55:35
trouble.
55:36
And at that point it you know this story, short events, what is what is the issue here? What was the sea? So do you've all been in these scenarios where you smell trouble brewing in an area where the business is very excited to see new rabbit
55:52
you got this guy running or guy or gal running in aromatics company? Suddenly he's got D O D contracts that look like they could be worth more than the whole company
56:01
and security person is freaking out. Oh, hey, there's some problems here. Those break ins. There's issues. There's this. There's that.
56:08
This does not look like a good situation.
56:12
So the CEO responds by saying, We're getting rid of you were bringing an M ss it. So
56:19
this is an interesting one because this is a case where you see a senior leader deciding not to worry so much about risk
56:27
and just forging forward because there's revenue. So it's a good one. Frito, sit down with your team is trying to go through. What would you do? Have you ever been in a sin ish in a situation like this, would you stay in the job?
56:37
Was the CEO being reasonable?
56:40
You know, do you think senior management is good understanding of risk? And if you know, is this the kind of thing where an external nemesis would be? Maybe they'd be better up. Maybe you're gonna get the same sort of judgment from an M SS even though my observation
56:54
is it An awful lot of em assesses cell,
56:57
um, one size fits all solution without really digging into the underlying risk.
57:02
They manage devices, they collect telemetry that notify you of alarms. And whether you are a chemical company or a comic book company, it's gonna look the same.
57:14
So that's why I made this an M assess, because I do know that most MSs is air. Not gonna be providing risk based service is so
57:21
so. I hope this is a useful one for you. And I hope, you know, just looked sort of clothes here with the top of our our.
57:28
I hope that my signal light my thoughts here
57:32
on rescue important for you
57:35
because you really do have to decide, you know, area risk taker risk avoider.
57:39
Even if you're a risk taker, that's fine. You can still be very effective in the role for recognized that the job the Ceasar job is probably the management activity.
57:51
It's probably more conducive to somebody who's extremely nervous about worst case scenarios, tries to avoid risk as much as possible. And while you do trust the data you also factor in, so you're human judgment based on experience.
58:06
So those are my thoughts on resc? I'm sort of monitoring the chat this week. A little more. I see. You know, a couple of things here.
58:13
Um, I think we're in good shape posting links to the course materials. Absolutely. Conduce that.
58:19
And thanks for some of you who are confirming some points Ellis, we're gonna skip next week. But make sure you come back in two weeks because that's gonna be the week. We talk about leadership.
58:30
And I think that is the most important competency of the 12.
58:35
And if you want to pick one, that's the one that is the most essential Youto have
58:40
leadership capability to be a CC. You cannot just be a manager. You have to also be a leader. So? So everybody enjoy black hat next week, Def con
58:50
um and we will see you in two weeks. Everybody have a wonderful couple of weeks and we'll talk to you. Say thanks.

CISO Competency - Risk

This is the eleventh course in Ed Amoroso's Twelve Competencies of the Effective CISO, which focuses on the CISO Competency in Risk Orientation. Developing a complete risk structure and framework for enterprise security prioritizes safeguards, minimizes expenses, and maximizes support and mitigation for business operations.

Instructed By

Instructor Profile Image
Ed Amoroso
CEO, CSO, CISO of TAG Cyber
Instructor