Time
1 hour 2 minutes
Difficulty
Advanced
CEU/CPE
1

Video Transcription

00:04
Hello and welcome to Episode nine. Competency nine, Threats 12 Compensates of the effect of sea. So with that Amoroso super excited. This is our ninth session. Just a quick reminder that we're going one week longer. Thank you, Ed, for allowing us to do that. Um,
00:22
on the 12 session will be
00:25
not next week, the week after. Then the week after that s o you'll be receiving an email about that. Also, for those of you that were interested in the x p tool. Um, Ed and I are in discussions on that, Um, and we'll be in touch very, very soon. Thank you for your interest. Several of you have reached out
00:44
that Ed super excited for this
00:47
this session. I take it away.
00:50
Okay, so let's start with a story. Um,
00:53
about two years ago.
00:55
Let me think about that. 16 was what, Three years ago,
00:59
Um, I was asked to give a presentation to aboard here in New York City, and,
01:07
you know, I was gonna talk tech and request was that I cover, um,
01:12
cyber security technology and in terminology that the, um,
01:17
board members can understand. So I've been a board members of presumption was I would know how to do it.
01:22
That's the right presumption being rated last minute. They asked if I would
01:26
if I could give up half my time
01:30
to a government employee, retired government play. You can talk about threat
01:34
and he got up in talk given amazing presentation on threat for 30 minutes. The reason I bring it up is because the second half Well, I was speaking.
01:42
It struck me that the board was less interested in
01:47
my explanation of technology had been much more interested in his explanation threat. I think you're going to find that this competency of being able to
01:57
understand and provide good insight
02:00
into threats is really important. And you're gonna find that at the senior most level the thirst
02:07
for information related to threat cyberthreats threat, actors, motivations,
02:14
um, foreign intrigue, political. You know,
02:17
uh, drama is that kind of stuff, that the appetite is un ending, the higher you go in an organization, and I found, as you go lower in an organization, the appetite is generally much more appropriate. People who work in the security operation center tend to have a good, rational kind of balance
02:38
in their need for and their interest in threat.
02:40
But as I said, once you get up to the board level,
02:44
you're gonna find that people have this unending just just just thirst for more. It's almost like a gossipy kind of thing. You know what? Why would somebody be attacking us where they from can show me a picture of them on and they'd like to see a picture somebody looking terrifyingly scary so they can all gasp on dhe. That's that's the
03:04
presumption. And we all know that That, for the most part, is just that lampoon version
03:07
of the kind of threat Information really matter. So so as we do always a statement here, the obligation
03:15
that you will have is a C. So you know, our ninth
03:19
obligation. And my fact that this is ninth doesn't mean it's not the most important. Just do them nine because you have to enumerates somehow.
03:30
But this is an important one. Effective C so maintains accurate, realistic insight that's important. You're not just some gabber, you know, sharing a bunch of,
03:39
you know, sort of lurid details about threats and threat actors. Rather, you're able to provide good insight. That's the job. You have to be the person who could do that.
03:51
And that's a skill that requires some practice. In particular, you require some restraint
03:58
because you sort of up to you to make sure that you're not tossing a bunch of silly hyperbole around
04:03
now. I've always taken the approach
04:06
of trying to provide
04:10
information, the context of of teaching, like I always felt like with boards that I've dealt with
04:15
that when they're interested in threat, I want them to learn. I want them to understand the context,
04:21
the threat and that requires maybe a little bit of stark. I understand you so that they can recognize
04:28
that cyber threat has evolved.
04:30
And it's evolved from a very, very humbled some sense, almost non cyber related margin, really from from this guy, Abbie Hoffman, I deluded toe Abbie Hoffman earlier in
04:44
our lectures.
04:45
But Abbie Hoffman really in my mind is the, um is the father
04:51
of the cyber vandal. In a sense, in that book, steal this book is my mind, the template
04:59
for just about everything. That's come since it's certainly a template for 2600 magazine
05:03
and some other Hacker Quarterly's that I'm sure a lot of you are familiar with
05:08
because it was playful and it was mischievous and it was interesting. And I still have Ah, early edition copy of the book, and I have some more recent additions
05:16
and and they just delightful plays on
05:21
essentially taking advantage either of norms or protocols or systems that are in place.
05:28
I'd mentioned that one point that I be Hoffman was a guy who said, Instead of putting your mother's address in the to field and your address from from field and putting a stamp on it, that's how you mail letter to Mom.
05:43
I'm just put your mother's address in the from field, put your address in the to field
05:47
and don't put a stamp on it and we'll get reprint. Return postage unpaid to Mom
05:53
So you've accomplished the same thing without having to spend any money on a stamp.
05:58
Um, take a minute and think about how you solve a problem by that. Like that, by the way, not as easy as you think.
06:04
Like, do you, ah, block it? Do you toss it to you, whatever,
06:10
By the way, I think the way the Postal Service deals with that is that they hold,
06:14
um, you know, delay it and then eventually contact you that
06:17
you know, something's happened. But again, you wonder. Is all that time and effort money and process worth?
06:25
You know, whatever stamp was was not purchased. It's a complex argument. The reason I have this picture of lunch counters because I'd be Hoffman's a guy who said,
06:36
um, and this plays right into hacks of the TCP Protocol.
06:41
Like a lot of times when I'm teaching
06:43
non technical folks of that t speak I p and how you hack it all often hearken to
06:48
this hack from Abbie Hoffman, where he says, you you have your friends set
06:53
at one of the chairs down like you see the little pie plate is have your friends sit there
06:58
and then you sit a little closer over here. Maybe where this
07:01
this little glass cases in the picture see about three or four chairs away.
07:06
Um, you order coffee
07:10
and your friend should order a full meal,
07:12
and you're obviously separated. But you're close enough that you might actually, you know, be ableto
07:18
reach out and touch each other, right? You're close enough so that their sum's proximity, and you hope nobody sits between you.
07:26
Hey, God was not a full meal. You sit there sipping coffee and you make sure that the counter person knows
07:31
that you've only had coffee.
07:33
And then when it comes time for your friend to pay his check,
07:39
what he does is he gets the check,
07:41
walks over to you, hands you his check quietly, you give him your check, which you've asked for it. You know, whatever coffee is,
07:48
um, And then he goes up to the counter with your cheque, pays the dollar $2.5 dollars now for coffee and then scoots out,
07:58
and you sit there in your dog a little bit and again, you've got your coffee in front of you. Wait to your friend is long out of the parking lot and, you know, a few blocks away,
08:07
um, and what you do then,
08:09
um is, um,
08:11
you raise. Holy heck, you say, Hey, um,
08:18
I only had coffee
08:20
and personal. My gosh, Um, you got the wrong bill. You got the wrong check. They might run out of the parking lot. Even if they found the guy
08:28
he could say, Oh, my gosh. I'm so sorry. Didn't even look II age. My somebody got the wrong track. You know, you could get away with total impunity, but the bottom line is, you're probably gonna get away with it. You're gonna pay for the coffee. Your friend's gonna pay for the coffee, and then you go to another lunch counter and you
08:45
switch places.
08:46
Um,
08:48
so, uh, it's a It's a wonderful hack of a of a protocol.
08:54
Um,
08:54
and then you get the idea like the book is filled with things like that,
08:58
that from a threat perspective, help you understand that the early kind of threat that we dealt with
09:05
a bunch of deadbeats trying to rip us off happy Hoffman was kind of a revolutionary,
09:11
and he would have said he was trying to bring the government down, but I don't believe him.
09:16
Like, I think he was just kind of being, you know, a little bit hippy ish,
09:20
Um, and and he became sort of the patron saint for this type of things. This is
09:26
Kevin Poulsen,
09:26
um, who now writes for Wired magazine.
09:30
Kevin Poulsen was a serial hacker, and this book by Jonathan Littman is spectacular. You've not read it.
09:35
It's a little bit out of date, but I still think it's wonderful. Was kind of our circle early nineties.
09:41
So Kevin Poulsen was at a hacker in California.
09:46
And
09:48
you, you you may know that the Bell system
09:54
kind of in the mid eighties, was broken up into a bunch of different pieces.
09:58
So the local Bell operating companies separated off from 18 tea and one of the local Bell operating companies, just company called PacBell. So is essentially the phone company for California. A couple other states,
10:11
um,
10:13
and what Kevin Poulsen was doing with some of his friends
10:16
cause they were breaking into PacBell. Learning how the system's worked. And they're actually like literally climbing in the window
10:24
of the Pac Bell building and and stealing manuals and leaving snarky notes for the security team. It's kind of hits home for me cause you'll know that I
10:33
basically was in that position
10:35
for quite some time. So the idea of a hacker leaving me notes on my desk would have been a little chilling. But at any rate,
10:41
um, he figured out what one things he figured out was how the 800 number translation system worked,
10:48
so he figured out a CZ. Many of you know when you dial 1 800 you know,
10:54
mattress, that that number actually corresponds to a really phone number with a really area code and ruined and so on. And there's a translation system. A system that takes the 1 800 m e t t e r E s s whatever need the last stop for saving,
11:13
Um,
11:15
it translates 1 800 mattress to the real number and posting knew how to get into that system.
11:20
So sometime in the early nineties is the wacky
11:24
radio deejay Rick Dees. Remember that weird guy did disco Duck Lake in the seventies. He had a call in program on the radio in the nineties. A really yacht sea thing. It was like, Hey, you 10th caller. Well, you know, one of those kind of guys
11:39
and and he was giving away a poor She could see the picture here, the Porsche
11:41
and it was like 10th caller on such and such day. Well, when the portion all of l. A was going nuts, everybody was excited.
11:48
So, you know, in the minutes leading up to the Colin, Kevin and his friends broke in to the Pac Bell system and what they did is they diverted the 800 number for the radio station
12:01
to some other numbers from garbage, and they could see the rial number. So they knew the real numbers when it came time to call Kevin Poulsen again and his friends, But mainly Kevin,
12:13
uh called the number once called the number twice called it bla ba ba and then 10th number. Hey, caller number 10 Kevin Poulsen. He wins a Porsche.
12:22
Um,
12:22
and the reason he got caught, I guess it's because he did it a second time. You win two portions. The FBI, it gets a little suspicious. And the really funny thing is he became, uh, he was on the run, you know, the FBI was after this guy,
12:41
and there was this NBC show like Unsolved Crimes or some Stupid
12:46
Thing like that. And they had an 800 number. If you had information about Kevin Poulsen. And the joke is that he obviously broke in and diverted the 800 number
12:54
to the Collins show for people to call an information to the FBI about him. I'm It's really quite funny.
13:03
now look,
13:03
he's a hacker
13:05
and he belongs in some sense, you know, uh, in a in a situation where Justice is brought. But Kevin Poulsen is not trying to bring the, you know, the country down. In fact, he, like City, writes for Wired magazine.
13:18
He's a good writer. He's a responsible citizen. Did swings a kid.
13:24
So when you're thinking threat, this kind of thing is a problem. If you run the radio stations a gigantic problem,
13:31
have a bunch of kids you know, win this by diverting the phone stuff. If you work at Pac Bell, it's a terrible embarrassment that you can't control your systems.
13:39
But it's not gonna be the kind of threat
13:43
that you know causes society to go off the rails. So again, when you're explaining
13:50
to aboard to an executive team yet that the nuance to make sure that they understand.
13:56
But these air threats, these air security problems that be dealt with but the intensity
14:01
stop somewhere there's a threshold. Kevin Poulsen. If he knew, for example, that someone was going to die
14:09
as a result of what he was doing, I'm pretty sure he wouldn't have done it is when a teenager winning a car. I could think of myself
14:18
doing something that foolish,
14:20
you know, before my brain kind of, you know, caught hold of reality. And I got some judgment so, so very important to keep that in mind.
14:30
Did you go through now?
14:31
Some number of years before Kevin Poulsen. There's this wonderful, wonderful woman, Dorothy Denning.
14:39
Here's a picture of her, um,
14:41
who had written the first cybersecurity textbook called
14:46
Cryptography and Data Security wrote in 1983 or four or something. I was remember because I wrote that. I think I wrote the second textbook in Cyber Security and 93 with Brenda Solon Dammit Middle McMillan's. When I wrote the proposal to do it,
15:03
they said, Well, we already have a security book
15:05
and it had been about 10 years earlier. So the review
15:09
board had to decide whether
15:11
10 years was a sufficient duration to have a second quote unquote second book
15:18
on cybersecurity and Dorothy and Britain first, but any rate, she she got herself into a little bit of a sticky situation with the Clipper chip that NSA was designing and that 18 T was gonna build some phones to where you had this idea that you would escrow
15:37
a piece of the key so that the government decided it wanted to eavesdrop on a captured conversation t to retrieve content from a conversation.
15:46
They could have a judge, you know, Give them, um, the okay,
15:50
they go retrieve the escrow key.
15:54
I'm gonna go retrieve it. That was Clipper, and Dorothy actually took the pro side of that. She thought that was a good idea. And she got destroyed in the er sort of hacker community. They really went nuts on her. And I was really mad because I thought, you know, I don't think they realize who they were dealing with. This is the inventor
16:14
of intrusion detection. Basically, her IEDs model in 1986
16:18
was the first observation that behavioral analytics and out of trails could be used as a as a means for detecting patterns or anomaly. She's just amazing. One of my
16:30
hero shoot. She wrote a review of my first book and it was the greatest moment of my life. She said she liked it. But this book, information warfare and security, you should go look at this book was written in 1998
16:42
which means it was three years before 9 11 And man does it get it gets to it. You know, it talks about the problems that are likely to come from nation states. You know, she was writing this 21 years ago, and it could have been written five minutes ago. This is a good one to get.
17:02
Like if you don't have this book on your bookshelf
17:04
who get it on eBay again. Remember I told you that the skill here, the sea so skills toe have insight and judgment.
17:11
So it's not just sitting there, you know, watching the cyber wire and being able to read back what you heard in sight means you have depth,
17:22
which means you read books like Dorothy Dennings book, which means you realize that half the stuff she talked about 21 years ago
17:32
is essentially the same as the stuff we're talking about today,
17:36
and once in a while, it's not. Once in a while, some things are new, and those are the things that when you're sharing with your CEO,
17:44
you're able to say that is new and this other stuff is not. That's why the history
17:49
is so important. That's why this habit of reading
17:52
it's so important, Percy says. You guys probably get sick of all these book recommendations. I'm gonna leafing out the count.
18:00
Well, then I guess we probably made at least 50 book recommendations so far, but the
18:04
probably to get a, ah, a little coupon from Amazon or something. But But in this book she talks about
18:11
how nation states are likely
18:15
to utilize information warfare as a means either compliment or is a direct means
18:21
for engaging with their enemies. And I think it's one of the most important books
18:26
that I'm aware of in cyber security, because she's the 1st 1 really delay the sad Yeah, I know there's
18:33
there's other guys who
18:34
have written big fat books but all the information wars and Alcide that's along from them. This is the real stuff. Dorothy Denning
18:44
is a computer scientist, and she understands the rial from hyperbole. And if you ask her today, she would say that the thing that's super interesting today is not so much cyber information warfare, but rather information meddling and on and social media
19:02
and sort of making use of forums to spread.
19:07
Um misinformation. She would tell you that that would be the kind of thing
19:11
that's difficult because back when she wrote this book, it was confusing stuff.
19:15
It was messy. You didn't understand it. It wasn't well put together. Wasn't on the news every day. It was her trying to make sense and proposing structures. That's what we do today with things like election tampering. If you say, Gosh, that's so confusing There's no clear structure. It's not like
19:33
the C I A model of cybersecurity. Well, the reason is because it's new
19:38
and if you go back far enough. Cybersecurity was new. It was justice confusing.
19:42
So Dorothy Danning is somebody that we all owe a debt of gratitude because she's the 1st 1 really, to bring this to our attention,
19:49
and we'd be out of a lot further from where we are
19:53
if we hadn't had her help.
19:56
Now, in a sort of modern person toe, look at its, you know, craps.
20:00
I know Brian Krebs. I think he's wonderful.
20:03
He I I feel like the guy is, uh, feels the weight of the world on his shoulders because he's become the go to person
20:12
for reporting problems surreptitiously He's really an interesting guy
20:18
when you call them. It sounds like, you know, there's all kinds of noise in the background and carrying on 10 conversations and the phones were ringing and there's it's it's like you've called The New York Times newsroom.
20:29
I'm and I don't think he has a lot of people around him. I think he's just busy, but he's wonderful
20:33
and there's a good book, much more up to date. It's not a cyber crime stuff like you see that thing there, that the card skimmer that's that's a physical piece of plastic that gets inserted into an A T M and its visual. Like I found that visual stuff
20:53
works well with
20:55
boards and executives. I got a call from the producers of the Today show a couple of years ago
21:00
on Day said, Oh my God, we're doing a piece on card skimmers. Can you get us one? And I go? Yeah, I guess what we want. They wanted this thing went on, eBay bought it and gave it to them, and that was kind of funny. They could have done the same thing,
21:17
but what was particularly interesting is
21:18
we could go on eBay and buy one.
21:21
It wasn't that big a deal to get a card skimmer that we can give to the producers of the Today show for a segment.
21:30
I was working as I mean, I didn't test it. Thing comes in a box that looks like this plastic thing. We give it to him, I you could have done it on a three d printer, and I wouldn't have known the difference because we didn't test it.
21:41
But I'm guessing it was really You know, we paid our who have paid for 50 box, maybe, or something like that. There was no less than 100.
21:49
Um,
21:52
but the thing look at that, that means a visual evidence
21:56
that it's relatively easy to do certain types of tampering
22:00
in a way that can lift your credit card, left your credentials, lift your information. So,
22:06
um
22:07
so, uh, so keep that in mind, you know, as you're sharing information about threat
22:14
with particular with senior executives,
22:17
and they love something visual that they can see or touch or pass around the
22:22
the board room. So you want to develop that habit of being somewhat of a show show person so that's good that that's crabs. Now that this guy's an interesting one is Daniel Ellsberg. Um, if you've not read the Pentagon papers, then you and I don't share a generation. But this was
22:38
one of most important books in seventies. So Daniel's burgers, a whistleblower, an insider he had access to the inner workings
22:48
of what was going on in the Nixon administration.
22:51
I mean, he's the guy who was taking, you know, this information about the
22:56
um at war estimates and other kinds of things that he thought were being cooked
23:02
and he would take them and make copies of them was sharing them with reporters
23:07
and illegal. You know, this was classified stuff, and he decided, the hell with it. I think it's important that I do it. So he did it.
23:15
And as a result, he's viewed today
23:22
by most people
23:23
as a pretty reasonable guy, at worst and as a superhero at best. If you go to some hacking conferences,
23:30
uh, they will bring out Daniel Ellsberg onto the stage
23:34
to a standing ovation,
23:37
where he's seen a sort of the patron saint
23:40
of people like Julian Osanai, JJ and Andi Manning and the others who
23:48
have taken upon themselves to do this sort of thing. Now, look,
23:52
I'm never getting value judgment here. It is a threat to your company. I mean, you would hope
24:00
that if you're breaking the law and someone sees that there is law breaking
24:06
that that person does blow the Whistler's whistle blower laws that protect people like that and shame on you. If you're doing that, that's not reasonable.
24:14
But a lot of times it's not so clear right.
24:18
What happens if somebody has got it totally wrong?
24:21
And just from their vantage point, they see one thing
24:25
on. It's actually something very different. Like, I think everybody in the room, remember? Listen to me knows what a skiff is.
24:30
Suppose you're working in a bank and you stumble onto this skiff,
24:36
and maybe there's a reason the skiffs their my way for the government to be sharing information back with you,
24:41
you know about threat. Most of you probably have had that experience.
24:47
Let's say you have that,
24:48
and then some, somebody's you know, stumbles onto it and sees that it's this locked room and somehow gets in
24:55
and sees all the super secret stuff and goes, Oh my God! Takes a bunch of pictures and caused The New York Times and says, I found a secret room
25:03
in our bank. And then there's a headline. Bank has secret room.
25:08
Is that reasonable? You know, or we just say, My God,
25:12
it's a skiff
25:15
And and every company has a skiff and you can't talk classified.
25:19
If you don't have one, blah, blah, blah. I get the point.
25:22
So kids little funny. That's why I think you need to recognize that insiders
25:26
are the most complex, and that's why I like the idea of you as a person with some insight
25:33
into what's going on in threat. This is you explaining this to your team and to your customers and to your board. I'm just giving you an example here of some of the people who've come who have helped us understand the dimensions of threat.
25:47
But the habit here, the confidence you for the sea. So
25:51
is the ability to explain these types of threats, explain the history, explain the context, differentiate between what's a real problem on what's not a real problem. If you're highly ethical corporation
26:06
and you treat your people well
26:07
and they're managed fairly, and you do everything you can to be an ethical and amazing employer. I don't care what anybody says.
26:17
Your insider threat will be less
26:19
than someone who does the reverse of all those things.
26:23
Now, I didn't say zero. I just said less
26:27
so. The best way to deal with disgruntlement amongst insider seems to be a great employer. But no, you can't always control that
26:37
now. Compromise. Different story were wonderful employer, wonderful company.
26:41
But you're developing some, you know, super important critical chemical for something, a bomb or something.
26:48
Well, you know, then there's gonna be the possibility of compromise where somebody's planted into your environment to steal that stuff.
26:55
Um, and that's not gonna hinge on whether you're nice people or not. That's gonna hinge on the criticality of your asset value
27:03
of the resource is that are available in your company. So
27:07
So again, that's the concept here with Ellsberg. Nixon was breaking the law. So we look back on it, we say thank you down Daniel Ellsberg. But if somebody today you know, decides Nancy Pelosi or Donald Trump or breaking the law
27:22
and leaks to the Times, then about a defendant. We're so polarized as a nation. Half the country thinks it's terrible. Have thinks it's great. It's a very complex
27:33
situation. Says you're explaining insider threat to your team into your board. Keep that in mind and keep their No Ellsberg's
27:41
And this guy's got a longtime friend of mine. I want to tell you a couple stories about him, the looming towers in a book that
27:48
you know it's better Al Qaeda. The reason I have that here for *** Clark is it was made into a Netflix series
27:52
and that Clark is played by one of the actors in the in the series,
27:57
Um
27:59
and, um, along with Ali Su Fonds from other people who at the time we're dealing with Al Qaeda.
28:07
But when um,
28:08
*** Clark was working for,
28:11
um, Bill Clinton,
28:12
he'd essentially laid out PDD 63
28:17
the basis for what we now would consider modern,
28:21
a critical infrastructure protection,
28:23
but kind of morphed into becoming the cyber. I'm sure I did the terrorism czar,
28:30
both under Clinton
28:32
and then later under George W. Bush
28:34
and under George W. Bush. I am something I'm not sure what, but I sort of was thought a favor and with either with soothe with
28:45
Condoleeza Rice or whomever,
28:48
and and I I sense that he was pushed over to do cybersecurity,
28:52
almost as a way of pushing him into something nondescript.
28:56
But later on, he'd written a very critical book of the George W. Bush administration, and I remember,
29:03
um, *** Cheney going on Rush Limbaugh's program, and I listened
29:07
and they were talking about the Clark and and the fact that *** had been very critical of the administration, and I again on my political views are irrelevant. But what I found galling about the conversation is that they lampooned the cyber assignment that he had been put on
29:25
Breast Limbaugh on talking to
29:27
um,
29:29
Thio Ah, *** Cheney. I don't do a very good Rush Limbaugh mutation, but he's like
29:33
he said something like, Oh, so *** Trainee with the Cyber is like, That's ah, viruses on computers. And they both had a great laugh
29:42
that he'd been sort of banished
29:45
from riel terrorism to this goofy of viruses on computers. Not really terrorism. And look, it could have been any big available Ella's liberal
29:57
ah person on TV. They probably would have lampooned ID as well. At the time, people didn't take what we do seriously on both sides of the political spectrum. They just did.
30:08
So I think it's interesting to point at *** because when he did go dive into cyber, he really dove into it. That's how I got to know we became friends. He made his rounds and spent a lot of time
30:21
with with people who were doing. It's been time with May. He's got a new book out, which I think is quite good.
30:27
So
30:29
again, *** Clark is an example of somebody who spent his entire career trying to understand threats
30:37
and and he's an expert in sort of the terrorist angle
30:41
around cybersecurity. Leave the first person to tell you that we're not well prepared for the kinds of things that could be lobbed back at us. I think there was an op ed that he wrote
30:51
in The Wall Street Journal just this past Sunday,
30:53
laying out that we'd better be pretty darn careful if we're going to go taken offensive posture
31:00
because we may not all be ready on the defense, and I blogged about that as well. So again, you need to have your opinion. You may think that Clark's all wet. I don't give a darn li like I'm hated. Whatever you. If you're deciding that you want to advance in this career, you want to be a chief information security officer. You want to really do it,
31:21
Then you have tohave a position on threat, and you need to have stories and angles and explanations and insight. You see how I'm using things from my life here,
31:32
my friends, my colleagues, to illustrate points, you need to do the same thing.
31:37
So here's another guy. No pretty welcome. Mandy, Um, if you have not read the Mandiant report, you should,
31:45
um, spectacularly good insight into the A P T actions
31:52
of that nation state PR say,
31:56
I don't know how many people actually read it. I did. I read it cover to cover. In fact, one of the appendices actually kind of ticked me off
32:04
because it explains something that a lot of people in the community were using
32:08
as the basis for detecting certain types of intrusions on by publishing in the Mandy um, report. It was no longer a very good indicator. Nevertheless,
32:19
um,
32:20
it's Kevin is an example of a guy comes from the FBI who understands the day to day practical reality of the problems that occur.
32:29
Um, and I think that if you really wanna have some impact with you're bored with your executive team and with customers and with your own Dean,
32:37
I don't know how you could not read the Mandiant report. It's There's a lot of them that have come since, but this is the best one. The 1st 1 the most
32:46
consequential one.
32:49
I I do think this is something that comes as close to required reading in our in our field
32:57
as I've ever seen.
32:58
Maybe the second thing that comes close is required. Reading is something that you might find surprising. It's this,
33:07
Um, I think it's funny that introduction by Alan Dershowitz by that guy's all over the news now, right? I mean,
33:13
it's just started, but everywhere you look there's just so much, so much news that can keep track anymore.
33:20
But the Mueller report is tough to read. I, I actually did read it backed. I blogged about it. Here's my
33:27
log here. If you follow me on lengthen, you can read the
33:30
the notes that Poe. But I was interested in lessons from the Muller report,
33:36
and I couldn't care less about kind of the politics. In fact, I didn't even read Volume two
33:42
because the obstruction of justice thing is irrelevant from a cyber security perspective. This wouldn't know what happened.
33:47
And, you know, first of all, the blackouts are hard. It's really hard to read
33:53
something this complicated.
33:55
If you know every other sentence and big paragraphs and entire pages air blacked out, it's hard.
34:02
People go, and most of it's there. But I can tell you, having read every word of Volume one,
34:08
it was not easy. But nevertheless, there's a few things that do come out from the smaller report. And again, if you read this, you can read my thing every. I'll read you the first sentence over the past month. I publish this on June 29th. I've loved the Muller report in my backpack. Every spare moment has been spent swimming it, footnotes,
34:28
pondering the terse legalese
34:30
and cursing at the endless blacked out sentences. My copy was accidentally dropped into a hotel pool, by the way that was in Positano
34:38
in Italy. It went into the bottom of a pool, and I gasped and I pulled it up, let it dry out the Italian sun for a day, and then it was lost for two wonderful days next to the toilet at home.
34:50
And then I also missed out on Earth because I spent so much time reading the *** report. But I got through it.
34:55
And I'll tell you what. It is not easy to read. Like I I'm a reader,
35:01
and it is the hardest thing I have ever read. It's got so many footnotes. The footnotes are thick,
35:08
the writing is thick. Things being blacked out are hard. You can kind of try to guess. In fact, I think you'd be hard. It would be fun to crypt analyze some of the blackouts because you can figure it out because they give you the length of the wording.
35:23
So it'll be the space and then about an inch and 1/2 of stuff and then a space, and then the rest of the sentence
35:31
and and you probably could do some reverse engineering. Figure out what it says. I think that would be kind of fun because it is known plain text script analysis because you are given a hint, you're given the length of
35:46
of the writing. They didn't compact that Tito Ah, synchronous block of black that you see the length So you have a nice hand. It's not cipher text only crypt analysis. But nevertheless, you should go through this because I was reading the Muller report. I was thinking, Oh my God,
36:05
this thing reads like that. It reads like the Mandy in report.
36:08
It's just a more modern version of the Manion reports. Far as I can tell, you know, here's here's the Chinese doing something and here's the Russians doing something
36:19
and and And if you do cybersecurity, then *** it, you got to read this thing. You have to be the expert and you gotta learn to put the stupid politics aside
36:30
gave. Maybe you hate Muller. Maybe you don't remember that story. I told you at the outset
36:35
where I had given a talk to a board,
36:37
you know, and I got up and I you know, and the first half was done by a former government guy who was talking about threat
36:46
and people liked it. And then I got up afterwards and talked about Tech.
36:51
Well, that was Robert Mueller. That's who was I was tag teaming with, and afterwards you came up. We shook hands. I'd known him from
36:58
FBI days and he's just a retired guy. He wasn't the Robert Mueller, you know. Now he's retired from FBI. We shook hands and we thought, Hey, why don't we d'oh,
37:08
Why don't we do a gig together? Let's do the, uh,
37:10
let's talk the boards. You know, your former FBI guy and I'm a former 18 TCS So we'd be in a couple of guys, we could go do it and I remember we very shoot cans very warmly. I have his card, but let's do this.
37:24
Um and then he became Robert Mueller, and suddenly everyone was like, Oh my God,
37:29
but I But again, he's he's just a guy.
37:31
It was talking about threat. It's not somebody who's like some conspiratorial thing to him. That's not where his head waas he was gonna took the boards with May, you know? So I'm just saying that
37:45
put the stupid politics aside and your which side you're on both sides and shame on both sides.
37:53
We are technologists and we should be reading the Muller report to understand the details of what went out. What happened
38:00
was an incident. Here is a break in just like there was a break in here.
38:05
So is a break in here. Do we have all politics and I don't know We win. And we learned that the TT, peas and some of the specifics of what happened here did. Oh, there. And that's again. Why blogged that wrote cyber lessons. I tried to pull out what I thought would be useful from Muller. There's a few questions for you
38:23
as you think through,
38:25
You know what you're doing Threat.
38:28
The first is Do you do a bug bounty program? Man, I hope you D'oh,
38:32
This is some head shots from Google's Hall of Fame.
38:37
I just copied them when I was doing this back, you know, I was in the er practitioner doing this and most of the people I know would agree
38:46
that the Google was the first and best on a lot of people copied. But But you could do this with vendors wonderful company Cenac and Hacker Juan. And all these companies are really good. They're nice. Help you. You should have a bug bounty program because it connects you to a research community that's willing to bang on your stuff and help you understand threat. You should
39:06
definitely do that. If you're not, shame on you.
39:09
Two. Are you connected? The law enforcement. If you're not, join in for guard or
39:15
find your local FBI guys. And what for? I haven't come over and buy pizza. They will come.
39:22
Do not wait until you get hacked to figure out who your local FBI representatives are
39:29
on A on a random Tuesday. Invite them for pizza and they will come. Ask who the cyber security team is again. I understand if you're Joe bag a donuts, they may not come.
39:40
But if you work for a bank or an insurance company or manufacturing company tells us something that's
39:46
of meaningful sighs, they will definitely come meet you. Shake hands, exchange cards. You hear a little bit about how you contact them, and then if you do need to reach out to them, you can reach out to them. Having already established a relationship and in for guard is a really good source of threatened information and kind of up to date intelligence.
40:07
This one's. There's a skiff, by the way, is a portable skiff. Most people don't have a skiff that looks like that, but I guess you embedded in a wall in a room that it would look like sketches. Just a place where you can talk and it's not tapped.
40:21
Um, you probably should have some cleared staff. If you're more than a
40:25
let's say you're ah, north of $100 million revenue, You should probably have some cleared staff where you can talk classified.
40:34
So the government, if their specific situations you should be aware of,
40:38
or if there's TTP and Stevie B's air tactics techniques.
40:43
Um, if something's gonna say, Here's there's a denial of service attack happening here is what it looks like. They want to tell you this will tell you it over a secret phone. Then you gotta have a sketch for you can get on the bat phone and talk to somebody. This is not evil stuff.
40:58
This is somebody wanting to share with you. Hey, look, here's how you really ought to be coding your, uh, your your Net Scout, your arbor device. You know, here's what we see. I hope this helps you. Great. Thank you. Government click. And then you go back to the lab and you have a little bit of advice that you got over secure phones, not
41:16
evil. It's I know, maybe this and people abuse that.
41:22
But my personal experiences that the vast majority of stuff that's going on skip
41:27
has been discussions of intelligence and threat, so you should have them.
41:31
Um, how about this,
41:35
Def con? I hope you go.
41:37
Um, it's in August. Coming up. That's the spot. The fed track there. The guy with the Hawaiian shirt runs the spot. The Fed.
41:45
Um I don't think any of the four people up there are, in fact, feds. Um
41:51
I think they were all wrong. The guy way on the right. Somebody, I guess, would have guessed. He looks like an FBI person with,
41:58
you know, he just looks like a, uh, pretty wasn't. Turns out nine hours of video, but whatever you should go. I have a funny Charlie see so
42:07
cartoon that I I'm just rich and I are working on right now for black hat deaf Consul, Watch for our cartoon
42:15
Berry. Def con is a useful source of threat information it's a little uncomfortable. I've gone a couple of times I've spoken and it's, Ah,
42:22
uncomfortable. I feel like a fish out of water.
42:28
I don't dress the way everybody dresses there. When I goto a conference, I'm usually wearing
42:34
dress slacks and a nice shirt and a jacket. Uh, so I'll lose that
42:40
and put something outside. But I just don't look right. So
42:44
But it is a tremendous source of threat
42:46
information and a wonderful conference and important stuff. You should definitely go. Then, finally, the user behavioral stuff for insiders. I think this is important. You need to be able to explain to your board
43:00
and to your executive team that you do understand that threats
43:05
primarily emanate from the inside and you gotta figure out what's going on. And these behavioral tools have become very important. Resource is that respect? Now let's real quickly go through our case study, and then we got an interesting guests that I can't wait. Thio,
43:22
introduce here very, very wonderful guest who you're gonna enjoy. But the case study in this for this week is our hero, Emily, talking about a situation where somebody in her environment
43:35
was pointed to as potentially compromised by his or her
43:39
government
43:40
on dhe. She knew it was going on, and the FBI approached her. She wasn't sure what to do. And then finally, the FBI said, Wait a minute, this guy is being complicity with us. He's helping us. Can you just leave him alone? We'd prefer if you not tell anybody.
43:55
And that's kind of the case study like what? Does she do that? Because that's always voluntary.
44:01
So what do you do if the FBI approaches you and asks if you be of assistance to them, but that you not tell your boss not tell the C? Yeah, not tell anybody around you again voluntarily. They're not telling you to do it. They're just asking if you'd be willing. And that happens frequently where you're in this role,
44:21
if you're uncomfortable with that kind of thing, that don't
44:23
opt to be a C. So
44:25
because you get put into the most uncomfortable situations all the time. So read this case. Study carefully and you can go over with your teams. You know, is the FBI being reasonable, that what are her options and so on? That's a very rich
44:43
a short but very rich little case study.
44:45
But I suspect you're going to enjoy going over with your team. So that's that's the basic thing here on threat. Now I've invited Chris. Chris Hodson is this. He said, Contain him. Chris, if you can see the chart here, you can see I took your linked in cartoon
45:01
and I projected it up on the screen. I hope that that's okay. I hope you're not mad, but I thought that look
45:07
so cool
45:07
with him is pretty cultures. That is that. Okay, I'll mail. That's yes, Chris. Thanks for Thanks for joining us. I I do appreciate you spending a few times with the students here.
45:20
Hi. It's Natalie. Pleasure. It and I've been I've actually been able to join team to start. So a lot of the stuff he was saying they're around contextualizing threats and understanding. Who would possibly want to compromise information or assets in your environment Certainly resonated with me. And in a number of conversations I've had with bored execs, Yeah,
45:37
You've been at this a while, Chris, you know that a lot of the folks who were here,
45:43
you know our executives with great promises a lot of high potential folks here who are listening. We're thinking that that sea so roll is something that's probably in the not too distant future
45:57
when you're having a beer with somebody on dhe, they mentioned that they're thinking about it. What comes to mind like what you're offering advice to people about the role.
46:06
Usually tell him.
46:08
I think after I check that they're not insane. I know you and maybe you'll get bad e having a face and I got a little we could we could be like that. But it is really trying to understand the concepts, my understanding, Why the case?
46:27
The way I see it si SOS have generally come through one of two tracks. I don't you think it, but they ever come free
46:32
kind of a very technical track. This is kind of the direction that I took right. They've worked maybe as a systems engineer, and they've worked as a technical architect, and then I might have run a function, and then they kind of have the risk in the business, angle added, and they become a sea several. You get the second angle, which is somebody who's may be hard
46:52
10 50 years
46:53
experience in an organization there, an executive level, and they understand the political players and how to get things done in a company. And then they move into the security practice, and I see plenty of people in kind of finance in a jar. Take that track. But I get curious. I honestly ask them
47:13
why. You know, what is it about being a C? So is it
47:16
holistically managing the security function? Is it genuinely wanting Thio materially reduce risk in an organization? I think
47:24
industry records kind of her our industry, with people thinking that every sea so gets, take this inordinate amount of money and going into it with those motivations behind you. I don't necessarily think is the right approach, because there are in a lot of cases, a lot of long hours, and quite often
47:44
I know you think it. But quite often we are stealing some organizations perceived to be this
47:49
function of no ah nde. You know, a business unit that potentially can almost hamstring the business in some way, shape or form, so people's motivations very different. But I think it's very important to go in with your eyes open that as technology continues to move at the breakneck speeds that we see,
48:07
there's gonna be this requirement to be top of your game and constantly learning. I don't know, a good C. So who I'll be honest with you who doesn't still have their head in a book or in some way understanding changes in technology?
48:21
Chris, tell us about your journey. What? You've had some really interesting jobs. I would argue. The one you're in now might be the best of all. But what? Some What? What's been your journey? What have been some of the highlights?
48:32
Yeah, I think it's pretty linear, but that's because I've continued thio learn to develop pond to enjoy what I do. But I came as a set from that very technical track, so I started out in systems engineering when I left my A levels college education, this would be
48:51
in the U. S. The first roll I actually wanted to do was some thinking journalism,
48:54
something in writing our passion for sports. But I thought I want to do something in writing. It didn't work out that way for various reasons in my home life and kind of I suppose
49:06
ability to go to university. So I went down, Maura kind of like a vocational with training kind of embedded within the role. So I got into development.
49:15
Development was the first kind of computer engineering style robot I had. So I learned visual basic five and six college, really getting into the nuts and bolts of how systems work. I'd always been curious by that from a hardware and a software perspective and then kind of took that.
49:32
I self enrolled here in the UK with the National Lighty Learning Center.
49:37
I went down there. I have many of us have gone down that M C S C rew. Right. You know, Microsoft, this is pre Apple's position in enterprise environments. You know, this is pre virtualization. Really, This is You know, if you wanted to build a lab,
49:52
it was generally a Microsoft Windows environment on physical servers in your home.
49:58
That's how we built stuff. And I went down that track for Windows 4003.
50:04
Um, I got to that crossroads, right? I got a crossroads of technology, of understanding how things work, but then you have kind of elected with an M C S e Do I go down for anyone who doesn't know? Sorry, that's a Mark stop certified systems engineer.
50:17
You gotta get that opportunity to go to tracks there either like a database and messaging track or a security track
50:25
drives very interested in how things work and equally have to break things. So spend a lot of time looking at proxy server architectures, sort of Microsoft. I s a server, understanding a little bit more around her data security and data in flight and how cryptography played such a pivotal role.
50:43
It's been quite a bit of time doing that, and it's at that stage kind of earned my stripes
50:49
entry level, kind of system security engineer position and started to look at a number of kind of consultancy and contracting rolls here in the UK So I'm spent some time in financial service is quite a bit of time in the retail space. Architecture has always been kind of a passion of mine systems
51:08
on solutions, architecture and latterly
51:13
more of the enterprise security architecture role, which kind of much more focused on translating business requirements into security solutions. Using
51:22
frameworks and methodologies. Such a sap Sorrento Gaff and I saw my last roll in and use the land was a large retailer here in the UK,
51:31
running their side security functions of defining their strategy, their architectural blueprints and on also resilience. Business continuity is also something that fell under my purview. In the last four or five years, I've moved to what some people will call the dark side, which is vendor legs.
51:50
So I took two and 1/2 years over easy scaler
51:52
s a a Web security organization, for for those of you don't know, they were kind of a lot more than that. But broadly so a lot of time there is there, See, So for the Amir region and also their data privacy officer, there's something that I've stayed quite a bit of time when it's the privacy in the governance side of security.
52:09
Um, today, our service titanium. See, So here in Europe absolutely love it very close to technology, feeding back customer requirements. And
52:17
I'm kind of working with, I suppose, security executives in large organizations on some of the stuff that you were talking about, right? So understanding what's important to an organization
52:29
designing controls commensurate with
52:31
the classification of information, but also the threat landscape.
52:36
You know, I'm big into threat modeling, profiling, learning from assessment activity, be that penetration testing or red team a barbarities on echo What you were saying? They're about the important supposed in organizations. I'm having brats interface into the research community as well.
52:53
And I said, That's the main mean crazy. That's wonderful story. You know, Chris, one of things we we've covered in this class, and it's kind of the
53:02
a couple of different journeys that you go through in this in this business, one being kind of a vertical journey, where you go to one company like IBM or some big vice in your
53:13
wander around the company and eventually come to the security role. And you really become an IBM expert versus the horizontal
53:22
kind of role, where you staying? Information security. But you have a number of different stops along the way
53:28
at different companies, and it sounds like you took the ladder, and I'm just curious. What was your observation there because you've had the benefit.
53:37
I've seen a lot of different organizations, but do you ever sort of lament, say, Gosh, I wish I'd gone to work at the
53:45
Ericsson or Amazon and stayed there for 25 years. What's your What are your thoughts of the pros and cons of those two
53:52
titans of careers?
53:54
I think I think there are pros and cons, obviously to buy. If I think one of the prose of working horizontally is you start to understand kind of on generalizing here, but sort of the risk appetite
54:07
within particular industry verticals, and I think we're seeing less of it
54:12
less of it now. But when I started out, I'm sure you get it over in the U. S. As well. You'd see job specs that would say must have 10 years banking experience, right? And you know you're wanting to break into financial service is maybe a za security architect or a designer.
54:28
And you're looking at that and thinking, Well, this is a real cyclical challenge, right? It's Ah, chicken and Xing. How how do you go and work for an organization that wants 10 years banking experience? If you can't get any banking experience right, that almost seem to be a way of just anybody. No one working in banking ultimately and
54:45
I think I think that's changed these days. I back Vanya. My financial service is had a particular kind of
54:50
risk. Tolerance level was low, whereas maybe some other sectors on a little bit more appetite for from risk. Right now, every organization in existence is a technology company, and I know that may be a bit of a tripe statement in 2019 but I believe it to be true.
55:06
Everyone has an e commerce platform. Everyone these days hasn't mobile at pretty much every industry. Virtually every company.
55:13
Everyone's processing personal information. You know, there are that these requirements this this horizontal kind of gain or experience that I got from governing from retail banking to media.
55:24
You still have stakeholder perceptions, and you still have different constructions of boards and on the hierarchy of the senior management. But I think it's less relevant now, tohave that horizontal or certainly cross vertical
55:40
um, experience. But what it brought to May
55:44
is exactly that is understanding what's important to different businesses and also how to articulate risk. Ondas, you mentioned on this goal
55:52
how to highlight and understand different threats, and by threats I mean the events themselves, but also the actors involved in that exchange. So understanding why, for example, maybe this is slightly overused. The nation state actors may be particularly interested it in
56:10
a specific industry vertical of a particular organization because some of the information, anyway,
56:15
I'm having their environment or financially motivated cyber criminals and one of the types of organizations, and you touched on a short TTP system the tools, techniques and procedures that they may use the kind of action on their objective. So I suppose
56:31
working for different organizations in different verticals gives you a more rounded understanding of
56:37
I trust this societal relevance of cyber security.
56:42
But then, on the flip side of that, if you've worked in the same organization for 20 years, you do have that kind of long standing. But the experience of what makes the company tip you know, what's the motivation? What's the culture? How could let's say, for example, you're gonna run a security awareness program?
57:00
If you've been in an organization for 20 years, you probably have a strong understanding of how to make messages step from resonate with a broad cross section
57:08
of stakeholders. But again, the flip site that positive
57:14
and I see this I don't see this very often, but I see it sometimes. If you've only worked in one organization, you only really understand one way of doing things from a technology in a security perspective. So at that point, I was strongly recommend to anyone who sticks out in an organization for 20 years.
57:30
Make sure you're going to industry catch ups, right? You make the death card in there,
57:35
make sure you're also talking in your own industry vertical kind of cyber sharing initiatives as well, getting a much better right there of what the industry as a whole is experiencing.
57:44
What's wonderful advice. You know, Chris, another topic that we've sort of hit on in our electricity years around
57:52
personal ideology and belief and so on and
57:55
cyber security used to be when we all first started just a gearhead thing. And how do you detect viruses and stopped? Um, you know, it was about as ones and zeros as it gets. But now the topics we deal with hit on things that in many cases
58:09
might collide somewhat with your own personal ideologies that there might be issues related to privacy or two. How we handle certain nation states or to supply chain issues and third parties. He's become really sensitive issues at some point.
58:29
How does one go about?
58:30
I don't know. This is an easy question apologized for, But I know we didn't discuss this question and that's right. You're probably saying, Boy, this is a This guy's going off the rails here But how do we
58:43
How do we deal with that problem when kind of at work decisions have to be made? Do we deal with China? Yes, or now
58:49
you know, and and there's a threat issue. But there's also the personal
58:52
ideology that creeps in. Whether it be left, right or mid middle of you ever had could deal with any of that.
58:59
I think I think there were occasions that, you know, many see says, that I've spoken to
59:05
have, maybe come across staff in there,
59:07
and that guy today I certainly have. It's been it's been an issue for you know you're working Telco. There's a lot of issues that pop up that you may or may not agree with, or where you totally agree with,
59:21
you know, really requires that you look in mirror a couple times.
59:23
Absolutely. I think the time to do. This is my personal view on this clinic. And again, this is not the view of my employer or anyone I've worked for. But the thing I think is to do that qualification as you're embarking on
59:37
a career with that organization. People talk about due diligence, and we talk about due diligence in cyber when we consider third parties who remain entering to a business relationship with it. Maybe a process a controller from a data privacy perspective, it may be 1/3 party
59:53
running your I T systems and in the same way we have ceased
59:58
would have our security function. Performer security Due diligence. I have really gotta understand the business model when the motivation of organizations that you're dealing with your potentially working for now, this becomes increasingly difficult, as as I mentioned earlier. Technology is pervasive
60:14
and essentially everything that we do, but also the other consideration for the sea. So in 2019 is that we're starting to see our last couple of years very physical repercussions of cyber event, something people almost felt comfortable when the worst case scenario of a cyber attack was a data breach.
60:32
And I'm not included with that. You know, there are
60:35
colossal reverberations in collateral damage of that, but But now we're seeing, you know, mistakes are oversights in the cyber around, cause loss of life. You know, that can cause very tangible physical,
60:47
I suppose. Like I said, repercussion. So it's incredibly important that we understand the industry in which the organization that we do business with all we worked for is in on it. I think we can also understand and learn a lot about culture by, you know, spending time of employees by looking at reviews of organizations online. But you're right.
61:07
There are. There are times when it's a C. So you need to remember what your day job is.
61:12
That is, to protect the information, the assets and from a cyber context. I suppose the people off a company on that has to be your number one right? And if personal personal beliefs clash with that, then I think, you know, easier said than done book potentially you should look for,
61:29
I'll tell you I'm trying to move on. Yeah, I would agree
61:32
a special, especially as the industry seems to suggest we have such a scheme shooting Hopefully, hopefully you shouldn't struggle. Defied another employer. Listen, Chris, on behalf of the whole class here
61:45
and Leaf and the team over its cyber, we want to thank you for spending. Time has been our honor to
61:52
listen and learn from you. And we really do appreciate your spending. And it's been an absolute pleasure. I love things like this. And I can see a few of the messages on the right hand side here about people being a black cat. I'm actually gonna shoot across the black hair and def con this year. So
62:06
if anyone's coming across, drop me a message in slack and hopefully we can be up for bail. They will recognize you from the cartoon
62:13
man. Thanks so much for this. You bet. Thanks. Everybody will see you all next week. Thanks. Leave.

CISO Competency - Threats

This is the ninth course in Ed Amoroso's Twelve Competencies of the Effective CISO, which focuses on the CISO Competency in Threat Insights. The CISO must maintain accurate and realistic insights into the evolving threats facing cyber security teams, and produce qualitative and quantitative assessments for risk decisions.

Instructed By

Instructor Profile Image
Ed Amoroso
CEO, CSO, CISO of TAG Cyber
Instructor