WEBVTT

0:00:03.620000 --> 0:00:07.800000
 Beyond first response, deep analysis.

0:00:07.800000 --> 0:00:12.900000
 So in the previous video, we got an
 introduction to first response or

0:00:12.900000 --> 0:00:17.940000
 hot triage as it's commonly referred
 to as, alternatively.

0:00:17.940000 --> 0:00:22.900000
 And we got an understanding as to what
 an instant responder does immediately

0:00:22.900000 --> 0:00:30.120000
 after receiving communication of an
 incident or immediately after having

0:00:30.120000 --> 0:00:32.680000
 an incident escalated to them.

0:00:32.680000 --> 0:00:36.240000
 And we got an idea, rough idea.

0:00:36.240000 --> 0:00:41.360000
 I should say an idea of what the first
 five minutes looks like, roughly

0:00:41.360000 --> 0:00:46.840000
 speaking, of course, that period
 can be longer but not too long.

0:00:46.840000 --> 0:00:51.480000
 And we understood or we got an understanding
 of what that looks like using

0:00:51.480000 --> 0:00:56.040000
 some practical or some scenario
 based examples.

0:00:56.040000 --> 0:00:59.360000
 So we're now turning our attention to
 deep analysis, which is something

0:00:59.360000 --> 0:01:01.520000
 I mentioned in the previous video.

0:01:01.520000 --> 0:01:06.340000
 So this video seeks to clarify or
 to explain that a little bit more.

0:01:06.340000 --> 0:01:10.340000
 So first things first, what does deep
 analysis mean in the context of

0:01:10.340000 --> 0:01:14.680000
 incident response generally, but more specifically
 the detection and analysis

0:01:14.680000 --> 0:01:18.720000
 phase and more specific than
 that, the analysis phase.

0:01:18.720000 --> 0:01:24.120000
 So an incident response deep analysis
 quote unquote is the transition

0:01:24.120000 --> 0:01:31.320000
 from a rapid yes or no, or how big validation
 to a forensic level investigation

0:01:31.320000 --> 0:01:36.560000
 or analysis that seeks to determine
 how the threat operated, what was

0:01:36.560000 --> 0:01:41.500000
 modified, which data was accessed, or
 which system was accessed, and how

0:01:41.500000 --> 0:01:45.480000
 the threat can be eradicated
 without collateral damage.

0:01:45.480000 --> 0:01:50.720000
 Okay, so deep analysis as its name
 suggests is quite deep in terms of

0:01:50.720000 --> 0:01:54.520000
 what it seeks to determine, right?

0:01:54.520000 --> 0:01:59.260000
 And it is triggered, or you know, you
 perform deep analysis when first

0:01:59.260000 --> 0:02:04.000000
 response, which we explored in the previous
 video, when the first response

0:02:04.000000 --> 0:02:10.680000
 findings show that a the activities
 confirmed as malicious, and not yet

0:02:10.680000 --> 0:02:13.940000
 fully scoped. Okay, so when you're
 dealing with an incident where you

0:02:13.940000 --> 0:02:17.740000
 know you've confirmed that it exists
 during the first response phase,

0:02:17.740000 --> 0:02:22.360000
 or you know, within the first five
 minutes, so to say, but you're not

0:02:22.360000 --> 0:02:29.500000
 entirely sure, or you're not really
 100% sure of what the the scope is

0:02:29.500000 --> 0:02:30.520000
 or what you're dealing with.

0:02:30.520000 --> 0:02:36.500000
 So that's one prompt for deep analysis,
 be you know, the other trigger

0:02:36.500000 --> 0:02:43.140000
 for deep analysis could be, you know,
 the involvement of or the fact that

0:02:43.140000 --> 0:02:48.800000
 a crown jewel or critical systems, or regulated
 data involved in the incident,

0:02:48.800000 --> 0:02:53.720000
 or are being affected by the incident,
 or another trigger could be when

0:02:53.720000 --> 0:02:58.440000
 containment decisions require surgical
 accuracy, for example, root cause

0:02:58.440000 --> 0:03:04.160000
 patching, selective rollbacks,
 or legal evidence preservation.

0:03:04.160000 --> 0:03:09.380000
 So within, you know, this particular
 slide here, I've sort of outlined

0:03:09.380000 --> 0:03:12.820000
 as best as I can, what
 deep analysis entails.

0:03:12.820000 --> 0:03:15.980000
 And as you've probably guessed by taking
 a look at the course sections,

0:03:15.980000 --> 0:03:17.460000
 you know, it's organized.

0:03:17.460000 --> 0:03:22.760000
 And what I mentioned earlier, analysis
 can be broken down categorically

0:03:22.760000 --> 0:03:28.200000
 into two types. And I'm not referring to
 the, you know, the types of analysis,

0:03:28.200000 --> 0:03:33.620000
 but really the the the categories of
 analysis with regards to what is

0:03:33.620000 --> 0:03:35.420000
 being analyzed specifically.

0:03:35.420000 --> 0:03:39.900000
 And you know, one of which is endpoint
 analysis or host analysis, those

0:03:39.900000 --> 0:03:42.000000
 two are used interchangeably.

0:03:42.000000 --> 0:03:47.140000
 So whenever you hear host analysis, and
 endpoint analysis, just know you're

0:03:47.140000 --> 0:03:48.840000
 referring to the same thing.

0:03:48.840000 --> 0:03:52.940000
 And then there's network analysis
 or network centric analysis.

0:03:52.940000 --> 0:03:57.700000
 So over here, I've sort of outlined
 what deep analysis entails or what

0:03:57.700000 --> 0:04:00.300000
 it looks like for both endpoints.

0:04:00.300000 --> 0:04:04.540000
 And you know, when you're talking about
 network centric analysis or activity,

0:04:04.540000 --> 0:04:09.220000
 as it were. So starting off with endpoint
 centric analysis, what are the

0:04:09.220000 --> 0:04:10.360000
 core tasks here?

0:04:10.360000 --> 0:04:14.060000
 Well, of course, this is not extensive
 or exhaustive, but I'm just trying

0:04:14.060000 --> 0:04:18.840000
 to give you a lay of the land as it
 were as to what the core tasks are.

0:04:18.840000 --> 0:04:21.540000
 So you have full memory forensics.

0:04:21.540000 --> 0:04:25.460000
 So think of, you know, using tools
 like volatility, recall, etc.

0:04:25.460000 --> 0:04:30.280000
 And by the way, when we talk about
 forensics specifically, we will not

0:04:30.280000 --> 0:04:36.000000
 be focusing on, you know, what you'd
 call, irregular or professional or

0:04:36.000000 --> 0:04:38.640000
 specialized forensics within this course.


0:04:38.640000 --> 0:04:42.600000
 We will be covering that in its own dedicated
 course, but we will be taking

0:04:42.600000 --> 0:04:46.040000
 a look at forensics within this course,
 but you know, to a limited or

0:04:46.040000 --> 0:04:50.620000
 certain extent. In so far as it relates
 to what you need to be able to

0:04:50.620000 --> 0:04:52.440000
 do as an incident responder.

0:04:52.440000 --> 0:04:56.940000
 In any case, you perform full memory
 forensics using tools like volatiles,

0:04:56.940000 --> 0:05:02.260000
 volatility or recall to extract injected
 code credentials, malware configs,

0:05:02.260000 --> 0:05:08.180000
 etc. You also perform disk and registry
 analysis to find persistence mechanisms,

0:05:08.180000 --> 0:05:11.780000
 time stumps, deleted files, etc.

0:05:11.780000 --> 0:05:16.120000
 This is quite common, although quite
 specialized, you know, if you have

0:05:16.120000 --> 0:05:20.180000
 the capabilities or the abilities to
 do so, you'd also be performing,

0:05:20.180000 --> 0:05:26.500000
 you know, binary triage or reverse
 engineering using tools like Gidra,

0:05:26.500000 --> 0:05:28.580000
 cyber chef, etc.

0:05:28.580000 --> 0:05:33.580000
 And, you know, what exactly, what types
 of binaries are you, you know,

0:05:33.580000 --> 0:05:35.420000
 triaging, analyzing or
 reverse engineering?

0:05:35.420000 --> 0:05:39.400000
 Well, it's going to be, you know, the
 executables that have been dropped

0:05:39.400000 --> 0:05:43.060000
 or the scripts that, you know, have
 been involved in the incident or the

0:05:43.060000 --> 0:05:49.540000
 attack. You'll also be, you know, performing,
 you know, you'd be performing

0:05:49.540000 --> 0:05:54.580000
 super timeline construction or broadly
 speaking timeline development.

0:05:54.580000 --> 0:05:58.740000
 And some of the common tools for timeline
 development would be PLASO or

0:05:58.740000 --> 0:06:03.520000
 time sketch, which involves, you know,
 combining file system, you know,

0:06:03.520000 --> 0:06:08.900000
 combining a file system, the
 log and registry timestamps.

0:06:08.900000 --> 0:06:11.220000
 It is also a user artifact review.

0:06:11.220000 --> 0:06:15.420000
 So think of things like shellbacks,
 jump lists, browser history, etc.

0:06:15.420000 --> 0:06:18.940000
 And the typical tools and artifacts
 associated with endpoints centric

0:06:18.940000 --> 0:06:22.800000
 analysis would be, you know,
 tools and artifacts.

0:06:22.800000 --> 0:06:26.880000
 Well, in terms of artifacts,
 you'd be have RAM dumps.

0:06:26.880000 --> 0:06:33.020000
 You'd also have MFT USN in terms of tools
 now, you know, SISMON and Windows

0:06:33.020000 --> 0:06:37.100000
 logs, you know, Windows logs would
 be more so artifacts, but you also

0:06:37.100000 --> 0:06:42.860000
 have scheduled tasks, XML, malicious
 binary, so on and so forth.

0:06:42.860000 --> 0:06:48.760000
 And don't worry, you know, I'm being
 quite brief with, you know, the core

0:06:48.760000 --> 0:06:52.520000
 tasks and the tools and artifacts specific
 to endpoints centric analysis,

0:06:52.520000 --> 0:06:55.780000
 when we get to the endpoint analysis
 section of the course, we'll dive

0:06:55.780000 --> 0:07:00.280000
 deep into the core tasks, right, and
 I'll sort of lay out what exactly

0:07:00.280000 --> 0:07:03.820000
 we'll be covering within this course,
 of course, keeping in mind what

0:07:03.820000 --> 0:07:08.980000
 I mentioned about the fact that some
 of the analysis pertinent to digital

0:07:08.980000 --> 0:07:13.440000
 forensics or specialized forensics, we
 will be exploring in its own dedicated

0:07:13.440000 --> 0:07:17.640000
 course. In any case, we then have network
 centric analysis and what are

0:07:17.640000 --> 0:07:18.760000
 the core tasks here?

0:07:18.760000 --> 0:07:22.600000
 Well, you have things like full packet
 capture, replay to decrypt or inspect

0:07:22.600000 --> 0:07:28.580000
 payloads. The tools would be, you know,
 wire shark, or the ZIC PCAP analyzer.

0:07:28.580000 --> 0:07:32.580000
 We also have beacon pattern analysis, you
 know, tools that would be applicable

0:07:32.580000 --> 0:07:43.700000
 here would be Rita or techniques protocol
 carving in order to extract

0:07:43.700000 --> 0:07:49.020000
 accelerated files, certificates, HTTP,
 multiple extremes, so on and so

0:07:49.020000 --> 0:07:53.380000
 forth. And then you also have, you know,
 thread intel correlation of ASN,

0:07:53.380000 --> 0:08:00.300000
 TLS fingerprints, J3, J4 hashes, and
 then graphing lateral flows, you

0:08:00.300000 --> 0:08:03.520000
 know, with tools like Malcolm, you
 know, the Splunk traffic app to map

0:08:03.520000 --> 0:08:07.560000
 movement between subnets, and the typical
 tools and artifacts you'll find

0:08:07.560000 --> 0:08:11.380000
 here are obviously going to be PCAPs,
 which will be exploring how to analyze

0:08:11.380000 --> 0:08:17.100000
 in the network analysis section of discourse,
 net flow, Z-clogs, TLS handshake

0:08:17.100000 --> 0:08:19.560000
 data, DNS query logs.

0:08:19.560000 --> 0:08:25.160000
 So hopefully this gives you a lay of the
 land when it comes to deep analysis

0:08:25.160000 --> 0:08:30.820000
 or endpoint and network centric analysis.


0:08:30.820000 --> 0:08:35.840000
 So now that we have that out of the
 way, let's, you know, let's bring

0:08:35.840000 --> 0:08:39.280000
 it together. Let's try and understand
 deep analysis, you know, through

0:08:39.280000 --> 0:08:43.340000
 the use of an example scenario, but
 not only that, we're also going to

0:08:43.340000 --> 0:08:47.200000
 include the first five minutes or first
 response or hot triage as it were

0:08:47.200000 --> 0:08:51.140000
 to show you how you move from first
 response or the first five minutes

0:08:51.140000 --> 0:08:54.600000
 to deep analysis and what
 triggers deep analysis.

0:08:54.600000 --> 0:08:57.880000
 So that's why these examples
 are very important.

0:08:57.880000 --> 0:09:01.300000
 Okay, so initial alert
 triage and escalation.

0:09:01.300000 --> 0:09:05.120000
 So right over here, you can see that
 we have a very nice image of, you

0:09:05.120000 --> 0:09:09.600000
 know, an instant report or a case on
 the hive, not a lot of info contained

0:09:09.600000 --> 0:09:12.720000
 there, but just to give you an idea
 of what it looks like, although, you

0:09:12.720000 --> 0:09:14.380000
 know, you've already seen
 what that looks like.

0:09:14.380000 --> 0:09:19.820000
 In any case, we have an alert here
 that was triggered at midnight.

0:09:19.820000 --> 0:09:23.760000
 Okay. And you can see Splunk
 raises a critical alert.

0:09:23.760000 --> 0:09:27.400000
 The tier one analyst has performed initial
 triage and has escalated this

0:09:27.400000 --> 0:09:33.140000
 to you, the on-call instant responder for
 validation and analysis if required.

0:09:33.140000 --> 0:09:38.620000
 Okay. And when they're referring to analysis,
 or when we're talking about

0:09:38.620000 --> 0:09:41.540000
 analysis here and whether it's required,
 we're referring to deep analysis.

0:09:41.540000 --> 0:09:45.360000
 So the ticket or case contains
 the following information.

0:09:45.360000 --> 0:09:47.100000
 And by the way, I just want
 to point out something.

0:09:47.100000 --> 0:09:50.120000
 The reason I keep using ticket in case
 interchangeably is because a lot

0:09:50.120000 --> 0:09:54.200000
 of the instant handling or management
 platforms use different nomenclature

0:09:54.200000 --> 0:10:01.980000
 that you encapsulation of instance,
 the communication of incidents.

0:10:01.980000 --> 0:10:05.340000
 And so you'll typically see them being
 called, you know, an incident or

0:10:05.340000 --> 0:10:09.300000
 a ticket or a case in the case
 of the hive, no pun intended.

0:10:09.300000 --> 0:10:13.820000
 In any case, this is what the ticket
 or case contains, right?

0:10:13.820000 --> 0:10:15.720000
 That has been now escalated to you.

0:10:15.720000 --> 0:10:17.720000
 You've been tagged in it as the owner.

0:10:17.720000 --> 0:10:20.720000
 So this is what it says, it says
 a hidden PowerShell session.

0:10:20.720000 --> 0:10:23.920000
 This is what the sock tier one analyst
 said, a hidden PowerShell session

0:10:23.920000 --> 0:10:29.440000
 has been executed on the system
 with a host named Finn SQL 01.

0:10:29.440000 --> 0:10:34.640000
 And this script, this PowerShell session
 appears to have been used to

0:10:34.640000 --> 0:10:39.320000
 download a binary or executable
 called beacon.exe.

0:10:39.320000 --> 0:10:43.180000
 Interesting. From a remote server,
 and they've provided us with the IP

0:10:43.180000 --> 0:10:46.720000
 address. So based on everything we've
 learned within this course thus

0:10:46.720000 --> 0:10:50.640000
 far, you should have a good
 idea of what the IOCs are.

0:10:50.640000 --> 0:10:54.440000
 The first IOC is going to be PowerShell.

0:10:54.440000 --> 0:10:56.460000
 That's not really an IOC.

0:10:56.460000 --> 0:11:01.680000
 You would combine it with, you know, for
 example, the more the more realistic

0:11:01.680000 --> 0:11:05.360000
 IOC or the more applicable
 IOC, which is beacon.exe.

0:11:05.360000 --> 0:11:10.700000
 That's a file, you know, that you know,
 you can actually get a hash of

0:11:10.700000 --> 0:11:11.920000
 so on and so forth.

0:11:11.920000 --> 0:11:16.180000
 You also have another IOC, which in
 this case is an IP address, which

0:11:16.180000 --> 0:11:20.540000
 you can then use, you know, to pivot
 through logs, perform correlation,

0:11:20.540000 --> 0:11:23.900000
 identify the scope, so on and so
 forth, all of that good stuff.

0:11:23.900000 --> 0:11:26.860000
 Hopefully it's starting to
 come together in your mind.

0:11:26.860000 --> 0:11:31.440000
 So now we're dealing with the first
 five minutes or hot triage of first

0:11:31.440000 --> 0:11:33.520000
 response. So we've just got it.

0:11:33.520000 --> 0:11:34.820000
 What do we do first?

0:11:34.820000 --> 0:11:38.420000
 We open the ticket, we read the information
 included by the tier one analyst,

0:11:38.420000 --> 0:11:42.160000
 which you saw is possible
 to do in just one minute.

0:11:42.160000 --> 0:11:44.420000
 We did it in the previous slide here.

0:11:44.420000 --> 0:11:47.220000
 So that's what I wanted to outline.

0:11:47.220000 --> 0:11:51.680000
 In any case, the outcome is confirm
 the affected host, which we did, or

0:11:51.680000 --> 0:11:54.920000
 we haven't confirmed it yet, but you
 know, we got the for the affected

0:11:54.920000 --> 0:11:58.020000
 host. We got the host name, the
 user that really wasn't there.

0:11:58.020000 --> 0:12:01.800000
 Initial hash, we didn't get that,
 but we can easily get that.

0:12:01.800000 --> 0:12:05.540000
 Moving on to the, you know, first and
 second minutes or minute one and

0:12:05.540000 --> 0:12:10.560000
 two, we would rerun the same search
 if attached by the within the case

0:12:10.560000 --> 0:12:15.660000
 itself or the ticket and, you know,
 would expand the time window to plus

0:12:15.660000 --> 0:12:20.920000
 five minutes. And let's say for the
 purposes of this example that we,

0:12:20.920000 --> 0:12:24.340000
 the outcome is that we identify three
 matching PowerShell events.

0:12:24.340000 --> 0:12:27.540000
 Okay. So we know that
 something's going on.

0:12:27.540000 --> 0:12:29.300000
 We validated it.

0:12:29.300000 --> 0:12:32.320000
 And now we have minutes two and three.

0:12:32.320000 --> 0:12:37.360000
 This is where we pivot on the IOCs,
 which was the hash for beacon.exe

0:12:37.360000 --> 0:12:40.900000
 and the IP address that we got.

0:12:40.900000 --> 0:12:45.840000
 So the outcome of that was that
 no other host show activity.

0:12:45.840000 --> 0:12:50.080000
 And we also identified, you know, that
 what I mentioned in the previous

0:12:50.080000 --> 0:12:57.000000
 video with regards to criticality or
 whether or not the system is a crown

0:12:57.000000 --> 0:13:02.100000
 jewel as it were, we discover that this
 acid is a production SQL server.

0:13:02.100000 --> 0:13:07.160000
 And then, okay, we move on to the next
 action or the next step of first

0:13:07.160000 --> 0:13:11.340000
 response, which is, you know, containment
 or isolation as it were.

0:13:11.340000 --> 0:13:17.680000
 So that's DDR isolates host
 and firewall blocks IP.

0:13:17.680000 --> 0:13:19.980000
 So the active beaconing is stopped.

0:13:19.980000 --> 0:13:20.920000
 That's the outcome.

0:13:20.920000 --> 0:13:25.420000
 And then minutes four to five, we update
 the ticket attach logs and page

0:13:25.420000 --> 0:13:32.500000
 the DBA owner. DBA means always an abbreviation
 for the database administrator.

0:13:32.500000 --> 0:13:38.840000
 And in this case, the outcome is incident
 is validated, scoped as a single

0:13:38.840000 --> 0:13:44.460000
 host and tagged accordingly as critical
 or, you know, the fact that this

0:13:44.460000 --> 0:13:45.920000
 is a critical asset.

0:13:45.920000 --> 0:13:50.720000
 So you may be asking yourself now because
 that's the core of what this

0:13:50.720000 --> 0:13:54.320000
 video is about. The first five
 minutes seem pretty good.

0:13:54.320000 --> 0:13:57.700000
 I mean, we, you know, based on the outcomes,
 it looks like we know what

0:13:57.700000 --> 0:13:58.560000
 we're dealing with.

0:13:58.560000 --> 0:14:05.400000
 I think we can just pass this along to
 the, you know, containment eradication

0:14:05.400000 --> 0:14:06.960000
 and recovery teams.

0:14:06.960000 --> 0:14:08.880000
 Or we can do that ourselves.

0:14:08.880000 --> 0:14:10.160000
 Well, not really.

0:14:10.160000 --> 0:14:12.880000
 If you come to think about
 it, this is a big deal.

0:14:12.880000 --> 0:14:17.460000
 This is a big deal because we have this,
 you know, interesting executable

0:14:17.460000 --> 0:14:20.000000
 called beacon. We don't
 know what it's about.

0:14:20.000000 --> 0:14:23.660000
 We just know that it's
 limited to one host.

0:14:23.660000 --> 0:14:28.120000
 We know we've stopped the beaconing,
 but do we know exactly what else

0:14:28.120000 --> 0:14:38.040000
 it's doing? Apart from beaconing, you
 know, deeper analysis is now required

0:14:38.040000 --> 0:14:43.920000
 is because a mission critical database
 server ran untrusted code.

0:14:43.920000 --> 0:14:47.680000
 That's a big no, no, like
 this is a big deal.

0:14:47.680000 --> 0:14:52.560000
 As a result, regulators, just as an
 example, you know, may require proof

0:14:52.560000 --> 0:14:56.420000
 of whether data was read or modified
 because remember, if that database,

0:14:56.420000 --> 0:15:00.740000
 which in this case, it was a production
 database, a mission critical database,

0:15:00.740000 --> 0:15:05.620000
 I should say, because that database,
 you know, may be serving, may be

0:15:05.620000 --> 0:15:11.320000
 hosting or storing, you know, customer data,
 personally identifiable information,

0:15:11.320000 --> 0:15:17.760000
 just based on the merits of, you know,
 regulatory requirements and compliance,

0:15:17.760000 --> 0:15:23.580000
 you have to determine, you have to
 determine concretely or accurately,

0:15:23.580000 --> 0:15:28.560000
 whether data on that database was read,
 was accessed or modified in any

0:15:28.560000 --> 0:15:34.000000
 way, shape or form, in relation to this
 malicious executable called beacon

0:15:34.000000 --> 0:15:39.940000
.exe. So the bottom line here that I want
 to point out as to why deep analysis

0:15:39.940000 --> 0:15:45.800000
 is required is because root cause and
 persistence must be conclusively

0:15:45.800000 --> 0:15:51.760000
 identified and located in the context
 of persistence before re-adding

0:15:51.760000 --> 0:15:56.180000
 the server to production or before
 you can say, yeah, we responded to

0:15:56.180000 --> 0:15:59.240000
 that incident. We, you know,
 contained everything.

0:15:59.240000 --> 0:16:04.180000
 We eradicated the threat and we, you
 know, we restored the system back,

0:16:04.180000 --> 0:16:08.740000
 you know, using a backup or a snapshot,
 you know, it's good to go, it's

0:16:08.740000 --> 0:16:10.160000
 production ready.

0:16:10.160000 --> 0:16:16.840000
 So the reason why, you know, we would
 want to perform deep analysis, you

0:16:16.840000 --> 0:16:19.700000
 know, after first response
 is fairly obvious.

0:16:19.700000 --> 0:16:24.980000
 And as I said, it came down to that
 executable and the fact that, you

0:16:24.980000 --> 0:16:27.060000
 know, we don't know what else it did.

0:16:27.060000 --> 0:16:32.100000
 And now that now hopefully has, you know,
 opened up your mind to the real

0:16:32.100000 --> 0:16:39.440000
 questions. And that is, okay, how do we
 analyze what a malicious executable

0:16:39.440000 --> 0:16:46.920000
 did on a Windows system outside of,
 you know, how it got in so the, you

0:16:46.920000 --> 0:16:49.100000
 know, the PowerShell command, etc.

0:16:49.100000 --> 0:16:51.240000
 How do we check for persistence?

0:16:51.240000 --> 0:16:55.660000
 How do we see whether the process
 is actively running?

0:16:55.660000 --> 0:17:00.700000
 And how do we communicate this information
 about the executable beacon

0:17:00.700000 --> 0:17:05.180000
.exe to the containment eradication
 and recovery teams?

0:17:05.180000 --> 0:17:08.120000
 Because remember, they're going to need
 information about how to properly

0:17:08.120000 --> 0:17:13.620000
 contain or isolate the system, how
 to eradicate the threat, how to get

0:17:13.620000 --> 0:17:15.900000
 rid of beacon.exe.

0:17:15.900000 --> 0:17:19.120000
 And I'm not talking about just terminating
 the process and deleting the

0:17:19.120000 --> 0:17:24.720000
 executable. Because again, a lot of
 these beacon.exe is quote unquote,

0:17:24.720000 --> 0:17:29.780000
 usually have some very nifty, saw some
 very nasty persistence mechanisms

0:17:29.780000 --> 0:17:34.960000
 built in to deal or to account for
 the fact that, you know, the first

0:17:34.960000 --> 0:17:39.340000
 line of defense would be, in most cases,
 going to be to delete the executable.

0:17:39.340000 --> 0:17:44.060000
 So once it's once beacon.exe was executed
 for the first time, did it download

0:17:44.060000 --> 0:17:48.480000
 any additional droppers or executables?

0:17:48.480000 --> 0:17:54.560000
 You know, did it, did it drop a rootkit
 or something of that sort that

0:17:54.560000 --> 0:17:59.280000
 is still maintaining access or still
 giving the attacker access stuff?

0:17:59.280000 --> 0:18:02.520000
 So hopefully you're starting to understand
 now what deep analysis is all

0:18:02.520000 --> 0:18:08.000000
 about. So it goes beyond, you know,
 just what you do after, you know,

0:18:08.000000 --> 0:18:09.240000
 with Infus response.

0:18:09.240000 --> 0:18:15.800000
 So still sticking to this particular
 example of beacon.exe.

0:18:15.800000 --> 0:18:20.740000
 What do we do, you know, in the
 context of endpoint analysis?

0:18:20.740000 --> 0:18:28.700000
 And we'll talk about how to prioritize
 how to prioritize evidence collection

0:18:28.700000 --> 0:18:34.720000
 using a process called evidence triage
 in the next set of videos that

0:18:34.720000 --> 0:18:37.560000
 will give you that that will pretty
 much explain why we're starting with

0:18:37.560000 --> 0:18:38.460000
 memory forensics.

0:18:38.460000 --> 0:18:44.720000
 It it has to do with the data sources
 or the pieces of evidence or the

0:18:44.720000 --> 0:18:48.040000
 the sources of evidence that
 are a the most volatile.

0:18:48.040000 --> 0:18:56.840000
 So, you know, a that are the most volatile
 and be provide the most investigative,

0:18:56.840000 --> 0:19:02.220000
 essentially have the most
 investigative value.

0:19:02.220000 --> 0:19:05.720000
 So it goes without saying that when
 you're dealing with a process, the

0:19:05.720000 --> 0:19:09.340000
 RAM is going to be the most important
 source of information about what

0:19:09.340000 --> 0:19:12.020000
 went on on a system and what
 is currently going on.

0:19:12.020000 --> 0:19:14.060000
 So we have memory forensics, right?

0:19:14.060000 --> 0:19:18.700000
 So what we do in this case is dump the
 RAM with a tool like a velociraptor

0:19:18.700000 --> 0:19:20.940000
 or you can use any other tool.

0:19:20.940000 --> 0:19:25.240000
 And you know, you can run volatility,
 mal-find and find injected shellcode

0:19:25.240000 --> 0:19:28.420000
 in SQL server.exe.

0:19:28.420000 --> 0:19:34.240000
 Based on that, we extract the C2 URL
 and AES key from the process strings

0:19:34.240000 --> 0:19:38.000000
 because it's windows and
 this was an executable.

0:19:38.000000 --> 0:19:40.960000
 We have to check disk and
 the windows registry.

0:19:40.960000 --> 0:19:43.480000
 So disk and registry examination.

0:19:43.480000 --> 0:19:49.220000
 So we pass MFT and SISMON event 11
 to confirm beacon.exe dropped into,

0:19:49.220000 --> 0:19:50.760000
 you know, a particular folder.

0:19:50.760000 --> 0:19:53.520000
 In this case, it looked like
 it dropped very creatively.

0:19:53.520000 --> 0:19:56.820000
 So I should say into
 C program data Adobe.

0:19:56.820000 --> 0:20:00.340000
 Okay. When we analyze the registry,
 you can see the registry.

0:20:00.340000 --> 0:20:02.880000
 High shows a new run key.

0:20:02.880000 --> 0:20:06.820000
 So H key current user software Microsoft
 Windows current version run Adobe

0:20:06.820000 --> 0:20:14.620000
 update. Okay. Do we hopefully know what
 that means in any case, you can

0:20:14.620000 --> 0:20:19.240000
 start to see that an executable does
 a lot of, as I said, nifty or nasty

0:20:19.240000 --> 0:20:23.200000
 things beyond its intended purpose
 because remember, attackers want to

0:20:23.200000 --> 0:20:24.820000
 maintain the access on a system.

0:20:24.820000 --> 0:20:29.620000
 They want to evade detection and they'll
 go, you know, to do great lengths

0:20:29.620000 --> 0:20:34.740000
 to ensure that. So we then
 have binary triage.

0:20:34.740000 --> 0:20:41.100000
 So this is now when we actually exploring
 or analyzing reverse engineering,

0:20:41.100000 --> 0:20:44.240000
 the beacon.exe executable.

0:20:44.240000 --> 0:20:49.020000
 And this is typically done in a very
 controlled environment in a sandbox.

0:20:49.020000 --> 0:20:54.220000
 But we static analysis reveals that,
 you know, seven zip self extracts

0:20:54.220000 --> 0:21:04.220000
 plus mimicats DLL.

0:21:04.220000 --> 0:21:05.740000
 And we also have a lot of different parameters,
 but also by threat actors

0:21:05.740000 --> 0:21:10.360000
 to access, you know, to essentially
 dumb credentials or for credential

0:21:10.360000 --> 0:21:17.200000
 access. And on Windows, it's facilitated
 through, you know, in many ways.

0:21:17.200000 --> 0:21:20.540000
 Firstly, mimicats needs to be run in
 a privileged context, which means

0:21:20.540000 --> 0:21:24.360000
 you need elevated the attack or need elevated
 privileges in order to access,

0:21:24.360000 --> 0:21:32.440000
 for example, the LSAS process cache
 or the SAM database, for example,

0:21:32.440000 --> 0:21:38.120000
 but based on what we do with regard
 to static analysis, we can see that

0:21:38.120000 --> 0:21:45.040000
 static strings reveal that this executable
 has in its, you know, we is

0:21:45.040000 --> 0:21:50.200000
 seven zip self extracts plus it also
 has another good day, which is this

0:21:50.200000 --> 0:21:55.240000
 mimicats DLL. Now execution in a sandbox,
 so it's executed offline because

0:21:55.240000 --> 0:22:00.340000
 we don't do, you know, we want to be
 as careful as possible confirms,

0:22:00.340000 --> 0:22:02.360000
 you know, the credential dump module.

0:22:02.360000 --> 0:22:07.160000
 So we can pretty much tell at this point
 what the purpose of beacon.exe

0:22:07.160000 --> 0:22:09.420000
 was in totality.

0:22:09.420000 --> 0:22:14.960000
 Not 100%, but we know it's weakening back
 to an attacker or C2 infrastructure.

0:22:14.960000 --> 0:22:19.120000
 But in addition to that, it's
 also dumping credentials.

0:22:19.120000 --> 0:22:21.220000
 We then develop a timeline.

0:22:21.220000 --> 0:22:26.140000
 So we combine MFT, prefetch Windows
 event 468, 8 and Sysmone into time

0:22:26.140000 --> 0:22:31.620000
 sketch, identify initial execution via
 our DP session from a particular

0:22:31.620000 --> 0:22:33.520000
 laptop. So in this case, H.R.

0:22:33.520000 --> 0:22:38.380000
 laptop 0.2, eight minutes earlier.

0:22:38.380000 --> 0:22:42.200000
 Okay, so we're now starting to understand
 much better what we're dealing

0:22:42.200000 --> 0:22:46.600000
 with. We then have our U C expansion
 and enterprise hunt.

0:22:46.600000 --> 0:22:48.420000
 This is very important.

0:22:48.420000 --> 0:22:54.380000
 So hashes of beacon.exe shellcode and
 seven zip payload are searched across

0:22:54.380000 --> 0:22:56.740000
 EDR, the EDR fleet.

0:22:56.740000 --> 0:22:59.760000
 And we find two more hits
 on staging servers.

0:22:59.760000 --> 0:23:05.460000
 So we're now, this is what we're talking
 about when we will say we're

0:23:05.460000 --> 0:23:08.260000
 trying to identify the blast
 radius or the scope.

0:23:08.260000 --> 0:23:15.340000
 We can see, oh, it's not just affecting
 that that particular database.

0:23:15.340000 --> 0:23:19.960000
 That SQL database server,
 it's also affecting other.

0:23:19.960000 --> 0:23:22.620000
 It appears in this case, staging service.


0:23:22.620000 --> 0:23:24.240000
 So what do we do next?

0:23:24.240000 --> 0:23:29.180000
 Well, we essentially cue these staging
 servers for separate containment.

0:23:29.180000 --> 0:23:33.940000
 Okay. The key thing here is that you
 need to be documenting all of this.

0:23:33.940000 --> 0:23:36.040000
 This is very important.

0:23:36.040000 --> 0:23:39.120000
 And then we have over here.

0:23:39.120000 --> 0:23:42.820000
 We have over here root
 case identification.

0:23:42.820000 --> 0:23:44.340000
 I know it said toot case.

0:23:44.340000 --> 0:23:48.200000
 I managed to correct that really quickly
 while you weren't looking.

0:23:48.200000 --> 0:23:51.620000
 So we have root case identification
 as the name suggests.

0:23:51.620000 --> 0:23:57.900000
 We're trying to identify the root cause,
 sorry, not root case, root cause.

0:23:57.900000 --> 0:24:00.180000
 Okay. And I corrected that as well.

0:24:00.180000 --> 0:24:02.220000
 So root cause identification.

0:24:02.220000 --> 0:24:08.180000
 So RDP logs show successful login with
 domain with a domain admin account

0:24:08.180000 --> 0:24:10.720000
 called service backup.

0:24:10.720000 --> 0:24:14.500000
 And we then perform correlation
 with ad logs.

0:24:14.500000 --> 0:24:19.660000
 And we discover that there's a password
 spray attack, password spray attack

0:24:19.660000 --> 0:24:24.760000
 and consequently an alert and our prior
 on this particular HR laptop.

0:24:24.760000 --> 0:24:29.520000
 So what we can tell is that initial
 access was facilitated via this HR

0:24:29.520000 --> 0:24:36.500000
 laptop. Okay. Via passwords, prior alert,
 the attacker then used the access

0:24:36.500000 --> 0:24:43.300000
 they got on HR laptop zero two
 to RDP into the SQL server.

0:24:43.300000 --> 0:24:46.760000
 We found out the successful login there.

0:24:46.760000 --> 0:24:51.840000
 And once they were there, they then,
 you know, pulled via PowerShell or

0:24:51.840000 --> 0:24:53.800000
 downloaded beacon.exe.

0:24:53.800000 --> 0:24:59.060000
 From that remote server, beacon.exe
 apart from it, pinging back to a C2

0:24:59.060000 --> 0:25:08.440000
 server. Also, you know, pretty much
 when executed, you know, as stated

0:25:08.440000 --> 0:25:13.060000
 here, just so I stick or I'm accurate.

0:25:13.060000 --> 0:25:19.860000
 You know, self extracts seven
 zip and the mimic has DLL.

0:25:19.860000 --> 0:25:25.680000
 And then, you know, you know, performs
 credential dumping and presumably

0:25:25.680000 --> 0:25:31.060000
 sends back that information or those
 credentials back to the attacker

0:25:31.060000 --> 0:25:34.340000
 C2 control or the attacker
 control C2 infrastructure.

0:25:34.340000 --> 0:25:38.220000
 And of course, at that point, we would
 perform network analysis or traffic

0:25:38.220000 --> 0:25:45.360000
 analysis to in a sandbox environment
 to see what we've already identified

0:25:45.360000 --> 0:25:51.620000
 or, you know, what the C2 domain is
 or what address IP address or domain

0:25:51.620000 --> 0:25:54.780000
 the beacon.exe is connecting to.

0:25:54.780000 --> 0:25:59.160000
 But we can also identify the frequency
 of the of the beaconing.

0:25:59.160000 --> 0:26:03.680000
 So how often it calls back, what
 information it's sending.

0:26:03.680000 --> 0:26:08.640000
 And if it does successfully dump credentials,
 how it sends those credentials

0:26:08.640000 --> 0:26:12.980000
 is it in encrypted format that will
 also allow you to determine whether,

0:26:12.980000 --> 0:26:19.940000
 you know, based on whether or not the
 data that was being sent to the

0:26:19.940000 --> 0:26:23.140000
 C2 was encrypted or not based on that
 and whether you're able to analyze

0:26:23.140000 --> 0:26:28.280000
 it, you're able to determine through,
 you know, captured traffic on that

0:26:28.280000 --> 0:26:32.060000
 particular system on the network, what
 what the attacker was able to get

0:26:32.060000 --> 0:26:35.500000
 or to access and consequently exfiltrate.


0:26:35.500000 --> 0:26:40.240000
 In any case, the deliverables or outputs
 of deep analysis in alignment

0:26:40.240000 --> 0:26:44.000000
 with, you know, this particular
 example are as follows.

0:26:44.000000 --> 0:26:47.900000
 So you have the output in one column
 and more importantly, how it feeds

0:26:47.900000 --> 0:26:49.680000
 containment and recovery.

0:26:49.680000 --> 0:26:53.640000
 Go or containment eradication
 and recovery, I should say.

0:26:53.640000 --> 0:26:57.280000
 So the first output is the IOC set.

0:26:57.280000 --> 0:27:01.280000
 So hashes the IP addresses
 registry keys, etc.

0:27:01.280000 --> 0:27:05.260000
 This is, you know, in terms of how it feeds
 or how it's used in the containment

0:27:05.260000 --> 0:27:07.060000
 eradication and recovery phases.

0:27:07.060000 --> 0:27:10.700000
 You know, you have new firewall block
 rules for that particular IP, let's

0:27:10.700000 --> 0:27:16.000000
 say, EDR quarantine quarantine rules
 and seem detection rules, improved

0:27:16.000000 --> 0:27:18.920000
 ones or fine tuned ones, I should say.

0:27:18.920000 --> 0:27:24.180000
 You also have your root cause report
 so compromised RDP credentials.

0:27:24.180000 --> 0:27:27.420000
 What do you do with this because
 credentials were compromised.

0:27:27.420000 --> 0:27:32.280000
 You perform, you know, in terms of how
 it feeds into containment eradication

0:27:32.280000 --> 0:27:39.120000
 and recovery immediate password resets
 are performed to a phase rolled

0:27:39.120000 --> 0:27:44.060000
 out. And there's an update made
 to the RDP audit policy.

0:27:44.060000 --> 0:27:47.620000
 And that could be different things like
 only accepting connections from

0:27:47.620000 --> 0:27:52.420000
 a particular IP or a particular system,
 only internal addresses, limiting

0:27:52.420000 --> 0:27:55.400000
 the number of field authentication attempts
 because the way the attacker

0:27:55.400000 --> 0:27:56.600000
 got in was through a period of time.

0:27:56.600000 --> 0:27:58.680000
 And then the data passcode spray attack.

0:27:58.680000 --> 0:28:02.080000
 And then forensic timeline
 and data access list.

0:28:02.080000 --> 0:28:07.200000
 So the database administrator verifies
 the database integrity and the

0:28:07.200000 --> 0:28:12.900000
 legal or compliance team evaluates breach
 notification duty if, you know,

0:28:12.900000 --> 0:28:19.100000
 data was accessed and or modified at
 all with regards to, you know, to

0:28:19.100000 --> 0:28:23.860000
 the actual data within the database,
 the Microsoft SQL database, that

0:28:23.860000 --> 0:28:27.960000
 is in the case of this example,
 and then detection gaps.

0:28:27.960000 --> 0:28:32.000000
 So, you know, the output here is that
 no partial downgrade rules with,

0:28:32.000000 --> 0:28:36.220000
 you know, there weren't any
 partial downgrade rules.

0:28:36.220000 --> 0:28:40.780000
 So the how it feeds containment eradication
 and recovery, the detection

0:28:40.780000 --> 0:28:44.840000
 engineering team or personnel write
 new controls and the sole playbook

0:28:44.840000 --> 0:28:49.180000
 if if it exists is updated.

0:28:49.180000 --> 0:28:53.840000
 So, hopefully this gives you, you know,
 better idea, understanding of

0:28:53.840000 --> 0:28:57.060000
 how you move from first response
 or the first five minutes.

0:28:57.060000 --> 0:29:03.460000
 I'll just call it first response or hot
 triage to, you know, how you move

0:29:03.460000 --> 0:29:07.080000
 from that or leveraging the outputs of
 first response to determine whether

0:29:07.080000 --> 0:29:12.300000
 you need to perform deeper analysis and
 then based on the actual incident

0:29:12.300000 --> 0:29:16.640000
 itself determining what type of artifacts
 you or what type of evidence

0:29:16.640000 --> 0:29:20.980000
 you need to collect or acquire.

0:29:20.980000 --> 0:29:27.420000
 And, you know, how to then systematically
 go about analyzing them to learn

0:29:27.420000 --> 0:29:32.440000
 more about the root cause
 of the incident.

0:29:32.440000 --> 0:29:37.880000
 The scope, so on and so forth and all
 of this, you know, feeds into the

0:29:37.880000 --> 0:29:42.500000
 containment eradication
 and recovery phases.

0:29:42.500000 --> 0:29:45.820000
 So, the key takeaway from this particular
 video is the following.

0:29:45.820000 --> 0:29:50.160000
 So the bottom line is that deeper analysis
 or deep analysis or full analysis.

0:29:50.160000 --> 0:29:56.020000
 You may hear any one of those variants
 being used to return to refer to

0:29:56.020000 --> 0:29:58.700000
 this, but deeper analysis
 turns the quick.

0:29:58.700000 --> 0:30:05.440000
 We have malware on the following system,
 finding into a complete and accurate

0:30:05.440000 --> 0:30:09.720000
 narrative that includes key information
 like the entry vector or the initial

0:30:09.720000 --> 0:30:11.640000
 access vector which we found.

0:30:11.640000 --> 0:30:15.260000
 It was the the HR laptop, I believe.

0:30:15.260000 --> 0:30:16.260000
 Let's confirm this.

0:30:16.260000 --> 0:30:18.460000
 I haven't been taking my notes.

0:30:18.460000 --> 0:30:22.820000
 Yeah, it was HR laptop zero two so
 that's how they got in and we know

0:30:22.820000 --> 0:30:27.760000
 the vector that they used to get
 in and when in terms of timeline.

0:30:27.760000 --> 0:30:29.640000
 They they got in.

0:30:29.640000 --> 0:30:35.580000
 We also get information like the lateral
 reach or the scope so lateral

0:30:35.580000 --> 0:30:37.940000
 movement or how many other
 systems are affected.

0:30:37.940000 --> 0:30:43.900000
 There were two other staging
 systems apart from HR laptop.

0:30:43.900000 --> 0:30:48.920000
 HR laptop didn't have beacon
.exe executed on it.

0:30:48.920000 --> 0:30:55.220000
 It was used as a pivot point or a jump
 box onto the key crown jewel system,

0:30:55.220000 --> 0:30:56.200000
 which is the SQL server.

0:30:56.200000 --> 0:31:00.160000
 So it looks like this intrude
 and knew what they were doing.

0:31:00.160000 --> 0:31:05.240000
 So we found that in terms of beacon
.exe, the malware in this case or the

0:31:05.240000 --> 0:31:11.180000
 binary. It was executed on
 the two staging servers.

0:31:11.180000 --> 0:31:16.060000
 Right. So we found out the scope of the
 incident with regards to how many

0:31:16.060000 --> 0:31:19.980000
 systems it affected and then
 credential exposure.

0:31:19.980000 --> 0:31:26.280000
 We didn't dive too much into this apart
 from the credentials or the password

0:31:26.280000 --> 0:31:30.060000
 spring attack and the credentials are
 obtained there as a result of the

0:31:30.060000 --> 0:31:31.520000
 password spring attack.

0:31:31.520000 --> 0:31:35.080000
 And then data impact, which again,
 you know, I didn't want to get into

0:31:35.080000 --> 0:31:39.580000
 because, you know, this example was going
 to be, you know, we'd have gone

0:31:39.580000 --> 0:31:44.300000
 into the nitty gritty and you know, I want
 to keep it because it is theoretical.

0:31:44.300000 --> 0:31:49.040000
 I want to limit the complexity and then
 finally concrete erratic eradication

0:31:49.040000 --> 0:31:52.540000
 steps. So deep analysis allows you
 to get all of this information.

0:31:52.540000 --> 0:31:57.340000
 There's much more like I was sees the
 fact that this can help improve

0:31:57.340000 --> 0:32:01.660000
 detection. You can, you know, pass along
 your findings to the detection

0:32:01.660000 --> 0:32:03.620000
 engineering teams, all that good stuff.

0:32:03.620000 --> 0:32:07.320000
 But really from an instant response
 perspective, limited to the instant

0:32:07.320000 --> 0:32:08.680000
 you're analyzing.

0:32:08.680000 --> 0:32:11.060000
 This is the key information you want.

0:32:11.060000 --> 0:32:14.400000
 And then you want to be able to tell the
 containment eradication and recovery

0:32:14.400000 --> 0:32:19.720000
 team or teams. Again, you may be responsible
 for some of that, but you

0:32:19.720000 --> 0:32:21.760000
 know, let's just assume
 it's a separate process.

0:32:21.760000 --> 0:32:26.600000
 You need to be able to give that to
 the responsible individuals and have

0:32:26.600000 --> 0:32:31.460000
 them, you know, contain, eradicate
 and recover those affected systems.

0:32:31.460000 --> 0:32:35.760000
 So bottom line is that this ensures that
 containment is precise and recovery

0:32:35.760000 --> 0:32:38.400000
 is both safe and compliant.

0:32:38.400000 --> 0:32:42.740000
 Okay. So that brings us to the end
 of this video, barring a couple of

0:32:42.740000 --> 0:32:45.100000
 spelling mistakes, which
 I do apologize for.

0:32:45.100000 --> 0:32:51.460000
 I think I explained deep analysis or
 full analysis the way I wanted to.

0:32:51.460000 --> 0:32:55.700000
 And I'm really glad about the fact
 that I included the example because

0:32:55.700000 --> 0:32:59.280000
 I sort of hesitant as to whether
 to include it now, maybe later.

0:32:59.280000 --> 0:33:02.900000
 But with that being said, that's
 going to be it for this video.

0:33:02.900000 --> 0:33:05.100000
 And I will be seeing you
 in the next video.

