WEBVTT

0:00:03.620000 --> 0:00:08.820000
 Log analysis with Splunk investigating
 a Linux intrusion.

0:00:08.820000 --> 0:00:10.420000
 So welcome everyone.

0:00:10.420000 --> 0:00:13.660000
 In this video, we're finally going
 to get our hands dirty with regards

0:00:13.660000 --> 0:00:19.000000
 to endpoint analysis, specifically,
 scene-based log analysis.

0:00:19.000000 --> 0:00:22.700000
 And in this case, we're going to be, again,
 as the title suggests, investigating

0:00:22.700000 --> 0:00:25.800000
 a Linux intrusion or analyzing it.

0:00:25.800000 --> 0:00:31.000000
 We're actually going to be following
 the incident response process or

0:00:31.000000 --> 0:00:37.020000
 first response methodology or workflow
 that I laid out earlier with regards

0:00:37.020000 --> 0:00:41.740000
 to validations, coping, et cetera.

0:00:41.740000 --> 0:00:47.340000
 And the idea of this video, which is
 practical, so this video has a lab

0:00:47.340000 --> 0:00:51.820000
 associated with it, is to, again, give
 you an idea as to what this looks

0:00:51.820000 --> 0:00:55.160000
 like, especially when you're
 dealing with an intrusion.

0:00:55.160000 --> 0:01:02.240000
 So to set this stage, I've created a
 brief here for you that sort of gives

0:01:02.240000 --> 0:01:08.360000
 you an idea of what you're supposed
 to do or what's the incident that

0:01:08.360000 --> 0:01:15.460000
 has been communicated or escalated to
 you entails in terms of your analysis.

0:01:15.460000 --> 0:01:20.580000
 So the case ID, again, this is
 just fictional, is 2025-104.

0:01:20.580000 --> 0:01:25.180000
 The asset in question, this is very
 important, is a system or hostname

0:01:25.180000 --> 0:01:30.640000
 called Linux 01, which is a
 production of Ubuntu Server.

0:01:30.640000 --> 0:01:35.320000
 And in terms of the hand of the tier
 one analyst escalated this particular

0:01:35.320000 --> 0:01:41.020000
 ticket to you with the high
 severity category or label.

0:01:41.020000 --> 0:01:46.900000
 So pretty much you've been tasked with
 analyzing existing logs using Splunk

0:01:46.900000 --> 0:01:52.900000
 to validate and use the engine of suspicious
 activity on a Linux server.

0:01:52.900000 --> 0:01:58.760000
 So your objective is to perform endpoint
 analysis using a log analysis

0:01:58.760000 --> 0:02:04.700000
 specifically through the use of a SEAM
 or the SEAM, which in this case

0:02:04.700000 --> 0:02:11.220000
 is Splunk. So you need to build a story
 and timeline of events with regard

0:02:11.220000 --> 0:02:15.640000
 to what has occurred on the server and
 use that information to plan out

0:02:15.640000 --> 0:02:17.460000
 the required next steps.

0:02:17.460000 --> 0:02:20.960000
 But in addition to that, it's very
 important to understand that again,

0:02:20.960000 --> 0:02:25.140000
 we are going to need to validate that
 there is suspicious or malicious

0:02:25.140000 --> 0:02:28.660000
 activity going on on that server.

0:02:28.660000 --> 0:02:29.740000
 So keep that in mind.

0:02:29.740000 --> 0:02:33.860000
 Again, the principles of first
 response still apply here.

0:02:33.860000 --> 0:02:38.200000
 So the next steps would be to perform
 searches in Splunk to identify and

0:02:38.200000 --> 0:02:42.460000
 or validate potential suspicious
 or malicious activity.

0:02:42.460000 --> 0:02:46.540000
 And then based on the data found, identify
 any gaps in logging that may

0:02:46.540000 --> 0:02:49.260000
 exist and then plan out the next steps.

0:02:49.260000 --> 0:02:54.520000
 You know, when we're talking about next
 steps, this would be, you know,

0:02:54.520000 --> 0:02:59.620000
 the actions to take with regards to in
 this case containment eradication,

0:02:59.620000 --> 0:03:04.860000
 et cetera. So the bottom line is that these
 next steps can only be determined

0:03:04.860000 --> 0:03:08.000000
 by the results of the log analysis
 that will be performing.

0:03:08.000000 --> 0:03:11.120000
 So with that being said, that
 brings us to the lab demo.

0:03:11.120000 --> 0:03:14.320000
 So as I mentioned, this video
 has a lab associated with it.

0:03:14.320000 --> 0:03:17.200000
 It's going to be the lab
 just below this video.

0:03:17.200000 --> 0:03:19.200000
 The title of the lab is log analysis.

0:03:19.200000 --> 0:03:22.680000
 Now, before we get started, you will
 be provided with access to a Windows

0:03:22.680000 --> 0:03:26.680000
 system that has Splunk deployment
 already set up on it.

0:03:26.680000 --> 0:03:29.880000
 You just need to click on the Splunk
 icon on your desktop and it'll bring

0:03:29.880000 --> 0:03:36.140000
 it up. There are other systems, you
 know, apart from the Linux system

0:03:36.140000 --> 0:03:38.220000
 like a domain controller.

0:03:38.220000 --> 0:03:41.920000
 So all of the systems within the environment
 or that are pertinent to

0:03:41.920000 --> 0:03:45.180000
 this lab have been provided to you
 in the lab documentation as well as

0:03:45.180000 --> 0:03:46.500000
 the credentials.

0:03:46.500000 --> 0:03:49.900000
 But really for what we'll be doing,
 given that we'll be using Splunk,

0:03:49.900000 --> 0:03:51.720000
 there's really not much required.

0:03:51.720000 --> 0:03:54.620000
 So when you start off the lab, you'll
 get access to a Windows system.

0:03:54.620000 --> 0:03:58.380000
 Just open up Splunk, it'll, you know,
 or just click on the Splunk icon

0:03:58.380000 --> 0:04:02.680000
 on the desktop or thumbnail as it
 will bring it up in a browser.

0:04:02.680000 --> 0:04:05.520000
 You don't need to authenticate and
 we can pretty much get started from

0:04:05.520000 --> 0:04:11.120000
 that point. In addition to that, the
 solutions for this lab also outline

0:04:11.120000 --> 0:04:14.080000
 in the solutions tab of this lab.

0:04:14.080000 --> 0:04:15.320000
 So keep that in mind.

0:04:15.320000 --> 0:04:17.640000
 With that being said, let's
 not waste any more time.

0:04:17.640000 --> 0:04:22.260000
 I'm going to start off my lab and I'll
 see you there in a couple of seconds.

0:04:22.260000 --> 0:04:25.700000
 All right. So I'm currently
 within the lab environment.

0:04:25.700000 --> 0:04:29.620000
 And as you can see, I've started up Splunk
 Enterprise here and we'll just

0:04:29.620000 --> 0:04:32.000000
 go into search and reporting.

0:04:32.000000 --> 0:04:40.000000
 And what we want to do first is just
 expand the timeline here to all time.

0:04:40.000000 --> 0:04:43.220000
 And that, you know, already explained
 why that's the case, especially

0:04:43.220000 --> 0:04:45.220000
 in the lab environment.

0:04:45.220000 --> 0:04:48.740000
 And now we can begin our search.

0:04:48.740000 --> 0:04:52.580000
 So first things first, we want to,
 you know, just say we want to query

0:04:52.580000 --> 0:04:57.640000
 all indexes. So we'll use the, we'll
 say index equal to wildcard here.

0:04:57.640000 --> 0:05:02.340000
 And then we also want to limit the results
 to, and this is very important

0:05:02.340000 --> 0:05:04.420000
 to just the host in question.

0:05:04.420000 --> 0:05:09.880000
 So host in this case would
 be equal to Linux 01.

0:05:09.880000 --> 0:05:11.960000
 And we want to hit enter.

0:05:11.960000 --> 0:05:14.300000
 All right. Give that a few seconds.

0:05:14.300000 --> 0:05:18.440000
 Now we have a couple of fields
 here that are very interesting.

0:05:18.440000 --> 0:05:20.420000
 So there's some suspicious activities.

0:05:20.420000 --> 0:05:25.060000
 So the first thing I want to do, I'll
 click on the source type syslog.

0:05:25.060000 --> 0:05:28.420000
 That's the one we want or that's pertinent
 here because we have quite

0:05:28.420000 --> 0:05:31.040000
 a few other, you know, source types here.


0:05:31.040000 --> 0:05:34.720000
 And I've already explained what source
 types are, but syslog is the one

0:05:34.720000 --> 0:05:36.680000
 pertinent to Linux.

0:05:36.680000 --> 0:05:39.060000
 So I'm going to click on that.

0:05:39.060000 --> 0:05:40.740000
 That's going to filter
 it a little bit more.

0:05:40.740000 --> 0:05:42.280000
 It'll add it to your search here.

0:05:42.280000 --> 0:05:44.360000
 So source type is syslog.

0:05:44.360000 --> 0:05:50.020000
 And now if we take a look at the sources
 pertinent to syslog, syslog being

0:05:50.020000 --> 0:05:53.240000
 the source type, we have the following.

0:05:53.240000 --> 0:05:58.900000
 So the first thing we can do is take
 a look at the auth.log log, which

0:05:58.900000 --> 0:06:01.360000
 is where we can sort of track
 authentication attempts.

0:06:01.360000 --> 0:06:05.520000
 So if something is going on, something
 malicious with is going on on this

0:06:05.520000 --> 0:06:09.160000
 Linux system, Linux 01, we probably
 want to start there.

0:06:09.160000 --> 0:06:12.060000
 So I'm going to add that to the search.

0:06:12.060000 --> 0:06:12.720000
 And there we are.

0:06:12.720000 --> 0:06:17.860000
 So we can see, you know, there's quite
 a bit of activity, but let's take

0:06:17.860000 --> 0:06:22.560000
 a look at this here in terms of what
 we have, you know, the latest logs

0:06:22.560000 --> 0:06:26.540000
 we have. Now, of course, you'll see
 that these logs are from 2023, but

0:06:26.540000 --> 0:06:29.080000
 again, that's besides the point.

0:06:29.080000 --> 0:06:31.900000
 Let's take a look here.

0:06:31.900000 --> 0:06:34.300000
 Okay, interesting.

0:06:34.300000 --> 0:06:39.860000
 So we can see that there's some interesting
 stuff going on here with regards

0:06:39.860000 --> 0:06:45.600000
 to a session being closed
 for root and new admin.

0:06:45.600000 --> 0:06:46.920000
 That's very interesting.

0:06:46.920000 --> 0:06:49.660000
 We're not in a user called new admin.

0:06:49.660000 --> 0:06:51.500000
 That is mighty interesting.

0:06:51.500000 --> 0:06:58.200000
 So let's see if we can take, let's see
 if we can see anything suspicious.

0:06:58.200000 --> 0:07:01.760000
 So malicious are, there's an account
 here called malicious account.

0:07:01.760000 --> 0:07:05.120000
 Okay, so let's add that in here.

0:07:05.120000 --> 0:07:07.960000
 Malicious account.

0:07:07.960000 --> 0:07:14.300000
 Okay, let's see whether we have any activity
 that correlates to our search

0:07:14.300000 --> 0:07:18.240000
 here. So what we're doing, you know,
 searching all indexes at the host,

0:07:18.240000 --> 0:07:21.660000
 we're limiting it to the, you know,
 system in question or the endpoint

0:07:21.660000 --> 0:07:26.220000
 in question, the source type is going
 to be syslog and the source or the,

0:07:26.220000 --> 0:07:29.800000
 you know, the log that we're looking
 or we're trying to analyze is auth

0:07:29.800000 --> 0:07:32.720000
.log and we saw something, you
 know, malicious account.

0:07:32.720000 --> 0:07:35.980000
 So I'm assuming this is a user here.

0:07:35.980000 --> 0:07:38.460000
 So let's see activity pertinent to this.

0:07:38.460000 --> 0:07:42.800000
 So we can see new session here.

0:07:42.800000 --> 0:07:49.300000
 Ah, okay. So we have an authentication
 attempt for malicious account from.

0:07:49.300000 --> 0:07:51.180000
 This is very interesting.

0:07:51.180000 --> 0:07:57.980000
 So it's authenticating with the host Linux
 01, this particular user malicious

0:07:57.980000 --> 0:08:02.000000
 account. But this IP, if you take a
 look at the lab documentation, this

0:08:02.000000 --> 0:08:07.660000
 IP belongs to the domain controller,
 which is a bit worrying here.

0:08:07.660000 --> 0:08:13.160000
 And if we scroll to the bottom here,
 we can see that there we are.

0:08:13.160000 --> 0:08:16.540000
 So this is when the user malicious
 account was created.

0:08:16.540000 --> 0:08:22.860000
 So we can see that the user that created
 malicious account was lab admin.

0:08:22.860000 --> 0:08:27.560000
 Okay. We can, you know,
 that's verified here.

0:08:27.560000 --> 0:08:32.740000
 Um, and then of course they run the
 command, uh, add user, which, you

0:08:32.740000 --> 0:08:36.420000
 know, by this, we can tell that lab
 admin was part of the pseudo route,

0:08:36.420000 --> 0:08:41.640000
 uh, group and can run, you know, um,
 root, can run commands or binaries

0:08:41.640000 --> 0:08:43.160000
 with root privileges.

0:08:43.160000 --> 0:08:47.240000
 So we can see that the news was created.

0:08:47.240000 --> 0:08:50.860000
 Um, malicious account
 was added to a group.

0:08:50.860000 --> 0:08:54.880000
 So Etsy group, let's see if we can drill
 down here and see what's going

0:08:54.880000 --> 0:09:00.200000
 on here. So, um, malicious, gid 1004.

0:09:00.200000 --> 0:09:02.720000
 We don't need to confirm that right now.

0:09:02.720000 --> 0:09:06.640000
 Um, okay. Okay. So something
 is definitely going on.

0:09:06.640000 --> 0:09:08.960000
 So we've been able to validate that
 there's something going on.

0:09:08.960000 --> 0:09:12.520000
 We also have been able to determine
 or roughly determine, you know, the

0:09:12.520000 --> 0:09:16.440000
 scope. So we're dealing with, we know
 that Linux 01 is, there's obviously

0:09:16.440000 --> 0:09:22.900000
 something going on here, but domain
 controller 01 also seems to be, uh,

0:09:22.900000 --> 0:09:27.180000
 seems to have been affected by this
 in that it was used to auth.

0:09:27.180000 --> 0:09:30.440000
 Uh, it was the system used by the attacker
 to authenticate with Linux

0:09:30.440000 --> 0:09:34.440000
 01 as malicious account, which
 is very interesting.

0:09:34.440000 --> 0:09:37.040000
 So some lateral movement is going on.

0:09:37.040000 --> 0:09:41.840000
 Um, so we can see right over here,
 Linux user, add new user, malicious

0:09:41.840000 --> 0:09:43.560000
 account, use id.

0:09:43.560000 --> 0:09:48.560000
 Uh, and then, uh, let's take
 a closer look at this here.

0:09:48.560000 --> 0:09:51.320000
 So, okay. Interesting.

0:09:51.320000 --> 0:09:53.600000
 So that appears to be, uh, okay.

0:09:53.600000 --> 0:09:58.680000
 So that's, we know that the user is added
 and then they changed the password.

0:09:58.680000 --> 0:10:01.860000
 So we can see this here and by the
 way, all of this is coming from the

0:10:01.860000 --> 0:10:05.360000
 auth.log file and then changed
 a user information.

0:10:05.360000 --> 0:10:07.660000
 Very interesting.

0:10:07.660000 --> 0:10:12.520000
 So, um, now let's do what we would
 do in first, um, response, which is

0:10:12.520000 --> 0:10:17.860000
 sort of, um, using our seem, um, extend
 our timeline or sort of broaden

0:10:17.860000 --> 0:10:22.340000
 the timeline that we're searching for
 here specific to a log or an event

0:10:22.340000 --> 0:10:23.800000
 that's interesting.

0:10:23.800000 --> 0:10:28.680000
 But before we do that, uh, I can also
 see something very interesting here,

0:10:28.680000 --> 0:10:34.420000
 um, that I didn't, I didn't point
 out and that is that, uh, yeah.

0:10:34.420000 --> 0:10:36.260000
 So it was actually not lab admin.

0:10:36.260000 --> 0:10:40.800000
 It was new admin that's trying to create
 malicious account here or created

0:10:40.800000 --> 0:10:45.540000
 it. So let's actually, um, new admin.

0:10:45.540000 --> 0:10:46.840000
 That's very interesting.

0:10:46.840000 --> 0:10:49.800000
 So let's see new admin here.

0:10:49.800000 --> 0:10:52.480000
 Just going to search for that.

0:10:52.480000 --> 0:10:54.980000
 Uh, huh. Okay. There we go.

0:10:54.980000 --> 0:10:58.120000
 Very nice. So this new admin.

0:10:58.120000 --> 0:11:01.780000
 Okay. Yeah. So this looks like
 another user that was created.

0:11:01.780000 --> 0:11:08.300000
 So we have two users that were created
 by the intruder by, and they did

0:11:08.300000 --> 0:11:11.940000
 so using the account lab admin,
 which is a standard account.

0:11:11.940000 --> 0:11:13.640000
 That's not a malicious account.

0:11:13.640000 --> 0:11:15.340000
 We can see, yeah, there we are.

0:11:15.340000 --> 0:11:17.460000
 So lab admin created new admin.

0:11:17.460000 --> 0:11:22.660000
 And then this was added to
 group, um, right over here.

0:11:22.660000 --> 0:11:24.940000
 And then, um, yeah.

0:11:24.940000 --> 0:11:31.660000
 Okay. New group, uh, new admin,
 right over here user add.

0:11:31.660000 --> 0:11:34.160000
 Okay. Pass would changed.

0:11:34.160000 --> 0:11:38.760000
 And then, um, new admin, you
 know, elevated privileges.

0:11:38.760000 --> 0:11:42.600000
 Uh, and, uh, so we can see now
 actually hold on lab admin.

0:11:42.600000 --> 0:11:48.660000
 Yeah. So they, this was the, uh, the command
 they used to create new admin.

0:11:48.660000 --> 0:11:54.500000
 Okay. Um, and then over here lab
 admin switched to new admin.

0:11:54.500000 --> 0:11:57.180000
 Okay. So they switched to that new user.

0:11:57.180000 --> 0:12:01.340000
 And then this is where they
 added malicious account.

0:12:01.340000 --> 0:12:05.880000
 So once they, you know, got access to
 the account they created new admin,

0:12:05.880000 --> 0:12:08.440000
 we can see there we are.

0:12:08.440000 --> 0:12:12.680000
 So they are authenticated
 as new admin to Linux 01.

0:12:12.680000 --> 0:12:15.420000
 They then created malicious account.

0:12:15.420000 --> 0:12:18.400000
 And then modified, uh, hold on.

0:12:18.400000 --> 0:12:19.480000
 So this is new admin.

0:12:19.480000 --> 0:12:22.300000
 Yeah. They modified the SSH config,
 which is interesting.

0:12:22.300000 --> 0:12:26.960000
 Now if we go back to our previous search,
 which was malicious account,

0:12:26.960000 --> 0:12:30.660000
 that's where we actually verified
 some of the other activity.

0:12:30.660000 --> 0:12:38.520000
 Let me just hit enter here with regards
 to authenticating, um, you know,

0:12:38.520000 --> 0:12:41.820000
 how they authenticated with
 Linux 01 as malicious admin.

0:12:41.820000 --> 0:12:45.820000
 We can see it comes from the IP address
 of the domain controller, which

0:12:45.820000 --> 0:12:48.060000
 is quite interesting.

0:12:48.060000 --> 0:12:53.400000
 Okay. So, um, we know a few things.

0:12:53.400000 --> 0:12:58.740000
 We have, um, in terms of points of
 compromise, there's lab admin, new

0:12:58.740000 --> 0:13:00.540000
 admin malicious account.

0:13:00.540000 --> 0:13:05.920000
 And we know the system TC 01 is also
 involved in this suspicious activity.

0:13:05.920000 --> 0:13:13.520000
 So, um, what we can do now is let's take
 this authentication attempt here.

0:13:13.520000 --> 0:13:15.900000
 So accepted password.

0:13:15.900000 --> 0:13:19.620000
 You can see the, this is from Linux 01
 accepted password formulation account

0:13:19.620000 --> 0:13:21.840000
 from the domain controller here.

0:13:21.840000 --> 0:13:26.560000
 So if we just drill down and we go to
 time, so again, principles of first

0:13:26.560000 --> 0:13:29.820000
 response, um, we have
 time right over here.

0:13:29.820000 --> 0:13:34.820000
 We're going to say nearby events
 plus minus five minutes.

0:13:34.820000 --> 0:13:38.020000
 Apply this in here.

0:13:38.020000 --> 0:13:40.160000
 Okay. Interesting.

0:13:40.160000 --> 0:13:45.940000
 All right. So, uh, let's
 see what we have here.

0:13:45.940000 --> 0:13:49.740000
 Okay. So plus five.

0:13:49.740000 --> 0:13:59.980000
 Um, okay. So let's get rid of this
 malicious account because we don't

0:13:59.980000 --> 0:14:03.460000
 want to limit it just to
 that particular user.

0:14:03.460000 --> 0:14:09.100000
 Um, all right. Let's go
 a bit broadened in this.

0:14:09.100000 --> 0:14:14.700000
 So yeah, this is still we can
 see, uh, accepted password.

0:14:14.700000 --> 0:14:18.200000
 So plus five, let's expand
 this a little bit.

0:14:18.200000 --> 0:14:21.940000
 Let's get rid of the source auth.log
 to see what other activity they were

0:14:21.940000 --> 0:14:24.060000
 doing, apart from authentication, right?

0:14:24.060000 --> 0:14:25.700000
 Remember, that's very important.

0:14:25.700000 --> 0:14:29.480000
 Logs can be used to track authentication,
 process creation, file creation,

0:14:29.480000 --> 0:14:34.600000
 etc. So let's limit it just to the source
 type syslog with the same plus

0:14:34.600000 --> 0:14:37.140000
 minus five minutes.

0:14:37.140000 --> 0:14:45.400000
 Um, that, that we, uh, you know, configured
 in our time filter, um, or

0:14:45.400000 --> 0:14:47.280000
 our time frame as it were.

0:14:47.280000 --> 0:14:49.560000
 So, uh, let's see.

0:14:49.560000 --> 0:14:54.100000
 Okay. This is just a syslog.

0:14:54.100000 --> 0:15:01.220000
 Uh, huh. So syslog, oh, syslog.

0:15:01.220000 --> 0:15:05.560000
 Uh, let's see. Um, okay.

0:15:05.560000 --> 0:15:08.500000
 Session opened here.

0:15:08.500000 --> 0:15:10.840000
 Let's see. Um, okay.

0:15:10.840000 --> 0:15:13.200000
 This is just standard stuff.

0:15:13.200000 --> 0:15:17.820000
 Um, let's see in terms of source type.

0:15:17.820000 --> 0:15:24.020000
 Let's try and, uh, what other source
 types do we have in here?

0:15:24.020000 --> 0:15:25.960000
 Let me get rid of this.

0:15:25.960000 --> 0:15:28.960000
 My bad. Let's leave that as it is.

0:15:28.960000 --> 0:15:31.540000
 Let's light and get rid of that.

0:15:31.540000 --> 0:15:35.600000
 So just, uh, the host Linux 01, it's
 still maintaining that timeline.

0:15:35.600000 --> 0:15:38.160000
 You can see right over here.

0:15:38.160000 --> 0:15:45.240000
 Um, if I just hover over it, we used
 the authentication, um, or the log

0:15:45.240000 --> 0:15:48.940000
 that sort of outlined authentic, uh, you
 know, we used the initial authentication

0:15:48.940000 --> 0:15:54.540000
 as malicious admin, I believe,
 um, as the starting point.

0:15:54.540000 --> 0:15:57.660000
 So plus five, uh, minus five.

0:15:57.660000 --> 0:16:03.460000
 So anytime, you know, five minutes prior
 to that, five minutes after that.

0:16:03.460000 --> 0:16:07.840000
 And right over here, we can
 see interesting Linux 01.

0:16:07.840000 --> 0:16:11.660000
 This is coming from the
 source, uh, Vim info.

0:16:11.660000 --> 0:16:12.880000
 Oh, interesting.

0:16:12.880000 --> 0:16:16.280000
 So we have a malware
 file right over here.

0:16:16.280000 --> 0:16:17.860000
 Might be interesting.

0:16:17.860000 --> 0:16:22.940000
 So, um, because, you know, this
 is home malicious Vim info.

0:16:22.940000 --> 0:16:24.780000
 Let's see. Okay.

0:16:24.780000 --> 0:16:27.940000
 Vim info file was generated.

0:16:27.940000 --> 0:16:34.660000
 Um, all right. Okay.

0:16:34.660000 --> 0:16:38.080000
 So we can see execution of
 a command, right and quit.

0:16:38.080000 --> 0:16:39.500000
 So they modified it.

0:16:39.500000 --> 0:16:40.780000
 They, okay, there we are.

0:16:40.780000 --> 0:16:42.240000
 So malicious content.

0:16:42.240000 --> 0:16:42.640000
 This is malware.

0:16:42.640000 --> 0:16:47.420000
 So again, this was just added to the
 file just to demonstrate, you know,

0:16:47.420000 --> 0:16:51.480000
 what this would look like, but it's
 saved on a home malicious account,

0:16:51.480000 --> 0:16:54.340000
 um, malware file.

0:16:54.340000 --> 0:16:58.480000
 And right over here, we're able to
 determine that there's some malware

0:16:58.480000 --> 0:17:00.500000
 deployed on Linux 01.

0:17:00.500000 --> 0:17:02.000000
 Mighty interesting.

0:17:02.000000 --> 0:17:06.700000
 Okay. So, uh, that's pretty much all
 I wanted to highlight now to just

0:17:06.700000 --> 0:17:09.640000
 summarize what I did here.

0:17:09.640000 --> 0:17:12.620000
 Um, firstly, you know, what, what we did.


0:17:12.620000 --> 0:17:16.240000
 Is we configured the source type and
 the source to look for authentication

0:17:16.240000 --> 0:17:19.180000
 attempts pertinent to the host Linux 01.

0:17:19.180000 --> 0:17:23.400000
 We found creation of a couple of accounts,
 the first was new admin and

0:17:23.400000 --> 0:17:25.300000
 then malicious account.

0:17:25.300000 --> 0:17:32.080000
 We also saw an authentication attempt to
 Linux 01 from the domain controller.

0:17:32.080000 --> 0:17:38.240000
 So DC 01 via or using the malicious
 account user, which was successful.

0:17:38.240000 --> 0:17:41.100000
 So we've found out quite
 a lot of information.

0:17:41.100000 --> 0:17:45.080000
 And with that being said, now, um, this
 brings us to the end of the practical

0:17:45.080000 --> 0:17:48.860000
 demonstration. I'm going to switch
 over back to the slides and we can

0:17:48.860000 --> 0:17:53.420000
 sort of analyze what we found and then
 take a look at the next steps with

0:17:53.420000 --> 0:17:56.140000
 regards to what we can do
 as instant responders.

0:17:56.140000 --> 0:17:59.520000
 Now, if you've absorbed everything I've
 taught you, then you know, this

0:17:59.520000 --> 0:18:02.500000
 is where we would get into, you
 know, proper endpoint analysis.

0:18:02.500000 --> 0:18:07.160000
 So actually getting onto the system
 here, analyzing this malware, stuff

0:18:07.160000 --> 0:18:10.000000
 like this, still performing
 our validation.

0:18:10.000000 --> 0:18:14.440000
 We'd also need to include DC 01 as part
 of our investigation, which we're

0:18:14.440000 --> 0:18:15.380000
 not going to do here.

0:18:15.380000 --> 0:18:19.720000
 This is just to give you, you know,
 high level overview, but, you know,

0:18:19.720000 --> 0:18:25.000000
 quite tacit or practical experience
 with what an real intrusion would

0:18:25.000000 --> 0:18:30.740000
 look like and how you use, um, you know,
 how you, you pick a particular,

0:18:30.740000 --> 0:18:32.900000
 you see like an IP address
 and then pivot.

0:18:32.900000 --> 0:18:36.360000
 Use that as your pivot point.

0:18:36.360000 --> 0:18:41.920000
 So we, you know, we have also, you may
 not have been taking noting this

0:18:41.920000 --> 0:18:46.280000
 down. We have also sort of built a timeline
 of events starting obviously

0:18:46.280000 --> 0:18:50.400000
 with, you know, the lab admin
 creating new admin.

0:18:50.400000 --> 0:18:57.340000
 And then it appears this was 602, you
 know, there's creation of the malware

0:18:57.340000 --> 0:19:02.160000
 file. We can of course get into execution
 of it, but that's, you know,

0:19:02.160000 --> 0:19:04.400000
 not within the scope of what
 I wanted to demonstrate.

0:19:04.400000 --> 0:19:07.640000
 In any case, let's switch back to the
 slides and we can go through the,

0:19:07.640000 --> 0:19:12.220000
 you know, what, what we've found
 and then the next steps.

0:19:12.220000 --> 0:19:18.060000
 All right. So let's take a look at the
 timeline of events based on what

0:19:18.060000 --> 0:19:22.900000
 we found. So based on the analysis of
 logs or log analysis, we performed

0:19:22.900000 --> 0:19:26.500000
 with Splunk. We have put together
 an interesting timeline of events.

0:19:26.500000 --> 0:19:33.280000
 So lab admin, which, you know, it's fair
 to say was compromised, possibly

0:19:33.280000 --> 0:19:36.500000
 compromised, created a
 user called new admin.

0:19:36.500000 --> 0:19:41.880000
 Lab admin then switched to, you know,
 switched users using su as you saw

0:19:41.880000 --> 0:19:46.320000
 there to act, you know, they switched
 to new admin, the user they just

0:19:46.320000 --> 0:19:52.700000
 created. And then new admin then created
 another account called malicious

0:19:52.700000 --> 0:19:55.540000
 account, which we saw first.

0:19:55.540000 --> 0:20:01.940000
 And then malicious account logged in
 from DC 01 using SSH, which we saw

0:20:01.940000 --> 0:20:06.120000
 there that was tracked by auth.log
 in the author log file, which means

0:20:06.120000 --> 0:20:11.000000
 that DC 01 is possibly compromised, because
 how could, you know, an attacker

0:20:11.000000 --> 0:20:17.720000
 SSH into Linux 01 from the domain controller
 without having access to

0:20:17.720000 --> 0:20:18.660000
 it or some form of access.

0:20:18.660000 --> 0:20:23.140000
 And they did advise a sage, which is
 also quite interesting or opens up

0:20:23.140000 --> 0:20:29.040000
 the, it opens up the box in, you know,
 in terms of what may have happened

0:20:29.040000 --> 0:20:33.440000
 to DC 01. So, you know, there's a,
 we saw that we're sort of expanding

0:20:33.440000 --> 0:20:38.440000
 our scope. In any case, malicious account
 also created or uploaded some

0:20:38.440000 --> 0:20:40.740000
 form of malware onto Linux 01.

0:20:40.740000 --> 0:20:45.740000
 They most likely, you know, just
 wrote it or pasted it in.

0:20:45.740000 --> 0:20:48.580000
 And we saw that they used them to do so.

0:20:48.580000 --> 0:20:56.160000
 We also saw the, you know, the path of
 the file, so where it was actually

0:20:56.160000 --> 0:21:01.240000
 saved and was saved in the home directory
 of the newly created user account

0:21:01.240000 --> 0:21:02.580000
 called malicious account.

0:21:02.580000 --> 0:21:06.140000
 Okay. Now, this is not really a timeline
 of events in the fact that we

0:21:06.140000 --> 0:21:07.540000
 have time stamped everything.

0:21:07.540000 --> 0:21:11.660000
 But again, we don't want to start, you
 know, with anything too complex.

0:21:11.660000 --> 0:21:15.460000
 But hopefully this is giving you an
 idea as to what it's really like and

0:21:15.460000 --> 0:21:19.380000
 the info you should be gathering
 or documenting.

0:21:19.380000 --> 0:21:21.920000
 In any case, you then have the next step.


0:21:21.920000 --> 0:21:26.980000
 So, you know, what we've found thus
 far pretty much as said, opens up

0:21:26.980000 --> 0:21:31.220000
 the field in terms of some additional
 deeper analysis that can be performed

0:21:31.220000 --> 0:21:37.880000
 that specific to, you know, particular
 IOCs or, you know, the, for example,

0:21:37.880000 --> 0:21:40.100000
 malware, but also other
 systems like DC 01.

0:21:40.100000 --> 0:21:45.620000
 But what we can pass along in addition
 to that is, you know, specific

0:21:45.620000 --> 0:21:52.120000
 to containment error, eradication
 specifically is the following.

0:21:52.120000 --> 0:21:57.020000
 So, first thing that's obvious is the
 lab admin, new admin and malicious

0:21:57.020000 --> 0:21:58.700000
 accounts must be disabled.

0:21:58.700000 --> 0:22:02.660000
 Secondly, we need to evaluate or analyze
 which of those accounts are legitimate

0:22:02.660000 --> 0:22:07.240000
 and are needed and change the passwords
 and secure as necessary.

0:22:07.240000 --> 0:22:10.400000
 So, this would be, you know,
 containment as it were.

0:22:10.400000 --> 0:22:15.120000
 Further analysis would be, you know,
 if possible, take DC 01 or fly in

0:22:15.120000 --> 0:22:21.160000
 for further investigation, you know,
 depending on the business criticality

0:22:21.160000 --> 0:22:24.420000
 of that system, which in this case,
 given that that's a common host name

0:22:24.420000 --> 0:22:28.420000
 for a domain controller, it probably
 wise to take it offline or to air

0:22:28.420000 --> 0:22:29.900000
 gap it as it were.

0:22:29.900000 --> 0:22:33.980000
 So, and again, I've sort of highlighted
 why this is important.

0:22:33.980000 --> 0:22:37.400000
 This is only possible if there are the
 redundant domain controllers that

0:22:37.400000 --> 0:22:40.520000
 can sort of, you know,
 take over operations.

0:22:40.520000 --> 0:22:44.000000
 If we can disconnect it from the network,
 then immediate further investigation

0:22:44.000000 --> 0:22:47.960000
 is required, which is something
 we would be doing still.

0:22:47.960000 --> 0:22:52.060000
 But you need to pass along, you know,
 these containment and eradication

0:22:52.060000 --> 0:22:57.740000
 intelligence or instructions
 to the relevant team.

0:22:57.740000 --> 0:23:05.000000
 If you're handling that, then, you
 know, you need to determine when to

0:23:05.000000 --> 0:23:11.520000
 perform these actions, but continuing
 on, we should also do the same with

0:23:11.520000 --> 0:23:14.860000
 Linux 01, since it is possibly
 compromised with malware.

0:23:14.860000 --> 0:23:17.540000
 So, these are those key decisions.

0:23:17.540000 --> 0:23:22.360000
 And of course, a lot of this still
 warrants a lot of further analysis.

0:23:22.360000 --> 0:23:26.320000
 But remember what we're doing, something
 very specific and nuanced here

0:23:26.320000 --> 0:23:30.660000
 in terms of techniques, techniques for
 what, techniques for endpoint log

0:23:30.660000 --> 0:23:35.900000
 analysis, you know, in this case, you
 know, using a CIM and, you know,

0:23:35.900000 --> 0:23:39.540000
 we're taking a look at log analysis
 in the context of endpoint analysis.

0:23:39.540000 --> 0:23:42.220000
 So, these are the types
 of specialized skills.

0:23:42.220000 --> 0:23:46.700000
 And this is the structured understanding
 that you should have of, you

0:23:46.700000 --> 0:23:48.880000
 know, the types of endpoint analysis.

0:23:48.880000 --> 0:23:58.120000
 And then with regards to the types
 of endpoint analysis, in this case,

0:23:58.120000 --> 0:24:02.840000
 you know, we're dealing with a log analysis
 and we've taken look at how

0:24:02.840000 --> 0:24:04.680000
 to do it with a CIM, right?

0:24:04.680000 --> 0:24:09.160000
 So, with that being said, that's
 going to be it for this video.

0:24:09.160000 --> 0:24:12.880000
 Hopefully you enjoyed that and you've
 got, you know, your taste of what

0:24:12.880000 --> 0:24:19.980000
 this process is like with regards to receiving
 or having an incident escalated

0:24:19.980000 --> 0:24:23.100000
 to your ticket or a case, whatever
 you want to call it and using that

0:24:23.100000 --> 0:24:25.160000
 information to perform
 your investigation.

0:24:25.160000 --> 0:24:30.940000
 Now, as I said, if you've absorbed
 everything that we've covered, you

0:24:30.940000 --> 0:24:35.120000
 know, to this point in the course,
 then this should have made a lot of

0:24:35.120000 --> 0:24:39.520000
 sense to you because you sort of understood,
 okay, first response, we

0:24:39.520000 --> 0:24:44.240000
 need to broaden the timeline once you've
 identified something interesting.

0:24:44.240000 --> 0:24:47.900000
 This is where the concept of pivoting
 or using a particular log is a pivot

0:24:47.900000 --> 0:24:56.040000
 point to, you know, identify or detect
 or really identify other malicious

0:24:56.040000 --> 0:25:00.940000
 activity, broaden your scope, documentation,
 you know, we also explored

0:25:00.940000 --> 0:25:05.420000
 validation. And then now from this point,
 you know, this prompts further

0:25:05.420000 --> 0:25:10.660000
 analysis, but also you also saw what the
 process of documenting your findings,

0:25:10.660000 --> 0:25:14.200000
 timeline construction, and
 all that good stuff.

0:25:14.200000 --> 0:25:18.340000
 So hopefully everything's tied together
 in this video, which again, if

0:25:18.340000 --> 0:25:20.960000
 it did, I hope you have
 a smile on your face.

0:25:20.960000 --> 0:25:24.260000
 In any case, that's going
 to be it for this video.

0:25:24.260000 --> 0:25:26.440000
 And I will be seeing you
 in the next video.

