hecfblog.com Ads.txt file























































<br /> Hacking Exposed Computer Forensics Blog<br />























































This Week's/Trending Posts












Hand-Picked/Curated Posts












Most Popular/Amcache

















Hand-Picked/Weekly News

















The Most/Recent Articles

















Daily Blog #703: Looking back at AWS EBS Direct Block access API









Looking back at AWS EBS Direct Block access API

 

 Hello Reader,

         It's been awhile! I've ... been busy? It's hard to describe the last 16 months prior to his daily blog other than a series of career defining events in my 'leveling up' at KPMG along with the launch of my new SANS Class FOR509: Enterprise Cloud Incident Response and Forensics! 

 

Oh and there was that collab with Crowdstrike (https://www.youtube.com/watch?v=7DHb1gzF5o4)


So yeah, little busy. 


But! That doesn't mean I haven't wanted to go back to posting, sharing and pushing myself to learn more. It should come as no surprise that a lot of what I've been researching has been less operating system focus and more cloud focused. So today let's talk about interesting ways to access snapshots. 


This whole discussion came out of teaching FOR509 in DC (shout out to my CDI students!). You see in Azure when you make a snapshot you can download it/copy it/access it as a VHD without restoring it to another volume (I'll post about this tomorrow) but within AWS there is no ability to download a snapshot as an image from any of the standard EBS calls. (If you know of a way to do this in AWS other than direct block access please let me know!)


In discussing this with others they brought up an interesting use case for the direct block access api I didn't think of! While I was nerding out thinking about using this API as rapid triage tool to read file systems without accessing entire snapshots, others have realized you can access every block to create an image of the snapshot! This would allow you to fully download a snapshot (either in the cloud or on prem) and store it into a dd file without having to restore it. I'll be writing a script to do this and I'll post here (would make a good blog post!) on the blog and put it into our FOR509 github (https://github.com/dlcowen/sansfor509) which contains all the cloud  provider related scripts we are creating for the class. 


So I hope you found this interesting, I certainly did, and find some use for this in your investigation!


Read more about the AWS Direct Block Access API here: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-accessing-snapshot.html

Also Read: Daily Blog #702



















Daily Blog #702: Sunday Funday 8/9/20 - Extensible Storage Engine (ESE) database Challenge










Hello Reader,

           It's been awhile! I wish I could tell you what all I've been up too, but needless to say real investigations got so crazy between May-August that I couldn't even find time to blog without losing even more sleep. So let's pick up where we left off with a Sunday Funday! This week we address a database format we are seeing more and more as developers realize what a useful alternative it is to SQLite on a windows system. This week is all about ESE databases! 



The Prize:


$100 Amazon Giftcard

And an apperance on the following week's Forensic Lunch!

The Rules:

  1. You must post your answer before Friday 8/14/20 7PM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful
  6. Anonymous entries are allowed, please email them to dlcowen@gmail.com. Please state in your email if you would like to be anonymous or not if you win.
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:

When looking at Extensible Storage Engine (ESE) database artifacts
(also known as 'Jet Blue' or .edb file):

1. Recover deleted messages from an ESE database from a live database or from the transaction journal. 

2. Determine what other applications other than IE, Search Index and SRUM make use of it

3. Determine how to avoid data loss when copying it from a live system

Also Read: Daily Blog #701 



















Daily Blog #701: Magnet Virtual Summit CTF 2020 Results









Magnet Virtual Summit CTF 2020 Results

Hello Reader,
         If you watched the live commentary boy were you in for a treat! So much so that I deleted the video afterwords. No reason to let that hot mess live on forever.

What will live on forever though is the winners of the CTF!

Magnet Virtual Summit CTF 2020 Results

Congratulations Evangelos aka theAtropos4n6 for winning 1st place! We will hopefully see you on the Forensic Lunch friday!

In second place was Oleg Skulkin aka 0x136 with the long time CTF feud between evandrix of Singapore and Adam Harris aka harrisonamj going to evandrix this time for the 3rd place finish.

Also Read: Daily Blog #700


















Daily Blog #700: New version of Plaso









New version of Plaso



Hello Reader,
          Ryan Benson's #130 Daily DFIR tweet mentioned something I think is interesting:


New version of Plaso



He pointed out that there is a new version of Plaso out which by itself is good news but whats interesting is that they have now switched to libfsntfs for NTFS parsing.

Why is that interesting?

Every previous version of Plaso and DFVFS backed tools made use of the TSK's native support for NTFS. Libfsntfs is Metz's NTFS library that he wrote to handle all of the edge case NTFS conditions he found, provide faster speeds and extend what is possible with supports for things like case sensitive entries, which in NTFS is interesting all by itself.

I think we should have a look at this library wednesday. Why not tomorrow? Tomorrow is when we do Magnet Virtual CTF commentary live on Youtube!





















Daily Blog: #699: Sunday Funday 5/10/20 - Auditd Challenge












Hello Reader.



       We've bounced from Windows to OSX and around the cloud. What we haven't done though is venture in the deep waters of Linux forensics. Today let's help out our fellow examiners who are in the trenches with few landmarks to lead their way in the linux forensics wasteland with this weeks challenge focused on Auditd.




The Prize:


$100 Amazon Giftcard

An apperance on the following week's Forensic Lunch!



The Rules:




  1. You must post your answer before Friday 5/15/20 7PM CST (GMT -5)

  2. The most complete answer wins

  3. You are allowed to edit your answer after posting

  4. If two answers are too similar for one to win, the one with the earlier posting time wins

  5. Be specific and be thoughtful

  6. Anonymous entries are allowed, please email them to dlcowen@gmail.com. Please state in your email if you would like to be anonymous or not if you win.

  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post








The Challenge:


On a Linux system with Auditd enabled answer the following quesitons:


1. What new data sources does Auditd create


2. What tools support the data


3. What can an examiner determine from Auditd


4. How long is the data retained for



















Daily Blog #698: Solution Saturday 5/9/20 - Updating a Previous Challenge on KnowledgeC












Hello Reader,

         It was week of returning champs coming to see who could win and this week that was Oleg Skulkin who did some solid work on updating a previous challenge on KnowledgeC. So congrats Oleg another win for the board!



The Challenge:

KnowledgeC on iOS is a jam packed knowledge resource, but on OSX it seems to be less used. 

1. What does each table in the KnowledgeC database correspond to activity wise

2. What data is logged  in each table

3. What data is not logged

4. Is there a similar datasource that would fill in the gaps?



The Winning Answer:

Oleg Skulkin






Know Your KnowledgeC






I’m using macOS
devices quite often, for example, to read blogs and general web-surfing, but
don’t look at them from a forensic perspective quite often, so Sunday Funday
gives me a good opportunity to do it.






KnowledgeC. This is
quite known source of forensic artifacts, many forensic tools even extract
relevant data from it automatically (e.g. Magnet AXIOM, Plaso also has a parser
for it - mac_knowledgec).





Regarding research,
Sarah Edwards proved that KnowledgeC is power (
https://www.mac4n6.com/blog/2018/8/5/knowledge-is-power-using-the-knowledgecdb-database-on-macos-and-ios-to-determine-precise-user-and-application-usage), also we already had a Sunday Funday on this
topic, but focusing on macOS Mojave, and Tun Naung (
https://twitter.com/tunnaunglin) won it (https://www.hecfblog.com/2019/03/daily-blog-642-solution-saturday-3919.html).




But now we already
have macOS Catalina (10.15), so it’s high time to look at the data source
again.




In fact, there are two
knowledge databases on macOS: system and user context. The first is located
under /private/var/db/CoreDuet/Knowledge, the second – under
/Users/username/Library/Application Support/Knowledge.




Let’s start from the
first one, system context database. It was obtained from a macOS image
presented at recent Champlain CTF.


There are 16 tables in
the database:




updating a previous challenge on KnowledgeC







System context database tables


Most of the tables are
empty. The most interesting things start from ZOBJECT table. ZSTREAMNAME column
contains information about the data streams. In the database I’m looking at
there are several streams:









  • com.apple.spotlightviewer.events



  • /safari/history



  • /media/nowPlaying



  • /display/isBacklit



  • /app/inFocus



  • /app/activity



  • /activity/level/feedback



  • /activity/level



ZVALUESTRING column contains
additional information. For example, for /app/inFocus is shows the application
used, for /safari/history – URL. That’s not all, for Safari related activity
and /media/nowPlaying it contains additional metadata in ZSTRUCTUREDMETADATA
table, corresponding ID can be found in the column with the same name. For
example, for Safari history it will store webpage’s title in Z_DKSAFARIHISTORYMETADATAKEY__TITLE
column:






updating a previous challenge on KnowledgeC







Of course, we
shouldn’t forget about the timestamps: there are three columns in ZOBJECT
table: ZSTARTDATE, ZENDDATE and ZCREATIONDATE, all contain timestamps in Mac
Absolute Time format.




It’s time for an SQL
query!





updating a previous challenge on KnowledgeC










Let’s move on to the
user context database. I got this one from our iMac. It’s used very often, so
there should be a lot of data in the database.





Tables are the same –
we have 16 of them. Let’s look inside ZOBJECT table. Here are the streams
available in ZSTREAMNAME:








  • /portrait/topic



  • /portrait/entity



  • /notification/usage



  • /knowledge-sync-deletion-bookmark/



  • /knowledge-sync-addition-window/



  • /display/isBacklit



  • /app/usage



  • /app/intents







First of all, we have
some information about database synchronization. It means that it may contain
not only information about this iMac, but also synced data, for example, from
an iPhone. There’s a table called ZSYNCPEER that includes some information
about these devices:







updating a previous challenge on KnowledgeC





There’s another useful
table – ZSOURCE. Here we can find was it a WhatsApp message, a phone call or an
SMS. Also it can help us to understand some not common data types. For example,
we can see that /portrait/topic refers to Pinterest, /portrait/entity – to
Safari.





Let’s look inside
ZSTRUCTUREDMETADATA, especially at Z_DKINTENTMETADATAKEY__SERIALIZEDINTERACTION
column. Here we can see some BLOBs. Let’s export one of them, it can be done,
for example, with DB Browser for SQLite. In fact, it’s a binary plist. But
that’s not all, there is another plist inside! It’s inside NS.data. In my case
it was a WhatsApp message, and I could get not only the phone number (it’s also
available at in Z_DKINTENTMETADATAKEY__DERIVEDINTENTIDENTIFIER), but also
contact’s name. The same can be done with phone calls – we can recover the
phone number. Unfortunately, we can’t recover the message body.





Again, we can gather a
lot of information about the usage of applications from /app/usage stream:



updating a previous challenge on KnowledgeC












Let’s write an SQL query
to gather this information:





updating a previous challenge on KnowledgeC








As you can see, the
first record is April 4, 2020, today is May 3, 2020, so the database stores
data for only 30 days.





So, what other similar
data sources are available? For example, another interesting database is
located under /private/var/db/CoreDuet/People. It’s interaction.db. There are
11 tables inside:





updating a previous challenge on KnowledgeC







If we look inside ZINTERACTIONS and ZCONTACTS, we can gather some information about calls the user performed. Again, it seems the data is written to the database as part of synchronization process, and, of course, it’ll contain different datasets – it’ll depend on the device.




















Daily Blog #697: Forensic Lunch 5/8/20 - Jack Farley, Josh Brunty, Kevin Pagano, Tom Pace, Jim Arnold










We talk about DFIR with experts by David Cowen - Hacking Exposed Computer Forensics Blog


Hello Reader,

        Another week of crisis times means another weekly Forensic Lunch!



This week on the Forensic Lunch
we had:






You can watch it here:

https://youtu.be/fPzSm-hofA0



















Daily Blog #696: Free Autopsy Training









Free Autopsy Training - DFIR


Hello Reader,

       I know right now not everyone is heads down in DFIR investigations like we are. I know that we are fortunate to retain our jobs and keep doing the work we love. So for those of you who know individuals who are looking to transition into DFIR or those already in it who are looking to grow their skills but currently have 0 budge to do it, I have some good news.



The fine folks over at Basis Technologies who fund things like The Sleuth Kit, Autopsy and OSDFCon have made their very successful Autopsy training class free!



This is an 8 hour on demand course that normally costs $495! However, you only have one more week left to claim it as they set a date of 5/15/20 to end this amazing deal.



So if you or someone you know has the time and want to get skilled up, what are you waiting for?

https://www.autopsy.com/support/training/covid-19-free-autopsy-training/

























































Ads.Txt Alerts - A trading name of Red Volcano Limited

Unit 6, Leylands Business Park, Colden Common, Hampshire, SO211TH

© Red Volcano 2020. All Rights Reserved.