Initialization vectors

Friday, May 24, 2024

Full File System extractions in Zip - MAC times

How do zip files generated by extraction tools used in digital forensics manage file timestamps?

When one looks at the zip file specification it does not contain defined fields for creation or access times. The only specification defined field for time is the modified timestamp.

 

Zip file specification - Modified Timestamp

Full File System (FFS) extractions from mobile devices are zip files but when processed by digital forensic tools the creation and access times can be seen along with the modified timestamp. These timestamps are known as the MAC times. If the only timestamp defined by the specification is the modified one, where are the other two?

Processed FFS - Notice MAC times for keychain-2.db

The creation and access times are kept in the extra field defined at the end of the specification. When these FFS extractions are opened with common zip managers like 7zip or WinRaR these MAC times are not seen by these managers because they have not been set to read and interpret the extra field.

7 zip view of keychain-2.db - Notice the empty created and accessed columns


In order to validate the MAC times on these particular FFS extractions one can use the following python script: extract_timestamps.py

Script location: https://github.com/abrignoni/Misc-Scripts/

The script needs the path to the FFS extraction and the path internal to the zip for the file which you would like to see the associated MAC times.

Script usage:

Terminal usage of script


The output will be to the screen as follows:

Script output to the screen

Note that the MAC times at the end of the output all come from the extra field while the modified time at the start is taken from the specification field for that purpose.

Hope the previous has been informative and helpful. For questions or comments you can find me on all social media here: https://linqapp.com/abrignoni

Tuesday, April 9, 2024

New parser for Uber app geo-locatios in iOS using iLEAPP

 

New parser for Uber app in iOS using iLEAPP
🗜 Data contained in LevelDB data structures
⏳ Timestamps
📍 GPS coordinates + horizontal accuracy
🚘 Speed
🗺 Active trip information
🔗 Get it here: https://github.com/abrignoni/iLEAPP

Thanks to CCL Solutions & Alex Caithness for the LevelDB libraries used in this artifact.
Libraries are located here: https://github.com/cclgroupltd/ccl_chrome_indexeddb


#DFIR
#FLOSS #FOSS #MobileForensics #DigitalForensics

 

Sunday, April 7, 2024

New VLEAPP parser

New VLEAPP parser for Dodge RAM 1500 extractions 
 📍 GPS locations from 2 sources 
 🛣️ Current road names 
 🛑 Road speed limits 
 🚗 Vehicle speeds 
 🔗 Get VLEAPP: https://buff.ly/3VLCXfS

The plan is to really dig down on vehicle extractions and create as much parsers as I can from the end of July to December.

There is a real need for more parsing platforms that provide alternate methods for validation and report presentation. Hopefully open source tools can start moving the files in that direction.

#DigitalForensics #VehicleForensics #DFIR #MobileForensics

Wednesday, February 7, 2024

What is cacheV0.db and why are there only images in it?

Last week the awesome Heather Charpentier (my co-host on the Digital Forensics Now Podcast) and myself were working on building a parser for Google Chats in iOS. As we were looking for the location where images were share via the chat we came across a SQLite database called cacheV0.db in the /private/var/mobile/Data/Application/GUID/Library/Caches/com.google.Dynamite/ImageFetcherCache/ directory.

The cacheV0.db file in context.


Even though we found the images tied to the chats somewhere else in the application directory, this database had a smaller resolution copy of all the files that were sent via the chats to include user avatars that were not shared via user attributable action. The database also contained images that were from deleted chats that did not remain in the folder where the chat images are kept.

It seems this database is similar in function to the Glide Image Manager Cache functionality found in some Android apps where this functionality generates and keeps a thumbnail of every image that has been rendered by the app's interface. In this context rendering means showing the image to the user within the interface of the application. You can watch a video detailing the Glide Image Manager Cache functionality and forensic significance here: https://youtu.be/Rlp-h9V6FI0

The cacheV0.db is comprised of only one table called cache that contains only two fields called id and data. The id field is of integer type and it is sequentially incremented starting at number one. The data field is of blob type and contains the thumbnail like images mentioned previously.

The cache table.

Some details about how the database is implemented per our observations:

  • We were not able to find any direct connection between the images in the database and the database that contains the chats. It seems it behaves like Glide were the images in the database are used by the application for rendering purposes but are separate from the actual images being send and received in chat interactions.
  • We knew deleted images where in the database because Heather had created the dataset and had extensive documentation of her process. We knew what images we were missing form the main image directory and we found those copies in the database.
  • We found another cacheV0.db database in the Google Voice app in iOS. It would seem this is a Google used image rendering managing process. We have not seen this database, so far, outside of Google apps.

In summary it seems this database:

  • Is used to keep and manage images used by the application for rendering to the user.
  • Keeps copies of images after the source files have been deleted.
  • It is used by Google applications.

If anyone comes across additional implementations of this database do share your findings.
In order to automate the parsing of these databases in iOS I have created the Image CacheV0 parser in iLEAPP.

iLEAPP parser for cacheV0.db

iLEAPP is a free, Python based, open-source, community driven platform for the parsing of iOS extractions for digital forensics. You can find the tool here: https://github.com/abrignoni/iLEAPP

For questions or comments find me on all social media here: https://linqapp.com/abrignoni



Tuesday, January 16, 2024

SQLite 3.45 introducing binary JSON

Have you heard about binary JSON in SQLite? I hadn't. Today I was made aware of it by digital forensics examiner and software developer extraordinaire Alex Caithness.


The latest SQLite version (Version 3.45.0) has the ability to encode and decode JSON data from plain text to binary format and back. Details of this functionality can be found here: https://sqlite.org/draft/jsonb.html

Why would this data need to be in binary format? Per the jsonb specification there will be a reduction in data size as well as faster processing speed.

 After downloading SQLite 3.45 on a Windows VM, I generated some synthetic plain text JSON data.

To test conversion from JSON to binary JSON I created a simple database a table called data with two fields: keyf, and jsonblob. The field definiton for the jsonblob field has to be BLOB.

After importing the data I encoded the blob by using the following query:
UPDATE data set jsonblobdata = jsonb(jsonblobdata);

After the UPDATE query I ran a SELECT query to see how the data would look.

JSON binary blob

Here is the blob field view with hex.

JSON binary blob with hex

One of the issues with binary data structures is that text searching an extraction will be less and less productive. SQLite binary JSON does not seem to compress data that much but I can foresee a future where it will just like LevelDB or other formats. Being aware of compressed binary data, and the need to access it in clear text format, will be a key function for digital examiners today and into the future.

In order to present the blob data in clear text I used the following query:
SELECT keyf, json(jsonblobdata) from data;

The json() function does all the work for you.

Clear-text JSON


After that one can deal with the JSON data as one usually does.
For questions or comments find me on all social media here: https://linqapp.com/abrignoni



 


Thursday, July 7, 2022

4Cast Awards 2022 voting in progress...

Voting for the SANS 4Cast awards started. Voting closes on July 24, 2022. Honored for the following: 

- DFIR Non-Commercial tool of the Year for xLEAPP - DFRI Social Media Contributor of the Year - Digital Forensic Investigator of the Year

If the tools have helped you do consider voting. 

Thank you!

🗳 Vote for xLEAPP here: https://tinyurl.com/vote4xleapp

Friday, December 24, 2021

Android Tor Browser Thumbnails. What?

Tor Browser investigations usually don't go beyond possible user saved bookmarks. Thanks to a find by Loicforensic@protonmail.com (no online presence) we can locate Tor Browser thumbnails of opened tabs in the following Android directories:

  • /data/data/org.torproject.torbrowser/cache/mozac_browser_thumbnails/thumbnails
  • /data/user/0/org.torproject.torbrowser/cache/mozac_browser_thumbnails/thumbnails
The thumbnail files are named in a GUID format with a .0 extension. For example: 8c7defaa-12b9-44f4-ae78-cc8850b92ab4.0

These thumbnails are in RIFF format contained in a WEBPVP8 container. 


They can be easily viewed by opening them with Chrome browser. In order to facilitate review I have made an artifact for the Android Logs Events And Protobuf Parser (ALEAPP) framework. Using the PIL library in Python we can convert the file to PNG format for easy reporting.


Here is ALEAPP's TOR Thumbnails report. The report contains the modified time, converted to PNG thumbnail, filename, and file location path.


ALEAPP can be downloaded here: https://github.com/abrignoni/ALEAPP

Thanks to Josh Hickman (https://twitter.com/josh_hickman1) for his most excellent Android 12 test image which enabled the creation of this artifact. You can get his Android 12 test image here: https://thebinaryhick.blog/2021/12/17/android-12-image-now-available/

Any questions or any comments I can be reached on twitter @AlexisBrignoni and email 4n6[at]abrignoni[dot]com.

Wednesday, July 21, 2021

vLEAPP - Vehicle Logs Events And Properties Parser

 Short version:

Take logical extractions from cars, trucks, and infotainment systems and parse them for interesting artifacts.

vLEAPP is Python 3 code and can be downloaded here:
https://github.com/abrignoni/VLEAPP

Long Version:

The need to analyze cars for digital forensic artifacts has grown recently as vehicles have smart mobile features by default. From GPS coordinates, contact databases, call logs, and even automated driving, the forensic value of these items cannot be overstated. Sadly there are not many options regarding tools to parse these data sources. vLEAPP aspires to be an open source platform the community can use to aggregate forensic artifacts found on the most mobile of data sources, cars.

This project started from Geraldine Blay's idea of being able to easily parse any car data source in a way that easily enables the backtracking of report data to source data. We decided to use the xLEAPP code base to do so.

Challenges

Dealing with cars brings a host of challenges to the examiner. Some are:

  • Data extraction.
    • In order to pull data from infotainment systems special tooling is usually needed. Many times a chip-off is required. This can be a labor intensive process that requires extensive training.
    • vLEAPP plays no role in the data extraction process.

  • Lack of standardization.
    • Different brands will have different ways of developing their navigation, infotainment, and sensor data recording systems. Sometimes there are different ways of doing these within cars and models of the same brand. It goes without saying that the digital forensics process is has to be well executed. Artifact identification and parsing automation is needed in this field.
    • Hopefully with the arrival of Google's Android Auto and Apple's CarPlay there will a more unified data source type across vehicle brands.

  • Unfamiliar file systems
    • File systems in use by cars might not be recognized by many forensic tools. The QNX file system by Blackberry is one example. Some examiners resort to carving in the hopes of getting relevant data from these nono-supported file systems. Be aware that using branded forensic tools might not help where other more traditional computing processes might. For example QNX file systems can be accessed using a Linux Ubuntu distribution. After accessing the logical files in the QNX file system you can package them all up in a zip file for analysis in any tool or by hand. The following video is a step by step process on how to do so.

Solutions

vLEAPP provides a way to report on forensic artifacts using Python in a way that abstracts the generation of HTML, KML, TSV, and SQLite reports. The examiner focuses on where the data is located and what to pull from it. vLEAPP handles the rest. Here is a video showing how it works.



If you are not familiar with Python or how to run scripts check this short video out. It will guide you from installation to script usage. Really easy and straightforward.



Conclusions

New data sources that are case relevant will continue to surface. As digital forensic examiners we will be well served to learn some coding. Alex Caithness said it best: Learn to code because every artifact exists because of code.


If you would like to learn Python from a digital forensics examiner's perspective and contribute to this or any of the other xLEAPP projects check out the following DFIR Python Study Group playlist. It will take you from knowing no Python to parsing protobuf files and SQLite databases.


Any questions or any comments I can be reached on twitter @AlexisBrignoni and email 4n6[at]abrignoni[dot]com.

Be safe. Take care.

-Brigs


Monday, May 10, 2021

CLEAPP it! - ChromeOS Logs Events And Protobuf Parser

Short version:

Process data extractions from Chromebooks using the ChromeOS Logs Events And Protobuf Parser (CLEAPP.) 

CLEAPP is made in Python 3 and can be downloaded here:
https://github.com/markmckinnon/cLeapp


 

Long version:

Until not too long ago extracting data for forensic analysis from Chromebooks seemed impossible. Thanks Daniel Dickerman's workflow we can extract data provided you have a username and password for the device.

Check out the peer-reviewed process here:
https://dfir.pubpub.org/pub/inkjsqrh/release/1

Thanks to Magnet Forensics the process has been automated and now its implementation is available as a free software tool called the Magnet Chromebook Acquisition Assistant. 

You can do it!!

To get the free tool go here:

https://www.magnetforensics.com/resources/magnet-chromebook-acquisition-assistant/

Now what?

So now you have an awesome extraction from the device. You will receive a file named extracted.tgz.

extracted.tgz

What do you do with it? How can you dig into the contents? Use CLEAPP for it. You can get CLEAPP here: https://github.com/markmckinnon/cLeapp

Two step process:

  1. Extract the tgz file.
  2. Select the extracted data location with CLEAPP and press process. 
Simple!!! 

This project is Mark McKinnon's brainchild and it is based on the Android Logs Events And Protobuf Parser community project. ALEAPP can be found here:
https://github.com/abrignoni/ALEAPP

Currently CLEAPP parses 38 artifact categories. The project wouldn't be what it is without the contributions from Alex Caithness and Ryan Benson. Thank you so much!

Thank you gentlepeople <3

Installation

If you are familiar with how iLEAPP of ALEAPP works then you already know how to use CLEAPP. These projects are done in python. If you are not familiar with how to run python scripts just follow the steps in the following video.


You can also use the provided executables contained in the release version for CLEAPP. Those can be found here:
https://github.com/markmckinnon/cLeapp/releases/tag/v1.0

Using CLEAPP

Run the cleappGUI.py script for the graphical user interface version. It will look like this:

Click around and done

Notice the list of modules on the left. You can parse all or select individual modules. CLEAPP is pretty fast so for most purposes running with all modules enabled is recommended.

Here is a short list of some modules it supports:
  • Chromebook device details
  • Chromebook device logs
  • Chromium Browsers
  • Instagram Threads
  • Chromium LevelDB data stores (Thanks Alex Caithness & Ryan Benson)
  • Microsoft RDP
  • Real VNC
  • Google Docs
  • and tons more...

After CLEAPP finishes processing the output will be in the following formats:
  • HTML report
  • Tab separated values text files for every artifact
  • KML files for artifacts that have geolocation data points
  • SQLite timeline file for artifacts that have timestamps
  • SQlite contacts file for artifacts that have contacts information
The HTML report contains the categories and artifacts on the left of the report.

HTML report

The Device Details tab will have information on the Chromebook like serial number, current operatin system version, and more.

Device Details

One of the interesting facts about Chromebooks is that they can run Android apps. As time permits I plan to merge all ALEAPP artifacts for use in CLEAPP and make sure that both projects support Android artifacts. 

Since this is a community project we will be more than happy to have additional collaborators. 

Get ready for this, Mark has provided Autopsy integration for CLEAPP right out the bat. Check it out here:
https://medium.com/@markmckinnon_80619/cleapp-autopsy-plugin-59ba312beccc


Any questions or any comments I can be reached on twitter @AlexisBrignoni and email 4n6[at]abrignoni[dot]com.

-Brigs

Thursday, April 15, 2021

Android version without the build.props file

 Short version

Use one of the following files to determine the Android version of a digital forensic extraction that is lacking the \system\build.prop file:

  • \data\system\usagestats\0\version
  • \data\system_ce\0\usagestats\version

Long version

Most automated tools that parse Android full file system extractions depend on the /system/build.prop file to determine the Android version among other device identifiers. Due to how variable are Android implementations as well as data extracting software a build.prop file might not be available. Is there a way to determine the Android version of an extraction by only looking at the userdata directory? The answer is yes. This was useful to me since some of my digital forensics tooling for Android extractions would benefit from programmatically identifying the Android version when a build.props file is not available.

by: https://gs.statcounter.com/android-version-market-share/mobile-tablet/worldwide

Usagestats

One of the ways Android devices keep track of application activity is by registering events in the usagestats directory. Depending on the Android version these can be kept in XML or protobuf format. These can be found in the following locations depending on your Android version:

  • \data\system\usagestats\0\
  • \data\system_ce\0\usagestats\

For a quick explanation on usagestats and their applicability see here: https://abrignoni.blogspot.com/2019/02/android-usagestats-xml-parser.html

To parse usagestats data you can use ALEAPP (Android Logs Events And Protobuf Parser) here: https://github.com/abrignoni/ALEAPP

 The usagestats folder contains a plain text file called version. It is usually composed of two lines. The first one being a number and the second one a series of alphanumeric values separated by semicolons. 

Usagestats/version file from a Samsung SM-G960U

The filename clearly indicated that the content had to be versioning related. Since the build.props file is well understood and documented I made a comparison between the too to try and determine the provenance of the version's file content if possible.

Notice in the following image the side by side comparison and color coding of similar values of files extracted from a Samsung device.


From the version file's second line we can determine the following:

  •  First items = Version release = Android version
  • Second item = Codename = Fingerprint
  • Third item = Incremental build version
  • Fourth item = CRC = Country Specific Code
Thanks to Kevin Pagano (@KevinPagano3) for identifying that the fourth value is a CRC and for leading me a list that matched codes with values. These CRC values seem to be Samsung specific.

 https://www.androidsage.com/2017/07/12/list-of-samsung-galaxy-country-specific-product-code-csc-and-country-region/

It is of note that Pixel devices have less values within the version file. There is no CRC and no final value. Still the Android version was the same as the one located in the build.props file. This was true across all sample extractions I was able to check.

Version file from a Pixel phone

What about the number on the first line? Through the process of testing the values on the second line a pattern appeared for the number in the first line. It is as follows:

  • 3 = Android 8, 9
  • 4 = Android 10
  • 5 = Android 11 

Jessica Hyde (@B1N2H3X) suggested I take into consideration how the file would look, if anything, after an operating system upgrade. Great point! Thanks to Josh Hickman (@josh_hickman1) that was really easy to do. He has well documented test Android images for community use and testing. By looking at the values within the version file on his Android 10 extraction and then the values on the upgraded Android 11 image I was able to determine that an update would produce an additional file in the directory where version resides called migrated.

The migrated file.

The migrated file contains the number in the first line of the version file previous to the upgrade. After the upgrade the version file contains the numbers that are consistent with the current version. The next image is the value contained within the migrated file in the Android 10 extraction.

Migrated file with a value of 4

Now compare it with the values within the version file in the same Android 11 extraction.

Version file with a value of 5

This behaviour was also confirmed by Carlos Eduardo (@GalloDu) with his own upgraded device data.

Confirmation of migrated & version relationship

Based on this behaviour it follows that the presence of a migrated file indicates a major operating system upgrade. By comparing the contents of the migrated and version files the analys can determine from and to what version the device was upgraded to.

Pending work

For this analysis I only had access to Samsung and Pixel files. It would be of use if migrated and version files from other vendors (LG, OnePlus, etc...) are shared to see how they might defer and/or what additional data they might provide if any.

Implementation 

I have made a parser for the version file within ALEAPP. The script will identify and use Android version number contained in either the build.props or version file for reporting and artifact purposes. To quickly view the data press the device details tab at ALEAPP's report home page.

Device details tab with Android version data

As always, I can be reached on twitter @AlexisBrignoni and email 4n6[at]abrignoni[dot]com.

-Brigs




Tuesday, September 15, 2020

It's alive! - Attachment links in Discord

What happens to the URL links inside Discord chats if you copy-paste them into an internet connected browser? You might be surprised to know that...

In the past I have written about the structure of Discord chats in the following platforms:

Windows:

https://abrignoni.blogspot.com/2018/03/finding-discord-app-chats-in-windows.html

macOS:

https://abrignoni.blogspot.com/2018/03/finding-discord-chats-in-os-x.html

iOS: 

https://abrignoni.blogspot.com/2018/08/finding-discord-chats-in-ios.html

Linux:

https://abrignoni.blogspot.com/2018/08/finding-discord-chats-in-linux-dfir.html

Android:

https://abrignoni.blogspot.com/2017/07/discord-app-forensic-artifacts-in.html

Viewing extracted data using an Android emulator:

https://abrignoni.blogspot.com/2017/08/viewing-extracted-android-app-data.html

Timely updates to the research have been provided by generous folks, like @TheKateCain,  here:

https://abrignoni.blogspot.com/2020/08/update-on-discord-forensic-artifacts.html

For the last couple of days I've been working creating a parser of Discord JSON chat files using iLEAPP. If you are not familiar with iLEAPP it is a Python 3 framework designed to parse useful forensic artifacts from iOS devices. More on iLEAPP here. I wanted to validate some findings on a case I am working with the amazing @i_am_the_gia and as part of the process I used the newly created parser on @Josh_Hickman1 excellent iOS testing images. You can get his testing images here.

Here is iLEAPP's  HTML report for the chat:


Here is the output for the Discord user's email and user ID:


In that same moment I watched the most amazing trailer for The Mandalorian Season #2 thanks to @KevinPagano3. As you all should know by now, the Child just steals every scene with just how cute it is. 

Going back to my report I copy one of the URLs in the attachment column and pasted it into an internet connected browser to see if it would come up. In past (2017) I did some testing on Discord for Android and found out that the links in chats could be copy-paste into a browser and be accessible from anywhere by anyone.

With Josh's image I confirmed that was still the case. And what did the URL image in the chat contain?


Coincidence? I think not. :-D
As always, I can be reached on twitter @AlexisBrignoni and email 4n6[at]abrignoni[dot]com.

May the Force be with you too.
-Brigs