Initialization vectors: 2024

Friday, May 24, 2024

Full File System extractions in Zip - MAC times

How do zip files generated by extraction tools used in digital forensics manage file timestamps?

When one looks at the zip file specification it does not contain defined fields for creation or access times. The only specification defined field for time is the modified timestamp.

 

Zip file specification - Modified Timestamp

Full File System (FFS) extractions from mobile devices are zip files but when processed by digital forensic tools the creation and access times can be seen along with the modified timestamp. These timestamps are known as the MAC times. If the only timestamp defined by the specification is the modified one, where are the other two?

Processed FFS - Notice MAC times for keychain-2.db

The creation and access times are kept in the extra field defined at the end of the specification. When these FFS extractions are opened with common zip managers like 7zip or WinRaR these MAC times are not seen by these managers because they have not been set to read and interpret the extra field.

7 zip view of keychain-2.db - Notice the empty created and accessed columns


In order to validate the MAC times on these particular FFS extractions one can use the following python script: extract_timestamps.py

Script location: https://github.com/abrignoni/Misc-Scripts/

The script needs the path to the FFS extraction and the path internal to the zip for the file which you would like to see the associated MAC times.

Script usage:

Terminal usage of script


The output will be to the screen as follows:

Script output to the screen

Note that the MAC times at the end of the output all come from the extra field while the modified time at the start is taken from the specification field for that purpose.

Hope the previous has been informative and helpful. For questions or comments you can find me on all social media here: https://linqapp.com/abrignoni

Tuesday, April 9, 2024

New parser for Uber app geo-locatios in iOS using iLEAPP

 

New parser for Uber app in iOS using iLEAPP
🗜 Data contained in LevelDB data structures
⏳ Timestamps
📍 GPS coordinates + horizontal accuracy
🚘 Speed
🗺 Active trip information
🔗 Get it here: https://github.com/abrignoni/iLEAPP

Thanks to CCL Solutions & Alex Caithness for the LevelDB libraries used in this artifact.
Libraries are located here: https://github.com/cclgroupltd/ccl_chrome_indexeddb


#DFIR
#FLOSS #FOSS #MobileForensics #DigitalForensics

 

Sunday, April 7, 2024

New VLEAPP parser

New VLEAPP parser for Dodge RAM 1500 extractions 
 📍 GPS locations from 2 sources 
 🛣️ Current road names 
 🛑 Road speed limits 
 🚗 Vehicle speeds 
 🔗 Get VLEAPP: https://buff.ly/3VLCXfS

The plan is to really dig down on vehicle extractions and create as much parsers as I can from the end of July to December.

There is a real need for more parsing platforms that provide alternate methods for validation and report presentation. Hopefully open source tools can start moving the files in that direction.

#DigitalForensics #VehicleForensics #DFIR #MobileForensics

Wednesday, February 7, 2024

What is cacheV0.db and why are there only images in it?

Last week the awesome Heather Charpentier (my co-host on the Digital Forensics Now Podcast) and myself were working on building a parser for Google Chats in iOS. As we were looking for the location where images were share via the chat we came across a SQLite database called cacheV0.db in the /private/var/mobile/Data/Application/GUID/Library/Caches/com.google.Dynamite/ImageFetcherCache/ directory.

The cacheV0.db file in context.


Even though we found the images tied to the chats somewhere else in the application directory, this database had a smaller resolution copy of all the files that were sent via the chats to include user avatars that were not shared via user attributable action. The database also contained images that were from deleted chats that did not remain in the folder where the chat images are kept.

It seems this database is similar in function to the Glide Image Manager Cache functionality found in some Android apps where this functionality generates and keeps a thumbnail of every image that has been rendered by the app's interface. In this context rendering means showing the image to the user within the interface of the application. You can watch a video detailing the Glide Image Manager Cache functionality and forensic significance here: https://youtu.be/Rlp-h9V6FI0

The cacheV0.db is comprised of only one table called cache that contains only two fields called id and data. The id field is of integer type and it is sequentially incremented starting at number one. The data field is of blob type and contains the thumbnail like images mentioned previously.

The cache table.

Some details about how the database is implemented per our observations:

  • We were not able to find any direct connection between the images in the database and the database that contains the chats. It seems it behaves like Glide were the images in the database are used by the application for rendering purposes but are separate from the actual images being send and received in chat interactions.
  • We knew deleted images where in the database because Heather had created the dataset and had extensive documentation of her process. We knew what images we were missing form the main image directory and we found those copies in the database.
  • We found another cacheV0.db database in the Google Voice app in iOS. It would seem this is a Google used image rendering managing process. We have not seen this database, so far, outside of Google apps.

In summary it seems this database:

  • Is used to keep and manage images used by the application for rendering to the user.
  • Keeps copies of images after the source files have been deleted.
  • It is used by Google applications.

If anyone comes across additional implementations of this database do share your findings.
In order to automate the parsing of these databases in iOS I have created the Image CacheV0 parser in iLEAPP.

iLEAPP parser for cacheV0.db

iLEAPP is a free, Python based, open-source, community driven platform for the parsing of iOS extractions for digital forensics. You can find the tool here: https://github.com/abrignoni/iLEAPP

For questions or comments find me on all social media here: https://linqapp.com/abrignoni



Tuesday, January 16, 2024

SQLite 3.45 introducing binary JSON

Have you heard about binary JSON in SQLite? I hadn't. Today I was made aware of it by digital forensics examiner and software developer extraordinaire Alex Caithness.


The latest SQLite version (Version 3.45.0) has the ability to encode and decode JSON data from plain text to binary format and back. Details of this functionality can be found here: https://sqlite.org/draft/jsonb.html

Why would this data need to be in binary format? Per the jsonb specification there will be a reduction in data size as well as faster processing speed.

 After downloading SQLite 3.45 on a Windows VM, I generated some synthetic plain text JSON data.

To test conversion from JSON to binary JSON I created a simple database a table called data with two fields: keyf, and jsonblob. The field definiton for the jsonblob field has to be BLOB.

After importing the data I encoded the blob by using the following query:
UPDATE data set jsonblobdata = jsonb(jsonblobdata);

After the UPDATE query I ran a SELECT query to see how the data would look.

JSON binary blob

Here is the blob field view with hex.

JSON binary blob with hex

One of the issues with binary data structures is that text searching an extraction will be less and less productive. SQLite binary JSON does not seem to compress data that much but I can foresee a future where it will just like LevelDB or other formats. Being aware of compressed binary data, and the need to access it in clear text format, will be a key function for digital examiners today and into the future.

In order to present the blob data in clear text I used the following query:
SELECT keyf, json(jsonblobdata) from data;

The json() function does all the work for you.

Clear-text JSON


After that one can deal with the JSON data as one usually does.
For questions or comments find me on all social media here: https://linqapp.com/abrignoni