Tuesday, February 4, 2014

Decoding GSM/SMS Timestamps



GSM timestamps (TP-SCTS)

Probably one of the most common timestamps found within mobile phones is the GSM/SMS timestamp format. Within the GSM specification 03.40, it's referred to specifically as the SCTS, or Service Center Time Stamp. The format is pretty simple, but I've gotten a lot of questions about parts of it, especially the timezone offset.

Let's look at a few examples in some feature phones. First, from a Samsung SGH-T429:

SMS records from a Samsung SGH-T429 phone - Highlighted portions are the timestamps

Here you can see two timestamps highlighted. By the way, the obfuscated portions are SMSC numbers and phone numbers that I felt it best to eliminate.

Now, looking at those timestamps, we have the following:

9001425100704A and 9001723165954A. Now to the formatting... These timestamps are stored in a reverse-nibble, binary-coded decimal format. Let's break down what that means by breaking it into bytes first:

90 01 42 51 00 70 4A

Now, each of these bytes represents, in order, the year, the month, the day, the hours, the minutes, and the seconds, followed by the last byte, which is the timezone offset. But... we have to reverse the order of the digits within each byte first (and we'll ignore the timezone for now).

09 10 24 15 00 07

Now, this makes more sense: 2009-10-24 at 15:00:07. Or, in good, old-fashioned American: October 24th, 2009 at 15:00:07 hours. Just remember that they are listed in descending order, biggest unit to smallest.

Now, the second timestamp we can decode easily as 2009-10-27 at 13:56:59. The reverse-nibbling... you get used to it after a while.

OK, now on to the timezone offset. That seventh byte represents the offset from Universal Coordinated Time, UTC, GMT (not really), or whatever you want to call it. This last byte is reverse-nibbled as well, but it's signed to indicate whether it's a negative or positive offset. Let's check it out.

First, starting with 4A, then reverse nibbling, we have A4, which in binary is 1010 0100. That first bit, if it's a 1, means it's a negative offset. If it's a 0, it means it's a positive offset. So, we know this is a negative offset. So, what I do is, in my mind, is convert that first bit to a sign, thusly: -010 0100, which converted back into decimal, gives us 010 = 2 and 0100 = 4, or -24. We just combine those nibbles as decimal digits (that's the binary-coded decimal part) Now, the easy bit. We multiply that by 1/4 hour increments to get the offset in hours. -24 x 1/4 = -6, so this offset is -6 hours from UTC. So, as a review:

Offset in the timestamp:        4A

Reverse nibble it:                   A4

Convert to binary:                  1010 0100

"1" equals negative:               -010 0100

Convert digits to decimal:      -24

Multiply by .25 hours:            -24 x .25 = -6 hours offset from UTC


Let's do another one from a random Mediatek Chinese phone:


Three SMS records in a Mediatek-based phone



OK, this one has three records on screen (from Cellebrite's Physical Analyzer). The three dates are as follows:

21 20 41 41 33 62 02 = 12 02 14 14 33 26 20 (2012-02-14, 14:33:26) with 20 x .25 = UTC +5 offset.

21 20 41 41 33 72 02 = 12 02 14 14 33 27 20 (2012-02-14, 14:33:27) with the same UTC +5

21 20 41 41 33 82 02 = 12 02 14 14 33 28 20 (2012-02-14, 14:33:27) again UTC +5

These were system messages, BTW.

Looking at the 02 offset code, it's easy to see that when we reverse nibble it (20), then look at the binary, 0010 0000, we see that the zero is the leading bit, which means it's positive. Then we just take the decimal digits 2 and 0 represented by these nibbles (decimal 20) and multiply them by .25 hours and we have UTC +5.

Half-hour timezones?

OK, one more that I thought was interesting. This one comes from one of those countries in the world where they have half-hour offset timezones. Here's a link to a page that describes some of those: http://www.timeanddate.com/time/time-zones-interesting.html.

Anyway, this is interesting to me because in this particular example, which is from Iran, PA shows the offset erroneously as UTC+5, when it's actually UTC + 5:30. Since Iran is either UTC+3:30 or UTC+4:30, maybe somebody somewhere set a clock wrong.

Screenshot from a popular mobile forensics application, showing UTC+5, when it should be UTC+5:30
We can clearly see the timestamp (11119190046422) which decodes as (2011-11-19 at 09:40:46). But... the timezone offset is 22... 22 x .25 hours is 5 and a half hours, not the five hours shown here. Anywho, probably something to take a look at if you're dealing with phones in those parts of the world with strange timezones.

Lastly, there is one good application for working with timestamps and verifying the ones you're getting from your mobile forensics tool of choice. That program is called Clocksmith, which comes with most of the common timestamp formats that you will probably see in phones, as well as some computer ones. You can get it from Evigator (http://www.evigator.com/free-apps/). Most of the time the main tools will break out the dates and times just fine, but... you have to be able to verify it yourself if need be.

Greg Thomson
H-11 Digital Forensics
www.h11dfs.com

http://www.h11dfs.com

References:
http://en.wikipedia.org/wiki/GSM_03.40
http://www.etsi.org/deliver/etsi_gts/03/0340/05.03.00_60/gsmts_0340v050300p.pdf
http://www.timeanddate.com/time/time-zones-interesting.html
http://en.wikipedia.org/wiki/Binary-coded_decimal



Tuesday, January 21, 2014

Parsing the stock Android Browser for Searched Items
 in Physical Analyzer


While digging through some Android extractions in Physical Analyzer, I noticed that, although the Google Search items were found by PA, the searches made from within the browser were not found. Now, I'm only talking about the stock Android browser here, not Chrome, Firefox, or any other browser.

The stock Android browser databases I'm interested in are found in the following path in the user data partition:

/data/data/com.android.browser/databases/browser.db or
/data/data/com.android.browser/databases/browser2.db

This is, of course, a SQLite database that typically also contains the web history for the browser in another table.

The table I care about here is the "searches" table. Here's a look at it using Physical Analyzer's database view tool. It should be noted that you COULD just export this as a .csv file and manually convert the Unix dates yourself, but... what I want to do is add these searches to the Searched Items under the Project Tree with their own category "Android Browser."



View of browser.db from within Physical Analyzer

So, you can see that this is a very simple table, with the search terms under the "search" column and the date (in Unix epoch time with milliseconds) under the "date" column.

What I was doing with these before was just exporting the entire table out to a .csv as I mentioned and using Evigator's Clocksmith to manually convert the dates and then pasting them back into the table in a new column. Something like this:

Using Excel to add the converted timestamps to the browser searches

By the way, Clocksmith is an excellent add-on tool which just happens to be free.

While the export to .csv trick works just fine, it's far from quick. Each manual conversion takes 15-20 seconds, so if you have dozens of search terms, it may take a while. I have seen some people do it with an Excel formula that converts the Unix date to a human-readable format, but that's too hard for me.

Anyway, I'll get to the point, which is a script that will do this for us and put the results right back into Physical Analyzer where we can use them just like anything else found there in the Project Tree.

SQLiteParser

Physical Analyzer includes a SQLite parsing module called, appropriately, SQLiteParser. It is quite easy to use (yes, even easier than importing sqlite3). All you need are the full path of the database, the table name of interest, and the column names from within the table. In this case, the database full path is /data/data/com.android.browser/databases/browser.db, the table name of interest is searches, and the column names are search and date.


We need to do the following:

- Find the file
- Set the file as our "database" for the parser, this tells PA to treat the file as a SQLite db
- set the table name that we want to read
- have PA pull the columns of interest from that table and assign them to values in a "searched item"
- add those searched items to the DataStore, which makes them show up in the Project Tree

So, here is one way to do it:



#script for finding the /data/data/com.android.browser/browser.db or browser2.db and pulling the searches
#table from that SQLite database, adding them to the Physical Analyzer datastore and displaying them
#in the Analyzed Data portion of the Project Tree under Searched Items - Android Browser
#Written for Physical Analyzer
#This script is not published by Cellebrite and Cellebrite makes no guarantees, either implied
#or express as to its suitability or performance or indeed its utility for any particular
#purpose.
#Greg Thomson
#H-11 Digital Forensics

from physical import *
import SQLiteParser
import clr
clr.AddReference ('System.Windows.Forms')
from System.Windows.Forms import MessageBox

#We need to iterate through the file systems and find the android browser database
#(this will vary depending on the make and model of Android phone and whether is a 
#physical or file system extraction. Some have multiple partitions and/or file
#systems, some just one.

#find the file and set the db to that file
count_db = 0
for fs in ds.FileSystems:
    for f in fs.Search('com\.android\.browser/databases/browser(2)*\.db$'):
        if f.AbsolutePath.endswith('browser.db') or f.AbsolutePath.endswith('browser2.db'):
            count_db += 1
            db = SQLiteParser.Database.FromNode(f)

#show a message if it didn't find anything
if count_db == 0:
    MessageBox.Show('Sorry, the Android Browser search history wasn\'t found.\
    \r\n\r\nEither this phone does\'t use the stock Android Browser or it\'s not an Android phone.',
    'Database not found.')
    quit()


#setting up a new Searched Item and assigning it values from the database    

tableName = "searches"                               #setting the table to the searches table

count_items = 0

for record in db.ReadTableRecords(tableName):         #read the records one by one in the
                                                      #'searches table
    srch = SearchedItem ()                            #setting a variable 'srch' to hold
                                                      #each new Searched Item
    srch.Value.Value = record ['search'].Value        #setting the "Value" value to our
                                                        #new Searched Item
    #here we convert the Unix timestamp to a PA timestamp (without milliseconds)
    srch.TimeStamp.Value = TimeStamp.FromUnixTime(int(record['date'].Value)/1000)
    srch.Source.Value = "Android Browser"              #setting the "Source" value to our
                                                       #new Searched Item
    ds.Models.Add(srch)                                #adding the new Searched Item to the
                                                       #Datastore
    count_items += 1
    
content = "Successfully added " + str(count_items) + " Android Browser search history items to the Project Tree."
title = "Success"
MessageBox.Show(content, title)

BTW, if anyone knows a good way to format source code on blogger, let me know. I'm new at this.

After running this on an Android phone that does have searches in the Android browser, you should see this, as well as a popup box telling you how many items were added:

Results of  android_browser_search.py script run on an Android phone with searches from the stock browser

If you run it on a non-Android phone or an Android phone that doesn't contain that file, you will see this:

Results of running the script on a phone that doesn't have the database

After you have one script written that parses SQLite records, then it's going to be only a few minutes to modify it for other apps that you may run into. Note: always double check to make sure you have the tables and columns names correct and that you're pulling the appropriate data to the appropriate places.

In any case, you should always know where the data came from, what its significance is, how it's formatted, etc. Even when your forensic tool does it all for you, you should always be able to go to the raw data and verify it for yourself.

Good luck. Remember.

H-11 Digital Forensics

Sunday, January 5, 2014

Adding Vendors to Bluetooth Devices in Physical Analzyer

Adding Vendors to Bluetooth Devices in Physical Analyzer


One of the nicest features about today's forensics software is its expandability. No matter how many people a forensics tool producer hires, they are NOT going to be able to anticipate every need. First of all, there are just too many different devices, applications, operating systems and users to be able to decode everything we may run across. Second, most programmers are not forensics people. They may not completely understand exactly what it is we want when it comes to data extraction types, reporting, analysis and organization of data. So, the addition of a scripting tool like Python to their product can help immensely in customizing our experience with the tool.

Looking up vendors by their MAC address

Every time I see an extraction in any software for Bluetooth devices, I want to be able to find out what the manufacturer/vendor of the device is in order to match that record with another actual device, whether its a computer or another mobile device. Let's be honest, I'm not terribly interested in Bluetooth headsets or hands-free kits, but... anyway...

There are various places on the Internet where you can find the vendor or manufacturer of a device by its MAC (media access control) address. Just about any networkable device will have a MAC address, which is its physical address. One of the sites is hwaddress.com, another is macvendors.com.

A MAC address consists of two parts, each three bytes long:

The first part is the OUI (Organizationally Unique Identifier) which names the vendor or maker of the device. The second part is a unique identifier for the device, which is somewhat like a serial number. For example, the following MAC address, 40:6c:8f:31:1a:0b is assigned to an Ethernet card and if we take a look on macvendors.com, we'll see that the manufacturer is Apple. The 40:6c:8f part tells us it's an Apple device and the 31:1a:0b part is just the unique identifier for our particular device (Ethernet, WiFi or Bluetooth adapter).

When I've come across MAC addresses in Physical Analyzer, whether they belong to Bluetooth devices or WiFi networks, I usually look them up to find out what brand they are. That way it should be easier to match it to the actual device, should we eventually come across it.

But the idea for this script really comes from me trying to find out if I could add to an existing piece of analyzed data or model in Physical Analyzer. It turns out you can.

I noticed that the Bluetooth Devices table has an "Information" column but I couldn't recall seeing it used, so I decided to write a script to add vendors to that column. The first method I tried, just to get it working, was a table, where I just looked up the vendors (on macvendors.com or hwaddress.com) from the MAC addresses already present:


from physical import *                                      #necessary import for PA                                           
                                                                                                                                                             
item = 0                                                               #set counter to 0                                                         
                                                                                                                                                           
#these are the vendors, in order, for the devices already listed in PA                                                       
                                                                                                                                                           
vendors = ["BlackBerry", "Huawei", "LG", "Samsung", "LG", "Samsung", "Samsung"]     
                                                                                                                                                         
#iterate through all the bluetooth devices in PA and add the corresponding vendor from the                    
#table to the device's "Info" property/attribute                                                                                        
                                                                                                                                                           
for device in ds.Models[BluetoothDevice]:                                                                                      
device.Info.Value = vendors[item]                                                                                           
item += 1                                                                                                                                    


So, here's what I got from that (for that particular phone):


Screenshot 1: View in Physical Analyzer before the script, showing the existing Bluetooth devices enumerated

Screenshot 2: View in Physical Analyzer after running the script. Now the vendors populate the Information column.

Screenshot 3: And you can, of course, add the information to your report.

Once I had that working, I had to find a way that I could do the same thing without having to manually look up the vendors for each device. This script was only marginally better than just doing it completely manually using any number of OUI searches on the Internet. So, after several minutes of research (thank you, Google), I discovered that the urllib2 library does work in Physical Analyzer's Python implementation.

Anyway, urllib2.urlopen can be used to access http://api.macvendors.com to pull the vendor for a particular MAC address. Their API page is located here: http://www.macvendors.com/api

And, yes, in case you're wondering, I did donate to their page. I'm amazing.

With urllib2, the script became even more simple and universal to any phone with Bluetooth Devices enumerated by Physical Analyzer. Here it is:

#necessary imports, both for scripts in PA and to use urllib2                                                                        
from physical import *                                                                                   
import urllib2                                                                                                                                            
                                                                                                                                                                  
url = "http://api.macvendors.com/"                               #set our url to the proper location                       
                                                                                                                                                                  
for device in ds.Models[BluetoothDevice]:                  #for to iterate through BT devices                                                                                                                                                                          
device.Info.Value = urllib2.urlopen(url+str(device.MACAddress)).read()                               


That last line does this:

1. Pulls each MAC Address from the existing BT devices as "device.MACAddress"
2. Converts the MAC Address to a string, as required by urllib2.urlopen.
3. With that and the url from before, runs urllib2.urlopen, which reads the vendor name from the macvendors API.
4. Assigns that result to the BT devices's "Info" property (device.Info.Value sets this).


So, the script just iterates through the existing Bluetooth devices, pulls each MAC Address from the device's "MACAddress" property, converts it to a string required by urlopen, looks it up, gets the vendor, then populates that info in the device's "Info" attribute/property.

The results look like this (note, this particular phone only has three BT devices enumerated):


Screenshot 4: The vendors (pulled from macvendors) populating the Information column




Hopefully you can get some use out of this, not just to add vendors to Bluetooth Devices, but for adding information carved to any type of Analyzed Data. Make sure, however, that you are certain it's correct before you go adding stuff willy-nilly to your results. Document everything.

Later.




H-11 Digital Forensics

Thursday, January 2, 2014

Parse PDU-format SMS messages

Parse PDU-format SMS messages from MCU of Samsung SGH-E900 phone

OK, since I primarily work with Cellebrite's Physical Analyzer, I'll post one script I wrote to parse PDU-formatted SMS messages out of the unallocated space of a Samsung SGH-E900 phone (cheap feature phone) that had a ton of deleted messages hanging out in the nether regions of the device's NOR (MCU) memory. These weren't all parsed by Physical Analyzer automatically. In fact, most of them weren't.

I wrote one version of this first, a version that builds the requisite memory range from "chunks." In this case it's just one chunk, but hopefully it makes sense. Then, I learned I could do it as a "subrange" of an existing memory range, in this case, the MCU image from the extraction. Here are both versions.

I will mention that I suck as a programmer and I wouldn't mind at all if someone could clean this up.



Chunk Version

#This script works for any phone that has PDU-formatted SMS message
#Written for Physical Analyzer


from physical import *                     #necessary for any script written for PA
from PhoneUtils.GSM import PDUParser       #import parser for PDU-formatted SMS

MCU = ds.MemoryRanges[1]                   #choose the MCU for the E900 phone
sms = Chunk (MCU, 0x1E5CE35, 0x53)         #set the offset/length of the SMS in the MCU
chunks = [sms]                             #define the list of chunks (one chunk here)
raw_sms = MemoryRange (chunks)             #define memory range from MCU

hasMMC = True                              #does it have an SMSC number?

parsed_sms = PDUParser.TryParsePDU (raw_mms, hasMMC).SMS #parse out the PDU-formatted SMS

ds.Models.Add (parsed_sms)                 #Add to the datastore
                                           #It will now show up in the Project Tree





Sub-Range Version

#This script works for any phone that has PDU-formatted SMS messages
#Written for Physical Analyzer

from physical import *                      #necessary for any script written for PA
from PhoneUtils.GSM import PDUParser        #import necessary parser for PDU-formatted SMS

MCU = ds.MemoryRanges[1]                    #choose MCU for the E900
offset = 0x1E5CFD9                          #set the offset/length of the SMS in the MCU
length = 0x53
raw_sms = MCU.GetSubRange (offset, length)  #define raw_sms as a subrange of the MCU
                                            #based on offset and length
hasMMC = True                               #does the message contain an SMSC number?

#parse out the PDU-formatted SMS using PA's function
parsed_sms = PDUParser.TryParsePDU (raw_sms, hasMMC).SMS   

ds.Models.Add (parsed_sms)                  #Add to the datastore
                                            #It now shows up in the Project Tree







Anywho, you can test this on any phone that has PDU-formatted SMS messages. You just have to specify the proper memory range and the offset and length. I want to expand this eventually to use grep to search through the unallocated space for these messages and parse them automatically. For further information, read the Python Scripting Guide found under the Help Menu item in Physical Analyzer. There aren't enough people doing this stuff and I want to get some collaborators.

Here's a reference to PDU-format SMS:

PDU-Format SMS Messages

My employer, H-11 Digital Forensics
I hope to make this a place where mobile forensics practitioners, python experts, hackers and people with too much time on there hands can share scripts and plugins to make mobile-device examination easier for all of us.