I've been fascinated by software defined radio (SDR) for years but investigating the GSM cellular network has always been just a little too challenging and slightly out of reach. All of that changed earlier this year when someone finally posted a workable, step-by-step guide for decoding (but not decrypting) cellular (GSM) radio traffic.
For those not familiar, SDR involves moving much of the functionality of a radio's circuitry into software rather than hardware. An SDR is a stripped down radio that outputs digital data to a host (usually a computer) where sophisticated programs decode the data into audio. For instance with an SDR and the proper software, you just need to change some code to make the radio play FM instead of AM. In recent years, cheap $20 SDR USB dongles have come onto the market replacing ones that used to cost thousands. (To get started, I suggest checking out http://www.rtl-sdr.com)
An SDR can easily pick up cellular phone transmissions but making sense of the data is complicated. Now you can read the detailed post here
and make it happen. If you follow this EXACTLY, it will work!
UPDATE: Since the "2G Sunset" on 1/1/2017, I have been unable to find any local GSM transmissions in the 850 MHz band using "Kalibrate." I think the solution would be to search in the higher 1900 MHz band, but to do so would require a new more expensive SDR with a wider frequency range.
As mentioned in an earlier post, I've been working with the Netduino microprocessor board, a .NET version of the open source Arduino. In preparation for designing a home energy monitoring system, I created a small project to measure and post data from the Netduino. The simplest circuit I could find consisted of a photocell and a resistor wired into one of the analog inputs. Every 60 seconds, the Netduino measures the value from the photo cell and posts the data to the Cosm website where it is graphed and displayed for all to see.
Data collection, analysis, and display are a particular interest of mine considering my background in SQL Server design and programming. Granted, this project produces a tiny amount of data, but it is a first step in integrating a microprocessor board with online graphing. One of the great features of Cosm is that you can set triggers for events to happen based on your data. For instance, when the daylight value reaches 66% I could send a tweet announcing that it is daybreak in Philadelphia. Cosm also provides graph widgets for any public data stream on their site, such as the one below.
It turns out that the opening sequence of the Big Bang Theory is great for determining the amount of video compression applied to the channel on which its airing. The rapid-fire images prove tricky for most compresion algorithms used by cable tv, satellite and fiber. With no or very little compression, the images in the title sequence will flow smoothly. The more compressed the signal, the more pauses or stuttering you'll see. For instance, when viewing the opening on the local independent station over the air with an HD tuner, little to no compression is seen. However, when viewing the same opening on TBS (via Verizon fiber) the sequence noticeably pauses on two of the images. (Most often on the image below)
A difficult question to answer is where the offending compression is being introduced. It could be from the uplink/downlink of the TBS signal to the local cable company. Or it could be the local cable company itself compressing video to fit more channels on their system. Although the compression artifacts are fairly minimal, they could make some viewers wonder why the producers chose to dwell on certain images, thus creating meaning where none exists. In any case, the next time you watch the opening titles, keep an eye out for tell-tale compression artifacts.