Project 3 - Smart Phone Sensing

Everyday technologies such as smart phones are increasingly being used as sensing devices and even to create large wireless sensor networks. This is primarily driven by the widespread use of smart phones, their relatively low cost, and their increasing sensing capabilities. For example, these devices are able to capture location (GPS, triangulation), motion (accelerometer, gyroscope), acoustic signals (microphone), light and proximity (e.g., using light sensors), and images/videos.

In this project we are primarily interested in the GPS and accelerometer sensors. You will develop a simple smart phone application that tracks your your location "on demand" (i.e., whenever your push a certain button on your smart phone app's user interface). At the same time, the device also records the accelerometer readings. Your application will allow a user to turn on or off these measurements at anytime. Your application will display the following information:
  • The current location (latitude and longitude); you will need the GPS sensor for this.
  • The steps taken by the user since tracking was turned on; you will need to extract "steps" from the accelerometer readings.
  • The stride length (the distance traveled by one foot during each step); which needs to be computed using the GPS data and accelerometer readings. The stride length should be shown both as the most recent stride and the average stride length since the beginning of the measurements.
  • The velocity with which the user travels. Again, this should show the current velocity and the average velocity since the beginning of the measurements.
In addition, whenever the device loses GPS (e.g., this can also be simulated with another button in your interface that tells the device to ignore GPS readings), it should predict the location using the accelerometer, computed stride length, and the user's direction before the GPS readings were lost. The device should clearly indicate whenever the app operates in this GPS-denied mode. Once GPS returns (e.g., by pressing the "GPS on/off" button in your app), the app should indicate the location error (e.g., by how many feet/meters the predicted location differed from the actual location).

To achieve these goals, please refer to the Android sensing tutorial and the two code samples (one for GPS, one for accelerometer) provided here:
Development Environment
This assignment is to be completed using the Android smart phone platform. The four desktops in the DARTS Lab (Linux) have Eclipse and the Android SDK (Software Development Kit) pre-installed. You can also install Eclipse and Android SDK on your own computer; instructions can be found here: "Installing the SDK". The SDK also includes an Android emulator that can be used to test simple code. However, to test the GPS and accelerometer components, you will need to download your code onto actual phones. The lab also has several Google Nexus phones available that can be used for this purpose (you can use any other Android phone too). Please do not remove the phones from the lab without the permission of the instructor.

Deliverables
The deadline for this project is midnight of December 6th.
  • All source files that have been developed must be placed into your dropbox into a subdirectory called "project3".
  • A brief report (at most two pages) describing your implementation, any challenges faced in the implementation and testing, and any unresolved issues. In this report, also describe at least five experiments you have performed and provide information such as: location of experiment (start and endpoint), route taken (feel free to add a map or graph), where GPS was turned off and back on, what the average velocity and stride readings where, what the total step count was, and what the localization error was when GPS was turned back on.
  • You will demonstrate your implementation to the TA before the submission deadline (either during the Friday lecture time before the deadline or any other time that you and the TA agree on).