Contents


Tapping into Android's sensors

Monitor your environment from near or far

Comments

The Android platform is ideal, especially for Java™ developers, for creating innovative applications through the use of hardware sensors. Learn some of the interfacing options available for Android applications, including using the sensor subsystem and recording audio snippets.

What type of applications might you build that would leverage the hardware capabilities of an Android-equipped device? Anything that needs electronic eyes and ears is a good candidate. A baby monitor, a security system, or even a seismograph comes to mind. Though you can't be in two places at once, metaphysically, Android might help bridge that gap in some practical ways. Throughout this article, keep in mind that the Android device in use is not merely a "cell phone" but perhaps a device deployed in a fixed location with wireless network connectivity, such as EDGE or WiFi. Download the source files for the examples in this article.

Android sensor capabilities

One refreshing aspect of working with the Android platform is that you can access some of the "goodies" within the device itself. Historically, the inability to access the underlying hardware of a device has been frustrating to mobile developers. Though the Android Java environment still sits between you and the metal, the Android development team brings much of the hardware's capability to the surface. The platform is open source, so you have the flexibility to roll up your sleeves and write some code to accomplish your tasks.

If it is not already installed, you might want to download the Android SDK. You can also browse the contents of the android.hardware package and follow along with the examples in this article. The android.media package contains classes that provide useful and novel functions.

Some of the hardware-oriented features exposed in the Android SDK are described below.

Table 1. Hardware-oriented features exposed in the Android SDK
FeatureDescription
android.hardware.CameraA class that enables your application to interact with the camera to snap a photo, acquire images for a preview screen, and modify parameters used to govern how the camera operates.
android.hardware.SensorManagerA class that permits access to the sensors available within the Android platform. Not every Android-equipped device will support all of the sensors in the SensorManager, though it's exciting to think about the possibilities. (See below for a brief discussion of available sensors.)
android.hardware.SensorListenerAn interface implemented by a class that wants to receive updates to sensor values as they change in real time. An application implements this interface to monitor one or more sensors available in the hardware. For example, the code in this article contains a class that implements this interface to monitor the orientation of the device and the built-in accelerometer.
android.media.MediaRecorderA class, used to record media samples, that can be useful for recording audio activity within a specific location (such as a baby nursery). Audio clippings can also be analyzed for identification purposes in an access-control or security application. For example, it could be helpful to open the door to your time-share with your voice, rather than having to meet with the realtor to get a key.
android.FaceDetectorA class that permits basic recognition of a person's face as contained in a bitmap. You cannot get much more personal than your face. Using this as a device lock means no more passwords to remember — biometrics capability on a cell phone.
android.os.*A package containing several useful classes for interacting with the operating environment, including power management, file watcher, handler, and message classes. Like many portable devices, Android-powered phones can consume a tremendous amount of power. Keeping a device "awake" at the right time to be in position to monitor an event of interest is a design aspect that deserves attention up front.
java.util.Date
java.util.Timer
java.util.TimerTask
When measuring events in the real world, date and time are often significant. For example, the java.util.Date class lets you get a time stamp when a particular event or condition is encountered. You can use java.util.Timer and java.util.TimerTask to perform periodic tasks, or point-in-time tasks, respectively.

The android.hardware.SensorManager contains several constants, which represent different aspects of Android's sensor system, including:

Sensor type
Orientation, accelerometer, light, magnetic field, proximity, temperature, etc.
Sampling rate
Fastest, game, normal, user interface. When an application requests a specific sampling rate, it is really only a hint, or suggestion, to the sensor subsystem. There is no guarantee of a particular rate being available.
Accuracy
High, low, medium, unreliable.

The SensorListener interface is central to sensor applications. It includes two required methods:

  • The onSensorChanged(int sensor,float values[]) method is invoked whenever a sensor value has changed. The method is invoked only for sensors being monitored by this application (more on this below). The arguments to the method include an integer that identifies the sensor that changed, along with an array of float values representing the sensor data itself. Some sensors provide only a single data value, while others provide three float values. The orientation and accelerometer sensors each provide three data values.
  • The onAccuracyChanged(int sensor,int accuracy) method is invoked when the accuracy of a sensor has been changed. The arguments are two integers: One represents the sensor, and the other represents the new accuracy value for that sensor.

To interact with a sensor, an application must register to listen for activity related to one or more sensors. Registering takes place with the registerListener method of the SensorManager class. The code example in this article demonstrates how an application registers and unregisters a SensorListener.

Remember, not every Android-equipped device supports any or all of the sensors defined in the SDK. Your application should degrade gracefully if a particular sensor is not available on a specific device.

Sensor example

The sample application simply monitors changes to the orientation and accelerometer sensors (see Download for the source code). When changes are received, the sensor values are displayed on the screen in TextView widgets. Figure 1 shows the application in action.

Figure 1. Monitoring acceleration and orientation
Monitoring acceleration and orientation
Monitoring acceleration and orientation

The application was created using the Eclipse environment with the Android Developer Tools plug-in. (For more information about developing Android applications with Eclipse, see Related topics.) Listing 1 shows the code for this application.

Listing 1. IBMEyes.java
package com.msi.ibm.eyes;
import android.app.Activity;
import android.os.Bundle;
import android.util.Log;
import android.widget.TextView;
import android.hardware.SensorManager;
import android.hardware.SensorListener;
public class IBMEyes extends Activity implements SensorListener {
    final String tag = "IBMEyes";
    SensorManager sm = null;
    TextView xViewA = null;
    TextView yViewA = null;
    TextView zViewA = null;
    TextView xViewO = null;
    TextView yViewO = null;
    TextView zViewO = null;

    /** Called when the activity is first created. */
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
       // get reference to SensorManager
        sm = (SensorManager) getSystemService(SENSOR_SERVICE);
        setContentView(R.layout.main);
        xViewA = (TextView) findViewById(R.id.xbox);
        yViewA = (TextView) findViewById(R.id.ybox);
        zViewA = (TextView) findViewById(R.id.zbox);
        xViewO = (TextView) findViewById(R.id.xboxo);
        yViewO = (TextView) findViewById(R.id.yboxo);
        zViewO = (TextView) findViewById(R.id.zboxo);
    }
    public void onSensorChanged(int sensor, float[] values) {
        synchronized (this) {
            Log.d(tag, "onSensorChanged: " + sensor + ", x: " + 
values[0] + ", y: " + values[1] + ", z: " + values[2]);
            if (sensor == SensorManager.SENSOR_ORIENTATION) {
                xViewO.setText("Orientation X: " + values[0]);
                yViewO.setText("Orientation Y: " + values[1]);
                zViewO.setText("Orientation Z: " + values[2]);
            }
            if (sensor == SensorManager.SENSOR_ACCELEROMETER) {
                xViewA.setText("Accel X: " + values[0]);
                yViewA.setText("Accel Y: " + values[1]);
                zViewA.setText("Accel Z: " + values[2]);
            }            
        }
    }
    
    public void onAccuracyChanged(int sensor, int accuracy) {
    	Log.d(tag,"onAccuracyChanged: " + sensor + ", accuracy: " + accuracy);
    }
    @Override
    protected void onResume() {
        super.onResume();
      // register this class as a listener for the orientation and accelerometer sensors
        sm.registerListener(this, 
                SensorManager.SENSOR_ORIENTATION |SensorManager.SENSOR_ACCELEROMETER,
                SensorManager.SENSOR_DELAY_NORMAL);
    }
    
    @Override
    protected void onStop() {
        // unregister listener
        sm.unregisterListener(this);
        super.onStop();
    }    
}

The application was written as a normal activity-based application because it is simply updating the screen with data obtained from the sensors. In an application where the device may be performing other activities in the foreground, constructing an application as a service would be more appropriate.

The onCreate method of the activity gets a reference to the SensorManager, where all sensor-related functions take place. The onCreate method also establishes references to the six TextView widgets you'll need to update with sensor data values.

The onResume() method uses the reference to the SensorManager to register for sensor updates from the registerListener method:

  • The first parameter is an instance of a class that implements the SensorListener interface.
  • The second parameter is a bitmask of the desired sensors. In this case, the application is requesting data from SENSOR_ORIENTATION and SENSOR_ACCELEROMETER.
  • The third parameter is a hint for the system to indicate how quickly the application requires updates to the sensor values.

When the application (activity) is paused, you want to unregister the listener so you no longer receive sensor updates. This is accomplished with the unregisterListener method of the SensorManager. The only parameter is the instance of the SensorListener.

In both the registerListener and unregisterListener method calls, the application uses the keyword this. Note the implements keyword in the class definition where it's declared that this class implements the SensorListener interface. That is why you pass this to registerListener and unregisterListener.

A SensorListener must implement the two methods onSensorChange and onAccuracyChanged. The example application is really not concerned with the accuracy of the sensors, but rather the current X, Y, and Z values of the sensors. The onAccuracyChanged method is essentially doing nothing; it just adds a log entry each time it is invoked.

It seems that the onSensorChanged method is invoked constantly, as the accelerometer and orientation sensors are rapidly sending data. Take a look at the first parameter to determine which sensor is sending data. Once the sending sensor is identified, the appropriate UI elements are updated with data contained in the array of float values passed as the second argument to the method. While the example is simply displaying those values, in more sophisticated applications, the values would be analyzed, compared to previous values, or put through some sort of pattern recognition algorithm to determine what the user (or outside environment) is doing.

Now that you've examined the sensor subsystem, the next section reviews a code sample that records some audio on an Android phone. The sample was run on the DEV1 development device.

Using the MediaRecorder

The android.media package contains classes to interact with the media subsystem. The android.media.MediaRecorder class is used to take samples of media, including audio and video. The MediaRecorder operates as a state machine. You need to set various parameters, such as source device and format. Once set, the recording may begin for an arbitrary amount of time until subsequently stopped.

Listing 2 contains code for recording audio on an Android device. The code shown does not include the UI elements of the application (see download for the full source code).

Listing 2. Recording an audio snippet
MediaRecorder mrec ;
File audiofile = null;
private static final String TAG="SoundRecordingDemo";
protected void startRecording() throws IOException 
{
   mrec.setAudioSource(MediaRecorder.AudioSource.MIC);
   mrec.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
   mrec.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
   if (mSampleFile == null) 
   {
       File sampleDir = Environment.getExternalStorageDirectory();
       try 
       { 
          audiofile = File.createTempFile("ibm", ".3gp", sampleDir);
       }
       catch (IOException e) 
       {
           Log.e(TAG,"sdcard access error");
           return;
       }
   }
   mrec.setOutputFile(audiofile.getAbsolutePath());
   mrec.prepare();
   mrec.start();
}
protected void stopRecording() 
{
   mrec.stop();
   mrec.release();
   processaudiofile(audiofile.getAbsolutePath());
}
protected void processaudiofile() 
{
   ContentValues values = new ContentValues(3);
   long current = System.currentTimeMillis();
   values.put(MediaStore.Audio.Media.TITLE, "audio" + audiofile.getName());
   values.put(MediaStore.Audio.Media.DATE_ADDED, (int) (current / 1000));
   values.put(MediaStore.Audio.Media.MIME_TYPE, "audio/3gpp");
   values.put(MediaStore.Audio.Media.DATA, audiofile.getAbsolutePath());
   ContentResolver contentResolver = getContentResolver();
   
   Uri base = MediaStore.Audio.Media.EXTERNAL_CONTENT_URI;
   Uri newUri = contentResolver.insert(base, values);
   
   sendBroadcast(new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE, newUri));
}

In the startRecording method, an instance of a MediaRecorder is instantiated and initialized:

  • The input source is set to the microphone (MIC).
  • The output format is set to 3GPP (*.3gp files), which is a media format geared toward mobile devices.
  • The encoder is set to the AMR_NB, which is an audio format, sampling at 8 KHz. The NB is for narrow band. The SDK documentation explains the different data formats and the available encoders.

The audio file is stored on the storage card, rather than in internal memory. External.getExternalStorageDirectory() returns the name of the storage card location, and a temporary file name is created in that directory. This file is then associated with the MediaRecorder instance by a call to the setOutputFile method. The audio data will be stored in this file.

The prepare method is invoked to finalize the initialization of the MediaRecorder. When you're ready to commence the recording process, the start method is called. Recording takes place to the file on the storage card until the stop method is invoked. The release method frees resources allocated to the MediaRecorder instance.

Once the audio sample has been taken, there are a few actions that can take place:

  • Add the audio to the media library on the device.
  • Perform some pattern recognition steps to identify the sound:
    • Is this the baby crying?
    • Is this the owner's voice, and should we unlock the phone?
    • Is this the "open-sesame" phrase that unlocks the door to the secret entrance?
  • Automatically upload the audio file to a network location for processing.

In the code sample, the processaudiofile method adds the audio to the media library. An Intent is used to notify the on-device media application that new content is available.

One final note about this snippet of code: If you try it, it won't record the audio at first. You will see a file created, but there will be no audio. You need to add a permission to the AndroidManifest.xml file:

<uses-permission android:name="android.permission.RECORD_AUDIO"></uses-permission>

At this point, you've learned a bit about interacting with Android sensors and recording audio. The next section takes a broader view of application architecture related to data gathering and reporting systems.

Android as a sensor platform

The Android platform boasts a healthy variety of sensor options for monitoring the environment. With this array of input or stimulus options, coupled with capable computational and networking functions, Android becomes an attractive platform for building real-world systems. Figure 2 shows a simplistic view of the relationship between inputs, application logic, and notification methods or outputs.

Figure 2. Block diagram for an Android-centric sensor system
Block diagram for an Android-centric sensor system
Block diagram for an Android-centric sensor system

This architecture is flexible; application logic can be split between the local Android device and a server-side resource that can tap into larger databases and computing power. For example, an audio track recorded on the local Android device can be POSTed to a Web server where the data is compared against a database of voice patterns. This is clearly just scratching the surface of the possibilities. Hopefully you're motivated to dig deeper into bringing Android to platforms beyond mobile phones.

Summary

In this article, you got an introduction to Android sensors. Sample applications measured orientation and acceleration, and interacted with the recording capabilities using the MediaRecorder class. Android is a flexible, attractive platform for building real-world systems. The Android space is maturing rapidly, and it is going places. Be sure to keep an eye on this platform.


Downloadable resources


Related topics


Comments

Sign in or register to add and subscribe to comments.

static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Open source
ArticleID=395918
ArticleTitle=Tapping into Android's sensors
publish-date=06162009