Activity Recognition

Did you know that your phone "automatically" knows when you are still, walking, running, on a bike, on a bus, etc? In this lecture, we will learn about Android's Activity Recognition (AR for short) capability. In essense the phone uses sensing data from the acceleromter embedded in the phone to infer the activity of the user -- it's typically called passive sensing. The way this works is Android offers an API that will return the current activity -- well, to be more specific: Android returns one or more activities and their confidence levels when you use the AR API.

The theory behind passive sensing of activity is as follows. Let me explain through an example: people young and old walk very differently, right? They have different gaits and speed, etc. Therefore there is no one single matheatical model that captures this physical activity across the general population. What Google has done is collect a lot of training data from lots of different people asking them to label their activities: walking, running, still,on bike, in car, etc. Then they trained a machine learning "classifier" or model to make inferences (read predictions) about what the user is doing using periodic unlabelled acceleromter data from user's phones. Google train a classifier with a massive amount of data. The data captures the walking styles (in my example) of a generalized population of people. They then build that classifier and probably push it to your phone. The phone records your accelerometer data for a period (say 30 seconds), computes "features" (e.g., statistics) and then uses these features as input parameters to the classifier model: the model spits out the "classes" (e.g., walking, running) based on the input. Android makes these inferences available to apps -- assuming the user gives the app permission.

In MyRuns4 you need to use the AR API to periodically get the user's activity. You display the current activity on the Google Map. And when the user saves the current ``automatic exercise'' (i.e., when the app determines the type of exercise -- walking, running, biking, etc) the app determines what the majority of inference types its seem and saves that -- so for example if a user was still for 5 mins, walked for 10 mins but mostly ran (25 mins) then running is saved because the vast majority of inferences recorded were running.

What this lecture will teach you

Demo project

The demo code used in this lecture include:

We will use the ActivityRecognition app to demonstrate how to code up an application that periodically gets the activity of the user.

The code snippet comprises the MainActivity which creates a service to connect to play activity recognition. Theservice manages the connection and requests activiy updates using a PendingIntent. The PendingIntent fires a IntentSrevice to handle the activity updates from Android and post broadcasts back to the MainActivity to update the activity and confidence level (e.g., ON_FOOT, 90) to the UI.

Resources

There isn't a section in the book on Activity Recognition. Here are a few resources other than these notes if you need any additional pointers:

Your phone knows what you are doing

Many apps are being built to take into accout what the user is doing. For example, if you are driving your phone might automatically pair with the car speaker and takes advantage of using the display in the car to display directions, manage interruptions, etc. This app exists and it's called Android Auto. It won't work in my crappy 16 year old Subaru but might in your car. It is a very cool app that incorporates AR. There are many more apps taking into account AR particularly mobile health and sports apps.

Android's Activity Recognition API automatically detects the following user activities using small amounts of periodically processed accelerometer data -- typically these are called activities or classes:

The AR API also returns:

Note, that Android's Activity Recognition API (AR API) may stop activity reporting the user's activity if the phone has been still for a while. The logic there is: the phone has been still for sometime so let's save battery and not keep running the AR pipeline (another name for a set of algorithms). The phone has a number of low power sensors that will start the reporting again once it detects movement again. It does this to save energy. In fact it might surprise you but your phone is stacked with a number of very low powered chips that are sensing light, sound, temperature, metal objects, elevation, etc. etc. Your phone is a very capable and "smart" device.

The API returns a list of activities with their associated confidence levels (between 0-100, where 100 is very high confidence and 0 is very (i.e., no) low confidence). Why would it not return just a single activity (e.g., walking)? That is because you might be on_foot (standing but moving), walking or just getting on your bike or moving off in your car, in transition between walking and running -- all of these activities might look very similar if you are just looking at the raw X, Y, Z accelerometer signals and therefore the classifer tells you what is "thinks" you are doing with varying degrees of confidence: for example, it might return the following in one invocation of the AR API::

It makes sense that the classifier/ AR API would return you a list of the 3 activites and their respective confidence levels. Then it's up to the user to interpret the results as they wish -- e.g., select the activity with the highest confidence levels. Or, only display the activity if it exceeds a confidence threshold (if activity.confidence > 70). Your app could say, em: I want to be quite sure that the user is running.

You need to update a number or files and resources to get AR to work. Let's look at how to do that now.

Updating the Manifest

To set up AR you have to update your manifest by declaring any services you might implement (we implement two services in our example ActivityRecognition app. The app also has to request the user's permission to use ACTIVITY_RECOGNITION as shown below.


<uses-permission android:name="com.google.android.gms.permission.ACTIVITY_RECOGNITION" />

<service android:name=".services.DetectedActivityIntentService" />
<service android:name=".services.ActivityDetectionService" />

Don't forget you need to update your build.gradle file, and import Play Services under the dependencies node. You have already done that for location and maps for MyRuns. It is worth noting that to use location, maps and activity recognition you have to have play incorporated into your app. These services are a part of [Google Mobile Services (GMS)[https://www.android.com/gms/] and the phone serves as a "client" to Google "sever" side services. Question: I understand why you need to Play to get maps but why for reading the location and accelerometer on the phone? It's because Google and Apple wants that information for various reasons. Think about why you think they want your data. It could be to monetize and/or show congestion on the route you are traveling on using Google Maps. There are many reasons.

requestActivityUpdates -- Register for activity recognition updates

To register for activity recognition updates we first need to create an activity recognition client and then request activity updates using an PendingIntent. We also the DETECTION_INTERVAL_IN_MILLISECONDS; that is, the desired time between activity detections. This should be set to 0 for the fastest update rate but there could a lot of noise in the signal. Better to ask Android to assess the activity over a larger window -- 30 secs for example. Bottomline is a larger values will result in fewer activity detections and result in less impact battery life. Just like use of all services associated with sensors (location, AR) their is a cost on battery performance if you are using the service or API at a high rate. But sometimes you want very fine location updates or very fine activity updates. But it is worth keeping the cost in mind. You don't want to develop a great app and have people remove it because it drains the battery. And user's will do that. If your app changes their recharging behavior your app is toast.

Typically applications wants to monitor user activities in the "background" (when the user pushes the app into the background and its not visible) and perform an action when a specific activity is detected -- for example, high confidence of walking. This can be accomplished a PendingIntent callback (typically an IntentService as in the case of our demo code) which will be called with an intent when activities are detected. The intent recipient can extract the ActivityRecognitionResult using extractResult(android.content.Intent). More on that shortly.

Note, the PendingIntent creates the service with the PendingIntent.FLAG_UPDATE_CURRENT. This tells Android that any requests previously registered with requestActivityUpdates that have the same PendingIntent will be replaced by this request and just update extras.

Also note, that the call to requestActivityUpdates will keep the Google Play services connection active, so we need to call removeActivityUpdates(PendingIntent) when we no longer need AR updates. We do this when the ActivityDetectionService service is killed in onDestroy() in our example code.


mActivityRecognitionClient = new ActivityRecognitionClient(this);
Task<Void> task = mActivityRecognitionClient.requestActivityUpdates(
                  DETECTION_INTERVAL_IN_MILLISECONDS,
              mPendingIntent);
                            

In ActivityRecognition we start the ActivityDetectionService which in the onStartCommand() gets the client, requests activity updates and creates a pending intent that fires an IntentService DetectedActivityIntentService (with worker thread) to handle the activity updates, as shown below in the ActivityDetectionService snippet.


public int onStartCommand(Intent intent, int flags, int startId) {
       super.onStartCommand(intent, flags, startId);

       mActivityRecognitionClient = new ActivityRecognitionClient(this);
       Intent mIntentService = new Intent(this, DetectedActivityIntentService.class);
       mPendingIntent = PendingIntent.getService(this, 1, mIntentService, PendingIntent.FLAG_UPDATE_CURRENT);
       requestActivityUpdatesHandler();

       return START_STICKY;
}

Note, a PendingIntent will be sent for each activity detection update. That means every time Android has a new update (based on DETECTION_INTERVAL_IN_MILLISECONDS) it will create a new DetectedActivityIntentService and the worker thread will process the update and broadcast the result back to the UI thread. This happens for every update. This is nice because the handler does not run on the UI thread. Again, think threaded design. Startin to get that?

The call to requestActivityUpdates will return a Task object for apps to check the status of the call. If the task fails, the status code for the failure can be found by examining getStatusCode(). More specifically ActivityDetectionService sets up listeners to handle success and failure of the request for updates. In our code we simply log success ("Successfully requested activity updates") or raise an exception in the case of failure, as shown in the code below.


// help method for onStartCommand to request activity updates and set task listeners.
public void requestActivityUpdatesHandler() {
       if(mActivityRecognitionClient != null) {
            Task<Void> task = mActivityRecognitionClient.requestActivityUpdates(
                       Constant.DETECTION_INTERVAL_IN_MILLISECONDS,
                   mPendingIntent);
        // Adds a listener that is called if the Task completes successfully.
            task.addOnSuccessListener(new OnSuccessListener<Void>() {
                @Override
                public void onSuccess(Void result) {
                  Log.d(TAG, "Successfully requested activity updates");
                }
            });
            // Adds a listener that is called if the Task fails.
            task.addOnFailureListener(new OnFailureListener() {
                @Override
                public void onFailure(@NonNull Exception e) {
                 Log.e(TAG, "Requesting activity updates failed to start");
                }
            });
      }
}
        

If we follow the flow control of the mActivityRecognitionClient.requestActivityUpdate the pending intent will be fired once the AR has an update by creating a single short IntentService -- the service's worker thread onHandleIntent() processes the activity update and broadcast the result back to the MainActivity to display on the UI.

removeActivityUpdates

Once an app has issued requestActivityUpdates it opens a connection to play services that periodically provides updates via callbacks for new activities. Once the service (in our case ActivityDetectionService) is killed it needs to remove the pendingIntent and play service connection by calling removeActivityUpdates, as shown in the snippet of code below. Note, we set up callbacks for success (onSucccess) and failure (onFailure) and handle those conditions in this case by logging. This Removes all activity updates for the specified PendingIntent. The call returns a Task object as discussed above.


// remove the activity requested updates from Google play.
@Override
public void onDestroy() {
    super.onDestroy();
    // need to remove the request to Google play services. Brings down the connection
    removeActivityUpdatesHandler();
}

// remove updates and set up callbacks for success or failure
public void removeActivityUpdatesHandler() {
   if(mActivityRecognitionClient != null){
       Task<Void> task = mActivityRecognitionClient.removeActivityUpdates(
              mPendingIntent);
          
       // Adds a listener that is called if the Task completes successfully.
       task.addOnSuccessListener(new OnSuccessListener<Void>() {
        @Override
        public void onSuccess(Void result) {
               Log.d(TAG, "Removed activity updates successfully!");
        }
       });

       // Adds a listener that is called if the Task fails.
       task.addOnFailureListener(new OnFailureListener() {
            @Override
        public void onFailure(@NonNull Exception e) {
               Log.e(TAG, "Failed to remove activity updates!");
        }
       });
    }
}

ActivityRecognitionResult.extractResult(intent) -- getting results back

OK, this is a lot of plumbing to get the activity of the user -- or as we will see potentially a list of activities with confidence levels. In our code the DetectedActivityIntentService onHandleIntent() extract the ActivityRecognitionResult using extractResult.

In the code below we extract the result from the intent then pop of each activity and broadcast the result to the UI. Take at look at the demo code to see how the broadcast receiver is wired up and how we many interaction between the MainActivity and the services. If you run the code you can simulate different activities by shaking the phone to emulate walking, running, etc. You will see (look at the LogCat) different classes being returned by the API.


@Override
protected void onHandleIntent(Intent intent) {

     Log.d(TAG,TAG + "onHandleIntent()");
     ActivityRecognitionResult result = ActivityRecognitionResult.extractResult(intent);

     // Get the list of the probable activities associated with the current state of the
     // device. Each activity is associated with a confidence level (between 0-100)

     List<DetectedActivity> detectedActivities = result.getProbableActivities();
     
     for (DetectedActivity activity : detectedActivities) {
     Log.d(TAG, "Detected activity: " + activity.getType() + ", " + activity.getConfidence());
     broadcastActivity(activity);
     }
}
                        

In our code for the each activity returned in the list of result.getProbableActivities() we broadcast the type and confidence interval to the UI thread and it in turn displays those values on the UI. Assuming say the requested update period is 30 seconds (could be faster or slower of course) then you would see the screen update to one of the activities. Try it. Take the demo code for a run, or walk, or stand still, or dance, or bike or get in a car. It will guess your activity with good accuracy. Note, if the phone has been sitting on the desk and still for a while it will take sometime for it to "wake up" from low powered still mode and give updates. So it is not your code but the Android system optimizing its sensors and pipelines.


private void handleUserActivity(int type, int confidence) {
    String label = "Unknown";
    switch (type) {
           case DetectedActivity.IN_VEHICLE: {
           label = "In_Vehicle";
           break;
       }
       case DetectedActivity.ON_BICYCLE: {
           label = "On_Bicycle";
           break;
       }                                            
           case DetectedActivity.ON_FOOT: {
           label = "On_Foot";
           break;
       }
       case DetectedActivity.RUNNING: {
           label = "Running";
           break;
       }
           case DetectedActivity.STILL: {
           label = "Still";
           break;
       }
       case DetectedActivity.TILTING: {
        label = "Tilting";
        break;
       }
           case DetectedActivity.WALKING: {
            label = "Walking";
        break;
       }
       case DetectedActivity.UNKNOWN: {
        break;
       }

        Log.d(TAG, "broadcast:onReceive(): Activity is " + label
                    + " and confidence level is: " + confidence);

        mTextARLabel.setText(label);
    mTextConfidence.setText(confidence+""); 

}

The Magic of Machine Learning -- and why you should just do it!

Finally, Google's backend AR classifier has 100s of millions of samples from many different users all over the world. So even if you are a 10 year old kid, an old fella like me or like my dear only 86 year mum from Ireland -- assuming we all used the sample code and put the phone in or bag/pants, etc. Android AR would be smart enough to say we were all walking -- assuming we all were walking -- even though we would all walk very very differently with differet gaits, speed, etc. How does it do this? The magic of machine learning and big data! The more data the better the performance accuracy of the classifier running on your phone or in the cloud.

Hey, one piece of advise students: don't leave Dartmouth without talking as many ML/data science classes as you can -- it's the new algorithm. It is impacting every sector. We live in a data driven world. It's so exciting. Be part of the future - just do it!