Android Activity Recognition API

Nowadays, everyone is having smartphones, and we use them to do our day to day lives. The best part of Android applications that are present in mobile phones is that these applications try to know their users more and more. Today, many of the applications are taking locations of the users to give a location related feeds to the users. One common example of this can be a normal news application, where the app takes your current location and presents the news according to the location.

If you are an Android Developer, then to give your users a better experience of the application, you have to understand your users in a better way. You should know what your users are doing at any instant of time. The more you know about your users the better application you can build for your users. So, there is much application that uses this Activity Recognition of the users. For example, a kilometer finder app starts running when you start driving a car or a bike and stops when you stop driving. By doing this, the app can find the distance that you traveled on a particular day. Another application of this Activity Recognition can be any Health and Fitness app that determines how many meters or kilometers you are running or walking on a particular day and after that, you can find the calories burnt on that day.

What is the Activity Recognition API?

The Activity Recognition API is an interface that periodically wakes the device, reads bursts of data from the device’s sensors, and then analyzes this data using powerful machine learning models.

Activity detection isn’t an exact science, so rather than returning a single activity that the user is definitely performing, the Activity Recognition API returns a list of activities that the user may be performing, with a confidence property for each activity. This confidence property is always an integer, ranging from 0 to 100. If an activity is accompanied by a confidence property of 75% or higher, then it’s generally safe to assume that the user is performing this activity and adjust your application’s behavior accordingly (although it’s not impossible for multiple activities to have a high confidence percentage, especially activities that are closely related, such as running and walking).

We’re going to display this confidence percentage in our application’s UI, so you’ll be able to see exactly how this property updates, in response to changing user activity.

The Activity Recognition API can detect the following activities:

  • STILL: When the mobile device will be still i.e. the user is either sitting at someplace or the mobile device is having no motion, then the Activity Recognition Client will detect the STILL activity.
  • ON_FOOT: When the mobile device is moving at a normal speed i.e. the user carrying the mobile device is either walking or running then the Activity Recognition Client will detect the ON_FOOT activity.
  • WALKING: This is a sub-activity of the ON_FOOT activity which is detected by the Activity Recognition Client when the user carrying the mobile device is walking.
  • RUNNING: This is also a sub-activity of ON_FOOT activity which is detected by the Activity Recognition Client when the user carrying the mobile device is running.
  • IN_VEHICLE: This activity detected when the mobile device is on the bus or car or some other kind of vehicle or the user holding the mobile device is present in the vehicle.
  • ON_BICYCLE: When the device is on the bicycle or the user carrying the mobile is on a bicycle then this activity will be detected.
  • TILTING: When the mobile device is being lifted and is having some angle with the flat surface then the Activity Recognition Client will detect this activity.
  • UNKNOWN: The Activity Recognition Client will show this result when the device is unable to detect any activity on the mobile device.

Detect when users start or end an activity

It might be necessary to design your app to identify when a user starts or stops a particular activity, such as walking, biking, or driving. For example, a mileage tracking app could start tracking miles when a user starts driving, or a messaging app could mute all conversations until the user stops driving.

The Activity Recognition Transition API can be used to detect changes in the user’s activity. Your app subscribes to a transition in activities of interest and the API notifies your app only when needed. This page shows how to use the Activity Recognition Transition API, also called the Transition API for short.

How can I use the Activity Recognition API?

Google Play’s Health & Fitness category is packed with apps dedicated to measuring and analyzing your day-to-day physical activities, which makes it a great place to get some inspiration about how you might use Activity Recognition in your own projects. For example, you could use the Activity Recognition API to create an app that motivates the user to get up and stretch when they’ve been stationary for an extended period of time, or an application that tracks the user’s daily run and prints their route on a map, ready for them to post to Facebook (because if Facebook isn’t aware that you got up early and went for a run before work, then did it even really happen?)

While you could deliver the same functionality without the Activity Recognition API, this would require the user to notify your app whenever they’re about to start a relevant activity. You can provide much better user experience by monitoring these activities, and then performing the desired action automatically.

Although fitness applications are the obvious choice, there are lots of ways that you can use Activity Recognition in applications that don’t fall into the Health & Fitness category. For example, your app might switch to a “hands-free” mode whenever it detects that the user is cycling; request location updates more frequently when the user is walking or running, or display the quickest way to reach a destination by road when the user is traveling in a vehicle.

Activity Recognition

Earlier user activity was detected using LocationClient, ActivityRecognitionApi. But these were deprecated recently and now we have to use ActivityRecognitionClient to do the same (hopefully it won’t be deprecated again, but their main intent is providing much efficient API).

Here is a code snippet that can be used to detect the activity.

ActivityRecognitionClient mActivityRecognitionClient = new ActivityRecognitionClient(this);Task<Void> task = mActivityRecognitionClient.requestActivityUpdates(Constants.DETECTION_INTERVAL_IN_MILLISECONDS,mPendingIntent);ActivityRecognitionResult result = ActivityRecognitionResult.extractResult(intent);ArrayList<DetectedActivity> detectedActivities = (ArrayList) result.getProbableActivities();for (DetectedActivity activity : detectedActivities) {Log.e(TAG, "Detected activity: " + activity.getType() + ", " + activity.getConfidence());}

Setup Project for Android App

Open Android Studio and create a project with the Empty Activity template.

Adding dependencies and permissions

The Activity Recognition Client requires the dependency of Google Play Services. So, to add the dependency of Google Play Services, add the below line in your app level build.gradle file:

implementation 'com.google.android.gms:play-services-location:16.0.0'

After adding the dependency of Google Play Services, add the permission of ACTIVITY_RECOGNITION in the AndroidManifest.xml file:

<uses-permission android:name="com.google.android.gms.permission.ACTIVITY_RECOGNITION" />

Adding the UI of the application

So, the process of adding the dependencies and permissions is done. Now the next step is to add the UI for our Main Activity. In our application, we will be having one TextView to display the name of the current activity and one TextView to display the confidence percentage of Activity. So, the activity_main.xml file looks something like this:

<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:app="http://schemas.android.com/apk/res-auto"
android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:android="http://schemas.android.com/apk/res/android">
<TextView
android:id="@+id/txt_activity"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_margin="24dp"
android:layout_marginStart="8dp"
android:layout_marginLeft="8dp"
android:layout_marginEnd="8dp"
android:layout_marginRight="8dp"
android:layout_marginBottom="48dp"
android:textAllCaps="true"
android:textColor="@color/colorPrimary"
android:textSize="18dp"
android:textStyle="bold"
app:layout_constraintBottom_toTopOf="@+id/txt_confidence"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
<TextView
android:id="@+id/txt_confidence"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal"
android:layout_margin="24dp"
android:layout_marginStart="8dp"
android:layout_marginLeft="8dp"
android:textAllCaps="true"
android:textSize="14dp"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<Button
android:id="@+id/btn_start_tracking"
android:layout_width="240dp"
android:layout_height="wrap_content"
android:layout_marginStart="8dp"
android:layout_marginLeft="8dp"
android:layout_marginEnd="8dp"
android:layout_marginRight="8dp"
android:layout_marginBottom="8dp"
android:text="Start Tracking"
app:layout_constraintBottom_toTopOf="@+id/btn_stop_tracking"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
<Button
android:id="@+id/btn_stop_tracking"
android:layout_width="240dp"
android:layout_height="wrap_content"
android:layout_alignParentRight="true"
android:layout_alignParentBottom="true"
android:layout_marginStart="8dp"
android:layout_marginLeft="8dp"
android:layout_marginEnd="8dp"
android:layout_marginRight="8dp"
android:layout_marginBottom="8dp"
android:text="Stop Tracking"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
</android.support.constraint.ConstraintLayout>

Now, add the strings values in the res/values/strings.xml file:

<resources>
<string name="app_name">Activity Recognition</string>
<string name="activity_in_vehicle">In Vehicle</string>
<string name="activity_on_bicycle">On Bicycle</string>
<string name="activity_on_foot">On Foot</string>
<string name="activity_running">Running</string>
<string name="activity_still">Still</string>
<string name="activity_tilting">Tilting</string>
<string name="activity_walking">walking</string>
<string name="activity_unknown">Unknown</string>
</resources>

Now, we are done with the UI part of our Main Activity. Now, let’s move on to the coding part of our application.

So, we are totally done with the UI part of our application. Now let’s move on to the coding part.

Creating an IntentService

After making the UI of the application, our next task is to create a class that will be extended from the IntentService. This class will return a list of probable activities that can be performed by the user or the activity that the user is currently doing i.e. WALKING, RUNNING, ON_FOOT, etc. The code of my DetectedActivitiesIntentService is:

class DetectedActivitiesIntentService : IntentService(TAG) {    override fun onCreate() {
super.onCreate()
}
override fun onHandleIntent(intent: Intent?) {
val result = ActivityRecognitionResult.extractResult(intent)
// Get the list of the probable activities associated with the current state of the
// device. Each activity is associated with a confidence level, which is an int between
// 0 and 100.
val detectedActivities = result.probableActivities as ArrayList<*>
for (activity in detectedActivities) {
broadcastActivity(activity as DetectedActivity)
}
}
private fun broadcastActivity(activity: DetectedActivity) {
val intent = Intent(MainActivity.BROADCAST_DETECTED_ACTIVITY)
intent.putExtra("type", activity.type)
intent.putExtra("confidence", activity.confidence)
LocalBroadcastManager.getInstance(this).sendBroadcast(intent)
}
companion object { protected val TAG = DetectedActivitiesIntentService::class.java.simpleName
}
}// Use the TAG to name the worker thread.

Activity Running in Background

Our next step is to write the code for the MainActivity.kt file. But before that, the thing that should be kept in mind is the battery performance. If you want your Activity Recognition Client to update the activities in a regular interval or in a frequent manner then this will reduce the battery performance of your mobile device. Also, if you want your application to continuously track the ongoing activity, then it is a good task to run your Application in the background. But at the same time, the battery consumption should be taken care of.

So, make a class that will detect the Activities in the background. Here is the code:

class BackgroundDetectedActivitiesService : Service() {    private lateinit var mIntentService: Intent
private lateinit var mPendingIntent: PendingIntent
private lateinit var mActivityRecognitionClient: ActivityRecognitionClient
internal var mBinder: IBinder = LocalBinder() inner class LocalBinder : Binder() {
val serverInstance: BackgroundDetectedActivitiesService
get() = this@BackgroundDetectedActivitiesService
}
override fun onCreate() {
super.onCreate()
mActivityRecognitionClient = ActivityRecognitionClient(this)
mIntentService = Intent(this, DetectedActivitiesIntentService::class.java)
mPendingIntent = PendingIntent.getService(this, 1, mIntentService, PendingIntent.FLAG_UPDATE_CURRENT)
requestActivityUpdatesButtonHandler()
}
override fun onBind(intent: Intent): IBinder? {
return mBinder
}
override fun onStartCommand(intent: Intent, flags: Int, startId: Int): Int {
super.onStartCommand(intent, flags, startId)
return Service.START_STICKY
}
fun requestActivityUpdatesButtonHandler() {
val task = mActivityRecognitionClient?.requestActivityUpdates(
MainActivity.DETECTION_INTERVAL_IN_MILLISECONDS,
mPendingIntent)
task?.addOnSuccessListener {
Toast.makeText(applicationContext,
"Successfully requested activity updates",
Toast.LENGTH_SHORT)
.show()
}
task?.addOnFailureListener {
Toast.makeText(applicationContext,
"Requesting activity updates failed to start",
Toast.LENGTH_SHORT)
.show()
}
}
fun removeActivityUpdatesButtonHandler() {
val task = mActivityRecognitionClient?.removeActivityUpdates(
mPendingIntent)
task?.addOnSuccessListener {
Toast.makeText(applicationContext,
"Removed activity updates successfully!",
Toast.LENGTH_SHORT)
.show()
}
task?.addOnFailureListener {
Toast.makeText(applicationContext, "Failed to remove activity updates!",
Toast.LENGTH_SHORT).show()
}
}
override fun onDestroy() {
super.onDestroy()
removeActivityUpdatesButtonHandler()
}
companion object {
private val TAG = BackgroundDetectedActivitiesService::class.java?.getSimpleName()
}
}

Code for MainActivity.kt

So, our final task is to write the code for MainActivity.kt file. Here, BroadCastReceiver() is used to receive the Activity Update from the user i.e. whenever there is a change in Activity, then the activity will be received. Following is the code for MainActivity.kt file:

class MainActivity : AppCompatActivity() {    private val TAG = MainActivity::class.java.simpleName
internal lateinit var broadcastReceiver: BroadcastReceiver
private lateinit var txtActivity: TextView
private lateinit var txtConfidence: TextView
private lateinit var btnStartTrcking: Button
private lateinit var btnStopTracking: Button
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
txtActivity = findViewById(R.id.txt_activity)
txtConfidence = findViewById(R.id.txt_confidence)
btnStartTrcking = findViewById(R.id.btn_start_tracking)
btnStopTracking = findViewById(R.id.btn_stop_tracking)
btnStartTrcking?.setOnClickListener { startTracking() } btnStopTracking?.setOnClickListener { stopTracking() } broadcastReceiver = object : BroadcastReceiver() {
override fun onReceive(context: Context, intent: Intent) {
if (intent.action == MainActivity.BROADCAST_DETECTED_ACTIVITY) {
val type = intent.getIntExtra("type", -1)
val confidence = intent.getIntExtra("confidence", 0)
handleUserActivity(type, confidence)
}
}
}
startTracking()
}
private fun handleUserActivity(type: Int, confidence: Int) {
var label = getString(R.string.activity_unknown)
when (type) {
DetectedActivity.IN_VEHICLE -> {
label = "You are in Vehicle"
}
DetectedActivity.ON_BICYCLE -> {
label = "You are on Bicycle"
}
DetectedActivity.ON_FOOT -> {
label = "You are on Foot"
}
DetectedActivity.RUNNING -> {
label = "You are Running"
}
DetectedActivity.STILL -> {
label = "You are Still"
}
DetectedActivity.TILTING -> {
label = "Your phone is Tilted"
}
DetectedActivity.WALKING -> {
label = "You are Walking"
}
DetectedActivity.UNKNOWN -> {
label = "Unkown Activity"
}
}
Log.e(TAG, "User activity: $label, Confidence: $confidence") if (confidence > MainActivity.CONFIDENCE) {
txtActivity?.text = label
txtConfidence?.text = "Confidence: $confidence"
}
}
override fun onResume() {
super.onResume()
LocalBroadcastManager.getInstance(this).registerReceiver(broadcastReceiver,
IntentFilter(MainActivity.BROADCAST_DETECTED_ACTIVITY))
}
override fun onPause() {
super.onPause()
LocalBroadcastManager.getInstance(this).unregisterReceiver(broadcastReceiver)
}
private fun startTracking() {
val intent = Intent(this@MainActivity, BackgroundDetectedActivitiesService::class.java)
startService(intent)
}
private fun stopTracking() {
val intent = Intent(this@MainActivity, BackgroundDetectedActivitiesService::class.java)
stopService(intent)
}
companion object { val BROADCAST_DETECTED_ACTIVITY = "activity_intent" internal val DETECTION_INTERVAL_IN_MILLISECONDS: Long = 1000 val CONFIDENCE = 70
}
}

Now, run the application and you will get the list of activities along with the confidence of the activity.

Conclusion

Here, we learned how to use the Activity Recognition Client in our application to determine the activities that users are doing at any instant of time. The Activity Recognition Client determines the ongoing Activities with some confidence percentage that tells you which activity is currently going on.

Senior Software Engineer | Android | Java | Kotlin|Xamarin Native Android|Flutter