Skip to main content

Introduction to Machine Learning on Mobile

Artificial Intelligence and ML apps are becoming more pervasive in our everyday lives. They can be used for many different use cases, for instance Virtual personal assistants, Commuting Predictions, recommending products, security, etc. Working in the background, they can provide us with help even when we are not aware they are there. I am sure most of you reading this blog might have received a notification on your mobiles on the best time to leave work based on traffic conditions from various map applications. Or you might have received product recommendations based on your previous purchase pattern.

Machine learning applications are rapidly entering our daily lives as technology advances toward providing smartphone mobile-centric solutions. Social Media platforms like Facebook, Linkedin, Instagram, etc are experiencing a rapid progression in recent years. The smartphone's power, productivity, and value must exceed the buyer's expectations to be a successful product. Certainly, this trend paves the way for a need to have a more advanced mobile app, which can provide heightened performance without requiring too much space or power.

In this article let us discuss Machine Learning on Mobiles its architecture and deployment along with one of the most used libraries for on-device Machine Learning i.e. Tensor Flow and its features. But before we actually look into its architecture and deployment. Let us first try and understand why Mobiles.? Because mobile phones are accessible to most people worldwide and most of us have one in our pocket. Also, in this article, we will look into movie recommendation systems using TensorFlow Lite and Firebase codelab.

IMAGE 1: REVAMPING THE USAGE OF MOBILE APPS
IMAGE 1: REVAMPING THE USAGE OF MOBILE APPS

Today we are living in a world of mobile applications. They have become such a part and parcel of our everyday lives that we are dependent on mobile apps for almost all our needs starting from our daily needs like groceries, and food delivery to booking our year-end favourite holiday spot.

We are also seeing that the global smartphone penetration base is increasing exponentially. Smartphones have replaced Personal Computers as the most important smart-connected devices. Innovations in mobile, new business models and technologies in mobile are transforming every walk of human life.


IMAGE 2: NUMBER OF SMARTPHONE USERS WORLDWIDE
IMAGE 2: NUMBER OF SMARTPHONE USERS WORLDWIDE

IMAGE 3: TOP 3 COUNTRIES BY SMARTPHONE USERS
IMAGE 3: TOP 3 COUNTRIES BY SMARTPHONE USERS


Now, let us come to Machine Learning. Why has Machine Learning become so popular in recent times? Machine Learning is not a new subject. Machine Learning existed over 10-20 years ago, so why is it in focus now? The reason behind this is simple: Data Explosion. Mobile devices made it possible to generate user data like never before. Ten years ago, we were not able to upload images to the cloud as we do today because mobile phone penetration then cannot be compared to what it is today. The 4G now 5G connection makes it possible even to live stream video so it means more data is running all around the world like never before. The next era is predicted to be the era of the Internet of Things (I.O.T), where we will have more data from sensors on these smart devices.

All this data can be a gold mine to us only when we can put it to proper use, derive insights that bring value to us, and bring data patterns that provide new business opportunities. So, for this to happen Machine Learning on mobile devices is the right tool to unlock the hidden potential in these piles and piles of data.
Machine Learning can be applied to mobile applications of any domain like -Healthcare, Finance, Games, Communication, or literally anything under the Sun.

Why Use Machine Learning on Mobile Devices?

The current trend suggests that the implementation of Machine Learning on mobile devices for app development is gaining popularity. Despite being traditionally associated with computers, modern mobile devices are equipped with high processing power, enabling them to perform tasks with the same level of effectiveness as computers.

Some classic examples of Machine Learning on Mobiles are as follows:

1) Speech Recognition

2) Image Classification

3) Gesture Recognition

4) Translation from one language to another

5) Robotics Patient-Monitoring Systems

6) Movie Recommendations

Ways to implement machine learning in Mobile Applications:

The following are the four main activities to be performed for any Machine-Learning problem.

1. Define the Machine Learning Problem

2. Gather the Data Required

3. Use the data to build/ train a model

4. Use the Model to Make Predictions

IMAGE 4: STEPS TO IMPLEMENT MACHINE LEARNING IN MOBILE APPLICATIONS
IMAGE 4: STEPS TO IMPLEMENT MACHINE LEARNING IN MOBILE APPLICATIONS

Utilizing Machine Learning Service Providers:

There are many M.L. Service providers which we can utilize for building Machine Learning Models on Mobiles. The following is a list of some of the service providers. The list is on an increasing trend day by day.

Google Cloud Vision

Microsoft Azure Cognitive Services

IBM Watson

Amazon Web Services 

Steps Involved in building a Machine Learning Model 

The below diagram will summarize the steps involved in building a Machine Learning Model in Mobile Application development.

IMAGE 5: STEPS INVOLVED IN BUILDING A MACHINE LEARNING MOBILE APPLICATION


Add Recommendations to our app with TensorFlow Lite and Firebase:

Recommendations allow apps to use M.L. to intelligently serve the most relevant content for each and every user. They actually take into account past user behaviour to suggest apps content the user might like to interact with in the future by using the model trained on the aggregate behaviour of a large number of users.

We will try and learn in this article to how to obtain data from the app's users with Firebase Analytics, build a Machine Learning model for recommendations from that data, and then we will also use the model in an Android app to run inference and obtain recommendations. Our objective is to create a model that can recommend movies to a user based on their past movie preferences. The model will analyze the user's list of previously liked movies to determine which films they are most likely to enjoy watching in the future.

What we will Do?

Step 1: Integrate Firebase Analytics into an android app to collect user behavior data.

Step 2: Export that data into Google Big Query

Step 3: Pre-process the data and train a Tensor Flow Lite recommendations model

Step 4: Deploy the Tensor Flow Lite Model to Firebase ML and access it from your app

Step 5: Run on device inference using the model to suggest recommendations to users.

What we will Need

1. Android studio version 3.4+

2. A test device with Android 2.3+ and Google Play services 9.8 or later.

3. If using a device a connection cable.

Now let us discuss each step in some detail.

In order to get the sample code from GitHub you may use the below code.

https://github.com/FirebaseExtended/codelab-contentrecommendation-android.git

The above code has two repositories which contain two folders:

1. Start- The starting code that you will build upon in this code lab.

2. Final- The complete code for the finished sample app.

From Android Studio, select the codelab-recommendations-android directory from the sample code download. (File > Open > .../codelab-recommendations-android/start).

We will now have the start project open in Android Studio.

Step 3: Create a Fire Base Console Project:

Create a new Project

1. Go to the Firebase Console

Select Add Project (or Create a Project if its the first one)

Select or enter a project name and click continue. Ensure to 'Enable Google Analytics for this project' is enabled. Follow the rest of the setup steps in the Firebase Console, then click create project or you can also add Firebase if you are using an existing Google Project.

Step 4: Add Firebase

From the overview screen of your new project, now click on the Android Icon to launch the setup workflow.

Enter the code labs package name  com.google.firebase.codelabs.recommendations, Now select Register app.

Add google- services.json file to your app

After adding the package name and selecting Register click on download google-services.json to obtain your Firebase Android config file then copy the google-services.json file into the app directory in your project. 

After the file is downloaded you can now skip the next steps shown in the console.


Add Google Services plugin to your app:

The google-services plugin uses the google-services.json file to configure your application to use Firebase. The following lines should already be added to the build .gradle files in the project.

app/build.grade

build.grade

Sync your project with gradle files:

To be sure that all dependencies are available to our app, we should sync our project with gradle files at this point. Select File> Sync Project with Gradle Files from the Android studio toolbar.

Run the Starter App:

Now that we have imported the project into Android studio and configured the google-services plugin with your JSON file, we are ready to run the app for the first time. 

Now connect your Android device, and click Run ( execute.png) in the Android Studio toolbar.

The app should launch on the device. At this point, we can see the app that shows a tab with a list of movies, a Liked Movies Tab and a Recommendation tab as can be seen in the below screen shot.


We can click on a movie in the list of movies to add it to your liked list. After completing the renaming steps of the code lab we will be able to generate the movie recommendations in the recommendations tab.

Step:5  Add Fire Base Analytics to the APP:

In this step we will add Firebase Analytics to the app to log user behavior data i.e. in our case which movies the user likes. This data will be used in aggregating in future steps to train the recommendation model.

Step 6: Add Firebase Analytics Dependence:

The following dependency is necessary to add Firebase Analytics to our app. It is already been included in the app/build.gradle file.

implementation 'com.google.firebase:firebase-analytics-ktx:21.1.0'

Setup Firebase Analytics in the app:

The Liked Movies view model contains functions to store the movies the user likes. Every time the user likes a new movie, we want to send off an analytics log event to record that like.

Add the on Movie Liked function with the code as shown below to register an analytics event when the user clicks like on a movie.

LikedMoviesViewModel.kt

import com.google.firebase.analytics.FirebaseAnalyticsimport com.google.firebase.analytics.ktx.analytics
import com.google.firebase.analytics.ktx.logEvent
import com.google.firebase.ktx.Firebase


class LikedMoviesViewModel internal constructor (application: Application) : AndroidViewModel(application) {

    ...

    fun onMovieLiked(movie: Movie) {
        movies.setLike(movie, true)
        logAnalyticsEvent(movie.id.toString())
    }
       
}

Add the following field and function to log an Analytics event when a movie is added to the user's liked list.                                                                                     
LikedMoviesViewModel.kt

import com.google.firebase.analytics.FirebaseAnalyticsimport com.google.firebase.analytics.ktx.analytics
import com.google.firebase.analytics.ktx.logEvent
import com.google.firebase.ktx.Firebase

class LikedMoviesViewModel internal constructor (application: Application) : AndroidViewModel(application) {
    ...
    private val firebaseAnalytics = Firebase.analytics

    ...

    /**
     * Logs an event in Firebase Analytics that is used in aggregate to train the recommendations
     * model.
     */
    private fun logAnalyticsEvent(id: String) {
        firebaseAnalytics.logEvent(FirebaseAnalytics.Event.SELECT_ITEM) {
            param(FirebaseAnalytics.Param.ITEM_ID, id)
        }
    }

Testing your Analytics Integration:

Now in this step, we will generate analytics events in the app and verify that they are being sent to the Firebase Console.

Enable Analytics Debug Logging:

Firebase Analytics is designed to maximize user battery life and will batch events on device and only send them to Firebase occasionally. For debugging purposes, we can disable this behavior to see events as they are logged in real time by running the following command in the shell.

Terminal

adb shell setprop debug.firebase.analytics.app com.google.firebase.codelabs.recommendations

Verify Analytics events are generated:

In Android Studio open the Logcat window to examine logging from your app. Now, set the Logcat filter to the string "Logging event".

Now, verify that the "select_ item" Analytics events are emitted every time you like a movie in the app.

At this point, we have successfully integrated Firebase Analytics into our app. As users use our app and like movies, their likes will be logged in aggregate format. We will use this aggregate data in the rest of the code lab to train our recommendations model. 

Export Analytics Data to Big Query:

Big Query is a Google Cloud product that allows us to examine and process large amounts of data. In this step, we will connect our Firebase Console project to Big Query so that the Analytics data generated by our app is automatically exported to Big Query.

Enable Big Query export:

1.Go to the Firebase Console

2. Select the settings gear icon next to project overview, and then select project settings

3. Select the Integrations tab.

4. Select Link (or Manage) inside the Big Query block

5. Select Next in the About Linking Firebase to Bigquery block.

6. Select Next in the About Linking Firebase to Bigquery step.

7. Under the Configure integration section, click the switch to enable sending Google Analytics data and select Link to Big query.

We have now enabled the Firebase Console project to automatically send Firebase Analytics events data to Big Query. This will happen without any further interaction automatically, however, the first export that creates the analytics dataset in Bigquery may not happen for the first 24 hours. After the dataset is created, Firebase continually starts exporting new Analytics events to Big Query into the intraday table, and groups events from past days in the events table.

In order to train a recommendations model we need a lot of data. Since we don't have an app generating large amounts of data, we will now import a sample dataset into Bigquery.

Use Big Query to Obtain Model Training Data:

Now we have connected our Firebase Console to export to BigQuery our app analytics event data will automatically show up in Bigquery console after some time. Now let us import some sample dataset for training our recommendations model.

Steps to download the sample dataset.

1. Go to the Big Query dashboard in the Google cloud console.

2. Select your project name in the Menu

3. Select your project name in the bottom of the BigQuery  left navigation to see details.

4. Select create dataset to open the dataset creation panel.

5. Enter 'firebase_recommendations_dataset' for the Dataset ID and select  create dataset.

6. The new dataset will show up in the left menu under the project name. Click it.

7. Select Create Table to open the table creation panel.

8. For create a table from select 'Google Cloud storge'.

9. In the Select from GCS bucket field, enter gs://firebase-recommendations/recommendations-test/formatted_data_filtered.txt

10. Enter 'recommendations_table' for the Table name.

11. Check the box under Schema> Auto detect>Schema and input parameters and select create a table.

Preprocess Data and  Train Recommendations Model:

Here we will use a Colab notebook to perform the following steps:

1. Import the Bigquery data into the Colab notebook.

2. Preprocess the data to prepare it for model training

3. Train the recommendations model on the analytics data.

4. Export the model as a TF lite model

5. Deploy the model to the Firefox Console so we can use it in our app

Before we launch the colab training notebook, we will have to first enable the Firebase model Management API so colab can deploy the trained model to our Firebase Console.

Enable Firebase Model Management API

Creating a bucket to store Machine Learning models

In your Firebase Console, go to Storage and click Get started as in the below screen shot.



Now we will go to the Firebase ML API page on Google Cloud Console and click enable.

Now we will use colab notebook to train and deploy the model. Open the colab notebook using the link below.

https://colab.research.google.com/github/FirebaseExtended/codelab-contentrecommendation-android/blob/master/Firebase_ML_on_device_recommentations.ipynb

Complete the steps. After finishing the steps we will have a TF lite model file deployed to the Firebase console that we can sync down to our app.

Download the Model in your APP:

In this step we will modify our app to download the model we just trained using the Firebase Machine Learning.

Adding Firebase ML dependency:

The following dependency is needed in order to use Firebase Machine Learning Models in our app. It should already be added.

app/build.grade

                        implementation 'com.google.firebase:firebase-ml-modeldownloader:24.0.4'

Downloading the Model with Firebase Model Manager API:

Now let us dump the code below into RecommendationsClient.kt to set up the conditions under which model download occurs and create a download task to sync the remote model to our app.

RecommendationClient.kt

private fun downloadModel(modelName: String) {

        val conditions = CustomModelDownloadConditions.Builder()

            .requireWifi()

            .build()

        FirebaseModelDownloader.getInstance()

            .getModel(modelName, DownloadType.LOCAL_MODEL, conditions)

            .addOnCompleteListener {

                if (!it.isSuccessful) {

                    showToast(context, "Failed to get model file.")

                } else {

                    showToast(context, "Downloaded remote model: $modelName")

                    GlobalScope.launch { initializeInterpreter(it.result) }

                }

            }

            .addOnFailureListener {

                showToast(context, "Model download failed for recommendations, please check your connection.")

            }

    }

Tensorflow Lite runtime will let us use our model in the app to generate recommendations. In the previous step, we have initialized a TFlite interpreter with the model file we have downloaded earlier. In this step, we will first load a dictionary and then labels to accompany our model in the inference step, then we will add pre-processing to generate the inputs to our model and post-processing we will extract the results from our inference.

Load Dictionary and Labels:

The labels which we have used to generate the recommendation candidates by the recommendations model are listed in the file sorted_movie_vocab.json in the res/assets folder. 

Now copy the following code to load these candidates.

/** Load recommendation candidate list.  */

    private suspend fun loadCandidateList() {

        return withContext(Dispatchers.IO) {

            val collection = MovieRepository.getInstance(context).getContent()

            for (item in collection) {

                candidates[item.id] = item

            }

            Log.v(TAG, "Candidate list loaded.")

        }

    }

Implementing Pre-Processing:

In the pre-processing step, we change the form of the input data to match what our model expects. Here, we pad the input length with a placeholder value if we have not generated a lot of user likes already. 

The code is as below.

/** Given a list of selected items, preprocess to get tflite input.  */

    @Synchronized

    private suspend fun preprocess(selectedMovies: List<Movie>): IntArray {

        return withContext(Dispatchers.Default) {

            val inputContext = IntArray(config.inputLength)

            for (i in 0 until config.inputLength) {

                if (i < selectedMovies.size) {

                    val (id) = selectedMovies[i]

                    inputContext[i] = id

                } else {

                    // Padding input.

                    inputContext[i] = config.pad

                }

            }

            inputContext

        }

    }

Run interpreter to generate recommendations

Now we use the model we downloaded in the previous step to run inference on our pre-processed input. We have set the type of input and output for our model and run the inference to generate our movie recommendations. 

The code for the same is as below.

/** Given a list of selected items, and returns the recommendation results.  */

    @Synchronized

    suspend fun recommend(selectedMovies: List<Movie>): List<Result> {

        return withContext(Dispatchers.Default) {

            val inputs = arrayOf<Any>(preprocess(selectedMovies))


            // Run inference.

            val outputIds = IntArray(config.outputLength)

            val confidences = FloatArray(config.outputLength)

            val outputs: MutableMap<Int, Any> = HashMap()

            outputs[config.outputIdsIndex] = outputIds

            outputs[config.outputScoresIndex] = confidences

            tflite?.let {

                it.runForMultipleInputsOutputs(inputs, outputs)

                postprocess(outputIds, confidences, selectedMovies)

            } ?: run {

                Log.e(TAG, "No tflite interpreter loaded")

                emptyList()

            }

        }

    }

Implement Post Processing:

Finally, in this step, we will post- process the output from our model. The code for the same is as below.


/** Postprocess to gets results from tflite inference.  */

    @Synchronized

    private suspend fun postprocess(

        outputIds: IntArray, confidences: FloatArray, selectedMovies: List<Movie>

    ): List<Result> {

        return withContext(Dispatchers.Default) {

            val results = ArrayList<Result>()


            // Add recommendation results. Filter null or contained items.

            for (i in outputIds.indices) {

                if (results.size >= config.topK) {

                    Log.v(TAG, String.format("Selected top K: %d. Ignore the rest.", config.topK))

                    break

                }

                val id = outputIds[i]

                val item = candidates[id]

                if (item == null) {

                    Log.v(TAG, String.format("Inference output[%d]. Id: %s is null", i, id))

                    continue

                }

                if (selectedMovies.contains(item)) {

                    Log.v(TAG, String.format("Inference output[%d]. Id: %s is contained", i, id))

                    continue

                }

                val result = Result(

                    id, item,

                    confidences[i]

                )

                results.add(result)

                Log.v(TAG, String.format("Inference output[%d]. Result: %s", i, result))

            }

            results

        }

    }

Now, we will try and re-run the app. As we select a few movies, it should automatically download the new model and also start generating recommendations. 

Finally, let us look into some of the advantages and Disadvantages of on-device Machine Learning.

Advantages of On-Device Machine Learning:

Low Latency: There is no round trip to the server and no wait for results to come back.

Privacy: The data processing happens on the user's device. This means that we can also apply Machine Learning to sensitive data which you would not want to leave your device.

Works Offline: Running ML models on-device means you can build ML-powered features that don't rely on network connectivity. In addition, as data processing happens on the device, we can reduce the usage of the user's mobile data plan.

Low Cost: As it makes use of the processing power of the device rather than maintaining additional servers or paying for cloud compute cycles this makes it cost-effective.

Limitations:

Let us also discuss some of the limitations of On-device  Machine Learning.

1. Due to the nature of Machine Learning, the models that are trained can become rather large. Although this is not a problem when they run on a server, it can be limiting when running the same on a client/ Mobile device.

2. Mobile devices are more restricted with respect to storage, memory, compute resources, and power consumption limitations.

Conclusion:

Hence we can conclude that on-device models need to be much smaller than their server counterparts. Typically, that means these are less powerful. Your use case will determine whether using on-device models is a good fit or not. In this article we have seen the deployment of the model on an android device using the Android studio also we have used FireBase Analytics to analyse the data generated from the app and to recommend movies to the user's basis their likes on the app.


About the Author:- K Sumanth is a Data Science enthusiast, passionate about Data and believes that data always has a story to tell. He is persistently learning, implementing, and solving Business problems which involve complex and large data. He is known for his clear, concise, and lucid explanations on Data Science concepts. He has a PG Diploma in Data Science and Analytics from University of Texas at Austin and McCombs School of Business.

Comments

  1. Hey there! This is a wonderful write-up. Can you please also mention further details about author. Would like to collaborate further on ML

    ReplyDelete
    Replies
    1. Its mentioned in the About the Author section

      Delete
    2. Hi! Thanks for updating. Can you also provide a link to connect with the author?

      Delete
    3. Hi there! Please do provide a link to connect with the author.

      Delete
    4. You can connect with him at.. https://www.linkedin.com/in/sumanthkowligi/

      Delete

Post a Comment