Tutorial: Create an Android app to detect and frame faces in an image

In this tutorial, you create a simple Android application that uses the Face service Java class library to detect human faces in an image. The application shows a selected image with each detected face framed by a rectangle. The complete sample code is available on GitHub at Detect and frame faces in an image on Android.

Android screenshot of a photo with faces framed by a red rectangle

This tutorial shows you how to:

  • Create an Android application
  • Install the Face service client library
  • Use the client library to detect faces in an image
  • Draw a frame around each detected face


Create the project

Create your Android application project by following these steps:

  1. Open Android Studio. This tutorial uses Android Studio 3.1.
  2. Select Start a new Android Studio project.
  3. On the Create Android Project screen, modify the default fields, if necessary, then click Next.
  4. On the Target Android Devices screen, use the dropdown selector to choose API 22 or higher, then click Next.
  5. Select Empty Activity, then click Next.
  6. Uncheck Backwards Compatibility, then click Finish.

Create the UI for selecting and displaying the image

Open activity_main.xml; you should see the Layout Editor. Select the Text tab, then replace the contents with the following code.

<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"

        android:contentDescription="Image with faces to analyze"/>

        android:text="Browse for face image"

Open MainActivity.java, then replace everything but the first package statement with the following code.

The code sets an event handler on the Button that starts a new activity to allow the user to select a picture. Once selected, the picture is displayed in the ImageView.

import java.io.*;
import android.app.*;
import android.content.*;
import android.net.*;
import android.os.*;
import android.view.*;
import android.graphics.*;
import android.widget.*;
import android.provider.*;

public class MainActivity extends Activity {
    private final int PICK_IMAGE = 1;
    private ProgressDialog detectionProgressDialog;

    protected void onCreate(Bundle savedInstanceState) {
            Button button1 = (Button)findViewById(R.id.button1);
            button1.setOnClickListener(new View.OnClickListener() {
                public void onClick(View v) {
                Intent intent = new Intent(Intent.ACTION_GET_CONTENT);
                        intent, "Select Picture"), PICK_IMAGE);

        detectionProgressDialog = new ProgressDialog(this);

    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        super.onActivityResult(requestCode, resultCode, data);
        if (requestCode == PICK_IMAGE && resultCode == RESULT_OK &&
                data != null && data.getData() != null) {
            Uri uri = data.getData();
            try {
                Bitmap bitmap = MediaStore.Images.Media.getBitmap(
                        getContentResolver(), uri);
                ImageView imageView = (ImageView) findViewById(R.id.imageView1);

                // Uncomment
                } catch (IOException e) {

Now your app can browse for a photo and display it in the window, similar to the image below.

Android screenshot of a photo with faces

Configure the Face client library

The Face API is a cloud API, which you can call using HTTPS requests. This tutorial uses the Face client library, which encapsulates these web requests, to simplify your work.

In the Project pane, use the dropdown selector to select Android. Expand Gradle Scripts, then open build.gradle (Module: app).

Add a dependency for the Face client library, com.microsoft.projectoxford:face:1.4.3, as shown in the screenshot below, then click Sync Now.

Android Studio screenshot of App build.gradle file

Open MainActivity.java and append the following import directives:

import com.microsoft.projectoxford.face.*;
import com.microsoft.projectoxford.face.contract.*;

Add the Face client library code

Insert the following code in the MainActivity class, above the onCreate method:

private final String apiEndpoint = "<API endpoint>";
private final String subscriptionKey = "<Subscription Key>";

private final FaceServiceClient faceServiceClient =
        new FaceServiceRestClient(apiEndpoint, subscriptionKey);

Replace <API endpoint> with the API endpoint that was assigned to your key. Free trial subscription keys are generated in the westcentralus region. So if you're using a free trial subscription key, the statement would be:

apiEndpoint = "https://westcentralus.api.cognitive.microsoft.com/face/v1.0";

Replace <Subscription Key> with your subscription key. For example:

subscriptionKey = "0123456789abcdef0123456789ABCDEF"

In the Project pane, expand app, then manifests, and open AndroidManifest.xml.

Insert the following element as a direct child of the manifest element:

<uses-permission android:name="android.permission.INTERNET" />

Build your project to check for errors. Now you're ready to call the Face service.

Upload an image to detect faces

The most straightforward way to detect faces is to call the FaceServiceClient.detect method. This method wraps the Detect API method and returns an array of Face's.

Each returned Face includes a rectangle to indicate its location, combined with a series of optional face attributes. In this example, only the face locations are required.

If an error occurs, an AlertDialog displays the underlying reason.

Insert the following methods into the MainActivity class.

// Detect faces by uploading a face image.
// Frame faces after detection.
private void detectAndFrame(final Bitmap imageBitmap) {
    ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
    imageBitmap.compress(Bitmap.CompressFormat.JPEG, 100, outputStream);
    ByteArrayInputStream inputStream =
            new ByteArrayInputStream(outputStream.toByteArray());

    AsyncTask<InputStream, String, Face[]> detectTask =
            new AsyncTask<InputStream, String, Face[]>() {
                String exceptionMessage = "";

                protected Face[] doInBackground(InputStream... params) {
                    try {
                        Face[] result = faceServiceClient.detect(
                                true,         // returnFaceId
                                false,        // returnFaceLandmarks
                                null          // returnFaceAttributes:
                                /* new FaceServiceClient.FaceAttributeType[] {
                                    FaceServiceClient.FaceAttributeType.Gender }
                        if (result == null){
                                    "Detection Finished. Nothing detected");
                            return null;
                                "Detection Finished. %d face(s) detected",
                        return result;
                    } catch (Exception e) {
                        exceptionMessage = String.format(
                                "Detection failed: %s", e.getMessage());
                        return null;

                protected void onPreExecute() {
                    //TODO: show progress dialog
                protected void onProgressUpdate(String... progress) {
                    //TODO: update progress
                protected void onPostExecute(Face[] result) {
                    //TODO: update face frames


private void showError(String message) {
    new AlertDialog.Builder(this)
    .setPositiveButton("OK", new DialogInterface.OnClickListener() {
            public void onClick(DialogInterface dialog, int id) {

Frame faces in the image

Insert the following helper method into the MainActivity class. This method draws a rectangle around each detected face.

private static Bitmap drawFaceRectanglesOnBitmap(
        Bitmap originalBitmap, Face[] faces) {
    Bitmap bitmap = originalBitmap.copy(Bitmap.Config.ARGB_8888, true);
    Canvas canvas = new Canvas(bitmap);
    Paint paint = new Paint();
    if (faces != null) {
        for (Face face : faces) {
            FaceRectangle faceRectangle = face.faceRectangle;
                    faceRectangle.left + faceRectangle.width,
                    faceRectangle.top + faceRectangle.height,
    return bitmap;

Complete the AsyncTask methods, indicated by the TODO comments, in the detectAndFrame method. On success, the selected image is displayed with framed faces in the ImageView.

protected void onPreExecute() {
protected void onProgressUpdate(String... progress) {
protected void onPostExecute(Face[] result) {
    if (result == null) return;
    ImageView imageView = findViewById(R.id.imageView1);
            drawFaceRectanglesOnBitmap(imageBitmap, result));

Finally, in the onActivityResult method, uncomment the call to the detectAndFrame method, as shown below.

protected void onActivityResult(int requestCode, int resultCode, Intent data) {
    super.onActivityResult(requestCode, resultCode, data);

    if (requestCode == PICK_IMAGE && resultCode == RESULT_OK &&
                data != null && data.getData() != null) {
        Uri uri = data.getData();
        try {
            Bitmap bitmap = MediaStore.Images.Media.getBitmap(
                    getContentResolver(), uri);
            ImageView imageView = findViewById(R.id.imageView1);

            // Uncomment
        } catch (IOException e) {

Run the app

Run the application and browse for an image with a face. Wait a few seconds to allow the Face service to respond. After that, you'll get a result similar to the image below:



In this tutorial, you learned the basic process for using the Face service and created an application to display framed faces in an image.

Next steps

Learn about detecting and using face landmarks.

Explore the Face APIs used to detect faces and their attributes such as pose, gender, age, head pose, facial hair, and glasses.