Firebase

How to use Firebase ML Kit for face detection in an Android app?

Discover the ins and outs of using Firebase ML Kit for face detection in Android apps. Learn with a detailed guide offering step-by-step instructions and top best practices.

Developer profile skeleton
a developer thinking

Overview

Firebase ML Kit brings some serious machine learning power straight into Android apps, boasting impressive face detection features. By weaving ML Kit into an app, developers can spot and track faces, pinpoint facial landmarks, and even pick up on expressions, all happening in real-time.

Setting it up? It's about adding Firebase to the project, sorting out ML Kit dependencies, and handling image data.

This beginner-friendly guide covers the vital steps to get face detection up and running with Firebase ML Kit. Your app will be ready for advanced image analysis without breaking a sweat.

Get a Free No-Code Consultation
Meet with Will, CEO at Bootstrapped to get a Free No-Code Consultation
Book a Call
Will Hawkins
CEO at Bootstrapped

How to use Firebase ML Kit for face detection in an Android app?

Step 1: Set Up Firebase in the Android Project

  1. Make sure you've got Android Studio up and running.
  2. Start a new Android project or just open one you already have.
  3. Head over to the Firebase Console at Firebase Console.
  4. Add your Android app to your Firebase project.
  • Enter your app's package name.
  • Follow the steps to download the google-services.json file.
  1. Drop the google-services.json file into the app/ directory of your project.

  2. Add this classpath to your project's build.gradle file:
    ```groovy
    buildscript {
    dependencies {
    // Add this line
    classpath 'com.google.gms:google-services:4.3.10'
    }
    }
    ```

  3. In the app-level build.gradle file, apply the following plugin and dependencies:
    ```groovy
    apply plugin: 'com.android.application'
    apply plugin: 'com.google.gms.google-services'

    dependencies {
    // Add these lines
    implementation 'com.google.firebase:firebase-ml-vision:24.0.3'
    implementation 'com.google.firebase:firebase-analytics'
    }
    ```

Step 2: Add Camera Permission to AndroidManifest.xml

  1. Open up AndroidManifest.xml.
  2. Add these permissions and features:
    ```xml \`\`\`

Step 3: Implement the Camera Source

  1. Create a file called CameraSource.java to handle camera operations.

  2. Implement the code to preview camera input:
    ```java
    import android.hardware.Camera;
    import android.view.SurfaceHolder;
    import android.view.SurfaceView;
    import java.io.IOException;

    public class CameraSource extends SurfaceView implements SurfaceHolder.Callback {
    private Camera camera;

    public CameraSource(Context context) {
    super(context);
    camera = Camera.open();
    SurfaceHolder holder = getHolder();
    holder.addCallback(this);
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
    try {
    camera.setPreviewDisplay(holder);
    camera.startPreview();
    } catch (IOException e) {
    e.printStackTrace();
    }
    }

    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
    // Handle surface changes here
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
    camera.release();
    }
    }
    ```

Step 4: Initialize Firebase Vision for Face Detection

  1. In your main activity, set up Firebase Vision and create a face detector instance:
    ```java
    import com.google.firebase.ml.vision.FirebaseVision;
    import com.google.firebase.ml.vision.common.FirebaseVisionImage;
    import com.google.firebase.ml.vision.face.FirebaseVisionFace;
    import com.google.firebase.ml.vision.face.FirebaseVisionFaceDetector;
    import com.google.firebase.ml.vision.face.FirebaseVisionFaceDetectorOptions;

    import java.util.List;

    public class MainActivity extends AppCompatActivity {
    private CameraSource cameraSource;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(new CameraSource(this));

    FirebaseVisionFaceDetectorOptions options =
    new FirebaseVisionFaceDetectorOptions.Builder()
    .setPerformanceMode(FirebaseVisionFaceDetectorOptions.ACCURATE)
    .setLandmarkMode(FirebaseVisionFaceDetectorOptions.ALL_LANDMARKS)
    .setClassificationMode(FirebaseVisionFaceDetectorOptions.ALL_CLASSIFICATIONS)
    .build();

    FirebaseVisionFaceDetector detector = FirebaseVision.getInstance()
    .getVisionFaceDetector(options);
    }

    private void detectFaces(FirebaseVisionImage image) {
    detector.detectInImage(image)
    .addOnSuccessListener(new OnSuccessListener<List>() {
    @Override
    public void onSuccess(List faces) {
    for (FirebaseVisionFace face : faces) {
    // Handle detected faces here
    }
    }
    })
    .addOnFailureListener(new OnFailureListener() {
    @Override
    public void onFailure(@NonNull Exception e) {
    // Handle detection failure here
    }
    });
    }
    }
    ```

Step 5: Capture Camera Frames and Detect Faces

  1. Modify the CameraSource class to capture frames and use the face detector:
    ```java
    import android.graphics.Bitmap;
    import android.hardware.Camera;
    import com.google.firebase.ml.vision.common.FirebaseVisionImage;

    public CameraSource(Context context) {
    super(context);
    camera = Camera.open();
    camera.setPreviewCallback(new Camera.PreviewCallback() {
    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
    Camera.Parameters parameters = camera.getParameters();
    int width = parameters.getPreviewSize().width;
    int height = parameters.getPreviewSize().height;
    Bitmap bitmap = BitmapUtils.getBitmapFromYUV(data, width, height);
    FirebaseVisionImage image = FirebaseVisionImage.fromBitmap(bitmap);

    // Call detectFaces method from MainActivity here
    ((MainActivity)context).detectFaces(image);
    }
    });
    SurfaceHolder holder = getHolder();
    holder.addCallback(this);
    }
    ```

  2. Create BitmapUtils class to convert YUV data to Bitmap:
    ```java
    import android.graphics.Bitmap;
    import android.graphics.YuvImage;

    public class BitmapUtils {
    public static Bitmap getBitmapFromYUV(byte[] data, int width, int height) {
    YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, width, height, null);
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    yuvImage.compressToJpeg(new Rect(0, 0, width, height), 50, out);
    byte[] bytes = out.toByteArray();
    return BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
    }
    }
    ```

By following these steps, Firebase ML Kit Face Detection can be effectively integrated into an Android app. Each step ensures proper setup and use of the camera and Firebase Vision API to detect faces in real-time.

Explore more Firebase tutorials

Complete Guide to Firebase: Tutorials, Tips, and Best Practices

Explore our Firebase tutorials directory - an essential resource for learning how to create, deploy and manage robust server-side applications with ease and efficiency.

Why are companies choosing Bootstrapped?

40-60%

Faster with no-code

Nocode tools allow us to develop and deploy your new application 40-60% faster than regular app development methods.

90 days

From idea to MVP

Save time, money, and energy with an optimized hiring process. Access a pool of experts who are sourced, vetted, and matched to meet your precise requirements.

1 283 apps

built by our developers

With the Bootstrapped platform, managing projects and developers has never been easier.

hero graphic

Our capabilities

Bootstrapped offers a comprehensive suite of capabilities tailored for startups. Our expertise spans web and mobile app development, utilizing the latest technologies to ensure high performance and scalability. The team excels in creating intuitive user interfaces and seamless user experiences. We employ agile methodologies for flexible and efficient project management, ensuring timely delivery and adaptability to changing requirements. Additionally, Bootstrapped provides continuous support and maintenance, helping startups grow and evolve their digital products. Our services are designed to be affordable and high-quality, making them an ideal partner for new ventures.

Engineered for you

1

Fast Development: Bootstrapped specializes in helping startup founders build web and mobile apps quickly, ensuring a fast go-to-market strategy.

2

Tailored Solutions: The company offers customized app development, adapting to specific business needs and goals, which ensures your app stands out in the competitive market.

3

Expert Team: With a team of experienced developers and designers, Bootstrapped ensures high-quality, reliable, and scalable app solutions.

4

Affordable Pricing: Ideal for startups, Bootstrapped offers cost-effective development services without compromising on quality.

5

Supportive Partnership: Beyond development, Bootstrapped provides ongoing support and consultation, fostering long-term success for your startup.

6

Agile Methodology: Utilizing agile development practices, Bootstrapped ensures flexibility, iterative progress, and swift adaptation to changes, enhancing project success.

Yes, if you can dream it, we can build it.