1. Introduction¶
BCC Face is an Android library meant to be integrated into an Android application.
It uses the device’s camera to take a picture of a face for biometric purposes. It provides a simple active liveness test, requiring the person to smile for about a second and/or look to the right or left. The liveness test includes an option to speak the instructions, facilitating the workflow for users. Additionally, it provides a passive liveness test that can be used to check if the photo was taken from a real person without requiring user interaction.
This manual is updated to BCC Face Mobile Android version 4.6.1
.
Attention
The passive liveness test is available only from version 4.4.0
onwards.
1.1. Requirements¶
BCC Face is an Android library, and must be imported into the target project.
Minimum Android Version: Android 6.0 (SDK 23), “Marshmallow”.
The mobile device must have a camera.
The native app must be built with Android technology.
Development environment: an Android IDE is required, such as Android Studio (recommended).
Additional external dependencies:
- Google ML Kit, face-detection version 16.0.2;
- Lottie, version 3.0.0;
- Libyuv-android, version 1.0.0;
External services:
- Firebase, by Google. An account is required, but no charges will be applied since the library uses only on-device APIs.
2. Installation¶
2.1. Adding the Library in the App Project¶
BCC Face is provided by Griaule as a .aar
file.
To add the libraries, go to your project directory, open the app folder and create the directories: libs/bccface
. Then, add the bccfacelib-release.aar
dependency. The folder structure must similar to this:
The next step is to make these files visible to the gradle dependencies. To do this, add the following line in the file build.gradle (:app)
, in the dependencies object:
dependencies {
[...]
implementation fileTree(dir: 'libs/bccface', include: ['*.aar'])
}
Inside the build.gradle (:app)
file, also add the compile options and set Source Compatibility and Target Compatibility to use 1.8 (Java 8):
compileOptions {
sourceCompatibility = 1.8
targetCompatibility = 1.8
}
2.2. Setting up Google ML Kit¶
It is recommended to follow the instructions provided in “Option 1: Use the Firebase console setup workflow” of the Firebase for Android documentation: Add Firebase to your Android project.
An account is required, but no charges will be applied since the library uses only on-device APIs: See Firebase pricing details.
If Option 1 was chosen, make sure to generate the google-services.json
file and to place it in android/app/
directory.
Make changes to the following files:
android/build.gradle
buildscript { ... dependencies { ... classpath 'com.google.gms:google-services:4.3.3' } }
android/app/build.gradle
, add to the bottom of the file:... apply plugin: 'com.android.application' apply plugin: 'kotlin-android' apply plugin: 'kotlin-kapt' apply plugin: 'com.google.gms.google-services' ... android { ... buildFeatures { viewBinding = true dataBinding = true } } ... dependencies { ... implementation 'com.google.firebase:firebase-analytics:17.2.2' // Add line above using analytics }
2.3. Setting up all dependencies¶
Make changes to the following files:
android/build.gradle
allprojects { repositories { ... maven { url 'https://jitpack.io' } } }
android/app/build.gradle
... dependencies { ... implementation project(path: ':bccfacelib') // ANDROIDX // implementation 'androidx.appcompat:appcompat:1.1.0' implementation 'androidx.constraintlayout:constraintlayout:1.1.3' implementation 'com.google.android.material:material:1.1.0' // Google MLKIT // implementation 'com.google.mlkit:face-detection:16.0.2' // LOTTIE // implementation 'com.airbnb.android:lottie:3.0.0' // LIBYUV // implementation 'com.github.xiaoxiaoqingyi:libyuv-android:v1.0' // CAMERA X // def camerax_version = "1.2.0-rc01" // CameraX core library using camera2 implementation implementation "androidx.camera:camera-camera2:$camerax_version" // CameraX Lifecycle Library implementation "androidx.camera:camera-lifecycle:$camerax_version" // CameraX View class implementation "androidx.camera:camera-view:$camerax_version" }
3. Usage¶
3.1. Parameters and Constructor¶
To properly use the BCC Face library, there are some required parameters.
A simple library usage example is shown below:
BCCFaceBuilder(this, this).initializeCapture()
The BCCFaceBuilder
class constructor receives the following parameters:
context:Context
- The application context.delegate:BCCFaceDelegate
- Interface responsible for notifying capture events (e.g. failure or success).
The BCCFaceBuilder
class is responsible for handling the usage configuration for BCCFace
. The following parameters are accepted for configuring biometric capture and software behavior:
buildSmileCheck(smileProbability: Float = 0.8f)
- Adds smile for liveness test and defines the acceptance threshold. This feature is enabled by default.removeSmileCheck()
- Removes smile for liveness check.buildRotationCheck(livenessConfigList: List\<HeadRotationCheck>,
headRotationAngle: Float = 20f)
- Defines a list of liveness tests for head rotation and max rotation angle. This feature is enabled by default. The head rotation options are:enum class HeadRotationCheck { randomRotation, leftRotation, rightRotation; }
removeHeadRotation()
- Removes head rotation for liveness check.addPassiveLiveness()
- Adds the passive liveness test. This feature is used to check if the captured photo is from a real person without requiring any user interaction. To use this feature, you MUST disable the active liveness checks (removeSmileCheck()
andremoveHeadRotation()
) that are added by default.buildSpeechSettings(speechSettings:SpeechSettings?)
- Defines the criteria for accessibility speech, using the following parameters:class SpeechSettings( val volume: Float = 1.0f, val startsMuted: Boolean = true, val pitch: Float = 1.0f, val speed: Float = 1.0f )
volume
- The audio volume between0.0
and1.0
.startsMuted
- Defines whether the instructions start muted or not (true
for muted).pitch
- Defines the voice pitch for the instructions between0.5
(low) and2.0
(high).speed
- Defines the voice speed for the instructions. This value must be positive.
The pre-defined values can be accessed through the static variable:
class SpeechSettings { companion object { val defaultSpeechSettings = SpeechSettings() } }
removeSpeech()
- Removes accessibility speech.setReviewEnable(enable: Boolean)
- Defines whether the biometric capture review screen is enabled or disabled.setInstructionEnable(enable: Boolean)
- Defines whether the instruction screen is enabled or disabled.forceLanguage(language:BCCFaceAPI.BCCLanguages?)
- Forces the instructions to be shown in a single language. If the device language is not supported, English will be used. The supported languages are:enum class BCCLanguages(val locale: Locale) { ptBR(Locale("pt", "br")), enUS(Locale.US), esMX(Locale("es", "mx")), deviceLanguage(Locale.getDefault()); }
removeLanguage()
- Removes the forced language.enableFlipCameraButton(enable: Boolean)
- Activates (true
) or deactivates (false
) the camera flip button (front or back camera). Default: enabled (true
).setCameraInitialDirection(CameraFacingDirection.<FRONT or BACK>)
- Defines the initial camera direction, choose between:CameraFacingDirection.FRONT
- Front camera (default setting).CameraFacingDirection.BACK
- Back camera.
All camera direction settings are obtained through the combination of the enableFlipCameraButton
and setCameraInitialDirection
methods. For example:
- Start with the front camera and enable the camera flip button:
BCCFaceBuilder(this, this) .enableFlipCameraButton(true) .setCameraInitialDirection(CameraFacingDirection.FRONT)
- Start with the back camera and disable the camera flip button:
BCCFaceBuilder(this, this) .enableFlipCameraButton(false) .setCameraInitialDirection(CameraFacingDirection.BACK)
For reference, here is a full list of parameters and default values:
var showPhotoReview: Boolean = false,
var showInstructionScreen: Boolean = false,
var stopVerifyEyesOpenTimeout: Long = 20,
var useICAO: Boolean = false,
var smilingProbabilityThreshold: Float = 0.8f,
var rotateAngle: Float = 20f,
var livenessList: MutableList<LivenessTypes> = mutableListOf(LivenessTypes.SMILE, LivenessTypes.ROTATION_RANDOM),
var language: BCCFaceAPI.BCCLanguages? = null,
var speechSettings: SpeechSettings? = SpeechSettings(),
var activityResults: FaceCaptureActivityResults? = null
Attention
The active liveness test (using smile and random head rotation) is enabled by default. To only use the passive liveness check, before adding it, it is necessary to remove the active liveness methods from the constructor:
BCCFaceBuilder(this, this)
.removeSmileCheck()
.removeHeadRotation()
.addPassiveLiveness()
Here is a code snipped to initialize a capture using only the passive liveness test:
// From the desired Activity...
// Create the builder
BCCFaceBuilder faceBuilder = new BCCFaceBuilder(this, this);
// Setup the builder
faceBuilder
.removeSmileCheck()
.removeHeadRotation()
.addPassiveLiveness();
// Initialize the capture from the builder
faceBuilder.initializeCapture();
3.2. Return Values¶
The results from the last facial capture can be retrieved using the faceCaptureDidFinish
method from the BCCFaceDelegate
interface:
fun faceCaptureDidFinish(
data: BCCFaceReturnData,
analytics: BCCFaceReturnAnalytics
)
The data
object contains both images captured during the process:
class BCCFaceReturnData {
val originalPhoto: Bitmap?
val croppedPhoto: Bitmap?
// Passive Liveness Result
val passiveResult: ByteArray?
}
The returned properties are:
originalPhoto
(image) - The original photo taken by the camera.croppedPhoto
(image) - The cropped photo, which is the face image cropped from the original photo.passiveResult
(ByteArray) - File content of the collection for passive liveness. It is present only if successfully captured. It is the return of the collection for passive liveness as JPEG ByteArray. This data can be saved/exported directly to a file or sent to the network (for networking: system encoding can be easily done, e.g. Base64 string).
Note
The originalPhoto
and croppedPhoto
properties return null when the capture is performed only for passive tests (i.e., without including an active test).
If the user aborts the capture, closing before capturing the biometrics, the method faceCaptureDidAbort
will be called. You can implement this method to treat this scenario.
3.3. Sample Project¶
This is a functional sample project for a face capture using BCC Mobile Face Android:
package com.example.bccfaceexample
import androidx.appcompat.app.AppCompatActivity
import android.os.Bundle
import com.example.bccfaceexample.databinding.ActivityMainBinding
import com.griaule.bccfacelib.analytics.BCCFaceReturnAnalytics
import com.griaule.bccfacelib.faceApi.BCCFaceBuilder
import com.griaule.bccfacelib.faceApi.BCCFaceDelegate
import com.griaule.bccfacelib.faceApi.BCCFaceReturnData
class MainActivity : AppCompatActivity(), BCCFaceDelegate {
private lateinit var binding: ActivityMainBinding
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
binding = ActivityMainBinding.inflate(layoutInflater)
setContentView(binding.root)
setupListeners()
}
private fun setupListeners() {
binding.startCaptureButton.setOnClickListener { initializeCapture() }
}
private fun initializeCapture() {
BCCFaceBuilder(this, this)
.setInstructionEnable(true)
.setReviewEnable(true)
.initializeCapture()
}
override fun faceCaptureDidAbort(
analytics: BCCFaceReturnAnalytics
) {
// ...
}
override fun faceCaptureDidFinish(
data: BCCFaceReturnData,
analytics: BCCFaceReturnAnalytics
) {
// ...
}
}