1. Introduction¶
BCC Face is a library meant to be integrated into an iOS application from a .framework
file.
It uses the device’s camera to take a picture of a face for biometric purposes. It provides a simple active liveness test, requiring the person to smile for about a second and/or look to the right or left. The liveness test includes an option to speak the instructions, facilitating the workflow for users. Additionally, it provides a passive liveness test that can be used to check if the photo was taken from a real person without requiring user interaction.
This manual is updated for BCC Face Mobile iOS version 4.4.0
.
Attention
The passive liveness test is available only from version 4.4.0
onwards.
1.1. Requirements¶
- Git
- CocoaPods, available at https://cocoapods.org/.
2. Installation¶
2.1. Installing Dependencies¶
1 - Add the following Pods to the application dependencies on Podfile:
pod 'GoogleMLKit/FaceDetection'
Note
If the application does not possess a Podfile, it can be created in the root folder of your Xcode project using the command pod init
in the terminal.
It is preferable to use dynamic frameworks. It can be indicated using the flag use_frameworks!
on Podfile.
A Podfile example with a target called BCCs-Sample is shown below:
platform :ios, '13.0'
target 'BCCs-Sample' do
use_frameworks!
pod 'GoogleMLKit/FaceDetection', '~> 3.0.0'
pod 'lottie-ios', '~> 3.3.0'
end
post_install do |installer|
installer.pods_project.targets.each do |target|
target.build_configurations.each do |config|
config.build_settings['DEVELOPMENT_TEAM'] = "YOUR_DEVELOPMENT_TEAM_KEY"
config.build_settings['BUILD_LIBRARY_FOR_DISTRIBUTION'] = 'YES'
end
end
end
Important
It is recommended to use the same minimum supported iOS version for your application as the one for this framework: iOS 13.0, as in the example above.
2 - Close the Xcode project, open a terminal and go to the folder where the Podfile is, and then run:
pod install
After the execution finishes, a file with the .xcworkspace
extension will be created in the same folder.
3 - Open the new .xcworkspace
file.
Warning
From now on, every time the user wants to open the project, it is necessary to open it through this .xcworkspace
file, as it includes the dependencies.
2.2. Importing and Configuring¶
2.2.1. Importing the Project¶
Open the project using the
.xcworkspace
file.Add the
BCCFace.framework
file to the project, then add it to the framework list of your application.Move the
.framework
file to the project file tree.If there’s already a framework folder, it is recommended to move the file there.
Open the project settings.
Go to General tab.
Click and drag the
.framework
to the project tree under the section “Frameworks, Libraries, etc.”
Change the
BCCFace.framework
setting from “Do not embed” to “Embed & Sign”.Change the target version of your project to a minimum of iOS 13.
Note
It is recommended to disable iPad as a target.
2.2.2. Initial Configuration¶
This version does not have dependencies on Firebase, and neither from an initial configuration called by AppDelegate. The only initial configuration needed is that the application must request camera usage permission. To do so, add the following key in the info.plist
file, at the Information Property List:
Key : Privacy - Camera Usage Description
Value : Allow access to camera
The key value is a message to be shown to the user when requesting camera use permission. This value can be blank or filled with a custom message.
3. Usage¶
3.1. Parameters and Constructor¶
To properly use the BCC Face library, there are some required parameters.
A simple library usage example is shown below:
BCCFaceBuilder(self, delegate: self).initializeCapture()
The BCCFaceBuilder
class constructor receives the following parameters:
hostVC: UIViewController
- View controller that calls the capture screen.delegate: BCCFaceDelegate
- Interface responsible for notifying capture events (e.g. failure or success).
The initializeCapture
method also accepts an optional parameter, an shown below:
public func initializeCapture(
_ navController: UINavigationController? = nil
) { ... }
If you want the navigation to run through a navigation controller, you must provide it when calling the method.
The BCCFaceBuilder
class is responsible for handling the usage configuration for BCCFace
. The following parameters are accepted for configuring biometric capture and software behavior:
buildSmileCheck(with smileProbability: ClosedRange\<Float> = 0.5...1.0)
- Adds smile for liveness test and defines the acceptance threshold. This feature is enabled by default.removeSmileCheck()
- Removes smile for liveness check.buildRotationCheck(_ rotationChecks: [HeadRotationCheck],
headRotationAngle: ClosedRange\<Float> = -6.0...6.0)
- Defines a list of liveness tests for head rotation and max rotation angle. This feature is enabled by default. The head rotation options are:enum class HeadRotationCheck { case randomRotation case leftRotation case rightRotation }
removeHeadRotation()
- Removes head rotation for liveness check.addPassiveLiveness()
- Adds the passive liveness test. This feature is used to check if the captured photo is from a real person without requiring any user interaction. To use this feature, you MUST disable the active liveness checks (removeSmileCheck()
andremoveHeadRotation()
) that are added by default.buildSpeechSettings(_ speechSettings: SpeechSettings)
- Defines the criteria for accessibility speech, using the following parameters:class SpeechSettings( public let volume: Float public let startsMuted: Bool public let pitch: Float public let speed: Float )
volume
- The audio volume between0.0
and1.0
.startsMuted
- Defines whether the instructions start muted or not (true
for muted).pitch
- Defines the voice pitch for the instructions between0.5
(low) and2.0
(high).speed
- Defines the voice speed for the instructions. This value must be positive.
The pre-defined values can be accessed through the static variable:
public static let defaultSpeechSettings = SpeechSettings( volume: 1.0, startsMuted: true, pitch: 1.0, speed: 0.5 )
removeSpeech()
- Removes accessibility speech.setReviewEnable(_ enable: Bool)
- Defines whether the biometric capture review screen is enabled or disabled.setInstructionEnable(_ enable: Bool)
- Defines whether the instruction screen is enabled or disabled.forceLanguage(_ language: BCCLanguages?)
- Forces the instructions to be shown on a single language. If the device language is not supported, English will be used. The supported languages are:public enum BCCLanguages: String { case ptBR = "pt-BR" case enUS = "en" case esMX = "es" case deviceLanguage = "deviceLanguage" }
removeLanguage()
- Removes the forced language.
For reference, here is a full list of parameters and default values:
var smileProbability: ClosedRange<Float> = 0.5...1.0
var headRotationAngle: ClosedRange<Float> = -6.0...6.0
var openEyesProbability: ClosedRange<Float> = 0.8...1.1
var livenessChecks: [LivenessChecks] = [.smileDetection, .headRotationRandom]
var speechSettings: SpeechSettings? = .defaultSpeechSettings
var language: BCCLanguages? = nil
var showPhotoReview: Bool = false
Attention
The active liveness test (using smile and random head rotation) is enabled by default. To only use the passive liveness check, before adding it, it is necessary to remove the active liveness methods from the constructor:
BCCFaceBuilder(self, delegate: self)
.removeSmileCheck()
.removeHeadRotation()
.addPassiveLiveness()
Here is a code snipped to initialize a capture using only the passive liveness test:
// From the desired ViewController...
// Create the builder
let faceBuilder = BCCFaceBuilder(self, delegate: self)
// Setup the builder
faceBuilder
.removeSmileCheck()
.removeHeadRotation()
.addPassiveLiveness()
// Initialize the capture from the builder
faceBuilder.initializeCapture(self.navigationController)
3.2. Return Values¶
The results from the last facial capture can be retrieved using the faceCaptureDidFinish
method from the BCCFaceDelegate
interface:
func faceCaptureDidFinish(
data: BCCFaceReturnData,
analytics: BCCFaceReturnAnalytics
)
The data
object contains the images captured during the process:
public struct BCCFaceReturnData {
// Previously 'photo', renamed to conform to the same standard as Android
public internal(set) var originalPhoto: UIImage
public internal(set) var croppedPhoto: UIImage?
// Passive Liveness Result
public internal(set) var passiveResult: Data?
// LEGACY API: same as 'originalPhoto'
public var photo: UIImage { self.originalPhoto }
}
The returned properties are:
originalPhoto
(image) - The original photo taken by the camera.croppedPhoto
(image) - The cropped photo, which is the face image cropped from the original photo.passiveResult
(Data) - File content of the collection for passive liveness. It is present only if successfully captured. It is the return of the collection for passive liveness as JPEG Data. This data can be saved/exported directly to a file or sent to the network (for networking: system encoding can be easily done, e.g. Base64 string).
If the user aborts the capture, closing before capturing the biometrics, the method faceCaptureDidAbort
will be called. You can implement this method to treat this scenario.
3.3. Sample Project¶
This is a functional sample project for a face capture using BCC Mobile Face iOS:
import UIKit
import BCCFace
class ViewController: UIViewController {
@IBOutlet weak var photoTaken: UIImageView!
@IBOutlet weak var startCaptureButton: UIButton!
override func viewDidLoad() {
super.viewDidLoad()
}
@IBAction func startCapture(_ sender: UIButton) {
BCCFaceBuilder(self, delegate: self)
.initializeCapture(navigationController)
}
}
extension ViewController: BCCFaceDelegate {
func faceCaptureDidFinish(
data: BCCFace.BCCFaceReturnData,
analytics: BCCFace.BCCFaceAnalytics
) {
self.photoTaken.contentMode = .scaleAspectFill
self.photoTaken.image = data.photo
}
func faceCaptureDidAbort(
analytics: BCCFace.BCCFaceAnalytics
) {
// ...
}
}