Face Detection Tutorial Using the Vision Framework for iOS | raywenderlich.com

Thanks @konsdor!

Regarding smile detection
 do you have the option to use ARKit, instead? If so, check out this tutorial:

https://www.raywenderlich.com/5491-ar-face-tracking-tutorial-for-ios-getting-started

There’s a section called Blend Shape Coefficients, which are used to describe expressions present in a face. There is a blend shape coefficient for the left side of the mouth smiling and one for the right side of the mouth smiling. You could easily access those to detect a smile.

Thank you for the suggestion. As far as I know the AR Face Tracking requires a TruDepth camera. I need something that would work with old devices. I found that CIDetector with options: [CIDetectorSmile: true]) as? [CIFaceFeature] would work, but I just don’t know how can I implement it in your code.

I haven’t used CIDetector before, but it looks like you could add the code to the captureOutput(_:didOutput:from:) function. You would need to:

  1. Convert the imageBuffer to a CIImage
  2. Create a CIDetector using something like:
let detector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: [:])
  1. Then run the features method on the detector:
let features = detector?.features(in: ciImageFromImageBuffer) as? [CIFaceFeature] ?? []
  1. Then check if any of the features .hasSmile is true

You may want to create the detector once outside of the captureOutput function, as you might be able to reuse it? Also check the documentation for the CIDetectorTypeFace 
 it mentions that you can make it more efficient by passing orientation as an option.

Thanks @toojuice!

I was not able to convert the imageBuffer to a CIImage.
I tried this:
var ciImageFromImageBuffer = CIImage(image: imageBuffer)
But got the error: Cannot convert value of type ‘CVImageBuffer’ (aka ‘CVBuffer’) to expected argument type ‘UIImage’

The imageBuffer in the tutorial is of type CVImageBuffer, so the way to convert it to a CIImage is using:

... = CIImage(cvImageBuffer: imageBuffer)

Thank you so much! It works now!

Hi @toojuice,

The smile detector works fine, but only in portrait mode. How to make it work with landscape?

i run your code but it not run in iOS 11.

let results = request.results as? [VNFaceObservation],
        let result = results.first
        else {
            print("results.first")
            // 2
            self.faceView.clear()
            return
    }

results.count = 0 in iOS 11, iOS 12 normally

Hi,

The project is setup in Portrait mode. See the the General tab for the project target. To start doing Landscape, you need you first check one or both of the Landscape boxes.

You may also need to check the UIDevice.current.orientation and adjust some views accordingly when it is in landscape mode.

Hi @nqdung2306,

I wrote the tutorial using iOS 12 and, unfortunately, I don’t have any devices running iOS 11 anymore, so I can’t help you check/debug it.

Sorry

This tutorial is more than six months old so questions are no longer supported at the moment for it. Thank you!