Thereâs a section called Blend Shape Coefficients, which are used to describe expressions present in a face. There is a blend shape coefficient for the left side of the mouth smiling and one for the right side of the mouth smiling. You could easily access those to detect a smile.
Thank you for the suggestion. As far as I know the AR Face Tracking requires a TruDepth camera. I need something that would work with old devices. I found that CIDetector with options: [CIDetectorSmile: true]) as? [CIFaceFeature] would work, but I just donât know how can I implement it in your code.
I havenât used CIDetector before, but it looks like you could add the code to the captureOutput(_:didOutput:from:) function. You would need to:
Convert the imageBuffer to a CIImage
Create a CIDetector using something like:
let detector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: [:])
Then run the features method on the detector:
let features = detector?.features(in: ciImageFromImageBuffer) as? [CIFaceFeature] ?? []
Then check if any of the features .hasSmile is true
You may want to create the detector once outside of the captureOutput function, as you might be able to reuse it? Also check the documentation for the CIDetectorTypeFace ⊠it mentions that you can make it more efficient by passing orientation as an option.
I was not able to convert the imageBuffer to a CIImage.
I tried this:
var ciImageFromImageBuffer = CIImage(image: imageBuffer)
But got the error: Cannot convert value of type âCVImageBufferâ (aka âCVBufferâ) to expected argument type âUIImageâ
The project is setup in Portrait mode. See the the General tab for the project target. To start doing Landscape, you need you first check one or both of the Landscape boxes.
You may also need to check the UIDevice.current.orientation and adjust some views accordingly when it is in landscape mode.