Building a Camera App With SwiftUI and Combine | raywenderlich.com

Learn to natively build your own SwiftUI camera app using Combine and create fun filters using the power of Core Image.


This is a companion discussion topic for the original entry at https://www.raywenderlich.com/26244793-building-a-camera-app-with-swiftui-and-combine

Thanks for your article.
Well done, but I have a question.
On my development phone (iPhone6s), if the position on AVCaptureDevice is set to back instead of front, screen update is too slow for Comic Filter and Crystal Filter.
Is there any way to improve this?

The back camera definitely has the ability to capture a higher resolution, so that might be slowing it down. Does the Monochrome filter work well on the back camera? I can take a look to see if I can find an optimization for the code. I’ll try to get my hands on an iPhone 6s.

1 Like

Love the article. For some reason, while setting up the Combine pipeline Xcode tells me that CGImage doesn’t have a method .create :man_shrugging:t2: Any ideas?

Under the Extensions group, there’s a file called CGImageExtensions.swift, which should include an extension on CGImage to add the create method.

Can you see that in the downloaded material?

Thanks for the great tutorial :slight_smile:

Can you tell wether or not this way is efficient in terms of ressources and performance?
For example the simple Combine pipeline (without CI filters) vs. the classic approach with UIViewRepresentable and AVPreviewLayer?

Thanks for the kind words!

I have not done any performance tests, however, without the CIFilters, I would imagine using an AVPreviewLayer would be faster. An Apple Engineer explained to me that AVPreviewLayer has the shortest path from the camera sensor to the screen pixels… so that will always win.

How would you update this guide to use the new built in concurrency API with async/await and Task to replace the outgoing GCD/DispatchQueue method?