Tutorial on obtaining depth data from iPhone cameras

I’ve been trying to find a tutorial on ways to obtain metadata (specifically depth data) from the iPhones TrueDepth Camera.

Tutorials like this one below, show how to visualize depth data but do not show how to actually obtain the depth data values.
Image Depth Maps Tutorial for iOS: Getting Started

do you guys provide a tutorial that goes over this, and if not would you be making a tutorial like this in the future??

Hi @toojuice, do you know this by any chance? Thanks in advance! :]

Yup, I can help. I’m away from my computer right now, but will post some code to do this later today.

Ok, you can do something like this:

func value(of pixelBuffer: CVPixelBuffer, x: Int, y: Int) -> Double {
    // Get the width and height of the depth map (which is a CVPixelBuffer)
    let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
    let width = bytesPerRow / MemoryLayout<Float16>.size 
    let height = CVPixelBufferGetHeight(pixelBuffer)

    if x >= width || y >= height {
        return Double.nan
    }

    // Lock the base address of the CVPixelBuffer
    CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly)

    let addr = CVPixelBufferGetBaseAddress(pixelBuffer)

    // Convert the CVPixelBuffer into something that can accessed like a 1D Array
    let floatBuffer = unsafeBitCast(addr, to: UnsafeMutablePointer<Float16>.self)

    // Since the Array is 1D, we need to convert (x, y) to a position along this Array
    let pos = y * width + x
    let depth = floatBuffer[pos]

    // Don't forget to unlock the base address when you're done!
    CVPixelBufferUnlockBaseAddress(pixelBuffer, .readOnly)

    return Double(depth)
}

I believe depth map CVPixelBuffers use Float16, so I used that in there. Now depending on what kind of depth map you requested, this value will either be depth OR disparity, which is the inverse of depth.

1 Like

This topic was automatically closed after 166 days. New replies are no longer allowed.