AVCaptureVideoPreviewLayer size

I am using the AVFoundation in an app to allow use of the device’s camera. Specifically, using AVCaptureVideoPreviewLayer on a UIView on the storyboard.

When the camera is enabled (clicking a camera button), the camera is not placed or sized where I would like or expect. It actually leaves a blank space at the top of this UIView, and displays right to the bottom of the screen / UIView. I keep the app only in portrait layout, and don’t want or need it to change.

The UIView rectangle is using most of the screen (375 x 567, starting at x=0,y=100), using the Main.storyboard IB/ GUI.

Supported with this code:

@IBOutlet weak var viewCamera: UIView!
var layer: CALayer {
    return viewCamera.layer
}

var cameraSession : AVCaptureSession!
var device : AVCaptureDevice!
var cameraLayer : AVCaptureVideoPreviewLayer!

@IBAction func buttonClickCamera(_ sender: UIBarButtonItem) {
    buttonTextCamera.isEnabled = false
    setupCameraSession()
}

func setupCameraSession() {
    
    cameraSession = AVCaptureSession()
    cameraSession.sessionPreset = AVCaptureSession.Preset.photo
    
    device = AVCaptureDevice.default(for: AVMediaType.video)
    
    do {
        let deviceInput = try AVCaptureDeviceInput(device: device!)
        
        if cameraSession.canAddInput(deviceInput) {
            cameraSession.addInput(deviceInput)
        }
    } catch {
        print(error.localizedDescription)
    }
    
    let cameraOutput = AVCaptureVideoDataOutput()
    if cameraSession.canAddOutput(cameraOutput) {
        cameraSession.addOutput(cameraOutput)
    }
    
    cameraLayer = AVCaptureVideoPreviewLayer(session: cameraSession)
    // TRIED USING: cameraLayer.videoGravity = AVLayerVideoGravity.resizeAspect
    cameraLayer.frame = layer.frame
    layer.addSublayer(cameraLayer)
    
    cameraOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "videoQueue"))
    
    cameraSession.startRunning()
    
}

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    processCameraBuffer(sampleBuffer: sampleBuffer) { result in
        // SOME APP PROCESSING / LOGIC
    } 
}

I can provide a screenshot if needed?

Can I calculate the size of the UIView, or the camera capture input? Or force the size of either / both?

I have tried using the cameraLayer.videoGravity property to try .resizeAspect & .resizeAspectFill but nothing changed.