Hello,
I need to call ffmpeg from a Swift routine I have created but I am not sure how to get an NSImage returned. I have tried to use the ffmpeg libraries but this has proven to be more difficult than I thought (unless someone has gone this route successfully and wants to share with the rest of the class).
Ultimately, I need a thumbnail, a proxy file and the metadata from a given video at “itemPath” that will go into either a CoreData or Firebase db, but one thing at a time.
func CreateThumbnailFromURL(itemPath: URL) ->NSImage {
let task = Process()
task.launchPath = "/usr/local/bin/ffmpeg"
task.arguments = ["not sure what this is supposed to be"] // the arguments for ffmpeg: -ss 120 -v error -i itemPath -s 480x270 -frames:v 1 and return an NSImage
task.launch()
task.waitUntilExit()
return thumbnail
}
Any ideas?
Thanks.