Thursday, 12 November 2020

Trouble applying scaleTimeRange on multiple videos in a AVMutableComposition video

I am attempting to merge videos with scaleTimeRanges (to make them slo-mo or speed-up); however, it is not working as desired. Only the first video has the timerange effect... not all of them.

The work is done in the merge videos function; it is pretty simple... however I am not sure why the scaling of the time range is not working for only the first video and not the next ones...

This is a test project to test with, it has my current code: https://github.com/meyesyesme/creationMergeProj

This is the merge function I use, with the time range scaling currently commented out (you can uncomment to see it not working):

func mergeVideosTestSQ(arrayVideos:[VideoSegment], completion:@escaping (URL?, Error?) -> ()) {


let mixComposition = AVMutableComposition()


var instructions: [AVMutableVideoCompositionLayerInstruction] = []
var insertTime = CMTime(seconds: 0, preferredTimescale: 1)

print(arrayVideos, "<- arrayVideos")
/// for each URL add the video and audio tracks and their duration to the composition
for videoSegment in arrayVideos {
    
    let sourceAsset = AVAsset(url: videoSegment.videoURL!)
    
    let frameRange = CMTimeRange(start: CMTime(seconds: 0, preferredTimescale: 1), duration: sourceAsset.duration)
    
    guard
        let nthVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)),
        let nthAudioTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)), //0 used to be kCMPersistentTrackID_Invalid
        let assetVideoTrack = sourceAsset.tracks(withMediaType: .video).first
    else {
        print("didnt work")
        return
    }
    
    var assetAudioTrack: AVAssetTrack?
    assetAudioTrack = sourceAsset.tracks(withMediaType: .audio).first
    print(assetAudioTrack, ",-- assetAudioTrack???", assetAudioTrack?.asset, "<-- hes", sourceAsset)
    
    do {
        
        try nthVideoTrack.insertTimeRange(frameRange, of: assetVideoTrack, at: insertTime)
        try nthAudioTrack.insertTimeRange(frameRange, of: assetAudioTrack!, at: insertTime)
        
        //                    //MY CURRENT SPEED ATTEMPT:
        //                    let newDuration = CMTimeMultiplyByFloat64(frameRange.duration, multiplier: videoSegment.videoSpeed)
        //                    nthVideoTrack.scaleTimeRange(frameRange, toDuration: newDuration)
        //                    nthAudioTrack.scaleTimeRange(frameRange, toDuration: newDuration)
        
        //                    print(insertTime.value, "<-- fiji, newdur --->", newDuration.value, "sourceasset duration--->", sourceAsset.duration.value, "frameRange.duration -->", frameRange.duration.value)
        
        //instructions:
        let nthInstruction = ViewController.videoCompositionInstruction(nthVideoTrack, asset: sourceAsset)
        nthInstruction.setOpacity(0.0, at: CMTimeAdd(insertTime, sourceAsset.duration)) //sourceasset.duration
        
        instructions.append(nthInstruction)
        insertTime = insertTime + sourceAsset.duration //sourceAsset.duration
        
        
        
    } catch {
        DispatchQueue.main.async {
            print("didnt wor2k")
        }
    }
    
}


let mainInstruction = AVMutableVideoCompositionInstruction()
mainInstruction.timeRange = CMTimeRange(start: CMTime(seconds: 0, preferredTimescale: 1), duration: insertTime)

mainInstruction.layerInstructions = instructions

let mainComposition = AVMutableVideoComposition()
mainComposition.instructions = [mainInstruction]
mainComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
mainComposition.renderSize = CGSize(width: 1080, height: 1920)

let outputFileURL = URL(fileURLWithPath: NSTemporaryDirectory() + "merge.mp4")

//below to clear the video form docuent folder for new vid...
let fileManager = FileManager()
try? fileManager.removeItem(at: outputFileURL)

print("<now will export: 🔥 🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥")


/// try to start an export session and set the path and file type
if let exportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) { //DOES NOT WORK WITH AVAssetExportPresetPassthrough
    exportSession.outputFileType = .mov
    exportSession.outputURL = outputFileURL
    exportSession.videoComposition = mainComposition
    exportSession.shouldOptimizeForNetworkUse = true
    
    /// try to export the file and handle the status cases
    exportSession.exportAsynchronously {
        if let url = exportSession.outputURL{
            completion(url, nil)
        }
        if let error = exportSession.error {
            completion(nil, error)
        }
    }
    
}

}

You'll see this behavior: the first one is working well, but then the next videos do not and have issues with when they were set opacity, etc... I have tried different combinations and this is the closest one yet.

I've been stuck on this for a while!



from Trouble applying scaleTimeRange on multiple videos in a AVMutableComposition video

No comments:

Post a Comment