I've implemented a RPScreenRecorder, which records screen as well as mic audio. And then after some replays I stick the videos together Using AVMutableComposition
For screen recording and getting the video and audio files, I am using
- (void)startCaptureWithHandler:(nullable void(^)(CMSampleBufferRef sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error))captureHandler completionHandler:
Most of the times it works great, I receive both video and audio samples. But some times it so happens that it only sends me audio buffers but not video buffers. And once I encounter this problem, it won't go until I restart my device and reinstall the app. This makes my app so unreliable for the user. I think this is a replay kit issue but unable to found out related issues with other developers. let me know if any one of you came across this issue and got the solution.
I have check multiple times but haven't seen any issue in configuration. But here it is anyway.
NSError *videoWriterError;
videoWriter = [[AVAssetWriter alloc] initWithURL:fileString fileType:AVFileTypeQuickTimeMovie
error:&videoWriterError];
if (videoWriterError != nil)
{
printf("+++ error while initializing videoWriter: %s",[videoWriterError.localizedDescription UTF8String]);
}
NSError *audioWriterError;
audioWriter = [[AVAssetWriter alloc] initWithURL:audioFileString fileType:AVFileTypeAppleM4A
error:&audioWriterError];
if (audioWriterError != nil)
{
printf("+++ error while initializing audioWriter: %s",[audioWriterError.localizedDescription UTF8String]);
}
CGFloat width =UIScreen.mainScreen.bounds.size.width;
NSString *widthString = [NSString stringWithFormat:@"%f", width];
CGFloat height =UIScreen.mainScreen.boNSString *heightString = [NSString stringWithFormat:@"%f", height];unds.size.height;
NSDictionary * videoOutputSettings= @{AVVideoCodecKey : AVVideoCodecTypeH264,
AVVideoWidthKey: widthString,
AVVideoHeightKey : heightString};
videoInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:videoOutputSettings];
videoInput.expectsMediaDataInRealTime = true;
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary * audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
[ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil ];
audioInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];
[audioInput setExpectsMediaDataInRealTime:YES];
[videoWriter addInput:videoInput];
[audioWriter addInput:audioInput];
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:nil];
[RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable myError) {
Block
}
from RPScreenRecorder / Replaykit not sending VideoBuffer samples
No comments:
Post a Comment