我正在使用WebRTC创建一个对等连接,用于共享屏幕和音频。我正在使用ReplayKit捕获屏幕,它生成
CMSampleBufferRef
RTCVideoFrame
.
为了得到
我用的是:
[[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error)
当我开始将应用程序发送到后台并多次返回时,问题就出现了;然后ReplayKit停止调用他的捕获处理程序。
只有当我把
CMSampleBufferRef
对于WebRTC,很明显ReplayKit问题与WebRTC有关。如果我从代码中删除这一行,问题就不会发生(但显然WebRTC不起作用)。
[self->source capturer:self->capturer didCaptureVideoFrame:videoFrame];
我能让它再次工作的唯一方法就是重启设备。即使关闭应用程序并重新启动也不起作用。
我就是这样创造的
RTCVideoTrack
- (RTCVideoTrack *)createLocalVideoTrack {
self->source = [_factory videoSource];
self->capturer = [[RTCVideoCapturer alloc] initWithDelegate:self->source];
[self->source adaptOutputFormatToWidth:441 height:736 fps:15];
return [_factory videoTrackWithSource:self->source trackId:@"ARDAMSv0"];
}
以下是我如何转换
到
rtc视频帧
并发送到WebRTC:
- (void)didCaptureSampleBuffer:(CMSampleBufferRef)sampleBuffer {
if (CMSampleBufferGetNumSamples(sampleBuffer) != 1 || !CMSampleBufferIsValid(sampleBuffer) ||
!CMSampleBufferDataIsReady(sampleBuffer)) {
return;
}
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (pixelBuffer == nil) {
return;
}
RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];
int64_t timeStampNs =
CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * NSEC_PER_SEC;
RTCVideoFrame *videoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer rotation:RTCVideoRotation_0 timeStampNs:timeStampNs];
[self->source capturer:self->capturer didCaptureVideoFrame:videoFrame];
}