代码之家  ›  专栏  ›  技术社区  ›  Codo

如何正确释放AVCaptureSession

  •  30
  • Codo  · 技术社区  · 14 年前

    我正在使用AVFoundation类从摄像机捕获实时视频流并处理视频样本。这个很好用。但是,一旦完成,我就很难正确地释放AVFoundation实例(捕获会话、预览层、输入和输出)。

    当我不再需要会话和所有相关对象时,我会停止捕获会话并释放它。这在大多数情况下都有效。但是,有时应用程序会因错误而崩溃 EXEC_BAD_ACCESS

    Apple文档提到了这个问题:停止捕获会话是一个异步操作。也就是说:它不会立即发生。特别地,第二线程继续处理视频样本并访问不同的实例,如捕获会话或输入和输出设备。

    那么,如何正确地释放 AVCaptureSession 以及所有相关实例?是否有通知可靠地告诉我 AVCaptureSession会话 完成了吗?

    这是我的密码:

    AVCaptureSession* session;
    AVCaptureVideoPreviewLayer* previewLayer;
    UIView* view;
    

    实例设置:

    AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo];
    session = [[AVCaptureSession alloc] init];
    
    AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice: camera error: &error];
    [session addInput: input];
    AVCaptureVideoDataOutput* output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    [session addOutput: output];
    
    dispatch_queue_t queue = dispatch_queue_create("augm_reality", NULL);
    [output setSampleBufferDelegate: self queue: queue];
    dispatch_release(queue);
    
    previewLayer = [[AVCaptureVideoPreviewLayer layerWithSession: session] retain];
    previewLayer.frame = view.bounds;
    [view.layer addSublayer: previewLayer];
    
    [session startRunning];
    

    清理:

    [previewLayer removeFromSuperlayer];
    [previewLayer release];
    [session stopRunning];
    [session release];
    
    7 回复  |  直到 4 年前
        1
  •  18
  •   Community CDub    13 年前

    这是迄今为止我找到的最好的解决办法。基本思想是使用调度队列的终结器。当分派队列退出时,我们可以确定在处理样本缓冲区的第二个线程中不会有任何操作。

    static void capture_cleanup(void* p)
    {
        AugmReality* ar = (AugmReality *)p; // cast to original context instance
        [ar release];  // releases capture session if dealloc is called
    }
    
    ...
    
    dispatch_queue_t queue = dispatch_queue_create("augm_reality", NULL);
    dispatch_set_context(queue, self);
    dispatch_set_finalizer_f(queue, capture_cleanup);
    [output setSampleBufferDelegate: self queue: queue];
    dispatch_release(queue);
    [self retain];
    
    ...
    

    不幸的是,我现在必须明确停止捕捉。否则释放我的实例不会释放它,因为第二个线程现在也会增加和减少计数器。

        2
  •  4
  •   Codo    14 年前

    我在苹果开发者论坛上发布了一个非常类似的问题,得到了一位苹果员工的回答。他说这是一个已知的问题:

    iOS 4.0-4.1已修复,将在将来的更新中出现。为了 停止AVCaptureSession,例如,在处理 会话和数据输出。

    dispatch_after(
        dispatch_time(0, 500000000),
        dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), // or main queue, or your own
        ^{
            // Do your work here.
            [session release];
            // etc.
        }
    );
    

    我仍然更喜欢使用dispatch queue finalizer的方法,因为这段代码只是猜测第二个线程何时完成。

        3
  •  3
  •   kiranpradeep    9 年前

    根据当前苹果文件( 1 ) [AVCaptureSession stopRunning]

        4
  •  2
  •   VLegakis    14 年前

    解决了的! 可能是初始化会话时的一系列动作。这个对我有用:

    NSError *error = nil;
    
    if(session)
        [session release];
    
    // Create the session
    session = [[AVCaptureSession alloc] init];
    
    
    // Configure the session to produce lower resolution video frames, if your 
    // processing algorithm can cope. We'll specify medium quality for the
    // chosen device.
    session.sessionPreset = AVCaptureSessionPresetMedium;
    
    // Find a suitable AVCaptureDevice
    AVCaptureDevice *device = [AVCaptureDevice
                               defaultDeviceWithMediaType:AVMediaTypeVideo];
    
    // Create a device input with the device and add it to the session.
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device 
                                                                        error:&error];
    if (!input) {
        // Handling the error appropriately.
    }
    [session addInput:input];
    
    // Create a VideoDataOutput and add it to the session
    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    [session addOutput:output];
    
    
    // Configure your output.
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);
    
    // Specify the pixel format
    output.videoSettings = 
    [NSDictionary dictionaryWithObject:
     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                                forKey:(id)kCVPixelBufferPixelFormatTypeKey];
    
    // If you wish to cap the frame rate to a known value, such as 15 fps, set 
    // minFrameDuration.
    output.minFrameDuration = CMTimeMake(1, 15);
    
    previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
    [delegate layerArrived:previewLayer];
    
    NSNotificationCenter *notify =
    [NSNotificationCenter defaultCenter];
    [notify addObserver: self
                selector: @selector(onVideoError:)
                name: AVCaptureSessionRuntimeErrorNotification
                object: session];
    [notify addObserver: self
                selector: @selector(onVideoStart:)
                name: AVCaptureSessionDidStartRunningNotification
                object: session];
    [notify addObserver: self
                selector: @selector(onVideoStop:)
                name: AVCaptureSessionDidStopRunningNotification
                object: session];
    [notify addObserver: self
                selector: @selector(onVideoStop:)
                name: AVCaptureSessionWasInterruptedNotification
                object: session];
    [notify addObserver: self
                selector: @selector(onVideoStart:)
                name: AVCaptureSessionInterruptionEndedNotification
                object: session];
    
    // Start the session running to start the flow of data
    [session startRunning];
    

        5
  •  2
  •   Mahyar    13 年前

    使用队列终结器,可以为每个队列使用dispatch\信号量,然后在完成后继续执行清理例程。

    #define GCD_TIME(delayInSeconds) dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC)
    
    static void vQueueCleanup(void* context) {
      VideoRecordingViewController *vc = (VideoRecordingViewController*)context;
      if (vc.vSema) dispatch_semaphore_signal(vc.vSema);
    }
    
    static void aQueueCleanup(void* context) {
      VideoRecordingViewController *vc = (VideoRecordingViewController*)context;
      if (vc.aSema) dispatch_semaphore_signal(vc.aSema);
    }
    
    //In your cleanup method:
    vSema = dispatch_semaphore_create(0);
    aSema = dispatch_semaphore_create(0);
    self.avSession = nil;
    if (vSema) dispatch_semaphore_wait(vSema, GCD_TIME(0.5));
    if (aSema) dispatch_semaphore_wait(aSema, GCD_TIME(0.5));
    [self.navigationController popViewControllerAnimated:YES];
    

    请记住,必须将AVCaptureVideoDataOutput/AVCaptureAudioDataOutput对象示例缓冲区委托设置为nil,否则它们永远不会释放关联的队列,因此在释放AVCaptureSession时永远不会调用终结器。

    [avs removeOutput:vOut];
    [vOut setSampleBufferDelegate:nil queue:NULL];
    
        6
  •  2
  •   souvickcse    11 年前
     -(void)deallocSession
    {
    [captureVideoPreviewLayer removeFromSuperlayer];
    for(AVCaptureInput *input1 in session.inputs) {
        [session removeInput:input1];
    }
    
    for(AVCaptureOutput *output1 in session.outputs) {
        [session removeOutput:output1];
    }
    [session stopRunning];
    session=nil;
    outputSettings=nil;
    device=nil;
    input=nil;
    captureVideoPreviewLayer=nil;
    stillImageOutput=nil;
    self.vImagePreview=nil;
    
    }
    

    在弹出和推送任何其他视图之前,我调用了这个函数。它解决了我的低内存警告问题。

        7
  •  1
  •   VLegakis    14 年前

    NSNotificationCenter *notify =
    [NSNotificationCenter defaultCenter];
    [notify addObserver: self
                selector: @selector(onVideoError:)
                name: AVCaptureSessionRuntimeErrorNotification
                object: session];
    [notify addObserver: self
                selector: @selector(onVideoStart:)
                name: AVCaptureSessionDidStartRunningNotification
                object: session];
    [notify addObserver: self
                selector: @selector(onVideoStop:)
                name: AVCaptureSessionDidStopRunningNotification
                object: session];
    [notify addObserver: self
                selector: @selector(onVideoStop:)
                name: AVCaptureSessionWasInterruptedNotification
                object: session];
    [notify addObserver: self
                selector: @selector(onVideoStart:)
                name: AVCaptureSessionInterruptionEndedNotification
                object: session];
    

    AVCaptureInput* input = [session.inputs objectAtIndex:0];
    [session removeInput:input];
    AVCaptureVideoDataOutput* output = (AVCaptureVideoDataOutput*)[session.outputs objectAtIndex:0];
    [session removeOutput:output];  
    

    它的工作,但请让我知道,如果你看到任何把戏。我更喜欢异步使用它。

    谢谢