代码之家  ›  专栏  ›  技术社区  ›  Ganesh

获取录制视频原始帧

  •  2
  • Ganesh  · 技术社区  · 10 年前

    我是Objective-C和iOS技术的新手 通过代码录制视频 在运行时,我必须将每个帧作为原始数据进行处理。如何实现这一点?请任何人帮助我。提前谢谢。这是我迄今为止的代码:

    - (void)viewDidLoad
    {
        [super viewDidLoad];
    
        [self setupCaptureSession];
    
    }
    

    viewDidAppel函数

    -(void)viewDidAppear:(BOOL)animated
    {
        if (!_bpickeropen)
        {
           _bpickeropen = true;
            _picker = [[UIImagePickerController alloc] init];
            _picker.delegate = self;
            NSArray *sourceTypes = [UIImagePickerController availableMediaTypesForSourceType:picker.sourceType];
            if (![sourceTypes containsObject:(NSString *)kUTTypeMovie ])
            {
                NSLog(@"device not supported");
                return;
            }
    
            _picker.sourceType = UIImagePickerControllerSourceTypeCamera;
            _picker.mediaTypes = [NSArray arrayWithObjects:(NSString *)kUTTypeMovie,nil];//,(NSString *) kUTTypeImage
            _picker.videoQuality = UIImagePickerControllerQualityTypeHigh;
            [self presentModalViewController:_picker animated:YES];
        }
    
    
    
    }
    

    //写入样本缓冲区时调用的委托例程

    - (void)captureOutput:(AVCaptureOutput *)captureOutput
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
           fromConnection:(AVCaptureConnection *)connection
    {
    
        CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(sampleBuffer);
    
    
        CVPixelBufferLockBaseAddress(cameraFrame, 0);
    
        GLubyte *rawImageBytes = CVPixelBufferGetBaseAddress(cameraFrame);
    
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(cameraFrame);
    
        **NSData *dataForRawBytes = [NSData dataWithBytes:rawImageBytes length:bytesPerRow * CVPixelBufferGetHeight(cameraFrame)];
    

    **

    问题 1.(这里我只获取一次原始字节) 2.(之后,我想将这个原始字节存储为应用程序路径中的二进制文件)。

    // Do whatever with your bytes
    
    NSLog(@"bytes per row %zd",bytesPerRow);
    
    [dataForRawBytes writeToFile:[self datafilepath]atomically:YES];
    
    NSLog(@"Sample Buffer Data is %@\n",dataForRawBytes);
    
    
    CVPixelBufferUnlockBaseAddress(cameraFrame, 0);
    

    }

    在这里,我正在设置输出的委托//创建并配置捕获会话并开始运行 -(void)setupCaptureSession { NSError*error=nil;

    // Create the session
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    
    
    // Configure the session to produce lower resolution video frames, if your
    // processing algorithm can cope. We'll specify medium quality for the
    // chosen device.
    session.sessionPreset = AVCaptureSessionPresetMedium;
    
    // Find a suitable AVCaptureDevice
    AVCaptureDevice *device = [AVCaptureDevice
                               defaultDeviceWithMediaType:AVMediaTypeVideo];
    
    // Create a device input with the device and add it to the session.
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                        error:&error];
    if (!input)
    {
        // Handling the error appropriately.
    }
    [session addInput:input];
    
    // Create a VideoDataOutput and add it to the session
    AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
    [session addOutput:output];
    
    
    // Configure your output.
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);
    
    // Specify the pixel format
    output.videoSettings =
    [NSDictionary dictionaryWithObject:
     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                forKey:(id)kCVPixelBufferPixelFormatTypeKey]; //kCVPixelBufferPixelFormatTypeKey
    
    
    // If you wish to cap the frame rate to a known value, such as 15 fps, set
    // minFrameDuration.
    // output.minFrameDuration = CMTimeMake(1, 15);
    
    // Start the session running to start the flow of data
    [session startRunning];
    
    // Assign session to an ivar.
    //[self setSession:session];
    

    }

    我感谢你的帮助。提前谢谢。

    1 回复  |  直到 10 年前
        1
  •  3
  •   Abhinav Singh    10 年前

    你可以调查一下 AVFoundation 框架它允许您访问从相机生成的原始数据。

    link 是一个很好的AVFoundation摄像机使用入门级项目。

    为了从视频输出中获取单个帧,可以使用 AVCaptureVideoDataOutput 类。

    希望这有帮助。

    编辑:您可以查看 AVCaptureVideoDataOutputSampleBufferDelegate ,特别是 captureOutput:didOutputSampleBuffer:fromConnection: 方法这将在每次捕获新帧时调用。

    如果您不知道代理是如何工作的 link 是代表的一个好例子。