代码之家  ›  专栏  ›  技术社区  ›  user924

使用增强现实技术录制视频的最佳方式是什么

  •  4
  • user924  · 技术社区  · 6 年前

    使用增强现实技术录制视频的最佳方式是什么?(在iPhone/iPad摄像头的框架中添加文字、图像和徽标)

    之前我想知道如何 CIImage ( How to draw text into CIImage? )和转换 CIImage公司 返回到 CMSampleBuffer ( CIImage back to CMSampleBuffer )

    我几乎什么都做了,只是用新的 CMSampleBuffer 在里面 AVAssetWriterInput

    但这个解决方案无论如何都不好,它在转换时会消耗大量CPU CIImage公司 CVPixelBuffer ( ciContext.render(ciImage!, to: aBuffer) )

    因此,我想停在这里,找到一些其他方法来录制具有增强现实功能的视频(例如,在将视频编码到mp4文件时,在帧内动态添加(绘制)文本)

    这里是我已经尝试过的,不想再使用的。。。

    // convert original CMSampleBuffer to CIImage, 
    // combine multiple `CIImage`s into one (adding augmented reality -  
    // text or some additional images)
    let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
    let ciimage : CIImage = CIImage(cvPixelBuffer: pixelBuffer)
    var outputImage: CIImage?
    let images : Array<CIImage> = [ciimage, ciimageSec!] // add all your CIImages that you'd like to combine
    for image in images {
        outputImage = outputImage == nil ? image : image.composited(over: outputImage!)
    }
    
    // allocate this class variable once         
    if pixelBufferNew == nil {
        CVPixelBufferCreate(kCFAllocatorSystemDefault, CVPixelBufferGetWidth(pixelBuffer),  CVPixelBufferGetHeight(pixelBuffer), kCVPixelFormatType_32BGRA, nil, &pixelBufferNew)
    }
    
    // convert CIImage to CVPixelBuffer
    let ciContext = CIContext(options: nil)
    if let aBuffer = pixelBufferNew {
        ciContext.render(outputImage!, to: aBuffer) // >>> IT EATS A LOT OF <<< CPU
    }
    
    // convert new CVPixelBuffer to new CMSampleBuffer
    var sampleTime = CMSampleTimingInfo()
    sampleTime.duration = CMSampleBufferGetDuration(sampleBuffer)
    sampleTime.presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
    sampleTime.decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer)
    var videoInfo: CMVideoFormatDescription? = nil
    CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBufferNew!, &videoInfo)
    var oBuf: CMSampleBuffer?
    CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBufferNew!, true, nil, nil, videoInfo!, &sampleTime, &oBuf)
    
    /*
    try to append new CMSampleBuffer into a file (.mp4) using 
    AVAssetWriter & AVAssetWriterInput... (I met errors with it, original buffer works ok 
    - "from func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)")
    */*
    

    有没有更好的解决方案?

    1 回复  |  直到 6 年前
        1
  •  0
  •   user924    6 年前

    现在我回答我自己的问题

    最好是使用 Objective-C++ 类别( .mm )我们可以使用OpenCV和从 CMSampleBuffer cv::Mat 然后返回到 CMSampleBuffer 处理后

    我们可以很容易地从Swift调用Objective-C++函数