代码之家  ›  专栏  ›  技术社区  ›  Malo Maisonneuve

正在处理中从摄影机获取两个连续帧

  •  0
  • Malo Maisonneuve  · 技术社区  · 7 年前

    我一直在尝试几种使用处理的方法来实现这一点,但每次都没有得到完全连续的方法。有人知道这样做的“正确方法”吗?

    提前感谢!

    2 回复  |  直到 7 年前
        1
  •  0
  •   George Profenza    7 年前

    理论上,这应该是一个倾听 captureEvent() 获取新帧并跟踪第一帧之前是否已录制,如果已录制,则跟踪之后的第二帧。

    下面是一个基本的注释草图来说明这一点(按任意键获取另一对帧):

    import processing.video.*;
    
    Capture camera;
    
    PImage firstFrame;
    PImage secondFrame;
    
    void setup(){
      size(1920,480);
    
      camera = new Capture(this,640,480);
      camera.start();
    }
    void draw(){
      image(camera,0,0);
      if(firstFrame != null){
        image(firstFrame,640,0);
      }
      if(secondFrame != null){
        image(secondFrame,1280,0);
      }
    }
    //this is the callback from the video library when a new camera frame is available
    void captureEvent(Capture c){
      //read a new frame
      c.read();
      //if the first frame wasn't recorded yet, record(copy) it's pixels
      if(firstFrame == null){
        firstFrame = c.get();
      }
      //same for the second frame, but check if the first frame has been recorded first
      if(firstFrame != null && secondFrame == null){
        secondFrame = c.get();
      }
    }
    
    void keyPressed(){
      //reset consecutive frames on keypress
      firstFrame = secondFrame = null;
    }
    

    理论上(正如您在 Processing Video Library's source code ),captureEvent仅在新相机样本准备就绪时激发。 在实践中,您会发现两个连续的帧可能看起来完全相同(即使它们在时间上可能是一分为二的),甚至是您在评论中指出的噪音。

    感觉你所追求的是一个连续的帧,但与之前的帧有很大的不同。如果是这样的话,你可以和 FrameDifferencing example ( 处理(>);示例(>);库(>);视频(>);捕获(>);帧差分 )

    这是上述草图的一个修改版本,使用戈兰·莱文的帧差分代码仅获取第二帧(如果相差一点点):

    import processing.video.*;
    
    Capture camera;
    
    PImage firstFrame;
    PImage secondFrame;
    PImage diff;
    
    void setup(){
      size(1920,960);
    
      camera = new Capture(this,640,480);
      camera.start();
    
      diff = createImage(640,480,RGB);
    }
    void draw(){
      image(camera,0,0);
      if(firstFrame != null){
        image(firstFrame,640,0);
      }
      if(secondFrame != null){
        image(secondFrame,1280,0);
      }
      image(diff,0,480);
    }
    //this is the callback from the video library when a new camera frame is available
    void captureEvent(Capture c){
      //read a new frame
      c.read();
      //if the first frame wasn't recorded yet, record(copy) it's pixels
      if(firstFrame == null){
        firstFrame = c.get();
        println("recorded first frame at",new java.util.Date());
      }
      //same for the second frame, but check if the first frame has been recorded first
      if(firstFrame != null && secondFrame == null){
        //if the difference between the first frame cand the current frame is even ever so slightly off, record the second frame
        if(difference(firstFrame,camera) > 100){
          secondFrame = c.get();
        }
      }
    
    }
    
    int difference(PImage first,PImage second){
      final int numPixels = 640*480;
      camera.loadPixels();
      int movementSum = 0; // Amount of movement in the frame
      for (int i = 0; i < numPixels; i++) { // For each pixel in the video frame...
        color currColor = first.pixels[i];
        color prevColor = second.pixels[i];
        // Extract the red, green, and blue components from current pixel
        int currR = (currColor >> 16) & 0xFF; // Like red(), but faster
        int currG = (currColor >> 8) & 0xFF;
        int currB = currColor & 0xFF;
        // Extract red, green, and blue components from previous pixel
        int prevR = (prevColor >> 16) & 0xFF;
        int prevG = (prevColor >> 8) & 0xFF;
        int prevB = prevColor & 0xFF;
        // Compute the difference of the red, green, and blue values
        int diffR = abs(currR - prevR);
        int diffG = abs(currG - prevG);
        int diffB = abs(currB - prevB);
        // Render the difference image to the screen
        diff.pixels[i] = color(diffR, diffG, diffB);
        // Add these differences to the running tally
        movementSum += diffR + diffG + diffB;
      }
      diff.updatePixels();
      return movementSum;
    }
    
    void keyPressed(){
      //reset consecutive frames on keypress
      firstFrame = secondFrame = null;
    }
    

    在这个例子中,100是一个任意值。 最大值为 255*3*640*480 (每个通道0-255个*通道数*宽度*高度)

        2
  •  0
  •   Alex Cohn    7 年前

    我看着KetaiCamera sources issue reports 尤其地 this one . 不幸的是,此代码不是为提供实时相机帧而构建的。

    你可以试着 AndroidCapture 作为初学者进行项目,并修改本机Android class 实现你的目标。