Extracting frames from the stream
Thanks for a great framework.
I've followed the samples and set up some nice pipelines of filters on a live video stream.
Is there a way to capture the individual frames for further processing? I want to analyse each frame after it has been processed, and previously I was using AVCaptureSession which called didOutputSampleBuffer for each frame. Would I implement a filter or is there a better way?
Thanks
I started off inheriting from GPUImageFilter but GPUImageRawDataOutput makes a lot more sense.
Am I right to think that I create a new class that inherits from GPUImageRawDataOutput and then overrides newFrameReadyAtTime as below, so that when this is called I have a new frame to fun some processing on (some opencv processing in my case).
Thanks
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
NSLog(@"newFrameReady");
[super newFrameReadyAtTime:frameTime atIndex:textureIndex];
// my code here
}
There's no need to subclass. Simply set the newFrameAvailableBlock to a block that you'd like to run on every new processed frame.
Aha! That's fantastic, it works a treat.
Thank you
Marcus
How do you want to extract them? There are several ways to do this.
For grabbing UIImages, you can use the -imageFromCurrentlyProcessedOutput method on any filter. However, this is fairly slow because of the need to convert to a UIImage.
There's also the GPUImageRawDataOutput, which has callbacks that allow you to access the raw bytes in the filter chain at that point.