iOS 从摄像头获取YUV420SP数据

来源:互联网 发布:nga数据库6.0 编辑:程序博客网 时间:2024/05/17 06:58

需要引入两个库

#import <AVFoundation/AVFoundation.h>

#import <AssetsLibrary/AssetsLibrary.h>


所在类需遵守AVCaptureVideoDataOutputSampleBufferDelegate代理


设置Session

<span style="font-size:18px;">- (void)setSession{    _captureInput = [[AVCaptureDeviceInput alloc]initWithDevice:[self getFrontCameraDevice] error:nil];    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc]                                               init];    captureOutput.alwaysDiscardsLateVideoFrames = YES;        dispatch_queue_t queue;    queue = dispatch_queue_create("cameraQueue", NULL);    [captureOutput setSampleBufferDelegate:self queue:queue];    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;    NSNumber* value = [NSNumber                       numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];    NSDictionary* videoSettings = [NSDictionary                                   dictionaryWithObject:value forKey:key];    [captureOutput setVideoSettings:videoSettings];    self.captureSession = [[AVCaptureSession alloc] init];    [self.captureSession addInput:_captureInput];    [self.captureSession addOutput:captureOutput];    [self.captureSession setSessionPreset:AVCaptureSessionPreset640x480];    }</span>

NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];

表示设置摄像头返回的数据类型为YUV420SP类型

[self.captureSessionsetSessionPreset:AVCaptureSessionPreset640x480];

设置分辨率

<span style="font-size:18px;">/** *  获取前置摄像头 * *  @return 摄像头设备 */- (AVCaptureDevice *)getFrontCameraDevice{    NSArray *cameras= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];    for (AVCaptureDevice *camera in cameras) {        if ([camera position] == AVCaptureDevicePositionFront) {            return camera;        }    }    return nil;}</span>

AVCaptureVideoDataOutputSampleBufferDelegate代理中的操作


<span style="font-size:18px;">#pragma mark AVCaptureSession - delegate- (void)captureOutput:(AVCaptureOutput *)captureOutputdidOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer       fromConnection:(AVCaptureConnection *)connection {    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);        // Lock the base address of the pixel buffer    CVPixelBufferLockBaseAddress(imageBuffer, 0);    CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);    CMTime duration = CMSampleBufferGetDuration(sampleBuffer);    // Get the number of bytes per row for the plane pixel buffer    void *imageAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);        size_t width = CVPixelBufferGetWidth(imageBuffer);    size_t height = CVPixelBufferGetHeight(imageBuffer);            Byte *buf = malloc(width * height * 3/ 2);    memcpy(buf, imageAddress, width * height);    size_t a = width * height;    size_t b = width * height * 5 / 4;    for (NSInteger i = 0; i < width * height/ 2; i ++) {        memcpy(buf + a, imageAddress + width * height + i , 1);        a++;        i++;        memcpy(buf + b, imageAddress + width * height + i, 1);        b++;            }    CVPixelBufferUnlockBaseAddress(imageBuffer,0);}</span>


将YUV420SP的数据重新排列为YUV420P


    Byte *buf = malloc(width * height *3/ 2);

    memcpy(buf, imageAddress, width * height);

    size_t a = width * height;

    size_t b = width * height *5 / 4;

    for (NSInteger i =0; i < width * height/2; i ++) {

        memcpy(buf + a, imageAddress + width * height + i ,1);

        a++;

        i++;

        memcpy(buf + b, imageAddress + width * height + i,1);

        b++;

        

    }

buf中的数据就是YUV420P的数据可以供OpenGL ES显示





0 0
原创粉丝点击