AVCaptureSession获取摄像流
来源:互联网 发布:招标软件 编辑:程序博客网 时间:2024/04/29 18:46
第一步:初始化AVCaptureSession,添加输入,输出源
#import <AVFoundation/AVFoundation.h>
// Create and configure a capture session and start it running
- (void)setupCaptureSession
{
NSError *error = nil;
// Create the session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetMedium;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
}
[session addInput:input];
// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[session addOutput:output];
// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// Specify the pixel format
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
// If you wish to cap the frame rate to a known value, such as 15 fps, set
// minFrameDuration.
output.minFrameDuration = CMTimeMake(1, 15);
// Start the session running to start the flow of data
[session startRunning];
// Assign session to an ivar.
[self setSession:session];
}
第二步:实现AVCaptureVideoDataOutputSampleBufferDelegate协议方法
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
< Add your code here that uses the image >
}
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer,0);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
if (!colorSpace)
{
NSLog(@"CGColorSpaceCreateDeviceRGB failure");
return nil;
}
// Get the base address of the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the data size for contiguous planes of the pixel buffer.
size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
// Create a Quartz direct-access data provider that uses data we supply
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize,
NULL);
// Create a bitmap image from data supplied by our data provider
CGImageRef cgImage =
CGImageCreate(width,
height,
8,
32,
bytesPerRow,
colorSpace,
kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
provider,
NULL,
true,
kCGRenderingIntentDefault);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
// Create and return an image object representing the specified Quartz image
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
return image;
}
- AVCaptureSession获取摄像流
- 自定义使用AVCaptureSession 拍照,摄像,载图
- AVCaptureSession 拍照,摄像,载图总结
- [IOS开发]自定义使用AVCaptureSession 拍照,摄像,载图总结
- 自定义使用AVCaptureSession 拍照,摄像,载图总结
- [IOS开发]自定义使用AVCaptureSession 拍照,摄像,载图总结
- 扫码。摄像捕捉。AVCaptureSession初始化 闪退。
- iOS AVCaptureSession获取图像数据
- JMF摄像获取
- iOS Still Image Capture Using AVCaptureSession(使用AVCaptureSession获取静止Image)
- ios4下不使用私有API,轻松打开摄像头,获取摄像流
- IOS获取摄像和本地中的资源
- OpenCv获取摄像头并显示摄像内容
- iOS AVCaptureSession实现获取摄像头图像,并识别图片中身份证号码
- 使用AVCaptureSession扫描二维码
- 媒体捕捉 - AVCaptureSession
- OpenCV实时美颜摄像并生成H264视频流
- OpenCV实时美颜摄像并生成H264视频流
- Ubuntu13.04下编译GCC-4.8.2源码并安装成功
- 帮助我们加载自己的动态库,
- Linux挂载iso文件
- web 横向导出word
- java 查看默认配置
- AVCaptureSession获取摄像流
- 于是开发者们谨慎的采取
- 关于适配ios7系统对导航栏的适配问题
- 1023 Have Fun with Numbers (20)(vector<int> di,di读取string里的数字)
- jquery操作复选框
- 蓝牙通信(一)
- UNIX上C++程序设计守则(信号和线程)(下)
- (OS 10048)Only one usage of each socket address (protocol/network address/port) is normally permitte
- 常用的ajax的代码