iOS从摄像头获得实时视频流(研究中)
来源:互联网 发布:淘宝董事长 编辑:程序博客网 时间:2024/06/06 03:15
推荐大家先看一下http://www.cnblogs.com/kenshincui/p/4186022.html这个博客,可以更好地了解视频以及音频的录制以及播放
首先下面的Demo是将摄像头的视频流转化为image(JPEG)
//// ViewController.m// 实时视频Demo//// Created by 程磊 on 15/4/11.// Copyright (c) 2015年 nightGroup. All rights reserved.//#import "ViewController.h"#import <AVFoundation/AVFoundation.h>@interface ViewController ()<AVCaptureVideoDataOutputSampleBufferDelegate>@end@implementation ViewController- (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view, typically from a nib. [self setupCaptureSession];}- (void)setupCaptureSession{ NSError *error = nil; // Create the session AVCaptureSession *session = [[AVCaptureSession alloc] init];//负责输入和输出设置之间的数据传递 // Configure the session to produce lower resolution video frames, if your // processing algorithm can cope. We'll specify medium quality for the // chosen device. session.sessionPreset = AVCaptureSessionPresetMedium;//设置分辨率 // Find a suitable AVCaptureDevice AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];//这里默认是使用后置摄像头,你可以改成前置摄像头 // Create a device input with the device and add it to the session. AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (!input) { // Handling the error appropriately. } [session addInput:input]; // Create a VideoDataOutput and add it to the session AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];//创建一个视频数据输出流 [session addOutput:output]; // Configure your output. dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL); [output setSampleBufferDelegate:self queue:queue]; // Specify the pixel format output.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, [NSNumber numberWithInt: 320], (id)kCVPixelBufferWidthKey, [NSNumber numberWithInt: 240], (id)kCVPixelBufferHeightKey, nil]; AVCaptureVideoPreviewLayer* preLayer = [AVCaptureVideoPreviewLayer layerWithSession: session];//相机拍摄预览图层 //preLayer = [AVCaptureVideoPreviewLayer layerWithSession:session]; preLayer.frame = CGRectMake(0, 0, 320, 240); preLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [self.view.layer addSublayer:preLayer]; // If you wish to cap the frame rate to a known value, such as 15 fps, set // minFrameDuration. output.minFrameDuration = CMTimeMake(1, 15); // Start the session running to start the flow of data [session startRunning]; // Assign session to an ivar. //[self setSession:session];}// Delegate routine that is called when a sample buffer was written- (void)captureOutput:(AVCaptureOutput *)captureOutputdidOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{ // Create a UIImage from the sample buffer data UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; }// Create a UIImage from sample buffer data- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{ // Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer, 0); // Get the number of bytes per row for the pixel buffer void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); // Get the number of bytes per row for the pixel buffer size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); // Get the pixel buffer width and height size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); // Create a device-dependent RGB color space CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); // Create a bitmap graphics context with the sample buffer data CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); // Create a Quartz image from the pixel data in the bitmap graphics context CGImageRef quartzImage = CGBitmapContextCreateImage(context); // Unlock the pixel buffer CVPixelBufferUnlockBaseAddress(imageBuffer,0); // Free up the context and color space CGContextRelease(context); CGColorSpaceRelease(colorSpace); // Create an image object from the Quartz image UIImage *image = [UIImage imageWithCGImage:quartzImage]; // Release the Quartz image CGImageRelease(quartzImage); return (image);}- (void)didReceiveMemoryWarning { [super didReceiveMemoryWarning]; // Dispose of any resources that can be recreated.}@end
0 0
- iOS从摄像头获得实时视频流(研究中)
- 从摄像头中获取视频
- WebRTC实时通信系列教程4 从摄像头获取视频流
- 从摄像头或视频文件中读取视频流并显示
- 从摄像头或视频文件中读取视频流并显示
- 利用FFmpeg+x264将iOS摄像头实时视频流编码为h264文件
- [iOS Swift3.0/Oc] 摄像头实时获取视频流效果 简单记录
- Android中实时视频传输(摄像头实时视频传输)解决方案<二>
- Android中实时视频传输(摄像头实时视频传输)解决方案<二>
- Android中实时视频传输(摄像头实时视频传输)解决方案<二>
- Android中实时视频传输(摄像头实时视频传输)解决方案<二>
- HTML5实时显示摄像头视频
- iOS GPUImage研究三:视频采集并添加实时滤镜
- 从摄像头读入视频
- 使用ffmpeg捕获USB外部摄像头视频流实时播放
- 从相机中选择或者通过摄像头获得照片
- windows mobile中使用DirectShow开发视频流之从摄像头流中捕捉一张图片
- iOS 获取摄像头视频
- IE11不兼容网页解决办法
- 输入一行字符,统计其中有多少单词,单词之间用空格隔开
- hdoj 2553 N皇后问题 【经典搜索】
- C Primer Plus 练习 9-7
- 使用charles proxy for Mac来抓取手机App的网络包
- iOS从摄像头获得实时视频流(研究中)
- 黑马程序员Java基础第十一章----网络编程
- #笔记#圣思园 JavaSE 第26讲——包与导入语句剖析
- zabbix agentd客户端插件Shell一键自动安装脚本
- 模电数电之你有我也有
- 正则表达式 NSRegularExpression NSTextCheckingResult
- 中网管家带您深入了解搜索引擎新闻源是什么?
- 天声人語 20150411
- 浅谈个人博客网站or屌丝vps服务器暴露真实IP的危险性