AVFoundation Programming Guide - Editing
来源:互联网 发布:linux ssh 用户数限制 编辑:程序博客网 时间:2024/04/29 03:16
Figure 3-1 AVMutableComposition assembles assets together
使用AVMutableAudioMix
类,你能够在自己的组合中的音频轨道上执行自定义音频的过程。Figure 3-2显示了相应的过程,这里,你能够指定最大的音量,或者为音轨(audio track)设置音量斜坡.
Figure 3-2 AVMutableAudioMix performs audio mixing
为了进行编辑操作我们能够使用AVMutableVideoComposition
类直接工作于我们组件中的视频轨道(video tracks)。如Figure 3-3所示:对于单一的视频组件,我们可以为输出视频(video)指定期望渲染的大小和缩放以及帧的时间。通过使用视频组件指令(composition’s instructions)由AVMutableVideoCompositionInstruction
类提供,我们能够使用指令来修改我们视频的背景颜色和应用层的指令,这些应用层指令由 AVMutableVideoCompositionLayerInstruction
类提供。应用层的相关指令能够应用于应用变换,变换坡道,不透明度以及不透明度的坡道到你的组件中的视频轨道。视频组件类也能够让我们使用animationTool属性引入核心动画框架对视频的影响。
Figure 3-3 AVMutableVideoComposition
为了组合组件的视频和视频成分,能够使用AVAssetExportSession
对象,如下图所示:我们使用自己的组合初始化export session这时候简单将音频部分和视频组件分别赋值到audioMix 和 videoComposition
属性。
Figure 3-4 Use AVAssetExportSession to combine media elements into an output file
AVMutableComposition
类,为了添加媒体(media data)数据到我们创建的组件对象,我们必须使用AVMutableCompositionTrack类创建一个或者多个组件轨道。最简单的例子就是使用一个视频轨道和一个音频轨道创建一个可变组件:AVMutableComposition *mutableComposition = [AVMutableComposition composition];// Create the video composition track.AVMutableCompositionTrack *mutableCompositionVideoTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];// Create the audio composition track.AVMutableCompositionTrack *mutableCompositionAudioTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVMediaTypeText
kCMPersistentTrackID_Invalid
,独特标识符将自动被创建并关联对应的轨道。AVAsset
对象获取所在位置的媒体数据。我们能够使用可变的组件轨道接口将多个拥有相同媒体类型添加到同一个轨道上。下面代码解释了怎样添加两个不同的视频资源轨道到同一个组件轨道上。// You can retrieve AVAssets from a number of places, like the camera roll for example.AVAsset *videoAsset = <#AVAsset with at least one video track#>;AVAsset *anotherVideoAsset = <#another AVAsset with at least one video track#>;// Get the first video track from each asset.AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];AVAssetTrack *anotherVideoAssetTrack = [[anotherVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];// Add them both to the composition.[mutableCompositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,videoAssetTrack.timeRange.duration) ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];[mutableCompositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,anotherVideoAssetTrack.timeRange.duration) ofTrack:anotherVideoAssetTrack atTime:videoAssetTrack.timeRange.duration error:nil];
AVMutableCompositionTrack *compatibleCompositionTrack = [mutableComposition mutableTrackCompatibleWithTrack:<#the AVAssetTrack you want to insert#>]; if (compatibleCompositionTrack) { // Implementation continues. }
AVMutableAudioMixInputParameters
的实例关联我们组件中具体轨道上的音频混合. 音频混合能够用于改变的音频轨道的音量,下面例子显示了怎样设置在指定的音频轨道上的音量坡度,使得在组件的持续时间让音频缓慢淡出:AVMutableAudioMix *mutableAudioMix = [AVMutableAudioMix audioMix];// Create the audio mix input parameters object. AVMutableAudioMixInputParameters *mixParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:mutableCompositionAudioTrack];// Set the volume ramp to slowly fade the audio out over the duration of the composition. [mixParameters setVolumeRampFromStartVolume:1.f toEndVolume:0.f timeRange:CMTimeRangeMake(kCMTimeZero, mutableComposition.duration)];// Attach the input parameters to the audio mix. mutableAudioMix.inputParameters = @[mixParameters];
AVMutableVideoCompositionInstruction *mutableVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];mutableVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, mutableComposition.duration);mutableVideoCompositionInstruction.backgroundColor = [[UIColor redColor] CGColor];
AVMutableVideoCompositionLayerInstruction
对象可以应用转换,转换坡道,不透明度和坡道的不透明度到某个组件内的视频轨道,视频组件指令的 layerInstructions
数组中层指令的顺序决定了组件指令期间,资源轨道中的视频框架应该如何被应用和组合。下面的代码展示了如何设置一个不透明的坡度使得第二个视频之前,让第一个视频慢慢淡出:AVAsset *firstVideoAssetTrack = <#AVAssetTrack representing the first video segment played in the composition#>;AVAsset *secondVideoAssetTrack = <#AVAssetTrack representing the second video segment played in the composition#>;// Create the first video composition instruction.AVMutableVideoCompositionInstruction *firstVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];// Set its time range to span the duration of the first video track.firstVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration);// Create the layer instruction and associate it with the composition video track.AVMutableVideoCompositionLayerInstruction *firstVideoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableCompositionVideoTrack];// Create the opacity ramp to fade out the first video track over its entire duration.[firstVideoLayerInstruction setOpacityRampFromStartOpacity:1.f toEndOpacity:0.f timeRange:CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration)];// Create the second video composition instruction so that the second video track isn't transparent.AVMutableVideoCompositionInstruction *secondVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];// Set its time range to span the duration of the second video track.secondVideoCompositionInstruction.timeRange = CMTimeRangeMake(firstVideoAssetTrack.timeRange.duration, CMTimeAdd(firstVideoAssetTrack.timeRange.duration, secondVideoAssetTrack.timeRange.duration));// Create the second layer instruction and associate it with the composition video track.AVMutableVideoCompositionLayerInstruction *secondVideoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableCompositionVideoTrack];// Attach the first layer instruction to the first video composition instruction.firstVideoCompositionInstruction.layerInstructions = @[firstVideoLayerInstruction];// Attach the second layer instruction to the second video composition instruction.secondVideoCompositionInstruction.layerInstructions = @[secondVideoLayerInstruction];// Attach both of the video composition instructions to the video composition.AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];mutableVideoComposition.instructions = @[firstVideoCompositionInstruction, secondVideoCompositionInstruction];
CALayer *watermarkLayer = <#CALayer representing your desired watermark image#>;CALayer *parentLayer = [CALayer layer];CALayer *videoLayer = [CALayer layer];parentLayer.frame = CGRectMake(0, 0, mutableVideoComposition.renderSize.width, mutableVideoComposition.renderSize.height);videoLayer.frame = CGRectMake(0, 0, mutableVideoComposition.renderSize.width, mutableVideoComposition.renderSize.height);[parentLayer addSublayer:videoLayer];watermarkLayer.position = CGPointMake(mutableVideoComposition.renderSize.width/2, mutableVideoComposition.renderSize.height/4);[parentLayer addSublayer:watermarkLayer];mutableVideoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
2:添加 AVAssetTrack 对象的时间范围,兼容组件轨道
3:检查视频资产轨道的 preferredTransform 的属性,决定视频的方向
4:使用 AVMutableVideoCompositionLayerInstruction 对象给组件内的视频轨道应用转换。
5:给视频组件的 renderSize 和 frameDuration 属性设置适当的值。
6:当导出视频文件时,使用一个视频组件组合物中的组件
AVMutableComposition *mutableComposition = [AVMutableComposition composition];AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *firstVideoAssetTrack = [[firstVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];AVAssetTrack *secondVideoAssetTrack = [[secondVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration) ofTrack:firstVideoAssetTrack atTime:kCMTimeZero error:nil];[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondVideoAssetTrack.timeRange.duration) ofTrack:secondVideoAssetTrack atTime:firstVideoAssetTrack.timeRange.duration error:nil];[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeAdd(firstVideoAssetTrack.timeRange.duration, secondVideoAssetTrack.timeRange.duration)) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
BOOL isFirstVideoPortrait = NO;CGAffineTransform firstTransform = firstVideoAssetTrack.preferredTransform;// Check the first video track's preferred transform to determine if it was recorded in portrait mode.if (firstTransform.a == 0&& firstTransform.d == 0&& (firstTransform.b == 1.0|| firstTransform.b == -1.0) && (firstTransform.c == 1.0|| firstTransform.c == -1.0)) { isFirstVideoPortrait = YES;}BOOL isSecondVideoPortrait = NO;CGAffineTransform secondTransform = secondVideoAssetTrack.preferredTransform;// Check the second video track's preferred transform to determine if it was recorded in portrait mode.if (secondTransform.a == 0&& secondTransform.d == 0&& (secondTransform.b == 1.0|| secondTransform.b == -1.0) && (secondTransform.c == 1.0|| secondTransform.c == -1.0)) { isSecondVideoPortrait = YES;}if ((isFirstVideoAssetPortrait &&!isSecondVideoAssetPortrait) || (!isFirstVideoAssetPortrait && isSecondVideoAssetPortrait)) { UIAlertView *incompatibleVideoOrientationAlert = [[UIAlertView alloc] initWithTitle:@"Error!" message:@"Cannot combine a video shot in portrait mode with a video shot in landscape mode." delegate:self cancelButtonTitle:@"Dismiss" otherButtonTitles:nil]; [incompatibleVideoOrientationAlert show]; return;}
AVMutableVideoCompositionInstruction *firstVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];// Set the time range of the first instruction to span the duration of the first video track.firstVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration);AVMutableVideoCompositionInstruction * secondVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];// Set the time range of the second instruction to span the duration of the second video track.secondVideoCompositionInstruction.timeRange = CMTimeRangeMake(firstVideoAssetTrack.timeRange.duration, CMTimeAdd(firstVideoAssetTrack.timeRange.duration, secondVideoAssetTrack.timeRange.duration));AVMutableVideoCompositionLayerInstruction *firstVideoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];// Set the transform of the first layer instruction to the preferred transform of the first video track.[firstVideoLayerInstruction setTransform:firstTransform atTime:kCMTimeZero];AVMutableVideoCompositionLayerInstruction *secondVideoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];// Set the transform of the second layer instruction to the preferred transform of the second video track.[secondVideoLayerInstruction setTransform:secondTransform atTime:firstVideoAssetTrack.timeRange.duration];firstVideoCompositionInstruction.layerInstructions = @[firstVideoLayerInstruction];secondVideoCompositionInstruction.layerInstructions = @[secondVideoLayerInstruction];AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];mutableVideoComposition.instructions = @[firstVideoCompositionInstruction, secondVideoCompositionInstruction];
CGSize naturalSizeFirst, naturalSizeSecond;// If the first video asset was shot in portrait mode, then so was the second one if we made it here.if (isFirstVideoAssetPortrait) { // Invert the width and height for the video tracks to ensure that they display properly. naturalSizeFirst = CGSizeMake(firstVideoAssetTrack.naturalSize.height, firstVideoAssetTrack.naturalSize.width); naturalSizeSecond = CGSizeMake(secondVideoAssetTrack.naturalSize.height, secondVideoAssetTrack.naturalSize.width);}else { // If the videos weren't shot in portrait mode, we can just use their natural sizes. naturalSizeFirst = firstVideoAssetTrack.naturalSize; naturalSizeSecond = secondVideoAssetTrack.naturalSize;}float renderWidth, renderHeight;// Set the renderWidth and renderHeight to the max of the two videos widths and heights.if (naturalSizeFirst.width > naturalSizeSecond.width) { renderWidth = naturalSizeFirst.width;}else { renderWidth = naturalSizeSecond.width;}if (naturalSizeFirst.height > naturalSizeSecond.height) { renderHeight = naturalSizeFirst.height;}else { renderHeight = naturalSizeSecond.height;}mutableVideoComposition.renderSize = CGSizeMake(renderWidth, renderHeight);// Set the frame duration to an appropriate value (i.e. 30 frames per second for video).mutableVideoComposition.frameDuration = CMTimeMake(1,30);
// Create a static date formatter so we only have to initialize it once.static NSDateFormatter *kDateFormatter;if (!kDateFormatter) { kDateFormatter = [[NSDateFormatter alloc] init]; kDateFormatter.dateStyle = NSDateFormatterMediumStyle; kDateFormatter.timeStyle = NSDateFormatterShortStyle;}// Create the export session with the composition and set the preset to the highest quality.AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPresetHighestQuality];// Set the desired output URL for the file created by the export process.exporter.outputURL = [[[[NSFileManager defaultManager] URLForDirectory:NSDocumentDirectory inDomain:NSUserDomainMask appropriateForURL:nil create:@YES error:nil] URLByAppendingPathComponent:[kDateFormatter stringFromDate:[NSDate date]]] URLByAppendingPathExtension:CFBridgingRelease(UTTypeCopyPreferredTagWithClass((CFStringRef)AVFileTypeQuickTimeMovie, kUTTagClassFilenameExtension))];// Set the output file type to be a QuickTime movie.exporter.outputFileType = AVFileTypeQuickTimeMovie;exporter.shouldOptimizeForNetworkUse = YES;exporter.videoComposition = mutableVideoComposition;// Asynchronously export the composition to a video file and save this file to the camera roll once export completes.[exporter exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ if (exporter.status == AVAssetExportSessionStatusCompleted) { ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init]; if ([assetsLibrary videoAtPathIsCompatibleWithSavedPhotosAlbum:exporter.outputURL]) { [assetsLibrary writeVideoAtPathToSavedPhotosAlbum:exporter.outputURL completionBlock:NULL]; } } });}];
- AVFoundation Programming Guide - Editing
- 学习笔记- AVFoundation Programming Guide - Editing
- AVFoundation Programming Guide(官方文档翻译4)Editing - 编辑
- AVFoundation Programming Guide - About AVFoundation
- AVFoundation Programming Guide - Playback
- AVFoundation Programming Guide - Using Assets
- [iOS文档翻译]AVFoundation Programming Guide - About AVFoundation - AVFoundation概述
- AVFoundation Programming Guide 之 Media Capture
- 学习笔记- AVFoundation Programming Guide - 概括
- 学习笔记- AVFoundation Programming Guide - Using Assets
- 学习笔记- AVFoundation Programming Guide - Playback
- AVFoundation Programming Guide(官方文档翻译1)About AVFoundation - AVFoundation概述
- 学习笔记- AVFoundation Programming Guide - Still and Video Media Capture
- AVFoundation Programming Guide(官方文档翻译3)Playback - 播放
- AVFoundation Programming Guide(官方文档翻译6)Export - 输出
- AVFoundation Programming Guide(官方文档翻译)完整版中英对照
- Text, Web, and Editing Programming Guide for iOS读书笔记
- AVFoundation Programming Guide(官方文档翻译2)Using Assets - 使用Assets
- JavaScript 常量定义
- 在python3中import cv2显示"ImportError: DLL load failed: 找不到指定的模块。"
- 多址技术剖析
- TeX中的引号 (Tex Quotes, UVa 272)
- CXF用工具生成相应客户端
- AVFoundation Programming Guide - Editing
- mod函数在vb中怎么用?
- Kattis - oceancurrents优先队列+bfs
- Unity3D Shader
- OpenLayer+PostGIS+GeoServer--无额外后台实现地理分析与查询框架
- Linux内核中的RCU机
- linux命令:kickstart 无人值守安装系统
- matlab start dir
- 树梅派应用50:Adafruit的树莓派教程:电力控制