How to Play, Record, and Edit Videos in iOS

来源:互联网 发布:淘宝靠谱的进口零食 编辑:程序博客网 时间:2024/05/14 04:44

come from:http://www.raywenderlich.com/13418/how-to-play-record-edit-videos-in-ios

How to Play, Record, and Edit Videos in iOS

jneuman Byjneuman on June 26, 2012
Learn how to play, record, and edit videos on iOS!

Learn how to play, record, and edit videos on iOS!

This is a blog post by iOS Tutorial Team member Abdul Azeem, software architect and co-founder at Datainvent Systems, a software development and IT services company.

Update 8/14/12: Fixes and clarifications made by Joseph Neuman.

Recording videos (and playing around with them programmatically) is one of the coolest things you can do with your phone, but surprisingly relatively few apps make use of it.

This is likely because learning the technology behind video recording and editing in iOS – AVFoundation – is notoriously difficult.

And to make it worse, there is very little documentation on how to accomplish anything with AVFoundation. One of the few resources available is theWWDC 2010 AVFoundation session video, but it only takes you so far.

There should be an easier way. Thus was born this tutorial! :]

In this tutorial, we’ll give you hands-on experience with the AVFoundation APIs so you can start using them in your own apps. You’ll learn how to:

  • Select and play a video from the media library.
  • Record and save a video to the media library.
  • Merge multiple videos together into a combined video, complete with a custom soundtrack! :]

Are you ready? Lights, cameras, action!

Getting Started

Let’s get started by creating a simple app that will allow you to play and record videos and save them to files.

Start Xcode and create a new project with the iOS\Application\Single View Application template. Enter “VideoPlayRecord” for the project name, choose iPhone for the Device Family, make sure the “Use Storyboard” and “Use Automatic Reference Counting” options are checked, and save the project to a location of your choice.

Next up, add some of the necessary frameworks to your project.

Select the root of the project in the “Project Navigator” pane in the left sidebar to bring up the project information in the central pane. If the project target is not selected, select it, then switch to the “Build Phases” tab.

Now click the triangle next to the “Link Binary With Libraries” section to expand it. Here you can add additional libraries/frameworks to your project.

Video1

Click the (+) button to add frameworks. You can select multiple items in the dialog that opens by command-clicking on each item. Add the following frameworks to your project:

  • AssetsLibrary
  • AVFoundation
  • CoreMedia
  • MediaPlayer
  • MobileCoreServies
  • QuartzCore

In this project, you’ll create an app with four screens. The first will simply have three buttons that will allow you to navigate to the following three screens:

  • Video Play
  • Video Record
  • Video Merge

Get Your Story Straight

Select MainStoryboard.storyboard in the main window to see a view controller. You need this view controller to be embedded in a navigation controller because there are going to be multiple screens in the app.

To do this, first click the view controller to give it focus, then click select Editor\Embed In\Navigation Controller from the menu. The view controller now has a segue from a navigation controller.

Now, drag three UIButtons from the Object Library (at the bottom half of the right sidebar – if the Object Library isn’t selected, it’s the third tab) to the view controller. Once you’ve placed them in the view to your satisfaction, set the titles of the buttons as follows:

  1. Select and Play Video
  2. Record and Save Video
  3. Merge Video

You can set the titles for each button by tapping the button to select it, and then editing the Title property for the button in the Attributes Inspector, which is the fourth tab in the top half of the right sidebar.

Next, set up three view controllers for the views that will be displayed via these buttons. Do this by creating three UIViewController subclass objects using the iOS\Cocoa Touch\UIViewController subclass template. Name the new classesPlayVideoViewController, RecordVideoViewController, and MergeVideoViewController. As you’re using storyboards, make sure you uncheck the “With XIB for user interface” option for each class.

Now switch back to MainStoryboard.storyboard and drag three UIViewControllers from the Object Library onto your storyboard. Select each view controller object in turn and switch to the Identity Inspector (the third tab in the top half of the right sidebar) to set the class for each view controller as follows:

  1. PlayVideoViewController
  2. RecordVideoViewController
  3. MergeVideoViewController

Video2

Now you’ve got to hook all of these things together. You’ll do this by creating a segue from each button to the new view controller it will load.

Select each button in turn, ensure that the Connections Inspector (sixth tab in the top half of the right sidebar) is open, and drag from the “Push” connector to the relevant view controller.

Video3

Once you’re done, your storyboard should look similar to the screen below:

Video4

Great, you’ve set up the basic UI! Build the application and run it to ensure that the three buttons work as intended, each leading to a secondary screen.

If you’re confused about storyboards and how to set them up, don’t worry! There’s a tutorial for that. Check out theBeginning Storyboards in iOS 5 tutorial series.

Now that your UI is working, it’s time to create those secondary screens and give some substance to the form!

Select and Play Video

Switch to MainStoryboard.storyboard and create a new button titled “Play Video” in thePlay Video View Controller. Hook the new button to an action in the PlayVideoViewController class by doing the following:

  1. Switch to the Assistant Editor view by tapping the middle button in the Editor section of the toolbar at the top of the Xcode window. This should open up a split view where you can see both the interface and its matching class.
  2. Tap on the new button you just created to select it.
  3. Switch to the Connections Inspector (sixth tab in the top half of the right sidebar).
  4. Control-drag from the Touch Up Inside event to the line beneath the @interface line in the PlayVideoViewController source code, and let go.
  5. You should see a dialog similar to the one in the image below. Type in “playVideo” as the action name and click “Connect.”

Video5

You just set up an action in PlayVideoViewController named playVideo, which will be executed whenever you tap the “Play Video” button. But you still have to implement the newplayVideo action.

Start by adding the following import statements to the top of PlayVideoViewController.h:

#import <MobileCoreServices/UTCoreTypes.h>#import <MediaPlayer/MediaPlayer.h>

The MediaPlayer.h header gives you access to the MediaPlayer object that will be used to play the selected video. UTCoreTypes.h defines a constant value named “kUTTypeMovie,” which you’ll need to refer to when selecting media.

Now add the following code to the end of the @interface line below the #import statements:

<UIImagePickerControllerDelegate, UINavigationControllerDelegate>

This sets up the PlayVideoViewController as a delegate for UIImagePickerController and UINavigationController, so that you can use the UIImagePickerController in your class. Specifically, you’ll be using it to browse videos in your photo library. What is this UIImagePickerController class provided by Apple? It offers a basic, customizable user interface for taking pictures and recording movies. It also provides some simple editing functionality for newly-captured media. If you don’t need a fully-customized UI, it’s generally better to use an image picker controller to select audio and video files from the media library. To browse media, you need to open an instance of UIImagePickerController as a pop up view. Add the definition for a method to do this inPlayVideoViewController.h, above the @end line:

// For opening UIImagePickerController-(BOOL)startMediaBrowserFromViewController:(UIViewController*)controller usingDelegate:(id )delegate;

Now switch to PlayVideoViewController.m and add the following code to theplayVideo method:

[self startMediaBrowserFromViewController:self usingDelegate:self];

The above code ensures that tapping the “Play Video” button will open the UIImagePickerController, allowing the user to select a video file from the media library.

Now, add the implementation for startMediaBrowserFromViewController to the bottom of the file (but above the final @end):

-(BOOL)startMediaBrowserFromViewController:(UIViewController*)controller usingDelegate:(id )delegate {    // 1 - Validations    if (([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeSavedPhotosAlbum] == NO)        || (delegate == nil)        || (controller == nil)) {        return NO;    }    // 2 - Get image picker    UIImagePickerController *mediaUI = [[UIImagePickerController alloc] init];    mediaUI.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;    mediaUI.mediaTypes = [[NSArray alloc] initWithObjects: (NSString *) kUTTypeMovie, nil];    // Hides the controls for moving & scaling pictures, or for    // trimming movies. To instead show the controls, use YES.    mediaUI.allowsEditing = YES;    mediaUI.delegate = delegate;    // 3 - Display image picker    [controller presentModalViewController:mediaUI animated:YES];    return YES;}

In the above code, you do the following:

  1. Check if the UIImagePickerControllerSourceTypeSavedPhotosAlbum (the defined source) is available on the device. This check is essential whenever you use a UIImagePickerController to pick media. If you don’t do it, you might try to pick media from a non-existent media library, resulting in crashes or other unexpected issues.
  2. If the source you want is available, you create a new UIImagePickerController object and set its source and media type. Only “kUTTypeMovie” is included in the mediaTypes array, as you only need video. You can include “kUTTypeImage” in the array to select images as well.
  3. Finally, you present the UIImagePickerController as a modal view controller.

Now you’re ready to give your project another whirl! Build and run.

If you have any videos in your media library, you should see them presented, similar to the following screenshot, when you tap the “Select and Play Video” button on the first screen, and then tap the “Play Video” button on the second screen.

Note: If you run this project on the simulator, you’ll have no way to capture video. Plus, you’ll need to figure out a way to add videos to the media library manually. In other words, I recommend you test this project on a device!

Once you see the list of videos, select one. You’ll be taken to another screen that shows the video in detail. Tap the “Choose” button to actually select the video here.

Hang on! If you tap “Choose,” nothing happens, except that the app returns to the Play Video screen! This is because you haven’t implemented any delegate methods to handle the actions you carried out while displaying the image picker.

UIImagePickerController has a delegate callback method that can be executed when media is selected. Implement this method by adding the following code to the end ofPlayVideoViewController.m:

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {    // 1 - Get media type    NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];    // 2 - Dismiss image picker    [self dismissModalViewControllerAnimated:NO];    // Handle a movie capture    if (CFStringCompare ((__bridge_retained CFStringRef)mediaType, kUTTypeMovie, 0) == kCFCompareEqualTo) {        // 3 - Play the video        MPMoviePlayerViewController *theMovie = [[MPMoviePlayerViewController alloc]             initWithContentURL:[info objectForKey:UIImagePickerControllerMediaURL]];        [self presentMoviePlayerViewControllerAnimated:theMovie];        // 4 - Register for the playback finished notification        [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(myMovieFinishedCallback:)            name:MPMoviePlayerPlaybackDidFinishNotification object:theMovie];    }}

The above code does the following:

  1. Gets the media type so you can verify later on that the selected media is a video.
  2. Dismisses the image picker so that it’s no longer displayed on screen.
  3. Verifies that the selected media is a video, and then creates an instance of MPMoviePlayerViewController to play it.
  4. Adds a callback method that will be executed once the movie finishes playing.

The myMovieFinishedCallback: method referenced in step #4 needs to be implemented. Add the following code to the end ofPlayVideoViewController.m:

// When the movie is done, release the controller.-(void)myMovieFinishedCallback:(NSNotification*)aNotification {    [self dismissMoviePlayerViewControllerAnimated];    MPMoviePlayerController* theMovie = [aNotification object];    [[NSNotificationCenter defaultCenter] removeObserver:self         name:MPMoviePlayerPlaybackDidFinishNotification object:theMovie];}

The last thing to do is to add a handler for when the user taps “Cancel” instead of selecting a video. Add the following code right belowimagePickerController:didFinishPickingMediaWithInfo::

// For responding to the user tapping Cancel.-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {    [self dismissModalViewControllerAnimated: YES];}

If the user cancels the operation, the image picker gets dismissed.

Compile and run your project. Press the “Select and Play Video” button, then the “Play Video” button, and finally choose a video from the list. You should be able to see the video playing in the media player.

Record and Save Video

Now that you have video playback working, it’s time to record a video using the device’s camera and save it to the media library.

Switch back to the storyboard and do the following:

  1. Add a new button titled “Record Video” to the Record Video View Controller.
  2. As before, switch to Assistant Editor mode and connect the “Record Video” button to an action namedrecordAndPlay:.

Video6

Time to get coding! Replace the contents of RecordVideoViewController.h with the following:

#import <MediaPlayer/MediaPlayer.h>#import <MobileCoreServices/UTCoreTypes.h>#import <AssetsLibrary/AssetsLibrary.h> @interface RecordVideoViewController: UIViewController  -(IBAction)recordAndPlay:(id)sender;-(BOOL)startCameraControllerFromViewController:(UIViewController*)controller     usingDelegate:(id )delegate;-(void)video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(void*)contextInfo; @end

You may have noticed: some of this looks similar to what you did in PlayVideoViewController. As for the bits that don’t:

The AssetsLibrary.h import provides access to the videos and photos under the control of the Photos application. As you want to save your video to the Saved Photos library, you need access to the AssetsLibrary framework.

The asset library includes media that is in the Saved Photos album, media coming from iTunes, and media that was directly imported onto the device. You use AssetsLibrary to retrieve a list of all asset groups and to save images and videos into the Saved Photos album.

The other new item is video:didFinishSavingWithError:contextInfo:. This method, as the name implies, is executed after a video is saved to the Asset/Photo Library.

Switch to RecordVideoViewController.m and add the following to recordAndPlay::

[self startCameraControllerFromViewController:self usingDelegate:self];

You are again in familiar territory. The code simply calls startCameraControllerFromViewController:usingDelegate: when the “Record Video” button is tapped. Of course, this means you should add the implementation for the method next. Add the following code to the end of the file (but before the final @end):

-(BOOL)startCameraControllerFromViewController:(UIViewController*)controller    usingDelegate:(id )delegate {    // 1 - Validattions    if (([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] == NO)        || (delegate == nil)        || (controller == nil)) {        return NO;    }    // 2 - Get image picker    UIImagePickerController *cameraUI = [[UIImagePickerController alloc] init];    cameraUI.sourceType = UIImagePickerControllerSourceTypeCamera;    // Displays a control that allows the user to choose movie capture    cameraUI.mediaTypes = [[NSArray alloc] initWithObjects:(NSString *)kUTTypeMovie, nil];    // Hides the controls for moving & scaling pictures, or for    // trimming movies. To instead show the controls, use YES.    cameraUI.allowsEditing = NO;    cameraUI.delegate = delegate;    // 3 - Display image picker    [controller presentModalViewController: cameraUI animated: YES];    return YES;}

In the code above, you check for “UIImagePickerControllerSourceTypeCamera” instead of “UIImagePickerControllerSourceTypeSavedPhotosAlbum” because you want to use the camera. The rest of the code is mostly identical to what you used before.

Build and run your code to see what you’ve got so far.

Go to the Record screen and press the “Record Video” button. Instead of the Photo Gallery, the camera UI opens. Start recording a video by tapping the red record button at the bottom of the screen, and tap it again when you’re done recording.

This video is more exciting that it looks. I can’t post it here, but there’s a hot babe just offscreen! Just kidding, but it is more exciting when you see this running on your own device. :]

When you get to the next screen, you can opt to use the recorded video or re-take the video. If you select “Use,” you’ll notice that nothing happens – that’s because, you guessed it, there is no callback method implemented. You need the callback method to save the recorded video to the media library.

To implement the callback methods, add the following code to the end of RecordVideoViewController.m:

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {    NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];    [self dismissModalViewControllerAnimated:NO];    // Handle a movie capture    if (CFStringCompare ((__bridge_retained CFStringRef) mediaType, kUTTypeMovie, 0) == kCFCompareEqualTo) {        NSString *moviePath = [[info objectForKey:UIImagePickerControllerMediaURL] path];        if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(moviePath)) {            UISaveVideoAtPathToSavedPhotosAlbum(moviePath, self,                 @selector(video:didFinishSavingWithError:contextInfo:), nil);        }     }} -(void)video:(NSString*)videoPath didFinishSavingWithError:(NSError*)error contextInfo:(void*)contextInfo {    if (error) {        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Video Saving Failed"             delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];        [alert show];    } else {        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Video Saved" message:@"Saved To Photo Album"             delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];        [alert show];    }}

In the above code, imagePickerController:didFinishPickingMediaWithInfo: gives you a moviePath. You verify that the movie can be saved to the device’s photo album, and save it if so.

UISaveVideoAtPathToSavedPhotosAlbum is the default method provided by the SDK to save videos to the Photos Album. As parameters, you pass both the path to the video to be saved, as well as a callback method that will inform you of the status of the save operation.

Build the code and run it. Record a video and select “Use.” If the “Video Saved” button pops up, your video has been successfully saved to the photo library.

A Brief Intro to AVFoundation

Now that your video playback and recording is up and running, let’s move on to something a bit more complex: AVFoundation.

Since iOS 4.0, the iOS SDK provides a number of video editing APIs in the AVFoundation framework. With these APIs, you can apply any kind of CGAffineTransform to a video and merge multiple video and audio files together into a single video.

These last few sections of the tutorial will walk you through merging two videos into a single video and adding a background audio track.

Before diving into the code, let’s discuss some theory first.

AVAsset

This is an abstract class that represents timed audiovisual media such as video and audio. Each asset contains a collection of tracks intended to be presented or processed together, each of a uniform media type, including but not limited to audio, video, text, closed captions, and subtitles.

An AVAsset object defines the collective properties of the tracks that comprise the asset. A track is represented by an instance of AVAssetTrack.

In a typical simple case, one track represents the audio component and another represents the video component; in a complex composition, there may be multiple overlapping tracks of audio and video. You will represent the video and audio files you’ll merge together as AVAsset objects.

AVComposition

An AVCompositionobject combines media data from multiple file-based sources in a custom temporal arrangement in order to present or process it together. All file-based audiovisual assets are eligible to be combined, regardless of container type.

At its top level, an AVComposition is a collection of tracks, each presenting media of a specific type such as audio or video, according to a timeline. Each track is represented by an instance of AVCompositionTrack.

AVMutableComposition and AVMutableCompositionTrack

A higher-level interface for constructing compositions is also presented by AVMutableComposition and AVMutableCompositionTrack. These objects offer insertion, removal, and scaling operations without direct manipulation of the trackSegment arrays of composition tracks.

AVMutableComposition and AVMutableCompositionTrack make use of higher-level constructs such as AVAsset and AVAssetTrack. This means the client can make use of the same references to candidate sources that it would have created in order to inspect or preview them prior to inclusion in a composition.

In short, you have an AVMutableComposition and you can add multiple AVMutableCompositionTrack instances to it. Each AVMutableCompositionTrack will have a separate media asset.

And the Rest

In order to apply a CGAffineTransform to a track, you will make use of AVVideoCompositionInstruction and AVVideoComposition. An AVVideoCompositionInstruction object represents an operation to be performed by a compositor. The object contains multiple AVMutableVideoCompositionLayerInstruction objects.

You use an AVVideoCompositionLayerInstruction object to modify the transform and opacity ramps to apply to a given track in an AV composition. AVMutableVideoCompositionLayerInstruction is a mutable subclass of AVVideoCompositionLayerInstruction.

An AVVideoComposition object maintains an array of instructions to perform its composition, and an AVMutableVideoComposition object represents a mutable video composition.

Conclusion

To sum it all up:

  • You have a main AVMutableComposition object that contains multiple AVMutableCompositionTrack instances. Each track represents an asset.
  • You have AVMutableVideoComposition objects that contain multiple AVMutableVideoCompositionInstructions.
  • Each AVMutableVideoCompositionInstruction contains multiple AVMutableVideoCompositionLayerInstruction instances.
  • Each layer instruction is used to apply a certain transform to a given track.

Got all that? There will be a test at the end before you can download the project sample code. ;]

Now you have at least heard of all the major objects you will use to merge your media. It may be a little confusing, but things will get clearer as you write some code. I promise!

Merge Video

Now to put that theory to use! Open MainStoryboard.storyboard and select the Merge Video View Controller. Add four buttons to the screen and name them as follows:

  1. Load Asset 1
  2. Load Asset 2
  3. Load Audio
  4. Merge and Save Video

Switch to the Assistant Editor mode and connect your four buttons to the following actions, as before:

  1. loadAssetOne:
  2. loadAssetTwo:
  3. loadAudio:
  4. mergeAndSave:

The final result should look something like this:

Video7

Now switch to MergeVideoViewController.h and replace its contents with:

#import <AVFoundation/AVFoundation.h>#import <CoreMedia/CoreMedia.h>#import <MobileCoreServices/UTCoreTypes.h>#import <AssetsLibrary/AssetsLibrary.h>#import <MediaPlayer/MediaPlayer.h> @interface MergeVideoViewController: UIViewController {    BOOL isSelectingAssetOne;} @property(nonatomic, strong) AVAsset *firstAsset;@property(nonatomic, strong) AVAsset *secondAsset;@property(nonatomic, strong) AVAsset *audioAsset;@property (weak, nonatomic) IBOutlet UIActivityIndicatorView *activityView; -(IBAction)loadAssetOne:(id)sender;-(IBAction)loadAssetTwo:(id)sender;-(IBAction)loadAudio:(id)sender;-(IBAction)mergeAndSave:(id)sender;-(BOOL)startMediaBrowserFromViewController:(UIViewController*)controller usingDelegate:(id)delegate;-(void)exportDidFinish:(AVAssetExportSession*)session; @end

Most of the above should be familiar by now. There are a few new properties, but they are mostly to hold references to the assets that you’ll add to create the final merged video. In additional to the assets, there’s an activity indicator that will display when the app is merging files, since it can take some time to complete the process.

To synthesize the properties you added above, switch to MergeVideoViewController.m and add the following at the top of the file, right below the @implementation line:

@synthesize firstAsset, secondAsset, audioAsset;@synthesize activityView;

Then, add the following to loadAssetOne:

if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeSavedPhotosAlbum] == NO) {    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"No Saved Album Found"          delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];    [alert show];        } else {    isSelectingAssetOne = TRUE;    [self startMediaBrowserFromViewController:self usingDelegate:self];  }

Add this code to loadAssetTwo:

if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeSavedPhotosAlbum] == NO) {    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"No Saved Album Found"         delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];    [alert show];} else {    isSelectingAssetOne = FALSE;    [self startMediaBrowserFromViewController:self usingDelegate:self];  }

Notice that the code in both the above instances is almost identical, except for the value assigned toisSelectingAssetOne. You use a UIImagePickerController to select the video files as you did in the “Play Video” section. TheisSelectingAssetOne variable is used to identify which asset is currently selected.

Add the following code to the end of the file for the UIImagePickerController display and handling:

-(BOOL)startMediaBrowserFromViewController:(UIViewController*)controller usingDelegate:(id)delegate {    // 1 - Validation    if (([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeSavedPhotosAlbum] == NO)        || (delegate == nil)        || (controller == nil)) {        return NO;    }    // 2 - Create image picker    UIImagePickerController *mediaUI = [[UIImagePickerController alloc] init];    mediaUI.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;    mediaUI.mediaTypes = [[NSArray alloc] initWithObjects:(NSString *)kUTTypeMovie, nil];    // Hides the controls for moving & scaling pictures, or for    // trimming movies. To instead show the controls, use YES.    mediaUI.allowsEditing = YES;    mediaUI.delegate = delegate;    // 3 - Display image picker    [controller presentModalViewController: mediaUI animated: YES];    return YES;} -(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {    // 1 - Get media type    NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];    // 2 - Dismiss image picker    [self dismissModalViewControllerAnimated:NO];    // 3 - Handle video selection    if (CFStringCompare ((__bridge_retained CFStringRef) mediaType, kUTTypeMovie, 0) == kCFCompareEqualTo) {        if (isSelectingAssetOne){            NSLog(@"Video One  Loaded");            UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Asset Loaded" message:@"Video One Loaded"                 delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];            [alert show];            firstAsset = [AVAsset assetWithURL:[info objectForKey:UIImagePickerControllerMediaURL]];        } else {            NSLog(@"Video two Loaded");            UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Asset Loaded" message:@"Video Two Loaded"                 delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];            [alert show];            secondAsset = [AVAsset assetWithURL:[info objectForKey:UIImagePickerControllerMediaURL]];                }    }}

Notice that in imagePickerController:didFinishPickingMediaWithInfo:, you initialize each asset variable using the media URL returned by the image picker. Also note how the isSelectingAssetOne variable is used to determine which asset variable is set.

At this point, you have the code in place to select the two video assets.

Compile and run, and make sure you have at least two videos in your library. Then select the “Merge Videos” option, and select two videos. If everything works, you should see the “Asset Loaded” message upon selecting each video.

The next step is to add the functionality to select the audio file.

The UIImagePickerController only provides functionality to select video and images from the media library. To select audio files from your music library, you will use theMPMediaPickerController. It works exactly the same as UIImagePickerController, but instead of images and video, it accesses audio files in the media library.

Add the following code to loadAudio:

MPMediaPickerController *mediaPicker = [[MPMediaPickerController alloc] initWithMediaTypes:MPMediaTypeAny];mediaPicker.delegate = self;mediaPicker.prompt = @"Select Audio";[self presentModalViewController:mediaPicker animated:YES];

The above code creates a new MPMediaPickerController instance and displays it as a modal view controller.

Build and run. Now when you tap the “Load Audio” button, you can access the audio library on your device. (Of course, you’ll need some audio files on your device. Otherwise, the list will be empty.)

If you select a song from the list, you’ll notice that nothing happens. That’s right, MPMediaPickerController needs delegate methods! Add the following two methods at the end of the file:

-(void) mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection {    NSArray *selectedSong = [mediaItemCollection items];    if ([selectedSong count] > 0) {        MPMediaItem *songItem = [selectedSong objectAtIndex:0];        NSURL *songURL = [songItem valueForProperty:MPMediaItemPropertyAssetURL];        audioAsset = [AVAsset assetWithURL:songURL];         NSLog(@"Audio Loaded");         UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Asset Loaded" message:@"Audio Loaded"              delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];         [alert show];    }    [self dismissModalViewControllerAnimated:YES];} -(void)mediaPickerDidCancel:(MPMediaPickerController *)mediaPicker {    [self dismissModalViewControllerAnimated: YES];}

The code is very similar to the delegate methods for UIImagePickerController. You set the audio asset based on the media item selected via the MPMediaPickerController.

Build and run again. Go to the Merge Videos screen and select an audio file. If there are no errors, you should see the “Audio Loaded” message.

You now have all your video and audio assets loading correctly. It’s time to merge the various media files into one file.

But before you get into that code, you have to do a little bit of set up. Add the following code tomergeAndSave::

    if (firstAsset !=nil && secondAsset!=nil) {        [activityView startAnimating];        // 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.        AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];        // 2 - Video track        AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo             preferredTrackID:kCMPersistentTrackID_Invalid];        [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)             ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];        [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration)             ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:firstAsset.duration error:nil];        // 3 - Audio track        if (audioAsset!=nil){            AVMutableCompositionTrack *AudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio                 preferredTrackID:kCMPersistentTrackID_Invalid];            [AudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeAdd(firstAsset.duration, secondAsset.duration))                 ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];        }         // 4 - Get path        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);        NSString *documentsDirectory = [paths objectAtIndex:0];        NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:            [NSString stringWithFormat:@"mergeVideo-%d.mov",arc4random() % 1000]];        NSURL *url = [NSURL fileURLWithPath:myPathDocs];        // 5 - Create exporter        AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition             presetName:AVAssetExportPresetHighestQuality];        exporter.outputURL=url;        exporter.outputFileType = AVFileTypeQuickTimeMovie;        exporter.shouldOptimizeForNetworkUse = YES;        [exporter exportAsynchronouslyWithCompletionHandler:^{             dispatch_async(dispatch_get_main_queue(), ^{                 [self exportDidFinish:exporter];             });         }];    }

Here’s a step-by-step breakdown of the above code:

  1. You create an AVMutableComposition object to hold your video and audio tracks and transform effects.
  2. Next, you create an AVMutableCompositionTrack for the video and add it to your AVMutableComposition object. Then you insert your two videos to the newly created AVMutableCompositionTrack.

    Note that the insertTimeRangemethod allows you to insert a part of a video into your main composition instead of the whole video. This way, you can trim the video to a time range of your choosing.

    In this instance, you want to insert the whole video, so you create a time range from kCMTimeZero to your video asset duration.TheatTime parameter allows you to place your video/audio track wherever you want it in your composition. Notice howfirstAsset is inserted at time zero, and secondAsset is inserted at the end of the first video. This tutorial assumes you want your video assets one after the other. But you can also overlap the assets by playing with the time ranges.

    For working with time ranges, you use CMTime structs. CMTime structs are non-opaque mutable structs representing times, where the time could be a timestamp or a duration.

  3. Similarly, you create a new track [AVMutableCompositionTrack?] for your audio and add it to the main composition. This time you set the audio time range to the sum of the duration of the first and second videos, since that will be the complete length of your video.
  4. Before you can save the final video, you need a path for the saved file. So create a random file name that points to a file in the documents folder.
  5. Finally, render and export the merged video. To do this, you create an AVAssetExportSession object that transcodes the contents of an AVAsset source object to create an output of the form described by a specified export preset.
  6. After you’ve initialized an export session with the asset that contains the source media, the export preset name (presetName), and the output file type (outputFileType), you start the export running by invokingexportAsynchronouslyWithCompletionHandler:.
  7. Because the export is performed asynchronously, this method returns immediately. The completion handler you supply toexportAsynchronouslyWithCompletionHandler: is called whether the export fails, completes, or is canceled. Upon completion, the exporter’s status property indicates whether the export has completed successfully. If it has failed, the value of the exporter’s error property supplies additional information about the reason for the failure.

Notice that the completion handler calls exportDidFinish:, a method that needs implementation. Add the following code to the end of the file:

-(void)exportDidFinish:(AVAssetExportSession*)session {    if (session.status == AVAssetExportSessionStatusCompleted) {        NSURL *outputURL = session.outputURL;        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL]) {            [library writeVideoAtPathToSavedPhotosAlbum:outputURL completionBlock:^(NSURL *assetURL, NSError *error){                dispatch_async(dispatch_get_main_queue(), ^{                    if (error) {                        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Video Saving Failed"                             delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];                        [alert show];                    } else {                        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Video Saved" message:@"Saved To Photo Album"                             delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];                        [alert show];                    }                });            }];        }     }    audioAsset = nil;    firstAsset = nil;    secondAsset = nil;    [activityView stopAnimating];}

Once the export completes successfully, the newly exported video is saved to the photo album. You don’t actually need to do – you can use an AssetBrowser to browse to the final video you saved to your documents folder. But it’s easier to copy the output video to the photo album so you can see the final output.

Go ahead, build and run your project!

Select the video and audio files and merge the selected files. If the merge was successful, you should see a “Video Saved” message. At this point, your new video should be present in the photo album.

Go to the photo album, or browse using your own “Select and Play Video” screen! You’ll notice that although the videos have been merged, there are some orientations issues. Portrait video is in landscape mode, and sometimes videos are turned upside down.

This is due to the default AVAsset orientation. All movie and image files recorded using the default iPhone camera application have the video frame set to landscape, and so the media is saved in landscape mode.

AVAsset has a preferredTransform property that contains the media orientation information, and this is applied to a media file whenever you view it using the Photos app or QuickTime. In the code above, you haven’t applied a transform to your AVAsset objects, hence the orientation issue.

You can correct this easily by applying the necessary transforms to your AVAsset objects. But as your two video files can have different orientations, you’ll need to use two separate AVMutableCompositionTrack instances instead of one as you originally did.

Replace section #2 in mergeAndSave: with the following so that you have two AVMutableCompositionTrack instances instead of one:

// 2 - Create two video tracksAVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo     preferredTrackID:kCMPersistentTrackID_Invalid];[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)     ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo     preferredTrackID:kCMPersistentTrackID_Invalid];[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration)     ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:firstAsset.duration error:nil];

Since you now have two separate AVMutableCompositionTrack instances, you need to apply an AVMutableVideoCompositionLayerInstruction to each track in order to fix the orientation. So add the following code after the code you just replaced (and before section #3):

// 2.1 - Create AVMutableVideoCompositionInstructionAVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeAdd(firstAsset.duration, secondAsset.duration));// 2.2 - Create an AVMutableVideoCompositionLayerInstruction for the first trackAVMutableVideoCompositionLayerInstruction *firstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];AVAssetTrack *firstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];UIImageOrientation firstAssetOrientation_  = UIImageOrientationUp;BOOL isFirstAssetPortrait_  = NO;CGAffineTransform firstTransform = firstAssetTrack.preferredTransform;if (firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0) {    firstAssetOrientation_ = UIImageOrientationRight;     isFirstAssetPortrait_ = YES;}if (firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0) {    firstAssetOrientation_ =  UIImageOrientationLeft;     isFirstAssetPortrait_ = YES;}if (firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0) {    firstAssetOrientation_ =  UIImageOrientationUp;}if (firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0) {    firstAssetOrientation_ = UIImageOrientationDown;}[firstlayerInstruction setTransform:firstAsset.preferredTransform atTime:kCMTimeZero];[firstlayerInstruction setOpacity:0.0 atTime:firstAsset.duration];// 2.3 - Create an AVMutableVideoCompositionLayerInstruction for the second trackAVMutableVideoCompositionLayerInstruction *secondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];AVAssetTrack *secondAssetTrack = [[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];UIImageOrientation secondAssetOrientation_  = UIImageOrientationUp;BOOL isSecondAssetPortrait_  = NO;CGAffineTransform secondTransform = secondAssetTrack.preferredTransform;if (secondTransform.a == 0 && secondTransform.b == 1.0 && secondTransform.c == -1.0 && secondTransform.d == 0) {    secondAssetOrientation_= UIImageOrientationRight;     isSecondAssetPortrait_ = YES;}if (secondTransform.a == 0 && secondTransform.b == -1.0 && secondTransform.c == 1.0 && secondTransform.d == 0) {    secondAssetOrientation_ =  UIImageOrientationLeft;     isSecondAssetPortrait_ = YES;}if (secondTransform.a == 1.0 && secondTransform.b == 0 && secondTransform.c == 0 && secondTransform.d == 1.0) {    secondAssetOrientation_ =  UIImageOrientationUp;}if (secondTransform.a == -1.0 && secondTransform.b == 0 && secondTransform.c == 0 && secondTransform.d == -1.0) {    secondAssetOrientation_ = UIImageOrientationDown;}[secondlayerInstruction setTransform:secondAsset.preferredTransform atTime:firstAsset.duration];}

In section #2.1, you create an AVMutableVideoCompositionInstruction object that will hold your layer instructions.

Then in section #2.2, you add the orientation fix to your first track as follows:

  • You create an AVMutableVideoCompositionLayerInstruction and associate it with your firstTrack.
  • Next, you create an AVAssetTrack object from your AVAsset. An AVAssetTrack object provides the track-level inspection interface for all assets. You need this object in order to access the preferredTransform and dimensions of the asset.
  • Then, you determine the orientation of your AVAsset. This will be used later when determining the exported video size.
  • Next, you apply the preferredTransform to fix the orientation.
  • You also set the opacity of your first layer to zero at time firstAsset.duration. This is because you want your first track to disappear when it has finished playing. Otherwise, the last frame of the first track will remain on screen and overlap the video from the second track.

The code in section #2.3 is almost identical to that in section #2.2. It’s just the orientation fix applied to the second track.

Next, add the following code right after section #2.3 (and before section #3):

// 2.4 - Add instructionsmainInstruction.layerInstructions = [NSArray arrayWithObjects:firstlayerInstruction, secondlayerInstruction,nil];AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];mainCompositionInst.frameDuration = CMTimeMake(1, 30); CGSize naturalSizeFirst, naturalSizeSecond;if(isFirstAssetPortrait_){    naturalSizeFirst = CGSizeMake(FirstAssetTrack.naturalSize.height, FirstAssetTrack.naturalSize.width);} else {    naturalSizeFirst = FirstAssetTrack.naturalSize;}if(isSecondAssetPortrait_){    naturalSizeSecond = CGSizeMake(SecondAssetTrack.naturalSize.height, SecondAssetTrack.naturalSize.width);} else {    naturalSizeSecond = SecondAssetTrack.naturalSize;} float renderWidth, renderHeight;if(naturalSizeFirst.width > naturalSizeSecond.width) {    renderWidth = naturalSizeFirst.width;} else {    renderWidth = naturalSizeSecond.width;}if(naturalSizeFirst.height > naturalSizeSecond.height) {    renderHeight = naturalSizeFirst.height;} else {    renderHeight = naturalSizeSecond.height;}MainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);

Now that you have your AVMutableVideoCompositionLayerInstruction instances for the first and second tracks, you just add them to the main AVMutableVideoCompositionInstruction object. Next, you add your mainInstruction object to the instructions property of an instance of AVMutableVideoComposition. You also set the frame rate for the composition to 30 frames/second.

Then you have to find the final video’s export size. First we have to check if the resources are portrait or landscape. To do this, we use the variables isFirstAssetPortrait_ and isSecondAssetPortrait_ from earlier. If they are landscape, we can use the naturalSize property we are supplied with, but if they are portrait, we must flip the the naturalSize so that the width is now the height and vice-versa. We save each of the results to variables.

Then we have to determine which of the two assets is wider and which of the two is taller. This is to ensure the exported video is large enough to accommodate all of each video. With some simple comparisons, we save the results of this to variables as well.

You can then set the renderSize of the export to the found renderWidth and renderHeight.

Now that you’ve got an AVMutableVideoComposition object configured, all you need to do is assign it your exporter. In section #5, insert the following code after line 4 of the section (just before theexportAsynchronouslyWithCompletionHandler: call):

exporter.videoComposition = mainCompositionInst;

Whew – that’s it!

Build and run your project. If you create a new video by combining two videos (and optionally an audio file), you will see that the orientation issues have disappeared when you play back the new merged video.

Where to Go From Here?

OK, I was bluffing about the quiz. Here is sample project with all of the code from the above tutorial. You’ve earned it.

If you followed along, you should now have a good understanding of how to play video, record video, and merge multiple videos and audio in your apps.

AVFoundation gives you a lot of flexibility when playing around with videos. You can also apply any kind of CGAffineTransform to merge, scale, or position videos.

I would recommend that you have a look at the WWDC 2010 AVFoundation session video if you want to go into a bit more detail. Also, check out theApple AVFoundation Framework programming guide.

I hope this tutorial has been useful to get you started with video manipulation in iOS. If you have any questions, comments, or suggestions for improvement, please join the forum discussion below!


This is a blog post by iOS Tutorial Team member Abdul Azeem, software architect and co-founder at Datainvent Systems, a software development and IT services company.

原创粉丝点击