MIDI Driven Animation using CoreMIDI in Objective C

来源:互联网 发布:finale软件 编辑:程序博客网 时间:2024/06/08 06:26

现在开始写、记博客

原文地址:http://www.deluge.co/?q=midi-driven-animation-core-audio-objective-c


MIDI Driven Animation using CoreMIDI in Objective C

So in this post I'm going to explain how to produce MIDI driven animation on OSX or iOS using the CoreMIDI and CoreAudio frameworks. When I first started trying to do this I thought it would be easy - just register a callback in the MIDI player which is called every time a MIDI message is played. Unfortunately this is not possible and I ended up spending three long days figuring it out from the limited documentation available. Hopefully this post will save someone some time!

Project files

A fully working X-Code project can be downloaded here.

The Goal

In this guide I will explain how to do the following:

  • Load and play a MIDI sequence from a file using a MusicPlayer
  • Play the MIDI notes with an instrument effect (SoundFont) using an AUGraph
  • Create a virtual endpoint to intercept and display the MIDI messages in realtime


 

Load and play a MIDI Sequence

The following tasks are needed to load and play a MIDI file:

  • Create a MusicSequence to hold the MIDI information
  • Get a NSURL to hold the path to the MIDI file
  • Load the sequence file into the sequence using MusicSequenceFileLoad
  • Create a new MusicPlayer, add the sequence and play the sequence

Now here's the code. You will need to include the following frameworks: CoreAudio, CoreMIDI and AudioToolbox as well as the import: AudioToolbox/MusicPlayer.h

  1. // Create a new music sequence
  2. MusicSequence s;
  3. // Initialise the music sequence
  4. NewMusicSequence(&s);
  5.  
  6. // Get a string to the path of the MIDI file which
  7. // should be located in the Resources folder
  8. // I'm using a simple test midi file which is included in the download bundle at the end of this document
  9. NSString *midiFilePath = [[NSBundle mainBundle]
  10. pathForResource:@"simpletest"
  11. ofType:@"mid"];
  12.  
  13. // Create a new URL which points to the MIDI file
  14. NSURL * midiFileURL = [NSURL fileURLWithPath:midiFilePath];
  15.  
  16.  
  17. MusicSequenceFileLoad(s, midiFileURL, 0, 0);
  18.  
  19. // Create a new music player
  20. MusicPlayer p;
  21. // Initialise the music player
  22. NewMusicPlayer(&p);
  23.  
  24. // Load the sequence into the music player
  25. MusicPlayerSetSequence(p, s);
  26. // Called to do some MusicPlayer setup. This just
  27. // reduces latency when MusicPlayerStart is called
  28. MusicPlayerPreroll(p);
  29. // Starts the music playing
  30. MusicPlayerStart(p);
  31.  
  32. // Get length of track so that we know how long to kill time for
  33. MusicTrack t;
  34. MusicTimeStamp len;
  35. UInt32 sz = sizeof(MusicTimeStamp);
  36. MusicSequenceGetIndTrack(s, 1, &t);
  37. MusicTrackGetProperty(t, kSequenceTrackProperty_TrackLength, &len, &sz);
  38.  
  39.  
  40. while (1) { // kill time until the music is over
  41. usleep (3 * 1000 * 1000);
  42. MusicTimeStamp now = 0;
  43. MusicPlayerGetTime (p, &now);
  44. if (now >= len)
  45. break;
  46. }
  47.  
  48. // Stop the player and dispose of the objects
  49. MusicPlayerStop(p);
  50. DisposeMusicSequence(s);
  51. DisposeMusicPlayer(p);

Hopefully you will have heard a rather mechanical scale followed by a chromatic scale. It's basic but at least it's a start. The next step is to create an AU graph so that we can play our MIDI file with an instrument effect.

Creating an AUGraph

When I first started reading about AU Graphs I thought it sounded horribly incomprehensible and opaque. In reality it's not too bad just a bit fiddly to set up.

An AUGraph is a container to hold a collection of AUNodes. AU Nodes are effects units which are supplied by Apple. Really it's just like music units in real life. Maybe you have a MIDI keyboard and you want to output the sound as a trumpet with an echo effect. You would need to plug your keyboard into a box which translates MIDI messages and turns them into trumpet sounds. This box would need to be plugged into an echo unit which is plugged into the speakers.

Choosing your AUNodes

In CoreAudio you choose the type of AUNode you need using three properties (defined by ENUMs):

  • componentManufacturer: The author of the AUNode in this case we will be using audio units from Apple - kAudioUnitManufacturer_Apple
  • componentType: The unit type
  • componentSubType: The sub unit type

The unit type and sub-unit type can be found in the Apple documentation or in the header file AUComponent.h. Basically to find the audio unit you need it's easiest to use Google. But say I want a high pass filter, I look in the AUComponent.h header file and find kAudioUnitSubType_HighPassFilter - this is the sub type. I then count how many sub type definitions there were before this one - in this case 2. I then look at the top of the document and look at the third Audio unit type defined kAudioUnitType_MusicEffect. Now I have my manufacturer, type and sub type and I can use the Audio Unit.

For this example we will be using the following two Audio Units:

  • Sampler: This is a unit converts MIDI to music sounds defined in a Sound Font or AUPreset and is available on iOS 5
  • RemoteIO: This unit allows us to output sounds to iPhone speakers

    So here's the code - adapted from an example provided by Apple but with extra comments.

    1. - (BOOL) createAUGraph {
    2.  
    3. // Each core audio call returns an OSStatus. This means that we
    4. // Can see if there have been any errors in the setup
    5. OSStatus result = noErr;
    6.  
    7. // Create 2 audio units one sampler and one IO
    8. AUNode samplerNode, ioNode;
    9.  
    10. // Specify the common portion of an audio unit's identify, used for both audio units
    11. // in the graph.
    12. // Setup the manufacturer - in this case Apple
    13. AudioComponentDescription cd = {};
    14. cd.componentManufacturer = kAudioUnitManufacturer_Apple;
    15.  
    16. // Instantiate an audio processing graph
    17. result = NewAUGraph (&_processingGraph);
    18. NSCAssert (result == noErr, @"Unable to create an AUGraph object. Error code: %d '%.4s'", (int) result, (const char *)&result);
    19.  
    20. //Specify the Sampler unit, to be used as the first node of the graph
    21. cd.componentType = kAudioUnitType_MusicDevice; // type - music device
    22. cd.componentSubType = kAudioUnitSubType_Sampler; // sub type - sampler to convert our MIDI
    23.  
    24. // Add the Sampler unit node to the graph
    25. result = AUGraphAddNode (self.processingGraph, &cd, &samplerNode);
    26. NSCAssert (result == noErr, @"Unable to add the Sampler unit to the audio processing graph. Error code: %d '%.4s'", (int) result, (const char *)&result);
    27.  
    28. // Specify the Output unit, to be used as the second and final node of the graph
    29. cd.componentType = kAudioUnitType_Output; // Output
    30. cd.componentSubType = kAudioUnitSubType_RemoteIO; // Output to speakers
    31.  
    32. // Add the Output unit node to the graph
    33. result = AUGraphAddNode (self.processingGraph, &cd, &ioNode);
    34. NSCAssert (result == noErr, @"Unable to add the Output unit to the audio processing graph. Error code: %d '%.4s'", (int) result, (const char *)&result);
    35.  
    36. // Open the graph
    37. result = AUGraphOpen (self.processingGraph);
    38. NSCAssert (result == noErr, @"Unable to open the audio processing graph. Error code: %d '%.4s'", (int) result, (const char *)&result);
    39.  
    40. // Connect the Sampler unit to the output unit
    41. result = AUGraphConnectNodeInput (self.processingGraph, samplerNode, 0, ioNode, 0);
    42. NSCAssert (result == noErr, @"Unable to interconnect the nodes in the audio processing graph. Error code: %d '%.4s'", (int) result, (const char *)&result);
    43.  
    44. // Obtain a reference to the Sampler unit from its node
    45. result = AUGraphNodeInfo (self.processingGraph, samplerNode, 0, &_samplerUnit);
    46. NSCAssert (result == noErr, @"Unable to obtain a reference to the Sampler unit. Error code: %d '%.4s'", (int) result, (const char *)&result);
    47.  
    48. // Obtain a reference to the I/O unit from its node
    49. result = AUGraphNodeInfo (self.processingGraph, ioNode, 0, &_ioUnit);
    50. NSCAssert (result == noErr, @"Unable to obtain a reference to the I/O unit. Error code: %d '%.4s'", (int) result, (const char *)&result);
    51.  
    52. return YES;
    53. }

    Next we need to create a function to start the AUGraph running. This is equivalent to turning on the physical devices.

    1. // Starting with instantiated audio processing graph, configure its
    2. // audio units, initialize it, and start it.
    3. - (void) configureAndStartAudioProcessingGraph: (AUGraph) graph {
    4.  
    5. OSStatus result = noErr;
    6. if (graph) {
    7.  
    8. // Initialize the audio processing graph.
    9. result = AUGraphInitialize (graph);
    10. NSAssert (result == noErr, @"Unable to initialze AUGraph object. Error code: %d '%.4s'", (int) result, (const char *)&result);
    11.  
    12. // Start the graph
    13. result = AUGraphStart (graph);
    14. NSAssert (result == noErr, @"Unable to start audio processing graph. Error code: %d '%.4s'", (int) result, (const char *)&result);
    15.  
    16. // Print out the graph to the console
    17. CAShow (graph);
    18. }
    19. }

    So, now we've created a new audio graph with a sampler and an output unit. We've connected the sampler unit to the output unit and we've started the graph. Finally we need to set up the instrument effect, connect the music sequence and play.

    Set up the sound effect

    1. // Load a sound effect from a SoundFont file
    2. -(OSStatus) loadFromDLSOrSoundFont: (NSURL *)bankURL withPatch: (int)presetNumber {
    3.  
    4. OSStatus result = noErr;
    5.  
    6. // fill out a bank preset data structure
    7. AUSamplerBankPresetData bpdata;
    8. bpdata.bankURL = (__bridge CFURLRef) bankURL;
    9. bpdata.bankMSB = kAUSampler_DefaultMelodicBankMSB;
    10. bpdata.bankLSB = kAUSampler_DefaultBankLSB;
    11. bpdata.presetID = (UInt8) presetNumber;
    12.  
    13. // set the kAUSamplerProperty_LoadPresetFromBank property
    14. result = AudioUnitSetProperty(self.samplerUnit,
    15. kAUSamplerProperty_LoadPresetFromBank,
    16. kAudioUnitScope_Global,
    17. 0,
    18. &bpdata,
    19. sizeof(bpdata));
    20.  
    21. // check for errors
    22. NSCAssert (result == noErr,
    23. @"Unable to set the preset property on the Sampler. Error code:%d '%.4s'",
    24. (int) result,
    25. (const char *)&result);
    26.  
    27. return result;
    28. }

    This code takes a sound font NSURL and a preset number as input. The NSURL should point to the Sound Font file in your Resources directory. Sound Fonts can hold a number of instrument effects so the presetNumber defines which one should be used.

    Now we just repeat what we did before but with a few added lines (marked by stars).

    1. // Create a new music player
    2. MusicPlayer p;
    3. // Initialise the music player
    4. NewMusicPlayer(&p);
    5.  
    6.  
    7. // ************* Tell the music sequence to output through our new AUGraph
    8. MusicSequenceSetAUGraph(s, self.processingGraph);
    9.  
    10.  
    11. // ************* Load the sound font from file
    12. NSURL *presetURL = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:@"Gorts_Filters" ofType:@"sf2"]];
    13.  
    14. // ************* Initialise the sound font
    15. [self loadFromDLSOrSoundFont: (NSURL *)presetURL withPatch: (int)10];
    16.  
    17. // Load the sequence into the music player
    18. MusicPlayerSetSequence(p, s);
    19. // Called to do some MusicPlayer setup. This just
    20. // reduces latency when MusicPlayerStart is called
    21. MusicPlayerPreroll(p);
    22. // Starts the music playing
    23. MusicPlayerStart(p);
    24.  
    25. // Get length of track so that we know how long to kill time for
    26. MusicTrack t;
    27. MusicTimeStamp len;
    28. UInt32 sz = sizeof(MusicTimeStamp);
    29. MusicSequenceGetIndTrack(s, 1, &t);
    30. MusicTrackGetProperty(t, kSequenceTrackProperty_TrackLength, &len, &sz);
    31.  
    32.  
    33. while (1) { // kill time until the music is over
    34. usleep (3 * 1000 * 1000);
    35. MusicTimeStamp now = 0;
    36. MusicPlayerGetTime (p, &now);
    37. if (now >= len)
    38. break;
    39. }
    40.  
    41. // Stop the player and dispose of the objects
    42. MusicPlayerStop(p);
    43. DisposeMusicSequence(s);
    44. DisposeMusicPlayer(p);

    From the sample project you should understand how to play a MIDI file with a Sound Font effect. The final step is to get real time access to the messages being parsed by the MusicPlayer. To do this we need to add an extra step to our chain. Currently it looks like this:
    MIDI File -> Sequence -> Sampler -> IO Unit -> Speakers
    We want it to look like this:
    MIDI File -> Sequence -> callback function to read messages -> Sampler -> IO Unit -> Speakers
    With this system we will receive the messages in real-time before passing them on to the Sampler unit. This cal be achieved by creating a new MIDI end point. A MIDI endpoint is a destination where midi messages can be sent. This could be another MIDI app on your iPhone, an external MIDI instrument or, in this case, a callback function.



     

    Creating a new MIDI end point

    In order to capture the MIDI messages we need a destination that they can be sent to. This can be done by creating a MIDI end point:

    1. // Create a client
    2. // This provides general information about the state of the midi engine to the callback MyMIDINotifyProc
    3. MIDIClientRef virtualMidi;
    4. result = MIDIClientCreate(CFSTR("Virtual Client"),
    5. MyMIDINotifyProc,
    6. NULL,
    7. &virtualMidi);
    8.  
    9. NSAssert( result == noErr, @"MIDIClientCreate failed. Error code: %d '%.4s'", (int) result, (const char *)&result);
    10.  
    11. // Create an endpoint
    12. // In this endpoint we define our client, a name: Virtual Destination
    13. // a callback function which will receive the MIDI packets: MyMIDIReadProc
    14. // a reference to the sampler unit for use within our callback
    15. // a point to our end point: virtualEndpoint
    16. MIDIEndpointRef virtualEndpoint;
    17. result = MIDIDestinationCreate(virtualMidi, @"Virtual Destination", MyMIDIReadProc, self.samplerUnit, &virtualEndpoint);
    18.  
    19. NSAssert( result == noErr, @"MIDIDestinationCreate failed. Error code: %d '%.4s'", (int) result, (const char *)&result);

    We also need to implement the callbacks in our code. This example will log each note as it's played:

    1. // Get general midi notifications
    2. void MyMIDINotifyProc (const MIDINotification *message, void *refCon) {
    3. printf("MIDI Notify, messageId=%d,", message->messageID);
    4. }
    5. // Get the MIDI messages as they're sent
    6. static void MyMIDIReadProc(const MIDIPacketList *pktlist,
    7. void *refCon,
    8. void *connRefCon) {
    9.  
    10. // Cast our Sampler unit back to an audio unit
    11. AudioUnit *player = (AudioUnit*) refCon;
    12.  
    13. MIDIPacket *packet = (MIDIPacket *)pktlist->packet;
    14. for (int i=0; i < pktlist->numPackets; i++) {
    15. Byte midiStatus = packet->data[0];
    16. Byte midiCommand = midiStatus >> 4;
    17.  
    18. // If the command is note-on
    19. if (midiCommand == 0x09) {
    20. Byte note = packet->data[1] & 0x7F;
    21. Byte velocity = packet->data[2] & 0x7F;
    22.  
    23. // Log the note letter in a readable format
    24. int noteNumber = ((int) note) % 12;
    25. NSString *noteType;
    26. switch (noteNumber) {
    27. case 0:
    28. noteType = @"C";
    29. break;
    30. case 1:
    31. noteType = @"C#";
    32. break;
    33. case 2:
    34. noteType = @"D";
    35. break;
    36. case 3:
    37. noteType = @"D#";
    38. break;
    39. case 4:
    40. noteType = @"E";
    41. break;
    42. case 5:
    43. noteType = @"F";
    44. break;
    45. case 6:
    46. noteType = @"F#";
    47. break;
    48. case 7:
    49. noteType = @"G";
    50. break;
    51. case 8:
    52. noteType = @"G#";
    53. break;
    54. case 9:
    55. noteType = @"A";
    56. break;
    57. case 10:
    58. noteType = @"Bb";
    59. break;
    60. case 11:
    61. noteType = @"B";
    62. break;
    63. default:
    64. break;
    65. }
    66. NSLog([noteType stringByAppendingFormat:[NSString stringWithFormat:@": %i", noteNumber]]);
    67.  
    68. // Use MusicDeviceMIDIEvent to send our MIDI message to the sampler to be played
    69. OSStatus result = noErr;
    70. result = MusicDeviceMIDIEvent (player, midiStatus, note, velocity, 0);
    71.  
    72. }
    73. packet = MIDIPacketNext(packet);
    74. }
    75. }

    The final step is to modify our main function to set the MusicSequence destination to our new endpoint:

    1. OSStatus result = noErr;
    2.  
    3. self.graphSampleRate = 44100.0;
    4.  
    5.  
    6. [self createAUGraph];
    7. [self configureAndStartAudioProcessingGraph: self.processingGraph];
    8.  
    9. // Create a client
    10. MIDIClientRef virtualMidi;
    11. result = MIDIClientCreate(CFSTR("Virtual Client"),
    12. MyMIDINotifyProc,
    13. NULL,
    14. &virtualMidi);
    15.  
    16. NSAssert( result == noErr, @"MIDIClientCreate failed. Error code: %d '%.4s'", (int) result, (const char *)&result);
    17.  
    18. // Create an endpoint
    19. MIDIEndpointRef virtualEndpoint;
    20. result = MIDIDestinationCreate(virtualMidi, @"Virtual Destination", MyMIDIReadProc, self.samplerUnit, &virtualEndpoint);
    21.  
    22. NSAssert( result == noErr, @"MIDIDestinationCreate failed. Error code: %d '%.4s'", (int) result, (const char *)&result);
    23.  
    24.  
    25.  
    26. // Create a new music sequence
    27. MusicSequence s;
    28. // Initialise the music sequence
    29. NewMusicSequence(&s);
    30.  
    31. // Get a string to the path of the MIDI file which
    32. // should be located in the Resources folder
    33. NSString *midiFilePath = [[NSBundle mainBundle]
    34. pathForResource:@"simpletest"
    35. ofType:@"mid"];
    36.  
    37. // Create a new URL which points to the MIDI file
    38. NSURL * midiFileURL = [NSURL fileURLWithPath:midiFilePath];
    39.  
    40.  
    41. MusicSequenceFileLoad(s, (__bridge CFURLRef) midiFileURL, 0, 0);
    42.  
    43. // Create a new music player
    44. MusicPlayer p;
    45. // Initialise the music player
    46. NewMusicPlayer(&p);
    47.  
    48. // ************* Set the endpoint of the sequence to be our virtual endpoint
    49. MusicSequenceSetMIDIEndpoint(s, virtualEndpoint);
    50.  
    51. // Load the ound font from file
    52. NSURL *presetURL = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:@"Gorts_Filters" ofType:@"sf2"]];
    53.  
    54. // Initialise the sound font
    55. [self loadFromDLSOrSoundFont: (NSURL *)presetURL withPatch: (int)10];
    56.  
    57. // Load the sequence into the music player
    58. MusicPlayerSetSequence(p, s);
    59. // Called to do some MusicPlayer setup. This just
    60. // reduces latency when MusicPlayerStart is called
    61. MusicPlayerPreroll(p);
    62. // Starts the music playing
    63. MusicPlayerStart(p);
    64.  
    65. // Get length of track so that we know how long to kill time for
    66. MusicTrack t;
    67. MusicTimeStamp len;
    68. UInt32 sz = sizeof(MusicTimeStamp);
    69. MusicSequenceGetIndTrack(s, 1, &t);
    70. MusicTrackGetProperty(t, kSequenceTrackProperty_TrackLength, &len, &sz);
    71.  
    72.  
    73. while (1) { // kill time until the music is over
    74. usleep (3 * 1000 * 1000);
    75. MusicTimeStamp now = 0;
    76. MusicPlayerGetTime (p, &now);
    77. if (now >= len)
    78. break;
    79. }
    80.  
    81. // Stop the player and dispose of the objects
    82. MusicPlayerStop(p);
    83. DisposeMusicSequence(s);
    84. DisposeMusicPlayer(p);

    So there you have it! Play your MIDI file through a nice reedy SoundFont while collecting the messages to drive your animation! I hope this saves you the 3 days it took me to figure it out! Here's the link again to the project files in case you missed it at the top of the guide. Project Files.



     

    Update:

    It's been pointed out to me that several resource files are missing from the project - a midi file called simpletest.mid and a sound font file called Gorts_Filters.SF2. These files can be downloadedhere. To add them to the project you need to right click on the resources folder in XCode and click "Add Files". As a side note, this code should work with any MIDI file and any Sound Font file. The only thing to watch with sound font files is that the preset/patch that you're requesting exists.

    If you want to ask a general question about CoreAudio or discuss your CoreAudio issue please ask your questions in the CoreAudio section of the forum.

原创粉丝点击