MIDI Driven Animation using CoreMIDI in Objective C
来源:互联网 发布:finale软件 编辑:程序博客网 时间:2024/06/08 06:26
现在开始写、记博客
MIDI Driven Animation using CoreMIDI in Objective C
So in this post I'm going to explain how to produce MIDI driven animation on OSX or iOS using the CoreMIDI and CoreAudio frameworks. When I first started trying to do this I thought it would be easy - just register a callback in the MIDI player which is called every time a MIDI message is played. Unfortunately this is not possible and I ended up spending three long days figuring it out from the limited documentation available. Hopefully this post will save someone some time!
Project files
A fully working X-Code project can be downloaded here.
The Goal
In this guide I will explain how to do the following:
- Load and play a MIDI sequence from a file using a MusicPlayer
- Play the MIDI notes with an instrument effect (SoundFont) using an AUGraph
- Create a virtual endpoint to intercept and display the MIDI messages in realtime
Load and play a MIDI Sequence
The following tasks are needed to load and play a MIDI file:
- Create a MusicSequence to hold the MIDI information
- Get a NSURL to hold the path to the MIDI file
- Load the sequence file into the sequence using MusicSequenceFileLoad
- Create a new MusicPlayer, add the sequence and play the sequence
Now here's the code. You will need to include the following frameworks: CoreAudio, CoreMIDI and AudioToolbox as well as the import: AudioToolbox/MusicPlayer.h
Hopefully you will have heard a rather mechanical scale followed by a chromatic scale. It's basic but at least it's a start. The next step is to create an AU graph so that we can play our MIDI file with an instrument effect.
Creating an AUGraph
When I first started reading about AU Graphs I thought it sounded horribly incomprehensible and opaque. In reality it's not too bad just a bit fiddly to set up.
An AUGraph is a container to hold a collection of AUNodes. AU Nodes are effects units which are supplied by Apple. Really it's just like music units in real life. Maybe you have a MIDI keyboard and you want to output the sound as a trumpet with an echo effect. You would need to plug your keyboard into a box which translates MIDI messages and turns them into trumpet sounds. This box would need to be plugged into an echo unit which is plugged into the speakers.
Choosing your AUNodes
In CoreAudio you choose the type of AUNode you need using three properties (defined by ENUMs):
- componentManufacturer: The author of the AUNode in this case we will be using audio units from Apple - kAudioUnitManufacturer_Apple
- componentType: The unit type
- componentSubType: The sub unit type
The unit type and sub-unit type can be found in the Apple documentation or in the header file AUComponent.h. Basically to find the audio unit you need it's easiest to use Google. But say I want a high pass filter, I look in the AUComponent.h header file and find kAudioUnitSubType_HighPassFilter - this is the sub type. I then count how many sub type definitions there were before this one - in this case 2. I then look at the top of the document and look at the third Audio unit type defined kAudioUnitType_MusicEffect. Now I have my manufacturer, type and sub type and I can use the Audio Unit.
For this example we will be using the following two Audio Units:
- Sampler: This is a unit converts MIDI to music sounds defined in a Sound Font or AUPreset and is available on iOS 5
- RemoteIO: This unit allows us to output sounds to iPhone speakers
So here's the code - adapted from an example provided by Apple but with extra comments.
Next we need to create a function to start the AUGraph running. This is equivalent to turning on the physical devices.
So, now we've created a new audio graph with a sampler and an output unit. We've connected the sampler unit to the output unit and we've started the graph. Finally we need to set up the instrument effect, connect the music sequence and play.
Set up the sound effect
This code takes a sound font NSURL and a preset number as input. The NSURL should point to the Sound Font file in your Resources directory. Sound Fonts can hold a number of instrument effects so the presetNumber defines which one should be used.
Now we just repeat what we did before but with a few added lines (marked by stars).
From the sample project you should understand how to play a MIDI file with a Sound Font effect. The final step is to get real time access to the messages being parsed by the MusicPlayer. To do this we need to add an extra step to our chain. Currently it looks like this:
MIDI File -> Sequence -> Sampler -> IO Unit -> Speakers
We want it to look like this:
MIDI File -> Sequence -> callback function to read messages -> Sampler -> IO Unit -> Speakers
With this system we will receive the messages in real-time before passing them on to the Sampler unit. This cal be achieved by creating a new MIDI end point. A MIDI endpoint is a destination where midi messages can be sent. This could be another MIDI app on your iPhone, an external MIDI instrument or, in this case, a callback function.
Creating a new MIDI end point
In order to capture the MIDI messages we need a destination that they can be sent to. This can be done by creating a MIDI end point:
We also need to implement the callbacks in our code. This example will log each note as it's played:
The final step is to modify our main function to set the MusicSequence destination to our new endpoint:
So there you have it! Play your MIDI file through a nice reedy SoundFont while collecting the messages to drive your animation! I hope this saves you the 3 days it took me to figure it out! Here's the link again to the project files in case you missed it at the top of the guide. Project Files.
Update:
It's been pointed out to me that several resource files are missing from the project - a midi file called simpletest.mid and a sound font file called Gorts_Filters.SF2. These files can be downloadedhere. To add them to the project you need to right click on the resources folder in XCode and click "Add Files". As a side note, this code should work with any MIDI file and any Sound Font file. The only thing to watch with sound font files is that the preset/patch that you're requesting exists.
If you want to ask a general question about CoreAudio or discuss your CoreAudio issue please ask your questions in the CoreAudio section of the forum.
- MIDI Driven Animation using CoreMIDI in Objective C
- Using Properties in Objective-C Tutorial 边看边记
- Event Driven Programming In C
- Creating and Using Your Own Delegates in Objective-C
- Programming in iOS with Objective-C Using MVC
- How to gzip Data in Memory Using Objective-C
- Programming and Using Linux Sound - in depth - MIDI ALSA
- Using NMock and DynamicMocks in Test Driven Development
- Target-Driven Visual Navigation In Indoor Scenes Using DRL 讲解
- Using C++ With Objective-C
- Using C++ With Objective-C
- Using C++ With Objective-C
- Using C++ With Objective-C
- Using Alamofire from Objective-C
- Using Cookie in C#
- Strategies for Using C++ in Objective-C Projects (and vice versa)
- Save a lot of code by using NSClassFromString in Objective C
- JSON in objective c
- java实现FusionCharts实时图工具
- 详解如何调试Xcode程序
- 【php】统计goods_category表中各分类下商品的总数量
- 重新学习编程
- 高质量C++编程指南学习笔记3-4章---thanks to林锐
- MIDI Driven Animation using CoreMIDI in Objective C
- SqlServer中使用Convert取得DateTime数据格式
- 符号链接
- The Web Creaks as Jackson Fans Mourn【zz】
- Oracle 游标使用全解
- HELLOWIN程序
- 【Effective Java】Ch2_创建销毁对象:Item7_避免使用finalize方法
- XCode工程中 Project 和 Targets区别
- mysql导出特定的数据表结构及数据