Core Audio API Architectural Layers

来源:互联网 发布:mac如何设置桌面壁纸 编辑:程序博客网 时间:2024/06/03 15:37

The programming interfaces for Core Audio are arranged into three layers, as illustrated in Figure 2-1.

Figure 2-1  The three API layers of Core Audio

The three layers of Core Audio

The lowest layer includes:

  • The I/O Kit, which interacts with drivers

  • The audio hardware abstraction layer (audio HAL), which provides a device-independent, driver-independent interface to hardware

  • Core MIDI, which provides software abstractions for working with MIDI streams and devices

  • Host Time Services, which provides access to the computer’s clock

Applications in Mac OS X can be written to use these technologies directly when they require the highest possible, real-time performance. Many audio applications, however, don’t access this layer. Indeed, Core Audio in iOS provides ways to achieve real-time audio using higher level interfaces. OpenAL, for example, employs direct I/O for real-time audio in games. The result is a significantly smaller, tuned API set appropriate for a mobile platform.

The middle layer in Core Audio includes services for data format conversion, reading and writing to disk, parsing streams, and working with plug-ins.

  • Audio Converter Services lets applications work with audio data format converters.

  • Audio File Services supports reading and writing audio data to and from disk-based files.

  • Audio Unit Services and Audio Processing Graph Services let applications work with digital signal processing (DSP) plug-ins such as equalizers and mixers.

  • Audio File Stream Services lets you build applications that can parse streams, such as for playing files streamed over a network connection.

  • Core Audio Clock Services supports audio and MIDI synchronization as well as time-base conversions.

  • Audio Format Services (a small API, not shown in the figure) assists with managing audio data formats in your application.

The highest layer in Core Audio includes streamlined interfaces that combine features from lower layers.

  • Audio Queue Services lets you record, play, pause, loop, and synchronize audio. It employs codecs as necessary to deal with compressed audio formats.

  • The AVAudioPlayer class provides a simple Objective-C interface for playing and looping audio in iOS applications. The class handles all audio formats supported in iOS, and provides a straightforward means to implement features such as rewind and fast-forward.

  • Extended Audio File Services combines features from Audio File Services and Audio Converter services. It gives you a unified interface for reading and writing uncompressed and compressed sound files.

  • OpenAL is the Core Audio implementation of the open-source OpenAL standard for positional audio. It is built on top of the system-supplied 3D Mixer audio unit. All applications can use OpenAL, although it is best suited for games development.

原创粉丝点击