Core Audio API Architectural Layers
来源:互联网 发布:mac如何设置桌面壁纸 编辑:程序博客网 时间:2024/06/03 15:37
The programming interfaces for Core Audio are arranged into three layers, as illustrated in Figure 2-1.
Figure 2-1 The three API layers of Core Audio
The lowest layer includes:
The I/O Kit, which interacts with drivers
The audio hardware abstraction layer (audio HAL), which provides a device-independent, driver-independent interface to hardware
Core MIDI, which provides software abstractions for working with MIDI streams and devices
Host Time Services, which provides access to the computer’s clock
Applications in Mac OS X can be written to use these technologies directly when they require the highest possible, real-time performance. Many audio applications, however, don’t access this layer. Indeed, Core Audio in iOS provides ways to achieve real-time audio using higher level interfaces. OpenAL, for example, employs direct I/O for real-time audio in games. The result is a significantly smaller, tuned API set appropriate for a mobile platform.
The middle layer in Core Audio includes services for data format conversion, reading and writing to disk, parsing streams, and working with plug-ins.
Audio Converter Services lets applications work with audio data format converters.
Audio File Services supports reading and writing audio data to and from disk-based files.
Audio Unit Services and Audio Processing Graph Services let applications work with digital signal processing (DSP) plug-ins such as equalizers and mixers.
Audio File Stream Services lets you build applications that can parse streams, such as for playing files streamed over a network connection.
Core Audio Clock Services supports audio and MIDI synchronization as well as time-base conversions.
Audio Format Services (a small API, not shown in the figure) assists with managing audio data formats in your application.
The highest layer in Core Audio includes streamlined interfaces that combine features from lower layers.
Audio Queue Services lets you record, play, pause, loop, and synchronize audio. It employs codecs as necessary to deal with compressed audio formats.
The
AVAudioPlayer
class provides a simple Objective-C interface for playing and looping audio in iOS applications. The class handles all audio formats supported in iOS, and provides a straightforward means to implement features such as rewind and fast-forward.Extended Audio File Services combines features from Audio File Services and Audio Converter services. It gives you a unified interface for reading and writing uncompressed and compressed sound files.
OpenAL is the Core Audio implementation of the open-source OpenAL standard for positional audio. It is built on top of the system-supplied 3D Mixer audio unit. All applications can use OpenAL, although it is best suited for games development.
- Core Audio API Architectural Layers
- Core Audio API
- core audio api(windows sound api)
- core audio
- Vista Core API: Changing audio device sample rate
- Core Animation之Specialized Layers
- MODERN API ARCHITECTURAL STYLES OFFER DEVELOPERS CHOICES
- core audio introduction
- Core Audio Frameworks
- Core Audio Services
- Core Audio服务
- Core Audio概要
- Core Audio Overview
- core audio 音频捕获
- caffe Layers相关的API
- Keras.layers.core.dense()方法详解
- Core Audio之音频概念
- Core Audio之音频概念
- 析构函数
- Asp.net(C#)-显示所有缓存 清除所有缓存
- DirectShow 在VS2005中环境配置
- 分享学习网站大全
- 如何获取CSDN积分啊
- Core Audio API Architectural Layers
- Android中自定义Adapter和ListView
- 5.4.4 实现选项类型的操作
- sqlMetal 的用法
- C#处理WORD,WORD数据入库
- js 弹出窗口
- sql取拼音首字母的方法
- Selenium 简介
- 使用jQuery来创建Silverlight