Android 音频视频媒体框架

来源:互联网 发布:php foreach 重组数组 编辑:程序博客网 时间:2024/06/05 11:21

The Android multimedia framework includes support for encoding and decoding avariety of common media types, so that you can easily integrate audio,video and images into your applications. You can play audio or video from media files stored in your application's resources (raw resources), from standalone files in the filesystem, or from a datastream arriving over a network connection, all using MediaPlayer APIs.

You can also record audio and video using the MediaRecorder APIs ifsupported by the device hardware. Note that the emulator doesn't have hardware to capture audio orvideo, but actual mobile devices are likely to provide these capabilities.

This document shows you how to write a media-playing application that interacts with the user andthe system in order to obtain good performance and a pleasant user experience.

Note: You can play back the audio data only to the standard outputdevice. Currently, that is the mobile device speaker or a Bluetooth headset. You cannot play soundfiles in the conversation audio during a call.

Using MediaPlayer

One of the most important components of the media framework is theMediaPlayerclass. An object of this class can fetch, decode, and play both audio and videowith minimal setup. It supports several different media sources such as:

  • Local resources
  • Internal URIs, such as one you might obtain from a Content Resolver
  • External URLs (streaming)

For a list of media formats that Android supports,see the Android Supported MediaFormats document.

Here is an exampleof how to play audio that's available as a local raw resource (saved in your application'sres/raw/ directory):

MediaPlayer mediaPlayer = MediaPlayer.create(context, R.raw.sound_file_1);mediaPlayer.start(); // no need to call prepare(); create() does that for you

In this case, a "raw" resource is a file that the system does nottry to parse in any particular way. However, the content of this resource should notbe raw audio. It should be a properly encoded and formatted media file in one of the supported formats.

And here is how you might play from a URI available locally in the system(that you obtained through a Content Resolver, for instance):

Uri myUri = ....; // initialize Uri hereMediaPlayer mediaPlayer = new MediaPlayer();mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);mediaPlayer.setDataSource(getApplicationContext(), myUri);mediaPlayer.prepare();mediaPlayer.start();

Playing from a remote URL via HTTP streaming looks like this:

String url = "http://........"; // your URL hereMediaPlayer mediaPlayer = new MediaPlayer();mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);mediaPlayer.setDataSource(url);mediaPlayer.prepare(); // might take long! (for buffering, etc)mediaPlayer.start();

Note:If you're passing a URL to stream an online media file, the file must be capable ofprogressive download.

Caution: You must either catch or passIllegalArgumentException and IOException when usingsetDataSource(), becausethe file you are referencing might not exist.

Asynchronous Preparation

Using MediaPlayer can be straightforward inprinciple. However, it's important to keep in mind that a few more things arenecessary to integrate it correctly with a typical Android application. Forexample, the call to prepare() cantake a long time to execute, becauseit might involve fetching and decoding media data. So, as is the case with anymethod that may take long to execute, you should never call it from yourapplication's UI thread. Doing that will cause the UI to hang until the method returns,which is a very bad user experience and can cause an ANR (Application Not Responding) error. Even ifyou expect your resource to load quickly, remember that anything that takes more than a tenthof a second to respond in the UI will cause a noticeable pause and will givethe user the impression that your application is slow.

To avoid hanging your UI thread, spawn another thread toprepare the MediaPlayer and notify the main thread when done. However, whileyou could write the threading logicyourself, this pattern is so common when using MediaPlayer that the frameworksupplies a convenient way to accomplish this task by using theprepareAsync() method. This methodstarts preparing the media in the background and returns immediately. When the mediais done preparing, the onPrepared()method of the MediaPlayer.OnPreparedListener, configured throughsetOnPreparedListener() is called.

Managing State

Another aspect of a MediaPlayer that you should keep in mind isthat it's state-based. That is, the MediaPlayer has an internal statethat you must always be aware of when writing your code, because certain operationsare only valid when then player is in specific states. If you perform an operation while in thewrong state, the system may throw an exception or cause other undesireable behaviors.

The documentation in theMediaPlayer class shows a complete state diagram,that clarifies which methods move the MediaPlayer from one state to another.For example, when you create a new MediaPlayer, it is in the Idlestate. At that point, you should initialize it by callingsetDataSource(), bringing itto the Initialized state. After that, you have to prepare it using either theprepare() orprepareAsync() method. Whenthe MediaPlayer is done preparing, it will then enter the Preparedstate, which means you can call start()to make it play the media. At that point, as the diagram illustrates,you can move between the Started, Paused and PlaybackCompleted states bycalling such methods asstart(),pause(), andseekTo(),amongst others. When youcall stop(), however, notice that youcannot call start() again until youprepare the MediaPlayer again.

Always keep the state diagram in mind when writing code that interacts with aMediaPlayer object, because calling its methods from the wrong state is acommon cause of bugs.

Releasing the MediaPlayer

A MediaPlayer can consume valuablesystem resources.Therefore, you should always take extra precautions to make sure you are nothanging on to a MediaPlayer instance longer than necessary. When youare done with it, you should always callrelease() to make sure anysystem resources allocated to it are properly released. For example, if you areusing a MediaPlayer and your activity receives a call to onStop(), you must release the MediaPlayer,because itmakes little sense to hold on to it while your activity is not interacting withthe user (unless you are playing media in the background, which is discussed in the next section).When your activity is resumed or restarted, of course, you need tocreate a new MediaPlayer and prepare it again before resuming playback.

Here's how you should release and then nullify your MediaPlayer:

mediaPlayer.release();mediaPlayer = null;

As an example, consider the problems that could happen if youforgot to release the MediaPlayer when your activity is stopped, but create anew one when the activity starts again. As you may know, when the user changes thescreen orientation (or changes the device configuration in another way), the system handles that by restarting the activity (by default), so you might quicklyconsume all of the system resources as the userrotates the device back and forth between portrait and landscape, because at eachorientation change, you create a new MediaPlayer that you neverrelease. (For more information about runtime restarts, see Handling Runtime Changes.)

You may be wondering what happens if you want to continue playing"background media" even when the user leaves your activity, much in the sameway that the built-in Music application behaves. In this case, what you need isa MediaPlayer controlled by a Service, asdiscussed in Using a Service with MediaPlayer.

Using a Service with MediaPlayer

If you want your media to play in the background even when your applicationis not onscreen—that is, you want it to continue playing while the user isinteracting with other applications—then you must start aService and control theMediaPlayer instance from there.You should be careful about this setup, because the user and the system have expectationsabout how an application running a background service should interact with the rest of thesystem. If your application does not fulfil those expectations, the user mayhave a bad experience. This section describes the main issues that you should beaware of and offers suggestions about how to approach them.

Running asynchronously

First of all, like an Activity, all work in aService is done in a single thread bydefault—in fact, if you're running an activity and a service from the same application, theyuse the same thread (the "main thread") by default. Therefore, services need toprocess incoming intents quicklyand never perform lengthy computations when responding to them. If any heavywork or blocking calls are expected, you must do those tasks asynchronously: either fromanother thread you implement yourself, or using the framework's many facilitiesfor asynchronous processing.

For instance, when using a MediaPlayer from your main thread,you should call prepareAsync() rather thanprepare(), and implementa MediaPlayer.OnPreparedListenerin order to be notified when the preparation is complete and you can start playing.For example:

public class MyService extends Service implements MediaPlayer.OnPreparedListener {    private static final ACTION_PLAY = "com.example.action.PLAY";    MediaPlayer mMediaPlayer = null;    public int onStartCommand(Intent intent, int flags, int startId) {        ...        if (intent.getAction().equals(ACTION_PLAY)) {            mMediaPlayer = ... // initialize it here            mMediaPlayer.setOnPreparedListener(this);            mMediaPlayer.prepareAsync(); // prepare async to not block main thread        }    }    /** Called when MediaPlayer is ready */    public void onPrepared(MediaPlayer player) {        player.start();    }}

Handling asynchronous errors

On synchronous operations, errors would normallybe signaled with an exception or an error code, but whenever you use asynchronousresources, you should make sure your application is notifiedof errors appropriately. In the case of a MediaPlayer,you can accomplish this by implementing aMediaPlayer.OnErrorListener andsetting it in your MediaPlayer instance:

public class MyService extends Service implements MediaPlayer.OnErrorListener {    MediaPlayer mMediaPlayer;    public void initMediaPlayer() {        // ...initialize the MediaPlayer here...        mMediaPlayer.setOnErrorListener(this);    }    @Override    public boolean onError(MediaPlayer mp, int what, int extra) {        // ... react appropriately ...        // The MediaPlayer has moved to the Error state, must be reset!    }}

It's important to remember that when an error occurs, the MediaPlayermoves to the Error state (see the documentation for theMediaPlayer class for the full state diagram)and you must reset it before you can use it again.

Using wake locks

When designing applications that play mediain the background, the device may go to sleepwhile your service is running. Because the Android system tries to conservebattery while the device is sleeping, the system tries to shut off any of the phone's features that arenot necessary, including the CPU and the WiFi hardware.However, if your service is playing or streaming music, you want to preventthe system from interfering with your playback.

In order to ensure that your service continues to run underthose conditions, you have to use "wake locks." A wake lock is a way to signal tothe system that your application is using some feature that shouldstay available even if the phone is idle.

Notice: You should always use wake locks sparingly and hold themonly for as long as truly necessary, because they significantly reduce the battery life of thedevice.

To ensure that the CPU continues running while your MediaPlayer isplaying, call the setWakeMode() method when initializing your MediaPlayer. Once you do,the MediaPlayer holds the specified lock while playing and releases the lockwhen paused or stopped:

mMediaPlayer = new MediaPlayer();// ... other initialization here ...mMediaPlayer.setWakeMode(getApplicationContext(), PowerManager.PARTIAL_WAKE_LOCK);

However, the wake lock acquired in this example guarantees only that the CPU remains awake. Ifyou are streaming media over thenetwork and you are using Wi-Fi, you probably want to hold aWifiLock aswell, which you must acquire and release manually. So, when you start preparing theMediaPlayer with the remote URL, you should create and acquire the Wi-Fi lock.For example:

WifiLock wifiLock = ((WifiManager) getSystemService(Context.WIFI_SERVICE))    .createWifiLock(WifiManager.WIFI_MODE_FULL, "mylock");wifiLock.acquire();

When you pause or stop your media, or when you no longer need thenetwork, you should release the lock:

wifiLock.release();

Running as a foreground service

Services are often used for performing background tasks, such as fetching emails,synchronizing data, downloading content, amongst other possibilities. In thesecases, the user is not actively aware of the service's execution, and probablywouldn't even notice if some of these services were interrupted and later restarted.

But consider the case of a service that is playing music. Clearly this is a service that the useris actively aware of and the experience would be severely affected by any interruptions.Additionally, it's a service that the user will likely wish to interact with during its execution.In this case, the service should run as a "foreground service." Aforeground service holds a higher level of importance within the system—the system willalmost never kill the service, because it is of immediate importance to the user. When runningin the foreground, the service also must provide a status bar notification to ensure that users areaware of the running service and allow them to open an activity that can interact with theservice.

In order to turn your service into a foreground service, you must create aNotification for the status bar and callstartForeground() from the Service. For example:

String songName;// assign the song name to songNamePendingIntent pi = PendingIntent.getActivity(getApplicationContext(), 0,                new Intent(getApplicationContext(), MainActivity.class),                PendingIntent.FLAG_UPDATE_CURRENT);Notification notification = new Notification();notification.tickerText = text;notification.icon = R.drawable.play0;notification.flags |= Notification.FLAG_ONGOING_EVENT;notification.setLatestEventInfo(getApplicationContext(), "MusicPlayerSample",                "Playing: " + songName, pi);startForeground(NOTIFICATION_ID, notification);

While your service is running in the foreground, the notification youconfigured is visible in the notification area of the device. If the userselects the notification, the system invokes the PendingIntent you supplied. Inthe example above, it opens an activity (MainActivity).


You should only hold on to the "foreground service" status while yourservice is actually performing something the user is actively aware of. Oncethat is no longer true, you should release it by callingstopForeground():

stopForeground(true);

For more information, see the documentation about Services andStatus Bar Notifications.

Handling audio focus

Even though only one activity can run at any given time, Android is amulti-tasking environment. This poses a particular challenge to applicationsthat use audio, because there is only one audio output and there may be severalmedia services competing for its use. Before Android 2.2, there was no built-inmechanism to address this issue, which could in some cases lead to a bad userexperience. For example, when a user is listening tomusic and another application needs to notify the user of something very important,the user might not hear the notification tone due to the loud music. Starting withAndroid 2.2, the platform offers a way for applications to negotiate theiruse of the device's audio output. This mechanism is called Audio Focus.

When your application needs to output audio such as music or a notification, you should always request audio focus. Once it has focus, it can use the sound output freely, but it shouldalways listen for focus changes. If it is notified that it has lost the audiofocus, it should immediately either kill the audio or lower it to a quiet level(known as "ducking"—there is a flag that indicates which one is appropriate) and only resumeloud playback after it receives focus again.

Audio Focus is cooperative in nature. That is, applications are expected(and highly encouraged) to comply with the audio focus guidelines, but therules are not enforced by the system. If an application wants to play loudmusic even after losing audio focus, nothing in the system will prevent that.However, the user is more likely to have a bad experience and will be morelikely to uninstall the misbehaving application.

To request audio focus, you must callrequestAudioFocus() from the AudioManager, as the example below demonstrates:

AudioManager audioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);int result = audioManager.requestAudioFocus(this, AudioManager.STREAM_MUSIC,    AudioManager.AUDIOFOCUS_GAIN);if (result != AudioManager.AUDIOFOCUS_REQUEST_GRANTED) {    // could not get audio focus.}

The first parameter to requestAudioFocus()is an AudioManager.OnAudioFocusChangeListener,whose onAudioFocusChange() method is called whenever there is a change in audio focus. Therefore, youshould also implement this interface on your service and activities. For example:

class MyService extends Service                implements AudioManager.OnAudioFocusChangeListener {    // ....    public void onAudioFocusChange(int focusChange) {        // Do something based on focus change...    }}

The focusChange parameter tells you how the audio focus has changed, andcan be one of the following values (they are all constants defined inAudioManager):

  • AUDIOFOCUS_GAIN: You have gained the audio focus.
  • AUDIOFOCUS_LOSS: You have lost the audio focus for apresumably long time.You must stop all audio playback. Because you should expect not to have focus backfor a long time, this would be a good place to clean up your resources as muchas possible. For example, you should release the MediaPlayer.
  • AUDIOFOCUS_LOSS_TRANSIENT: You havetemporarily lost audio focus, but should receive it back shortly. You must stopall audio playback, but you can keep your resources because you will probably getfocus back shortly.
  • AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK: You have temporarilylost audio focus,but you are allowed to continue to play audio quietly (at a low volume) insteadof killing audio completely.

Here is an example implementation:

public void onAudioFocusChange(int focusChange) {    switch (focusChange) {        case AudioManager.AUDIOFOCUS_GAIN:            // resume playback            if (mMediaPlayer == null) initMediaPlayer();            else if (!mMediaPlayer.isPlaying()) mMediaPlayer.start();            mMediaPlayer.setVolume(1.0f, 1.0f);            break;        case AudioManager.AUDIOFOCUS_LOSS:            // Lost focus for an unbounded amount of time: stop playback and release media player            if (mMediaPlayer.isPlaying()) mMediaPlayer.stop();            mMediaPlayer.release();            mMediaPlayer = null;            break;        case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT:            // Lost focus for a short time, but we have to stop            // playback. We don't release the media player because playback            // is likely to resume            if (mMediaPlayer.isPlaying()) mMediaPlayer.pause();            break;        case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK:            // Lost focus for a short time, but it's ok to keep playing            // at an attenuated level            if (mMediaPlayer.isPlaying()) mMediaPlayer.setVolume(0.1f, 0.1f);            break;    }}

Keep in mind that the audio focus APIs are available only with API level 8 (Android 2.2)and above, so if you want to support previousversions of Android, you should adopt a backward compatibility strategy thatallows you to use this feature if available, and fall back seamlessly if not.

You can achieve backward compatibility either by calling the audio focus methods by reflectionor by implementing all the audio focus features in a separate class (say,AudioFocusHelper). Here is an example of such a class:

public class AudioFocusHelper implements AudioManager.OnAudioFocusChangeListener {    AudioManager mAudioManager;    // other fields here, you'll probably hold a reference to an interface    // that you can use to communicate the focus changes to your Service    public AudioFocusHelper(Context ctx, /* other arguments here */) {        mAudioManager = (AudioManager) mContext.getSystemService(Context.AUDIO_SERVICE);        // ...    }    public boolean requestFocus() {        return AudioManager.AUDIOFOCUS_REQUEST_GRANTED ==            mAudioManager.requestAudioFocus(mContext, AudioManager.STREAM_MUSIC,            AudioManager.AUDIOFOCUS_GAIN);    }    public boolean abandonFocus() {        return AudioManager.AUDIOFOCUS_REQUEST_GRANTED ==            mAudioManager.abandonAudioFocus(this);    }    @Override    public void onAudioFocusChange(int focusChange) {        // let your service know about the focus change    }}

You can create an instance of AudioFocusHelper class only if you detect thatthe system is running API level 8 or above. For example:

if (android.os.Build.VERSION.SDK_INT >= 8) {    mAudioFocusHelper = new AudioFocusHelper(getApplicationContext(), this);} else {    mAudioFocusHelper = null;}

Performing cleanup

As mentioned earlier, a MediaPlayer object can consume a significantamount of system resources, so you should keep it only for as long as you need and callrelease() when you are done with it. It's importantto call this cleanup method explicitly rather than rely on system garbage collection becauseit might take some time before the garbage collector reclaims the MediaPlayer,as it's only sensitive to memory needs and not to shortage of other media-related resources.So, in the case when you're using a service, you should always override theonDestroy() method to make sure you are releasingthe MediaPlayer:

public class MyService extends Service {   MediaPlayer mMediaPlayer;   // ...   @Override   public void onDestroy() {       if (mMediaPlayer != null) mMediaPlayer.release();   }}

You should always look for other opportunities to release your MediaPlayeras well, apart from releasing it when being shut down. For example, if you expect notto be able to play media for an extended period of time (after losing audio focus, for example),you should definitely release your existing MediaPlayer and create it againlater. On theother hand, if you only expect to stop playback for a very short time, you should probablyhold on to your MediaPlayer to avoid the overhead of creating and preparing itagain.

Handling the AUDIO_BECOMING_NOISY Intent

Many well-written applications that play audio automatically stop playback when an eventoccurs that causes the audio to become noisy (ouput through external speakers). For instance,this might happen when a user is listening to music through headphones and accidentallydisconnects the headphones from the device. However, this behavior does not happen automatically.If you don't implement this feature, audio plays out of the device's external speakers, whichmight not be what the user wants.

You can ensure your app stops playing music in these situations by handlingthe ACTION_AUDIO_BECOMING_NOISY intent, for which you can register a receiver byadding the following to your manifest:

<receiver android:name=".MusicIntentReceiver">   <intent-filter>      <action android:name="android.media.AUDIO_BECOMING_NOISY" />   </intent-filter></receiver>

This registers the MusicIntentReceiver class as a broadcast receiver for thatintent. You should then implement this class:

public class MusicIntentReceiver implements android.content.BroadcastReceiver {   @Override   public void onReceive(Context ctx, Intent intent) {      if (intent.getAction().equals(                    android.media.AudioManager.ACTION_AUDIO_BECOMING_NOISY)) {          // signal your service to stop playback          // (via an Intent, for instance)      }   }}

Retrieving Media from a Content Resolver

Another feature that may be useful in a media player application is the ability toretrieve music that the user has on the device. You can do that by querying the ContentResolver for external media:

ContentResolver contentResolver = getContentResolver();Uri uri = android.provider.MediaStore.Audio.Media.EXTERNAL_CONTENT_URI;Cursor cursor = contentResolver.query(uri, null, null, null, null);if (cursor == null) {    // query failed, handle error.} else if (!cursor.moveToFirst()) {    // no media on the device} else {    int titleColumn = cursor.getColumnIndex(android.provider.MediaStore.Audio.Media.TITLE);    int idColumn = cursor.getColumnIndex(android.provider.MediaStore.Audio.Media._ID);    do {       long thisId = cursor.getLong(idColumn);       String thisTitle = cursor.getString(titleColumn);       // ...process entry...    } while (cursor.moveToNext());}

To use this with the MediaPlayer, you can do this:

long id = /* retrieve it from somewhere */;Uri contentUri = ContentUris.withAppendedId(        android.provider.MediaStore.Audio.Media.EXTERNAL_CONTENT_URI, id);mMediaPlayer = new MediaPlayer();mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);mMediaPlayer.setDataSource(getApplicationContext(), contentUri);// ...prepare and start...

Playing JET content

The Android platform includes a JET engine that lets you add interactive playback of JET audiocontent in your applications. You can create JET content for interactive playback using theJetCreator authoring application that ships with the SDK. To play and manage JET content from yourapplication, use the JetPlayer class.

For a description of JET concepts and instructions on how to use the JetCreator authoring tool,see the JetCreator UserManual. The tool is available on Windows, OS X, and Linux platforms (Linux does notsupport auditioning of imported assets like with the Windows and OS X versions).

Here's an example of how to set up JET playback from a .jet file stored on the SD card:

JetPlayer jetPlayer = JetPlayer.getJetPlayer();jetPlayer.loadJetFile("/sdcard/level1.jet");byte segmentId = 0;// queue segment 5, repeat once, use General MIDI, transpose by -1 octavejetPlayer.queueJetSegment(5, -1, 1, -1, 0, segmentId++);// queue segment 2jetPlayer.queueJetSegment(2, -1, 0, 0, 0, segmentId++);jetPlayer.play();

The SDK includes an example application — JetBoy — that shows how to use JetPlayer to create an interactive music soundtrack in your game. It alsoillustrates how to use JET events to synchronize music and game logic. The application is located at<sdk>/platforms/android-1.5/samples/JetBoy.

Performing Audio Capture

Audio capture from the device is a bit more complicated than audio and video playback, but still fairly simple:

  1. Create a new instance of android.media.MediaRecorder.
  2. Set the audio source using MediaRecorder.setAudioSource(). You will probably want to use MediaRecorder.AudioSource.MIC.
  3. Set output file format using MediaRecorder.setOutputFormat().
  4. Set output file name using MediaRecorder.setOutputFile().
  5. Set the audio encoder using MediaRecorder.setAudioEncoder().
  6. Call MediaRecorder.prepare() on the MediaRecorder instance.
  7. To start audio capture, call MediaRecorder.start().
  8. To stop audio capture, call MediaRecorder.stop().
  9. When you are done with the MediaRecorder instance, callMediaRecorder.release() on it. CallingMediaRecorder.release() is always recommended tofree the resource immediately.

Example: Record audio and play the recorded audio

The example class below illustrates how to set up, start and stop audio capture, and to play the recorded audio file.

/* * The application needs to have the permission to write to external storage * if the output file is written to the external storage, and also the * permission to record audio. These permissions must be set in the * application's AndroidManifest.xml file, with something like: * * <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> * <uses-permission android:name="android.permission.RECORD_AUDIO" /> * */package com.android.audiorecordtest;import android.app.Activity;import android.widget.LinearLayout;import android.os.Bundle;import android.os.Environment;import android.view.ViewGroup;import android.widget.Button;import android.view.View;import android.view.View.OnClickListener;import android.content.Context;import android.util.Log;import android.media.MediaRecorder;import android.media.MediaPlayer;import java.io.IOException;public class AudioRecordTest extends Activity{    private static final String LOG_TAG = "AudioRecordTest";    private static String mFileName = null;    private RecordButton mRecordButton = null;    private MediaRecorder mRecorder = null;    private PlayButton   mPlayButton = null;    private MediaPlayer   mPlayer = null;    private void onRecord(boolean start) {        if (start) {            startRecording();        } else {            stopRecording();        }    }    private void onPlay(boolean start) {        if (start) {            startPlaying();        } else {            stopPlaying();        }    }    private void startPlaying() {        mPlayer = new MediaPlayer();        try {            mPlayer.setDataSource(mFileName);            mPlayer.prepare();            mPlayer.start();        } catch (IOException e) {            Log.e(LOG_TAG, "prepare() failed");        }    }    private void stopPlaying() {        mPlayer.release();        mPlayer = null;    }    private void startRecording() {        mRecorder = new MediaRecorder();        mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);        mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);        mRecorder.setOutputFile(mFileName);        mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);        try {            mRecorder.prepare();        } catch (IOException e) {            Log.e(LOG_TAG, "prepare() failed");        }        mRecorder.start();    }    private void stopRecording() {        mRecorder.stop();        mRecorder.release();        mRecorder = null;    }    class RecordButton extends Button {        boolean mStartRecording = true;        OnClickListener clicker = new OnClickListener() {            public void onClick(View v) {                onRecord(mStartRecording);                if (mStartRecording) {                    setText("Stop recording");                } else {                    setText("Start recording");                }                mStartRecording = !mStartRecording;            }        };        public RecordButton(Context ctx) {            super(ctx);            setText("Start recording");            setOnClickListener(clicker);        }    }    class PlayButton extends Button {        boolean mStartPlaying = true;        OnClickListener clicker = new OnClickListener() {            public void onClick(View v) {                onPlay(mStartPlaying);                if (mStartPlaying) {                    setText("Stop playing");                } else {                    setText("Start playing");                }                mStartPlaying = !mStartPlaying;            }        };        public PlayButton(Context ctx) {            super(ctx);            setText("Start playing");            setOnClickListener(clicker);        }    }    public AudioRecordTest() {        mFileName = Environment.getExternalStorageDirectory().getAbsolutePath();        mFileName += "/audiorecordtest.3gp";    }    @Override    public void onCreate(Bundle icicle) {        super.onCreate(icicle);        LinearLayout ll = new LinearLayout(this);        mRecordButton = new RecordButton(this);        ll.addView(mRecordButton,            new LinearLayout.LayoutParams(                ViewGroup.LayoutParams.WRAP_CONTENT,                ViewGroup.LayoutParams.WRAP_CONTENT,                0));        mPlayButton = new PlayButton(this);        ll.addView(mPlayButton,            new LinearLayout.LayoutParams(                ViewGroup.LayoutParams.WRAP_CONTENT,                ViewGroup.LayoutParams.WRAP_CONTENT,                0));        setContentView(ll);    }    @Override    public void onPause() {        super.onPause();        if (mRecorder != null) {            mRecorder.release();            mRecorder = null;        }        if (mPlayer != null) {            mPlayer.release();            mPlayer = null;        }    }}