android Camera framework层解析

来源:互联网 发布:鸡蛋进口数据2015 编辑:程序博客网 时间:2024/05/17 23:01

 

转载请注明出处,谢谢!

相机服务框架类

相机服务是如何运作的呢?若弄清这个问题,必须先了解组成Service Framework的各个类与Bind RPC的连接关系。图-1描述了在不同的三个部分中各个类与Binder RPC的关系。

(a)       Camera 类继承自ICameraClient类,负责在应用程序与相机服务间传递Binder RPC数据。

(b)       CameraService 类继承了ICameraService类,负责应用程序与相机服务间的连接。

(c)       CameraService::Client类继承自ICamera类,负责相机设备的设置,控制及来自相机设备的事件。

各个类与Binder RPC的具体关系:

(1)       android.hardware.Camera 中的本地方法通过JNI调用本地库中Camera的成员函数。

(2)       在应用程序连接相机服务时,Camera 通过BpCameraService(服务代理)BnCamera Service (服务stub)进行Binder RPC操作(经由ICameraService接口执行Binder RPC交互)。

(3)       应用程序中请求相机设备或预览功能时,Camera通过BpCamera(服务代理)bnCamera进行binder RPC操作(经由ICamera接口执行Binder RPC交互)。

(4)       当相机设备发生事件时,CameraService::Client 通过BpCameraClientBnCamera进行Binder RPC操作(经由ICameraClient接口执行Binder RPC).

相机服务初始化

/frameworks/base/media/mediaserver/main_mediaserver.cpp

 

int main(int argc, char** argv)

{

    sp<ProcessState> proc(ProcessState::self());

    sp<IServiceManager> sm = defaultServiceManager();

    LOGI("ServiceManager: %p", sm.get());

 

   //add for coredump .check only in debug mode

    {

        char value[PROPERTY_VALUE_MAX];

        property_get("ro.debuggable", value, "0");

        if(value[0] == '1' )

        {

            struct rlimit rl;

            rl.rlim_cur = -1;

            rl.rlim_max = -1;

            setrlimit(4,&rl);

        }

    }

 

    VolumeManager::instantiate(); // volumemanager have to be started before audioflinger

    AudioFlinger::instantiate();

    MediaPlayerService::instantiate();

    CameraService::instantiate();

    AudioPolicyService::instantiate();

    ProcessState::self()->startThreadPool();

    IPCThreadState::self()->joinThreadPool();

}

查看CameraService::instantiate()源码遇到问题,到CameraService.cpp文件里面却找不到instantiate()这个方法,它在哪?

frameworks/base/media/mediaserver/main_mediaserver.cpp文件中发现蛛丝马迹:

#include <grp.h>

#include <binder/IPCThreadState.h>

#include <binder/ProcessState.h>

#include <binder/IServiceManager.h>

#include <utils/Log.h>

 

#include <AudioFlinger.h>

#include <CameraService.h>

#include <MediaPlayerService.h>

#include <AudioPolicyService.h>

#include <private/android_filesystem_config.h>

所以去frameworks/base/services/camera/libcameraservice/CameraService.h一探究竟。

 

class CameraService :

    public BinderService<CameraService>,

    public BnCameraService

{

    class Client;

    friend class BinderService<CameraService>;

public:

    static char const* getServiceName() { return "media.camera"; }

 

                        CameraService();

    virtual             ~CameraService();

 

    virtual int32_t     getNumberOfCameras();

    virtual status_t    getCameraInfo(int cameraId,

...

从以上定义可以看出CameraService继承于BinderService和BnCameraService所以CameraService::instantiate()可能继承父类的方法,到父类BinderService中一看,果不其然,其父类BinderService中有instantiate方法,并且是个静态方法。

 

frameworks/base/include/binder/BinderService.h

 

class BinderService

{

public:

    static status_tpublish() {

        sp<IServiceManager> sm(defaultServiceManager());

        returnsm->addService(String16(SERVICE::getServiceName()), new SERVICE());

    }

 

    static void publishAndJoinThreadPool() {

        sp<ProcessState> proc(ProcessState::self());

        sp<IServiceManager> sm(defaultServiceManager());

        sm->addService(String16(SERVICE::getServiceName()), new SERVICE());

        ProcessState::self()->startThreadPool();

        IPCThreadState::self()->joinThreadPool();

    }

 

    static void instantiate() { publish(); }

 

    static status_t shutdown() {

        return NO_ERROR;

    }

};

 

 

}; // namespace android

可以发现在publish()函数中,CameraService完成服务的注册。这里面有个SERVICE,源码中有说明

template<typename SERVICE>

这表示SERVICE是个模板,这里是注册CameraService,所以可以用CameraService代替SERVICE,这段代码可以替换为:

return sm->addService(String16(CameraService::getServiceName()), new CameraService());

好了这样,Camera就在ServiceManager完成服务注册,提供给client随时使用。

 

main_mediaserver.cpp编译后生成out/target/product/sp8825ea/system/bin/mediaserver ;

Main_MediaServer主函数由init.rc在启动是调用,所以在设备开机的时候Camera就会注册一个服务,用作binder通信。

service media /system/bin/mediaserver

    class main

    user media

    group system audio camera graphics inet net_bt net_bt_admin net_bw_acct drmrpc radio

ioprio rt 4

 

连接相机服务

在使用相机服务之前,应用程序必须先与相机服务连接起来,(在应用程序看来,连接就是调用open函数)。在连接的过程中,ICameraServiceBinder RPC扮演连接桥梁的角色。在完成连接以后,相应的Binder RPC关系就确定了。利用它可以设置相机设备,传递相应命令、接收相应的事件。

连接过程

(1)       应用程序调用android.hardware.Cameraopen()方法;

(2)       Open方法调用native_setup();

(3)       native_setup()方法通过JNI调用android_hardware_Camera_native_setup()函数。

(4)       android_hardware_Camera_native_setup()函数调用Cameraconnect()成员函数。

(5)       Cameraconnect方法从Context Manager获取相机服务信息,生成服务代理(BpCameraService)后,通过Binder RPC连接到相机服务stubBnCameraService)。

(6)       实际连接是由CameraServiceconnect()方法进行处理的。

APP层:

如我的demo:

public void surfaceCreated(SurfaceHolder holder)

        {

                // int nCameras = Camera.getNumberOfCameras();

               mCamera = Camera.open(0);

                try {

                        Log.i(TAG, "SurfaceHolder.Callback:surface Created");

                        mCamera.setPreviewDisplay(mSurfaceHolder);// set the surface to be

                        // used for live preview

                        mCamera.setPreviewCallback(previewCallback);

                        mCamera.setErrorCallback(errorCallback);

                        mCamera.startPreview();

     .......................................

 

framewrok

frameworks/base/core/java/android/hardware/Camera.java

 public static Camera open() {

        int numberOfCameras = getNumber OfCameras();

        CameraInfo cameraInfo = new CameraInfo();

        for (int i = 0; i < numberOfCameras; i++) {

            getCameraInfo(i, cameraInfo);

            if (cameraInfo.facing == CameraInfo.CAMERA_FACING_BACK) {

               return new Camera(i);

            }

        }

        return null;

    }

 

    Camera(int cameraId) {

        mShutterCallback = null;

        mRawImageCallback = null;

        mJpegCallback = null;

        mPreviewCallback = null;

        mPostviewCallback = null;

        mZoomListener = null;

 

        Looper looper;

        if ((looper = Looper.myLooper()) != null) {

            mEventHandler = new EventHandler(this, looper);

        } else if ((looper = Looper.getMainLooper()) != null) {

            mEventHandler = new EventHandler(this, looper);

        } else {

            mEventHandler = null;

        }

 

        native_setup(new WeakReference<Camera>(this), cameraId);

    }

JNI

frameworks/base/core/jni

 

static JNINativeMethod camMethods[] = {

  { "getNumberOfCameras",

    "()I",

    (void *)android_hardware_Camera_getNumberOfCameras },

  { "getCameraInfo",

    "(ILandroid/hardware/Camera$CameraInfo;)V",

    (void*)android_hardware_Camera_getCameraInfo },

  { "native_setup",

   "(Ljava/lang/Object;I)V",

   (void*)android_hardware_Camera_native_setup },

  { "native_release",

    "()V",

    (void*)android_hardware_Camera_release },

  { "setPreviewDisplay",

    "(Landroid/view/Surface;)V",

    (void *)android_hardware_Camera_setPreviewDisplay },

android_hardware_Camera_native_setup原型:

static voidandroid_hardware_Camera_native_setup(JNIEnv *env, jobject thiz,

    jobject weak_this, jint cameraId)

{

    sp<Camera> camera = Camera::connect(cameraId);

 

    if (camera == NULL) {

        jniThrowRuntimeException(env, "Fail to connect to camera service");

        return;

    }

 

    // make sure camera hardware is alive

    if (camera->getStatus() != NO_ERROR) {

        jniThrowRuntimeException(env, "Camera initialization failed");

        return;

    }

 

    jclass clazz = env->GetObjectClass(thiz);

    if (clazz == NULL) {

        jniThrowRuntimeException(env, "Can't find android/hardware/Camera");

        return;

    }

 

    // We use a weak reference so the Camera object can be garbage collected.

    // The reference is only used as a proxy for callbacks.

    sp<JNICameraContext> context = new JNICameraContext(env, weak_this, clazz, camera);

    context->incStrong(thiz);

    camera->setListener(context);

 

    // save context in opaque field

    env->SetIntField(thiz, fields.context, (int)context.get());

}

本地库

frameworks/base/libs/camera/Camera.cpp

 

sp<Camera> Camera::connect(int cameraId)

{

    LOGV("connect");

    sp<Camera> c = new Camera();

    const sp<ICameraService>& cs = getCameraService();

    if (cs != 0) {

        c->mCamera = cs->connect(c, cameraId);

    }

    if (c->mCamera != 0) {

        c->mCamera->asBinder()->linkToDeath(c);

        c->mStatus = NO_ERROR;

    } else {

        c.clear();

    }

    return c;

}

 

 

frameworks/base/libs/camera/ICameraService.cpp

 

    virtual sp<ICamera> connect(const sp<ICameraClient>& cameraClient, int cameraId)

    {

        Parcel data, reply;

        data.writeInterfaceToken(ICameraService::getInterfaceDescriptor());

        data.writeStrongBinder(cameraClient->asBinder());

        data.writeInt32(cameraId);

        remote()->transact(BnCameraService::CONNECT, data, &reply);

        return interface_cast<ICamera>(reply.readStrongBinder());

    }

 

frameworks/base/libs/camera/ICameraService.cpp

 

status_t BnCameraService::onTransact(

    uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)

{

    switch(code) {

        case GET_NUMBER_OF_CAMERAS: {

            CHECK_INTERFACE(ICameraService, data, reply);

            reply->writeInt32(getNumberOfCameras());

            return NO_ERROR;

        } break;

        case GET_CAMERA_INFO: {

            CHECK_INTERFACE(ICameraService, data, reply);

            CameraInfo cameraInfo;

            memset(&cameraInfo, 0, sizeof(cameraInfo));

            status_t result = getCameraInfo(data.readInt32(), &cameraInfo);

            reply->writeInt32(cameraInfo.facing);

            reply->writeInt32(cameraInfo.orientation);

            reply->writeInt32(result);

            return NO_ERROR;

        } break;

        case CONNECT: {

           CHECK_INTERFACE(ICameraService, data, reply);

           sp<ICameraClient> cameraClient = interface_cast<ICameraClient>(data.readStrongBinder());

           sp<ICamera> camera =connect(cameraClient, data.readInt32());

           reply->writeStrongBinder(camera->asBinder());

           return NO_ERROR;

        } break;

        default:

            return BBinder::onTransact(code, data, reply, flags);

    }

}

 

frameworks/base/services/camera/libcameraservice/CameraService.cpp

 

sp<ICamera>CameraService::connect(

        const sp<ICameraClient>& cameraClient, int cameraId) {

    int callingPid = getCallingPid();

    sp<CameraHardwareInterface> hardware = NULL;

 

    LOG1("CameraService::connect E (pid %d, id %d)", callingPid, cameraId);

 

    if (!mModule) {

        LOGE("Camera HAL module not loaded");

        return NULL;

    }

 

    sp<Client>client;

    if (cameraId < 0 || cameraId >= mNumberOfCameras) {

        LOGE("CameraService::connect X (pid %d) rejected (invalid cameraId %d).",

            callingPid, cameraId);

        return NULL;

    }

 

    char value[PROPERTY_VALUE_MAX];

    property_get("sys.secpolicy.camera.disabled", value, "0");

    if (strcmp(value, "1") == 0) {

        // Camera is disabled by DevicePolicyManager.

        LOGI("Camera is disabled. connect X (pid %d) rejected", callingPid);

        return NULL;

    }

 

    Mutex::Autolock lock(mServiceLock);

    if (mClient[cameraId] != 0) {

        client = mClient[cameraId].promote();

        if (client != 0) {

            if (cameraClient->asBinder() == client->getCameraClient()->asBinder()) {

                LOG1("CameraService::connect X (pid %d) (the same client)",

                    callingPid);

                return client;

            } else {

                LOGW("CameraService::connect X (pid %d) rejected (existing client).",

                    callingPid);

                return NULL;

            }

        }

        mClient[cameraId].clear();

    }

 

    if (mBusy[cameraId]) {

        LOGW("CameraService::connect X (pid %d) rejected"

             " (camera %d is still busy).", callingPid, cameraId);

        return NULL;

    }

 

    struct camera_info info;

    if (mModule->get_camera_info(cameraId, &info) != OK) {

        LOGE("Invalid camera id %d", cameraId);

        return NULL;

    }

 

    char camera_device_name[10];

    snprintf(camera_device_name, sizeof(camera_device_name), "%d", cameraId);

 

    hardware = new CameraHardwareInterface(camera_device_name);

    if (hardware->initialize(&mModule->common) != OK) {

        hardware.clear();

        return NULL;

    }

 

    client = new Client(this, cameraClient, hardware, cameraId, info.facing, callingPid);

    mClient[cameraId] = client;

    LOG1("CameraService::connect X");

    return client;

}

mModule的实例化是在第一次该对象被应用的时候;

onFirstRef重载父类的方法。

void CameraService::onFirstRef()

{

    BnCameraService::onFirstRef();

 

    if (hw_get_module(CAMERA_HARDWARE_MODULE_ID,

                (const hw_module_t **)&mModule) < 0) {

        LOGE("Could not load camera HAL module");

        mNumberOfCameras = 0;

    }

    else {

        mNumberOfCameras = mModule->get_number_of_cameras();

LOGE("CameraService::onFirstRef mNumberOfCameras=%d",mNumberOfCameras);

 

        if (mNumberOfCameras > MAX_CAMERAS) {

            LOGE("Number of cameras(%d) > MAX_CAMERAS(%d).",

                    mNumberOfCameras, MAX_CAMERAS);

            mNumberOfCameras = MAX_CAMERAS;

        }

        //for (int i = 0; i < mNumberOfCameras; i++) {

        for (int i = 0; i < MAX_CAMERAS; i++) {

            setCameraFree(i);

        }

    }

    // Read the system property to determine if we have to use the

    // AUDIO_STREAM_ENFORCED_AUDIBLE type.

    char value[PROPERTY_VALUE_MAX];

    property_get("ro.camera.sound.forced", value, "0");

    if (strcmp(value, "0") != 0) {

        mAudioStreamType = AUDIO_STREAM_ENFORCED_AUDIBLE;

    } else {

        mAudioStreamType = AUDIO_STREAM_MUSIC;

    }

    //set AudioStreamTyee  as "AUDIO_STREAM_ENFORCED_AUDIBLE"

    mAudioStreamType = AUDIO_STREAM_ENFORCED_AUDIBLE;

}

 

frameworks/base/services/camera/libcameraservice/CameraService.cpp

 

CameraService::Client::Client(const sp<CameraService>& cameraService,

        const sp<ICameraClient>& cameraClient,

        const sp<CameraHardwareInterface>& hardware,

        int cameraId, int cameraFacing, int clientPid) {

    int callingPid = getCallingPid();

    LOG1("Client::Client E (pid %d)", callingPid);

 

    mCameraService = cameraService;

    mCameraClient = cameraClient;

    mHardware = hardware;

    mCameraId = cameraId;

    mCameraFacing = cameraFacing;

    mClientPid = clientPid;

    mMsgEnabled = 0;

    mSurface = 0;

    mPreviewWindow = 0;

    mHardware->setCallbacks(notifyCallback,

                            dataCallback,

                            dataCallbackTimestamp,

                            (void *)cameraId);

 

    // Enable zoom, error, focus, and metadata messages by default

    enableMsgType(CAMERA_MSG_ERROR | CAMERA_MSG_ZOOM | CAMERA_MSG_FOCUS |

                  CAMERA_MSG_PREVIEW_METADATA);

 

    // Callback is disabled by default

    mPreviewCallbackFlag = CAMERA_FRAME_CALLBACK_FLAG_NOOP;

    mOrientation = getOrientation(0, mCameraFacing == CAMERA_FACING_FRONT);

    mPlayShutterSound = true;

    cameraService->setCameraBusy(cameraId);

    cameraService->loadSound();

    LOG1("Client::Client X (pid %d)", callingPid);

}

(顺便可以看看析构函数的disconnect过程。)

经过上述过程,cameracamera::client之间形成了服务与用户的关系!

 

相机设置和控制

应用程序使用ICameraBinder RPC请求更改相机设置或预览在应用程序中,调用setParameter()方法,将相应的设置参数以Binder RPC的形式传递给相机设备。

过程分析

(1)    应用层通过android.hardware.CamerasetParameter()方法,变更相机设置。

(2)    通过JNI调用android_hardware_Camera_setParameters()方法。

(3)    在android_hardware_Camera_setParameters()方法中调用CamerasetParameter()的成员方法。

(4)    CamerasetParameter成员方法通过BpCameraBinder RPC的形式向BnCamera请求变更相机设置。

(5)    在CameraService::ClientsetParameter()成员方法中,调用CameraHardwareInterfacesetParameters()成员方法;

(6)    CameraHardwareInterface将设置内容更改到相机设备上。

 

app层中的代码如demo中所用,这里不再细看。

framewrok

frameworks/base/core/java/android/hardware/Camera.java

    public void setParameters(Parameters params) {

    if ("invalid".equals(params.getFocusMode())) {

       throw new RuntimeException("Throw exception for invalid parameters,Because FocusMode parameters was invalid.");

    }

 

        if ("invalid".equals(params.getFlashMode())) {

                throw new RuntimeException("Throw exception for invalid parameters,Because FlashMode parameters was invalid.");

        }

 

        native_setParameters(params.flatten());

    }

JNI

frameworks/base/core/jni/android_hardware_Camera.cpp

................

  { "native_takePicture",

    "(I)V",

    (void *)android_hardware_Camera_takePicture },

  { "native_setParameters",

   "(Ljava/lang/String;)V",

   (void *)android_hardware_Camera_setParameters },

  { "native_getParameters",

    "()Ljava/lang/String;",

    (void *)android_hardware_Camera_getParameters },

  { "reconnect",

....................

 

frameworks/base/core/jni/android_hardware_Camera.cpp

 

static voidandroid_hardware_Camera_setParameters(JNIEnv *env, jobject thiz, jstring params)

{

    LOGV("setParameters");

    sp<Camera> camera = get_native_camera(env, thiz, NULL);

    if (camera == 0) return;

 

    const jchar* str = env->GetStringCritical(params, 0);

    String8 params8;

    if (params) {

        params8 = String8(str, env->GetStringLength(params));

        env->ReleaseStringCritical(params, str);

    }

    if (camera->setParameters(params8) != NO_ERROR) {

        jniThrowRuntimeException(env, "setParameters failed");

        return;

    }

}

 

sp<Camera>get_native_camera(JNIEnv *env, jobject thiz, JNICameraContext** pContext)

{

    sp<Camera> camera;

    Mutex::Autolock _l(sLock);

    JNICameraContext* context = reinterpret_cast<JNICameraContext*>(env->GetIntField(thiz, fields.context));

    if (context != NULL) {

        camera = context->getCamera();

    }

    LOGV("get_native_camera: context=%p, camera=%p", context, camera.get());

    if (camera == 0) {

        jniThrowRuntimeException(env, "Method called after release()");

    }

 

    if (pContext != NULL) *pContext = context;

    return camera;

}

本地库

frameworks/base/libs/camera/Camera.cpp

status_t Camera::setParameters(const String8& params)

{

    LOGV("setParameters");

    sp <ICamera> c = mCamera;

    if (c == 0) return NO_INIT;

    return c->setParameters(params);

}

 

frameworks/base/libs/camera/ICamera.cpp

 

   // set preview/capture parameters - key/value pairs

    status_t setParameters(const String8& params)

    {

        LOGV("setParameters");

        Parcel data, reply;

        data.writeInterfaceToken(ICamera::getInterfaceDescriptor());

       data.writeString8(params);

       remote()->transact(SET_PARAMETERS, data, &reply);

       return reply.readInt32();

    }

 

frameworks/base/libs/camera/ICamera.cpp

 

status_t BnCamera::onTransact(

    uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)

{

    switch(code) {

//此处略去一些代码

        case SET_PARAMETERS: {

           LOGV("SET_PARAMETERS");

           CHECK_INTERFACE(ICamera, data, reply);

           String8 params(data.readString8());

           reply->writeInt32(setParameters(params));

           return NO_ERROR;

        } break;

//此处略去一些代码

 

frameworks/base/services/camera/libcameraservice/CameraService.cpp

 

status_t CameraService::Client::setParameters(const String8& params) {

    LOG1("setParameters (pid %d) (%s)", getCallingPid(), params.string());

    Mutex::Autolock lock(mLock);

    status_t result = checkPidAndHardware();

    if (result != NO_ERROR) return result;

 

    CameraParameters p(params);

   return mHardware->setParameters(p);

}

mHardware存储着CameraHarewareInterface实例对象,该语句通过mHardware将设置应用到设备中。

相机事件处理

当相机设备发生事件时,相机服务使用ICameraClientBinder RPC ,将其传递给应用程序。例如:应用程序在调用takePicture()函数获取静态图像时,相机设备准备好静态图像后,将以异步的方式将静态图像已经准备好的信息通知给应用程序。在这以过程中,首先发生Shutter事件,随后分别发生与RAW图像和JPEG图像相关的事件。

当应用程序调用android.hardware.CameratakePicture()方法请求静态图像时,CameraService::Client::takePicture就会调用(如同在相机设置和控制中讲的一样,从应用层到本地库的调用过程不再赘述)。

Framework及本地库

frameworks/base/services/camera/libcameraservice/CameraService.cpp

status_t CameraService::Client::takePicture(int msgType) {

    LOG1("takePicture (pid %d): 0x%x", getCallingPid(), msgType);

 

    Mutex::Autolock lock(mLock);

    status_t result = checkPidAndHardware();

    if (result != NO_ERROR) return result;

 

    if ((msgType & CAMERA_MSG_RAW_IMAGE) &&

        (msgType & CAMERA_MSG_RAW_IMAGE_NOTIFY)) {

        LOGE("CAMERA_MSG_RAW_IMAGE and CAMERA_MSG_RAW_IMAGE_NOTIFY"

                " cannot be both enabled");

        return BAD_VALUE;

    }

 

    // We only accept picture related message types

    // and ignore other types of messages for takePicture().

    int picMsgType = msgType

                       & (CAMERA_MSG_SHUTTER |

                          CAMERA_MSG_POSTVIEW_FRAME |

                          CAMERA_MSG_RAW_IMAGE |

                          CAMERA_MSG_RAW_IMAGE_NOTIFY |

                          CAMERA_MSG_COMPRESSED_IMAGE);

 

   enableMsgType(picMsgType);

 

    return mHardware->takePicture();

}

首先在CameraService::ClienttakePicture()中启用相应类型的消息。当相机设备中发生CAMERA_MSG_SHUTTER,CAMERA_MSG_POSTVIEW_FRAME,CAMERA_MSG_RAW_IMAGE等事件时进行捕获。然后调用静态图像处理函数。

下面以CameraService::ClienthandleShutter函数为例:

 

frameworks/base/services/camera/libcameraservice/CameraService.cpp

 

void CameraService::Client::handleShutter(void) {

    if (mPlayShutterSound) {

        mCameraService->playSound(SOUND_SHUTTER);

    }

 

    sp<ICameraClient> c = mCameraClient;

    if (c != 0) {

        mLock.unlock();

        c->notifyCallback(CAMERA_MSG_SHUTTER, 0, 0);

        if (!lockIfMessageWanted(CAMERA_MSG_SHUTTER)) return;

    }

    disableMsgType(CAMERA_MSG_SHUTTER);

 

    mLock.unlock();

}

mCameraClient中保存BpCameraClient的实例对象,通过Binder RPC调用CameranotifyCallback()函数处理发生的事件。disableMsgType()销毁CAMERA_MSG_SHUTTER消息。

frameworks/base/libs/camera/Camera.cpp

 

void Camera::notifyCallback(int32_t msgType, int32_t ext1, int32_t ext2)

{

    sp<CameraListener> listener;

    {

        Mutex::Autolock _l(mLock);

        listener = mListener;

    }

    if (listener != NULL) {

        listener->notify(msgType, ext1, ext2);

    }

}

CameranotifyCallback()函数中,调用CameraListenernotify()函数,将以Binder RPC传递过来的事件发送给应用程序。在变量mListenerz中保存着JNICameraContext的实例对象,它通过JNI在引用程序与Server Framework件传递数据。

 

 

 

原创粉丝点击