camera2 (api2)打开预览过程(二)

来源:互联网 发布:火狐浏览器打不开淘宝 编辑:程序博客网 时间:2024/05/25 13:33

下面就开始获取CameraService的句柄,调用CameraService中的connectDevice函数。

private CameraDeviceopenCameraDeviceUserAsync(String cameraId,           CameraDevice.StateCallback callback, Handler handler, final int uid)           throws CameraAccessException@ CameraManager.java {//首先是获取CameraService的句柄         ICameraServicecameraService = CameraManagerGlobal.get().getCameraService();//向cameraService发送连接请求         cameraUser= cameraService.connectDevice(callbacks, id,                   mContext.getOpPackageName(),uid);}


先看获取CameraService的过程:

CameraManagerGlobal.get().getCameraService();àpublic ICameraService getCameraService() {         connectCameraServiceLocked();}

private void connectCameraServiceLocked() {//这里通过serviceManager来查询cameraservice的句柄,对应的servicename是// private static final String CAMERA_SERVICE_BINDER_NAME ="media.camera";跟前面提到的//cameraservice的启动过程中注册cameraservice到servicemanager时,设置的服务名是一//样的,所以这里得到的就是CameraService的句柄。         IBindercameraServiceBinder =                   ServiceManager.getService(CAMERA_SERVICE_BINDER_NAME);//接下来是把查询到cameraservice句柄这个Ibinder转成ICameraService,在注册//cameraservice时,是把ICameraService转成Ibinder保存的,这里反向转化,由此可以推断//CameraService.cpp一定继承自IBinder,         ICameraServicecameraService = ICameraService.Stub.asInterface(cameraServiceBinder);//这里注册一个监听ICameraServiceListener,当一个新的camera可用时,有相应的回调cameraService.addListener(this);}


接着看下CameraService.cpp是不是继承自IBinder

CameraService.hclass CameraService : public::android::hardware::BnCameraService,其余省略,CameraService继承自BnCameraService,BnCameraService.h//对应的命名空间:android::hardware::namespace android {namespace hardware {class BnCameraService : public::android::BnInterface<ICameraService>}}


这里的ICameraService是有ICameraService.aidl进过aidl工具自动生成的,ICameraService.aidl文件经转化后生成了ICameraService.javaICameraService.hICameraService.cpp文件,早期版本aidl文件转化后只有.java文件生成。如果*.aidl文件被添加到的Android.mk,它的build Target是库,比如:include $(BUILD_SHARED_LIBRARY),那么就会自动生成.h.cpp文件。

继续看BnInterface是不是根IBinder有关系:

IInterface.h

template<typename INTERFACE>

class BnInterface : public INTERFACE, publicBBinder

可以看到BnInterface是一个模板类,这里的INTERFACE就是ICameraService,并且其继承自BBinder

Frameworks/native/include/binder/Binder.h

class BBinder : public IBinder{}

从这里可以看出BBinder继承自IBinder

 

从以上继承关系,可以知道connectDevice的调用流程:

CameraManager.java

cameraUser = cameraService.connectDevice(callbacks,id,

            mContext.getOpPackageName(), uid);à

ICameraService.java

这是有ICameraService.aidl自动生成的.java文件

public interface ICameraService extendsandroid.os.IInterface{

         publicstatic abstract class Stub extends android.os.Binder implementsandroid.hardware.ICameraService{

                   privatestatic class Proxy implements android.hardware.ICameraService{

                            @Overridepublic android.hardware.camera2.ICameraDeviceUserconnectDevice(android.hardware.camera2.ICameraDeviceCallbacks callbacks, intcameraId, java.lang.String opPackageName, int clientUid) throwsandroid.os.RemoteException{

                                     mRemote.transact(Stub.TRANSACTION_connectDevice,_data, _reply, 0);à

}

}

}

}

mRemote.transact()开启跨进程的通信,经由IBinderBpBinderIPCThreadState把请求发到Binder驱动,由Binder驱动把请求发到cameraService服务端,针对同一个请求,clientserver端的业务码是一致的。

CameraService.cpponTransact()方法会被调用:

status_t CameraService::onTransact(uint32_tcode, const Parcel& data, Parcel* reply, uint32_t flags) {

         returnBnCameraService::onTransact(code, data, reply, flags);à

}

ICameraService.cpp

::android::status_tBnCameraService::onTransact(uint32_t _aidl_code, const ::android::Parcel&_aidl_data, ::android::Parcel* _aidl_reply, uint32_t _aidl_flags) {

         caseCall::CONNECTDEVICE:

         ::android::sp<::android::hardware::camera2::ICameraDeviceUser>_aidl_return;

//cameraManager.java中发起connectDevice时是带四个参数,这里加了一个ICameraDeviceUser类型的参数,并把这个出参作为reply的一部分返回给client端,这里的connectDevice才是真正调用到CameraService.cpp中的connectDevice方法。

         ::android::binder::Status_aidl_status(connectDevice(in_callbacks, in_cameraId, in_opPackageName,in_clientUid, &_aidl_return));

//_aidl_return写入到返回的数据结构中

         _aidl_ret_status=_aidl_reply->writeStrongBinder(::android::hardware::camera2::ICameraDeviceUser::asBinder(_aidl_return));

}

下面先看下怎么返回_aidl_returnclient端的,cameraservice先把结果写到_aidl_reply这个parcel中,然后由Binder驱动在发到client端,其中的细节是client端发起请求后会进入睡眠,等server端有了处理结果,把这个结果写到了binder驱动后,client会被Binder驱动唤醒执行读取操作。这里接收结果的客户端是:

ICameraService.java中的Proxy

public interface ICameraService:: publicstatic abstract class Stub:: private static class Proxy{

         @Overridepublic android.hardware.camera2.ICameraDeviceUserconnectDevice(android.hardware.camera2.ICameraDeviceCallbacks callbacks, intcameraId, java.lang.String opPackageName, int clientUid) throwsandroid.os.RemoteException{

//这句代码是发送请求的开始

                   mRemote.transact(Stub.TRANSACTION_connectDevice,_data, _reply, 0);

//这句就是服务端处理后返回的结果,通过_reply.readStrongBinder()parcel中读取结果,然后返回值给cameraManager

                   _result= android.hardware.camera2.ICameraDeviceUser.Stub.asInterface(_reply.readStrongBinder());

                   return_result;

}

}

 

接着看CameraService.cppconnectDevice都做了什么操作:

CameraService.cpp

Status CameraService::connectDevice(

         constsp<hardware::camera2::ICameraDeviceCallbacks>& cameraCb,

         intcameraId, onst String16& clientPackageName, int clientUid,

         /*out*/sp<hardware::camera2::ICameraDeviceUser>*device){

//这里的device是出参,类型是ICameraDeviceUser,也是有ICameraDeviceUser.aidl自动生成的,这个对象跟CameraDeviceClient的实例client对应,CameraDeviceClient继承了BnCameraDeviceUser进而继承了ICameraDeviceUser

         sp<CameraDeviceClient>client = nullptr;      

// connectHelper的定义在CameraService.h

         ret=connectHelper<hardware::camera2::ICameraDeviceCallbacks,CameraDeviceClient>

(cameraCb, id, CAMERA_HAL_API_VERSION_UNSPECIFIED,clientPackageName,

clientUid, USE_CALLING_PID, API_2, /*legacyMode*/ false,/*shimUpdateOnly*/ false,

/*out*/client);

         *device= client;

}

CameraService.h

//这是一个模板方法,CALLBACKhardware::camera2::ICameraDeviceCallbacks

//CLIENTCameraDeviceClient。这个方法主要作用是生成CameraClient实例,并调用其inittialize方法。

template<class CALLBACK, classCLIENT>

binder::StatusCameraService::connectHelper(const sp<CALLBACK>& cameraCb, constString8& cameraId, int halVersion, const String16& clientPackageName,int clientUid, int clientPid, apiLevel effectiveApiLevel, bool legacyMode, boolshimUpdateOnly, /*out*/sp<CLIENT>& device) {

         ret= makeClient(this, cameraCb, clientPackageName, id, facing, clientPid,

                clientUid, getpid(),legacyMode, halVersion, deviceVersion, effectiveApiLevel,

                /*out*/&tmp)

         client= static_cast<CLIENT*>(tmp.get());

         err= client->initialize(mModule)

}

CameraService.cpp

Status CameraService::makeClient(constsp<CameraService>& cameraService,

       const sp<IInterface>& cameraCb, const String16& packageName,int cameraId,

       int facing, int clientPid, uid_t clientUid, int servicePid, boollegacyMode,

       int halVersion, int deviceVersion, apiLevel effectiveApiLevel,

       /*out*/sp<BasicClient>* client){

//根据apiversion的不同,这里创建CameraDeviceClient实例。

         *client= new CameraDeviceClient(cameraService, tmp, packageName, cameraId,

                        facing, clientPid,clientUid, servicePid);

}

看下CameraDeviceClient的继承关系,

CameraDeviceClient.h

class CameraDeviceClient :

         publicCamera2ClientBase<CameraDeviceClientBase>,

         publiccamera2::FrameProcessorBase::FilteredListener{}

struct CameraDeviceClientBase :

         publicCameraService::BasicClient,

         publichardware::camera2::BnCameraDeviceUser{}

可以看到CameraDeviceClient继承了CameraService::BasicClient,并且实现了ICameraDeviceUser的这个Binderapi,同时还实现了帧处理线程的监听。

接着看CameraDeviceClient的构造函数:

CameraDeviceClient.cpp

CameraDeviceClient::CameraDeviceClient(constsp<CameraService>& cameraService,

         constsp<hardware::camera2::ICameraDeviceCallbacks>& remoteCallback,

         constString16& clientPackageName, int cameraId, int cameraFacing, int clientPid,

         uid_tclientUid, int servicePid) :

         Camera2ClientBase(cameraService,remoteCallback, clientPackageName,

         cameraId,cameraFacing, clientPid, clientUid, servicePid),

主要是在参数初始化列表中调用了父类Camera2ClientBase的构造函数。

Camera2ClientBase.cpp

Camera2ClientBase是一个模板类,这里的TClientBaseCameraDeviceClientBase,可以从CameraDeviceClient的继承关系看出。除了调用父类TClientBase(CameraDeviceClientBase)的构造函数外,还创建Camera3Device实例。

template <typename TClientBase>

Camera2ClientBase<TClientBase>::Camera2ClientBase(

         constsp<CameraService>& cameraService,  constsp<TCamCallbacks>& remoteCallback,

         constString16& clientPackageName, int cameraId, int cameraFacing, int clientPid,

         uid_tclientUid, int servicePid):

TClientBase(cameraService, remoteCallback,clientPackageName,

       cameraId, cameraFacing, clientPid, clientUid, servicePid),{

         mDevice= new Camera3Device(cameraId);

}

接着把继承的构造函数看完,CameraDeviceClientBase又调用了父类CameraService::BasicClient,的构造函数,BasicClient的构造函数实现代码在CameraService中,主要做的事情是应用权限相关的,这块权限的处理不是很了解。

这一系列构造函数的执行,最重要的还是Camera2ClientBase中的Camera3Device实例的创建及紧接着的initialize方法的调用。

下面看initialize方法的调用流程:

CameraDeviceClient.cpp

status_tCameraDeviceClient::initialize(CameraModule *module){

         res= Camera2ClientBase::initialize(module);

//这里注册了一个监听,mFrameProcessor是一个Thread,是一个输出帧元数据处理线程,

//处理预览回调相关的事情,这个线程会等待camera设备新的帧,然后调用监听接口的方法onResultAvailable

//这个方法:CameraDeviceClient::onResultAvailable,又会执行回调:

// remoteCb->onResultReceived(result.mMetadata,result.mResultExtras);这个remoteCb

//hardware::camera2::ICameraDeviceCallbacks类型的,这个callback实例是在

//Cameramanager.java中执行打开camera设备时创建的,然后由CameraService//connectDevice方法一路传递到CameraDeviceClient这里,所以这个回调实际的实现代码是:

// CameraDeviceImpl.java中的内部类CameraDeviceCallbacks的方法:onResultReceived

         mFrameProcessor->registerListener(FRAME_PROCESSOR_LISTENER_MIN_ID,

                   FRAME_PROCESSOR_LISTENER_MAX_ID,/*listener*/this, /*sendPartials*/true);

}

Camera2ClientBase.cpp

status_t Camera2ClientBase<TClientBase>::initialize(CameraModule*module){

//这里的mDeviceCamera3Device类的实例。

         res= mDevice->initialize(module);

}

Camera3Device.cpp

status_tCamera3Device::initialize(CameraModule *module){

//调用CameraModuleopen方法打开HAL设备,从这里开始就进入到了HAL层,HAL设备对应的结构体类型是camera3_device_tmodule就是CameraModule的实例,这个实例的创建是在CameraService第一次被引用时在其voidCameraService::onFirstRef()函数中,mModule = new CameraModule(rawModule);这部分跟CameraService的启动有关系。

         res= module->open(deviceName.string(),

                   reinterpret_cast<hw_device_t**>(&device));

//初始化HAL层设备,

         res= device->ops->initialize(device, this);

//创建Buffer管理器。

         mBufferManager= new Camera3BufferManager();

 

         res= find_camera_metadata_ro_entry(info.static_camera_characteristics,

                   ANDROID_CONTROL_AE_LOCK_AVAILABLE,&aeLockAvailableEntry);

//开启一个请求队列线程,run方法后它的threadLoop方法就会执行。

         mRequestThread= new RequestThread(this, mStatusTracker, device, aeLockAvailable);

         res= mRequestThread->run(String8::format("C3Dev-%d-ReqQueue",mId).string());

//创建准备流的线程,但是并没有马上运行这个线程,而是等到调用Camera3Deviceprepare方法时,根据需要开启线程,什么时候调用了Camera3Deviceprepare方法呢?这个我没打log跟,一种可能的情况是当创建一个session时,预分配缓存时调用。

         mPreparerThread= new PreparerThread();

}

-------------------------------------------------------------------------------------------------------------------

到这里Camera设备的打开就完成了。紧接着的就是开启预览。

再回到应用层,CaptureModule.java

上面camera设备打开的过程是从openCameraAndStartPreviewopen方法开始的,当camera成功打开后,会回调onCameraOpened,在这个回调中通过camera.startPreview启动预览。

private void openCameraAndStartPreview() {

         mOneCameraOpener.open(cameraId,captureSetting, mCameraHandler, mainThread,

                   imageRotationCalculator,mBurstController, mSoundPlayer,

                   newOpenCallback() {

                            @Override

                            publicvoid onCameraOpened(@Nonnull final OneCamera camera) {

                            mCamera= camera;

                            updatePreviewBufferDimension();

                            updatePreviewBufferSize();

                            camera.startPreview(newSurface(getPreviewSurfaceTexture()),

                                     newCaptureReadyCallback() {

                                               @Override

                                               publicvoid onReadyForCapture() {

//开启预览,要先创建拍照session,如果session成功创建,会回调到这里,说明预览已经准备好了,可以准备拍照了,

                                                        mMainThread.execute(newRunnable() {

         public void run() {

                   onPreviewStarted();

                   onReadyStateChanged(true);

}

}

}

                }); à

                            }

                   },,);

}

OneCameraImpl.java

public void startPreview(SurfacepreviewSurface, CaptureReadyCallback listener) {

         setupAsync(mPreviewSurface,listener);

}

开启异步拍照session

private void setupAsync(final SurfacepreviewSurface, final CaptureReadyCallback listener) {

       mCameraHandler.post(new Runnable() {

           @Override

           public void run() {

               setup(previewSurface,listener);

           }

       });

}

private void setup(Surface previewSurface,final CaptureReadyCallback listener) {

mDevice.createCaptureSession(outputSurfaces,new CameraCaptureSession.StateCallback() {

         public void onConfigured(CameraCaptureSessionsession) {

                   mCaptureSession = session;

         boolean success =repeatingPreview(null);

         if (success) {

                   listener.onReadyForCapture();

         }

}

}

Session的创建是调用到CameraDeviceImpl.java中的createCaptureSession,进而调用configureStreamsChecked配置流,所谓session创建是否成功,就是是否成功配置了输入输出,如果成功了设备会block进入idle,并且回调StateCallbackKK. onIdle();配置可能会失败,比如格式大小不支持,这时回调StateCallbackKK. onUnconfigured()。不管这个配置成功与否,都会new一个CameraCaptureSessionImpl实例,如果配置是成功的,就会回调上面CameraCaptureSession.StateCallback()中的onConfigured,同时把CameraCaptureSessionImpl实例作为onConfigured的参数传到OneCameraImpl.java中,就是mCaptureSession= session,也就是只有配置成功了,才会接着发出预览的requestrepeatingPreview

OneCameraImpl.java

private boolean repeatingPreview(Objecttag) {

         CaptureRequest.Builderbuilder = mDevice.

                   createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);

         mCaptureSession.setRepeatingRequest(builder.build(),mCaptureCallback,

                   mCameraHandler);

}

如果request成功build,就可以准备拍照了。

CameraCaptureSessionImpl.java

public synchronized intsetRepeatingRequest(CaptureRequest request, CaptureCallback callback,

         Handlerhandler) throws CameraAccessException {

//这里会把提交的requestrequestID入队,因为session的创建要配置camera设备的内部管道,要分配内存缓冲区,很耗时,所以capture request提交后会先入队,等session ready就开始执行capture

         returnaddPendingSequence(mDeviceImpl.setRepeatingRequest(request,

                   createCaptureCallbackProxy(handler,callback), mDeviceHandler));

}

其中的参数createCaptureCallbackProxy(handler, callback)是指定了回调从CameraDeviceImpl.CaptureCallbackCameraCaptureSession.CaptureCallback的,比较重要的一个方法是它的onCaptureCompleted。其中callbackOneCameraImpl.java中的mCaptureCallback

继续看Request的创建提交。

CameraDeviceImpl.java

public intsetRepeatingRequest(CaptureRequest request, CaptureCallback callback,

         Handlerhandler) throws CameraAccessException {

         returnsubmitCaptureRequest(requestList, callback, handler, /*streaming*/true);

}

private intsubmitCaptureRequest(List<CaptureRequest> requestList, CaptureCallbackcallback,

         Handlerhandler, boolean repeating) throws CameraAccessException {

         requestInfo= mRemoteDevice.submitRequestList(requestArray, repeating);

         if(callback != null) {

         mCaptureCallbackMap.put(requestInfo.getRequestId(),

                   new CaptureCallbackHolder(

                   callback, requestList,handler, repeating, mNextSessionId - 1));

}

}

通过ICameraDeviceUserWrapper的实例mRemoteDevice提交request

ICameraDeviceUserWrapper.java

public SubmitInfosubmitRequest(CaptureRequest request, boolean streaming){

         returnmRemoteDevice.submitRequest(request, streaming);

}

这里的mRemoteDevice类型是ICameraDeviceUser,这个实例是通过cameraServiceconnectDevice方法返回的。

前面我们说过ICameraDeviceUser对应了CameraDeviceClientCameraDeviceClient对应了CameraService的内部类Client

ICameraDeviceUser.javaICameraDeviceUser.cpp都是aidl文件自动生成的。

这样request就借助aidl的跨进程从ICameraDeviceUser.java到了CameraDeviceClient.cpp这边,进而跟cameraservice建立了联系。

继续看submitCaptureRequestcallback的处理,把callback做了一个包装放在了mCaptureCallbackMap中跟requestID做了关联,那么什么时候回调了这个callback呢?

前面在说CameraDeviceClient.cpp的初始化时提到,mFrameProcessor是一个输出帧元数据处理线程,处理预览回调相关的事情,这个线程会等待caemra设备新的帧,然后然后调用监听接口的方

onResultAvailable,这个方法:CameraDeviceClient::onResultAvailable,又会执行回调:remoteCb->onResultReceived(result.mMetadata,result.mResultExtras);这个remoteCbhardware::camera2::ICameraDeviceCallbacks类型的,这个callback实例是在Cameramanager.java中执行打开camera设备时创建的,然后由CameraServiceconnectDevice方法一路传递到CameraDeviceClient这里,所以这个回调实际的实现代码是:

CameraDeviceImpl.java中的内部类CameraDeviceCallbacks的方法:onResultReceived

我们看CameraDeviceImpl.javaonResultReceived方法:

CameraDeviceImpl.java

public voidonResultReceived(CameraMetadataNative result,

         CaptureResultExtrasresultExtras) throws RemoteException {

//根据requestId,取得holder

         intrequestId = resultExtras.getRequestId();

         finalCaptureCallbackHolder holder =

                   CameraDeviceImpl.this.mCaptureCallbackMap.get(requestId);

         finalCaptureRequest request = holder.getRequest(resultExtras.getSubsequenceId());

//通过holder得到callback执行回调,同时传入数据resultAsCapture

         holder.getCallback().onCaptureProgressed(CameraDeviceImpl.this,

request, resultAsCapture);

         holder.getCallback().onCaptureCompleted(CameraDeviceImpl.this,

request, resultAsCapture);

}

这样就把底层的数据传到Framework,进一步传到了应用层。


另外获取元数据,也可以通过ImageLoader。

在实例化Camera实例时,获取ImageReader对象,同时设置它的监听,当有一张新的图片可用时,回调其onImageAvailable接口,在这个onImageAvailable接口中,读取、存储元数据。


OneCameraImpl.javaprivate final ImageReader mCaptureImageReader;    ImageReader.OnImageAvailableListener mCaptureImageListener =            new ImageReader.OnImageAvailableListener() {                @Override                public void onImageAvailable(ImageReader reader) {                    // Add the image data to the latest in-flight capture.                    // If all the data for that capture is complete, store the                    // image data.                    InFlightCapture capture = null;                    synchronized (mCaptureQueue) {                        if (mCaptureQueue.getFirst().setImage(reader.acquireLatestImage())                                .isCaptureComplete()) {                            capture = mCaptureQueue.removeFirst();                        }                    }                    if (capture != null) {                        onCaptureCompleted(capture);                    }                }            };

获取ImageReader实例,设置监听。

OneCameraImpl(CameraDevice device, CameraCharacteristics characteristics, Size pictureSize) {        mCaptureImageReader = ImageReader.newInstance(pictureSize.getWidth(),                pictureSize.getHeight(),                sCaptureImageFormat, 2);        mCaptureImageReader.setOnImageAvailableListener(mCaptureImageListener, mCameraHandler);}

拍照完成时,会回调onCaptureCompleted。


private void onCaptureCompleted(InFlightCapture capture) {        // Experimental support for writing RAW. We do not have a usable JPEG        // here, so we don't use the usual capture session mechanism and instead        // just store the RAW file in its own directory.        // TODO: If we make this a real feature we should probably put the DNGs        // into the Camera directly.//可以存储元数据        if (sCaptureImageFormat == ImageFormat.RAW_SENSOR) {            if (!RAW_DIRECTORY.exists()) {                if (!RAW_DIRECTORY.mkdirs()) {                    throw new RuntimeException("Could not create RAW directory.");                }            }            File dngFile = new File(RAW_DIRECTORY, capture.session.getTitle() + ".dng");            writeDngBytesAndClose(capture.image, capture.totalCaptureResult,                    mCharacteristics, dngFile);        } else {//也可以存储jpg。            // Since this is not an HDR+ session, we will just save the            // result.            byte[] imageBytes = acquireJpegBytesAndClose(capture.image);            saveJpegPicture(imageBytes, capture.parameters, capture.session,                    capture.totalCaptureResult);        }        broadcastReadyState(true);        capture.parameters.callback.onPictureTaken(capture.session);}

调用writeDngBytesAndClose存储元数据,

 private static void writeDngBytesAndClose(Image image, TotalCaptureResult captureResult,            CameraCharacteristics characteristics, File dngFile) {        try (DngCreator dngCreator = new DngCreator(characteristics, captureResult);                FileOutputStream outputStream = new FileOutputStream(dngFile)) {            // TODO: Add DngCreator#setThumbnail and add the DNG to the normal            // filmstrip.            dngCreator.writeImage(outputStream, image);            outputStream.close();            image.close();        } catch (IOException e) {            Log.e(TAG, "Could not store DNG file", e);            return;        }        Log.i(TAG, "Successfully stored DNG file: " + dngFile.getAbsolutePath());    }



阅读全文
0 0
原创粉丝点击