Android Multimedia框架总结(十四)Camera框架初识及自定义相机案例

来源:互联网 发布:mysql的string转date 编辑:程序博客网 时间:2024/06/03 20:33

转载请把头部出处链接和尾部二维码一起转载,本文出自逆流的鱼yuiop:http://blog.csdn.net/hejjunlin/article/details/52738492

前言:国庆节告一段落,又是新一月,上月主要是围绕MediaPlayer相关展开,从今天开始,开始分析多媒体框架中的Camera模块,看下今天的Agenda:

  • Camera拍照
  • Camera录像
  • 新API android.hardware.camera2
  • 新旧API特点对比
  • Camera自定义相机
  • 新API android.hardware.camera2自定义相机

Camera类被用于获取图片设置,start/stop 预览,快照图片,恢复视频编码的帧,Camera是一个client对于Camera Service来说,将管理Camera 硬件相关,
为了使用设备的camera, 须要在Manifest中加上相应的权限,如你须要自动聚焦功能,在Manifest.xml须加入

<uses-permission android:name="android.permission.CAMERA" /><uses-feature android:name="android.hardware.camera" /><uses-feature android:name="android.hardware.camera.autofocus" />

Camera拍照

用Camera去拍照,将遵循以下几个步骤:

  • 获得一个Camera的实例,通过open方法
  • 如果必要的话,可以修改一些默认参数
  • 通过初始化SurfaceHolder去setPreviewDisplay(SurfaceHolder),没有Surface, camera不能开始预览
  • 调用startPreview方法开始更新预览的surface,在拍照前,预览(preview)必须被开启。
  • 当你想开始拍照时,使用takePicture(Camera.ShutterCallback,
    Camera.PictureCallback, Camera.PictureCallback, Camera.PictureCallback), 等待回调提供真实的图像数据
  • 当拍完一张照片时,预览(preview)将会停止,当你想要拍更多的照片时,须要再一次调用startPreview方法
  • 当调用stopPreview方法时,将停止更新预览的surface
  • 当调用release方法时,将马上释放camera

Camera录像

以上是拍照过程,当如果切换成视频录制模式时,将遵循以下步骤:

  • 获取一个初始化Camera,且开始预览
  • 调用unlock方法允许media进程去访问camera
  • 通过camera去setCamera(Camera),了解更多,可看MediaRecorder
  • 当完成录制时,调用reconnect方法重新获取且重新lock camera
  • 调用stopPreview()方法且release方法时,作为以上描述
  • 这个类是一个非线程安全类,意味着在被使用任何工作线程中,大多数长运行操作(preview,focus,photo capture 等)发生,异步调用回调是必要的。回调被调用在事件线程,Camera方法决不能一次被多个线程调用
  • 不同的安卓设备可能有不同的硬件规格,如像素的评级和自动对焦功能。

新API android.hardware.camera2

从API-21(5.0)开始,新增一个android.hardware.camera2包,取代原来Camera.java。

Android平台支持拍照及录制视频,通过android.hardware.camera2相关API或camera Intent,下面是一些相关联的类

  • android.hardware.camera2
    主要的API控制相机设备,被用于拍照及拍视频当build一个camera应用时
  • Camera
    Android 5.0之前控制相机设备的class
  • SurfaceView
    呈现实时预览给用户
  • MediaRecorder
    用户录制视频
  • Intent
    Intent action类型:MediaStore.ACTION_IMAGE_CAPTURE/MediaStore.ACTION_VIDEO_CAPTURE,拍照或录视频,不用直接用Camera(可通过调起第三方定义的Camera)

本文出自逆流的鱼yuiop:http://blog.csdn.net/hejjunlin/article/details/52738492

新旧API特点对比

android.hardware.camera2与原来的camera API相比,不同之处在于:

  • 原生支持RAW照片输出突发拍摄模式
  • 制约拍照速度的不再是软件而是硬件。以Nexus 5为例,分辨率全开下Andorid L的连拍速度可达到30fps。
  • 全自动控制快门、感光度、对焦、测光、硬件视频防抖等多种参数都被整合到了新的API内。
  • 新的API中添加的手动控制功能列表:感光度手动对焦/AF开关AE/AF/AWB模式AE/AWB锁硬件视频防抖连续帧
  • 可以用单个手指进行缩放
  • 支持QR码识别

实例

使用Camera有两种方式:通过Intent使用已有的app和通过Camera构建自己的app。通过Camera API的方式拍照。

通过Camera API方式拍照需要引入几个关键的类:

  • Camera类:最主要的类,用于管理Camera设备,本文中主要用到以下方法:

    • open():通过open方法获取Camera实例。
    • setPreviewDisplay(SurfaceHolder):设置预览拍照
    • startPreview():开始预览
    • stopPreview():停止预览
    • release():释放Camera实例
    • takePicture(Camera.ShutterCallback shutter, Camera.PictureCallback raw, Camera.PictureCallback jpeg):这个是拍照要执行的方法,包含了三个回调参数。Shutter是快门按下时的回调,raw是获取拍照原始数据的回调,jpeg是获取经过压缩成jpg格式的图像数据。在本文中需要实现最后一个回调,参见下面。
    • Camera.PictureCallback接口:该回调接口包含了一个onPictureTaken(byte[]data, Camera camera)方法。在这个方法中可以保存图像数据。
  • SurfaceView类:用于控制预览界面

  • SurfaceHolder.Callback接口:用于处理预览的事件,需实现如下三个方法:

    • surfaceCreated(SurfaceHolderholder):预览界面创建时调用,每次界面改变后都会重新创建,需要获取相机资源并设置SurfaceHolder。
    • surfaceChanged(SurfaceHolderholder, int format, int width, int height):预览界面发生变化时调用,每次界面发生变化之后需要重新启动预览。
    • surfaceDestroyed(SurfaceHolderholder):预览销毁时调用,停止预览,释放相应资源。

通过Camera方式来实现拍照

通过Camera方式 会比通过Intent方式获得更为丰富的功能。通常创建一个定制化的Camera需要如下步骤:

(1) 通过Camera.open()来获取Camera实例。
(2) 创建Preview类,需要继承SurfaceView类并实现SurfaceHolder.Callback接口。
(3) 为相机设置Preview
(4) 构建一个Preview的Layout来预览相机;
(5) 为拍照建立Listener以获取拍照后的回调;
(6) 拍照并保存文件;
(7) 释放Camera。

本文出自逆流的鱼yuiop:http://blog.csdn.net/hejjunlin/article/details/52738492

Camera自定义相机

CameraSample代码如下:
MainActivity.java
这里写图片描述
这里写图片描述

SurfacePreview.java
这里写图片描述
这里写图片描述
这里写图片描述

activity_main.xml
这里写图片描述

效果图:


这里写图片描述

本文出自逆流的鱼yuiop:http://blog.csdn.net/hejjunlin/article/details/52738492

新API android.hardware.camera2自定义相机

接下来看来用新API自定义相机:
MainActivity.java

package com.hejunlin.camerasample2;import android.support.v7.app.AppCompatActivity;import android.os.Bundle;public class MainActivity extends AppCompatActivity {    @Override    protected void onCreate(Bundle savedInstanceState) {        super.onCreate(savedInstanceState);        setContentView(R.layout.activity_main);        if (null == savedInstanceState) {            getFragmentManager().beginTransaction()                    .replace(R.id.container, Camera2Fragment.newInstance())                    .commit();        }    }}

Camera2Fragment.java

package com.hejunlin.camerasample2;import android.Manifest;import android.app.Activity;import android.app.AlertDialog;import android.app.Dialog;import android.app.DialogFragment;import android.app.Fragment;import android.content.Context;import android.content.DialogInterface;import android.content.pm.PackageManager;import android.content.res.Configuration;import android.graphics.ImageFormat;import android.graphics.Matrix;import android.graphics.Point;import android.graphics.RectF;import android.graphics.SurfaceTexture;import android.hardware.camera2.CameraAccessException;import android.hardware.camera2.CameraCaptureSession;import android.hardware.camera2.CameraCharacteristics;import android.hardware.camera2.CameraDevice;import android.hardware.camera2.CameraManager;import android.hardware.camera2.CameraMetadata;import android.hardware.camera2.CaptureRequest;import android.hardware.camera2.CaptureResult;import android.hardware.camera2.TotalCaptureResult;import android.hardware.camera2.params.StreamConfigurationMap;import android.media.Image;import android.media.ImageReader;import android.os.Bundle;import android.os.Environment;import android.os.Handler;import android.os.HandlerThread;import android.support.annotation.NonNull;import android.support.v13.app.FragmentCompat;import android.support.v4.content.ContextCompat;import android.util.Log;import android.util.Size;import android.util.SparseIntArray;import android.view.LayoutInflater;import android.view.Surface;import android.view.TextureView;import android.view.View;import android.view.ViewGroup;import android.widget.Toast;import java.io.File;import java.io.FileNotFoundException;import java.io.FileOutputStream;import java.io.IOException;import java.nio.ByteBuffer;import java.text.SimpleDateFormat;import java.util.ArrayList;import java.util.Arrays;import java.util.Collections;import java.util.Comparator;import java.util.Date;import java.util.List;import java.util.concurrent.Semaphore;import java.util.concurrent.TimeUnit;public class Camera2Fragment extends Fragment        implements View.OnClickListener, FragmentCompat.OnRequestPermissionsResultCallback {    /**     * Conversion from screen rotation to JPEG orientation.     */    private static final SparseIntArray ORIENTATIONS = new SparseIntArray();    private static final int REQUEST_CAMERA_PERMISSION = 1;    private static final String FRAGMENT_DIALOG = "dialog";    static {        ORIENTATIONS.append(Surface.ROTATION_0, 90);        ORIENTATIONS.append(Surface.ROTATION_90, 0);        ORIENTATIONS.append(Surface.ROTATION_180, 270);        ORIENTATIONS.append(Surface.ROTATION_270, 180);    }    /**     * Tag for the {@link Log}.     */    private static final String TAG = "Camera2Fragment";    /**     * Camera state: Showing camera preview.     */    private static final int STATE_PREVIEW = 0;    /**     * Camera state: Waiting for the focus to be locked.     */    private static final int STATE_WAITING_LOCK = 1;    /**     * Camera state: Waiting for the exposure to be precapture state.     */    private static final int STATE_WAITING_PRECAPTURE = 2;    /**     * Camera state: Waiting for the exposure state to be something other than precapture.     */    private static final int STATE_WAITING_NON_PRECAPTURE = 3;    /**     * Camera state: Picture was taken.     */    private static final int STATE_PICTURE_TAKEN = 4;    /**     * Max preview width that is guaranteed by Camera2 API     */    private static final int MAX_PREVIEW_WIDTH = 1920;    /**     * Max preview height that is guaranteed by Camera2 API     */    private static final int MAX_PREVIEW_HEIGHT = 1080;    /**     * {@link TextureView.SurfaceTextureListener} handles several lifecycle events on a     * {@link TextureView}.     */    private final TextureView.SurfaceTextureListener mSurfaceTextureListener            = new TextureView.SurfaceTextureListener() {        @Override        public void onSurfaceTextureAvailable(SurfaceTexture texture, int width, int height) {            openCamera(width, height);        }        @Override        public void onSurfaceTextureSizeChanged(SurfaceTexture texture, int width, int height) {            configureTransform(width, height);        }        @Override        public boolean onSurfaceTextureDestroyed(SurfaceTexture texture) {            return true;        }        @Override        public void onSurfaceTextureUpdated(SurfaceTexture texture) {        }    };    /**     * ID of the current {@link CameraDevice}.     */    private String mCameraId;    /**     * An {@link SuperTextureView} for camera preview.     */    private SuperTextureView mTextureView;    /**     * A {@link CameraCaptureSession } for camera preview.     */    private CameraCaptureSession mCaptureSession;    /**     * A reference to the opened {@link CameraDevice}.     */    private CameraDevice mCameraDevice;    /**     * The {@link Size} of camera preview.     */    private Size mPreviewSize;    /**     * {@link CameraDevice.StateCallback} is called when {@link CameraDevice} changes its state.     */    private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {        @Override        public void onOpened(@NonNull CameraDevice cameraDevice) {            // This method is called when the camera is opened.  We start camera preview here.            mCameraOpenCloseLock.release();            mCameraDevice = cameraDevice;            createCameraPreviewSession();        }        @Override        public void onDisconnected(@NonNull CameraDevice cameraDevice) {            mCameraOpenCloseLock.release();            cameraDevice.close();            mCameraDevice = null;        }        @Override        public void onError(@NonNull CameraDevice cameraDevice, int error) {            mCameraOpenCloseLock.release();            cameraDevice.close();            mCameraDevice = null;            Activity activity = getActivity();            if (null != activity) {                activity.finish();            }        }    };    /**     * An additional thread for running tasks that shouldn't block the UI.     */    private HandlerThread mBackgroundThread;    /**     * A {@link Handler} for running tasks in the background.     */    private Handler mBackgroundHandler;    /**     * An {@link ImageReader} that handles still image capture.     */    private ImageReader mImageReader;    /**     * This is the output file for our picture.     */    private File mFile;    /**     * This a callback object for the {@link ImageReader}. "onImageAvailable" will be called when a     * still image is ready to be saved.     */    private final ImageReader.OnImageAvailableListener mOnImageAvailableListener            = new ImageReader.OnImageAvailableListener() {        @Override        public void onImageAvailable(ImageReader reader) {            mBackgroundHandler.post(new ImageSaver(reader.acquireNextImage(), mFile));        }    };    /**     * {@link CaptureRequest.Builder} for the camera preview     */    private CaptureRequest.Builder mPreviewRequestBuilder;    /**     * {@link CaptureRequest} generated by {@link #mPreviewRequestBuilder}     */    private CaptureRequest mPreviewRequest;    /**     * The current state of camera state for taking pictures.     *     * @see #mCaptureCallback     */    private int mState = STATE_PREVIEW;    /**     * A {@link Semaphore} to prevent the app from exiting before closing the camera.     */    private Semaphore mCameraOpenCloseLock = new Semaphore(1);    /**     * Whether the current camera device supports Flash or not.     */    private boolean mFlashSupported;    /**     * Orientation of the camera sensor     */    private int mSensorOrientation;    /**     * A {@link CameraCaptureSession.CaptureCallback} that handles events related to JPEG capture.     */    private CameraCaptureSession.CaptureCallback mCaptureCallback            = new CameraCaptureSession.CaptureCallback() {        private void process(CaptureResult result) {            switch (mState) {                case STATE_PREVIEW: {                    // We have nothing to do when the camera preview is working normally.                    break;                }                case STATE_WAITING_LOCK: {                    Integer afState = result.get(CaptureResult.CONTROL_AF_STATE);                    if (afState == null) {                        captureStillPicture();                    } else if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState ||                            CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState) {                        // CONTROL_AE_STATE can be null on some devices                        Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);                        if (aeState == null ||                                aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) {                            mState = STATE_PICTURE_TAKEN;                            captureStillPicture();                        } else {                            runPrecaptureSequence();                        }                    }                    break;                }                case STATE_WAITING_PRECAPTURE: {                    // CONTROL_AE_STATE can be null on some devices                    Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);                    if (aeState == null ||                            aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE ||                            aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) {                        mState = STATE_WAITING_NON_PRECAPTURE;                    }                    break;                }                case STATE_WAITING_NON_PRECAPTURE: {                    // CONTROL_AE_STATE can be null on some devices                    Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);                    if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE) {                        mState = STATE_PICTURE_TAKEN;                        captureStillPicture();                    }                    break;                }            }        }        @Override        public void onCaptureProgressed(@NonNull CameraCaptureSession session,                                        @NonNull CaptureRequest request,                                        @NonNull CaptureResult partialResult) {            process(partialResult);        }        @Override        public void onCaptureCompleted(@NonNull CameraCaptureSession session,                                       @NonNull CaptureRequest request,                                       @NonNull TotalCaptureResult result) {            process(result);        }    };    /**     * Shows a {@link Toast} on the UI thread.     *     * @param text The message to show     */    private void showToast(final String text) {        final Activity activity = getActivity();        if (activity != null) {            activity.runOnUiThread(new Runnable() {                @Override                public void run() {                    Toast.makeText(activity, text, Toast.LENGTH_SHORT).show();                }            });        }    }    /**     * Given {@code choices} of {@code Size}s supported by a camera, choose the smallest one that     * is at least as large as the respective texture view size, and that is at most as large as the     * respective max size, and whose aspect ratio matches with the specified value. If such size     * doesn't exist, choose the largest one that is at most as large as the respective max size,     * and whose aspect ratio matches with the specified value.     *     * @param choices           The list of sizes that the camera supports for the intended output     *                          class     * @param textureViewWidth  The width of the texture view relative to sensor coordinate     * @param textureViewHeight The height of the texture view relative to sensor coordinate     * @param maxWidth          The maximum width that can be chosen     * @param maxHeight         The maximum height that can be chosen     * @param aspectRatio       The aspect ratio     * @return The optimal {@code Size}, or an arbitrary one if none were big enough     */    private static Size chooseOptimalSize(Size[] choices, int textureViewWidth,            int textureViewHeight, int maxWidth, int maxHeight, Size aspectRatio) {        // Collect the supported resolutions that are at least as big as the preview Surface        List<Size> bigEnough = new ArrayList<>();        // Collect the supported resolutions that are smaller than the preview Surface        List<Size> notBigEnough = new ArrayList<>();        int w = aspectRatio.getWidth();        int h = aspectRatio.getHeight();        for (Size option : choices) {            if (option.getWidth() <= maxWidth && option.getHeight() <= maxHeight &&                    option.getHeight() == option.getWidth() * h / w) {                if (option.getWidth() >= textureViewWidth &&                    option.getHeight() >= textureViewHeight) {                    bigEnough.add(option);                } else {                    notBigEnough.add(option);                }            }        }        // Pick the smallest of those big enough. If there is no one big enough, pick the        // largest of those not big enough.        if (bigEnough.size() > 0) {            return Collections.min(bigEnough, new CompareSizesByArea());        } else if (notBigEnough.size() > 0) {            return Collections.max(notBigEnough, new CompareSizesByArea());        } else {            Log.e(TAG, "Couldn't find any suitable preview size");            return choices[0];        }    }    public static Camera2Fragment newInstance() {        return new Camera2Fragment();    }    @Override    public View onCreateView(LayoutInflater inflater, ViewGroup container,                             Bundle savedInstanceState) {        return inflater.inflate(R.layout.fragment_camera2, container, false);    }    @Override    public void onViewCreated(final View view, Bundle savedInstanceState) {        view.findViewById(R.id.picture).setOnClickListener(this);//        view.findViewById(R.id.info).setOnClickListener(this);        mTextureView = (SuperTextureView) view.findViewById(R.id.texture);    }    @Override    public void onActivityCreated(Bundle savedInstanceState) {        super.onActivityCreated(savedInstanceState);        mFile = getOutputMediaFile();    }    private File getOutputMediaFile(){        //get the mobile Pictures directory        File picDir = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES);        //get the current time        String timeStamp = new SimpleDateFormat("yyyy-MMdd-HHmmss").format(new Date());        return new File(picDir.getPath() + File.separator + "hejunlin_camera2_"+ timeStamp + ".jpg");    }    @Override    public void onResume() {        super.onResume();        startBackgroundThread();        // When the screen is turned off and turned back on, the SurfaceTexture is already        // available, and "onSurfaceTextureAvailable" will not be called. In that case, we can open        // a camera and start preview from here (otherwise, we wait until the surface is ready in        // the SurfaceTextureListener).        if (mTextureView.isAvailable()) {            openCamera(mTextureView.getWidth(), mTextureView.getHeight());        } else {            mTextureView.setSurfaceTextureListener(mSurfaceTextureListener);        }    }    @Override    public void onPause() {        closeCamera();        stopBackgroundThread();        super.onPause();    }    private void requestCameraPermission() {        if (FragmentCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.CAMERA)) {            new ConfirmationDialog().show(getChildFragmentManager(), FRAGMENT_DIALOG);        } else {            FragmentCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA},                    REQUEST_CAMERA_PERMISSION);        }    }    @Override    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions,                                           @NonNull int[] grantResults) {        if (requestCode == REQUEST_CAMERA_PERMISSION) {            if (grantResults.length != 1 || grantResults[0] != PackageManager.PERMISSION_GRANTED) {                ErrorDialog.newInstance("需要申请摄像头权限")                        .show(getChildFragmentManager(), FRAGMENT_DIALOG);            }        } else {            super.onRequestPermissionsResult(requestCode, permissions, grantResults);        }    }    /**     * Sets up member variables related to camera.     *     * @param width  The width of available size for camera preview     * @param height The height of available size for camera preview     */    private void setUpCameraOutputs(int width, int height) {        Activity activity = getActivity();        CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);        try {            for (String cameraId : manager.getCameraIdList()) {                CameraCharacteristics characteristics                        = manager.getCameraCharacteristics(cameraId);                // We don't use a front facing camera in this sample.                Integer facing = characteristics.get(CameraCharacteristics.LENS_FACING);                if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {                    continue;                }                StreamConfigurationMap map = characteristics.get(                        CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);                if (map == null) {                    continue;                }                // For still image captures, we use the largest available size.                Size largest = Collections.max(                        Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),                        new CompareSizesByArea());                mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),                        ImageFormat.JPEG, /*maxImages*/2);                mImageReader.setOnImageAvailableListener(                        mOnImageAvailableListener, mBackgroundHandler);                // Find out if we need to swap dimension to get the preview size relative to sensor                // coordinate.                int displayRotation = activity.getWindowManager().getDefaultDisplay().getRotation();                //noinspection ConstantConditions                mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);                boolean swappedDimensions = false;                switch (displayRotation) {                    case Surface.ROTATION_0:                    case Surface.ROTATION_180:                        if (mSensorOrientation == 90 || mSensorOrientation == 270) {                            swappedDimensions = true;                        }                        break;                    case Surface.ROTATION_90:                    case Surface.ROTATION_270:                        if (mSensorOrientation == 0 || mSensorOrientation == 180) {                            swappedDimensions = true;                        }                        break;                    default:                        Log.e(TAG, "Display rotation is invalid: " + displayRotation);                }                Point displaySize = new Point();                activity.getWindowManager().getDefaultDisplay().getSize(displaySize);                int rotatedPreviewWidth = width;                int rotatedPreviewHeight = height;                int maxPreviewWidth = displaySize.x;                int maxPreviewHeight = displaySize.y;                if (swappedDimensions) {                    rotatedPreviewWidth = height;                    rotatedPreviewHeight = width;                    maxPreviewWidth = displaySize.y;                    maxPreviewHeight = displaySize.x;                }                if (maxPreviewWidth > MAX_PREVIEW_WIDTH) {                    maxPreviewWidth = MAX_PREVIEW_WIDTH;                }                if (maxPreviewHeight > MAX_PREVIEW_HEIGHT) {                    maxPreviewHeight = MAX_PREVIEW_HEIGHT;                }                // Danger, W.R.! Attempting to use too large a preview size could  exceed the camera                // bus' bandwidth limitation, resulting in gorgeous previews but the storage of                // garbage capture data.                mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),                        rotatedPreviewWidth, rotatedPreviewHeight, maxPreviewWidth,                        maxPreviewHeight, largest);                // We fit the aspect ratio of TextureView to the size of preview we picked.                int orientation = getResources().getConfiguration().orientation;                if (orientation == Configuration.ORIENTATION_LANDSCAPE) {//                    mTextureView.setAspectRatio(//                            mPreviewSize.getWidth(), mPreviewSize.getHeight());                } else {//                    mTextureView.setAspectRatio(//                            mPreviewSize.getHeight(), mPreviewSize.getWidth());                }                // Check if the flash is supported.                Boolean available = characteristics.get(CameraCharacteristics.FLASH_INFO_AVAILABLE);                mFlashSupported = available == null ? false : available;                mCameraId = cameraId;                return;            }        } catch (CameraAccessException e) {            e.printStackTrace();        } catch (NullPointerException e) {            // Currently an NPE is thrown when the Camera2API is used but not supported on the            // device this code runs.            ErrorDialog.newInstance("该设备不支持Camera2API")                    .show(getChildFragmentManager(), FRAGMENT_DIALOG);        }    }    /**     * Opens the camera specified by {@link Camera2Fragment#mCameraId}.     */    private void openCamera(int width, int height) {        if (ContextCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA)                != PackageManager.PERMISSION_GRANTED) {            requestCameraPermission();            return;        }        setUpCameraOutputs(width, height);        configureTransform(width, height);        Activity activity = getActivity();        CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);        try {            if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {                throw new RuntimeException("Time out waiting to lock camera opening.");            }            manager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);        } catch (CameraAccessException e) {            e.printStackTrace();        } catch (InterruptedException e) {            throw new RuntimeException("Interrupted while trying to lock camera opening.", e);        }    }    /**     * Closes the current {@link CameraDevice}.     */    private void closeCamera() {        try {            mCameraOpenCloseLock.acquire();            if (null != mCaptureSession) {                mCaptureSession.close();                mCaptureSession = null;            }            if (null != mCameraDevice) {                mCameraDevice.close();                mCameraDevice = null;            }            if (null != mImageReader) {                mImageReader.close();                mImageReader = null;            }        } catch (InterruptedException e) {            throw new RuntimeException("Interrupted while trying to lock camera closing.", e);        } finally {            mCameraOpenCloseLock.release();        }    }    /**     * Starts a background thread and its {@link Handler}.     */    private void startBackgroundThread() {        mBackgroundThread = new HandlerThread("CameraBackground");        mBackgroundThread.start();        mBackgroundHandler = new Handler(mBackgroundThread.getLooper());    }    /**     * Stops the background thread and its {@link Handler}.     */    private void stopBackgroundThread() {        mBackgroundThread.quitSafely();        try {            mBackgroundThread.join();            mBackgroundThread = null;            mBackgroundHandler = null;        } catch (InterruptedException e) {            e.printStackTrace();        }    }    /**     * Creates a new {@link CameraCaptureSession} for camera preview.     */    private void createCameraPreviewSession() {        try {            SurfaceTexture texture = mTextureView.getSurfaceTexture();            assert texture != null;            // We configure the size of default buffer to be the size of camera preview we want.            texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());            // This is the output Surface we need to start preview.            Surface surface = new Surface(texture);            // We set up a CaptureRequest.Builder with the output Surface.            mPreviewRequestBuilder                    = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);            mPreviewRequestBuilder.addTarget(surface);            // Here, we create a CameraCaptureSession for camera preview.            mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),                    new CameraCaptureSession.StateCallback() {                        @Override                        public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {                            // The camera is already closed                            if (null == mCameraDevice) {                                return;                            }                            // When the session is ready, we start displaying the preview.                            mCaptureSession = cameraCaptureSession;                            try {                                // Auto focus should be continuous for camera preview.                                mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,                                        CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);                                // Flash is automatically enabled when necessary.                                setAutoFlash(mPreviewRequestBuilder);                                // Finally, we start displaying the camera preview.                                mPreviewRequest = mPreviewRequestBuilder.build();                                mCaptureSession.setRepeatingRequest(mPreviewRequest,                                        mCaptureCallback, mBackgroundHandler);                            } catch (CameraAccessException e) {                                e.printStackTrace();                            }                        }                        @Override                        public void onConfigureFailed(                                @NonNull CameraCaptureSession cameraCaptureSession) {                            showToast("Failed");                        }                    }, null            );        } catch (CameraAccessException e) {            e.printStackTrace();        }    }    /**     * Configures the necessary {@link Matrix} transformation to `mTextureView`.     * This method should be called after the camera preview size is determined in     * setUpCameraOutputs and also the size of `mTextureView` is fixed.     *     * @param viewWidth  The width of `mTextureView`     * @param viewHeight The height of `mTextureView`     */    private void configureTransform(int viewWidth, int viewHeight) {        Activity activity = getActivity();        if (null == mTextureView || null == mPreviewSize || null == activity) {            return;        }        int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();        Matrix matrix = new Matrix();        RectF viewRect = new RectF(0, 0, viewWidth, viewHeight);        RectF bufferRect = new RectF(0, 0, mPreviewSize.getHeight(), mPreviewSize.getWidth());        float centerX = viewRect.centerX();        float centerY = viewRect.centerY();        if (Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation) {            bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());            matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);            float scale = Math.max(                    (float) viewHeight / mPreviewSize.getHeight(),                    (float) viewWidth / mPreviewSize.getWidth());            matrix.postScale(scale, scale, centerX, centerY);            matrix.postRotate(90 * (rotation - 2), centerX, centerY);        } else if (Surface.ROTATION_180 == rotation) {            matrix.postRotate(180, centerX, centerY);        }        mTextureView.setTransform(matrix);    }    /**     * Initiate a still image capture.     */    private void takePicture() {        lockFocus();    }    /**     * Lock the focus as the first step for a still image capture.     */    private void lockFocus() {        try {            // This is how to tell the camera to lock focus.            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,                    CameraMetadata.CONTROL_AF_TRIGGER_START);            // Tell #mCaptureCallback to wait for the lock.            mState = STATE_WAITING_LOCK;            mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,                    mBackgroundHandler);        } catch (CameraAccessException e) {            e.printStackTrace();        }    }    /**     * Run the precapture sequence for capturing a still image. This method should be called when     * we get a response in {@link #mCaptureCallback} from {@link #lockFocus()}.     */    private void runPrecaptureSequence() {        try {            // This is how to tell the camera to trigger.            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER,                    CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);            // Tell #mCaptureCallback to wait for the precapture sequence to be set.            mState = STATE_WAITING_PRECAPTURE;            mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,                    mBackgroundHandler);        } catch (CameraAccessException e) {            e.printStackTrace();        }    }    /**     * Capture a still picture. This method should be called when we get a response in     * {@link #mCaptureCallback} from both {@link #lockFocus()}.     */    private void captureStillPicture() {        try {            final Activity activity = getActivity();            if (null == activity || null == mCameraDevice) {                return;            }            // This is the CaptureRequest.Builder that we use to take a picture.            final CaptureRequest.Builder captureBuilder =                    mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);            captureBuilder.addTarget(mImageReader.getSurface());            // Use the same AE and AF modes as the preview.            captureBuilder.set(CaptureRequest.CONTROL_AF_MODE,                    CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);            setAutoFlash(captureBuilder);            // Orientation            int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();            captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));            CameraCaptureSession.CaptureCallback CaptureCallback                    = new CameraCaptureSession.CaptureCallback() {                @Override                public void onCaptureCompleted(@NonNull CameraCaptureSession session,                                               @NonNull CaptureRequest request,                                               @NonNull TotalCaptureResult result) {                    showToast("拍照成功,图片保存为: " + mFile);                    Log.d(TAG, mFile.toString());                    unlockFocus();                }            };            mCaptureSession.stopRepeating();            mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);        } catch (CameraAccessException e) {            e.printStackTrace();        }    }    /**     * Retrieves the JPEG orientation from the specified screen rotation.     *     * @param rotation The screen rotation.     * @return The JPEG orientation (one of 0, 90, 270, and 360)     */    private int getOrientation(int rotation) {        // Sensor orientation is 90 for most devices, or 270 for some devices (eg. Nexus 5X)        // We have to take that into account and rotate JPEG properly.        // For devices with orientation of 90, we simply return our mapping from ORIENTATIONS.        // For devices with orientation of 270, we need to rotate the JPEG 180 degrees.        return (ORIENTATIONS.get(rotation) + mSensorOrientation + 270) % 360;    }    /**     * Unlock the focus. This method should be called when still image capture sequence is     * finished.     */    private void unlockFocus() {        try {            // Reset the auto-focus trigger            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,                    CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);            setAutoFlash(mPreviewRequestBuilder);            mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,                    mBackgroundHandler);            // After this, the camera will go back to the normal state of preview.            mState = STATE_PREVIEW;            mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback,                    mBackgroundHandler);        } catch (CameraAccessException e) {            e.printStackTrace();        }    }    @Override    public void onClick(View view) {        switch (view.getId()) {            case R.id.picture: {                takePicture();                break;            }        }    }    private void setAutoFlash(CaptureRequest.Builder requestBuilder) {        if (mFlashSupported) {            requestBuilder.set(CaptureRequest.CONTROL_AE_MODE,                    CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);        }    }    /**     * Saves a JPEG {@link Image} into the specified {@link File}.     */    private static class ImageSaver implements Runnable {        /**         * The JPEG image         */        private final Image mImage;        /**         * The file we save the image into.         */        private final File mFile;        public ImageSaver(Image image, File file) {            mImage = image;            mFile = file;        }        @Override        public void run() {            ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();            byte[] bytes = new byte[buffer.remaining()];            buffer.get(bytes);            FileOutputStream output = null;            try {                output = new FileOutputStream(mFile);                output.write(bytes);            } catch (IOException e) {                e.printStackTrace();            } finally {                mImage.close();                if (null != output) {                    try {                        output.close();                    } catch (IOException e) {                        e.printStackTrace();                    }                }            }        }    }    /**     * Compares two {@code Size}s based on their areas.     */    static class CompareSizesByArea implements Comparator<Size> {        @Override        public int compare(Size lhs, Size rhs) {            // We cast here to ensure the multiplications won't overflow            return Long.signum((long) lhs.getWidth() * lhs.getHeight() -                    (long) rhs.getWidth() * rhs.getHeight());        }    }    /**     * Shows an error message dialog.     */    public static class ErrorDialog extends DialogFragment {        private static final String ARG_MESSAGE = "message";        public static ErrorDialog newInstance(String message) {            ErrorDialog dialog = new ErrorDialog();            Bundle args = new Bundle();            args.putString(ARG_MESSAGE, message);            dialog.setArguments(args);            return dialog;        }        @Override        public Dialog onCreateDialog(Bundle savedInstanceState) {            final Activity activity = getActivity();            return new AlertDialog.Builder(activity)                    .setMessage(getArguments().getString(ARG_MESSAGE))                    .setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {                        @Override                        public void onClick(DialogInterface dialogInterface, int i) {                            activity.finish();                        }                    })                    .create();        }    }    /**     * Shows OK/Cancel confirmation dialog about camera permission.     */    public static class ConfirmationDialog extends DialogFragment {        @Override        public Dialog onCreateDialog(Bundle savedInstanceState) {            final Fragment parent = getParentFragment();            return new AlertDialog.Builder(getActivity())                    .setMessage("需要申请摄像头权限")                    .setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {                        @Override                        public void onClick(DialogInterface dialog, int which) {                            FragmentCompat.requestPermissions(parent,                                    new String[]{Manifest.permission.CAMERA},                                    REQUEST_CAMERA_PERMISSION);                        }                    })                    .setNegativeButton(android.R.string.cancel,                            new DialogInterface.OnClickListener() {                                @Override                                public void onClick(DialogInterface dialog, int which) {                                    Activity activity = parent.getActivity();                                    if (activity != null) {                                        activity.finish();                                    }                                }                            })                    .create();        }    }}

布局文件

<?xml version="1.0" encoding="utf-8"?><RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"    android:layout_width="match_parent"    android:layout_height="match_parent">    <com.hejunlin.camerasample2.SuperTextureView        android:id="@+id/texture"        android:layout_width="match_parent"        android:layout_height="wrap_content" />    <FrameLayout        android:id="@+id/control"        android:layout_width="match_parent"        android:layout_height="112dp"        android:layout_alignParentBottom="true"        android:layout_alignParentStart="true"        android:background="#000000">        <ImageView            android:id="@+id/picture"            android:layout_width="wrap_content"            android:layout_height="wrap_content"            android:layout_gravity="center"            android:src="@mipmap/capture" />    </FrameLayout></RelativeLayout>

效果图:


这里写图片描述

从效果图上看出,新API功能更强大,拍照效果更好。
以上两个自定义相机案例完整下载地址:
https://github.com/hejunlin2013/MultiMediaSample

第一时间获得博客更新提醒,以及更多android干货,源码分析,欢迎关注我的微信公众号,扫一扫下方二维码或者长按识别二维码,即可关注。


这里写图片描述

如果你觉得好,随手点赞,也是对笔者的肯定,也可以分享此公众号给你更多的人,原创不易

5 0