Android Camera数据流分析全程记录
16lz
2021-01-23
Android Camera数据流分析全程记录
花了不少时间在这个数据流的分析上面,自己毕竟没怎么做过android,这里记录一下自己的见解,任何理解错误还望高人指教,以后还需慢慢纠正
整个分析过程从app的onCreate开始:packages/apps/OMAPCamera/src/com/ti/omap4/android/camera/Camera.java
在onCreate中做了很多的初始化,我们真正关注的是一下几条语句:
其中 SurfaceView的定义在以下路径: frameworks/base/core/java/android/view/SurfaceView.java
其中 SurfaceHolder的定义在以下路径: frameworks/base/core/java/android/view/SurfaceHolder.java
这里看看这个文章的解释,写的很是不错:http://blog.chinaunix.net/uid-9863638-id-1996383.html
SurfaceFlinger 是Android multimedia 的一个部分,在Android 的实现中它是一个service ,提供系统范围内的surface composer 功能,它能够将各种应用程序的2D,3D surface 进行组合。
在具体讲SurfaceFlinger 之前,我们先来看一下有关显示方面的一些基础知识 。
每个应用程序可能对应着一个或者多个图形界面,而每个界面我们就称之为一个surface ,或者说是window ,在上面的图中我们能看到4 个surface ,一个是home 界面,还有就是红、绿、蓝分别代表的3 个surface ,而两个button 实际是home surface 里面的内容。在这里我们能看到我们进行图形显示所需要解决 的问题:
a 、首先每个surface 在屏幕上有它的位置,以及大小,然后每个surface 里面还有要显示的内容,内容,大小,位置 这些元素 在我们改变应用程序的时候都可能会改变,改变时应该如何处理
b 、然后就各个surface 之间可能有重叠,比如说在上面的简略图中,绿色覆盖了蓝色,而红色又覆盖了绿色和蓝色以及下面的home ,而且还具有一定透明度。这种层之间的关系应该如何描述
我们首先来看第二个问题,我们可以想象在屏幕平面的垂直方向还有一个Z 轴,所有的surface 根据在Z 轴上的坐标来确定前后,这样就可以描述各个surface 之间的上下覆盖关系了,而这个在Z 轴上的顺序,图形上有个专业术语叫Z-order 。
对于第一个问题,我们需要一个结构来记录应用程序界面的位置,大小,以及一个buffer 来记录需要显示的内容,所以这就是我们surface 的概念,surface 实际我们可以把它理解成一个容器,这个容器记录着应用程序界面的控制信息,比如说大小啊,位置啊,而它还有buffer 来专门存储需要显示的内容。
在这里还存在一个问题,那就是当存在图形重合的时候应该如何处理呢,而且可能有些surface 还带有透明信息,这里就是我们SurfaceFlinger 需要解决问题,它要把各个surface 组合(compose/merge) 成一个main Surface ,最后将Main Surface 的内容发送给FB/V4l2 Output ,这样屏幕上就能看到我们想要的效果。
在实际中对这些Surface 进行merge 可以采用两种方式,一种就是采用软件的形式来merge ,还一种就是采用硬件的方式,软件的方式就是我们的SurfaceFlinger ,而硬件的方式就是Overlay 。
首先继承SurfaceView并实现SurfaceHolder.Callback接口
使用接口的原因:因为使用SurfaceView 有一个原则,所有的绘图工作必须得在Surface 被创建之后才能开始(Surface—表面,基本上我们可以把它当作显存的一个映射,写入到Surface 的内容可以被直接复制到显存从而显示出来,这使得显示速度会非常快),而在Surface 被销毁之前必须结束。所以Callback 中的surfaceCreated 和surfaceDestroyed 就成了绘图处理代码的边界。
需要重写的方法
(1)public void surfaceChanged(SurfaceHolder holder,int format,int width,int height){}//在surface的大小发生改变时激发
(2)public void surfaceCreated(SurfaceHolder holder){}//在创建时激发,一般在这里调用画图的线程。
(3)public void surfaceDestroyed(SurfaceHolder holder) {}//销毁时激发,一般在这里将画图的线程停止、释放。
这几个方法在在app中都已经重新实现了,重点分析 surfaceChanged
这里我必须得着重着重的进行分析,我一直在寻找是什么决定了overlay的使用与不适用,这里就这个 setPreviewDisplay 方法就是“罪魁祸首 ”
在setPreview方法中传入的参数是 surfa ceview ,这个surfa ceview传到底层HAL层是参数形式发生了改变,但是在我的理解下,就是人换衣服一样,
张三今天换了一身衣服,但这个张三跟昨天穿不同衣服的张三是同一个人,到了HAL层这个参数的形式是 preview_stream_ops ,下面慢慢你就可以知道了,
在camerahal中的 setPreviewDisplay 方法中,是通过判断传下来的的 preview_stream_ops 参数是否为空决定使用overlay还是不适用overlay的,很重要的
这篇文章只是在这里提及一下,下面不会提及overlay的内容,默认是以不适用overlay的方式分析数据流的整个过程的,这里可千万别混淆了
使用overl的数据回流方式将单独作为一章分析,同时会详细分析使用和不适用overlay的最终决定权
流程如下:app-->frameworks-->通过JNI-->camera client--> camera service-->通过hardware-interface -->hal_module-->HAL
这里十分有必要看一下camera service层的调用过程:
其实我说的本质的变化这里也只能这么说,但往深入追究,这个preview_stream_ops也可以说只是surface的另外一种形式而已
这样才通过hardware调用到hal-module再调用到hal层
那么我不是很明白的是,driver中的视频数据是怎么和 mPreviewBufs 还有index关联在一起的,并且这里可以通过 buffer = mPreviewBufs . keyAt ( index ) 获取到CameraBuffer,这里待会会详细探究一下
先接着往下说,获取到视频数据之后,如果需要,会将数据经过转换保存到file中方便之后使用,
最后使用得到的camerabuffer填充CameraFrame,这个结构至关重要,在我的理解,最终是通过 sendFrameToSubscribers ( & frame ) ; 方法将数据回流的
这里就先追踪一下 driver中的视频数据是怎么和 mPreviewBufs 还有index关联在一起的
到了这里就不得不提及上面已经说的一个很重要的方法,先看看这个方法:
他是startPreview的第一步, cameraPreviewInitialization
在sendcommand中实现如下:
的初始化是在哪里实现的呢??在在camerahal文件的initial中初始化的
我们就看一下setFrameProvider这个方法都做了什么事情,
这个方法只是调用了下面这个方法实现:
这里所要获取到的callback方法就是上面setFrameProvider时引入的 frameCallbackRelay 这个函数,我们看看这个函数的具体实现
我们可以看一下在 AppCallbackNotifier初始化的时候就调用了initialize做一下初始设置
花了不少时间在这个数据流的分析上面,自己毕竟没怎么做过android,这里记录一下自己的见解,任何理解错误还望高人指教,以后还需慢慢纠正
整个分析过程从app的onCreate开始:packages/apps/OMAPCamera/src/com/ti/omap4/android/camera/Camera.java
在onCreate中做了很多的初始化,我们真正关注的是一下几条语句:
- //don'tsetmSurfaceHolder here.We have itsetONLY within
- //surfaceChanged/surfaceDestroyed,other parts of the code
- //assume that when itisset,the surfaceisalsoset.
- SurfaceView preview=(SurfaceView)findViewById(R.id.camera_preview);
- SurfaceHolder holder=preview.getHolder();
- holder.addCallback(this);
其中 SurfaceView的定义在以下路径: frameworks/base/core/java/android/view/SurfaceView.java
其中 SurfaceHolder的定义在以下路径: frameworks/base/core/java/android/view/SurfaceHolder.java
这里看看这个文章的解释,写的很是不错:http://blog.chinaunix.net/uid-9863638-id-1996383.html
SurfaceFlinger 是Android multimedia 的一个部分,在Android 的实现中它是一个service ,提供系统范围内的surface composer 功能,它能够将各种应用程序的2D,3D surface 进行组合。
在具体讲SurfaceFlinger 之前,我们先来看一下有关显示方面的一些基础知识 。
每个应用程序可能对应着一个或者多个图形界面,而每个界面我们就称之为一个surface ,或者说是window ,在上面的图中我们能看到4 个surface ,一个是home 界面,还有就是红、绿、蓝分别代表的3 个surface ,而两个button 实际是home surface 里面的内容。在这里我们能看到我们进行图形显示所需要解决 的问题:
a 、首先每个surface 在屏幕上有它的位置,以及大小,然后每个surface 里面还有要显示的内容,内容,大小,位置 这些元素 在我们改变应用程序的时候都可能会改变,改变时应该如何处理
b 、然后就各个surface 之间可能有重叠,比如说在上面的简略图中,绿色覆盖了蓝色,而红色又覆盖了绿色和蓝色以及下面的home ,而且还具有一定透明度。这种层之间的关系应该如何描述
我们首先来看第二个问题,我们可以想象在屏幕平面的垂直方向还有一个Z 轴,所有的surface 根据在Z 轴上的坐标来确定前后,这样就可以描述各个surface 之间的上下覆盖关系了,而这个在Z 轴上的顺序,图形上有个专业术语叫Z-order 。
对于第一个问题,我们需要一个结构来记录应用程序界面的位置,大小,以及一个buffer 来记录需要显示的内容,所以这就是我们surface 的概念,surface 实际我们可以把它理解成一个容器,这个容器记录着应用程序界面的控制信息,比如说大小啊,位置啊,而它还有buffer 来专门存储需要显示的内容。
在这里还存在一个问题,那就是当存在图形重合的时候应该如何处理呢,而且可能有些surface 还带有透明信息,这里就是我们SurfaceFlinger 需要解决问题,它要把各个surface 组合(compose/merge) 成一个main Surface ,最后将Main Surface 的内容发送给FB/V4l2 Output ,这样屏幕上就能看到我们想要的效果。
在实际中对这些Surface 进行merge 可以采用两种方式,一种就是采用软件的形式来merge ,还一种就是采用硬件的方式,软件的方式就是我们的SurfaceFlinger ,而硬件的方式就是Overlay 。
首先继承SurfaceView并实现SurfaceHolder.Callback接口
使用接口的原因:因为使用SurfaceView 有一个原则,所有的绘图工作必须得在Surface 被创建之后才能开始(Surface—表面,基本上我们可以把它当作显存的一个映射,写入到Surface 的内容可以被直接复制到显存从而显示出来,这使得显示速度会非常快),而在Surface 被销毁之前必须结束。所以Callback 中的surfaceCreated 和surfaceDestroyed 就成了绘图处理代码的边界。
需要重写的方法
(1)public void surfaceChanged(SurfaceHolder holder,int format,int width,int height){}//在surface的大小发生改变时激发
(2)public void surfaceCreated(SurfaceHolder holder){}//在创建时激发,一般在这里调用画图的线程。
(3)public void surfaceDestroyed(SurfaceHolder holder) {}//销毁时激发,一般在这里将画图的线程停止、释放。
这几个方法在在app中都已经重新实现了,重点分析 surfaceChanged
- publicvoid surfaceChanged(SurfaceHolder holder,intformat,intw,inth){
- //Make sure we have a surfaceinthe holder before proceeding.
- if(holder.getSurface()==null){
- Log.d(TAG,"holder.getSurface() == null");
- return;
- }
- Log.v(TAG,"surfaceChanged. w="+w+". h="+h);
- //We needtosave the holderforlater use,even when the mCameraDevice
- //isnull.This could happenifonResume()isinvoked after this
- //function.
- mSurfaceHolder=holder;
- //The mCameraDevice will benullifit failstoconnecttothe camera
- //hardware.Inthiscasewe will show a dialogandthenfinish the
- //activity,so it's OKtoignore it.
- if(mCameraDevice==null)return;
- //Sometimes surfaceChangediscalled after onPauseorbefore onResume.
- //Ignore it.
- if(mPausing||isFinishing())return;
- setSurfaceLayout();
- //Setpreview displayifthe surfaceisbeing created.Preview was
- //already started.Also restart the previewifdisplay rotation has
- //changed.Sometimes this happens when the deviceisheldinportrait
- //andcamera appisopened.Rotation animation takes sometimeand
- //display rotationinonCreate maynotbe what we want.
- if(mCameraState==PREVIEW_STOPPED){//这里check摄像头是否已经启动,第一次启动摄像头和摄像头已经打开从新进入摄像头实现方法不同
- startPreview(true);
- startFaceDetection();
- }else{
- if(Util.getDisplayRotation(this)!=mDisplayRotation){
- setDisplayOrientation();
- }
- if(holder.isCreating()){
- //Setpreview displayifthe surfaceisbeing createdandpreview
- //was already started.That means preview display wassettonull
- //andwe needtosetitnow.
- setPreviewDisplay(holder);
- }
- }
- //Iffirsttimeinitializationisnotfinished,send a messagetodo
- //it later.We wanttofinish surfaceChanged as soon as possibletolet
- //user see preview first.
- if(!mFirstTimeInitialized){
- mHandler.sendEmptyMessage(FIRST_TIME_INIT);
- }else{
- initializeSecondTime();
- }
- SurfaceView preview=(SurfaceView)findViewById(R.id.camera_preview);
- CameraInfo info=CameraHolder.instance().getCameraInfo()[mCameraId];
- boolean mirror=(info.facing==CameraInfo.CAMERA_FACING_FRONT);
- intdisplayRotation=Util.getDisplayRotation(this);
- intdisplayOrientation=Util.getDisplayOrientation(displayRotation,mCameraId);
- mTouchManager.initialize(preview.getHeight()/3,preview.getHeight()/3,
- preview,this,mirror,displayOrientation);
- }
- privatevoid startPreview(boolean updateAll){
- if(mPausing||isFinishing())return;
- mFocusManager.resetTouchFocus();
- mCameraDevice.setErrorCallback(mErrorCallback);
- //Ifwe're previewing already,stop the preview first(this will blank
- //the screen).
- if(mCameraState!=PREVIEW_STOPPED)stopPreview();
- setPreviewDisplay(mSurfaceHolder);
- setDisplayOrientation();
- if(!mSnapshotOnIdle){
- //Ifthe focus modeiscontinuous autofocus,callcancelAutoFocusto
- //resumeit because it may have been paused by autoFocuscall.
- if(Parameters.FOCUS_MODE_CONTINUOUS_PICTURE.equals(mFocusManager.getFocusMode())){
- mCameraDevice.cancelAutoFocus();
- }
- mFocusManager.setAeAwbLock(false);//Unlock AEandAWB.
- }
- if(updateAll){
- Log.v(TAG,"Updating all parameters!");
- setCameraParameters(UPDATE_PARAM_INITIALIZE|UPDATE_PARAM_ZOOM|UPDATE_PARAM_PREFERENCE);
- }else{
- setCameraParameters(UPDATE_PARAM_MODE);
- }
- //setCameraParameters(UPDATE_PARAM_ALL);
- //Inform the mainthreadtogoonthe UI initialization.
- if(mCameraPreviewThread!=null){
- synchronized(mCameraPreviewThread){
- mCameraPreviewThread.notify();
- }
- }
- try{
- Log.v(TAG,"startPreview");
- mCameraDevice.startPreview();
- }catch(Throwable ex){
- closeCamera();
- throw new RuntimeException("startPreview failed",ex);
- }
- mZoomState=ZOOM_STOPPED;
- setCameraState(IDLE);
- mFocusManager.onPreviewStarted();
- if(mTempBracketingEnabled){
- mFocusManager.setTempBracketingState(FocusManager.TempBracketingStates.ACTIVE);
- }
- if(mSnapshotOnIdle){
- mHandler.post(mDoSnapRunnable);
- }
- }
这里我必须得着重着重的进行分析,我一直在寻找是什么决定了overlay的使用与不适用,这里就这个 setPreviewDisplay 方法就是“罪魁祸首 ”
在setPreview方法中传入的参数是 surfa ceview ,这个surfa ceview传到底层HAL层是参数形式发生了改变,但是在我的理解下,就是人换衣服一样,
张三今天换了一身衣服,但这个张三跟昨天穿不同衣服的张三是同一个人,到了HAL层这个参数的形式是 preview_stream_ops ,下面慢慢你就可以知道了,
在camerahal中的 setPreviewDisplay 方法中,是通过判断传下来的的 preview_stream_ops 参数是否为空决定使用overlay还是不适用overlay的,很重要的
这篇文章只是在这里提及一下,下面不会提及overlay的内容,默认是以不适用overlay的方式分析数据流的整个过程的,这里可千万别混淆了
使用overl的数据回流方式将单独作为一章分析,同时会详细分析使用和不适用overlay的最终决定权
流程如下:app-->frameworks-->通过JNI-->camera client--> camera service-->通过hardware-interface -->hal_module-->HAL
这里十分有必要看一下camera service层的调用过程:
- //setthe Surface that the preview will use
- status_t CameraService::Client::setPreviewDisplay(constsp<Surface>&surface){
- LOG1("setPreviewDisplay(%p) (pid %d)",surface.get(),getCallingPid());
- sp<IBinder>binder(surface!=0?surface->asBinder():0);
- sp<ANativeWindow>window(surface);
- return setPreviewWindow(binder,window);
- }
- status_t CameraService::Client::setPreviewWindow(constsp<IBinder>&binder,
- constsp<ANativeWindow>&window){
- Mutex::Autolock lock(mLock);
- status_t result=checkPidAndHardware();
- if(result!=NO_ERROR)return result;
- //returnifno changeinsurface.
- if(binder==mSurface){
- return NO_ERROR;
- }
- if(window!=0){
- result=native_window_api_connect(window.get(),NATIVE_WINDOW_API_CAMERA);
- if(result!=NO_ERROR){
- LOGE("native_window_api_connect failed: %s (%d)",strerror(-result),
- result);
- return result;
- }
- }
- //Ifpreview has been already started,register preview buffersnow.
- if(mHardware->previewEnabled()){
- if(window!=0){
- native_window_set_scaling_mode(window.get(),
- NATIVE_WINDOW_SCALING_MODE_SCALE_TO_WINDOW);
- native_window_set_buffers_transform(window.get(),mOrientation);
- result=mHardware->setPreviewWindow(window);
- }
- }
- if(result==NO_ERROR){
- //Everything has succeeded.Disconnect the oldwindowandremember the
- //newwindow.
- disconnectWindow(mPreviewWindow);
- mSurface=binder;
- mPreviewWindow=window;
- }else{
- //Something went wrong after we connectedtothe newwindow,so
- //disconnect here.
- disconnectWindow(window);
- }
- return result;
- }
- status_t setPreviewWindow(constsp<ANativeWindow>&buf)
- {
- LOGV("%s(%s) buf %p",__FUNCTION__,mName.string(),buf.get());
- if(mDevice->ops->set_preview_window){
- mPreviewWindow=buf;
- #ifdef OMAP_ENHANCEMENT_CPCAM
- mHalPreviewWindow.user=mPreviewWindow.get();
- #else
- mHalPreviewWindow.user=this;
- #endif
- LOGV("%s &mHalPreviewWindow %p mHalPreviewWindow.user %p",__FUNCTION__,
- &mHalPreviewWindow,mHalPreviewWindow.user);
- return mDevice->ops->set_preview_window(mDevice,
- buf.get()?&mHalPreviewWindow.nw:0);
- }
- return INVALID_OPERATION;
- }
其实我说的本质的变化这里也只能这么说,但往深入追究,这个preview_stream_ops也可以说只是surface的另外一种形式而已
这样才通过hardware调用到hal-module再调用到hal层
- intcamera_set_preview_window(struct camera_device*device,
- struct preview_stream_ops*window)
- {
- intrv=-EINVAL;
- ti_camera_device_t*ti_dev=NULL;
- LOGV("%s",__FUNCTION__);
- if(!device)
- return rv;
- ti_dev=(ti_camera_device_t*)device;
- rv=gCameraHals[ti_dev->cameraid]->setPreviewWindow(window);
- return rv;
- }
- status_t CameraHal::setPreviewWindow(struct preview_stream_ops*window)
- {
- status_t ret=NO_ERROR;
- CameraAdapter::BuffersDescriptor desc;
- LOG_FUNCTION_NAME;
- mSetPreviewWindowCalled=true;
- //Ifthe Camera service passes anullwindow,we destroy existingwindowandfree the DisplayAdapter
- if(!window)
- {
- if(mDisplayAdapter.get()!=NULL)
- {
- ///NULLwindowpassed,destroy the display adapterifpresent
- CAMHAL_LOGD("NULL window passed, destroying display adapter");
- mDisplayAdapter.clear();
- ///@remarksIfthere was awindowpreviously existing,we usually expect another validwindowtobe passed by the client
- ///@remarks so,we will waituntilit passes a validwindowtobegin the preview again
- mSetPreviewWindowCalled=false;
- }
- CAMHAL_LOGD("NULL ANativeWindow passed to setPreviewWindow");
- return NO_ERROR;
- }elseif(mDisplayAdapter.get()==NULL)
- {
- //Needtocreate the display adapter since it hasnotbeen created
- //Create display adapter
- mDisplayAdapter=new ANativeWindowDisplayAdapter();
- ret=NO_ERROR;
- if(!mDisplayAdapter.get()||((ret=mDisplayAdapter->initialize())!=NO_ERROR))
- {
- if(ret!=NO_ERROR)
- {
- mDisplayAdapter.clear();
- CAMHAL_LOGEA("DisplayAdapter initialize failed");
- LOG_FUNCTION_NAME_EXIT;
- return ret;
- }
- else
- {
- CAMHAL_LOGEA("Couldn't create DisplayAdapter");
- LOG_FUNCTION_NAME_EXIT;
- return NO_MEMORY;
- }
- }
- //DisplayAdapter needstoknow wheretogetthe CameraFrames from inordertodisplay
- //Since CameraAdapteristhe one that provides the frames,setit as the frame providerforDisplayAdapter
- mDisplayAdapter->setFrameProvider(mCameraAdapter);
- //Any dynamic errors that happen during the camera usecasehastobe propagated backtothe application
- //via CAMERA_MSG_ERROR.AppCallbackNotifieristheclassthat notifies such errorstothe application
- //Setit as theerrorhandlerforthe DisplayAdapter
- mDisplayAdapter->setErrorHandler(mAppCallbackNotifier.get());
- //Update the display adapter with the newwindowthatispassed from CameraService
- ret=mDisplayAdapter->setPreviewWindow(window);
- if(ret!=NO_ERROR)
- {
- CAMHAL_LOGEB("DisplayAdapter setPreviewWindow returned error %d",ret);
- }
- if(mPreviewStartInProgress)
- {
- CAMHAL_LOGDA("setPreviewWindow called when preview running");
- //Start the preview since thewindowisnowavailable
- ret=startPreview();
- }
- }else{
- //Update the display adapter with the newwindowthatispassed from CameraService
- ret=mDisplayAdapter->setPreviewWindow(window);
- if((NO_ERROR==ret)&&previewEnabled()){
- restartPreview();
- }elseif(ret==ALREADY_EXISTS){
- //ALREADY_EXISTS should be treated as a noopinthiscase
- ret=NO_ERROR;
- }
- }
- LOG_FUNCTION_NAME_EXIT;
- return ret;
- }
- status_t CameraHal::startPreview(){
- LOG_FUNCTION_NAME;
- //When tunnelingisenabled during VTC,startPreview happensin2 steps:
- //When the application sends the command CAMERA_CMD_PREVIEW_INITIALIZATION,
- //cameraPreviewInitialization()iscalled,whichinturn causes the CameraAdapter
- //tomove from loadedtoidle state.Andwhen the application calls startPreview,
- //the CameraAdapter moves from idletoexecuting state.
- //
- //Ifthe application calls startPreview()without sending the command
- //CAMERA_CMD_PREVIEW_INITIALIZATION,thenthefunctioncameraPreviewInitialization()
- //ANDstartPreview()are executed.Inother words,ifthe application calls
- //startPreview()without sending the command CAMERA_CMD_PREVIEW_INITIALIZATION,
- //thenthe CameraAdapter moves from loadedtoidletoexecuting stateinone shot.
- status_t ret=cameraPreviewInitialization();这个地方十分重要,下面会具体分析
- //The flag mPreviewInitializationDoneissettotrueat theendof thefunction
- //cameraPreviewInitialization().Therefore,ifeverything goes alright,thenthe
- //flag will beset.Sometimes,thefunctioncameraPreviewInitialization()may
- //return prematurelyifall the resources arenotavailableforstarting preview.
- //Forexample,ifthe previewwindowisnotset,thenit would return NO_ERROR.
- //Under such circumstances,one should return from startPreview as wellandshould
- //notcontinue execution.Thatiswhy,we check the flagandnotthe return value.
- if(!mPreviewInitializationDone)return ret;
- //Once startPreviewiscalled,thereisno needtocontinuetoremember whether
- //thefunctioncameraPreviewInitialization()was called earlierornot.Andso
- //the flag mPreviewInitializationDoneisreset here.Plus,this preserves the
- //current behavior of startPreview under the circumstances where the application
- //calls startPreview twiceormore.
- mPreviewInitializationDone=false;
- //Enable the display adapterifpresent,actual overlay enable happens when we post the buffer这里说overlay happens,我一直在找的地方,上面棕色标注将来会在详细说说这里
- if(mDisplayAdapter.get()!=NULL){
- CAMHAL_LOGDA("Enabling display");
- intwidth,height;
- mParameters.getPreviewSize(&width,&height);
- #ifPPM_INSTRUMENTATION||PPM_INSTRUMENTATION_ABS
- ret=mDisplayAdapter->enableDisplay(width,height,&mStartPreview);
- #else
- ret=mDisplayAdapter->enableDisplay(width,height,NULL);
- #endif
- if(ret!=NO_ERROR){
- CAMHAL_LOGEA("Couldn't enable display");
- //FIXME:At this stage mStateSwitchLockislockedandunlockissupposedtobe called
- //only from mCameraAdapter->sendCommand(CameraAdapter::CAMERA_START_PREVIEW)
- //below.But this will never happen because of gotoerror.Thus atnext
- //startPreview()callCameraHAL will be deadlocked.
- //Needtorevisit mStateSwitch lock,fornowjust abort the process.
- CAMHAL_ASSERT_X(false,
- "At this stage mCameraAdapter->mStateSwitchLock is still locked, "
- "deadlock is guaranteed");
- gotoerror;
- }
- }
- CAMHAL_LOGDA("Starting CameraAdapter preview mode");
- //Send START_PREVIEW commandtoadapter
- ret=mCameraAdapter->sendCommand(CameraAdapter::CAMERA_START_PREVIEW);//从这里开始调用到BaseCameraAdapter
- if(ret!=NO_ERROR){
- CAMHAL_LOGEA("Couldn't start preview w/ CameraAdapter");
- gotoerror;
- }
- CAMHAL_LOGDA("Started preview");
- mPreviewEnabled=true;
- mPreviewStartInProgress=false;
- return ret;
- error:
- CAMHAL_LOGEA("Performing cleanup after error");
- //Doall the cleanup
- freePreviewBufs();
- mCameraAdapter->sendCommand(CameraAdapter::CAMERA_STOP_PREVIEW);
- if(mDisplayAdapter.get()!=NULL){
- mDisplayAdapter->disableDisplay(false);
- }
- mAppCallbackNotifier->stop();
- mPreviewStartInProgress=false;
- mPreviewEnabled=false;
- LOG_FUNCTION_NAME_EXIT;
- return ret;
- }
- caseCameraAdapter::CAMERA_START_PREVIEW:
- {
- CAMHAL_LOGDA("Start Preview");
- if(ret==NO_ERROR)
- {
- ret=setState(operation);
- }
- if(ret==NO_ERROR)
- {
- ret=startPreview();
- }
- if(ret==NO_ERROR)
- {
- ret=commitState();
- }
- else
- {
- ret|=rollbackState();
- }
- break;
- }
- status_t V4LCameraAdapter::startPreview()
- {
- status_t ret=NO_ERROR;
- LOG_FUNCTION_NAME;
- Mutex::Autolock lock(mPreviewBufsLock);
- if(mPreviewing){
- ret=BAD_VALUE;
- gotoEXIT;
- }
- for(inti=0;i<mPreviewBufferCountQueueable;i++){
- mVideoInfo->buf.index=i;
- mVideoInfo->buf.type=V4L2_BUF_TYPE_VIDEO_CAPTURE;
- mVideoInfo->buf.memory=V4L2_MEMORY_MMAP;
- ret=v4lIoctl(mCameraHandle,VIDIOC_QBUF,&mVideoInfo->buf);//申请内存空间
- if(ret<0){
- CAMHAL_LOGEA("VIDIOC_QBUF Failed");
- gotoEXIT;
- }
- nQueued++;
- }
- ret=v4lStartStreaming();
- //Createandstart preview threadforreceiving buffers from V4L Camera
- if(!mCapturing){
- mPreviewThread=new PreviewThread(this);//开始preview线程
- CAMHAL_LOGDA("Created preview thread");
- }
- //Update the flagtoindicate we are previewing
- mPreviewing=true;
- mCapturing=false;
- EXIT:
- LOG_FUNCTION_NAME_EXIT;
- return ret;
- }
- status_t V4LCameraAdapter::v4lStartStreaming(){
- status_t ret=NO_ERROR;
- enum v4l2_buf_type bufType;
- if(!mVideoInfo->isStreaming){
- bufType=V4L2_BUF_TYPE_VIDEO_CAPTURE;
- ret=v4lIoctl(mCameraHandle,VIDIOC_STREAMON,&bufType);开始preview
- if(ret<0){
- CAMHAL_LOGEB("StartStreaming: Unable to start capture: %s",strerror(errno));
- return ret;
- }
- mVideoInfo->isStreaming=true;
- }
- return ret;
- }
- intV4LCameraAdapter::previewThread()
- {
- status_t ret=NO_ERROR;
- intwidth,height;
- CameraFrame frame;
- void*y_uv[2];
- intindex=0;
- intstride=4096;
- char*fp=NULL;
- mParams.getPreviewSize(&width,&height);
- if(mPreviewing){
- fp=this->GetFrame(index);
- if(!fp){
- ret=BAD_VALUE;
- gotoEXIT;
- }
- CameraBuffer*buffer=mPreviewBufs.keyAt(index);
- CameraFrame*lframe=(CameraFrame*)mFrameQueue.valueFor(buffer);
- if(!lframe){
- ret=BAD_VALUE;
- gotoEXIT;
- }
- debugShowFPS();
- if(mFrameSubscribers.size()==0){
- ret=BAD_VALUE;
- gotoEXIT;
- }
- //从这里开始以我的理解是进行数据的转换和保存操作
- y_uv[0]=(void*)lframe->mYuv[0];
- //y_uv[1]=(void*)lframe->mYuv[1];
- //y_uv[1]=(void*)(lframe->mYuv[0]+height*stride);
- convertYUV422ToNV12Tiler((unsigned char*)fp,(unsigned char*)y_uv[0],width,height);
- CAMHAL_LOGVB("##...index= %d.;camera buffer= 0x%x; y= 0x%x; UV= 0x%x.",index,buffer,y_uv[0],y_uv[1]);
- #ifdef SAVE_RAW_FRAMES
- unsigned char*nv12_buff=(unsigned char*)malloc(width*height*3/2);
- //Convert yuv422itoyuv420sp(NV12)&dump the frametoa file
- convertYUV422ToNV12((unsigned char*)fp,nv12_buff,width,height);
- saveFile(nv12_buff,((width*height)*3/2));
- free(nv12_buff);
- #endif
- frame.mFrameType=CameraFrame::PREVIEW_FRAME_SYNC;
- frame.mBuffer=buffer;
- frame.mLength=width*height*3/2;
- frame.mAlignment=stride;
- frame.mOffset=0;
- frame.mTimestamp=systemTime(SYSTEM_TIME_MONOTONIC);
- frame.mFrameMask=(unsignedint)CameraFrame::PREVIEW_FRAME_SYNC;
- if(mRecording)
- {
- frame.mFrameMask|=(unsignedint)CameraFrame::VIDEO_FRAME_SYNC;
- mFramesWithEncoder++;
- }
- ret=setInitFrameRefCount(frame.mBuffer,frame.mFrameMask);
- if(ret!=NO_ERROR){
- CAMHAL_LOGDB("Error in setInitFrameRefCount %d",ret);
- }else{
- ret=sendFrameToSubscribers(&frame);
- }
- }
- EXIT:
- return ret;
- }
那么我不是很明白的是,driver中的视频数据是怎么和 mPreviewBufs 还有index关联在一起的,并且这里可以通过 buffer = mPreviewBufs . keyAt ( index ) 获取到CameraBuffer,这里待会会详细探究一下
先接着往下说,获取到视频数据之后,如果需要,会将数据经过转换保存到file中方便之后使用,
最后使用得到的camerabuffer填充CameraFrame,这个结构至关重要,在我的理解,最终是通过 sendFrameToSubscribers ( & frame ) ; 方法将数据回流的
这里就先追踪一下 driver中的视频数据是怎么和 mPreviewBufs 还有index关联在一起的
到了这里就不得不提及上面已经说的一个很重要的方法,先看看这个方法:
他是startPreview的第一步, cameraPreviewInitialization
- status_t CameraHal::cameraPreviewInitialization()
- {
- status_t ret=NO_ERROR;
- CameraAdapter::BuffersDescriptor desc;
- CameraFrame frame;
- unsignedintrequired_buffer_count;
- unsignedintmax_queueble_buffers;
- #ifPPM_INSTRUMENTATION||PPM_INSTRUMENTATION_ABS
- gettimeofday(&mStartPreview,NULL);
- #endif
- LOG_FUNCTION_NAME;
- if(mPreviewInitializationDone){
- return NO_ERROR;
- }
- if(mPreviewEnabled){
- CAMHAL_LOGDA("Preview already running");
- LOG_FUNCTION_NAME_EXIT;
- return ALREADY_EXISTS;
- }
- if(NULL!=mCameraAdapter){
- ret=mCameraAdapter->setParameters(mParameters);配置参数到CameraAdapter
- }
- if((mPreviewStartInProgress==false)&&(mDisplayPaused==false)){
- ret=mCameraAdapter->sendCommand(CameraAdapter::CAMERA_QUERY_RESOLUTION_PREVIEW,(int)&frame);//通过这个command获取frame
- if(NO_ERROR!=ret){
- CAMHAL_LOGEB("Error: CAMERA_QUERY_RESOLUTION_PREVIEW %d",ret);
- return ret;
- }
- ///Update the current preview widthandheight
- mPreviewWidth=frame.mWidth;//初始化宽和高
- mPreviewHeight=frame.mHeight;
- }
- ///Ifwe don't have the preview callback enabledanddisplay adapter,
- if(!mSetPreviewWindowCalled||(mDisplayAdapter.get()==NULL)){
- CAMHAL_LOGD("Preview not started. Preview in progress flag set");
- mPreviewStartInProgress=true;
- ret=mCameraAdapter->sendCommand(CameraAdapter::CAMERA_SWITCH_TO_EXECUTING);
- if(NO_ERROR!=ret){
- CAMHAL_LOGEB("Error: CAMERA_SWITCH_TO_EXECUTING %d",ret);
- return ret;
- }
- return NO_ERROR;
- }
- if((mDisplayAdapter.get()!=NULL)&&(!mPreviewEnabled)&&(mDisplayPaused))
- {
- CAMHAL_LOGDA("Preview is in paused state");
- mDisplayPaused=false;
- mPreviewEnabled=true;
- if(NO_ERROR==ret)
- {
- ret=mDisplayAdapter->pauseDisplay(mDisplayPaused);
- if(NO_ERROR!=ret)
- {
- CAMHAL_LOGEB("Display adapter resume failed %x",ret);
- }
- }
- //restart preview callbacks
- if(mMsgEnabled&CAMERA_MSG_PREVIEW_FRAME)
- {
- mAppCallbackNotifier->enableMsgType(CAMERA_MSG_PREVIEW_FRAME);//
- }
- signalEndImageCapture();
- return ret;
- }
- required_buffer_count=atoi(mCameraProperties->get(CameraProperties::REQUIRED_PREVIEW_BUFS));
- ///Allocate the preview buffers
- ret=allocPreviewBufs(mPreviewWidth,mPreviewHeight,mParameters.getPreviewFormat(),required_buffer_count,max_queueble_buffers);
- if(NO_ERROR!=ret)
- {
- CAMHAL_LOGEA("Couldn't allocate buffers for Preview");
- gotoerror;
- }
- if(mMeasurementEnabled)
- {
- ret=mCameraAdapter->sendCommand(CameraAdapter::CAMERA_QUERY_BUFFER_SIZE_PREVIEW_DATA,
- (int)&frame,
- required_buffer_count);
- if(NO_ERROR!=ret)
- {
- return ret;
- }
- ///Allocate the preview data buffers
- ret=allocPreviewDataBufs(frame.mLength,required_buffer_count);
- if(NO_ERROR!=ret){
- CAMHAL_LOGEA("Couldn't allocate preview data buffers");
- gotoerror;
- }
- if(NO_ERROR==ret)
- {
- desc.mBuffers=mPreviewDataBuffers;
- desc.mOffsets=mPreviewDataOffsets;
- desc.mFd=mPreviewDataFd;
- desc.mLength=mPreviewDataLength;
- desc.mCount=(size_t)required_buffer_count;
- desc.mMaxQueueable=(size_t)required_buffer_count;
- mCameraAdapter->sendCommand(CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW_DATA,
- (int)&desc);
- }
- }
- ///Pass the bufferstoCamera Adapter
- desc.mBuffers=mPreviewBuffers;
- desc.mOffsets=mPreviewOffsets;
- desc.mFd=mPreviewFd;
- desc.mLength=mPreviewLength;
- desc.mCount=(size_t)required_buffer_count;
- desc.mMaxQueueable=(size_t)max_queueble_buffers;
- ret=mCameraAdapter->sendCommand(CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW,(int)&desc);
- if(NO_ERROR!=ret)
- {
- CAMHAL_LOGEB("Failed to register preview buffers: 0x%x",ret);
- freePreviewBufs();
- return ret;
- }
- mAppCallbackNotifier->startPreviewCallbacks(mParameters,mPreviewBuffers,mPreviewOffsets,mPreviewFd,mPreviewLength,required_buffer_count);
- ///Start the callback notifier
- ret=mAppCallbackNotifier->start();
- if(ALREADY_EXISTS==ret)
- {
- //Already running,donothing
- CAMHAL_LOGDA("AppCallbackNotifier already running");
- ret=NO_ERROR;
- }
- elseif(NO_ERROR==ret){
- CAMHAL_LOGDA("Started AppCallbackNotifier..");
- mAppCallbackNotifier->setMeasurements(mMeasurementEnabled);
- }
- else
- {
- CAMHAL_LOGDA("Couldn't start AppCallbackNotifier");
- gotoerror;
- }
- if(ret==NO_ERROR)mPreviewInitializationDone=true;
- return ret;
- error:
- CAMHAL_LOGEA("Performing cleanup after error");
- //Doall the cleanup
- freePreviewBufs();
- mCameraAdapter->sendCommand(CameraAdapter::CAMERA_STOP_PREVIEW);
- if(mDisplayAdapter.get()!=NULL)
- {
- mDisplayAdapter->disableDisplay(false);
- }
- mAppCallbackNotifier->stop();
- mPreviewStartInProgress=false;
- mPreviewEnabled=false;
- LOG_FUNCTION_NAME_EXIT;
- return ret;
- }
在sendcommand中实现如下:
- caseCameraAdapter::CAMERA_USE_BUFFERS_PREVIEW:
- CAMHAL_LOGDA("Use buffers for preview");
- desc=(BuffersDescriptor*)value1;
- if(NULL==desc)
- {
- CAMHAL_LOGEA("Invalid preview buffers!");
- return-EINVAL;
- }
- if(ret==NO_ERROR)
- {
- ret=setState(operation);
- }
- if(ret==NO_ERROR)
- {
- Mutex::Autolock lock(mPreviewBufferLock);
- mPreviewBuffers=desc->mBuffers;
- mPreviewBuffersLength=desc->mLength;
- mPreviewBuffersAvailable.clear();
- mSnapshotBuffersAvailable.clear();
- for(uint32_t i=0;i<desc->mMaxQueueable;i++)
- {
- mPreviewBuffersAvailable.add(&mPreviewBuffers[i],0);这里实现了mPreviewBuffersAvailable与mPreviewBuffers的关联
- }
- //initial ref countforundeqeueued buffersis1 since buffer provider
- //isstill holdingontoit
- for(uint32_t i=desc->mMaxQueueable;i<desc->mCount;i++)
- {
- mPreviewBuffersAvailable.add(&mPreviewBuffers[i],1);
- }
- }
- if(NULL!=desc)
- {
- ret=useBuffers(CameraAdapter::CAMERA_PREVIEW,
- desc->mBuffers,
- desc->mCount,
- desc->mLength,
- desc->mMaxQueueable);
- }
- if(ret==NO_ERROR)
- {
- ret=commitState();
- }
- else
- {
- ret|=rollbackState();
- }
- break;
- status_t V4LCameraAdapter::UseBuffersPreview(CameraBuffer*bufArr,intnum)
- {
- intret=NO_ERROR;
- LOG_FUNCTION_NAME;
- if(NULL==bufArr){
- ret=BAD_VALUE;
- gotoEXIT;
- }
- ret=v4lInitMmap(num);
- if(ret==NO_ERROR){
- for(inti=0;i<num;i++){
- //AssociateeachCamera internal buffer with the one from Overlay
- mPreviewBufs.add(&bufArr[i],i);//这里实现了mPreviewBufs和desc->mBuffers的关联
- CAMHAL_LOGDB("Preview- buff [%d] = 0x%x ",i,mPreviewBufs.keyAt(i));
- }
- //Update the preview buffer count
- mPreviewBufferCount=num;
- }
- EXIT:
- LOG_FUNCTION_NAME_EXIT;
- return ret;
- }
的初始化是在哪里实现的呢??在在camerahal文件的initial中初始化的
- /**
- @brief Initialize the Camera HAL
- Creates CameraAdapter,AppCallbackNotifier,DisplayAdapterandMemoryManager
- @param None
- @return NO_ERROR-Onsuccess
- NO_MEMORY-Onfailuretoallocate memoryforany of the objects
- @remarks Camera Hal internalfunction
- */
- status_t CameraHal::initialize(CameraProperties::Properties*properties)
- {
- LOG_FUNCTION_NAME;
- intsensor_index=0;
- constchar*sensor_name=NULL;
- ///Initialize the event mask usedforregistering an event providerforAppCallbackNotifier
- ///Currently,registering all events astobe coming from CameraAdapter
- int32_t eventMask=CameraHalEvent::ALL_EVENTS;
- //Getmy camera properties
- mCameraProperties=properties;
- if(!mCameraProperties)
- {
- goto fail_loop;
- }
- //Dump the properties of this Camera
- //will only printifDEBUG macroisdefined
- mCameraProperties->dump();
- if(strcmp(CameraProperties::DEFAULT_VALUE,mCameraProperties->get(CameraProperties::CAMERA_SENSOR_INDEX))!=0)
- {
- sensor_index=atoi(mCameraProperties->get(CameraProperties::CAMERA_SENSOR_INDEX));
- }
- if(strcmp(CameraProperties::DEFAULT_VALUE,mCameraProperties->get(CameraProperties::CAMERA_NAME))!=0){
- sensor_name=mCameraProperties->get(CameraProperties::CAMERA_NAME);
- }
- CAMHAL_LOGDB("Sensor index= %d; Sensor name= %s",sensor_index,sensor_name);
- if(strcmp(sensor_name,V4L_CAMERA_NAME_USB)==0){
- #ifdef V4L_CAMERA_ADAPTER
- mCameraAdapter=V4LCameraAdapter_Factory(sensor_index);
- #endif
- }
- else{
- #ifdef OMX_CAMERA_ADAPTER
- mCameraAdapter=OMXCameraAdapter_Factory(sensor_index);
- #endif
- }
- if((NULL==mCameraAdapter)||(mCameraAdapter->initialize(properties)!=NO_ERROR))
- {
- CAMHAL_LOGEA("Unable to create or initialize CameraAdapter");
- mCameraAdapter=NULL;
- goto fail_loop;
- }
- mCameraAdapter->incStrong(mCameraAdapter);
- mCameraAdapter->registerImageReleaseCallback(releaseImageBuffers,(void*)this);
- mCameraAdapter->registerEndCaptureCallback(endImageCapture,(void*)this);
- if(!mAppCallbackNotifier.get())
- {
- ///Create the callback notifier
- mAppCallbackNotifier=new AppCallbackNotifier();
- if((NULL==mAppCallbackNotifier.get())||(mAppCallbackNotifier->initialize()!=NO_ERROR))
- {
- CAMHAL_LOGEA("Unable to create or initialize AppCallbackNotifier");
- goto fail_loop;
- }
- }
- if(!mMemoryManager.get())
- {
- ///Create Memory Manager
- mMemoryManager=new MemoryManager();
- if((NULL==mMemoryManager.get())||(mMemoryManager->initialize()!=NO_ERROR))
- {
- CAMHAL_LOGEA("Unable to create or initialize MemoryManager");
- goto fail_loop;
- }
- }
- ///Setup theclassdependencies...
- ///AppCallbackNotifier hastoknow wheretogetthe Camera framesandthe events like auto focus lock etc from.
- ///CameraAdapteristhe one which provides those events
- ///Setit as the frameandevent providersforAppCallbackNotifier
- ///@remarks setEventProvider API takesina bit mask of eventsforregistering a providerforthe different events
- ///That way,ifevents can come from DisplayAdapterinfuture,we will be abletoadd it as provider
- ///forany event
- mAppCallbackNotifier->setEventProvider(eventMask,mCameraAdapter);
- mAppCallbackNotifier->setFrameProvider(mCameraAdapter);
- ///Any dynamic errors that happen during the camera usecasehastobe propagated backtothe application
- ///via CAMERA_MSG_ERROR.AppCallbackNotifieristheclassthat notifies such errorstothe application
- ///Setit as theerrorhandlerforCameraAdapter
- mCameraAdapter->setErrorHandler(mAppCallbackNotifier.get());
- ///Start the callback notifier
- if(mAppCallbackNotifier->start()!=NO_ERROR)
- {
- CAMHAL_LOGEA("Couldn't start AppCallbackNotifier");
- goto fail_loop;
- }
- CAMHAL_LOGDA("Started AppCallbackNotifier..");
- mAppCallbackNotifier->setMeasurements(mMeasurementEnabled);
- ///Initialize default parameters
- initDefaultParameters();
- if(setParameters(mParameters)!=NO_ERROR)
- {
- CAMHAL_LOGEA("Failed to set default parameters?!");
- }
- //registerforsensor events
- mSensorListener=new SensorListener();
- if(mSensorListener.get()){
- if(mSensorListener->initialize()==NO_ERROR){
- mSensorListener->setCallbacks(orientation_cb,this);
- mSensorListener->enableSensor(SensorListener::SENSOR_ORIENTATION);
- }else{
- CAMHAL_LOGEA("Error initializing SensorListener. not fatal, continuing");
- mSensorListener.clear();
- mSensorListener=NULL;
- }
- }
- LOG_FUNCTION_NAME_EXIT;
- return NO_ERROR;
- fail_loop:
- ///Free up the resources because we failed somewhere up
- deinitialize();
- LOG_FUNCTION_NAME_EXIT;
- return NO_MEMORY;
- }
我们就看一下setFrameProvider这个方法都做了什么事情,
- void AppCallbackNotifier::setFrameProvider(FrameNotifier*frameNotifier)
- {
- LOG_FUNCTION_NAME;
- ///@remarks ThereisnoNULLcheck here.We will check
- ///forNULLwhen wegetthe start command from CameraAdapter
- mFrameProvider=new FrameProvider(frameNotifier,this,frameCallbackRelay);
- if(NULL==mFrameProvider)
- {
- CAMHAL_LOGEA("Error in creating FrameProvider");
- }
- else
- {
- //Register onlyforcaptured imagesandRAWfornow
- //TODO:Registerforandhandle all types of frames
- mFrameProvider->enableFrameNotification(CameraFrame::IMAGE_FRAME);
- mFrameProvider->enableFrameNotification(CameraFrame::RAW_FRAME);
- }
- LOG_FUNCTION_NAME_EXIT;
- }
这个方法只是调用了下面这个方法实现:
- status_t BaseCameraAdapter::__sendFrameToSubscribers(CameraFrame*frame,
- KeyedVector<int,frame_callback>*subscribers,
- CameraFrame::FrameType frameType)
- {
- size_t refCount=0;
- status_t ret=NO_ERROR;
- frame_callback callback=NULL;
- frame->mFrameType=frameType;
- if((frameType==CameraFrame::PREVIEW_FRAME_SYNC)||
- (frameType==CameraFrame::VIDEO_FRAME_SYNC)||
- (frameType==CameraFrame::SNAPSHOT_FRAME)){
- if(mFrameQueue.size()>0){
- CameraFrame*lframe=(CameraFrame*)mFrameQueue.valueFor(frame->mBuffer);
- frame->mYuv[0]=lframe->mYuv[0];
- frame->mYuv[1]=frame->mYuv[0]+(frame->mLength+frame->mOffset)*2/3;
- }
- else{
- CAMHAL_LOGDA("Empty Frame Queue");
- return-EINVAL;
- }
- }
- if(NULL!=subscribers){
- refCount=getFrameRefCount(frame->mBuffer,frameType);
- if(refCount==0){
- CAMHAL_LOGDA("Invalid ref count of 0");
- return-EINVAL;
- }
- if(refCount>subscribers->size()){
- CAMHAL_LOGEB("Invalid ref count for frame type: 0x%x",frameType);
- return-EINVAL;
- }
- CAMHAL_LOGVB("Type of Frame: 0x%x address: 0x%x refCount start %d",
- frame->mFrameType,
- (uint32_t)frame->mBuffer,
- refCount);
- for(unsignedinti=0;i<refCount;i++){
- frame->mCookie=(void*)subscribers->keyAt(i);
- callback=(frame_callback)subscribers->valueAt(i);
- if(!callback){
- CAMHAL_LOGEB("callback not set for frame type: 0x%x",frameType);
- return-EINVAL;
- }
- callback(frame);
- }
- }else{
- CAMHAL_LOGEA("Subscribers is null??");
- return-EINVAL;
- }
- return ret;
- }
这里所要获取到的callback方法就是上面setFrameProvider时引入的 frameCallbackRelay 这个函数,我们看看这个函数的具体实现
- void AppCallbackNotifier::frameCallbackRelay(CameraFrame*caFrame)
- {
- LOG_FUNCTION_NAME;
- AppCallbackNotifier*appcbn=(AppCallbackNotifier*)(caFrame->mCookie);
- appcbn->frameCallback(caFrame);
- LOG_FUNCTION_NAME_EXIT;
- }
- void AppCallbackNotifier::frameCallback(CameraFrame*caFrame)
- {
- ///Post the eventtothe event queue of AppCallbackNotifier
- TIUTILS::Message msg;
- CameraFrame*frame;
- LOG_FUNCTION_NAME;
- if(NULL!=caFrame)
- {
- frame=new CameraFrame(*caFrame);
- if(NULL!=frame)
- {
- msg.command=AppCallbackNotifier::NOTIFIER_CMD_PROCESS_FRAME;
- msg.arg1=frame;
- mFrameQ.put(&msg);
- }
- else
- {
- CAMHAL_LOGEA("Not enough resources to allocate CameraFrame");
- }
- }
- LOG_FUNCTION_NAME_EXIT;
- }
我们可以看一下在 AppCallbackNotifier初始化的时候就调用了initialize做一下初始设置
- /**
- *NotificationHandlerclass
- */
- ///InitializationfunctionforAppCallbackNotifier
- status_t AppCallbackNotifier::initialize()
- {
- LOG_FUNCTION_NAME;
- mPreviewMemory=0;
- mMeasurementEnabled=false;
- mNotifierState=NOTIFIER_STOPPED;
- ///Create the app notifier thread
- mNotificationThread=new NotificationThread(this);
- if(!mNotificationThread.get())
- {
- CAMHAL_LOGEA("Couldn't create Notification thread");
- return NO_MEMORY;
- }
- ///Start the display thread
- status_t ret=mNotificationThread->run("NotificationThread",PRIORITY_URGENT_DISPLAY);
- if(ret!=NO_ERROR)
- {
- CAMHAL_LOGEA("Couldn't run NotificationThread");
- mNotificationThread.clear();
- return ret;
- }
- mUseMetaDataBufferMode=true;
- mRawAvailable=false;
- mRecording=false;
- mPreviewing=false;
- LOG_FUNCTION_NAME_EXIT;
- return ret;
- }
- bool AppCallbackNotifier::notificationThread()
- {
- bool shouldLive=true;
- status_t ret;
- LOG_FUNCTION_NAME;
- //CAMHAL_LOGDA("Notification Thread waiting for message");
- ret=TIUTILS::MessageQueue::waitForMsg(&mNotificationThread->msgQ(),
- &mEventQ,
- &mFrameQ,
- AppCallbackNotifier::NOTIFIER_TIMEOUT);
- //CAMHAL_LOGDA("Notification Thread received message");
- if(mNotificationThread->msgQ().hasMsg()){
- ///Received a message from CameraHal,process it
- CAMHAL_LOGDA("Notification Thread received message from Camera HAL");
- shouldLive=processMessage();
- if(!shouldLive){
- CAMHAL_LOGDA("Notification Thread exiting.");
- return shouldLive;
- }
- }
- if(mEventQ.hasMsg()){
- ///Received an event from one of the event providers
- CAMHAL_LOGDA("Notification Thread received an event from event provider (CameraAdapter)");
- notifyEvent();
- }
- if(mFrameQ.hasMsg()){
- ///Received a frame from one of the frame providers
- //CAMHAL_LOGDA("Notification Thread received a frame from frame provider (CameraAdapter)");
- notifyFrame();
- }
- LOG_FUNCTION_NAME_EXIT;
- return shouldLive;
- }
- void AppCallbackNotifier::notifyFrame()
- {
- ///Receiveandsend the frame notificationstoapp
- TIUTILS::Message msg;
- CameraFrame*frame;
- MemoryHeapBase*heap;
- MemoryBase*buffer=NULL;
- sp<MemoryBase>memBase;
- void*buf=NULL;
- LOG_FUNCTION_NAME;
- {
- Mutex::Autolock lock(mLock);
- if(!mFrameQ.isEmpty()){
- mFrameQ.get(&msg);
- }else{
- return;
- }
- }
- bool ret=true;
- frame=NULL;
- switch(msg.command)
- {
- caseAppCallbackNotifier::NOTIFIER_CMD_PROCESS_FRAME:
- frame=(CameraFrame*)msg.arg1;
- if(!frame)
- {
- break;
- }
- if((CameraFrame::RAW_FRAME==frame->mFrameType)&&
- (NULL!=mCameraHal)&&
- (NULL!=mDataCb)&&
- (NULL!=mNotifyCb))
- {
- if(mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE))
- {
- #ifdef COPY_IMAGE_BUFFER
- copyAndSendPictureFrame(frame,CAMERA_MSG_RAW_IMAGE);
- #else
- //TODO:Find a waytomap a Tiler buffertoa MemoryHeapBase
- #endif
- }
- else{
- if(mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE_NOTIFY)){
- mNotifyCb(CAMERA_MSG_RAW_IMAGE_NOTIFY,0,0,mCallbackCookie);
- }
- mFrameProvider->returnFrame(frame->mBuffer,
- (CameraFrame::FrameType)frame->mFrameType);
- }
- mRawAvailable=true;
- }
- elseif((CameraFrame::IMAGE_FRAME==frame->mFrameType)&&
- (NULL!=mCameraHal)&&
- (NULL!=mDataCb)&&
- (CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG&frame->mQuirks))
- {
- intencode_quality=100,tn_quality=100;
- inttn_width,tn_height;
- unsignedintcurrent_snapshot=0;
- Encoder_libjpeg::params*main_jpeg=NULL,*tn_jpeg=NULL;
- void*exif_data=NULL;
- constchar*previewFormat=NULL;
- camera_memory_t*raw_picture=mRequestMemory(-1,frame->mLength,1,NULL);
- if(raw_picture){
- buf=raw_picture->data;
- }
- CameraParameters parameters;
- char*params=mCameraHal->getParameters();
- constString8 strParams(params);
- parameters.unflatten(strParams);
- encode_quality=parameters.getInt(CameraParameters::KEY_JPEG_QUALITY);
- if(encode_quality<0||encode_quality>100){
- encode_quality=100;
- }
- tn_quality=parameters.getInt(CameraParameters::KEY_JPEG_THUMBNAIL_QUALITY);
- if(tn_quality<0||tn_quality>100){
- tn_quality=100;
- }
- if(CameraFrame::HAS_EXIF_DATA&frame->mQuirks){
- exif_data=frame->mCookie2;
- }
- main_jpeg=(Encoder_libjpeg::params*)
- malloc(sizeof(Encoder_libjpeg::params));
- //Video snapshot with LDCNSFonadds a few bytes start offset
- //anda few bytesonevery line.They must be skipped.
- intrightCrop=frame->mAlignment/2-frame->mWidth;
- CAMHAL_LOGDB("Video snapshot right crop = %d",rightCrop);
- CAMHAL_LOGDB("Video snapshot offset = %d",frame->mOffset);
- if(main_jpeg){
- main_jpeg->src=(uint8_t*)frame->mBuffer->mapped;
- main_jpeg->src_size=frame->mLength;
- main_jpeg->dst=(uint8_t*)buf;
- main_jpeg->dst_size=frame->mLength;
- main_jpeg->quality=encode_quality;
- main_jpeg->in_width=frame->mAlignment/2;//use stride here
- main_jpeg->in_height=frame->mHeight;
- main_jpeg->out_width=frame->mAlignment/2;
- main_jpeg->out_height=frame->mHeight;
- main_jpeg->right_crop=rightCrop;
- main_jpeg->start_offset=frame->mOffset;
- if(CameraFrame::FORMAT_YUV422I_UYVY&frame->mQuirks){
- main_jpeg->format=TICameraParameters::PIXEL_FORMAT_YUV422I_UYVY;
- }
- else{//if(CameraFrame::FORMAT_YUV422I_YUYV&frame->mQuirks)
- main_jpeg->format=CameraParameters::PIXEL_FORMAT_YUV422I;
- }
- }
- tn_width=parameters.getInt(CameraParameters::KEY_JPEG_THUMBNAIL_WIDTH);
- tn_height=parameters.getInt(CameraParameters::KEY_JPEG_THUMBNAIL_HEIGHT);
- previewFormat=parameters.getPreviewFormat();
- if((tn_width>0)&&(tn_height>0)&&(NULL!=previewFormat)){
- tn_jpeg=(Encoder_libjpeg::params*)
- malloc(sizeof(Encoder_libjpeg::params));
- //ifmalloc fails just keep goingandencode main jpeg
- if(!tn_jpeg){
- tn_jpeg=NULL;
- }
- }
- if(tn_jpeg){
- intwidth,height;
- parameters.getPreviewSize(&width,&height);
- current_snapshot=(mPreviewBufCount+MAX_BUFFERS-1)%MAX_BUFFERS;
- tn_jpeg->src=(uint8_t*)mPreviewBuffers[current_snapshot].mapped;
- tn_jpeg->src_size=mPreviewMemory->size/MAX_BUFFERS;
- tn_jpeg->dst_size=calculateBufferSize(tn_width,
- tn_height,
- previewFormat);
- tn_jpeg->dst=(uint8_t*)malloc(tn_jpeg->dst_size);
- tn_jpeg->quality=tn_quality;
- tn_jpeg->in_width=width;
- tn_jpeg->in_height=height;
- tn_jpeg->out_width=tn_width;
- tn_jpeg->out_height=tn_height;
- tn_jpeg->right_crop=0;
- tn_jpeg->start_offset=0;
- tn_jpeg->format=CameraParameters::PIXEL_FORMAT_YUV420SP;;
- }
- sp<Encoder_libjpeg>encoder=new Encoder_libjpeg(main_jpeg,
- tn_jpeg,
- AppCallbackNotifierEncoderCallback,
- (CameraFrame::FrameType)frame->mFrameType,
- this,
- raw_picture,
- exif_data,frame->mBuffer);
- gEncoderQueue.add(frame->mBuffer->mapped,encoder);
- encoder->run();
- encoder.clear();
- if(params!=NULL)
- {
- mCameraHal->putParameters(params);
- }
- }
- elseif((CameraFrame::IMAGE_FRAME==frame->mFrameType)&&
- (NULL!=mCameraHal)&&
- (NULL!=mDataCb))
- {
- //CTS,MTS requirements:Every'takePicture()'call
- //who registers a raw callback should receive one
- //as well.Thisisnotalways thecasewith
- //CameraAdapters though.
- if(!mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE)){
- dummyRaw();
- }else{
- mRawAvailable=false;
- }
- #ifdef COPY_IMAGE_BUFFER
- {
- Mutex::Autolock lock(mBurstLock);
- #ifdefined(OMAP_ENHANCEMENT)
- if(mBurst)
- {
- copyAndSendPictureFrame(frame,CAMERA_MSG_COMPRESSED_BURST_IMAGE);
- }
- else
- #endif
- {
- copyAndSendPictureFrame(frame,CAMERA_MSG_COMPRESSED_IMAGE);
- }
- }
- #else
- //TODO:Find a waytomap a Tiler buffertoa MemoryHeapBase
- #endif
- }
- elseif((CameraFrame::VIDEO_FRAME_SYNC==frame->mFrameType)&&
- (NULL!=mCameraHal)&&
- (NULL!=mDataCb)&&
- (mCameraHal->msgTypeEnabled(CAMERA_MSG_VIDEO_FRAME)))
- {
- AutoMutex locker(mRecordingLock);
- if(mRecording)
- {
- if(mUseMetaDataBufferMode)
- {
- camera_memory_t*videoMedatadaBufferMemory=
- mVideoMetadataBufferMemoryMap.valueFor(frame->mBuffer->opaque);
- video_metadata_t*videoMetadataBuffer=(video_metadata_t*)videoMedatadaBufferMemory->data;
- if((NULL==videoMedatadaBufferMemory)||(NULL==videoMetadataBuffer)||(NULL==frame->mBuffer))
- {
- CAMHAL_LOGEA("Error! One of the video buffers is NULL");
- break;
- }
- if(mUseVideoBuffers)
- {
- CameraBuffer*vBuf=mVideoMap.valueFor(frame->mBuffer->opaque);
- GraphicBufferMapper&mapper=GraphicBufferMapper::get();
- Rect bounds;
- bounds.left=0;
- bounds.top=0;
- bounds.right=mVideoWidth;
- bounds.bottom=mVideoHeight;
- void*y_uv[2];
- mapper.lock((buffer_handle_t)vBuf,CAMHAL_GRALLOC_USAGE,bounds,y_uv);
- y_uv[1]=y_uv[0]+mVideoHeight*4096;
- structConvImage input={frame->mWidth,
- frame->mHeight,
- 4096,
- IC_FORMAT_YCbCr420_lp,
- (mmByte*)frame->mYuv[0],
- (mmByte*)frame->mYuv[1],
- frame->mOffset};
- structConvImage output={mVideoWidth,
- mVideoHeight,
- 4096,
- IC_FORMAT_YCbCr420_lp,
- (mmByte*)y_uv[0],
- (mmByte*)y_uv[1],
- 0};
- VT_resizeFrame_Video_opt2_lp(&input,&output,NULL,0);
- mapper.unlock((buffer_handle_t)vBuf->opaque);
- videoMetadataBuffer->metadataBufferType=(int)kMetadataBufferTypeCameraSource;
- /*FIXME remove cast*/
- videoMetadataBuffer->handle=(void*)vBuf->opaque;
- videoMetadataBuffer->offset=0;
- }
- else
- {
- videoMetadataBuffer->metadataBufferType=(int)kMetadataBufferTypeCameraSource;
- videoMetadataBuffer->handle=camera_buffer_get_omx_ptr(frame->mBuffer);
- videoMetadataBuffer->offset=frame->mOffset;
- }
- CAMHAL_LOGVB("mDataCbTimestamp : frame->mBuffer=0x%x, videoMetadataBuffer=0x%x, videoMedatadaBufferMemory=0x%x",
- frame->mBuffer->opaque,videoMetadataBuffer,videoMedatadaBufferMemory);
- mDataCbTimestamp(frame->mTimestamp,CAMERA_MSG_VIDEO_FRAME,
- videoMedatadaBufferMemory,0,mCallbackCookie);
- }
- else
- {
- //TODO:Needtorevisit this,should ideally be mapping the TILER buffer using mRequestMemory
- camera_memory_t*fakebuf=mRequestMemory(-1,sizeof(buffer_handle_t),1,NULL);
- if((NULL==fakebuf)||(NULL==fakebuf->data)||(NULL==frame->mBuffer))
- {
- CAMHAL_LOGEA("Error! One of the video buffers is NULL");
- break;
- }
- *reinterpret_cast<buffer_handle_t*>(fakebuf->data)=reinterpret_cast<buffer_handle_t>(frame->mBuffer->mapped);
- mDataCbTimestamp(frame->mTimestamp,CAMERA_MSG_VIDEO_FRAME,fakebuf,0,mCallbackCookie);
- fakebuf->release(fakebuf);
- }
- }
- }
- elseif((CameraFrame::SNAPSHOT_FRAME==frame->mFrameType)&&
- (NULL!=mCameraHal)&&
- (NULL!=mDataCb)&&
- (NULL!=mNotifyCb)){
- //When enabled,measurement dataissent instead of video data
- if(!mMeasurementEnabled){
- copyAndSendPreviewFrame(frame,CAMERA_MSG_POSTVIEW_FRAME);
- }else{
- mFrameProvider->returnFrame(frame->mBuffer,
- (CameraFrame::FrameType)frame->mFrameType);
- }
- }
- elseif((CameraFrame::PREVIEW_FRAME_SYNC==frame->mFrameType)&&
- (NULL!=mCameraHal)&&
- (NULL!=mDataCb)&&
- (mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME))){
- //When enabled,measurement dataissent instead of video data
- if(!mMeasurementEnabled){
- copyAndSendPreviewFrame(frame,CAMERA_MSG_PREVIEW_FRAME);
- }else{
- mFrameProvider->returnFrame(frame->mBuffer,
- (CameraFrame::FrameType)frame->mFrameType);
- }
- }
- elseif((CameraFrame::FRAME_DATA_SYNC==frame->mFrameType)&&
- (NULL!=mCameraHal)&&
- (NULL!=mDataCb)&&
- (mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME))){
- copyAndSendPreviewFrame(frame,CAMERA_MSG_PREVIEW_FRAME);
- }else{
- mFrameProvider->returnFrame(frame->mBuffer,
- (CameraFrame::FrameType)frame->mFrameType);
- CAMHAL_LOGDB("Frame type 0x%x is still unsupported!",frame->mFrameType);
- }
- break;
- default:
- break;
- };
- exit:
- if(NULL!=frame)
- {
- delete frame;
- }
- LOG_FUNCTION_NAME_EXIT;
- }
- void AppCallbackNotifier::copyAndSendPreviewFrame(CameraFrame*frame,int32_t msgType)
- {
- camera_memory_t*picture=NULL;
- CameraBuffer*dest=NULL;
- //scopeforlock
- {
- Mutex::Autolock lock(mLock)
更多相关文章
- 2010-02-27 传智播客—Android(二)数据存储和访问 之文件
- Android多个APK共享数据(Shared User ID)
- Android 解决65535的限制 使用android-support-multidex解决Dex
- Android 应用语言切换的三种方法
- Android CheckBox中设置padding无效问题解决方法
- android: 横竖屏切换总结-布局改变和数据保存