/********************************************************************************************
* author:[email protected]大钟
* E-mail:[email protected]

*site:http://www.idealpwr.com/

*深圳市动力思维科技发展有限公司
* http://blog.csdn.net/conowen
* 注:本文为原创,仅作为学习交流使用,转载请标明作者及出处。

********************************************************************************************/


1、

使用NDK去编译官方的FFmpeg原版的话,还得自己实现JNI层与java层工程量比较大。所以移植FFmpeg到Android平台时,可以移植一些已经实现JNI与JAVA层的开源项目,毕竟软件行业从来都是站在巨人肩膀上发展的。


2、移植havlenapetr/FFMpeg


havlenapetr的开源项目是比较出名的一个FFmpeg工程,很多Android多媒体项目都是在此基础上面修改的。


下载地址:https://github.com/havlenapetr/FFMpeg

可以直接ZIP包:https://github.com/havlenapetr/FFMpeg/zipball/debug

或者通过git方式下载,新建一个目录,然后在linux的终端下执行,当然了,你要事情安装git的相关工具

git clone https://github.com/havlenapetr/FFMpeg.git

3、利用NDK编译生成so库


下载后直接在havlenapetr-FFMpeg-7c27aa2的顶级目录下执行

$ndk/ndk-build

是可以编译通过的,不会提示任何error。

关于如何利用NDK编译,可以参考我之前的博文:http://blog.csdn.net/conowen/article/details/7518870


4、导入java工程,实现播放

然后把在eclipse里面,把havlenapetr-FFMpeg-7c27aa2这个项目import进来,就可以播放视频了。


4.1、需要注意的是:这个版本的havlenapetr FFmpeg工程只能在Android 2.2上面运行,因为havlenapetr采用的是音视频直接在JNI层输入。可以注意到havlenapetr-FFMpeg-7c27aa2目录下有prebuilt这样一个目录,此目录下有Android 2.2版本的libjniaudio.so和libjnivideo.so两个库文件。


4.2、Android版本不同导致不能播放:

havlenapetr的FFmpeg项目音视频输出如下

音频:采用Android底层的audiotrack输出。

视频:在FFmpeg解码之后,得到YUV信号,然后转换成RGB信号,最终通过Android底层的surface输出。


提示:可以移植SDL开源库实现音视频输出,因为SDL的视频输出机制是通过OPenGL呈现画面,这样就可以兼容所有的Android平台。


但是问题就来了,Android每个版本的framework都是不大一样的,所以要在底层使用Android的audiotrack和surface来输入音视频信号,就要在相应版本的Android源代码中,重新编译生成libjniaudio.so和libjnivideo.so两个库文件了。


5、编译havlenapetr FFmpeg工程Android 2.3版本的libjniaudio.so和libjnivideo.so

首先要明白一点,Android的官方源代码编译之后,是不会生成libjniaudio.so和libjnivideo.so的。所以要自己添加audiotrack.cpp、surface.cpp和Android.mk文件到Android源代码里面编译生成。(每次编译libjniaudio.so和libjnivideo.so都要重新编译这个Android源代码,时间比较长。)


5.1下载audio与video文件夹

可以在https://github.com/havlenapetr/android_frameworks_base下载audiotrack.cpp、surface.cpp和Android.mk,注意要选择正确的branch(分支)

froyo---->Android 2.2

gingerbread---->Android 2.3

ICS---->Android 4.0


关于havlenapetr-FFMpeg在Android 4.0(ICS)的补充说明



5.2、编译Android系统源代码

下载之后,然后找到里面的native文件夹,把里面的audio和video文件夹拖进Android源代码的frameworks/base/native目录下。

绿色的是新加入的文件

Android多媒体开发(3)————使用Android NKD编译havlenapetr-FFMpeg-7c27aa2_第1张图片

需要注意的一点是:

gingerbread下载之后,里面是没有audio和video文件夹的,但是可以用froyo版本的audio和video文件夹。(也就是下载gingerbread感觉也没啥用Orz~~~)

但是我们可以使用froyo的audio和video文件夹,编译Android源代码是可以成功通过的,ndk-build也可以通过,但是在Android的java工程里面使用就会有以下错误信息。

java.lang.NoSuchFieldError: no field with name='mSurface' signature='I' in class Landroid/view/Surface;

加载库时,找不到mSruface类
修改方法是:
将surface.cpp中mSurface改为 mNativeSurface 然后重新编译即可。当然了,你也可以用ICS的surface.cpp文件,这个版本是没有问题的。


另外编译havlenapetr FFmpeg工程Android 4.0版本的libjniaudio.so和libjnivideo.so与上面步骤差不多。


/************************************************************************/

附上我所使用的audio与video(来源havlenapetr的项目)

video/jni/surface.cpp(注意目录结构)


/* * Copyright (C) 2009 The Android Open Source Project * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * *      http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */#include <android/surface.h>#include <surfaceflinger/Surface.h>#include <utils/Log.h>#include <SkBitmap.h>#include <SkCanvas.h>#define TAG "SurfaceWrapper"using namespace android;static Surface*sSurface;static SkBitmapsBitmapClient;static SkBitmapsBitmapSurface;static Surface* getNativeSurface(JNIEnv* env, jobject jsurface) {jclass clazz = env->FindClass("android/view/Surface");jfieldID field_surface = env->GetFieldID(clazz, "mNativeSurface", "I");if(field_surface == NULL) {return NULL;}return (Surface *) env->GetIntField(jsurface, field_surface);}static int initBitmap(SkBitmap *bitmap, int format, int width, int height, bool allocPixels) {switch (format) {        case PIXEL_FORMAT_RGBA_8888:            bitmap->setConfig(SkBitmap::kARGB_8888_Config, width, height);            break;        case PIXEL_FORMAT_RGBA_4444:            bitmap->setConfig(SkBitmap::kARGB_4444_Config, width, height);            break;        case PIXEL_FORMAT_RGB_565:            bitmap->setConfig(SkBitmap::kRGB_565_Config, width, height);            break;        case PIXEL_FORMAT_A_8:            bitmap->setConfig(SkBitmap::kA8_Config, width, height);            break;        default:            bitmap->setConfig(SkBitmap::kNo_Config, width, height);            break;    }if(allocPixels) {bitmap->setIsOpaque(true);//-- alloc array of pixelsif(!bitmap->allocPixels()) {return -1;}}return 0;}int AndroidSurface_register(JNIEnv* env, jobject jsurface) {__android_log_print(ANDROID_LOG_INFO, TAG, "registering video surface");sSurface = getNativeSurface(env, jsurface);if(sSurface == NULL) {     return ANDROID_SURFACE_RESULT_JNI_EXCEPTION;}__android_log_print(ANDROID_LOG_INFO, TAG, "registered");return ANDROID_SURFACE_RESULT_SUCCESS;}int AndroidSurface_getPixels(int width, int height, void** pixels) {__android_log_print(ANDROID_LOG_INFO, TAG, "getting surface's pixels %ix%i", width, height);if(sSurface == NULL) {return ANDROID_SURFACE_RESULT_JNI_EXCEPTION;}if(initBitmap(&sBitmapClient, PIXEL_FORMAT_RGB_565, width, height, true) < 0) {return ANDROID_SURFACE_RESULT_COULDNT_INIT_BITMAP_CLIENT;}*pixels = sBitmapClient.getPixels();__android_log_print(ANDROID_LOG_INFO, TAG, "getted");return ANDROID_SURFACE_RESULT_SUCCESS;}static void doUpdateSurface() {SkCanvascanvas(sBitmapSurface);SkRectsurface_sBitmapClient;SkRectsurface_sBitmapSurface;SkMatrixmatrix;surface_sBitmapSurface.set(0, 0, sBitmapSurface.width(), sBitmapSurface.height());surface_sBitmapClient.set(0, 0, sBitmapClient.width(), sBitmapClient.height());matrix.setRectToRect(surface_sBitmapClient, surface_sBitmapSurface, SkMatrix::kFill_ScaleToFit);canvas.drawBitmapMatrix(sBitmapClient, matrix);}static int prepareSurfaceBitmap(Surface::SurfaceInfo* info) {if(initBitmap(&sBitmapSurface, info->format, info->w, info->h, false) < 0) {return -1;}sBitmapSurface.setPixels(info->bits);return 0;}int AndroidSurface_updateSurface() {static Surface::SurfaceInfo surfaceInfo;if(sSurface == NULL) {return ANDROID_SURFACE_RESULT_JNI_EXCEPTION;}if (!Surface::isValid (sSurface)){return ANDROID_SURFACE_RESULT_NOT_VALID;}if (sSurface->lock(&surfaceInfo) < 0) {return ANDROID_SURFACE_RESULT_COULDNT_LOCK;}if(prepareSurfaceBitmap(&surfaceInfo) < 0) {return ANDROID_SURFACE_RESULT_COULDNT_INIT_BITMAP_SURFACE;}doUpdateSurface();if (sSurface->unlockAndPost() < 0) {return ANDROID_SURFACE_RESULT_COULDNT_UNLOCK_AND_POST;}return ANDROID_SURFACE_RESULT_SUCCESS;}int AndroidSurface_unregister() {__android_log_print(ANDROID_LOG_INFO, TAG, "unregistering video surface");__android_log_print(ANDROID_LOG_INFO, TAG, "unregistered");    return ANDROID_SURFACE_RESULT_SUCCESS;}


video/jni/Android.mk (注意目录结构)

LOCAL_PATH:= $(call my-dir)include $(CLEAR_VARS)# our source files#LOCAL_SRC_FILES:= \ surface.cppLOCAL_SHARED_LIBRARIES := \ libskia \  libsurfaceflinger_client \  libutils \ liblogLOCAL_C_INCLUDES += \ $(JNI_H_INCLUDE) \ external/skia/src/core \ external/skia/include/core \ frameworks/base/include \ frameworks/base/native/include# Optional tag would mean it doesn't get installed by defaultLOCAL_MODULE_TAGS := optionalLOCAL_PRELINK_MODULE := falseLOCAL_MODULE:= libjnivideoinclude $(BUILD_SHARED_LIBRARY)

/audio/jni/audiotrack.cpp(注意目录结构)

/** Copyright (C) 2009 The Android Open Source Project** Licensed under the Apache License, Version 2.0 (the "License");* you may not use this file except in compliance with the License.* You may obtain a copy of the License at** http://www.apache.org/licenses/LICENSE-2.0** Unless required by applicable law or agreed to in writing, software* distributed under the License is distributed on an "AS IS" BASIS,* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.* See the License for the specific language governing permissions and* limitations under the License.*/#include <android/audiotrack.h>#include <utils/Log.h>#include <media/AudioTrack.h>#include <media/AudioSystem.h>#include <utils/Errors.h>#include <binder/MemoryHeapBase.h>#include <binder/MemoryBase.h>#define TAG "AudioTrackWrapper"using namespace android;//struct audiotrack_fields_t {static AudioTrack*      track;//sp<MemoryHeapBase>     memHeap;//sp<MemoryBase>      memBase;//};//static struct audiotrack_fields_t audio;static AudioTrack* getNativeAudioTrack(JNIEnv* env, jobject jaudioTrack) { jclass clazz = env->FindClass("android/media/AudioTrack"); jfieldID field_track = env->GetFieldID(clazz, "mNativeTrackInJavaObj", "I"); if(field_track == NULL) {  return NULL; } return (AudioTrack *) env->GetIntField(jaudioTrack, field_track);}/*static bool allocSharedMem(int sizeInBytes) { memHeap = new MemoryHeapBase(sizeInBytes); if (memHeap->getHeapID() < 0) {  return false; } memBase = new MemoryBase(memHeap, 0, sizeInBytes); return true;}*/int AndroidAudioTrack_register() { __android_log_print(ANDROID_LOG_INFO, TAG, "registering audio track"); track = new AudioTrack(); if(track == NULL) {  return ANDROID_AUDIOTRACK_RESULT_JNI_EXCEPTION; } __android_log_print(ANDROID_LOG_INFO, TAG, "registered"); return ANDROID_AUDIOTRACK_RESULT_SUCCESS;}int AndroidAudioTrack_start() { //__android_log_print(ANDROID_LOG_INFO, TAG, "starting audio track"); if(track == NULL) { return ANDROID_AUDIOTRACK_RESULT_ALLOCATION_FAILED; } track->start(); return ANDROID_AUDIOTRACK_RESULT_SUCCESS;}int AndroidAudioTrack_set(int streamType,       uint32_t sampleRate,       int format,       int channels) { if(track == NULL) { return ANDROID_AUDIOTRACK_RESULT_ALLOCATION_FAILED; }  __android_log_print(ANDROID_LOG_INFO, TAG, "setting audio track");  status_t ret = track->set(streamType,         sampleRate,         format,         channels,         0,         0,        0,         0,        false);  if (ret != NO_ERROR) {  return ANDROID_AUDIOTRACK_RESULT_ERRNO; } return ANDROID_AUDIOTRACK_RESULT_SUCCESS;}int AndroidAudioTrack_flush() { if(track == NULL) { return ANDROID_AUDIOTRACK_RESULT_ALLOCATION_FAILED; } track->flush(); return ANDROID_AUDIOTRACK_RESULT_SUCCESS;}int AndroidAudioTrack_stop() { if(track == NULL) { return ANDROID_AUDIOTRACK_RESULT_ALLOCATION_FAILED; } track->stop(); return ANDROID_AUDIOTRACK_RESULT_SUCCESS;}int AndroidAudioTrack_reload() { if(track == NULL) { return ANDROID_AUDIOTRACK_RESULT_ALLOCATION_FAILED; } if(track->reload() != NO_ERROR) {  return ANDROID_AUDIOTRACK_RESULT_ERRNO; } return ANDROID_AUDIOTRACK_RESULT_SUCCESS;}int AndroidAudioTrack_unregister() { __android_log_print(ANDROID_LOG_INFO, TAG, "unregistering audio track"); if(!track->stopped()) { track->stop(); } //memBase.clear(); //memHeap.clear(); free(track); //track = NULL; __android_log_print(ANDROID_LOG_INFO, TAG, "unregistered"); return ANDROID_AUDIOTRACK_RESULT_SUCCESS;}int AndroidAudioTrack_write(void *buffer, int buffer_size) { // give the data to the native AudioTrack object (the data starts at the offset) ssize_t written = 0; // regular write() or copy the data to the AudioTrack's shared memory? if (track->sharedBuffer() == 0) { written = track->write(buffer, buffer_size); } else {  // writing to shared memory, check for capacity  if ((size_t)buffer_size > track->sharedBuffer()->size()) {   __android_log_print(ANDROID_LOG_INFO, TAG, "buffer size was too small");   buffer_size = track->sharedBuffer()->size();  }  memcpy(track->sharedBuffer()->pointer(), buffer, buffer_size);  written = buffer_size; } return written;}

/audio/jni/Android.mk(注意目录结构)

LOCAL_PATH:= $(call my-dir)include $(CLEAR_VARS)# our source files#LOCAL_SRC_FILES:= \ audiotrack.cppLOCAL_SHARED_LIBRARIES := \ libbinder \  libmedia \  libutils \ liblogLOCAL_C_INCLUDES += \ $(JNI_H_INCLUDE) \ frameworks/base/include \ frameworks/base/native/include# Optional tag would mean it doesn't get installed by defaultLOCAL_MODULE_TAGS := optionalLOCAL_PRELINK_MODULE := falseLOCAL_MODULE:= libjniaudioinclude $(BUILD_SHARED_LIBRARY)






更多相关文章

  1. Android进度条源代码
  2. Android版本下载以及切换
  3. Android的源代码结构
  4. 【Unity3D】Unity3D与Android的交互通信(Android Studio3.0版本)
  5. Android 升级SDK管理器版本到20,安装Android 4.0以及更高版本
  6. Android 5.1源代码与Nexus设备工厂镜像下载
  7. Android Studio apk系统签名和版本描述的实现
  8. android 开发对gif解码(适配android 4.2、4.3、4.4版本)
  9. Android API与Android版本的关系

随机推荐

  1. WebAssembly入门课
  2. MySQL 8.0 安装教程 步骤 (windows 64位)
  3. 演示flex container 容器中的4个属性
  4. CSS中flex布局的属性及应用
  5. Vue自学之路1-vue概述
  6. 我可以搞定这个需求,你行吗
  7. 在数据中查找异常值的5种方法总结及示例
  8. 第三卷.Stata最新且急需的程序系列汇编
  9. 美国621位经济学家关于支持就业和企业应
  10. 我是如何从一个诗人变成一个计量实证高手