Streaming media refers to the capability of playing media data while the data is being transferred from server. The user doesn't need to wait until full media content has been downloaded to start playing. In media streaming, media content is split into small chunks as the transport unit. After the user's player has received sufficient chunks, it starts playing.
From the developer's perspective, media streaming is comprised of two tasks, transfer data and render data. Application developers usually concentrate more on transfer data than render data, because codec and media renderer are often available already.
On android, streaming audio is somewhat easier than video for android provides a more friendly api to render audio data in small chunks. No matter what is our transfer mechanism, rtp, raw udp or raw file reading, we need to feed chunks we received to renderer. The AudioTrack.write function enables us doing so.
AudioTrack object runs in two modes, static or stream. In static mode, we write the whole audio file to audio hardware. In stream mode, audio data are written in small chunks. The static mode is more efficient because it doesn't have the overhead of copying data from java layer to native layer, but it's not suitable if the audio file is too big to fit it memory. It's important to notice that we call play at different time in two modes. In static mode, we must call write first, then call play. Otherwise, the AudioTrack raises an exception complains that AudioTrack object isn't properly initialized. In stream mode, they are called in reverse order. Under the hood, static and stream mode determine the memory model. In static mode, audio data are passed to renderer via shared memory. Thus static mode is moreefficient.

From birds eye view, the architecture of an typicalaudio streaming application is:

Our application receives data from network. Then the data will be passed to ajava layer AudioTrack object which internally calls through jni to native AudioTrack object. The native AudioTrack object in our applicationis a proxy that refers to the implementation AudioTrack object resides in audioflinger process,through binder ipc mechanism.The audiofinger process will interact with audio hardware.
Since our application and audioflinger are separate processes, so after our application has written data to audioflinger, the playback will not stop even if our application exits.

AudioTrack only supports PCM (a.k.a G.711) audio format. In other words, we can't stream mp3 audio directly. We have to deal with decoding ourselves, and feed decoded data to AudioTrack.

Sample
http://code.google.com/p/rxwen-blog-stuff/source/browse/trunk/android/streaming_audio/
For demonstration purpose, this sample chooses a very simple transfer mechanism. It reads data from a wav file on disk in chunks, but we can consider it as if the data were delivered from a media server on network. The idea is similar.

更多相关文章

  1. 代码中设置drawableleft
  2. android 3.0 隐藏 系统标题栏
  3. Android开发中activity切换动画的实现
  4. Android(安卓)学习 笔记_05. 文件下载
  5. Android中直播视频技术探究之—摄像头Camera视频源数据采集解析
  6. 技术博客汇总
  7. android 2.3 wifi (一)
  8. AndRoid Notification的清空和修改
  9. Android中的Chronometer

随机推荐

  1. MySQL配置了双主,是如何避免出现数据回环
  2. win10下mysql 8.0.23 安装配置方法图文教
  3. deepin20.1系统安装MySQL8.0.23(超详细的
  4. MySQL修改字符集的实战教程
  5. MySQL 一则慢日志监控误报的问题分析与解
  6. MySQL慢查询日志的作用和开启
  7. Centos7下安装MySQL8.0.23的步骤(小白入
  8. 几个MySQL高频面试题的解答
  9. MySQL创建数据库并支持中文字符的操作方
  10. MySQL的MaxIdleConns不合理,会变成短连接