(1) 要將mVideoBuffer中的資料畫出來之前,必須先建立mVideoRenderervoid AwesomePlayer::onVideoEvent(){...if (mVideoRenderer == NULL){initRenderer_l();}...}void AwesomePlayer::initRenderer_l(){if (!strncmp("OMX.", component, 4)){mVideoRenderer = new AwesomeRemoteRenderer(mClient.intece()->createRenderer(mISuce,component,...)); .......... (2)}else{mVideoRenderer = new AwesomeLocalRenderer(...,component,mISuce); ............................ (3)}}(2) 如果video decoder是OMX component,則建立一個AwesomeRemoteRenderer作為mVideoRenderer從上段的程式碼(1)來看,AwesomeRemoteRenderer的本質是由OMX::createRenderer所創建的。
createRenderer會先建立一個hardware renderer -- SharedVideoRenderer (libstagefrighthw.so);若失敗,則建立software renderer -- SoftwareRenderer (suce)。sp<IOMXRenderer> OMX::createRenderer(...){VideoRenderer *impl = NULL;libHandle = dlopen("libstagefrighthw.so", RTLD_NOW);if (libHandle){CreateRendererFunc func = dlsym(libHandle, ...);impl = (*func)(...); <----------------- Hardware Renderer}if (!impl){impl = new SoftwareRenderer(...); <---- Software Renderer}}(3) 如果video decoder是software component,則建立一個AwesomeLocalRenderer作為mVideoRendererAwesomeLocalRenderer的constructor會呼叫本身的init函式,其所做的事和OMX::createRenderer一模一樣。
void AwesomeLocalRenderer::init(...){mLibHandle = dlopen("libstagefrighthw.so", RTLD_NOW);if (mLibHandle){CreateRendererFunc func = dlsym(...);mTarget = (*func)(...); <---------------- Hardware Renderer}if (mTarget == NULL){mTarget = new SoftwareRenderer(...); <--- Software Renderer}}(4) mVideoRenderer一經建立就可以開始將解碼後的資料傳給它void AwesomePlayer::onVideoEvent(){if (!mVideoBuffer){mVideoSource->read(&mVideoBuffer, ...);}[Check Timestamp]if (mVideoRenderer == NULL){initRenderer_l();}mVideoRenderer->render(mVideoBuffer); <----- Render Data}stagefright框架(六)-Audio Playback的流程到目前為止,我們都只著重在video處理的部分,對於audio卻隻字未提。
本文来自电脑杂谈,转载请注明本文网址:
http://www.pc-fly.com/a/tongxinshuyu/article-50199-5.html
我没看懂MV前面写他粉丝怎么怎么欢迎