让IjkPlayer⽀持插⼊⾃定义的GPU滤镜⽅法
最近因为⼯作的原因,需要提供⼀个将我们的插⼊到IjkPlayer中的⽰例,就不得不好好看了下IjkPlayer的代码。在IjkPlayer中并没有提供设置⾃定义GPU滤镜的接⼝,所以最后只能⾃⼰动⼿,以求丰⾐⾜⾷了。不得不说,Bilibili开源的这个IjkPlayer播放器的确⾮常强⼤,代码设计的⾮常清晰,仔细看看,能学到不少东西。
源码获取及编译⽅法
源码地址,编译参考readme即可:
# 获取ijk源码
git clone github/Bilibili/ijkplayer.git ijkplayer-android
# 进⼊源码⽬录
cd ijkplayer-android
# checkout 最新版本
git checkout -B latest k0.8.0
# 执⾏脚本,此脚本会下载ijk依赖的源码,⽐如ffmpeg
./init-android.sh
# 编译ffmpeg, all可以换成指定版本,如armv7a
cd android/contrib
./compile-ffmpeg.sh clean
./compile-ffmpeg.sh all
# 编译ijkplayer,all可以换成指定版本,如armv7a
cd ..
./compile-ijk.sh all
IjkPlayer分析及修改
Android 版的IjkPlayer⽰例⼯程中,播放视频界⾯为tv.ample.activities.VideoActivity,在VideoActivity 中使⽤的是IjkVideoView来播放视频的,位于tv.dia包下。
IjkVideoView使⽤与Android的VideoView基本⼀致,在IjkVideoView中,设置视频源调⽤setVideoURI⽅法,⽽此⽅法⼜会调⽤private属性的openVideo⽅法。在openVideo⽅法中,会根据Player()的值创建⼀个IMediaPlayer:
public IMediaPlayer createPlayer(int playerType) {
IMediaPlayer mediaPlayer = null;
switch (playerType) {
case Settings.PV_PLAYER__IjkExoMediaPlayer: {
IjkExoMediaPlayer IjkExoMediaPlayer = new IjkExoMediaPlayer(mAppContext);
mediaPlayer = IjkExoMediaPlayer;
}
break;
case Settings.PV_PLAYER__AndroidMediaPlayer: {
AndroidMediaPlayer androidMediaPlayer = new AndroidMediaPlayer();
mediaPlayer = androidMediaPlayer;
}
break;
case Settings.PV_PLAYER__IjkMediaPlayer:
default: {
IjkMediaPlayer ijkMediaPlayer = null;
if (mUri != null) {
ijkMediaPlayer = new IjkMediaPlayer();
ijkMediaPlayer.native_setLogLevel(IjkMediaPlayer.IJK_LOG_DEBUG);
if (UsingMediaCodec()) {
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec", 1);
if (UsingMediaCodecAutoRotate()) {
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec-auto-rotate", 1);
} else {
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec-auto-rotate", 0);
}
if (MediaCodecHandleResolutionChange()) {
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec-handle-resolution-change", 1);
} else {
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec-handle-resolution-change", 0);
}
} else {
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec", 0);
}
//省略其他参数设置的的代码
}
mediaPlayer = ijkMediaPlayer;
}
break;
if (EnableDetachedSurfaceTextureView()) {
mediaPlayer = new TextureMediaPlayer(mediaPlayer);
}
return mediaPlayer;
}
从上⾯代码中可以看到,在IjkVideoView中多处⽤到了mSettings,mSettings的值主要是⽤户设置的,通过SharedPreferences保存的,包括⾳视频解码设置、是否使⽤OpenSLES、渲染View等等。可以参看SettingsActivity界⾯。
根据playerType创建IjkMediaPlayer,前两类分别为google的ExoPlayer和Android的MediaPlayer。除此之外才是真正的创建的IjkPlayer。
mSettings中的其他参数最终会转换后通过IjkMediaPlayer的setOption⽅法进⾏设置,⽽IjkMediaPlayer.setOption⼜是直接调⽤native⽅法。进⼊IjkMediaPlayer可以发现,IjkMediaPlayer中的许多⽅法都是native⽅法,或者调⽤了native⽅法。
增加setGLFilter接⼝
在ijkmedia⽂件夹下全局搜索其中⼀个⽅法_setDataSource,得到内容⼤致如下:
F:\cres\C\ijkplayer-android\ijkmedia\ijkplayer\android\ijkplayer_jni.c:
static void
IjkMediaPlayer_setDataSourceAndHeaders(
JNIEnv *env, jobject thiz, jstring path,
jobjectArray keys, jobjectArray values)
...
static void
IjkMediaPlayer_setDataSourceFd(JNIEnv *env, jobject thiz, jint fd)
{
MPTRACE("%s\n", __func__);
.
setoption..
static void
IjkMediaPlayer_setDataSourceCallback(JNIEnv *env, jobject thiz, jobject callback)
{
MPTRACE("%s\n", __func__);
...
static JNINativeMethod g_methods[] = {
{
"_setDataSource",
"(Ljava/lang/String;[Ljava/lang/String;[Ljava/lang/String;)V",
(void *) IjkMediaPlayer_setDataSourceAndHeaders
},
{ "_setDataSourceFd", "(I)V", (void *) IjkMediaPlayer_setDataSourceFd },
{ "_setDataSource", "(Ltv/danmaku/ijk/media/player/misc/IMediaDataSource;)V", (void *)IjkMediaPlayer_setDataSourceCallback },
{ "_setAndroidIOCallback", "(Ltv/danmaku/ijk/media/player/misc/IAndroidIO;)V", (void *)IjkMediaPlayer_setAndroidIOCallback },
即知,在Android版本的ijkplayer,⼊⼝即为ijkmedia\ijkplayer\android\ijkplayer_jni.c
在IjkMediaPlayer.java及ijkplayer_jni.c⽂件中增加setGLFilter⽅法:
//增加setGLFilter⽅法
static void IjkMediaPlayer_native_setGLFilter(JNIEnv *env, jclass clazz,jobject filter)
{
}
// ----------------------------------------------------------------------------
static JNINativeMethod g_methods[] = {
{
"_setDataSource",
"(Ljava/lang/String;[Ljava/lang/String;[Ljava/lang/String;)V",
(void *) IjkMediaPlayer_setDataSourceAndHeaders
},
{ "_setDataSourceFd", "(I)V", (void *) IjkMediaPlayer_setDataSourceFd },
{ "_setDataSource", "(Ltv/danmaku/ijk/media/player/misc/IMediaDataSource;)V", (void *)IjkMediaPlayer_setDataSourceCallback },
//···省略其他⽅法
{ "_setGLFilter", "(Ltv/danmaku/ijk/media/player/IjkFilter;)V", (void *) IjkMediaPlayer_native_setGLFilter },
};
渲染时回调⽤户设置的GLFilter相关⽅法
在ijkmedia/ijksdl/gles2/internal.h⽂件中,IJK_GLES2_Renderer结构体内增加:
typedef struct IJK_GLES2_Renderer
{
GLuint frame_buffers[1];
GLuint frame_textures[1];
int hasFilter=0;
void (* func_onCreate)(void);
void (* func_onSizeChanged)(int width,int height);
void (* func_onDrawFrame)(int textureId);
//...
} IJK_GLES2_Renderer;
这三个⽅法即为IjkFlter的三个⽅法,将会在Jni⾥⾯,将Java中设置的IjkFilter对象的三个⽅法与之对应起来。
全局搜索glDraw搜索到只有⼀个在renderer.c的IJK_GLES2_Renderer_renderOverlay⽅法中,渲染⼯作也是此⽅法执⾏的。当然,IjkPlayer利⽤OpenGLES渲染时,会根据从视频中解码出来的数据具体格式来进⾏渲染,⽐如yuv420p、yuv420sp、rbg565等等诸多格式。具体在ijkmedia/ijksdl/gles2/下到,renderer_rgb.c\renderer.yuv420p.c等等都是。
当⽤户在Java层设置了GLFilter时,GLFilter的三个⽅法应该在合适的时候被C回调,从名字可以看出来,这三个⽅法,和GLSurfaceView.Renderer接⼝中定义的三个⽅法其实是⼀样的。
具体在IJK_GLES2_Renderer_renderOverlay⽅法中的修改如下:
GLboolean IJK_GLES2_Renderer_renderOverlay(IJK_GLES2_Renderer *renderer, SDL_VoutOverlay *overlay)
{
/*⽤户设置了fitler,⽽且没有创建过framebuffer,创建framebuffer,依旧利⽤
IjkPlayer⾥⾯原来的流程,就yuv或rgb的数据,渲染到⼀个texture上⾯去,然后将
这个texture作为原始数据传递给java层进⾏处理。
*/
if(renderer->hasFilter&&!renderer->frame_buffers[0]&&renderer->frame_width>0&&renderer->frame_height>0){
//创建⼀个texture,⽤来接受将不同格式的视频帧数据,渲染成⼀个纹理
glGenTextures(1,renderer->frame_textures);
glBindTexture(GL_TEXTURE_2D,renderer->frame_textures[0]);
glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,renderer->frame_width,renderer->frame_height,0,GL_RGBA,GL_UNSIGNED_BYTE,NULL);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glBindTexture(GL_TEXTURE_2D,0);
//创建framebuffer来挂载纹理,以渲染视频帧
glGenFramebuffers(1,renderer->frame_buffers);
glBindFramebuffer(GL_FRAMEBUFFER,renderer->frame_buffers[0]);
glFramebufferTexture2D(GL_FRAMEBUFFER,GL_COLOR_ATTACHMENT0,GL_TEXTURE_2D,renderer->frame_textures[0],0);
// int r;
// if ((r = glCheckFramebufferStatus(GL_FRAMEBUFFER)) != GL_FRAMEBUFFER_COMPLETE)
// {
// ALOGE("wuwang: Error in Framebuffer 0x%x", r);
// }else{
// ALOGE("wuwang: glCheckFramebufferStatus 0x%x", r);
// }
glBindFramebuffer(GL_FRAMEBUFFER,0);
//调⽤Java传递进来的Filter的onCreated⽅法及onSizeChanged⽅法
renderer->func_onCreated();
renderer->func_onSizeChanged(renderer->frame_width,renderer->frame_height);
ALOGE("wuwang: create frame_buffers and textures %d,%d",renderer->frame_width,renderer->frame_height);
}
//⽤户设置了Filter,就挂载frameBuffer,否则就按照原来的流程直接渲染到屏幕上
if(renderer->hasFilter&&renderer->frame_buffers[0]){
GLint bindFrame;
glGetIntegerv(GL_FRAMEBUFFER_BINDING,&bindFrame);
ALOGE("wuwang: default frame binding %d",bindFrame);
glBindFramebuffer(GL_FRAMEBUFFER,renderer->frame_buffers[0]);
// glFramebufferTexture2D(GL_FRAMEBUFFER,GL_COLOR_ATTACHMENT0,GL_TEXTURE_2D,renderer->frame_textures[0],0);
/* 这句⼀定要加上,否则⽆法增加了Filter之后,启⽤了其他GLProgram,⽆法渲染原始视频到Texture上去了 */
IJK_GLES2_Renderer_use(renderer);
}
if (!renderer || !renderer->func_uploadTexture)
return GL_FALSE;
glClear(GL_COLOR_BUFFER_BIT); IJK_GLES2_checkError_TRACE("glClear");
ALOGE("wuwang: frame buffer id:%d",renderer->frame_buffers[0]);
GLsizei visible_width = renderer->frame_width;
GLsizei visible_height = renderer->frame_height;
if (overlay) {
visible_width = overlay->w;
visible_height = overlay->h;
if (renderer->frame_width != visible_width ||
renderer->frame_height != visible_height ||
renderer->frame_sar_num != overlay->sar_num ||
renderer->frame_sar_den != overlay->sar_den) {
renderer->frame_width = visible_width;
renderer->frame_height = visible_height;
renderer->frame_sar_num = overlay->sar_num;
renderer->frame_sar_den = overlay->sar_den;
renderer->vertices_changed = 1;
}
renderer->last_buffer_width = renderer->func_getBufferWidth(renderer, overlay);
if (!renderer->func_uploadTexture(renderer, overlay)){
return GL_FALSE;
}
} else {
// NULL overlay means force reload vertice
renderer->vertices_changed = 1;
}
GLsizei buffer_width = renderer->last_buffer_width;
if (renderer->vertices_changed ||
(buffer_width > 0 &&
buffer_width > visible_width &&
buffer_width != renderer->buffer_width &&
visible_width != renderer->visible_width)){
if(renderer->hasFilter&&renderer->frame_buffers[0]){
renderer->func_onSizeChanged(renderer->frame_width,renderer->frame_height);
}
renderer->vertices_changed = 0;
IJK_GLES2_Renderer_Vertices_apply(renderer);
IJK_GLES2_Renderer_Vertices_reset(renderer);
IJK_GLES2_Renderer_Vertices_reloadVertex(renderer);
renderer->buffer_width = buffer_width;
renderer->visible_width = visible_width;
GLsizei padding_pixels = buffer_width - visible_width;
GLfloat padding_normalized = ((GLfloat)padding_pixels) / buffer_width;
IJK_GLES2_Renderer_TexCoords_reset(renderer);
IJK_GLES2_Renderer_TexCoords_cropRight(renderer, padding_normalized);
IJK_GLES2_Renderer_TexCoords_reloadVertex(renderer);
}
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); IJK_GLES2_checkError_TRACE("glDrawArrays");
//⽤户设置了Filter,就取消挂载FrameBuffer,并调⽤Filter的onDrawFrame⽅法。
if(renderer->hasFilter&&renderer->frame_buffers[0]){
glBindFramebuffer(GL_FRAMEBUFFER,0);
renderer->func_onDrawFrame(renderer->frame_textures[0]);
// renderer->func_onDrawFrame(renderer->plane_textures[0]);
}
return GL_TRUE;
}
GLFilter从设置到调⽤实现分析
上⾯已经完成的接⼝的编写,也做好执⾏的编写。现在需要将接⼝传递进来的GLFilter的三个⽅法与执⾏的三个⽅法对应起来,才能是⽤户的Filter真正发挥作⽤。
IJK_GLES2_Renderer的创建是根据SDL_VoutOverlay来的,在查SDL_VoutOverlay从哪⾥来的,⼀路可以搜索到ijkmedia/ijksdl/android/ijksdl_vout_android_nativewindow.c中的func_create_overlay⽅法:
static SDL_VoutOverlay *func_create_overlay(int width, int height, int frame_format, SDL_Vout *vout)
{
SDL_LockMutex(vout->mutex);
SDL_VoutOverlay *overlay = func_create_overlay_l(width, height, frame_format, vout);
SDL_UnlockMutex(vout->mutex);
return overlay;
}
SDL_VoutOverlay的创建和SDL_Vout有关,再查SDL_Vout的来源,可以到
ijkmedia/ijksdl/android/ijksdl_vout_android_nativewindow.c中的SDL_VoutAndroid_CreateForANativeWindow:
SDL_Vout *SDL_VoutAndroid_CreateForANativeWindow()
{
SDL_Vout *vout = SDL_Vout_CreateInternal(sizeof(SDL_Vout_Opaque));
if (!vout)
return NULL;
SDL_Vout_Opaque *opaque = vout->opaque;
opaque->native_window = NULL;
if (ISDL_Array__init(&opaque->overlay_manager, 32))
goto fail;
if (ISDL_Array__init(&opaque->overlay_pool, 32))
goto fail;
opaque->egl = IJK_EGL_create();
if (!opaque->egl)
goto fail;
vout->opaque_class = &g_nativewindow_class;
vout->create_overlay = func_create_overlay;
vout->free_l = func_free_l;
vout->display_overlay = func_display_overlay;
return vout;
fail:
func_free_l(vout);
return NULL;
}
它的初始化,再没啥可以关联了,那就只能调⽤它的地⽅了。搜索后发现SDL_VoutAndroid_CreateForANativeWindow只在ijkmedia\ijksdl\android\ijksdl_vout_android_surface.c的SDL_VoutAndroid_CreateForAndroidSurface⽅法:
SDL_Vout *SDL_VoutAndroid_CreateForAndroidSurface()
{
return SDL_VoutAndroid_CreateForANativeWindow();
}
看来是被包了⼀层⽪,那就接着查SDL_VoutAndroid_CreateForAndroidSurface,搜索到调⽤它的为
ijkmedia\ijkplayer\android\ijkplayer_android.c的ijkmp_android_create⽅法:
IjkMediaPlayer *ijkmp_android_create(int(*msg_loop)(void*))
{
IjkMediaPlayer *mp = ijkmp_create(msg_loop);
if (!mp)
goto fail;
mp->ffplayer->vout = SDL_VoutAndroid_CreateForAndroidSurface();
if (!mp->ffplayer->vout)
goto fail;
mp->ffplayer->pipeline = ffpipeline_create_from_android(mp->ffplayer);
if (!mp->ffplayer->pipeline)
goto fail;
ffpipeline_set_vout(mp->ffplayer->pipeline, mp->ffplayer->vout);
return mp;
fail:
ijkmp_dec_ref_p(&mp);
return NULL;
}
可以看到ijkmp_android_create返回了⼀个IjkMediaPlayer,这个在Java中也有⼀个这样的类,那么曙光应该就不远了。
再搜索ijkmp_android_create,结果调⽤它的只有ijkmedia\ijkplayer\android\ijkplayer_jni.c中的IjkMediaPlayer_native_setup ⽅法,到了这⾥,就可以将IjkFilter传递下去了。
在上⾯的过程中,可以看到从renderer.c到jni,IJK_GLES2_Renderer的创建,依赖SDL_VoutOverlay。SDL_VoutOverlay的创建依赖SDL_Vout。SDL_Vout是FFPlayer的成员,⽽FFPlayer⼜是IjkMediaPlayer的成员变量。Java层的IjkMediaPlayer依赖于IjkMediaPlayer
从GLFilter到IJK_GLES2_Renderer
根据上⾯的分析,可以知道,在jni中增加的setGLFilter的⽅法中,我们可以将GLFilter的⽅法传递给IjkMediaPlayer-
>FFPlayer->SDLVout,然后再传递给SDL_VoutOverlay,再由SDL_VoutOverlay传递给IJK_GLES2_Renderer即可。这样将增加Filter的功能增加进去了,也不会影响IjkPlayer的流程,让IOS同样能够快速的实现增加GPU滤镜的功能。
先在SDL_VoutOverlay和SDL_Vout中的结构体定义中(ijkmedia/ijksdl/ijksdl_vout.h⽂件中)同样加⼊在IJK_GLES2_Renderer 中增加的成员:
struct SDL_VoutOverlay {
//...
int hasFilter;
void (* func_onCreated)(void);
void (* func_onSizeChanged)(int width,int height);
int (* func_onDrawFrame)(int textureId);
//...
};
struct SDL_Vout {
//...
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。
发表评论