android 视频播放滤镜,用openGL ES+MediaPlayer 渲染播放视频+滤镜效果
之前曾經(jīng)寫過(guò)用SurfaceView,TextureView+MediaPlayer 播放視頻,和 ffmpeg avi解碼后SurfaceView播放視頻,今天再給大家來(lái)一篇openGL ES+MediaPlayer來(lái)播放視頻。。。。當(dāng)年也曾呆過(guò)camera開(kāi)發(fā)組近一年時(shí)間,可惜那時(shí)候沒(méi)寫博客的意識(shí),沒(méi)能給自己給大家留下多少干貨分享。。。
上個(gè)效果圖吧:
這里寫圖片描述
用openGL著色器實(shí)現(xiàn)黑白(灰度圖)效果。
即 0.299,0.587,0.114 CRT中轉(zhuǎn)灰度的模型
這里寫圖片描述
下面看具體實(shí)現(xiàn)的邏輯:
如果你曾用openGL實(shí)現(xiàn)過(guò)貼圖,那么就容易理解多了。和圖片不同的是,視頻需要不斷地刷新,每當(dāng)有新的一幀來(lái)時(shí),我們都應(yīng)該更新紋理,然后重新繪制。用openGL播放視頻就是把視頻貼到屏幕上。
對(duì)openGL不熟的同學(xué)先看這里:學(xué)openGL必知道的圖形學(xué)知識(shí)
1.先寫頂點(diǎn)著色器和片段著色器(我的習(xí)慣是這樣,你也可以后邊根據(jù)需要再寫這個(gè))
頂點(diǎn)著色器:
attribute vec4 aPosition;//頂點(diǎn)位置
attribute vec4 aTexCoord;//S T 紋理坐標(biāo)
varying vec2 vTexCoord;
uniform mat4 uMatrix;
uniform mat4 uSTMatrix;
void main() {
vTexCoord = (uSTMatrix * aTexCoord).xy;
gl_Position = uMatrix*aPosition;
}
片段著色器:
#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 vTexCoord;
uniform samplerExternalOES sTexture;
void main() {
gl_FragColor=texture2D(sTexture, vTexCoord);
}
samplerExternalOES代替貼圖片時(shí)的sampler2D,作用就是和surfaceTexture配合進(jìn)行紋理更新和格式轉(zhuǎn)換
2.MediaPlayer的輸出
在GLVideoRenderer的構(gòu)造函數(shù)中初始化MediaPlayer:
mediaPlayer=new MediaPlayer();
try{
mediaPlayer.setDataSource(context, Uri.parse(videoPath));
}catch (IOException e){
e.printStackTrace();
}
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setLooping(true);
mediaPlayer.setOnVideoSizeChangedListener(this);
onSurfaceCreated函數(shù)中使用SurfaceTexture來(lái)設(shè)置MediaPlayer的輸出
我們要用SurfaceTexture 創(chuàng)建一個(gè)Surface,然后將這個(gè)Surface作為MediaPlayer的輸出表面。
SurfaceTexture的主要作用就是,從視頻流和相機(jī)數(shù)據(jù)流獲取新一幀的數(shù)據(jù),獲取新數(shù)據(jù)調(diào)用的方法是updateTexImage。
需要注意的是MediaPlayer的輸出往往不是RGB格式(一般是YUV),而GLSurfaceView需要RGB格式才能正常顯示。
所以我們先在onSurfaceCreated中將生成紋理的代碼改成這樣:
textureId = textures[0];
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
ShaderUtils.checkGlError("ws-------glBindTexture mTextureID");
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
GLES11Ext.GL_TEXTURE_EXTERNAL_OES的用處是什么?
之前提到視頻解碼的輸出格式是YUV的(YUV420sp,應(yīng)該是),那么這個(gè)擴(kuò)展紋理的作用就是實(shí)現(xiàn)YUV格式到RGB的自動(dòng)轉(zhuǎn)化,我們就不需要再為此寫YUV轉(zhuǎn)RGB的代碼了
然后在onSurfaceCreated的最后加上如下代碼:
surfaceTexture = new SurfaceTexture(textureId);
surfaceTexture.setOnFrameAvailableListener(this);//監(jiān)聽(tīng)是否有新的一幀數(shù)據(jù)到來(lái)
Surface surface = new Surface(surfaceTexture);
mediaPlayer.setSurface(surface);
surface.release();
if (!playerPrepared){
try {
mediaPlayer.prepare();
playerPrepared=true;
} catch (IOException t) {
Log.e(TAG, "media player prepare failed");
}
mediaPlayer.start();
playerPrepared=true;
}
用SurfaceTexture 創(chuàng)建一個(gè)Surface,然后將這個(gè)Surface作為MediaPlayer的輸出表面.
在onDrawFrame中
synchronized (this){
if (updateSurface){
surfaceTexture.updateTexImage();//獲取新數(shù)據(jù)
surfaceTexture.getTransformMatrix(mSTMatrix);//讓新的紋理和紋理坐標(biāo)系能夠正確的對(duì)應(yīng),mSTMatrix的定義是和projectionMatrix完全一樣的。
updateSurface = false;
}
}
在有新數(shù)據(jù)時(shí),用updateTexImage來(lái)更新紋理,這個(gè)getTransformMatrix的目的,是讓新的紋理和紋理坐標(biāo)系能夠正確的對(duì)應(yīng),mSTMatrix的定義是和projectionMatrix完全一樣的。
private final float[] vertexData = {
1f,-1f,0f,
-1f,-1f,0f,
1f,1f,0f,
-1f,1f,0f
};
private final float[] textureVertexData = {
1f,0f,
0f,0f,
1f,1f,
0f,1f
};
vertexData 代表要繪制的視口坐標(biāo)。textureVertexData 代表視頻紋理,與屏幕坐標(biāo)對(duì)應(yīng)
然后我們讀取坐標(biāo),在此自己我們先與著色器映射。
在onSurfaceCreated映射
aPositionLocation= GLES20.glGetAttribLocation(programId,"aPosition");
uMatrixLocation=GLES20.glGetUniformLocation(programId,"uMatrix");
uSTMMatrixHandle = GLES20.glGetUniformLocation(programId, "uSTMatrix");
uTextureSamplerLocation=GLES20.glGetUniformLocation(programId,"sTexture");
aTextureCoordLocation=GLES20.glGetAttribLocation(programId,"aTexCoord");
onDrawFrame中讀取:
GLES20.glUseProgram(programId);
GLES20.glUniformMatrix4fv(uMatrixLocation,1,false,projectionMatrix,0);
GLES20.glUniformMatrix4fv(uSTMMatrixHandle, 1, false, mSTMatrix, 0);
vertexBuffer.position(0);
GLES20.glEnableVertexAttribArray(aPositionLocation);
GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false,
12, vertexBuffer);
textureVertexBuffer.position(0);
GLES20.glEnableVertexAttribArray(aTextureCoordLocation);
GLES20.glVertexAttribPointer(aTextureCoordLocation,2,GLES20.GL_FLOAT,false,8,textureVertexBuffer);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,textureId);
GLES20.glUniform1i(uTextureSamplerLocation,0);
GLES20.glViewport(0,0,screenWidth,screenHeight);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
GLVideoRenderer 全部代碼如下:
package com.ws.openglvideoplayer;
import android.content.Context;
import android.graphics.SurfaceTexture;
import android.media.AudioManager;
import android.media.MediaPlayer;
import android.net.Uri;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
import android.opengl.Matrix;
import android.util.Log;
import android.view.Surface;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
/**
* Created by Shuo.Wang on 2017/3/19.
*/
public class GLVideoRenderer implements GLSurfaceView.Renderer
, SurfaceTexture.OnFrameAvailableListener, MediaPlayer.OnVideoSizeChangedListener {
private static final String TAG = "GLRenderer";
private Context context;
private int aPositionLocation;
private int programId;
private FloatBuffer vertexBuffer;
private final float[] vertexData = {
1f,-1f,0f,
-1f,-1f,0f,
1f,1f,0f,
-1f,1f,0f
};
private final float[] projectionMatrix=new float[16];
private int uMatrixLocation;
private final float[] textureVertexData = {
1f,0f,
0f,0f,
1f,1f,
0f,1f
};
private FloatBuffer textureVertexBuffer;
private int uTextureSamplerLocation;
private int aTextureCoordLocation;
private int textureId;
private SurfaceTexture surfaceTexture;
private MediaPlayer mediaPlayer;
private float[] mSTMatrix = new float[16];
private int uSTMMatrixHandle;
private boolean updateSurface;
private boolean playerPrepared;
private int screenWidth,screenHeight;
public GLVideoRenderer(Context context,String videoPath) {
this.context = context;
playerPrepared=false;
synchronized(this) {
updateSurface = false;
}
vertexBuffer = ByteBuffer.allocateDirect(vertexData.length * 4)
.order(ByteOrder.nativeOrder())
.asFloatBuffer()
.put(vertexData);
vertexBuffer.position(0);
textureVertexBuffer = ByteBuffer.allocateDirect(textureVertexData.length * 4)
.order(ByteOrder.nativeOrder())
.asFloatBuffer()
.put(textureVertexData);
textureVertexBuffer.position(0);
mediaPlayer=new MediaPlayer();
try{
mediaPlayer.setDataSource(context, Uri.parse(videoPath));
}catch (IOException e){
e.printStackTrace();
}
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setLooping(true);
mediaPlayer.setOnVideoSizeChangedListener(this);
}
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
String vertexShader = ShaderUtils.readRawTextFile(context, R.raw.simple_vertex_shader);
String fragmentShader= ShaderUtils.readRawTextFile(context, R.raw.simple_fragment_shader);
programId=ShaderUtils.createProgram(vertexShader,fragmentShader);
aPositionLocation= GLES20.glGetAttribLocation(programId,"aPosition");
uMatrixLocation=GLES20.glGetUniformLocation(programId,"uMatrix");
uSTMMatrixHandle = GLES20.glGetUniformLocation(programId, "uSTMatrix");
uTextureSamplerLocation=GLES20.glGetUniformLocation(programId,"sTexture");
aTextureCoordLocation=GLES20.glGetAttribLocation(programId,"aTexCoord");
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
textureId = textures[0];
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
ShaderUtils.checkGlError("glBindTexture mTextureID");
/*GLES11Ext.GL_TEXTURE_EXTERNAL_OES的用處?
之前提到視頻解碼的輸出格式是YUV的(YUV420p,應(yīng)該是),那么這個(gè)擴(kuò)展紋理的作用就是實(shí)現(xiàn)YUV格式到RGB的自動(dòng)轉(zhuǎn)化,
我們就不需要再為此寫YUV轉(zhuǎn)RGB的代碼了*/
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
surfaceTexture = new SurfaceTexture(textureId);
surfaceTexture.setOnFrameAvailableListener(this);//監(jiān)聽(tīng)是否有新的一幀數(shù)據(jù)到來(lái)
Surface surface = new Surface(surfaceTexture);
mediaPlayer.setSurface(surface);
surface.release();
if (!playerPrepared){
try {
mediaPlayer.prepare();
playerPrepared=true;
} catch (IOException t) {
Log.e(TAG, "media player prepare failed");
}
mediaPlayer.start();
playerPrepared=true;
}
}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
Log.d(TAG, "onSurfaceChanged: "+width+" "+height);
screenWidth=width; screenHeight=height;
}
@Override
public void onDrawFrame(GL10 gl) {
GLES20.glClear( GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
synchronized (this){
if (updateSurface){
surfaceTexture.updateTexImage();//獲取新數(shù)據(jù)
surfaceTexture.getTransformMatrix(mSTMatrix);//讓新的紋理和紋理坐標(biāo)系能夠正確的對(duì)應(yīng),mSTMatrix的定義是和projectionMatrix完全一樣的。
updateSurface = false;
}
}
GLES20.glUseProgram(programId);
GLES20.glUniformMatrix4fv(uMatrixLocation,1,false,projectionMatrix,0);
GLES20.glUniformMatrix4fv(uSTMMatrixHandle, 1, false, mSTMatrix, 0);
vertexBuffer.position(0);
GLES20.glEnableVertexAttribArray(aPositionLocation);
GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false,
12, vertexBuffer);
textureVertexBuffer.position(0);
GLES20.glEnableVertexAttribArray(aTextureCoordLocation);
GLES20.glVertexAttribPointer(aTextureCoordLocation,2,GLES20.GL_FLOAT,false,8,textureVertexBuffer);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,textureId);
GLES20.glUniform1i(uTextureSamplerLocation,0);
GLES20.glViewport(0,0,screenWidth,screenHeight);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
@Override
synchronized public void onFrameAvailable(SurfaceTexture surface) {
updateSurface = true;
}
@Override
public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
Log.d(TAG, "onVideoSizeChanged: "+width+" "+height);
updateProjection(width,height);
}
private void updateProjection(int videoWidth, int videoHeight){
float screenRatio=(float)screenWidth/screenHeight;
float videoRatio=(float)videoWidth/videoHeight;
if (videoRatio>screenRatio){
Matrix.orthoM(projectionMatrix,0,-1f,1f,-videoRatio/screenRatio,videoRatio/screenRatio,-1f,1f);
}else Matrix.orthoM(projectionMatrix,0,-screenRatio/videoRatio,screenRatio/videoRatio,-1f,1f,-1f,1f);
}
public MediaPlayer getMediaPlayer() {
return mediaPlayer;
}
}
要實(shí)現(xiàn)上圖中的濾鏡視頻效果,只需用0.299,0.587,0.114 CRT中轉(zhuǎn)灰度的模型算法。(自己可以網(wǎng)上搜尋更多效果,這里只是拋磚引玉)
更改片段著色器即可:
#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 vTexCoord;
uniform samplerExternalOES sTexture;
void main() {
//gl_FragColor=texture2D(sTexture, vTexCoord);
vec3 centralColor = texture2D(sTexture, vTexCoord).rgb;
gl_FragColor = vec4(0.299*centralColor.r+0.587*centralColor.g+0.114*centralColor.b);
}
這里寫圖片描述
到此結(jié)束,我們已經(jīng)實(shí)現(xiàn)了openGL ES+MediaPlayer 渲染播放視頻+濾鏡效果。后期將講述全景視頻原理及實(shí)現(xiàn)過(guò)程,敬請(qǐng)關(guān)注~
總結(jié)
以上是生活随笔為你收集整理的android 视频播放滤镜,用openGL ES+MediaPlayer 渲染播放视频+滤镜效果的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: 英科医疗股票历史最高价 在哪一年出现的
- 下一篇: 计算机一级b考试理论知识,全国计算机等级