【移动应用开发技术】怎么在Android中实现相机圆形预览_第1页
【移动应用开发技术】怎么在Android中实现相机圆形预览_第2页
【移动应用开发技术】怎么在Android中实现相机圆形预览_第3页
【移动应用开发技术】怎么在Android中实现相机圆形预览_第4页
【移动应用开发技术】怎么在Android中实现相机圆形预览_第5页
已阅读5页,还剩22页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

【移动应用开发技术】怎么在Android中实现相机圆形预览

怎么在Android中实现相机圆形预览?相信很多没有经验的人对此束手无策,为此本文总结了问题出现的原因和解决方法,通过这篇文章希望你能解决这个问题。一、为预览控件设置圆角为控件设置ViewOutlineProviderpublic

RoundTextureView(Context

context,

AttributeSet

attrs)

{

super(context,

attrs);

setOutlineProvider(new

ViewOutlineProvider()

{

@Override

public

void

getOutline(View

view,

Outline

outline)

{

Rect

rect

=

new

Rect(0,

0,

view.getMeasuredWidth(),

view.getMeasuredHeight());

outline.setRoundRect(rect,

radius);

}

});

setClipToOutline(true);

}在需要时修改圆角值并更新public

void

setRadius(int

radius)

{

this.radius

=

radius;

}

public

void

turnRound()

{

invalidateOutline();

}即可根据设置的圆角值更新控件显示的圆角大小。当控件为正方形,且圆角值为边长的一半,显示的就是圆形。二、实现正方形预览1.设备支持1:1预览尺寸首先介绍一种简单但是局限性较大的实现方式:将相机预览尺寸和预览控件的大小都调整为1:1。一般Android设备都支持多种预览尺寸,以SamsungTabS3为例在使用CameraAPI时,其支持的预览尺寸如下:2019-08-02

13:16:08.669

16407-16407/com.wsy.glcamerademo

I/CameraHelper:

supportedPreviewSize:

1920x1080

2019-08-02

13:16:08.669

16407-16407/com.wsy.glcamerademo

I/CameraHelper:

supportedPreviewSize:

1280x720

2019-08-02

13:16:08.669

16407-16407/com.wsy.glcamerademo

I/CameraHelper:

supportedPreviewSize:

1440x1080

2019-08-02

13:16:08.669

16407-16407/com.wsy.glcamerademo

I/CameraHelper:

supportedPreviewSize:

1088x1088

2019-08-02

13:16:08.670

16407-16407/com.wsy.glcamerademo

I/CameraHelper:

supportedPreviewSize:

1056x864

2019-08-02

13:16:08.670

16407-16407/com.wsy.glcamerademo

I/CameraHelper:

supportedPreviewSize:

960x720

2019-08-02

13:16:08.670

16407-16407/com.wsy.glcamerademo

I/CameraHelper:

supportedPreviewSize:

720x480

2019-08-02

13:16:08.670

16407-16407/com.wsy.glcamerademo

I/CameraHelper:

supportedPreviewSize:

640x480

2019-08-02

13:16:08.670

16407-16407/com.wsy.glcamerademo

I/CameraHelper:

supportedPreviewSize:

352x288

2019-08-02

13:16:08.670

16407-16407/com.wsy.glcamerademo

I/CameraHelper:

supportedPreviewSize:

320x240

2019-08-02

13:16:08.670

16407-16407/com.wsy.glcamerademo

I/CameraHelper:

supportedPreviewSize:

176x144其中1:1的预览尺寸为:1088x1088。在使用Camera2API时,其支持的预览尺寸(其实也包含了PictureSize)如下:2019-08-02

13:19:24.980

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

4128x3096

2019-08-02

13:19:24.980

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

4128x2322

2019-08-02

13:19:24.980

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

3264x2448

2019-08-02

13:19:24.980

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

3264x1836

2019-08-02

13:19:24.980

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

3024x3024

2019-08-02

13:19:24.980

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

2976x2976

2019-08-02

13:19:24.980

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

2880x2160

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

2592x1944

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

2560x1920

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

2560x1440

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

2560x1080

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

2160x2160

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

2048x1536

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

2048x1152

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

1936x1936

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

1920x1080

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

1440x1080

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

1280x960

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

1280x720

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

960x720

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

720x480

2019-08-02

13:19:24.981

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

640x480

2019-08-02

13:19:24.982

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

320x240

2019-08-02

13:19:24.982

16768-16768/com.wsy.glcamerademo

I/Camera2Helper:

getBestSupportedSize:

176x144其中1:1的预览尺寸为:3024x3024、2976x2976、2160x2160、1936x1936。只要我们选择1:1的预览尺寸,再将预览控件设置为正方形,即可实现正方形预览;再通过设置预览控件的圆角为边长的一半,即可实现圆形预览。2.设备不支持1:1预览尺寸的情况选择1:1预览尺寸的缺陷分析分辨率局限性上述说到,我们可以选择1:1的预览尺寸进行预览,但是局限性较高,可选择范围都很小。如果相机不支持1:1的预览尺寸,这个方案就不可行了。资源消耗以SamsungtabS3为例,该设备使用Camera2API时,支持的正方形预览尺寸都很大,在进行图像处理等操作时将占用较多系统资源。处理不支持1:1预览尺寸的情况添加一个1:1尺寸的ViewGroup将TextureView放入ViewGroup设置TextureView的margin值以达到显示中心正方形区域的效果示意图示例代码//将预览控件和预览尺寸比例保持一致,避免拉伸

{

FrameLayout.LayoutParams

textureViewLayoutParams

=

(FrameLayout.LayoutParams)

textureView.getLayoutParams();

int

newHeight

=

0;

int

newWidth

=

textureViewLayoutParams.width;

//横屏

if

(displayOrientation

%

180

==

0)

{

newHeight

=

textureViewLayoutParams.width

*

previewSize.height

/

previewSize.width;

}

//竖屏

else

{

newHeight

=

textureViewLayoutParams.width

*

previewSize.width

/

previewSize.height;

}

////当不是正方形预览的情况下,添加一层ViewGroup限制View的显示区域

if

(newHeight

!=

textureViewLayoutParams.height)

{

insertFrameLayout

=

new

RoundFrameLayout(CoverByParentCameraActivity.this);

int

sideLength

=

Math.min(newWidth,

newHeight);

FrameLayout.LayoutParams

layoutParams

=

new

FrameLayout.LayoutParams(sideLength,

sideLength);

insertFrameLayout.setLayoutParams(layoutParams);

FrameLayout

parentView

=

(FrameLayout)

textureView.getParent();

parentView.removeView(textureView);

parentView.addView(insertFrameLayout);

insertFrameLayout.addView(textureView);

FrameLayout.LayoutParams

newTextureViewLayoutParams

=

new

FrameLayout.LayoutParams(newWidth,

newHeight);

//横屏

if

(displayOrientation

%

180

==

0)

{

newTextureViewLayoutParams.leftMargin

=

((newHeight

-

newWidth)

/

2);

}

//竖屏

else

{

newTextureViewLayoutParams.topMargin

=

-(newHeight

-

newWidth)

/

2;

}

textureView.setLayoutParams(newTextureViewLayoutParams);

}

}三、使用GLSurfaceView进行自定义程度更高的预览使用上面的方法操作已经可完成正方形和圆形预览,但是仅适用于原生相机,当我们的数据源并非是原生相机的情况时如何进行圆形预览?接下来介绍使用GLSurfaceView显示NV21的方案,完全是自己实现预览数据的绘制。1.GLSurfaceView使用流程OpenGL渲染YUV数据流程其中的重点是渲染器(Renderer)的编写,Renderer的介绍如下:/**

*

A

generic

renderer

interface.

*

<p>

*

The

renderer

is

responsible

for

making

OpenGL

calls

to

render

a

frame.

*

<p>

*

GLSurfaceView

clients

typically

create

their

own

classes

that

implement

*

this

interface,

and

then

call

{@link

GLSurfaceView#setRenderer}

to

*

register

the

renderer

with

the

GLSurfaceView.

*

<p>

*

*

<div

class="special

reference">

*

<h4>Developer

Guides</h4>

*

<p>For

more

information

about

how

to

use

OpenGL,

read

the

*

<a

href="{@docRoot}guide/topics/graphics/opengl.html"

rel="external

nofollow"

>OpenGL</a>

developer

guide.</p>

*

</div>

*

*

<h4>Threading</h4>

*

The

renderer

will

be

called

on

a

separate

thread,

so

that

rendering

*

performance

is

decoupled

from

the

UI

thread.

Clients

typically

need

to

*

communicate

with

the

renderer

from

the

UI

thread,

because

that's

where

*

input

events

are

received.

Clients

can

communicate

using

any

of

the

*

standard

Java

techniques

for

cross-thread

communication,

or

they

can

*

use

the

{@link

GLSurfaceView#queueEvent(Runnable)}

convenience

method.

*

<p>

*

<h4>EGL

Context

Lost</h4>

*

There

are

situations

where

the

EGL

rendering

context

will

be

lost.

This

*

typically

happens

when

device

wakes

up

after

going

to

sleep.

When

*

the

EGL

context

is

lost,

all

OpenGL

resources

(such

as

textures)

that

are

*

associated

with

that

context

will

be

automatically

deleted.

In

order

to

*

keep

rendering

correctly,

a

renderer

must

recreate

any

lost

resources

*

that

it

still

needs.

The

{@link

#onSurfaceCreated(GL10,

EGLConfig)}

method

*

is

a

convenient

place

to

do

this.

*

*

*

@see

#setRenderer(Renderer)

*/

public

interface

Renderer

{

/**

*

Called

when

the

surface

is

created

or

recreated.

*

<p>

*

Called

when

the

rendering

thread

*

starts

and

whenever

the

EGL

context

is

lost.

The

EGL

context

will

typically

*

be

lost

when

the

Android

device

awakes

after

going

to

sleep.

*

<p>

*

Since

this

method

is

called

at

the

beginning

of

rendering,

as

well

as

*

every

time

the

EGL

context

is

lost,

this

method

is

a

convenient

place

to

put

*

code

to

create

resources

that

need

to

be

created

when

the

rendering

*

starts,

and

that

need

to

be

recreated

when

the

EGL

context

is

lost.

*

Textures

are

an

example

of

a

resource

that

you

might

want

to

create

*

here.

*

<p>

*

Note

that

when

the

EGL

context

is

lost,

all

OpenGL

resources

associated

*

with

that

context

will

be

automatically

deleted.

You

do

not

need

to

call

*

the

corresponding

"glDelete"

methods

such

as

glDeleteTextures

to

*

manually

delete

these

lost

resources.

*

<p>

*

@param

gl

the

GL

interface.

Use

<code>instanceof</code>

to

*

test

if

the

interface

supports

GL11

or

higher

interfaces.

*

@param

config

the

EGLConfig

of

the

created

surface.

Can

be

used

*

to

create

matching

pbuffers.

*/

void

onSurfaceCreated(GL10

gl,

EGLConfig

config);

/**

*

Called

when

the

surface

changed

size.

*

<p>

*

Called

after

the

surface

is

created

and

whenever

*

the

OpenGL

ES

surface

size

changes.

*

<p>

*

Typically

you

will

set

your

viewport

here.

If

your

camera

*

is

fixed

then

you

could

also

set

your

projection

matrix

here:

*

<pre

class="prettyprint">

*

void

onSurfaceChanged(GL10

gl,

int

width,

int

height)

{

*

gl.glViewport(0,

0,

width,

height);

*

//

for

a

fixed

camera,

set

the

projection

too

*

float

ratio

=

(float)

width

/

height;

*

gl.glMatrixMode(GL10.GL_PROJECTION);

*

gl.glLoadIdentity();

*

gl.glFrustumf(-ratio,

ratio,

-1,

1,

1,

10);

*

}

*

</pre>

*

@param

gl

the

GL

interface.

Use

<code>instanceof</code>

to

*

test

if

the

interface

supports

GL11

or

higher

interfaces.

*

@param

width

*

@param

height

*/

void

onSurfaceChanged(GL10

gl,

int

width,

int

height);

/**

*

Called

to

draw

the

current

frame.

*

<p>

*

This

method

is

responsible

for

drawing

the

current

frame.

*

<p>

*

The

implementation

of

this

method

typically

looks

like

this:

*

<pre

class="prettyprint">

*

void

onDrawFrame(GL10

gl)

{

*

gl.glClear(GL10.GL_COLOR_BUFFER_BIT

|

GL10.GL_DEPTH_BUFFER_BIT);

*

//...

other

gl

calls

to

render

the

scene

...

*

}

*

</pre>

*

@param

gl

the

GL

interface.

Use

<code>instanceof</code>

to

*

test

if

the

interface

supports

GL11

or

higher

interfaces.

*/

void

onDrawFrame(GL10

gl);

}voidonSurfaceCreated(GL10gl,EGLConfigconfig)在Surface创建或重建的情况下回调voidonSurfaceChanged(GL10gl,intwidth,intheight)在Surface的大小发生变化的情况下回调voidonDrawFrame(GL10gl)在这里实现绘制操作。当我们设置的renderMode为RENDERMODE_CONTINUOUSLY时,该函数将不断地执行;当我们设置的renderMode为RENDERMODE_WHEN_DIRTY时,将只在创建完成和调用requestRender后才执行。一般我们选择RENDERMODE_WHEN_DIRTY渲染模式,避免过度绘制。一般情况下,我们会自己实现一个Renderer,然后为GLSurfaceView设置Renderer,可以说,Renderer的编写是整个流程的核心步骤。以下是在voidonSurfaceCreated(GL10gl,EGLConfigconfig)进行的初始化操作和在voidonDrawFrame(GL10gl)进行的绘制操作的流程图:渲染YUV数据的Renderer2.具体实现坐标系介绍AndroidView坐标系OpenGL世界坐标系如图所示,和Android的View坐标系不同,OpenGL的坐标系是笛卡尔坐标系。AndroidView的坐标系以左上角为原点,向右x递增,向下y递增;而OpenGL坐标系以中心为原点,向右x递增,向上y递增。着色器编写/**

*

顶点着色器

*/

private

static

String

VERTEX_SHADER

=

"

attribute

vec4

attr_position;\n"

+

"

attribute

vec2

attr_tc;\n"

+

"

varying

vec2

tc;\n"

+

"

void

main()

{\n"

+

"

gl_Position

=

attr_position;\n"

+

"

tc

=

attr_tc;\n"

+

"

}";

/**

*

片段着色器

*/

private

static

String

FRAG_SHADER

=

"

varying

vec2

tc;\n"

+

"

uniform

sampler2D

ySampler;\n"

+

"

uniform

sampler2D

uSampler;\n"

+

"

uniform

sampler2D

vSampler;\n"

+

"

const

mat3

convertMat

=

mat3(

1.0,

1.0,

1.0,

-0.001,

-0.3441,

1.772,

1.402,

-0.7141,

-0.58060);\n"

+

"

void

main()\n"

+

"

{\n"

+

"

vec3

yuv;\n"

+

"

yuv.x

=

texture2D(ySampler,

tc).r;\n"

+

"

yuv.y

=

texture2D(uSampler,

tc).r

-

0.5;\n"

+

"

yuv.z

=

texture2D(vSampler,

tc).r

-

0.5;\n"

+

"

gl_FragColor

=

vec4(convertMat

*

yuv,

1.0);\n"

+

"

}";内建变量解释gl_PositionVERTEX_SHADER代码里的gl_Position代表绘制的空间坐标。由于我们是二维绘制,所以直接传入OpenGL二维坐标系的左下(-1,-1)、右下(1,-1)、左上(-1,1)、右上(1,1),也就是{-1,-1,1,-1,-1,1,1,1}gl_FragColorFRAG_SHADER代码里的gl_FragColor代表单个片元的颜色其他变量解释ySampler、uSampler、vSampler分别代表Y、U、V纹理采样器convertMat根据以下公式:R

=

Y

+

1.402

(V

-

128)

G

=

Y

-

0.34414

(U

-

128)

-

0.71414

(V

-

128)

B

=

Y

+

1.772

(U

-

128)我们可得到一个YUV转RGB的矩阵1.0,

1.0,

1.0,

0,

-0.344,

1.77,

1.403,

-0.714,

0部分类型、函数的解释vec3、vec4分别代表三维向量、四维向量。vec4

texture2D(sampler2D

sampler,

vec2

coord)以指定的矩阵将采样器的图像纹理转换为颜色值;如:texture2D(ySampler,tc).r获取到的是Y数据,texture2D(uSampler,tc).r获取到的是U数据,texture2D(vSampler,tc).r获取到的是V数据。在Java代码中进行初始化根据图像宽高创建Y、U、V对应的ByteBuffer纹理数据;根据是否镜像显示、旋转角度选择对应的转换矩阵;public

void

init(boolean

isMirror,

int

rotateDegree,

int

frameWidth,

int

frameHeight)

{

if

(this.frameWidth

==

frameWidth

&&

this.frameHeight

==

frameHeight

&&

this.rotateDegree

==

rotateDegree

&&

this.isMirror

==

isMirror)

{

return;

}

dataInput

=

false;

this.frameWidth

=

frameWidth;

this.frameHeight

=

frameHeight;

this.rotateDegree

=

rotateDegree;

this.isMirror

=

isMirror;

yArray

=

new

byte[this.frameWidth

*

this.frameHeight];

uArray

=

new

byte[this.frameWidth

*

this.frameHeight

/

4];

vArray

=

new

byte[this.frameWidth

*

this.frameHeight

/

4];

int

yFrameSize

=

this.frameHeight

*

this.frameWidth;

int

uvFrameSize

=

yFrameSize

>>

2;

yBuf

=

ByteBuffer.allocateDirect(yFrameSize);

yBuf.order(ByteOrder.nativeOrder()).position(0);

uBuf

=

ByteBuffer.allocateDirect(uvFrameSize);

uBuf.order(ByteOrder.nativeOrder()).position(0);

vBuf

=

ByteBuffer.allocateDirect(uvFrameSize);

vBuf.order(ByteOrder.nativeOrder()).position(0);

//

顶点坐标

squareVertices

=

ByteBuffer

.allocateDirect(GLUtil.SQUARE_VERTICES.length

*

FLOAT_SIZE_BYTES)

.order(ByteOrder.nativeOrder())

.asFloatBuffer();

squareVertices.put(GLUtil.SQUARE_VERTICES).position(0);

//纹理坐标

if

(isMirror)

{

switch

(rotateDegree)

{

case

0:

coordVertice

=

GLUtil.MIRROR_COORD_VERTICES;

break;

case

90:

coordVertice

=

GLUtil.ROTATE_90_MIRROR_COORD_VERTICES;

break;

case

180:

coordVertice

=

GLUtil.ROTATE_180_MIRROR_COORD_VERTICES;

break;

case

270:

coordVertice

=

GLUtil.ROTATE_270_MIRROR_COORD_VERTICES;

break;

default:

break;

}

}

else

{

switch

(rotateDegree)

{

case

0:

coordVertice

=

GLUtil.COORD_VERTICES;

break;

case

90:

coordVertice

=

GLUtil.ROTATE_90_COORD_VERTICES;

break;

case

180:

coordVertice

=

GLUtil.ROTATE_180_COORD_VERTICES;

break;

case

270:

coordVertice

=

GLUtil.ROTATE_270_COORD_VERTICES;

break;

default:

break;

}

}

coordVertices

=

ByteBuffer.allocateDirect(coordVertice.length

*

FLOAT_SIZE_BYTES).order(ByteOrder.nativeOrder()).asFloatBuffer();

coordVertices.put(coordVertice).position(0);}在Surface创建完成时进行Renderer初始化

private

void

initRenderer()

{

rendererReady

=

false;

createGLProgram();

//启用纹理

GLES20.glEnable(GLES20.GL_TEXTURE_2D);

//创建纹理

createTexture(frameWidth,

frameHeight,

GLES20.GL_LUMINANCE,

yTexture);

createTexture(frameWidth

/

2,

frameHeight

/

2,

GLES20.GL_LUMINANCE,

uTexture);

createTexture(frameWidth

/

2,

frameHeight

/

2,

GLES20.GL_LUMINANCE,

vTexture);

rendererReady

=

true;

}其中createGLProgram用于创建OpenGLProgram并关联着色器代码中的变量

private

void

createGLProgram()

{

int

programHandleMain

=

GLUtil.createShaderProgram();

if

(programHandleMain

!=

-1)

{

//

使用着色器程序

GLES20.glUseProgram(programHandleMain);

//

获取顶点着色器变量

int

glPosition

=

GLES20.glGetAttribLocation(programHandleMain,

"attr_position");

int

textureCoord

=

GLES20.glGetAttribLocation(programHandleMain,

"attr_tc");

//

获取片段着色器变量

int

ySampler

=

GLES20.glGetUniformLocation(programHandleMain,

"ySampler");

int

uSampler

=

GLES20.glGetUniformLocation(programHandleMain,

"uSampler");

int

vSampler

=

GLES20.glGetUniformLocation(programHandleMain,

"vSampler");

//给变量赋值

/**

*

GLES20.GL_TEXTURE0

ySampler

绑定

*

GLES20.GL_TEXTURE1

uSampler

绑定

*

GLES20.GL_TEXTURE2

vSampler

绑定

*

*

也就是说

glUniform1i的第二个参数代表图层序号

*/

GLES20.glUniform1i(ySampler,

0);

GLES20.glUniform1i(uSampler,

1);

GLES20.glUniform1i(vSampler,

2);

GLES20.glEnableVertexAttribArray(glPosition);

GLES20.glEnableVertexAttribArray(textureCoord);

/**

*

设置Vertex

Shader数据

*/

squareVertices.position(0);

GLES20.glVertexAttribPointer(glPosition,

GLUtil.COUNT_PER_SQUARE_VERTICE,

GLES20.GL_FLOAT,

false,

8,

squareVertices);

coordVertices.position(0);

GLES20.glVertexAttribPointer(textureCoord,

GLUtil.COUNT_PER_COORD_VERTICES,

GLES20.GL_FLOAT,

false,

8,

coordVertices);

}

}其中createTexture用于根据宽高和格式创建纹理

private

void

createTexture(int

width,

int

height,

int

format,

int[]

textureId)

{

//创建纹理

GLES20.glGenTextures(1,

textureId,

0);

//绑定纹理

GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,

textureId[0]);

/**

*

{@link

GLES20#GL_TEXTURE_WRAP_S}代表左右方向的纹理环绕模式

*

{@link

GLES20#GL_TEXTURE_WRAP_T}代表上下方向的纹理环绕模式

*

*

{@link

GLES20#GL_REPEAT}:重复

*

{@link

GLES20#GL_MIRRORED_REPEAT}:镜像重复

*

{@link

GLES20#GL_CLAMP_TO_EDGE}:忽略边框截取

*

*

例如我们使用{@link

GLES20#GL_REPEAT}:

*

*

squareVertices

coordVertices

*

-1.0f,

-1.0f,

1.0f,

1.0f,

*

1.0f,

-1.0f,

1.0f,

0.0f,

->

和textureView预览相同

*

-1.0f,

1.0f,

0.0f,

1.0f,

*

1.0f,

1.0f

0.0f,

0.0f

*

*

squareVertices

coordVertices

*

-1.0f,

-1.0f,

2.0f,

2.0f,

*

1.0f,

-1.0f,

2.0f,

0.0f,

->

和textureView预览相比,分割成了4

块相同的预览(左下,右下,左上,右上)

*

-1.0f,

1.0f,

0.0f,

2.0f,

*

1.0f,

1.0f

0.0f,

0.0f

*/

GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,

GLES20.GL_TEXTURE_WRAP_S,

GLES20.GL_REPEAT);

GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,

GLES20.GL_TEXTURE_WRAP_T,

GLES20.GL_REPEAT);

/**

*

{@link

GLES20#GL_TEXTURE_MIN_FILTER}代表所显示的纹理比加载进来的纹理小时的情况

*

{@link

GLES20#GL_TEXTURE_MAG_FILTER}代表所显示的纹理比加载进来的纹理大时的情况

*

*

{@link

GLES20#GL_NEAREST}:使用纹理中坐标最接近的一个像素的颜色作为需要绘制的像素颜色

*

{@link

GLES20#GL_LINEAR}:使用纹理中坐标最接近的若干个颜色,通过加权平均算法得到需要绘制的像素颜色

*/

GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,

GLES20.GL_TEXTURE_MIN_FILTER,

GLES20.GL_NEAREST);

GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,

GLES20.GL_TEXTURE_MAG_FILTER,

GLES20.GL_LINEAR);

GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D,

0,

format,

width,

height,

0,

format,

GLES20.GL_UNSIGNED_BYTE,

null);

}在Java代码中调用绘制在数据源获取到时裁剪并传入帧数据@Override

public

void

onPreview(final

byte[]

nv21,

Camera

camera)

{

//裁剪指定的图像区域

ImageUtil.cropNV21(nv21,

this.squareNV21,

previewSize.width,

previewSize.height,

cropRect);

//刷新GLSurfaceView

roundCameraGLSurfaceView.refreshFrameNV21(this.squareNV21);

}NV21数据裁剪代码/**

*

裁剪NV21数据

*

*

@param

originNV21

原始的NV21数据

*

@param

cropNV21

裁剪结果NV21数据,需要预先分配内存

*

@param

width

原始数据的宽度

*

@param

height

原始数据的高度

*

@param

left

原始数据被裁剪的区域的左边界

*

@param

top

原始数据被裁剪的区域的上边界

*

@param

right

原始数据被裁剪的区域的右边界

*

@param

bottom

原始数据被裁剪的区域的下边界

*/

public

static

void

cropNV21(byte[]

originNV21,

byte[]

cropNV21,

int

width,

int

height,

int

left,

int

top,

int

right,

int

bottom)

{

int

halfWidth

=

width

/

2;

int

cropImageWidth

=

right

-

left;

int

cropImageHeight

=

bottom

-

top;

//原数据Y左上

int

originalYLineStart

=

top

*

width;

int

targetYIndex

=

0;

//原数据UV左上

int

originalUVLineStart

=

width

*

height

+

top

*

halfWidth;

//目标数据的UV起始值

int

targetUVIndex

=

cropImageWidth

*

cropImageHeight;

for

(int

i

=

top;

i

<

bottom;

i++)

{

System.arraycopy(originNV21,

originalYLineStart

+

left,

cropNV21,

targetYIndex,

cropImageWidth);

originalYLineStart

+=

width;

targetYIndex

+=

cropImageWidth;

if

((i

&

1)

==

0)

{

System.arraycopy(originNV21,

originalUVLineStart

+

left,

cropNV21,

targetUVIndex,

cropImageWidth);

originalUVLineStart

+=

width;

targetUVIndex

+=

cropImageWidth;

}

}

}传给GLSurafceView并刷新帧数据/**

*

传入NV21刷新帧

*

*

@param

data

NV21数据

*/

public

void

refreshFrameNV21(byte[]

data)

{

if

(rendererReady)

{

yBuf.clear();

uBuf.clear();

vBuf.clear();

putNV21(data,

frameWidth,

frameHeight);

dataInput

=

true;

requestRender();

}

}其中putNV21用于将NV21中的Y、U、V数据分别取出/**

*

将NV21数据的Y、U、V分量取出

*

*

@param

src

nv21帧数据

*

@param

width

宽度

*

@param

height

高度

*/

private

void

putNV21(byte[]

src,

int

width,

int

height)

{

int

ySize

=

width

*

height;

int

frameSize

=

ySize

*

3

/

2;

//取分量y值

System.arraycopy(src,

0,

yArray,

0,

ySize);

int

k

=

0;

//取分量uv值

int

index

=

ySize;

while

(index

<

frameSize)

{

vArray[k]

=

src[inde

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论