17

How do I apply custom filters to single frames in the camera output, and show them.

What I've tried so far:

mCamera.setPreviewCallback(new CameraGreenFilter());

public class CameraGreenFilter implements PreviewCallback {

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        final int len = data.length;
        for(int i=0; i<len; ++i){
            data[i] *= 2;
        }
    }
}
  • Although its name contains "green" I actually want to just modify the values somehow (in this case, colors would be intensified a bit). Long story short, it does not work.

  • I figured out that the byte array 'data' is a copy of the camera output; but this doesn't really help, because I need the 'real' buffer.

  • I've heard you could implement this with openGL. That sounds very complicated.

Is there an easier way? Else, how would this openGL-surfaceView mapping work?

McGarnagle
  • 101,349
  • 31
  • 229
  • 260
poitroae
  • 21,129
  • 10
  • 63
  • 81

1 Answers1

39

OK, there are several ways to do this. But there is a significant problem with performance. The byte[] from a camera is in YUV format, which has to be converted to some sort of RGB format, if you want to display it. This conversion is quite expensive operation and significantly lowers the output fps.

It depends on what you actually want to do with the camera preview. Because the best solution is to draw the camera preview without callback and make some effects over the camera preview. That is the usual way to do argumented reallity stuff.

But if you really need to display the output manually, there are several ways to do that. Your example does not work for several reasons. First, you are not displaying the image at all. If you call this:

mCamera.setPreviewCallback(new CameraGreenFilter());
mCamera.setPreviewDisplay(null);

than your camera is not displaying preview at all, you have to display it manually. And you can't do any expensive operations in onPreviewFrame method, beacause the lifetime of data is limited, it's overwriten on the next frame. One hint, use setPreviewCallbackWithBuffer, it's faster, because it reuses one buffer and does not have to allocate new memory on each frame.

So you have to do something like this:

private byte[] cameraFrame;
private byte[] buffer;
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
    cameraFrame = data;
    addCallbackBuffer(data); //actually, addCallbackBuffer(buffer) has to be called once sowhere before you call mCamera.startPreview();
}


private ByteOutputStream baos;
private YuvImage yuvimage;
private byte[] jdata;
private Bitmap bmp;
private Paint paint;

@Override //from SurfaceView
public void onDraw(Canvas canvas) {
    baos = new ByteOutputStream();
    yuvimage=new YuvImage(cameraFrame, ImageFormat.NV21, prevX, prevY, null);

    yuvimage.compressToJpeg(new Rect(0, 0, width, height), 80, baos); //width and height of the screen
    jdata = baos.toByteArray();

    bmp = BitmapFactory.decodeByteArray(jdata, 0, jdata.length);

    canvas.drawBitmap(bmp , 0, 0, paint);
    invalidate(); //to call ondraw again
}

To make this work, you need to call setWillNotDraw(false) in the class constructor or somewhere.

In onDraw, you can for example apply paint.setColorFilter(filter), if you want to modify colors. I can post some example of that, if you want.

So this will work, but the performance will be low (less than 8fps), cause BitmapFactory.decodeByteArray is slow. You can try to convert data from YUV to RGB with native code and android NDK, but that's quite complicated.

The other option is to use openGL ES. You need GLSurfaceView, where you bind camera frame as a texture (in GLSurfaceView implement Camera.previewCallback, so you use onPreviewFrame same way as in regular surface). But there is the same problem, you need to convert YUV data. There is one chance - you can display only luminance data from the preview (greyscale image) quite fast, because the first half of byte array in YUV is only luminance data without colors. So on onPreviewFrame you use arraycopy to copy the first half of the array, and than you bind the texture like this:

gl.glGenTextures(1, cameraTexture, 0);
int tex = cameraTexture[0];
gl.glBindTexture(GL10.GL_TEXTURE_2D, tex);
gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_LUMINANCE, 
    this.prevX, this.prevY, 0, GL10.GL_LUMINANCE, 
    GL10.GL_UNSIGNED_BYTE, ByteBuffer.wrap(this.cameraFrame)); //cameraFrame is the first half od byte[] from onPreviewFrame

gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR);

You cant get about 16-18 fps this way and you can use openGL to make some filters. I can send you some more code to this if you want, but it's too long to put in here...

For some more info, you can see my simillar question, but there is not a good solution either...

Community
  • 1
  • 1
Jaa-c
  • 5,017
  • 4
  • 34
  • 64
  • 1
    You say the YUV data needs to be converted to RGB data to be displayable and this would give just around 8fps. What is the android framework doing to get such a high display rate (no filters applied)? Do you think they are wrapping everything to openGL ES to get around almost 20 fps? :) – poitroae Jan 02 '12 at 14:42
  • 1
    More likely, they're not using Android API :) It might be written in native code or something like that (it's just a guess)... But Android is opensource, so you can try to find how their code actually works :) – Jaa-c Jan 02 '12 at 16:59
  • Jaa-c, I was wondering if it'd be at all possible for you to post an example using the code you've written above (Not the openGL). I can't figure out how to add it all together to work correctly, and I understand this is an older question, but their isn't much I can find on Google about this. Or can you make a mini tutorial on what classes to make to implement this? – Justin Warner Apr 24 '13 at 02:46
  • @JustinWarner: **Basic** idea: https://gist.github.com/Jaa-c/5456278 but you should at least process the data in another thread. – Jaa-c Apr 24 '13 at 23:00
  • Note that it's possible to get color display with OpenGL, using a fragment shader, e.g. http://www.fourcc.org/source/YUV420P-OpenGL-GLSLang.c. This code must be modified to account for **NV21** preview format with **Cb** and **Cr** interleaved (instead of **YUV420P** in the sample). – Alex Cohn Mar 20 '14 at 08:42
  • @Jaa-c Hi, I know this is an old thread but could you provide your code the is doing fragment shader on camera's preview? – Nativ Jul 31 '14 at 14:40
  • I am having issue on blending on camera preview, can anyone help.http://stackoverflow.com/questions/26828344/additive-blending-without-glclear – gitesh.tyagi Nov 10 '14 at 18:37
  • also see my question http://stackoverflow.com/questions/29645950/how-can-i-add-thermal-effect-to-yuv-image and http://stackoverflow.com/questions/29649137/how-to-modify-rgb-pixel-of-an-bitmap-to-look-different – Zar E Ahmer Apr 15 '15 at 15:04
  • @Jaa-c have you seen my questions. I have created night vision effect . Only thermal effect has to be implemented – Zar E Ahmer Apr 20 '15 at 08:39