4

I am trying to Implement a Functionality that includes taking Pictures While Recording Video. That's the reason i have concluded to use the Screenshot approach of SurfaceView.
However, when i try to take the Screen Shot of SurfaceView. I am always getting a Blank Image.

Here is the code that i am using for taking a Snapshot:

View tempView = (View)MY_SURFACE_VIEW;
tempView.setDrawingCacheEnabled(true);
Bitmap tempBmp = Bitmap.createBitmap(tempView.getDrawingCache());
tempView.setDrawingCacheEnabled(false);
//Saving this Bitmap to a File....

In case you guys may think this is a duplicate Question, let me assure you that i have tried the following Solutions provided on SO for the same Problem before asking this one.

  1. https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android
  2. Facing issue to take a screenshot while recording a video
  3. Take camera screenshot while recording - Like in Galaxy S3?
  4. Taking screen shot of a SurfaceView in android
  5. Get screenshot of surfaceView in Android (This is the correct answer, but Partially Answered. I have already asked @sajar to Explain the Answer)

Other Resources on Internet:
1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project 2. http://www.phonesdevelopers.com/1795894/

None of this has worked so far for me. I also know that we need to create some Thread that interacts with the Surface Holder and Gets the Bitmap from It. But i am not sure how to implement that.

Any Help is Highly Appreciated.

Community
  • 1
  • 1
Salman Khakwani
  • 6,684
  • 7
  • 33
  • 58

2 Answers2

6

Here's another one: Take screenshot of SurfaceView.

SurfaceViews have a "surface" part and a "view" part; your code tries to capture the "view" part. The "surface" part is a separate layer, and there's no trivial "grab all pixels" method. The basic difficulty is that your app is on the "producer" side of the surface, rather than the "consumer" side, so reading pixels back out is problematic. Note that the underlying buffers are in whatever format is most convenient for the data producer, so for camera preview it'll be a YUV buffer.

The easiest and most efficient way to "capture" the surface pixels is to draw them twice, once for the screen and once for capture. If you do this with OpenGL ES, the YUV to RGB conversion will likely be done by a hardware module, which will be much faster than receiving camera frames in YUV buffers and doing your own conversion.

Grafika's "texture from camera" activity demonstrates manipulation of incoming video data with GLES. After rendering you can get the pixels with glReadPixels(). The performance of glReadPixels() can vary significantly between devices and different use cases. EglSurfaceBase#saveFrame() shows how to capture to a Bitmap and save as PNG.

More information about the Android graphics architecture, notably the producer-consumer nature of SurfaceView surfaces, can be found in this document.

Community
  • 1
  • 1
fadden
  • 51,356
  • 5
  • 116
  • 166
  • Thanks for this brilliant response =) Grafika is a great Project for learning. – Salman Khakwani Oct 13 '14 at 10:02
  • 1
    I am unable to find the method `glReadPixels()` is it in the same Activity 'Texture from Camera' ? – Salman Khakwani Oct 13 '14 at 10:05
  • 1
    `glReadPixels()` is called in `EglSurfaceBase#saveFrame()`. See https://github.com/google/grafika/blob/master/src/com/android/grafika/gles/EglSurfaceBase.java#L157 – fadden Oct 13 '14 at 14:34
  • Do i need to change the `SurfaceView` component to `GLSurfaceView` component and re-implement the Camera Previewing using Textures in `GLSurfaceView` ? – Salman Khakwani Nov 28 '14 at 08:53
  • 1
    GLSurfaceView is just a SurfaceView with some helper classes (which, in some cases, can be more of a bother than helpful). You can do anything in SurfaceView that you can in GLSurfaceView; it's mostly a matter of having to do your own EGL management, but the gles library in Grafika shows how to do that. If you want to show the camera preview and record it at the same time, you will need to send the preview to a SurfaceTexture and render it twice (note "continuous capture" may be more relevant than "show + capture camera" for SurfaceView). – fadden Nov 28 '14 at 16:53
  • 1
    @SalmanMuhammadAyub: I'm searching for a solution to the issue of capturing an image through SurfaceView content, & I can see your posts/comments at most of the questions I reach on StackOverflow on this. So, I wanted to know if you were able to achieve this? If yes, can you help me out as well? – Kiran Parmar Feb 01 '15 at 10:56
-1
public class AndroidSurfaceviewExample extends Activity implements SurfaceHolder.Callback  {

static Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
static boolean boo;
static Thread x;
GLSurfaceView glSurfaceView;
public static Bitmap mBitmap;
public static Camera.Parameters param;
public static Camera.Size mPreviewSize;
public static byte[] byteArray;

PictureCallback jpegCallback;
private Bitmap inputBMP = null, bmp, bmp1;
public static ImageView imgScreen;
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    setContentView(R.layout.camera);

    surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
    surfaceHolder = surfaceView.getHolder();
    Button btnTakeScreen = (Button)findViewById(R.id.btnTakeScreen);
    imgScreen = (ImageView)findViewById(R.id.imgScreen);



    btnTakeScreen.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View view) {
            Bitmap screen = Bitmap.createBitmap(getBitmap());
            imgScreen.setImageBitmap(screen);
        }
    });


    // Install a SurfaceHolder.Callback so we get notified when the
    // underlying surface is created and destroyed.
    surfaceHolder.addCallback(this);

    // deprecated setting, but required on Android versions prior to 3.0
    surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

    jpegCallback = new PictureCallback() {
        @SuppressLint("WrongConstant")
        public void onPictureTaken(byte[] data, Camera camera) {
            FileOutputStream outStream = null;
            try {
                outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
                outStream.write(data);
                outStream.close();
                Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
            } catch (FileNotFoundException e) {
                e.printStackTrace();
            } catch (IOException e) {
                e.printStackTrace();
            } finally {
            }
            Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
            refreshCamera();
        }
    };
}






public void refreshCamera() {
    if (surfaceHolder.getSurface() == null) {
        // preview surface does not exist
        return;
    }

    // stop preview before making changes
    try {
        camera.stopPreview();
    } catch (Exception e) {
        // ignore: tried to stop a non-existent preview
    }

    // set preview size and make any resize, rotate or
    // reformatting changes here
    // start preview with new settings
    try {
        camera.setPreviewDisplay(surfaceHolder);
        camera.startPreview();
    } catch (Exception e) {

    }
}

public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
    // Now that the size is known, set up the camera parameters and begin
    // the preview.
    refreshCamera();
}

public void surfaceCreated(SurfaceHolder holder) {


    if (camera == null) {
        try {
            camera = Camera.open();
        } catch (RuntimeException ignored) {
        }
    }

    try {
        if (camera != null) {
            WindowManager winManager = (WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE);
            camera.setPreviewDisplay(surfaceHolder);
        }
    } catch (Exception e) {
        if (camera != null)
            camera.release();
        camera = null;
    }

    if (camera == null) {
        return;
    } else {
        camera.setPreviewCallback(new Camera.PreviewCallback() {
            @Override
            public void onPreviewFrame(byte[] bytes, Camera camera) {
                if (param == null) {
                    return;
                }
                byteArray = bytes;
            }
        });
    }





    param = camera.getParameters();
    mPreviewSize = param.getSupportedPreviewSizes().get(0);

    param.setColorEffect(Camera.Parameters.EFFECT_NONE);

    //set antibanding to none
    if (param.getAntibanding() != null) {
        param.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
    }

    // set white ballance
    if (param.getWhiteBalance() != null) {
        param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
    }

    //set flash
    if (param.getFlashMode() != null) {
        param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
    }

    //set zoom
    if (param.isZoomSupported()) {
        param.setZoom(0);
    }

    //set focus mode
    param.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);


    // modify parameter
    camera.setParameters(param);
    try {
        // The Surface has been created, now tell the camera where to draw
        // the preview.
        camera.setPreviewDisplay(surfaceHolder);
        camera.startPreview();
    } catch (Exception e) {
        // check for exceptions
        System.err.println(e);
        return;
    }
}

public void surfaceDestroyed(SurfaceHolder holder) {
    // stop preview and release camera
    camera.stopPreview();
    camera.release();
    camera = null;
}



public Bitmap getBitmap() {
    try {
        if (param == null)
            return null;

        if (mPreviewSize == null)
            return null;

        int format = param.getPreviewFormat();
        YuvImage yuvImage = new YuvImage(byteArray, format, mPreviewSize.width, mPreviewSize.height, null);
        ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();

        Log.i("myLog","array: "+byteArray.toString());



        Rect rect = new Rect(0, 0, mPreviewSize.width, mPreviewSize.height);

        yuvImage.compressToJpeg(rect, 75, byteArrayOutputStream);
        BitmapFactory.Options options = new BitmapFactory.Options();
        options.inPurgeable = true;
        options.inInputShareable = true;
        mBitmap = BitmapFactory.decodeByteArray(byteArrayOutputStream.toByteArray(), 0, byteArrayOutputStream.size(), options);

        byteArrayOutputStream.flush();
        byteArrayOutputStream.close();
    } catch (IOException ioe) {
        ioe.printStackTrace();
    }

    return mBitmap;
}
mojtaba
  • 94
  • 1
  • 4
  • Hello, please provide some description to your answer. Just posting code is not very helpful. – Hristo Aug 28 '18 at 12:30