I'm using a SurfaceTexture
to run the camera of a device in the background and get frames from it. I see that that onFrameAvailable
callback don't bring a frame, but a SurfaceTexture
instead. I want to get the frame or an image representation of it, but I'm not sure how to do that. When I searched I found that I need to use GraphicBuffer, but it seems too complicated and it's not clear to me how to use it.
I've also looked at solutions here:
Texture Image processing on the GPU?
Android SDK: Get raw preview camera image without displaying it
But it's not clear how to do it in the code. Here is my code:
public class BackgroundService extends Service {
private Camera camera = null;
private int NOTIFICATION_ID= 1;
private static final String TAG = "OCVSample::Activity";
// Binder given to clients
private final IBinder mBinder = new LocalBinder();
private WindowManager windowManager;
private SurfaceTexture mSurfaceTexture= new SurfaceTexture (10);
Intent intializerIntent;
public class LocalBinder extends Binder {
BackgroundService getService() {
// Return this instance of this service so clients can call public methods
return BackgroundService.this;
}
}//end inner class that returns an instance of the service.
@Override
public IBinder onBind(Intent intent) {
intializerIntent = intent;
return mBinder;
}//end onBind.
@Override
public void onCreate() {
Log.i(TAG, "onCreate is called");
// Start foreground service to avoid unexpected kill
startForeground(NOTIFICATION_ID, buildNotification());
Thread thread = new Thread() {
public void run() {
camera = Camera.open(1);
mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
@Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
Log.i(TAG, "frame captured from texture");
if (camera!=null) {
//camera.setPreviewCallbackWithBuffer(null);
//camera.setPreviewCallback(null);
//camera.setOneShotPreviewCallback(null);
//if the following two lines are not called, many frames will be droped.
camera.stopPreview();
camera.startPreview();
}
}
});
//now try to set the preview texture of the camera which is actually the surfaceTexture that has just been created.
try {
camera.setPreviewTexture(mSurfaceTexture);
} catch (IOException e) {
Log.e(TAG, "Error in setting the camera surface texture");
}
camera.startPreview();
}
};
thread.start();
}
private Notification buildNotification () {
NotificationCompat.Builder notificationBuilder=new NotificationCompat.Builder(this);
notificationBuilder.setOngoing(true); //this notification should be ongoing
notificationBuilder.setContentTitle(getString(R.string.notification_title))
.setContentText(getString (R.string.notification_text_and_ticker))
.setSmallIcon(R.drawable.vecsat_logo)
.setTicker(getString(R.string.notification_text_and_ticker));
return(notificationBuilder.build());
}
@Override
public void onDestroy() {
Log.i(TAG, "surfaceDestroyed method");
camera.stopPreview();
//camera.lock();
camera.release();
mSurfaceTexture.detachFromGLContext();
mSurfaceTexture.release();
stopService(intializerIntent) ;
//windowManager.removeView(surfaceView);
}
}
How can I get the frames and process them whether that's on GPU or CPU? If there is a way to do that on GPU, then get the results from there, that would be great as it seems more efficient.
Thank you.