I own a Samsung Galaxy S3, which is capable of capturing pitures of size ~3000 x 2000. And, I am currently developing an application that requires capturing pictures. I use my phone as debugger and I set the best possible size that the device offers for the picture to be captured.
However, if I use this setting, the callback onPictureTaken
gives out of memory error at the very first line of it in Bitmap.decodeByteArray
method where I try to decode the captured bytes into a bitmap. If I use Bitmap.Options.inSampleSize = 2
, there occurs no out of memory error.
I want the application to be able to capture the best that the device offers, and, the device does this in its very own camera application but I can't in mine. I don't understand. How can I overcome this problem?