2

Hi I am trying to send a BufferedImage I have on my Java application through a tcp socket to an Android Device. I currently get the raster in a byte[] from the BufferedImage and then ship this through a plain OutputStream to the device. This works fine and I get the same byte array on the Android side. When I call Bitmap.decodeByteArray() however, I only get null.

Here is the code I have to send my picture in Java. The image type of the BufferedImage is TYPE_4BYTE_ABGR

byte[] imgBytes =    ((DataBufferByte)msg.getImage().getData().getDataBuffer()).getData();

lineBytes = (String.valueOf(imgBytes.length) + '\n').getBytes();        
out.write(lineBytes);
out.write(imgBytes);
out.write((int)'\n');
out.flush();

The first thing I write out is the size of the image so I know how big to make the byte[] on Android.

Here's the code I'm trying to use to create the Android Bitmap.

currLine = readLine(in);
int imgSize = Integer.parseInt(currLine);
byte[] imgBytes = new byte[imgSize];
in.read(imgBytes);
BitmapFactory.Options imgOptions = new BitmapFactory.Options();
imgOptions.inPreferredConfig = Bitmap.Config.ARGB_4444;

Bitmap img = BitmapFactory.decodeByteArray(imgBytes, 0, imgSize, imgOptions);

The bytes arrive fine.. They just don't work for the Bitmap.

Tyler Helmuth
  • 129
  • 2
  • 11
  • You probably don't want to *decode* the array, as the bytes are not encoded in the first place. It's just "raw" pixels. You also need to pass height and width to properly reconstruct the image. Try one of the `Bitmap.createBitmap` methods. – Harald K Apr 06 '15 at 16:04

2 Answers2

2

To elaborate on the suggestion I made in the comment:

From the Java/server side, send the image's width and height (if you know your image's type is always TYPE_4BYTE_ABGR you don't need anything else):

BufferedImage image = msg.getImage();
byte[] imgBytes = ((DataBufferByte) image.getData().getDataBuffer()).getData();

// Using DataOutputStream for simplicity
DataOutputStream data = new DataOutputStream(out);

data.writeInt(image.getWidth());
data.writeInt(image.getHeight());
data.write(imgBytes);

data.flush();

Now you can either convert the interleaved ABGR byte array to packed int ARGB on the server side, or on the client side, it does not really matter. I'll show the conversion on the Android/client side, for simplicity:

// Read image data
DataInputStream data = new DataInputStream(in);
int w = data.readInt();
int h = data.readInt();
byte[] imgBytes = new byte[w * h * 4]; // 4 byte ABGR
data.readFully(imgBytes);

// Convert 4 byte interleaved ABGR to int packed ARGB
int[] pixels = new int[w * h];
for (int i = 0; i < pixels.length; i++) {
    int byteIndex = i * 4;
    pixels[i] = 
            ((imgBytes[byteIndex    ] & 0xFF) << 24) 
          | ((imgBytes[byteIndex + 3] & 0xFF) << 16) 
          | ((imgBytes[byteIndex + 2] & 0xFF) <<  8) 
          |  (imgBytes[byteIndex + 1] & 0xFF);
} 

// Finally, create bitmap from packed int ARGB, using ARGB_8888
Bitmap bitmap = Bitmap.createBitmap(pixels, w, h, Bitmap.Config.ARGB_8888);

If you really want ARGB_4444, you can convert the bitmap, but note that the constant is deprecated in all recent versions of the Android API.

Harald K
  • 26,314
  • 7
  • 65
  • 111
  • Thank you for the answer it cleared some things I was confused about up. This works better in the sense that it works the whole time. Unfortunately, the resulting image does not work at all. All of the colors are completely off. I can kind of see the resulting shape in the image but I don't think the shifts are working correctly, although I sketched some stuff out and it definitely looks like they should be correct. – Tyler Helmuth Apr 06 '15 at 23:32
  • I got it to work for red and green by changing the index inside the pixel for loop to be imgBytes[(i*4)] for all of the indexes since we need to get bytes 4-7 on pixels[1] instead of 1-4. But whenever I take a picture with blue in it it gets completely corrupted to yellow. I'm doing some manual debugging of the byte values but not finding much – Tyler Helmuth Apr 07 '15 at 00:17
  • 1
    Ok it's working now I just realized that I had to compress all of the bytes we were oring into the 8 significant bits by using & 0xFF on each byte. So final loop statement of pixels[i] = ((imgBytes[byteIndex] & 0xFF) << 24) | ((imgBytes[byteIndex + 3] & 0xFF) << 16) | ((imgBytes[byteIndex + 2] & 0xFF) << 8) | (imgBytes[byteIndex + 1] & 0xFF); This works perfectly!! Thanks so much. – Tyler Helmuth Apr 07 '15 at 00:29
  • @TylerHelmuth Sorry about the bugs! Updated sample code now, to work as expected. Glad to hear it works! – Harald K Apr 07 '15 at 07:53
  • @haraldK Can you help me with [this](https://stackoverflow.com/questions/56513068/convert-android-bitmap-to-byte-array-not-producing-expected-results)? – Phenomenal One Jun 09 '19 at 08:35
0

imgSize should be the size of the image. Why not try imgBytes.length?

MonoThreaded
  • 11,429
  • 12
  • 71
  • 102
Ananth
  • 1,065
  • 1
  • 12
  • 20