EDIT 3:
int rRows = result.length;
int rColums = result[0].length;
BufferedImage img = new BufferedImage(rColums, rRows, BufferedImage.TYPE_BYTE_GRAY);
for (int r = 0; r < rRows; r++) {
for (int t = 0; t < result[r].length; t++) {
img.setRGB(t, r, result[r][t]);
EDIT2:
Created the image like so....
BufferedImage img = new BufferedImage(rColums, rRows,BufferedImage.TYPE_BYTE_GRAY)
private static int[][] convertToArray(BufferedImage inputImage) {
final byte[] pixels = ((DataBufferByte) inputImage.getRaster().getDataBuffer()).getData();
final int width = inputImage.getWidth();
final int height = inputImage.getHeight();
System.out.println("height" + height + "width");
int[][] result = new int[height][width];
for (int pixel = 0, row = 0, col = 0; pixel < pixels.length; pixel++) {
int argb = 0;
argb = (int) pixels[pixel];
result[row][col] = argb;
col++;
if (col == width) {
col = 0;
row++;
}
}
return result;
Edit:
I've realized what I'm trying to ask is how to go from signed grayscale to unsigned grayscale, as I said adding 256 didnt work for me and also that would still seem to leave the image to dark as it wont raise the value of the +127 signed values to 256 unsigned.(hopefully I've expressed that correctly.
As per title I've an int[][] extracted from a buffered image via
((DataBufferByte) inputImage.getRaster().getDataBuffer()).getData()
The array ranges from -128 to 127. The problem is that when I attempt to reconstruct the image based on the int[][] by passing it to BufferedImage, it comes out too dark, and more like a black and white(mostly black) image.
I saw suggestion to add 256 to each sub zero value of the byte[] produced by the DataBufferByte, in the process of converting byte[] to in[][], but this actually produces a totally black image and I dont really get the logic of it, Like wouldnt you want to shift the entire scale over by 128, rather than just the negative numbers??