11

I have an Image that was loaded from camera roll or any other source, usually local one.

How can I access its pixel data map to perform some calculations or measurements?

cbr
  • 12,563
  • 3
  • 38
  • 63
zamuka
  • 796
  • 2
  • 9
  • 21
  • 1
    Did you ever find a solution to this? – James Feb 16 '17 at 23:40
  • 1
    I think you could draw the image to a canvas, then use `getImageData()` to get the data map: https://developer.mozilla.org/en-US/docs/Web/API/CanvasRenderingContext2D/getImageData. Although, it's not a React-based solution. – J. Titus Mar 24 '17 at 17:20

2 Answers2

15

There is a way of getting this information using native modules, but I only have Android implementation at the moment. Tested on RN 0.42.3. First of all, you'll need to create a native module in your app. Assuming that application is initialized with the name SampleApp, create new directory in your React Native project android/app/src/main/java/com/sampleapp/bitmap with two files in it:

android/app/src/main/java/com/sampleapp/bitmap/BitmapReactPackage.java

package com.sampleapp;

import com.facebook.react.ReactPackage;
import com.facebook.react.bridge.JavaScriptModule;
import com.facebook.react.bridge.NativeModule;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.uimanager.ViewManager;

import java.util.ArrayList;
import java.util.Collections;
import java.util.List;

public class BitmapReactPackage implements ReactPackage {

  @Override
  public List<Class<? extends JavaScriptModule>> createJSModules() {
    return Collections.emptyList();
  }

  @Override
  public List<ViewManager> createViewManagers(ReactApplicationContext reactContext) {
    return Collections.emptyList();
  }

  @Override
  public List<NativeModule> createNativeModules(
                              ReactApplicationContext reactContext) {
    List<NativeModule> modules = new ArrayList<>();

    modules.add(new BitmapModule(reactContext));

    return modules;
  }

}

android/app/src/main/java/com/sampleapp/bitmap/BitmapModule.java

package com.sampleapp;

import com.facebook.react.bridge.NativeModule;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.bridge.ReactContext;
import com.facebook.react.bridge.ReactContextBaseJavaModule;
import com.facebook.react.bridge.ReactMethod;
import com.facebook.react.bridge.Promise;
import com.facebook.react.bridge.WritableNativeArray;
import com.facebook.react.bridge.WritableNativeMap;

import android.graphics.Bitmap;
import android.graphics.BitmapFactory;

import java.io.IOException;

public class BitmapModule extends ReactContextBaseJavaModule {

  public BitmapModule(ReactApplicationContext reactContext) {
    super(reactContext);
  }

  @Override
  public String getName() {
    return "Bitmap";
  }

  @ReactMethod
  public void getPixels(String filePath, final Promise promise) {
    try {
      WritableNativeMap result = new WritableNativeMap();
      WritableNativeArray pixels = new WritableNativeArray();

      Bitmap bitmap = BitmapFactory.decodeFile(filePath);
      if (bitmap == null) {
        promise.reject("Failed to decode. Path is incorrect or image is corrupted");
        return;
      }

      int width = bitmap.getWidth();
      int height = bitmap.getHeight();

      boolean hasAlpha = bitmap.hasAlpha();

      for (int x = 0; x < width; x++) {
        for (int y = 0; y < height; y++) {
          int color = bitmap.getPixel(x, y);
          String hex = Integer.toHexString(color);
          pixels.pushString(hex);
        }
      }

      result.putInt("width", width);
      result.putInt("height", height);
      result.putBoolean("hasAlpha", hasAlpha);
      result.putArray("pixels", pixels);

      promise.resolve(result);

    } catch (Exception e) {
      promise.reject(e);
    }

  }

}

As you can see in the second file there is a method getPixels, which will be available from JS as a part of Bitmap native module. It accepts a path to an image file, converts the image to an internal Bitmap type, which allows to read image pixels. All image pixels are read one by one and saved to array of pixels in a form of hex strings (because React Native does not allow to pass hex values through the bridge). These hex strings have 8 characters, 2 characters per ARGB channel: first two characters is a hex value for alpha channel, second two - for red, third two - for green and the last two - for blue channel. For example, the value ffffffff - is a white color and ff0000ff - is a blue color. For convenience, image width, height and presence of alpha channel are returned along with array of pixels. Method returns a promise with an object:

{
  width: 1200,
  height: 800,
  hasAlpha: false,
  pixels: ['ffffffff', 'ff00ffff', 'ffff00ff', ...]
}

Native module also has to be registered in the app, modify android/app/src/main/java/com/sampleapp/MainApplication.java and add new module in there:

@Override
protected List<ReactPackage> getPackages() {
  return Arrays.<ReactPackage>asList(
      new MainReactPackage(),
      new BitmapReactPackage() // <---
  );
}

How to use from JS:

import { NativeModules } from 'react-native';

const imagePath = '/storage/emulated/0/Pictures/blob.png';

NativeModules.Bitmap.getPixels(imagePath)
  .then((image) => {
    console.log(image.width);
    console.log(image.height);
    console.log(image.hasAlpha);


    for (let x = 0; x < image.width; x++) {
      for (let y = 0; y < image.height; y++) {
        const offset = image.width * y + x;
        const pixel = image.pixels[offset];
      }
    }
  })
  .catch((err) => {
    console.error(err);
  });

I need to mention, that it performs pretty slow, most likely because of transfering a huge array through the bridge.

Michael Radionov
  • 12,859
  • 1
  • 55
  • 72
  • 1
    Thanks for your answer! Looks like native modules are the way to go. By the way, the [Android API](https://developer.android.com/reference/android/graphics/Color.html) mentions that the colors are encoded in ARGB order, not RGBA. You mentioned that "hex values" are not supported. Are integer arrays *really* not supported? Not that it matters, any image processing should probably be done on the native side anyways. – cbr Mar 25 '17 at 22:50
  • 1
    @cubrr, you are correct, I did not pay enough attention to it. Colors are indeed ARGB. Regarding hex: I tried to pass the color value which is returned directly from the call `bitmap.getPixel(x, y)` through the bridge, but the int values I got were not very informative to me, these are a bunch of negative ints. After executing `pixel.toString(16)` in JS in got better. For example : `(-4153434).toString(16) = "-3f605a"`. I guess it's up to you to decide which format you prefer. – Michael Radionov Mar 26 '17 at 10:06
  • Ah, right, Java doesn't have unsigned numbers so of course they might be negative. What might work is passing thru longs: `long color = bitmap.getPixel(x, y) & 0xFFFFFFFFL` – cbr Mar 26 '17 at 13:33
0

Thanks Michael. Seems today its much easier

import { RNCamera } from 'react-native-camera';

<RNCamera ref={ref => {this.camera = ref;}} style={styles.preview}></RNCamera>

//Set the options for the camera
const options = {
    base64: true
};
// Get the base64 version of the image
const data = await this.camera.takePictureAsync(options)
zamuka
  • 796
  • 2
  • 9
  • 21