4

I am trying to render the camera preview from an java android plugin on a Unity-Texture. The Android plugin runs in background, so there is no view or activity. The only way to grab Cameradata without a view is using SurfaceTexture (see this Question).

Due to perfomance reasons, my plan is to render the SurfaceTexture in the Android-Code directly on a Unity-Texture using OpenGL. I don't get any compile / runtime errors, but the Texture inside Unity just stays white / black.

Here is my Code:

Android-Library (CameraFeed.java):

public class CameraFeed implements SurfaceTexture.OnFrameAvailableListener {

private SurfaceTexture mTexture;
private Camera mCamera;
private int unityTextureID;

public CameraFeed() {

    int textures[] = new int[1];
    // reserve an ID for the GL-Texture..
    GLES20.glGenTextures(1, textures, 0);
    unityTextureID = textures[0];

    // Bind texture to EXTERNAL_OES
    GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, unityTextureID);

    // Create a AndroidSurface Texture with the Unity Texture ID
    mTexture = new SurfaceTexture(unityTextureID);

    // Create Camera and open it
    mCamera = Camera.open();
    mCamera.startPreview();
    try {
        // Bind the Unity Texture to the Camera-Frame
        mCamera.setPreviewTexture(mTexture);
    } catch (IOException e) {
        e.printStackTrace();
    }

    // onFrameAvailable of this Class will be called when frame is ready..
    mTexture.setOnFrameAvailableListener(this);
}

// Offer the Handle to the texture to Unity-Scripts
public int getUnityTextureID() {
    return unityTextureID;
}

@Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
    // update when new frame is here..
    surfaceTexture.updateTexImage();
}
}

Unity-C# Script (CameraFeed.cs):

public class CameraFeed : MonoBehaviour {
private Texture2D nativeTexture;
private AndroidJavaObject javaCameraFeed;

void Start () {
    javaCameraFeed = new AndroidJavaObject("com.oberloestern.grabhuegel.arlibs3.CameraFeed");

    Int32 texPtr = javaCameraFeed.Call <Int32> ("getUnityTextureID");
    Debug.Log("texture pointer? " + texPtr);

    nativeTexture = Texture2D.CreateExternalTexture (128, 128, TextureFormat.RGBA32 , false, false, new System.IntPtr(texPtr));
}

void Update () {

}
void OnGUI() {
    // Draw the Texture from Android in the middle of the screen
    GUI.DrawTexture(new Rect(50, 50, Screen.width - 100, Screen.height - 100), nativeTexture);
}
}

I already managed to send a bitmap-Texture to Unity using GLES20.GL_TEXTURE_2D, but SurfaceTexture must be sent over GLES11Ext.GL_TEXTURE_EXTERNAL_OES...

Does anybody know how to get this working?

Community
  • 1
  • 1
  • I found similar approaches [in this question](http://stackoverflow.com/questions/35227222/rendering-surfacetexture-to-unity-texture2d) or on [this github repository](https://gist.github.com/Paloghas/4037ff314751fca7e205?signup=true). Does anyone have a similiar example maybe? Or is there another perfomant approach to send the camerastream from an android plugin to unity? – Christoph Göttert Dec 28 '16 at 13:43
  • Did you manage to solve this problem? If yes, please share your knowledge :) – Dimitri Podborski Feb 21 '18 at 19:23
  • Sorry, couldn't find a solution for this at all. I was trying to implement an AR-Application and finally ended up using Vuforia and Wikitude. – Christoph Göttert Mar 02 '18 at 08:57
  • Also tried and haven't solved it yet. A solution must exist, I've seen this stuff in AR unity games. This similar question might help: https://stackoverflow.com/questions/35227222/rendering-surfacetexture-to-unity-texture2d – Troy Nov 27 '19 at 07:52

0 Answers0