I managed to make this work some time ago, with a hack-ish solution, maybe it still works, because glium
is not changing very much lately.
But in my experience using ASurfaceTexture
yields unreliable results, maybe that is because I used it wrongly, or maybe because Android manufacturers do not pay too much attention to it, I don't know. But I didn't see any real program using it, so I decided to use the well tested Java GLSurfaceView
instead and a bit of JNI to connect everything.
class MyGLView extends GLSurfaceView
implements GLSurfaceView.Renderer {
public MyGLView(Context context) {
super(context);
setEGLContextClientVersion(2);
setEGLConfigChooser(8, 8, 8, 0, 0, 0);
setRenderer(this);
}
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
GLJNILib.init();
}
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLJNILib.resize(width, height);
}
public void onDrawFrame(GL10 gl) {
GLJNILib.render();
}
Being com.example.myapp.GLJNILib
the JNI binding to the Rust native library, where the magic happens. The interface is quite straightforward:
package com.example.myapplication;
public class GLJNILib {
static {
System.loadLibrary("myrustlib");
}
public static native void init();
public static native void resize(int width, int height);
public static native void step();
}
Now, this Rust library can be designed in several ways. In my particular projects, since it was a simple game with a single full-screen view, I just created the glium
context and store it in a global variable. More sophisticated programs could store the Backend
into a Java object, but that complicates the lifetimes and I didn't need it.
struct Data {
dsp: Rc<glium::backend::Context>,
size: (u32, u32),
}
static mut DATA: Option<Data> = None;
But first we have to implement the trait glium::backend::Backend
, which happens to be surprisingly easy, if we assume that every time one of the Rust functions is called the proper GL context is always current:
struct Backend;
extern "C" {
fn eglGetProcAddress(procname: *const c_char) -> *const c_void;
}
unsafe impl glium::backend::Backend for Backend {
fn swap_buffers(&self) -> Result<(), glium::SwapBuffersError> {
Ok(())
}
unsafe fn get_proc_address(&self, symbol: &str) -> *const c_void {
let cs = CString::new(symbol).unwrap();
let ptr = eglGetProcAddress(cs.as_ptr());
ptr
}
fn get_framebuffer_dimensions(&self) -> (u32, u32) {
let data = unsafe { &DATA.as_ref().unwrap() };
data.size
}
fn is_current(&self) -> bool {
true
}
unsafe fn make_current(&self) {
}
}
And now we can implement the JNI init
function:
use jni::{
JNIEnv,
objects::{JClass, JObject},
sys::{jint}
};
#[no_mangle]
#[allow(non_snake_case)]
pub extern "system"
fn Java_com_example_myapp_GLJNILib_init(_env: JNIEnv, _class: JClass) { log_panic(|| {
unsafe {
DATA = None
};
let backend = Backend;
let dsp = unsafe { glium::backend::Context::new(backend, false, Default::default()).unwrap() };
// Use dsp to create additional GL objects: programs, textures, buffers...
// and store them inside `DATA` or another global.
unsafe {
DATA = Some(Data {
dsp,
size: (256, 256), //dummy size
});
}
}
The size will be updated when the size of the view changes (not that glium
uses that value so much):
#[no_mangle]
#[allow(non_snake_case)]
pub extern "system"
fn Java_com_example_myapp_GLJNILib_resize(_env: JNIEnv, _class: JClass, width: jint, height: jint) {
let data = unsafe { &mut DATA.as_mut().unwrap() };
data.size = (width as u32, height as u32);
}
And similarly the render
function:
#[no_mangle]
#[allow(non_snake_case)]
pub extern "system"
fn Java_com_example_myapp_GLJNILib_render(_env: JNIEnv, _class: JClass) {
let data = unsafe { &mut DATA.as_ref().unwrap() };
let dsp = &data.dsp;
let mut target = glium::Frame::new(dsp.clone(), dsp.get_framebuffer_dimensions());
// use dsp and target at will, such as:
target.clear_color(0.0, 0.0, 1.0, 1.0);
let (width, height) = target.get_dimensions();
//...
target.finish().unwrap();
}
Note that target.finish()
is still needed although glium
is not actually doing the swap.