0

I am looking for an elegant and hopefully bevy-esque way of rendering to a wgpu::Texture. The reason is that I'm am implementing a WebXR libary and the WebXRFramebuffer must be rendered to in immersive XR.

let framebuffer = //get framebuffer from web_sys
let texture: wgpu::Texture = unsafe {
        device.create_texture_from_hal::<wgpu_hal::gles::Api>(
            wgpu_hal::gles::Texture {
                inner: wgpu_hal::gles::TextureInner::ExternalFramebuffer {
                    inner: framebuffer,
...

My question is, once I have created this wgpu::Texture is there a way to either:

  1. Set it as the main pass texture of the bevy engine
  2. Render the cameras to a bevy::Image and blit that to the wgpu::texture

I've seen examples like the superconductor engine doing a lot of low level wgpu work to achieve this but it feels like there should be a simpler way with recent bevy features like the render graph and camera render targets.

chantey
  • 4,252
  • 1
  • 35
  • 40

0 Answers0