2

So, I'm trying to render a cube with 3d texture. Texture contains 3 slices of 3 diferent colors, red green and blue. Each slice consists of 4 pixels with the same color. Works fine. https://i.stack.imgur.com/ZoiUi.jpg

private func makeTexture() {
    let width = 2
    let height = 2
    let depth = 3
    let byteSize = 4
    let bytesPerRow = byteSize * width
    let bytesPerImage = bytesPerRow * height
    let blue: UInt32 = 0x000000FF
    let green: UInt32 = 0xFF00FF00
    let red: UInt32 = 0x00FF0000

    let textureDescriptor = MTLTextureDescriptor()
    textureDescriptor.pixelFormat = .bgra8Unorm
    textureDescriptor.width = width
    textureDescriptor.height = height
    textureDescriptor.depth = depth
    textureDescriptor.textureType = .type3D

    let image = UnsafeMutableRawPointer.allocate(bytes: width*height*depth*byteSize, alignedTo: 1)
    image.storeBytes(of: red, toByteOffset: 0, as: UInt32.self) 
    image.storeBytes(of: red, toByteOffset: 4, as: UInt32.self)
    image.storeBytes(of: red, toByteOffset: 8, as: UInt32.self)
    image.storeBytes(of: red, toByteOffset: 12, as: UInt32.self)

    image.storeBytes(of: green, toByteOffset: 16, as: UInt32.self)
    image.storeBytes(of: green, toByteOffset: 20, as: UInt32.self)
    image.storeBytes(of: green, toByteOffset: 24, as: UInt32.self)
    image.storeBytes(of: green, toByteOffset: 28, as: UInt32.self)

    image.storeBytes(of: blue, toByteOffset: 32, as: UInt32.self)
    image.storeBytes(of: blue, toByteOffset: 36, as: UInt32.self)
    image.storeBytes(of: blue, toByteOffset: 40, as: UInt32.self)
    image.storeBytes(of: blue, toByteOffset: 44, as: UInt32.self)

    texture = device?.makeTexture(descriptor: textureDescriptor)

    let region = MTLRegionMake3D(0, 0, 0, width, height, depth)
    texture?.replace(region: region,
                     mipmapLevel: 0,
                     slice: 0,
                     withBytes: image,
                     bytesPerRow: bytesPerRow,
                     bytesPerImage: bytesPerImage)
}

fragment shader code:

struct VertexOut{
float4 position [[position]];
float3 textureCoordinate;
};

fragment half4 basic_fragment(VertexOut in [[stage_in]],
                          texture3d<half> colorTexture [[ texture(0) ]]) {

    constexpr sampler textureSampler (mag_filter::nearest,
                                  min_filter::nearest);

    // Sample the texture to obtain a color
    const half4 colorSample = colorTexture.sample(textureSampler, in.textureCoordinate);

    // We return the color of the texture
     return colorSample;
}

Then i want to make red and blue slices transparent, so i set alphas equals to 0

 let blue: UInt32 = 0x000000FF
 let green: UInt32 = 0xFF00FF00
 let red: UInt32 = 0x00FF0000

fragment shader now contains

const half4 colorSample = colorTexture.sample(textureSampler, in.textureCoordinate);
if (colorSample.a <= 0)
   discard_fragment();

and expect to see a cut with green color but i see just green edges https://i.stack.imgur.com/ISPKY.jpg.

There is nothing inside the cube and i dont even see back edges because cullMode is set to .front.

Can I draw and see the texture within the object so i can see the insides of it? I haven't found the way so far. Isn't it when i set texture type to 3d, it should calculate the color for each pixel of the 3D object? not just the edges? Maybe it does, but doesn't display?

1 Answers1

0

No, 3D textures don't get you that.

There is no 3D object, there are just triangles (which you provide). Those are 2D objects, although they are positioned within 3D space. Metal does not try to figure out what solid object you're trying to draw by extrapolating from the triangles you tell it to draw. No common 3D-drawing API does that. It's not generally possible. Among other things, keep in mind that you don't even have to give all of the triangles to Metal together; they could be split across draw calls.

There is no "inside" to any object, as far as Metal knows, just points, lines, and triangles. If you want to render the inside of an object, you have to model that. For a slice of a cube, you have to compute the new surfaces of the "exposed inside" and pass triangles to Metal to draw that.

A 3D texture is just a texture that you can sample with a 3D coordinate. Note that the decision about what fragments to draw has already been made before your fragment shader is called and Metal doesn't even know you'll be using a 3D texture at the time it makes those decisions.

Ken Thomases
  • 88,520
  • 7
  • 116
  • 154
  • Thank you for answer. Not what i expected but of course usefull! I'm actually working on visualization of CT results. Its provided as more than 400 slices, so i had thought i could've made a 3d texture and then just put it on a cube. But if its not possible to hide extra vertices. Do you have any suggestions on how i can visualise a hundreds of slices as a 3d object? – Yura Sorokin Mar 16 '18 at 19:48
  • @YuraSorokin to do so you need either volume rendering techniques or a some kind of mapping procedure that takes the 3d texture and return triangles, like marching cubes. The simplest volume renderer would be a stack of quads with a 3d texture mapping (uvw). The first vertex of the first quad would have uvw = 0,0,0, the first vert of the second quad would have uvw = 0,0,0.01, and so on. If rendered with blending that would give you a crude 3d volume renderer. No idea if iOS can render a stack of hundreds of blended planes. – Geronimo Dec 17 '19 at 18:17