4

I'm converting some OpenCL code to DirectCompute and need to process 8-bit character strings in a compute shader but don't find an HLSL data type for "byte" or "char". OpenCL supports a "char" type, so I was expecting an equivalent. What is the best way to define and access the data?

It seems that the data can be passed by treating it as a series of "uint" types and unpacking it with bit-shifting, AND-ing, etc. but this seems like it will cause unnecessary overhead. What is the correct way?

1 Answers1

4

I've found two ways to do this, although they both require working with int/uint values in the HLSL since I haven't found an 8-bit data type:

Option 1 is to let the "view" handle the translation:

  • Pass the original data as a byte/char buffer.
  • Set the Shader Resource View format (D3D11_SHADER_RESOURCE_VIEW_DESC.Format) to DXGI_FORMAT_R8_UINT
  • Define the HLSL data type as Buffer<uint>
  • Reference each byte using its byte offset (i.e., treat it as a buffer of bytes not a buffer of uints). Each character is automatically promoted to a uint value.

Option 2 is to treat each 4-byte sequence as a uint, using the format DXGI_FORMAT_R32_UINT, and manually extract each character using something like this:

Buffer<uint> buffer;
uint offset = ...;    
uint ch1, ch2, ch3, ch4;
ch1 =  buffer[offset] >> 24;
ch2 = (buffer[offset] & 0x00ff0000) >> 16;
ch3 = (buffer[offset] & 0x0000ff00) >> 8;
ch4 = (buffer[offset] & 0x000000ff);

Either way you end up working with 32-bit values but at least they correspond to individual characters.

  • This seems right to me. There are also ByteAddressBuffers but they have the same uint32 granularity restriction. http://developer.download.nvidia.com/compute/DevZone/docs/html/DirectCompute/doc/DirectCompute_Programming_Guide.pdf – fifoforlifo Jul 20 '19 at 16:08
  • it works but except for structured buffer. which only supports 2, 4 bytes access. – mopodafordeya Aug 24 '21 at 23:31