I've found two ways to do this, although they both require working with int/uint values in the HLSL since I haven't found an 8-bit data type:
Option 1 is to let the "view" handle the translation:
- Pass the original data as a byte/char buffer.
- Set the Shader Resource View format (D3D11_SHADER_RESOURCE_VIEW_DESC.Format) to DXGI_FORMAT_R8_UINT
- Define the HLSL data type as
Buffer<uint>
- Reference each byte using its byte offset (i.e., treat it as a buffer of bytes not a buffer of uints). Each character is
automatically promoted to a uint value.
Option 2 is to treat each 4-byte sequence as a uint, using the format DXGI_FORMAT_R32_UINT, and manually extract each character using something like this:
Buffer<uint> buffer;
uint offset = ...;
uint ch1, ch2, ch3, ch4;
ch1 = buffer[offset] >> 24;
ch2 = (buffer[offset] & 0x00ff0000) >> 16;
ch3 = (buffer[offset] & 0x0000ff00) >> 8;
ch4 = (buffer[offset] & 0x000000ff);
Either way you end up working with 32-bit values but at least they correspond to individual characters.