I have
const uint8_t longByteTable[16][256][16] = { { { 0x00, ... } } };
declared as a three-dimensional 16x256x16 array of hardcoded octet values.
For optimisation purposes and various other reasons I need this array to be interpreted as a three-dimensional 16x256x2 array of uint64_t values:
const uint64_t reinterpretedTable[16][256][2];
What I need is a valid way to cast longByteTable
to reinterpretedTable
within strict ISO/ANSI C. Is this:
const uint64_t (*reinterpretedTable)[256][2] =
(const uint64_t(*)[256][2])longByteTable;
a proper way to do that?
P.S. I can't declare longByteTable
with latter type because then it would not work properly with different endianness, and I would either need to declare different tables for diffent endianness, or perform some runtime checks and rotations. And yes, all further transformations of reinterpreted array are endianness-invariant.