I've being benchmarking an algorithm, it's not necessary to know the details. The main components are a buffer(raw array of integers) and an indexer (integer - used to access the elements in buffer).
The fastest types for the buffer seem to be unsigned char, and both signed and unsigned versions of short, int, long. However char/signed char was slower. Difference: 1.07x.
For the indexer there was no difference between signed and unsigned types. However int and long were 1.21x faster than char and short.
Is there a type that should be used by default when considering performance and not memory consumption?
NOTE: The operations used on the elements of the buffer and the indexer were assignment, increment, decrement and comparison.