Is there any known hierarchy of C++ Data Types and Containers? Such as a Byte
is made up of Bits
, an Int
is made of Char
, which is a Byte
. Strings
are collections of Chars
....
I am interested in butting together a DataTable
which will need to store many different data types in a DataRow
and would hate to choose a List
, if a List
is made up of a Set
and a Map
, which is a multidimensional array of Vectors
when I could have just chosen Vectors
to start off with.
Performance is the main goal with a belief that choosing the most basic container that will support the data will lead to greater performance.
I am using a JSON definition file such as this:
[{
"Column": "Column1",
"StartingPosition": 15,
"ColumnWidth": 3,
"DataType": "Int"
},
{
"Column": "Column2",
"StartingPosition": 19,
"ColumnWidth": 15,
"DataType": "String"
},
{
"Column": "Column3",
"StartingPosition": 35,
"ColumnWidth": 15,
"DataType": "String"
},
{
"Column": "Column4",
"StartingPosition": 51,
"ColumnWidth": 4,
"DataType": "Double"
}]
To parse binary data files. The JSON is read at runtime and needs to create the containers to store the data in. Currently it all parses to a vector<string*>
, which works, but if I wanted to preserve the original data type, I need to expand the complexity of my storage to incorporate containers that support multiple data types. I have looked at the std::any
, tuples
and Heterogeneous containers (https://gieseanw.wordpress.com/2017/05/03/a-true-heterogeneous-container-in-c/).
I was thinking it may end up as an array of a custom struct that would have each data type in it, with an extra to define which to use, which seemed like it may eat up extra memory, and if each cell of data was going to have to have a nested multidimensional array of multiple types, I felt it would be important to choose the right one to start.