By "increasingly" what I mean is that Add
is fast at the beginning when there is a low number of keys. After inserting 20% of the keys, it gets very slow. After 50% it gets unbearably slow.
I get that the lower the number of keys, the faster the "key collision search" when adding new elements to the dictionary. But is there any possible way to skip this downside while keeping the Dictionary
? I know beforehand that keys don't collide so no check is needed, but I don't know if there is any way to successfully use this info in the code.
BTW I am forced to use the dictionary structure because of architecture restrictions (this structure is swallowed later by a db exporter).
What my code does:
var keyList = GetKeyList();
var resultDict = new Dictionary<T,T>();
foreach (var key in keyList)
{
resultDict.Add(key,someResult);
}
Edit: since people is asking how the hash code is generated, I will try to clarify this.
Theoretically I have no control over the hash code generation, because unfortunately it uses a convention between multiple systems that are connected through the same db.
In practice, the piece of code that generates the hash code is indeed my code (disclaimer: it wasn't me choosing the convention that is used in the generation).
The key generation is way more complicated than that, but it all boils down to this:
private List<ResultKey> GetKeyList(string prefix, List<float> xCoordList, List<float> yCoordList)
{
var keyList = new List<ResultKey>();
var constantSensorName = "xxx";
foreach (float xCoord in xCoordList)
{
foreach (float yCoord in yCoordList)
{
string stationName = string.Format("{0}_E{1}N{2}", prefix, xCoord, yCoord);
keyList.Add(new ResultKey(constantSensorName, stationName));
}
}
return keyList;
}
public struct ResultKey
{
public string SensorName { get; set; }
public string StationName { get; set; }
public ResultKey(string sensorName, string stationName)
{
this.SensorName = sensorName;
this.StationName = stationName;
}
}