2

I want to work with large size vectors. But these vectors allocated large size in memory and caused error..

size=10000;
//2D vector doubles 
vector<vector<double>> vecA(size,vector<double>(size));
vector<vector<double>> vecB(size,vector<double>(size));
vector<vector<double>> vecC(size,vector<double>(size));

I want to work with them in all of my code in program. what is best solution?

chwarr
  • 6,777
  • 1
  • 30
  • 57
user2855778
  • 137
  • 3
  • 19

2 Answers2

1

Firstly about your problem:

vector<vector<double> > v;
cout<< v.max_size();

This piece of code giving me the output 268435455 and 768614336404564650 (when compiled for 64-bit). Moreover in my machine it is not throwing any compilation error, but the programs hangs (i.e. the allocation never happens in 32-bit). Though in 64-bit all three vectors are being allocated with no error. So basically this may be a bug in vs12 or may not be a bug, Just a undefined behaviour (since the c++ standard does not garuntee anything about this allocation ).

Now about your solution: You may use a on-disk data structure, which will be much much slower. there are many library to do this for you. You may check HERE to find one.


Similar bug in vs.

deeiip
  • 3,319
  • 2
  • 22
  • 33
  • I cannot repro a hang with either Visual Studio 2012 or 2013 (I'm not sure which you meant by "vs12"). With both versions (and using both the release and debug runtime libraries), the program runs until allocation fails, at which point `std::bad_alloc` is thrown and the program is terminated (because the exception is unhandled). This is the expected behavior and is the behavior mandated by the C++ Standard. I do note that with the Visual C++ 2012 debug libraries, it takes an excessively long time for the program to run, but it does not hang. – James McNellis Nov 27 '13 at 09:09
  • The Connect bug to which you link is entirely unrelated to the problem described in the question: that bug is a compile-time issue; it has no run-time effects since code that exposes the bug will not compile. – James McNellis Nov 27 '13 at 09:12
  • @JamesMcNellis in my case it hanged in visual studio 2012, that's why i deduced that it may be a bug. about your second remark, the op didn't mentioned it is a runtime exception or compile time error (may happen in case of arrayes). So i thought a similar bug might be the mentioned one ( obviously not this one). – deeiip Nov 27 '13 at 09:40
  • My question is: Do you actually observe a hang when using Visual Studio 2012, or does the program just take a long time to run to exception? My guess is that it's actually the latter (as this is the behavior I observe, and this is the sort of behavior that doesn't usually vary from system to system). Could you verify? A hang at runtime is a very serious bug. A performance issue is problematic, but is less serious (because at least the issue will resolve itself). – James McNellis Nov 27 '13 at 22:54
  • I understand that it should not hang, but yes, it is hanging up in my laptop (though it may not be a bug of vc libraries because i sometime experiment with runtime libraries, that's why i wrote in my answer "may be a bug". I'll try elsewhere sortly).

    Apart from that the return of max_size() were 268435455 and 768614336404564650, both are lower than required. Then why in first case(32-bit) the allocation fails but in second case (64-bit) it happens. If my system had less than 2.4Gb memory would the second allocation also fail ?

    – deeiip Nov 28 '13 at 00:10
0

The tool you need depends on what you are trying to achieve. However, pre-allocated large vectors of pre-allocated large vectors is almost certainly not the right choice.

If the vector size remains fixed you may be creating a matrix-like thing, in which case you are better off using a matrix library such as the excellent Eigen.

If you are doing matrix calculations with large matrices, it is worth considering whether the performance would be better with sparse matrices (in other words is the data sparse).

If you are doing maths with large data arrays you should probably also consider using a GPU library because to can get speed ups of 10x to 100x. I believe Eigen can be made to use the GPU but I have never done so myself.

If you a are building a large table that is not going to be used like a matrix then you may need some other data-structure, perhaps something on-disk and database-like. Please post some more details of what you are trying to do.

maninalift
  • 703
  • 6
  • 10