4

Well, the question is not as silly as it sound.

I am using C++11 <array> and want to declare an array like this:

array<int, MAX_ARR_SIZE> myArr;

The MAX_ARR_SIZE is to be defined in a header file and could be very large i.e. 10^13. Currently I am typing it like a pre-school kid

 #define MAX_ARR_SIZE 1000000000000000

I can live with it if there is no alternative. I can't use pow(10, 13) here since it can not be evaluated at compile time; array initialization will fail. I am not aware of any shorthand to type this.

Dilawar
  • 5,438
  • 9
  • 45
  • 58
  • 3
    Well you could do `const size_t max_array_size = 10e15;`, but an array that size is likely too large for the stack. Having a `MAX_ARR_SIZE` indicates that you have some kind of dynamic sizing; is there a reason you aren't using `std::vector`? – TartanLlama Mar 31 '16 at 10:18
  • 1
    @TartanLlama or `constexpr size_t max_array_size = 10e15;` – Garf365 Mar 31 '16 at 10:19
  • 4
    This smells like an [XY problem](http://mywiki.wooledge.org/XyProblem)… – Biffen Mar 31 '16 at 10:26
  • @TartanLlama Doing benchmarking of a custom random number generator ;-) . Have already done it with vector; thought give it a try with array (plan is to have a concurrent version for GPU and openMPI) – Dilawar Mar 31 '16 at 10:31
  • @Garf365 I assumed that `a e b` expression would be of type `float` or `double`? Got this impression from this page http://en.cppreference.com/w/cpp/language/types#Integer_types ; they did not use this either. – Dilawar Mar 31 '16 at 10:32
  • 1
    @Dilawar no, it's type `size_t` (unsigned interger type) – Garf365 Mar 31 '16 at 10:36
  • @Garf365 No, it's a [`double`](http://coliru.stacked-crooked.com/a/230e6febd866cab1) which we convert to an integral type. – TartanLlama Mar 31 '16 at 10:42
  • @TartanLlama right, apologize, I take a shortcut ;) – Garf365 Mar 31 '16 at 10:43
  • 3
    note: this will fail if your system doesn't have 4000 (or 8000) terabytes of ram free .... – M.M Mar 31 '16 at 10:49

6 Answers6

9

Using #define for constants is more a way of C than C++.

You can define your constant in this way:

const size_t MAX_ARR_SIZE(1e15); 
Tomáš Šíma
  • 834
  • 7
  • 26
  • 6
    Or `constexpr` for C++11 (http://stackoverflow.com/questions/13346879/const-vs-constexpr-on-variables) – Garf365 Mar 31 '16 at 10:26
7

In this case, using a const size_t instead of #define is preferred.


I'd like to add that, since C++14, when writing integer literals, you could add the optional single quotes as separator.

1'000'000'000'000'000

This looks more clear.

Yu Hao
  • 119,891
  • 44
  • 235
  • 294
2

You can define a constexpr function:

constexpr size_t MAX_ARR_SIZE()
{
    return pow(10, 15); 
}

That way you can do even more complex calculations in compile time.

Then use it as array<int, MAX_ARR_SIZE()> myArr; it will be evaluated in compile time.

Also like it was already mentioned, you probably won't be able to allocate that size on the stack.

EDIT:

I have a fault here, since pow itself is not constexpr you can't use it, but it's solvable, for example use ipow as discussed here: c++11 fast constexpr integer powers

here is the function quote:

constexpr int64_t ipow(int64_t base, int exp, int64_t result = 1) {
  return exp < 1 ? result : ipow(base*base, exp/2, (exp % 2) ? result*base : result);
}

simply change MAX_ARR_SIZE() to:

constexpr size_t MAX_ARR_SIZE()
{
    return ipow(10, 15); 
}
Community
  • 1
  • 1
MoonBun
  • 4,322
  • 3
  • 37
  • 69
1
#define MAX_ARRAY_SIZE (1000ull * 1000 * 1000 * 1000 * 1000)
gnasher729
  • 51,477
  • 5
  • 75
  • 98
0

You can use :

#define MAX_ARR_SIZE 1e15

1e15 is very huge and probably would not be allocated.

uSeemSurprised
  • 1,826
  • 2
  • 15
  • 18
0

You actually can evaluate pow(10, 15) and similar expressions at compile time in C++11 if you use a const instead of #define. Just make sure you pick a large enough primitive.

user3026691
  • 497
  • 2
  • 11