The function is returning a reference to an array of int
of size 3
and the [3]
part after the function actually the size of the array to be returned as reference.
This obscure syntax comes from the weird syntax of array declaration, which you do as:
int arr[3]; //real but weird
The language would have been much simpler if it had this instead:
int[3] arr; //hypothetical but better and simpler
because size 3
is part of the type of arr
, so it makes much more sense if all the parts appear on the left side of the variable name, in the same way as when you write:
unsigned int a;
You dont write:
unsigned a int; //analogous to : int a [3];
So while the language does the right thing with unsigned int
, it does a very weird thing with int[3]
.
Now coming back to the function declaration, the function would have been much better if it is declared as:
int[3]& f(int[3]& arr); //hypothetical
only if it had all the parts on the left side of variable-name. But since it doesn't do that (i.e the language requires you to write the size on the rightmost side after the variable name), you end up with this weird signature:
int (&f(int (&arr)[3])[3]; //real
Notice that how even the parameter becomes weird.
But you can simplify it with a typedef as:
typedef int array_type[3];
array_type& f(array_type& arr);
That looks much better. Now only the typedef looks weird.
With C++11, you can write even a better typedef:
using array_type = int[3];
array_type& f(array_type& arr);
which is as close as this (if you visualize array_type
as int[3]
):
int[3]& f(int[3]& arr); //hypothetical
Hope that helps.