1

As my first template metaprogram I am trying to write a function that transforms an input vector to an output vector.

For instance, I want

vector<int> v={1,2,3};
auto w=v_transform(v,[](int x){return (float)(x*2)})

to set w to the vector of three floats, {2.0, 4.0, 6.0} .

I started with this stackoverflow question, The std::transform-like function that returns transformed container , which addresses a harder question of transforming arbitrary containers.

I now have two solutions:

  1. A solution, v_transform_doesntwork that doesn’t work, but I don’t know why (which I wrote myself).

  2. A solution, v_transform that works, but I don’t know why (based on Michael Urman's answer to the above question)

I am looking for simple explanations or pointers to literature that explains what is happening.

Here are the two solutions, v_transform_doesntwork and v_transform:

#include <type_traits>
#include <vector>

using namespace std;

template<typename T, typename Functor,
    typename U=typename std::result_of<Functor(T)>::type>
vector<U> v_transform(const std::vector<T> &v, Functor&& f){
    vector<U>ret;
    for(const auto & e:v)
        ret.push_back(f(e));
    return ret;
}

template<typename T, typename U> 
vector<U> v_transform_doesntwork(const std::vector<T> &v, U(*f)(const T &)){
    vector<U>ret;
    for(const auto & e:v)
        ret.push_back(f(e));
    return ret;
}

float foo(const int & i){
    return (float)(i+1);
}

int main(){
    vector<int>v{1,2,3,4,5};
    auto w=v_transform(v,foo);
    auto z=v_transform(v,[](const int &x){return (float)(x*2);});
    auto zz=v_transform(v,[](int x){return (float)(x*3);});
    auto zzz=v_transform_doesntwork(v,[](const int &x){return (float)(x*2);});
}

Question 1: why doesn’t the call to v_transform_doesntwork compile? (It gives a fail-to-match template error, c++11. I tried about 4 permutations of “const” and “&” and “*” in the argument list, but nothing seemed to help.)

I prefer the implementation of v_transform_doesntwork to that of v_transform, because it’s simpler, but it has the slight problem of not working.

Question 2: why does the call to v_transform work? I get the gist obviously of what is happening, but I don’t understand why all the typenames are needed in defining U, I don’t understand how this weird syntax of defining a template parameter that is relied on later in the same definition is even allowed, or where this is all specified. I tried looking up "dependent type names" in cppreference but saw nothing about this kind of syntax.

Further note: I am assuming that v_transform works, since it compiles. If it would fail or behave unexpectedly under some situations, please let me know.

Community
  • 1
  • 1
kdog
  • 1,583
  • 16
  • 28
  • Closures aren't function pointers. They may be *convertible* to them, but that's not the same thing. – Kerrek SB Oct 23 '14 at 01:28
  • then why isn't it converted? – kdog Oct 23 '14 at 01:30
  • If you can rely on C++11, take a look at trailing-return-type and `decltype`. They will make your life much easier. – Deduplicator Oct 23 '14 at 01:31
  • @kdog: To get converted you'd need to say which type to convert it to, but you don't have that situation, because instead you chose to ask for template argument deduction. Argument deduction doesn't then also consider user-defined conversions (that would never work if you think about it). – Kerrek SB Oct 23 '14 at 08:46

3 Answers3

4

Your doesnotwork expects a function pointer and pattern matches on it.

A lambda is not a function pointer. A stateless lambda can be converted to a function pointer, but template pattern matching does not use conversions (other than a very limited subset -- Derived& to Base& and Derived* to Base&, reference-to-value and vice versa, etc -- never a constructor or conversion operator).

Pass foo to doesnotwork and it should work, barring typos in your code.

template<typename T, 
  typename Functor,
  typename U=typename std::result_of<Functor(T)>::type
>
vector<U> v_transform(const std::vector<T> &v, Functor&& f){
  vector<U>ret;
  for(const auto & e:v)
    ret.push_back(f(e));
  return ret;
}

so you call v_transform. It tries to deduce the template types.

It pattern matches the first argument. You pass a std::vector<int, blah> where blah is some allocator.

It sees that the first argument is std::vector<T>. It matches T to int. As you did not give a second parameter, the default allocator for std::vector<T> is used, which happens to match blah.

We then continue to the second parameter. You passed in a closure object, so it deduces the (unnamable) lambda type as Functor.

It is now out of arguments to pattern match. The remaining types use their defaulted types -- U is set to typename std::result_of<Functor(T)::type. This does not result in a substitution failure, so SFINAE does not occur.

All types are determined, and the function is now slotted into the set of overloads to examine to determine which to call. As there are no other functions of the same name, and it is a valid overload, it is called.


Note that your code has a few minor errors:

template<typename T, 
  typename A,
  typename Functor,
  typename U=typename std::decay<typename std::result_of<Functor&(T const&)>::type>::type
>
std::vector<U> v_transform(const std::vector<T, A> &v, Functor&& f){
  std::vector<U> ret;
  ret.reserve(v.size());
  for(const auto & e:v)
    ret.push_back(f(e));
  return ret;
}

which cover some corner cases.

Yakk - Adam Nevraumont
  • 262,606
  • 27
  • 330
  • 524
  • Wow that is very clear and helpful. How/where do you learn all this? It just seems like people just know it from the ether. I don't need to worry about custom allocs in my app, but can you explain what is the point of changing Functor to Functor& in the argument to result_of? – kdog Oct 23 '14 at 01:40
  • 1
    @kdog If you pass an rvalue functor (like you do in the OP) to the function, then `Functor` is the class type (with no reference), and `std::result_of` invokes the rvalue `operator()` overload. You, however, use it in an lvalue context within the transform. In an extreme corner case, this will screw up. Oh, that reminds me, another problem -- we need to `decay`! And where do I know it? More than a decade of coding in C++, playing with the language, reading the standard (in short bits!), learning from various sources, and the like. – Yakk - Adam Nevraumont Oct 23 '14 at 01:56
  • @Yakk wow -- I definitely wouldn't have caught the decay bug... impressive! – Alex Reinking Oct 23 '14 at 01:59
  • Very interesting, but I am curious what case would (a) fail if Functor& were changed to Functor in the result_of expression; (b) fail if the std::decay call were omitted? – kdog Oct 23 '14 at 02:14
  • @kdog if your function object returned a reference, you cannot store references in vectors, so the body will fail. Similarly if a function returned a `const` object. `decay` can be used to convert types to 'storable' types. The `Functor&` would basically require a function object with a different `operator()&&` and `operator()&` overload, at this point a ridiculous corner case: in addition (after said decay) it would have to be a very strangely different overload (`operator()&` returning a ref, while `operator()&&` returning a value, is ... more common). – Yakk - Adam Nevraumont Oct 23 '14 at 03:54
  • @kdog: You learn it by doing lots of C++, paying attention to the problems and surprises you encounter, reading explanations on Stack Overflow, checking standard references yourself, and then iterate on this whole thing and never stop :-) – Kerrek SB Oct 23 '14 at 08:47
  • @Yakk thanks again for the great answer. Extremely helpful and illuminating not just for this specific issue but for later work with template metaprogramming. – kdog Oct 23 '14 at 09:50
2

Question 1

Why doesn't the call to v_transform_doesntwork compile?

This is because you've passed it a C++11 lambda. The template argument in v_transform_doesntwork is a function pointer argument. C++11 lambdas are, in fact, objects of an unknown type. So the declaration

template<typename T, typename U>
vector<U> v_transform_doesntwork(const std::vector<T> &v, U(*f)(const T &))

binds T to the input type of the function pointer f and U to the output type of the function pointer. But the second argument cannot accept a lambda for this reason! You can specify the types explicitly to make it work with the non-capturing lambda, but the compiler will not attempt the type inference in the face of the cast.

Question 2

Why does the call to v_transform work?

Let's look at the code you wrote:

template<typename T, 
typename Functor,
typename U=typename std::result_of<Functor(T)>::type>
vector<U> v_transform(const std::vector<T> &v, Functor&& f){

Again, T is a template parameter that represents the input type. But now Functor is a parameter for whichever callable object you decide to pass in to v_transform (nothing special about the name). We set U to be equal to the result of that Functor being called on T. The std::result_of function jumps through some hoops to figure out what the return value will be. You also might want to change the definition of U to

typename U=typename std::result_of<Functor&(T const &)>::type>

so that is can accept functions taking constants or references as parameters.

Alex Reinking
  • 16,724
  • 5
  • 52
  • 86
  • Why Functor&(T const &) and not Functor(T const &) ? – kdog Oct 23 '14 at 01:54
  • actually hold on...i don't even see why Functor(T) could have a different result than Functor(T const &). I mean, if the Functor parameter is declared as const T&, and a T argument is passed in, then the invocation will still return the type U. Maybe if Functor were overloaded to do something different if it gets a const T& versus a T actual parameter at the invoking site? – kdog Oct 23 '14 at 01:58
  • 1
    @kdog yes, if the function object did something different with a `T` or a `T const&` argument. – Yakk - Adam Nevraumont Oct 23 '14 at 01:59
0

For the doesntwork function, you need to explicitly specify the template parameters:

auto zzz=v_transform_doesntwork<int,float>(v,[](const int &x){return (float)(x*2);});

Then it does work. The compiler is not able to implicitly determine these parameters whilst it converts the lambda to a function pointer.

John Zwinck
  • 239,568
  • 38
  • 324
  • 436