In order to find the reasoning behind the terms of MapReduce, we must go back to the roots of those elements that make up this particular programming paradigm. This means we need to talk (as much precise and as less boring as possible) about functional programming.
In short, functional programming for Wikipedia is:
a declarative programming paradigm in which function definitions are trees of expressions that map values to other values, rather than a sequence of imperative statements which update the running state of the program.
This basically means that the emphasis of this model is on the application of functions and not on the imperative programming that is focused on the changes being made to a state. So by using functional code, a function in execution doesn't really rely on or manipulate data outside of its scope (as brilliantly said here).
"Ok, and what does that have to do with MapReduce, anyhow?"
Well, MapReduce is directly inspired by functional programming, because the Map and Reduce functions are the basic functions used in functional programming. Of course, MapReduce has many other added stages for an execution like Combine, Shuffle, Sort, etc., but the core idea of the model stems from that idea of functional programming described above.
About mapping, in a functional sense it is described as a function that receives two arguments, a function and a list of values. The Map function is essentially implementing the function upon each and every one value of the list to return an output list of results. You can indeed call this a type of "filtering", however data can be manipulated in a lot more ways than just "filtering" them out. The main goal of a Map function is changing input data to the desired form for the calculations being made up next in the Reduce function.
Talking about Reduce now, it follows a similar approach. Two arguments are given here as well, a function and a list of values where the function is going to be implemented. Since the list of values here is the transformed collection of data from the output of the Map function, all left to do is work on them and reach to the desired results. With your knowledge of the abstract sense of that step of a MapReduce job, you have the right idea when you describe the Reduce function as trying to aggregate the input data. The one thing that is "missing" from that procedure, though, is how and based on what will those input data be aggregated. And this is the main essence of the Map function, as described above.
With all this, we are able to understand that the MapReduce model is named after those two basic functions of functional programming that is abstractly implementing, so the model essentially follows the semantic contracts of the latter.
You can go on a quest yourself about all of this and a lot more by starting from here, here, here, and here.