36

I am confused about the definition of left-to-right and right-to-left associativity. I have also seen them called left associativity and right associativity and would like to know which corresponds to which.

I know that it relates to the order that operations with the same precedence are preformed, as in whether a = x * y * z means a = x * (y * z) or a = (x * y) * z. I don't know which one is left-to-right associative and which is right-to-left associative.

I have tried Googleing it but all I have been able to find is tables of what the associativity of different operators are in c++. Looking at all the examples has just made me more confused.

What also confuses me further is that:

glm::vec4 transformedVector = translationMatrix * rotationMatrix * scaleMatrix * originalVector;

preforms the scaling matrix multiplication first, followed by the rotation matrix followed by the translation. In this example the matrices are all of type glm::mat4 and the vectors are of type glm::vec4. Is this left-to-right or right-to-left associativity? Is this the same as normal multiplication or is multiplication of glm types different?

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
Francis
  • 1,151
  • 1
  • 13
  • 29

6 Answers6

33

You normally read left to right. You normally do math left to right. This is left to right associativity and it is most common.

Most people will solve

x = 23 + 34 + 45

by grouping it

x = (23 + 34) + 45

this is left-to-right associativity. You can remember it because you read and do math left to right.

For addition in mathematics it does't matter too much. You always get the same result either way. This is because addition is associative. To say an operation is associative means left to right and right to left association are the same thing. For addition in programming it still matters because of overflows and floating point arithmetic (but won't for normal-sized integers in any reasonable language), so when you have a 2 AM bug with large numbers and flippant use of a+b and b+a, remember what order the addition happened in.

In your example:

glm::vec4 transformedVector = translationMatrix * rotationMatrix * scaleMatrix * originalVector

You conceptually chomp through from the right-side first, since that is where the thing you are acting on is. However in C++, * is normally left-to-right associative and it is not possible to override this. glm may handle this in a number of ways: it may build up a cache of things to multiply waiting for the final vector to arrive then do right to left multiplication. It may also (more likely) use the theorem of algebra that matrix multiplication is fully associative, and just multiply out left-to-right, then assure the reader in the documentation that it's the same as thinking of it as right to left. However, you need to understand the implementation because as previously discussed it matters which way the implementation chooses to multiplying floating point numbers together.

For completeness, consider subtraction. What is a - b - c? Here it really does matter whether it is left or right associative. Of course in math we define it to b (a - b) - c, but some strange programming language might prefer subtraction to be right associative, and take a - b - c to always mean a - (b - c). This alien language had better have a documentation page specifying that - is right-associative, because it is part of the operation specification, not something you can tell simply from looking at the operator's use.

Community
  • 1
  • 1
djechlin
  • 59,258
  • 35
  • 162
  • 290
  • +1 for mentioning the algebraic associative property – Code-Apprentice Aug 31 '14 at 05:49
  • I'm guessing the DV is because I pretended C++ can override the associativity of `*`, I fixed this – djechlin Aug 31 '14 at 05:54
  • Credit to Yu Hao's now deleted answer, since I learned I got the most important detail wrong from it. – djechlin Aug 31 '14 at 05:55
  • 1
    You should also point out that the *mathematical* operations are associative, *not* the actual operations in programming languages (especially on floats). E.g. `100 + 1e-15 + 1e-15 + ... + 1e-15` evaluates to `100` if you group the operations left-to-right but may evaluate to a number `> 100` if you group the operations from right-to-left and there are enough operands (e.g. with 8 `1e-15` addends). Btw: [APL](http://en.wikipedia.org/wiki/APL_%28programming_language%29) has right-to-left as default associativity if you wanted an example for your last paragraph. – Bakuriu Aug 31 '14 at 05:58
  • 1
    I'm not the downvoter, but the reason that the transformations are seen "in reverse order" as interpreted by the OP is not a matter of associativity or commutativity of the + or * operator; It's a matter of the transformation matrices being applied inner-to-outer against the vector. – Iwillnotexist Idonotexist Aug 31 '14 at 05:59
  • @IwillnotexistIdonotexist What does "inner-to-outer" mean with the lack of parentheses in the expression? – Code-Apprentice Aug 31 '14 at 06:02
  • @Bakuriu good catch, done (didn't add bit about APL, but the first technicality was important) – djechlin Aug 31 '14 at 06:04
  • @Code-Apprentice it's understood that the vector you are acting on is on the inside. algebraically matrix multiplication and vector multiplication are not quite equals. matrix multiplication takes place in a ring and vector multiplication is a module action. – djechlin Aug 31 '14 at 06:05
  • @IwillnotexistIdonotexist I think I explained that well enough post-edit? Keep in mind that can't actually happen since an overloaded * is still left-to-right associative. glm has to hack around this either via a cache of objects to multiply or via just actually left to right multiplying. – djechlin Aug 31 '14 at 06:06
  • @Code-Apprentice By that I meant that the order you associate the matrix multiplications here mattered not; Whichever way you pick, mathematically speaking what happens is that the original vector is transformed first by scale (because it is the innermost w.r.t. the vector), producing a scaled vector; Thence the rotation matrix transforms the scaled vector because it now is innermost, then translation. This is independent of association: Association left-to-right still just makes a matrix that scales first, rotates second and translates third, it's just that they'd be applied in one shot. – Iwillnotexist Idonotexist Aug 31 '14 at 06:08
  • @djechlin It's getting there, but it's important to recognize that the reason scaling is seen first is purely mathematically related, and there is no C++ trick underneath. It relates purely to matrix math. Anyways I +1'd because this is the best solution so far. – Iwillnotexist Idonotexist Aug 31 '14 at 06:12
  • @djechlin @Code-Apprentice This of it this way: `v' = T*R*S*v`. If association is LR, you get `v' = ((T*R)*S)*v`, where `T*R` is computed first. It's a matrix that, when it left-multiplies, rotates first then translates. `(T*R)*S` is then computed; That's now a matrix that, when left-multiplying, scales first, rotates second and translates third. It's this matrix that left-multiplies `v`. _Now_, if `*` were associating RL, you'd have `v' = T*(R*(S*v))`. What this does is it first scales the vector, then rotates it, then translates it. Two ways of doing the same thing, do you see? – Iwillnotexist Idonotexist Aug 31 '14 at 06:17
  • @IwillnotexistIdonotexist Mathematically, the two results are the same. Conceptually, the right-to-left version makes it simpler to understand what is going on. – Code-Apprentice Aug 31 '14 at 07:02
  • @Code-Apprentice I agree the RL version was easer to explain; But the point I was driving home was it's the inner-to-outerness of matrix multiplication that is guilty for the transformations being applied in the order they are, and that it is independent of the associativity of matrix multiplication. You can associate any way you want matrix multiplication, but if your rotation matrix is "closer" to the vector it left-multiplies than the translation, the transformation it describes will be applied first, geometrically speaking. – Iwillnotexist Idonotexist Aug 31 '14 at 07:07
9

You can see that from the following words:

When we combine operators to form expressions, the order in which the operators are to be applied may not be obvious. For example, a + b + c can be interpreted as ((a + b) + c) or as (a + (b + c)). We say that + is left-associative if operands are grouped left to right as in ((a + b) + c). We say it is right-associative if it groups operands in the opposite direction, as in (a + (b + c)).

A.V. Aho & J.D. Ullman 1977, p. 47

user207421
  • 305,947
  • 44
  • 307
  • 483
Daniel
  • 1,783
  • 2
  • 15
  • 25
  • what's a good example of right associativity which is actually used and makes a difference mathematically (not operationally, so ignoring floating point type issues)? In Haskell `==`, `/=`, "<" etc have non-associative. And `^`, `:` , `$` are right associative operators, meaning the first implied parenthesis is on the right hand side. Quoting @Marc below: "Consequently, in a nested use without explicit parentheses, the leftmost operator takes its immediate neighbours as operands, while the other instances take as left operand the (result of) the expression formed by everything to their left." – wide_eyed_pupil May 26 '21 at 16:30
  • [cont] and the case of non-associative operators, it doesn't matter if the first implied parenthesis is on the left-most or right-most non-associative operator in the sub-expression or expression. – wide_eyed_pupil May 26 '21 at 16:32
  • Example with logical operators: (1 || 0) && 0 = 0; 1 || (0 && 0) = 1 – misiu_mp Mar 28 '23 at 18:49
5

Simplest and most non-tl;dr answer I found:

In most programming languages, the addition, subtraction, multiplication, and division operators are left-associative, while the assignment, conditional, and exponentiation operators are right associative.

thanks to: http://www.computerhope.com/jargon/a/assooper.htm

killjoy
  • 940
  • 1
  • 11
  • 16
4

An infix operator (or more generally expression type that has unclosed left and right sub-expressions) is left associative if in nested use of this operator (expression type) without explicit parentheses, the implicit parentheses are placed at the left. Since * is left-associative in C++, a*b*c means (a*b)*c. In case of deeper nesting, a cluster of implicit parentheses occurs on the left end: (((a*b)*c)*d)*e.

Equivalently this means the syntactic production rule for this operator is left-recursive (meaning the left sub-expression has the same syntactic category as the one this rule is a production for, so that the same rule (same operator) may be used directly to form that sub-expression; the sub-expression at the other end has a more restrictive syntactic category, and using the same operator there would require explicit parentheses). In C++, one production for multiplicative-expression (section 5.6 in the Standard) reads mutliplicative-expression * pm-expression, with multiplicative-expression on the left.

Consequently, in a nested use without explicit parentheses, the leftmost operator takes its immediate neighbours as operands, while the other instances take as left operand the (result of) the expression formed by everything to their left.

I admit, I've been pushing this a bit (too far). My point in that nowhere above does the word "right" occur, nor is there any movement involved; associativity is a syntactic and therefore static matter. It is important where the implicit parentheses go, not in which order one writes them (in fact one does not at all, or else they would be explicit). Of course, for right-associativity, just replace each "left" by "right" above.

In conclusion, I can see no valid reason why one should call this left-to-right associativity (or grouping), but the fact is that people do (even the Standard does, though it is perfectly redundant given that explicit syntax rules are also given).

Confusion comes from explaining this, as is often done, by saying that (in absence of explicit parentheses) the operators are performed from left to right (respectively from right-to-left for right-associative operators). This is misleading because it confuses syntax with semantics (execution), and also is only valid for operations with bottom-up evaluation (all operands are evaluated before the operator is). For operators with special evaluation rules it is simply wrong. For the operators && (and) and || (or) the semantics is to evaluate the left operand first then the operator itself (namely decide whether the left or the right operand will produce the result) followed possibly by the evaluation of the right operand. This left-to-right evaluation is totally independent of the associativity: the operators happen to be left-associative, probably because all non-assignment binary operators are, but (c1 && c2) && c3 (with redundant parentheses where they would already implicitly be) has an equivalent execution to c1 && (c2 && c3) (namely execute the conditions from left to right until one returns false and return that, or if none does return true), and I cannot imagine a reasonable compiler generating different code for the two cases. Actually I find the right-grouping more suggestive of how the expression is evaluated, but it really makes no difference; the same goes for or.

This is even more clear for the conditional (ternary) operator ? ... :. Here associativity applies, because there are open sub-expressions at both sides (cf. my opening sentence); the middle operand is enclosed in ? and : and never requires additional parentheses. Indeed this operator is declared to be right-associative, which means that c1 ? x : c2 ? y : z should be read as c1 ? x : (c2 ? y : z) rather than as (c1 ? x : c2) ? y : z (the implicit parentheses are to the right). However, with the implicit parentheses the two ternary operators are executed from left to right; the explanation is that the semantics of the ternary operator does not evaluate all sub-expressions first.


Back to the example from your question, left-associativity (or grouping left-to-right) means that your matrix-vector product is parsed as ((M1*M2)*M3)*v. Although mathematically equivalent, it is virtually impossible that this gets executed as M1*(M2*(M3*v)), even though that is more efficient. The reason is that floating-point multiplication is not truly associative (just approximately), nor is therefore floating-point matrix multiplication; the compiler therefore cannot transform one expression into the other. Note that in ((M1*M2)*M3)*v one cannot say which of the matrices gets applied to a vector first, because none of them is: the matrix of the composite linear maps gets computed first, and the that matrix gets applied to the vector. The result will be approximately equal to that of M1*(M2*(M3*v)) in which M3 is applied, then M2 and finally M1. But if you want things to happen like that, you have to write those parentheses.

Marc van Leeuwen
  • 3,605
  • 23
  • 38
2

a = (x * y) * z is left-to-right and a = x * (y * z) is right-to-left.

glm's matrix multiplcation associates left-to-right because it overloads the * operator. The issue here is about the meaning of the matrix multiplications in terms of geometrical transformations, rather than the mathematical associativity.

Code-Apprentice
  • 81,660
  • 23
  • 145
  • 268
-2

Left to right associativity of a operator means right side of operator should not have any operator of higher precedece(priority), but it can be of same priority. If there is any operator of higher priority on right side of our operator, then we have to solve it first.example:

x = 2 + 3 * 3;

Here, for + operator(left to right associativity), right side contains a operator * , which is of higher priority than + operator, so we have to solve it first.

x = 2 + 9;
x = 11;

enter image description here

enter image description here

enter image description here