I wanted to get some numbers on how fast is multiplication over addition. I wrote a simple code where I am multiplying 2 numbers, finding the time taken. Then I am adding the 2 numbers and find the time taken. The results are a bit disturbing. Before I show the results, here is the code
package com.np.fun;
import java.util.Scanner;
import org.apache.commons.lang3.time.StopWatch;
public class HowSlowIsMultiplication {
public static void main(String[] args) {
Scanner scanner = new Scanner(System.in);
int x = scanner.nextInt();
int y = scanner.nextInt();
long z;
StopWatch stopWatchMultipy = new StopWatch();
stopWatchMultipy.start();
z = x*y;
stopWatchMultipy.stop();
System.out.println("Time taken for multiplication is : " + stopWatchMultipy.getNanoTime());
StopWatch stopWatchAdd_1 = new StopWatch();
stopWatchAdd_1.start();
for(int i =0 ;i <Math.min(x, y); i++){
z = z + Math.max(x, y);
}
stopWatchAdd_1.stop();
System.out.println("Time taken for adding in less for loops is : " + stopWatchAdd_1.getNanoTime());
StopWatch stopWatchAdd_2 = new StopWatch();
stopWatchAdd_2.start();
for(int i =0 ;i <Math.max(x, y); i++){
z = z + Math.min(x, y);
}
stopWatchAdd_2.stop();
System.out.println("Time taken for adding in more for loops is : " + stopWatchAdd_1.getNanoTime());
}
}
I tried this with varying values of x & y. Here is the output for x=10000 and y=5000 (all times are in naoseconds)
Time taken for multiplication is : 61593
Time taken for adding in less for loops is : 1622599
Time taken for adding in more for loops is : 1622599
As you can see, multiplication is several orders of magnitude faster than addition.
Any reasons for this?