I was wondering about:
- what are the aspects in a computer that affect how fast a program runs and executes
- How can I calculate an estimate for the time it would take for a computer with x mb to run the program fully.
For example, how can I know how long it would take a computer to compute
2^44523432542352352342 - 1 Mod 447324523
(just random numbers)
According to mersenne.org
, it says it can take up to weeks on very fast computers. So how are they able to compute such estimates of time?