So recently (today) I learned the concept of cpu branch prediction. Basically if the compares of your if statements are predictable, your code will run faster. Below is a program (console application in C#) I wrote that demonstrates this concept:
using System;
using System.Collections.Generic;
using System.Diagnostics;
/* *
* Below are the times for branch prediction / misfires in seconds:
*
* Predictable: 0.91
* Unpredictable: 1.61
*
* Summary: When the branch is predictable, the program runs 55% faster.
*/
namespace BranchPredictions
{
class Program
{
static void Main(string[] args)
{
const int MAX = 100000000; // The amount of branches to create
bool predictable = true; // When true the list isn't in a predictable order
var nums = new List<int>(MAX);
var random = new Random();
for (int i = 0; i < MAX; i++)
{
if (predictable)
{
nums.Add(i);
}
else
{
nums.Add(random.Next());
}
}
int count = 0;
var sw = Stopwatch.StartNew();
foreach (var num in nums)
{
if (num % 2 == 0) // Here is the branch
{
count++;
}
}
sw.Stop();
Console.WriteLine("Total count: {0}", count);
Console.WriteLine("Time taken: {0}", sw.Elapsed);
if (Debugger.IsAttached)
{
Console.Write("Press any key to continue..");
Console.ReadKey(true);
}
}
}
}
It opened my eyes, by knowing some hardware concepts, I can make certain code run much faster, without any real code changes at all!
But it makes me wonder, what else is there that hardware does that call also make my software run faster if I'm aware if it?
I use windows and C#, but these concepts should work for all computers and languages.