I have a need to compare the performance of two algorithmic approaches to do the same DSP function in written in C++ using VS2017
Each algorithm takes the same signal data and processes it into the same output over the same period of time - it is sample frequency-based so in this instance 48KHz.
I really want to know the count of instructions - or some valid proxy - by which I can compare them to draw a conclusion as to which is more efficient.