I want to write function in C that takes seconds and nanoseconds as input. Converts seconds and nanoseconds into microseconds, returns the total in microseconds.
unsigned long long get_microseconds(int seconds, unsigned long long nSeconds);
Now the conversion is pretty trivial. I can use following formula-
mSeconds = Seconds*1000000 + nSeconds/1000 (Loss of precision in nanosecond conversion is alright, my timer has anyway minimum resolution of 100 microseconds)
What would be fastest way of implementing this equation without using multiplication and division operators to get the best accuracy and least number of cpu cycles.
EDIT: I am running on a custom DSP with a GNU based but custom designed toolchain. I have not really tested out performance of the arithmetic operation, I am simply curious to know if it would affect the performance and if is there a way to improve it.