my assignment is to write a compiler that will compile small programming language to a fake assembly language (register and memory cells are of undefined length).
What I have to do is converting such instruction:
a = b * c
to a set of INC, SHL i, ADD i instructions (where i is pointer to a memory, there's just one register available) that will perform multiplication. The problem is, that the algorithm must do it's work in O(log n) time.
Could anybody explain step-by-step the algorithm solving this problem? My friend has mentioned Booth-Mcsorley algorithm, but I hardly understand it.
EDIT: Fake assembly instructions include: READ, WRITE, LOAD i, STORE i, ADD i, SUB i, SHR i, SHL i, INC, RESET, JUMP k, JZERO k, JODD k (jump to k if register is odd)