I work as a programmer, but have no computer science background, so recently I've been following along with the excellent MIT OpenCourseWare intro to Computer Science and Programming. In the course of which, the question is asked: "will any program written using only function definitions and calls, the basic arithmetic operators, assignment and conditionals run in constant time?"
I thought the answer was yes, since all of these operations seem pretty simple. But as you smart people probably already knew, the correct answer was no, apparently because of the potential for indefinite recursion.
It seems like I just don't understand the implications of "in constant time". The way I pictured the meaning, it sounded like it just meant that a simple operation would take a known amount of time to complete. I accept that recursion can lead to your program running forever, but aren't the individual operations still taking a finite and measurable amount of time each... even if there are now an infinite number of them? Or does the answer just mean that two infinitely recursive programs cannot validly be said to take the same amount of time to run?
If anyone can give me an authoritative definition of "in constant time", and the implications thereof, I'd be very grateful!