To second the answer given by Henry, the algorithm in the original question in fact has a polynomially bounded running time - if unary encoding is used for the input!
More precisely, the runtime bound does not only depend on the algorithm itself but also the used encoding scheme. Consider the following algorithm in C-like syntax.
INPUT: integer n
for (int i = 0; i < n; i++)
{
wait one second
}
Apparently, the algorithm takes n
seconds to terminate; the time is linear in n
. If the input is encoded using unary encoding, the amount of time scales linearly in the encoding length of n
. However, if n
is encoded using binary encoding the amount of time scales exponentially in the encoding length of n
(as the encoding length of n
scales logarithmically in the value of n
).
To put it all in a nutshell, the statement that the algorithm in the question is not polynomial without any additional information is not correct. However, apparently it is a convention that binary encoding (or any other positional notation) is used unless otherwise stated.
That being said, I admit that the depenence of the runtime bound on the encoding scheme tends to be taught a bit imprecisely. The term pseudopolynomial is also floating around.