In the elegant answer, given by Phrudhvi, to this question What is pseudopolynomial time? How does it differ from polynomial time? it's stated, among many other important points, that under Time Complexity's real, Formal Definition, the TC of their example algorithm is exponential in x.
I understood everything in the answer, except this point here. In the answer, his example alg is O(n^4). He then states that the formal def of TC is based on the number of bits required to represent n. And so, I expected him to say that the TC would be O(x), where x is the number of bits. Instead, he said it's O(2^4x).
I have an idea of why I got confused, and what I think is actually happening. Could you tell me if I'm now right?
Here's why I think I got confused: When Phrudhvi said that formally, TC is based on the number of bits used to represent your input, I thought he meant that the given time required to solve a problem per bit was polynomial, i.e. what people assume TC means about n, but now its linear with each bit, instead.
However, what I now assume he means and what I think is correct is this: even in the formal definition of Time Complexity the time taken for his example and algorithms in general is indeed in fact based on the size of the input n, and is O(n^4) (in his example). However, the size of n grows exponentially with increases in x.
The two expressions of the time complexity are both accurate - but since the formal definition of Time Complexity wants the time taken O(n^4) expressed as a function of x, and n grows exponentially with x, then in terms of x the formal TC is O(2^4x).
Is this right? Many thanks, indeed!