Short answer: No.
Slightly longer answer: As everyone mentioned in the comments, Big-O notation assumes infinite resources, so issues of implementation (in particular hardware related ones) have no bearing. It's only a description of the algorithm and how it hypothetically would scale if you give it arbitrarily large inputs.
That said, if you WERE to include hardware constraints in the definition, the answer would still be no, since Big-O is concerned with processor cycles, or iterations, or tasks -- that sort of measurement of doing something. If you get too deep in a recursive algorithm such that you run out of stack, you're gonna stop performing any algorithm-related operations whatsoever. It'll just stop and you won't be "doing anything" anymore. Nothing more to measure, and certainly not infinite.
And anyway, the point of Big-O is to be useful and help with understanding the theory of the algorithm. Saying "everything is Big-O of infinity", even if it were right, wouldn't accomplish any of that. It'd just be a loophole in the definition.