Most of the time, decimal numbers are represented in the floating-point number format. However, this data format has some drawbacks and it is possible that for some applications, other formats could be more useful.
I know that fixed-precision arithmetic used to be a thing, but was abandoned in favour of floating-point. But apart from these two, I have not heard of any other ways to represent real numbers on a computer.
My question is - what number formats have been proposed to represent real numbers and for which applications might they be more useful than the floating-point standard?
*I'm not sure if this is the appropriate site to post this question, so please suggest a better one if not.