In computers, a serial decimal numeric representation is one in which ten bits are reserved for each digit, with a different bit turned on depending on which of the ten possible digits is intended. ENIAC and CALDIC used this representation.[1]

See also

References


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.