0

From my understanding octal digits in a number map to 3 bit segments, and will never truly represent a binary string of 2^n length.

Out of curiosity, When/Why/How is this useful? :-)

..If the base of the primary representation of a word in an architecture is a prime(!) power of 2, at least make sure It's even ;-).

user76568
  • 456
  • 4
  • 16
  • I think it's more about history than it is about current usefulness. I've used at least one computer a long time ago (something called an Alpha Micro) that actually dealt in octal values. – mah Mar 11 '14 at 18:00
  • Basically if you don't want letters in your numbers. Octal commonly requires additional 0's to be implicitly added to the left, but that doesn't usually change the value. – Mooing Duck Mar 11 '14 at 18:00
  • If you have a 6-bit computer... – Rob Mar 11 '14 at 18:01
  • @Rob If you have a 36 bit computer. (They were very common once upon a time.) – James Kanze Mar 11 '14 at 18:06
  • Octal notation provides a handy shorthand for file permissions, which are grouped in 3's (read, write, execute) on Unix-ish systems. – Caleb Mar 11 '14 at 18:12
  • 2
    It is not useful, it is a virus. Which escaped from Maynard MA in the early 1970s, home of the Digital Equipment Corporation (aka DEC). All of their documentation and tooling used octal notation. Which in turn infected C and Unix, PDP-11 was the principal carrier. Which infected many subsequent languages since their compilers and runtime libraries were often first implemented in C. – Hans Passant Mar 11 '14 at 18:19
  • Octal is a *very* useful notation in the documentation of several assembler opcodes -- from memory, 80x86, Z80, MC68K. (ARM RISC CPU's have more registers, operators, and flags and so its assembly is based on nibbles instead.) – Jongware Mar 11 '14 at 18:25
  • @HansPassant Hah. I agree about the virus part, but fortunately it seems It's far enough from evolving properly, and back to source codes.. ;-) – user76568 Mar 11 '14 at 18:27
  • 'chmod 777' - about the only thing I can think of. – Martin James Mar 11 '14 at 18:50
  • Shorter to type `'\0'` (an octal escape sequence) than `'\x0'`. – chux - Reinstate Monica Mar 11 '14 at 19:52

2 Answers2

4

Octal can represent exactly a binary string of any length. Although it's true that with 8 bit bytes and byte addressing, hexadecimal seems more natural. Historically, however...

  • a lot of machines had 36 bit words, where octal made a lot of sense, and

  • on the PDP-11 (on which the first C compilers ran), the machine instructions were divided into 3 bit groups: the high bit to flag whether the operation was on bytes or words, then a 3 bit op-code, and two six bit addresses, with the first 3 bits the addressing mode and the second 3 the register involved.

At the time when C was first being invented, octal was probably more frequently used than hexadecimal, and so the authors of the language provided for it. (I can't recall seeing it actually used in recent code for a very long time, however.)

James Kanze
  • 150,581
  • 18
  • 184
  • 329
1

At the times of Unix development there were machines (DEC) with 36-bit word. A 36-bit word is made of four 9-bits bytes, each of them represented by three octal digits.

ouah
  • 142,963
  • 15
  • 272
  • 331
  • IBM also, and indeed principally, by market share. As a matter of fact the 18 digit COBOL rule comes from 36-bit IBM hardware and BCD. – user207421 Mar 11 '14 at 18:41