11

In both languages the basic source character set includes every printable ASCII character except @, $ and `. I can understand not using grave accent because it's not always interpreted as a separate character and it also looks very similar to apostrophe. But is there a specific reason why @ and $ don't have any usage or did the language designers just run out of ideas? :)

Timo
  • 5,125
  • 3
  • 23
  • 29
  • My guess is it probably has something to do with old keyboard layouts, but I'd be interested to hear the real reason. – Owen Aug 23 '11 at 04:44
  • 2
    `@` is used in objective-c which is strict superset of c and will be broken if c uses it. About $ I don't know. Why do you think they should be used? – Daniel Aug 23 '11 at 04:44
  • Probably to keep people that want to build custom C "preprocessors" sane, by giving them safe escape characters :) – Earlz Aug 23 '11 at 04:45
  • 13
    Should using every character be a goal in language design? – Benjamin Lindley Aug 23 '11 at 04:46
  • ¥ and € aren't use either, talk about discrimination. – laurent Aug 23 '11 at 05:01
  • 2
    Historically `vi` editor was found around the `C` language time frame. And in `vi` editor, actually `$` symbol internally notifies the meaning of end of line (though it is not used explicitly). However `@` can have some good usages; it should have been in the language. – iammilind Aug 23 '11 at 05:02
  • 1
    @Laurent: ¥ and € are not ASCII. – Thilo Aug 23 '11 at 05:19
  • 5
    If their goal was to find a meaning for every punctuation character on the keyboard, they'd have created Perl ;^) – Jeremy Friesner Aug 23 '11 at 05:22
  • 1
    @Thilo: Not to forget that C predates the Euro by a couple decades. – Dietrich Epp Aug 23 '11 at 06:14
  • DEC C allowed $ in identifiers. – SK-logic Aug 23 '11 at 06:22
  • 1
    Some of the answers are less good than the question, but I am unsure that this is the question's fault. The question looks like a good question to which no one knew the answer. Reopen. – thb Feb 26 '19 at 21:40

5 Answers5

5

I can't imagine what capacity they would fill. Perhaps using @ to signify pointers...

But $ and @ are very busy looking symbols, perhaps almost distracting, and if you throw them into the mix with an already diverse syntax, just because they're there, you might end up having a language that reads like a perl regex. Which is to say it doesn't read at all. :P

Anne Quinn
  • 12,609
  • 8
  • 54
  • 101
  • Many Pascal implementations use "$" as a hex prefix. Vastly superior to the bulky and ugly 0x. – supercat Sep 01 '11 at 03:32
  • 1
    @supercat - It must be nice to type a `$` followed by eight digits. :P – Anne Quinn Sep 01 '11 at 05:34
  • More commonly two or four digits. Using "0x" as a hex prefix, comma-separated bytes take up five characters each, and comma-separated words take up seven. Even sixteen bytes or twelve words of them will wrap an 80-character line. Using "$" as a hex prefix, comma-separated bytes take four characters whether written in hex or decimal, and 16-bit words likewise take six. Sixteen comma-separated bytes will fit in 64 characters, and ten comma-selarated words will fit in 70. BTW, I will grant that the "0x" for hex isn't as horrible as the "0" for octal. That's just evil. – supercat Sep 01 '11 at 19:24
3

@ was a bad idea because it was the kill character. If you were typing in a program and you accidentally hit @ then you erased the entire line of input up to that point.

# was more or less a bad idea because it was the erase character. If you were typing in a program and you accidentally hit # then you erased the most recent character.

When the preprocessor was added to the C language, # was accepted in the first column of a line, but not anywhere else. So maybe ed was modified to allow # to be input as the first character of a line, since there was nothing before it to be erased.

So why didn't the preprocessor use $ instead of #? Here we go, I answered half of your question but added to the other half of your question.

Newspaper articles didn't used to include the @ character. After the internet became common, some reporters or editors put the 4-character string "(at)" in newspaper articles because they couldn't or wouldn't use some escape sequence to put an actual @ in the article. Unix's definition of the kill character @ was copied from newspaper equipment.

http://en.wikipedia.org/wiki/Seventh_Edition_Unix_terminal_interface

Windows programmer
  • 7,871
  • 1
  • 22
  • 23
  • Pascal (slightly older than C, IIRC), used @ as address operator. ;-) – Rudy Velthuis Aug 23 '11 at 07:05
  • Pascal wasn't designed in tandem with Unix, didn't copy the kill character from newspaper equipment, and originally didn't use curly braces or other US-dependent characters. – Windows programmer Aug 23 '11 at 08:25
  • Yes, well, that may be true. Apparently for Wirth, @ was not a bad idea and not a "kill character". – Rudy Velthuis Aug 23 '11 at 08:35
  • Yes I agree, for Wirth @ was not a bad idea because he didn't develop Pascal in tandem with Unix, and Unix copied @ from newspaper equipment as a kill character when typing input not when being parsed by compilers. If C were not developed in tandem with Unix then I bet it would use @. If Unix originally used backspace and control-X the way it often does now, I bet @ wouldn't have been a bad idea in programming languages that were developed in tandem with Unix. – Windows programmer Aug 24 '11 at 02:23
2

The first question you should ask is, "Why are only certain characters allowed in C/C++ function and variable names"?

Even I am not quite old enough to answer that... But I would bet many special characters (especially $) were not legal in external symbols in the original Unix. That is, the assembler and linker would choke on them.

So the only use for non-alphanumeric characters was in operators, like + or ->. The original designers presumably had all the operators they needed, so there was no reason to use $ or @ or whatever. (How do you do the back-tick markdown, anyway?)

With the advent of C++ and name mangling, most restrictions on identifier names could presumably be lifted. But even the C++ committee is not going to break with tradition for no reason at all.

Anyway, this is just my guess. I do know that to answer your question definitively, you will need to virtually transport yourself to 1973...

Nemo
  • 70,042
  • 10
  • 116
  • 153
1

I don't see any specific reason for them to be used.

I mean, maybe $ and @ could have been used to denote scalars and arrays as in perl, but I see little benefits in adding a character for each variable names.

Also, in C, arrays are really just syntactic sugar for pointers, so they can be used in a scalar context of sort.

Maybe they could have been permitted in variable names as any other valid characters.

Or maybe the reason is just they didn't think about it because there wasn't really any reason to put them.

Go ask K&R :)

Federico klez Culloca
  • 26,308
  • 17
  • 56
  • 95
  • `$` is a valid part of an identifier in Scala, Java and JavaScript -- it doesn't strictly have to be for sigils. (An identifier which contains a $ in Java or Scala might be for an automatically generated classname.) –  Aug 23 '11 at 04:50
  • 2
    Arrays are [not](http://stackoverflow.com/questions/4810664/) just syntactic sugar for pointers. – fredoverflow Aug 23 '11 at 04:55
  • @Fred: You linked to a C++ faq. Is your assertion also true for C? – Robert Harvey Aug 23 '11 at 05:14
  • @Robert: in both languages, an array is an actual piece of storage, i.e. an array of 10 elements has a size of 10*sizeof(element), while a pointer is simply a variable that contains one address. That one can assign an array to a pointer of the same base type is a convenience. But an array is not just syntactic sugar for a pointer. – Rudy Velthuis Aug 23 '11 at 12:46
1

Perhaps, the Standards committee left out these characters because there were lot of characters to choose from and they simply found these to be odd. We may never know the rationale behind why not unless someone from the Standards committee actually answers this.

Atleast $ is supported as an valid identifier in both MSVC & GCC through extensions.

Following code compiles in both:

struct $Y1$ 
{
   void $Test$() {}
};

int main() 
{
   $Y1$ $x$;
   $x$.$Test$();
   return 0;
}
Alok Save
  • 202,538
  • 53
  • 430
  • 533