3

This:

var foo = {
  : true //Truely adorable
};

Gives me an Illegal Character error on Firefox and Chrome. However,

var foo = {
  '' : true
};

Works perfectly. Why?
(You can also answer for a wider set of Unicode characters, but I really want to know more about Dog)

Kyll
  • 7,036
  • 7
  • 41
  • 64
  • 1
    Here's a great explanation of what's allowed and what's not: https://mathiasbynens.be/notes/javascript-identifiers – CodingIntrigue Jun 23 '15 at 14:02
  • 1
    Because its not in a unicode range of allowed identifier characters. The latter example is just a string containing the character. See [What characters are valid for JavaScript variable names?](http://stackoverflow.com/questions/1661197/what-characters-are-valid-for-javascript-variable-names) – Alex K. Jun 23 '15 at 14:03

1 Answers1

6

As the ECMAScript standard defines, valid identifiers must start with a Unicode code point with the Unicode property ID_Start.

This is not the case for the poor dog. :(

You may use any of these code points as first character of your identifier:

http://unicode.org/cldr/utility/list-unicodeset.jsp?a=[:ID_Start=Yes:]

Remy Lebeau
  • 555,201
  • 31
  • 458
  • 770
Hauke P.
  • 2,695
  • 1
  • 20
  • 43
  • Okay I found the Hiragana (`ちえん`) but it's still 3 characters so meh. Update: One character! In traditional chinese : `狗` (U+72D7). – Kyll Jun 23 '15 at 14:38
  • Try an [Egyptian hieroglyph](http://www.unicode.org/charts/PDF/U13000.pdf). Not sure if your IDE (or even your browser) will render them though. – Hauke P. Jun 23 '15 at 14:42