2

According to this, JavaScript allows Unicode characters as identifiers. However, this is how Node handles it.

  > var ∆ = 6;
  > ...

I've also put it in the identifier validator, and it agrees that "∆" shouldn't be allowed.

I suppose my question is "What's so special about ∆?"

tshepang
  • 12,111
  • 21
  • 91
  • 136
ThatDarnPat
  • 80
  • 1
  • 7

2 Answers2

7

That isn't U+0394 GREEK CAPITAL LETTER DELTA; it's U+2206 INCREMENT, which is a Math Symbol, not a Letter.

SLaks
  • 868,454
  • 176
  • 1,908
  • 1,964
2

JavaScript identifiers can contain any "Unicode Letter", which means

any character in the Unicode categories “Uppercase letter (Lu)”, “Lowercase letter (Ll)”, “Titlecase letter (Lt)”, “Modifier letter (Lm)”, “Other letter (Lo)”, or “Letter number (Nl)”.

Now, what you can do is this:

var \u0394 = 0;

and 0394 is the Unicode value for ∆. Clearly, that's not quite as satisfying, but it is syntactically OK.

edit — as usual SLaks is correct; you can in fact do this:

var Δ = 0;

when you've got the right version of Δ. (In my current font the math version is prettier.)

Pointy
  • 405,095
  • 59
  • 585
  • 614
  • If `var \u0394;` works that means the unescaped version of that identifier works too. Read [the post you’re quoting from](http://mathiasbynens.be/notes/javascript-identifiers) for more info, or [see this Stack Overflow answer for the TL;DR](http://stackoverflow.com/a/9337047/96656). – Mathias Bynens Jun 30 '14 at 07:37
  • @MathiasBynens yes that became clear once I saw what SLaks had figured out :) – Pointy Jun 30 '14 at 13:08