0

Below declaration works,

var \u0061 =2; // a  = 2;

But below declaration give an error,

var \u00A5 = 2; // suppose to be ¥ = 2;

code point 0xA5 is in BMP plane, Why this error?

一二三
  • 21,059
  • 11
  • 65
  • 74
overexchange
  • 15,768
  • 30
  • 152
  • 347
  • 1
    See http://stackoverflow.com/questions/1661197/what-characters-are-valid-for-javascript-variable-names – rrowland Aug 07 '15 at 05:38

2 Answers2

2

This has nothing to do with your escape sequence, which is fine. It's just that ¥ is not a valid identifier, in contrast to a. An identifier needs to start with $, _, "any Unicode code point with the Unicode property “ID_Start”", or an escape sequence for one of the previous. ¥, being a currency symbol, is not such character.

Bergi
  • 630,263
  • 148
  • 957
  • 1,375
  • `var ヴァリアブル = "変量"` has valid identifier, but what is wrong with the identifier in `var ¥ = 2;`? – overexchange Aug 07 '15 at 15:12
  • @overexchange: As I said, currency symbols are not a valid identier start. [The Katakana ヴ](http://www.fileformat.info/info/unicode/char/30f4/index.htm) in contrast is one. – Bergi Aug 07 '15 at 15:23
0

What you're doing is equivalent to:

var a = 2;
var ¥ = 2;

¥ is not a valid character for a variable in JavaScript. See What characters are valid for JavaScript variable names?

Community
  • 1
  • 1
rrowland
  • 2,734
  • 2
  • 17
  • 32