0

The ECMAScript standard indicates that JS variable names can be pretty much any Unicode character. I have tested this out in this fiddle with the variable __$. This appears to pose no problems to all my desktop browsers (IE down to IE 8 - the last I tested), iOS6 on Safari and Chrome, my own ancient Android smartphone. However, I want to look before I leap - is there any probability that a recent handheld device (I do not care about supporting the ark) might burp when it sees a variable such as

var __$,ǰ 

etc?

DroidOS
  • 8,530
  • 16
  • 99
  • 171
  • .. see also [Why aren't ◎ܫ◎ and ☺ valid JavaScript variable names?](http://stackoverflow.com/questions/7451524/why-arent-and-valid-javascript-variable-names). – Jongware Jun 02 '14 at 12:29
  • You are asking if any browser that runs on any recent handheld device may have a buggy JS implementation that doesn't follow the spec. While there are infinitely many possible answers that prove your fear was founded, logically there cannot be any answer that puts them to rest. – Jon Jun 02 '14 at 12:30
  • Not quite sure this is a duplicate - I am aware of whwat Ecmascript allows. My question is - to what extent do modern browsers (particularly on mobile devices) implement what ECMA says. – DroidOS Jun 02 '14 at 12:32
  • "is there any probability" - of course there is, but it would be vanishingly small, particularly for recent implementations. Developing a test is trivial, why not do it? – RobG Jun 02 '14 at 13:03
  • ummm... I have developed the test and tested. I do not have access to all kinds of devices (e.g Blackberries) which is why I asked the question. – DroidOS Jun 03 '14 at 03:36

0 Answers0