10

I want to use eval()to resolve simple equations and logical expressions, e.g. 12*(4+3).

How safe is client side eval when the input (possibly untrusted) gets sanitized and only allows digits, +-*/()<>|&! and the words 'true' and 'false'?

Available JS parsers for equations are too big and featureful for me. I threw one together myself, however it's a lot of lines of code compared to eval'ing and it's not yet perfect.

EDIT: So yeah, I guess what I'm specifically asking is can somebody execute malicious code with nothing but digits and +-*/()<>|&! ? (I guess 'true' and 'false' are harmless)

user3195878
  • 145
  • 6
  • eval is not good by default, but if you don't have other options. regex will help you to sanitize input. – Eugene P. Feb 01 '14 at 18:23
  • What do you need the words "true" and "false" for? How are you sanitizing it - could there be a bug in that? Show us your code :-) – Bergi Feb 01 '14 at 18:40
  • 2
    and [Is using javascript eval() safe for simple calculations in inputs?](http://stackoverflow.com/q/14020780/218196) – Felix Kling Feb 01 '14 at 18:41
  • 1
    @FelixKling I don't think either of those answers my specific question. The first is a general pondering about eval(), the second is answered with "It's safe because the user enters the input". – user3195878 Feb 01 '14 at 18:50

1 Answers1

2

I think it's completely safe, I don't think that eval is evil. Just use it with judice, and double check your sanitize function.

Since you're not allowing unicode letters neither _ or $ to pass sanitization, and javascript identifier must contains letter, it won't be possible to pollute the global scope, not to call functions.

from MDN page on identifiers :

Starting with JavaScript 1.5, you can use ISO 8859-1 or Unicode letters such as å and ü in identifiers. You can also use the \uXXXX Unicode escape sequences as characters in identifiers.

Remember to catch for exception thrown by eval calls, because it's always possible to enter wrong expression, e.g. 4><5.

Also, be sure that you check for characters you allow, not for these that you deny, so that characters you didn't think about are denied by default.

Andrea Parodi
  • 5,534
  • 27
  • 46
  • Javascript identifier mustn't contain letter- it can be a bare _ symbol, for instance: var _ = 2; In this case, the \"_\" character isn't allowed, but still- it's a correct identifier. – Jack L. Feb 01 '14 at 18:31
  • You're right, I edit my answer – Andrea Parodi Feb 01 '14 at 18:35
  • That's a good thing... – Andrea Parodi Feb 01 '14 at 18:55
  • To add to @JackL., almsot every character except for a few ones are acceptable variable names. See http://mathiasbynens.be/notes/javascript-identifiers. `var π = Math.PI;` works just fine. – Joeytje50 Feb 01 '14 at 19:37
  • This is why I suggest to whitelist allowed characters... anyway, I edit my answer to specify that any unicode letter could be used as an identifier. – Andrea Parodi Feb 01 '14 at 19:41