9

Say I have a string, which value is already a number, e.g. var str = "1234" Now I want to convert it to number.

I have seen two tricks on the internet so far,

  1. Use the unary +: var num = +str
  2. Use the multiply operator *: var num = str*1

I want to know which one is better in general.

As I saw from the comment of the accepted answer here: Converting Json Results to a Date, it seems like *1 is best to be avoided. Is this true and what is the reason behind it?

Community
  • 1
  • 1
shole
  • 4,046
  • 2
  • 29
  • 69
  • 6
    For integer, parseInt() would be better – Eric So Apr 28 '16 at 03:22
  • 2
    The unary plus is by definition intended to convert a value to a number, so it's more semantically correct than multiplying by 1 which introduces another operand. @EricSo - parseInt() is better only if you specifically want to ignore any non-numeric data or decimal place value in the string - otherwise it is *worse* than the unary plus *because* it ignores that stuff. – nnnnnn Apr 28 '16 at 03:23

1 Answers1

10

Fewer operations, basically.

The unary plus calls the internal toNumber method, whereas the multiplication operator calls toNumber as well, then does a math operation on it.

Why do the extra steps?

http://www.ecma-international.org/ecma-262/6.0/#sec-unary-plus-operator

http://www.ecma-international.org/ecma-262/6.0/#sec-applying-the-mul-operator

Jeremy J Starcher
  • 23,369
  • 6
  • 54
  • 74
  • Thanks. Clean & clear answer, and let me know that + is not hacky at all..it is indeed the official intention to do the job – shole Apr 28 '16 at 03:32
  • 1
    I wouldn't say it is the 'official intention' as there are strongly typed languages that implement the unary `+`. However, when using a specific JavaScript subset called `ASM`, the `+` is officially used to type define a floating point number and ` | 0` is used to type define 32bit integers. – Jeremy J Starcher Apr 28 '16 at 03:35