1

I'm currently trying to use angelscript with a simple code, following the official website's examples.

But when i try to initialise a double variable like below in my script :

double x=1/2;

the variable x appears to be initialised with the value 0.

It only works when i write double x=1/2.0; or double x=1.0/2;

Does it exist a way to make angelscript work in double precision when i type double x=1/2 without adding any more code in the script ?

Thank you,

mat
  • 11
  • 2

3 Answers3

1

Using some macro chicanery:

#include <stdio.h>

#define DIV * 1.0 / 

int main(void)
{
    double x = 1 DIV 2;

    printf("%f\n", x);
    return 0;
}

DIV can also be defined as:

#define DIV / (double)
David Ranieri
  • 39,972
  • 7
  • 52
  • 94
0

That's because 1 and 2 are integers, and:

int x = 1/2;

Would be 0. If x is actually a double, you get an implicit cast conversion:

double x = (double)(1/2);

Which means 1/2 = 0 becomes 0.0. Notice that is not the same as:

double x = (double)1/2;

Which will do what you want.

Numbers with decimals are doubles, and dividing an int by a double produces a double. You can also do this with casting each number:

double x = (double)1/(double)2;

Which is handy if 1 and 2 are actually int variables -- by casting this way, their value will be converted to a double before the division, so the product will be a double.

CodeClown42
  • 11,194
  • 1
  • 32
  • 67
0

When you divide an int by an int, the result is an int. The quotient is the result, the remainder is discarded. Here, 1 / 2 yields a quotient of zero and a remainder of 1. If you need a double, try 1.0 / 2.

No, there is no way to get a double by dividing two ints without casting the result.

verbose
  • 7,827
  • 1
  • 25
  • 40