0

How i get the float result of the division?

Although i defined the average array as float.

int main()
{    
const int Number = 20; 
int Fibonacci[Number]; 
float average[Number];
for ( int i =0; i <= Number; i++ )
{
    if ( i == 0 ) 
       {Fibonacci[i] = 0;}
    else if ( i == 1 ) 
       {Fibonacci[i] = 1;}
    else 
       {Fibonacci[i] = Fibonacci[i -1] + Fibonacci[i -2];

        //average[i] = (Fibonacci[i -1] + Fibonacci[i -2])/2 ;   
       }
}

cout<< "The first 20 Fibonacci series numbers are: \n";
for ( int i = 1; i <= Number; i++)
{    cout<< Fibonacci[i]<<endl;

}



cout<< "The average adjacent array numbers are: \n";
for ( int i = 3; i <= Number; i++)
{   average[i] = (Fibonacci[i]/2);
    //cout.precision(0);

Here the proplem

    cout<< average[i]<<endl;     <-----here the problem!!
}

return 0;
}

I appreciate any help. Thanks in advance.

Nadim
  • 51
  • 8
  • You are doing integer division. `Fibonacci` contains `int`, and `1` is also an int. You want to convert one of the values to a float first. For example: `average[i] = (Fibonacci[i]/2.f)` – ChrisMM Jun 10 '20 at 13:11
  • 4
    Does this answer your question? [Why does dividing two int not yield the right value when assigned to double?](https://stackoverflow.com/questions/7571326/why-does-dividing-two-int-not-yield-the-right-value-when-assigned-to-double) – ChrisMM Jun 10 '20 at 13:12
  • `for ( int i =0; i <= Number; i++ )` should be `for ( int i =0; i < Number; i++ )`. As written, the final pass through the loop goes off the end of the array. – Pete Becker Jun 10 '20 at 14:07

3 Answers3

3

When you do the division, you are doing an integer division, so you won't get floating point results. A simple fix would be the following:

average[i] = Fibonacci[i] / 2.0f;

Note that if one of the operands to / is a float, then you get floating point division.

Also, note that your loops index too far into the array. You need to stop before Number, like this:

for ( int i = 0; i < Number; i++)

This is because valid array indexes go from 0 .. (Number - 1).

cigien
  • 57,834
  • 11
  • 73
  • 112
1

If Fibonacci[i] is of type int, then (Fibonacci[i]/2) is an integral division, resulting in an integral value (without any fractional part). Assigning this integral result to a float does not change the fact that you have performed an integral division.

You can enforce a floating point division by making one of the operands a floating point value.

So (1) use either a cast...

((float)Fibonacci[i])/2

or (2) divide by 2.0f (which is a float value):

Fibonacci[i]/2.0f
Stephan Lechner
  • 34,891
  • 4
  • 35
  • 58
0

Just use typecasting the fibonacci array like this-

average[i] = (float)Fibonacci[i]/2;

Because in order to get float result any of the two variable used in / operation has to be float.

Raju Ahmed
  • 366
  • 6
  • 18