0

So I have this program that is a game where the user has a currency that they can buy things with, and this originally starts very small (10^-8). Each time they buy something, the cost is taken from the total and the cost is then doubled. It's a pretty simple implementation and works as expected, until we pass 1. Here's an example of what the function to buy something looks like:

if (totalMoney >= itemCost) {
        items++
        totalMoney = totalMoney - itemCost
        itemCost = itemCost * 2
    }

Pretty simple, right? That's what I thought too but once the itemCost gets past 1, it doesn't work anymore, even though the corresponding labels say that totalMoney is larger than itemCost. Is there some kind of behavior with Javascript Numbers that I'm overlooking? I'm just not sure why it would work with numbers between 0-1 but not anything above. I use .toFixed(11) on my numbers so that they're all a certain length, but my totalMoney right now is something like 22.23849234892 and the items I am trying to buy's cost is something like 3.23438292828 or 4.23489234892 (not the same number of decimals i know but you know what I'm trying to say) and it's indicating that my totalMoney is not larger than the respective itemCost.

Jacob F
  • 135
  • 3
  • 13
  • If I understand you correctly, this sound like the floating point problem, maybe [this](https://stackoverflow.com/q/1458633/5923666) will help you? – Raz Luvaton Sep 28 '21 at 22:51
  • `toFixed` is used only for visualization purposes. Can you show us an example of the `totalMoney` and `itemCost` values? Better if taken from the console. – Christian Vincenzo Traina Sep 28 '21 at 22:51

0 Answers0