0
_screen.brightness = _screen.brightness - 0.1;

This line of code gives me an unexpected result.
When I call the NSLog(@"%.2f", _screen.brightness - 0.1); command, then it prints the
-0.00 value. When I test to this if (_screen.brightness == 0), it gives me NO.
Why this happens? Is there any conversion problem?

Here's my accessor methods in the class of _screen's object:

- (CGFloat)brightness {
    return 1 - _dimmingView.alpha;
}

- (void)setBrightness:(CGFloat)brightness {
    if (brightness < self.minValue || brightness > self.maxValue) {
        return;
    }
    _dimmingView.alpha = 1 - brightness;
}
Infinite Possibilities
  • 7,415
  • 13
  • 55
  • 118
  • 2
    How is it possible that you've graduated to programming an iPhone, yet never learned about the imprecision of floating point? Methinks you perhaps skipped a few chapters in the textbook. – Hot Licks Sep 19 '11 at 16:23

1 Answers1

3

Floating point arithmetic doesn't necessarily give you the precise answers you're looking for. Better men than I have explained it here: C# float bug? 0.1 - 0.1 = 1.490116E-08. For a different language but the point remains the same.

Community
  • 1
  • 1
jrturton
  • 118,105
  • 32
  • 252
  • 268