0I'm developing some kind of buttons (really they are images) that when they're disabled (not by enabled = false) they have and opacity value of 0.1. So, with a simple checking of opacity in the tapping event I can know if it's enabled or disabled.
But, when I set the disabled value for the opacity (0.1) and I debug and check the execution I have a strange behaviour: opacity values are not exactly 0.1, but slightly bigger. So, when I check if ( btnLocateMeControl.Opacity == 0.1 )
always returns false because it isn't exact.
Here the screenshot when debugging:
Any idea why is happenning? Any elegant solution that it's not to check if opacity is not between 0.09 - 0.11 ?
Thanks!