I have an extension to normalize a CGRect
to a specified CGSize
extension CGRect {
/// If you have a rectangle that needs to be normalized to another width and height you can use this function
/// - Parameter size: The size you want to normalize to
/// - Returns: The normalized Rect
public func normalize(toSize size: CGSize) -> CGRect {
let width = 1.0 / size.width
let height = 1.0 / size.height
let scale = CGAffineTransform.identity.scaledBy(x: width, y: height)
let newSize = self.applying(scale)
return newSize
}
}
Now I test it like this.
//Will Work with a 100 X 100 canvas first
let firstCanvas = CGSize(width: 100, height: 100)
let nonNormalRect = CGRect(x: 50, y: 50, width: 10, height: 10)
let normalizedRect = nonNormalRect.normalize(toSize: firstCanvas)
XCTAssert(normalizedRect.origin.x == 0.5)
XCTAssert(normalizedRect.origin.y == 0.5)
//Fails
XCTAssert(normalizedRect.width == 0.1)
//Fails
XCTAssert(normalizedRect.height == 0.1)
The new CGSize
fails the test because it is a long floating point value close to 0.1
po normalizedRect.size
▿ (0.09999999999999998, 0.09999999999999998)
- width : 0.09999999999999998
- height : 0.09999999999999998
How come the CGAffineTransform
is "too precise" in this case?