0

I am trying to get the colour of a point along a CAGradientLayer, the function I am using below works on a view with a solid background colour, but when using on a gradient, it returns the incorrect result.

extension UIView {
func colorOfPointView(point: CGPoint) -> CGColor {
    let colorSpace: CGColorSpace = CGColorSpaceCreateDeviceRGB()
    let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)

    var pixelData: [UInt8] = [0, 0, 0, 0]

    let context = CGContext(data: &pixelData, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)

    context!.translateBy(x: -point.x, y: -point.y)

    self.layer.render(in: context!)

    let red: CGFloat = CGFloat(pixelData[0]) / CGFloat(255.0)
    let green: CGFloat = CGFloat(pixelData[1]) / CGFloat(255.0)
    let blue: CGFloat = CGFloat(pixelData[2]) / CGFloat(255.0)
    let alpha: CGFloat = CGFloat(pixelData[3]) / CGFloat(255.0)

    let color: CGColor = UIColor(red: red, green: green, blue: blue, alpha: alpha).cgColor

    return color
}
}

When using the code on a view with a background colour of UIColor(red: 1.00, green: 0.00, blue: 0.00, alpha: 1.00).cgColor, it returns the correct RGB values, but when using on a CAGradientLayer, it returns: Red: 1, Green: 0.14902, Blue: 0, Alpha: 1

Tom Coomer
  • 6,227
  • 12
  • 45
  • 82

0 Answers0