I have a SwiftUI view displaying an UIImage. How can I determine the frame of the displayed View?
I want to determine the color at the point on the image tapped by the user. I know the size of the raw image, but can't work out how to determine the actual size of the frame displayed.
For example, with an image 3024 x 4032, and code:
struct PhotoImage: View {
var image: UIImage
@State private var gest: DragGesture = DragGesture(minimumDistance: 0, coordinateSpace: .local)
var body: some View {
GeometryReader {
geometry in
Image(uiImage: self.image )
.resizable()
.frame(width: 500, height: 400, alignment: .center)
.aspectRatio(contentMode: .fit)
.gesture(self.gest
.onEnded({ (endGesture) in
let frame = geometry.frame(in: CoordinateSpace.local)
let size = geometry.size
print("Frame: \(frame)")
print("Geometry size: \(size)")
print("Image size: \(self.image.size)")
print("Location: \(endGesture.location)")
}))
}
}
}
the debug shows the frame and geometry size as 1194x745. The gesture location shows the image View to have dimensions 500x 400.
If I don't set the frame size, and use aspectFill, then the geometry size is correct. However, this is no good for my needs as the top and bottom of the image is clipped.