2

I have a SwiftUI view displaying an UIImage. How can I determine the frame of the displayed View?

I want to determine the color at the point on the image tapped by the user. I know the size of the raw image, but can't work out how to determine the actual size of the frame displayed.

For example, with an image 3024 x 4032, and code:

struct PhotoImage: View {
    var image: UIImage
    @State private var gest: DragGesture = DragGesture(minimumDistance: 0, coordinateSpace: .local)
    var body: some View {
        GeometryReader {
            geometry in
                Image(uiImage: self.image )
                    .resizable()
                    .frame(width: 500, height: 400, alignment: .center)
                    .aspectRatio(contentMode: .fit)
                    .gesture(self.gest
                    .onEnded({ (endGesture) in
                        let frame = geometry.frame(in: CoordinateSpace.local)
                        let size = geometry.size
                        print("Frame: \(frame)")
                        print("Geometry size: \(size)")
                        print("Image size: \(self.image.size)")
                        print("Location: \(endGesture.location)")
                       }))
        }
    }
}  

the debug shows the frame and geometry size as 1194x745. The gesture location shows the image View to have dimensions 500x 400.

If I don't set the frame size, and use aspectFill, then the geometry size is correct. However, this is no good for my needs as the top and bottom of the image is clipped.

guinnessman
  • 433
  • 5
  • 15
  • Possible duplicate of [Get Parent Size SwiftUI](https://stackoverflow.com/questions/56954325/get-parent-size-swiftui) – Alladinian Aug 02 '19 at 09:54
  • Thanks but I had looked at that already and it didn't seem to give the right results. I have added more detail to the question now. – guinnessman Aug 02 '19 at 12:29

1 Answers1

1

Your GeometryReader is not reading size of your image, but all the space which is available/which it claims. You could ensure that Geometry Reader is returning the expected 500x400 in the frame and geometry size, by adding it to the background or overlay layer.

Here is a modified version:

struct PhotoImage: View {
    var image: UIImage
    @State private var gest: DragGesture = DragGesture(minimumDistance: 0, coordinateSpace: .local)
    var body: some View {
        Image(uiImage: self.image )
            .resizable()
            .frame(width: 500, height: 400, alignment: .center)
            .aspectRatio(contentMode: .fit)
            .overlay(
                GeometryReader { geometry in
                    Color.clear
                        .contentShape(Rectangle())
                        .gesture(
                            self.gest
                                .onEnded({ (endGesture) in
                                    let frame = geometry.frame(in: CoordinateSpace.local)
                                    let size = geometry.size
                                    print("Frame: \(frame)")
                                    print("Geometry size: \(size)")
                                    print("Image size: \(self.image.size)")
                                    print("Location: \(endGesture.location)")
                                })
                        )
                }
            )
    }
}
pd95
  • 1,999
  • 1
  • 20
  • 33