I'm trying to determine whether a point is located inside of a polygon or not. I use the following (for Swift modified) algorithm from this website:
func contains(polygon: [Point], test: Point) -> Bool {
let count = polygon.count
var i: Int, j: Int
var contains = false
for (i = 0, j = count - 1; i < count; j = i++) {
if ( ((polygon[i].y >= test.y) != (polygon[j].y >= test.y)) &&
(test.x <= (polygon[j].x - polygon[i].x) * (test.y - polygon[i].y) /
(polygon[j].y - polygon[i].y) + polygon[i].x) ) {
contains = !contains;
}
}
return contains;
}
However, when having a simple polygon with the following coordinates: (x: 0, y: 40), (x: 0, y: 0), (x: 20, y: 0), (x: 20, y: 20), (x: 40, y: 20), (x: 40, y: 40)
, and check for the point (x: 30, y: 20)
the result is true as the if-statement evaluates to true
when i
and j
are 5 and 4 ((x: 40, y: 40)
and (x: 40, y: 20)
), though the point is only located at the border of the polygon. The function should actually only evaluate true
if the point is really located in the polygon. Thanks for any help or improvements/adjustments of the algorithm!