A minigolf course is rendered through tiles; each tile has 3 or more vertices that defined its shape. We are given a ball and its starting position. Over time, this ball is meant to be acted on by Physics and reach the hole of the course.
If I have the following tile setup:
Tile 1: Vertices (-2.5, 0, 2.5) (-0.5, 0, 2.5) (-0.5, 0, 1.5) (-1.5, 0, 0.5) (-2.5, 0, 0.5)
Tile 2: Vertices (-0.5, 0, 2.5) ( 0.5, 0, 2.5) ( 0.5, 0, 1.5) (-0.5, 0, 1.5)
Ball: Vertex (-2.25, 0, 2)
Image is here:
My question here is how to check if the ball's position is within the boundaries of a Tile?
This is necessary because some tiles in the course are sloped, so appropriate physics calculations are necessary.
Note that the Tile numbers are defined in each tile. I have a tile struct that holds all its vertices and edges. A ball's position is also given to us. I don't have code to demonstrate because I don't know how to get started with this.
EDIT. I have been following the answer here How can I determine whether a 2D Point is within a Polygon?
and it seems to work fine as long as I check within x and z coordinates. However, in a blunder, I forgot there are certain golf holes in my project that will have overlapping tiles or bridges above one another; polygons that have the same x and z coordinates but different y values. This causes glitches in relation to my ball's movement. How can I go about a Ray Casting method like the link's answer and check for the tile's height given vertices?
I was thinking about taking the ball's current position's y component and comparing it to the average of the vertices that form a tile or polygon; if it was under that average, the ball was in that tile instead of the other. This doesn't work as intended. Please advise.