I have a rectangle like this...
... and I also know the rectangle's corner positions, as in the array below:
var corners = [[107, 0], [0, 21], [111, 24], [4, 45]]
Because I need to calculate the rectangle's angle like I've symbolized in the image above, my idea was to just use this code:
var rectangle_angle = ((180/Math.PI) * Math.atan((a[1] - b[1]) / (a[0] - b[0]))).toFixed(2)
console.log('angle : ' + rectangle_angle)
In the case above I'm using the first two corner points to calculate the angle:
var a = corners[0], b = corners[1]
But e.g. I was using a rectangle like below and tried to calculate the rectangle's angle...
(corners: [[101, 0], [110, 22], [0, 38], [9, 60]]
)
... but the result I got is this -> angle : 67.75
, what absolutely is not the right rotation angle.
Afterwards I fixed the problem by just using corner[0]
and corner[2]
for the calculation instead of using corner[0]
and corner[1]
.
So the result I get now is -20.62°
. Way better.
Now my question: How can I extract the right points from my corners array that I have to use for my calculation?
You can try the function here:
var corners = [[107, 0], [0, 21], [111, 24], [4, 45]]
var a = corners[0], b = corners[1]
var rectangle_angle = ((180/Math.PI) * Math.atan((a[1] - b[1]) / (a[0] - b[0]))).toFixed(2)
console.log('angle : ' + rectangle_angle)