1

In my project (C++/UE4), I have a lever mesh sticking out of the floor. Holding the left mouse button on this lever and moving the mouse initiates a dragging operation. This dragging operation is responsible for calculating the 2D delta mouse movements, and utilizing this data to rotate the lever *in local space*, which can only rotate on a single axis (negative or positive, but still only one axis).

But what if, instead of being in front of the lever, I'm actually behind it? What if I'm on one of its sides? What if the lever is actually sticking out of a wall instead of the floor?... How do I make it so that mouse movements actually rotate the lever appropriate to the angle at which it is viewed from, regardless of the lever's orientation?


To further explain myself...
Here's a list of scenarios, and how I'd like the mouse to control them:

When the lever's on the FLOOR and you are in FRONT of it:

  • If you move the mouse UP (-Y), it should rotate away from the camera
  • If you move the mouse DOWN (+Y), it should rotate toward the camera

When the lever's on the FLOOR and you are BEHIND it:

  • If you move the mouse UP (-Y), it should rotate away from the camera
    (which is the opposite world-space direction of when you are in front of it)
  • If you move the mouse DOWN (+Y), it should rotate toward the camera
    (which is the opposite world-space direction of when you are in front of it)

When the lever's on the FLOOR and you are BESIDE it:

  • If you move the mouse LEFT (-X), it should rotate to the LEFT of the camera
    (which is the opposite direction of when you are on the other side of it)
  • If you move the mouse RIGHT (+X), it should rotate to the RIGHT of the camera
    (which is the opposite direction of when you are on the other side of it)

When the lever's on a WALL and you are in FRONT of it:

  • If you move the mouse UP, it should rotate UP (toward the sky)
  • If you move the mouse DOWN, it should rotate DOWN (toward the floor)

When the lever's on the WALL and you are BESIDE it:

  • Same as when it's on the wall and you are in front of it

PLEASE NOTE if it helps at all, that UE4 does have built-in 2D/3D Vector math functions, as well as easy ways to project and deproject coordinates to/from the 3D world or 2D screen. Because of this, I always know the exact world-space and screen-space coordinates of the mouse location, the lever's pivot (base) location, and the lever's handle (top) location, as well as the amount (delta) that the mouse has moved each frame.

RectangleEquals
  • 1,825
  • 2
  • 25
  • 44

2 Answers2

1

Get the pivot of the lever (the point around it rotates), and project it to the screen coordinates. Then when you first click you store the coordinates of the click.

Now when you need to know which way to rotate you compute the dot product between the vector pivot to first click and the vector pivot to current location (you should normalize the vectors before the dot product). This gives you cos(angle) that the mouse moved and you can use it (take arccos(value) to get the angle) to move the lever in 3d. It will be a bit wonky since the angle on screen is not the same as the projected angle, but it's easier to control this way (if you move the mouse 90 degrees the lever moves 90 degrees even if they don't align properly). Play with the setup and see what works best for your project.

Another way to do it is this: When you first click you store the point of the end of the lever (or even better the point where you clicked on the lever) in 3d space. You use the camera projection plane to move the point in 3d (you can use camera up vector, after you make it orthogonal to the camera view direction, then take the view direction cross up vector to get the right direction). You apply the mouse delta movements to the point, then project it into the rotation plane and move the lever to align the point to the projected one (the math is similar to the one above, just use the 3d points instead of the screen projections).

Caution: this doesn't work well if the camera is very close to the plane of rotation since it's not always clear if the lever should move forward or backwards.

I'm not an expert on unreal-engine4(just learning myself) but all these are basic vector math and should be supported well. Check out dot product and cross product on wikipedia since they are super useful for these kind of tricks.

Sorin
  • 11,863
  • 22
  • 26
  • I've updated my question to be more thorough about the different kind of scenarios that I have in mind. Will either of your suggested algorithms work for all listed cases? Thanks! – RectangleEquals Jun 28 '16 at 10:18
  • @RectangleEquals Both algorithms work in all cases, including most angles for both the camera and the lever (say lever on a incline). The weird case is when the lever is on the floor in front of you and it's supposed to go away and towards the camera. The screen space method you move the mouse to the left and the lever goes away from you. The world space method you move the mouse down and the lever can go back for forward depending on you exact alignment. Both are kinda' strange from a control perspective, but without a mouse that you can push into the table it's not going to be an easy fix. – Sorin Jun 28 '16 at 12:02
  • @RectangleEquals I suggest you try these methods first and get a feeling for the math. Once you're comfortable with that you can try some different mappings, or a combination of solutions, depending on your relative position with respect to the lever. You may also find that some methods are more intuitive in your context, even if they aren't perfect for control. – Sorin Jun 28 '16 at 12:06
  • Just to be clear, I have the world space (XYZ) coordinates of: `Pivot (Base of lever)`, & `Handle (Tip of lever)`. And I have the screen space (XY) coordinates of the mouse. I also have the ability to convert screen-space coords into world-space coords, and vice-versa. But converting into world-space gives 2 vectors: `XYZ Location (Just in front of the camera)`, and `XYZ Direction` (which means you lose the actual length of the vector)... So using this information, what would I need to normalize? What do you mean by "First Click" and "Current Position"? Aren't they the same thing? – RectangleEquals Jun 29 '16 at 01:10
  • And as far as the UE4 API is concerned... For 3D Vectors: https://docs.unrealengine.com/latest/INT/API/Runtime/Core/Math/FVector/DotProduct/index.html ... For 2D Vectors: https://docs.unrealengine.com/latest/INT/API/Runtime/Core/Math/FVector2D/DotProduct/index.html – RectangleEquals Jun 29 '16 at 01:15
  • So both take in 2 vectors and return a float – RectangleEquals Jun 29 '16 at 01:15
  • I might be doing it wrong, but I'm not sure what to do with the values I'm getting from `FVector::DotProduct` and `FMath::Acos`. Both values always seem positive (between 0.f and ~1.5f), and don't appear to tie into the mouse movement at all. Are these angles? Perhaps in radians? How am I supposed to know what direction to rotate based upon these float values? By the way, I'm using `USceneComponent::AddRelativeRotation` on the lever: https://docs.unrealengine.com/latest/INT/API/Runtime/Engine/Components/USceneComponent/AddRelativeRotation/index.html – RectangleEquals Jun 29 '16 at 04:18
  • Did you get this solved? I'd really be interested in seeing what you ended up with. – ryandlf Aug 23 '16 at 01:39
  • @ryandlf Nope, the company I was working for ended up taking an entirely different approach. Sorry! – RectangleEquals Oct 23 '17 at 00:56
0

Here's one approach:

  1. When the user clicks on the lever, Suppose there is a a plane through the pivot of the lever whose normal is the same as the direction from the camera to the pivot.. Calculate the intersection point of the cursor's ray and that plane.

    FVector rayOrigin;
    FVector rayDirection;
    FVector cameraPosition;
    FVector leverPivotPosition;
    
    FVector planeNormal = (leverPivotPosition-cameraPosition).GetSafeNormal(0.0001f);
    
    float t = ((leverPivotPosition - rayOrigin) | planeNormal) / (planeNormal | rayDirection);
    
    FVector planeHitPosition = rayOrigin + rayDirection * t;
    
  2. Do a scalar projection of that onto the local top/bottom axis of the lever. Let's assume it's the local up/down axis:

    FVector leverLocalUpDirectionNormalized;
    
    float scalarPosition = planeHitPosition | leverLocalUpDirectionNormalized;
    
  3. Then, in each other frame where the lever is held down, calculate the new scalarPosition for that frame. As scalarPosition increases between frames, the lever should rotate such that it moves towards the up side of the lever. As it decreases between frames, the lever should rotate towards the the down side of the lever.

Ruzihm
  • 19,749
  • 5
  • 36
  • 48