15

I'm trying out the new ARKit to replace another similar solution I have. It's pretty great! But I can't seem to figure out how to move an ARAnchor programmatically. I want to slowly move the anchor to the left of the user.

Creating the anchor to be 2 meters in front of the user:

        var translation = matrix_identity_float4x4
        translation.columns.3.z = -2.0
        let transform = simd_mul(currentFrame.camera.transform, translation)

        let anchor = ARAnchor(transform: transform)
        sceneView.session.add(anchor: anchor)

later, moving the object to the left/right of the user (x-axis)...

anchor.transform.columns.3.x = anchor.transform.columns.3.x + 0.1

repeated every 50 milliseconds (or whatever).

The above does not work because transform is a get-only property.

I need a way to change the position of an AR object in space relative to the user in a way that keeps the AR experience intact - meaning, if you move your device, the AR object will be moving but also won't be "stuck" to the camera like it's simply painted on, but moves like you would see a person move while you were walking by - they are moving and you are moving and it looks natural.

Please note the scope of this question relates only to how to move an object in space in relation to the user (ARAnchor), not in relation to a plane (ARPlaneAnchor) or to another detected surface (ARHitTestResult).

Thanks!

Ryan Pfister
  • 3,176
  • 4
  • 24
  • 26

1 Answers1

44

You don't need to move anchors. (hand wave) That's not the API you're looking for...

Adding ARAnchor objects to a session is effectively about "labeling" a point in real-world space so that you can refer to it later. The point (1,1,1) (for example) is always the point (1,1,1) — you can't move it someplace else because then it's not the point (1,1,1) anymore.

To make a 2D analogy: anchors are reference points sort of like the bounds of a view. The system (or another piece of your code) tells the view where it's boundaries are, and the view draws its content relative to those boundaries. Anchors in AR give you reference points you can use for drawing content in 3D.

What you're asking is really about moving (and animating the movement of) virtual content between two points. And ARKit itself really isn't about displaying or animating virtual content — there are plenty of great graphics engines out there, so ARKit doesn't need to reinvent that wheel. What ARKit does is provide a real-world frame of reference for you to display or animate content using an existing graphics technology like SceneKit or SpriteKit (or Unity or Unreal, or a custom engine built with Metal or GL).


Since you mentioned trying to do this with SpriteKit... beware, it gets messy. SpriteKit is a 2D engine, and while ARSKView provides some ways to shoehorn a third dimension in there, those ways have their limits.

ARSKView automatically updates the xScale, yScale, and zRotation of each sprite associated with an ARAnchor, providing the illusion of 3D perspective. But that applies only to nodes attached to anchors, and as noted above, anchors are static.

You can, however, add other nodes to your scene, and use those same properties to make those nodes match the ARSKView-managed nodes. Here's some code you can add/replace in the ARKit/SpriteKit Xcode template project to do that. We'll start with some basic logic to run a bouncing animation on the third tap (after using the first two taps to place anchors).

var anchors: [ARAnchor] = []
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {

    // Start bouncing on touch after placing 2 anchors (don't allow more)
    if anchors.count > 1 {
        startBouncing(time: 1)
        return
    }
    // Create anchor using the camera's current position
    guard let sceneView = self.view as? ARSKView else { return }
    if let currentFrame = sceneView.session.currentFrame {

        // Create a transform with a translation of 30 cm in front of the camera
        var translation = matrix_identity_float4x4
        translation.columns.3.z = -0.3
        let transform = simd_mul(currentFrame.camera.transform, translation)

        // Add a new anchor to the session
        let anchor = ARAnchor(transform: transform)
        sceneView.session.add(anchor: anchor)
        anchors.append(anchor)
    }
}

Then, some SpriteKit fun for making that animation happen:

var ballNode: SKLabelNode = {
    let labelNode  = SKLabelNode(text: "")
    labelNode.horizontalAlignmentMode = .center
    labelNode.verticalAlignmentMode = .center
    return labelNode
}()
func startBouncing(time: TimeInterval) {
    guard
        let sceneView = self.view as? ARSKView,
        let first = anchors.first, let start = sceneView.node(for: first),
        let last = anchors.last, let end = sceneView.node(for: last)
        else { return }

    if ballNode.parent == nil {
        addChild(ballNode)
    }
    ballNode.setScale(start.xScale)
    ballNode.zRotation = start.zRotation
    ballNode.position = start.position

    let scale = SKAction.scale(to: end.xScale, duration: time)
    let rotate = SKAction.rotate(toAngle: end.zRotation, duration: time)
    let move = SKAction.move(to: end.position, duration: time)

    let scaleBack = SKAction.scale(to: start.xScale, duration: time)
    let rotateBack = SKAction.rotate(toAngle: start.zRotation, duration: time)
    let moveBack = SKAction.move(to: start.position, duration: time)

    let action = SKAction.repeatForever(.sequence([
        .group([scale, rotate, move]),
        .group([scaleBack, rotateBack, moveBack])
        ]))
    ballNode.removeAllActions()
    ballNode.run(action)
}

Here's a video so you can see this code in action. You'll notice that the illusion only works as long as you don't move the camera — not so great for AR. When using SKAction, we can't adjust the start/end states of the animation while animating, so the ball keeps bouncing back and forth between its original (screen-space) positions/rotations/scales.

You could do better by animating the ball directly, but it's a lot of work. You'd need to, on every frame (or every view(_:didUpdate:for:) delegate callback):

  1. Save off the updated position, rotation, and scale values for the anchor-based nodes at each end of the animation. You'll need to do this twice per didUpdate callback, because you'll get one callback for each node.

  2. Work out position, rotation, and scale values for the node being animated, by interpolating between the two endpoint values based on the current time.

  3. Set the new attributes on the node. (Or maybe animate it to those attributes over a very short duration, so it doesn't jump too much in one frame?)

That's kind of a lot of work to shoehorn a fake 3D illusion into a 2D graphics toolkit — hence my comments about SpriteKit not being a great first step into ARKit.


If you want 3D positioning and animation for your AR overlays, it's a lot easier to use a 3D graphics toolkit. Here's a repeat of the previous example, but using SceneKit instead. Start with the ARKit/SceneKit Xcode template, take the spaceship out, and paste the same touchesBegan function from above into the ViewController. (Change the as ARSKView casts to as ARSCNView, too.)

Then, some quick code for placing 2D billboarded sprites, matching via SceneKit the behavior of the ARKit/SpriteKit template:

// in global scope
func makeBillboardNode(image: UIImage) -> SCNNode {
    let plane = SCNPlane(width: 0.1, height: 0.1)
    plane.firstMaterial!.diffuse.contents = image
    let node = SCNNode(geometry: plane)
    node.constraints = [SCNBillboardConstraint()]
    return node
}

// inside ViewController
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
    // emoji to image based on https://stackoverflow.com/a/41021662/957768
    let billboard = makeBillboardNode(image: "⛹️".image())
    node.addChildNode(billboard)
}

Finally, adding the animation for the bouncing ball:

let ballNode = makeBillboardNode(image: "".image())
func startBouncing(time: TimeInterval) {
    guard
        let sceneView = self.view as? ARSCNView,
        let first = anchors.first, let start = sceneView.node(for: first),
        let last = anchors.last, let end = sceneView.node(for: last)
        else { return }

    if ballNode.parent == nil {
        sceneView.scene.rootNode.addChildNode(ballNode)
    }

    let animation = CABasicAnimation(keyPath: #keyPath(SCNNode.transform))
    animation.fromValue = start.transform
    animation.toValue = end.transform
    animation.duration = time
    animation.autoreverses = true
    animation.repeatCount = .infinity
    ballNode.removeAllAnimations()
    ballNode.addAnimation(animation, forKey: nil)
}

This time the animation code is a lot shorter than the SpriteKit version. Here's how it looks in action.

Because we're working in 3D to start with, we're actually animating between two 3D positions — unlike in the SpriteKit version, the animation stays where it's supposed to. (And without the extra work for directly interpolating and animating attributes.)

rickster
  • 124,678
  • 26
  • 272
  • 326
  • Thanks for taking the time to reply. I am using the 2d version, Spritekit scenes, just to keep things simple. If I move the SKSpriteNode to the left of the ARAnchor, when I move the iPhone around while it's moving it looks....unnatural. The SpriteNode moves with the phone, just X amount to the left. If there was a way for me to move the actual anchor however, it seems that would look more natural, because as you are moving the device the object is moving in relation to the device. That is essentially what AR frameworks like Wikitude do in order to make the ar elements look good. – Ryan Pfister Jun 07 '17 at 04:47
  • Ah. To move things in SpriteKit you'll need a different approach, but the core idea is the same: don't place content (SKNodes) at the anchors, and instead use the anchors to get position/scale/etc that you can apply to other nodes you position and animate directly. – rickster Jun 07 '17 at 05:01
  • By the way, I'm not sure SpriteKit is keeping things simple when it comes to getting yourself started with ARKit. AR is inherently 3D, so using a 2D graphics technology with it means some extra conceptual shenanigans to squeeze in the extra dimension. – rickster Jun 07 '17 at 05:19
  • I disagree that there is any difference between 2d and 3d objects. Moving a flat image left and right and up and down and further and closer doesn't have to be any different than a 3d object. WIth Wikitude (the framework that I currently use to place objects in an AR environment) it handles the two scenarios the same, and I can build a natural feeling user experience because I can move objects (2d or 3d) with ease by simple changing where they are relative to the user (2 SDUs away, 3 SDUs away) and it all just works. Moving the device while the object is in motion is handled properly. – Ryan Pfister Jun 07 '17 at 13:12
  • After re-reading my question and thinking about it a bit, I realized that my question was ambiguous and nonspecific enough to what my needs were. I edited the question to attempt to clarify what exactly I'm trying to do. – Ryan Pfister Jun 07 '17 at 13:50
  • 1
    @RyanPfister If you want to move things in 3D, regardless of whether those things are 3D models or 2D sprites, you'll have a much easier time with a 3D graphics toolkit. Edited the answer to expand and clarify. – rickster Jun 13 '17 at 19:22
  • 1
    Wow, truly amazing job and effort demonstrating the difference. And what a truly amazing framework ARKit is. Wow. – Ryan Pfister Jun 13 '17 at 23:48
  • Thanks, happy to help. BTW, kinda wish I could do video embeds on Stack sites... I've done animated GIFs for some of these before, but some demos really call for actual video. – rickster Jun 21 '17 at 18:32
  • Hi, thanks your answer is really cool and helps a lot to plan moves between ARAnchors. I have a small issue though, did you manage to extend some more complex animations between the anchors like curves : I am trying to setup a CAKeyframeAnimation for keyPath SCNNode.position (also tried SCNode.transform) with start/end control points offseted by y but I am not able to see anything, how do you apply a bezier path (which represent some kind of path in 2D space) on the SCNNode which moves in 3D space ? – michael-martinez Apr 03 '18 at 09:36
  • None of Apple's frameworks include types for describing a 3D Bézier curve. You can use `CAKeyframeAnimation` with a series of 3D points to move directly (linearly) from point to point, though, so you can move along a "curved" path by doing your own math to discretize it to a series of line segments. – rickster Apr 03 '18 at 19:01
  • @rickster hi, you're animation is amazing, great job! I'm working in ARSCNView and I need to animate the Anchors themselves to follow the device (like in the question). The reason being I'm working with the MultiUserDemo that is a shared session. I contacted Apple and they said that the only way shared experience users can see each others nodes is if they are attached to ARAnchors. Any advice? – Lance Samaria Feb 20 '20 at 09:35
  • @LanceSamaria The statement you got from Apple is at best an oversimplification. Yes, you need shared anchors, but only to set a shared frame of reference. Given that, you can work out camera pose relative to it, use any means you like to send it over the wire(less) to other devices, and use the same reference on the other device to reconstruct the pose and display something there. IIRC Apple’s own SwiftShot sample does this to show the opponent interacting with slingshots. – rickster Feb 20 '20 at 15:35
  • @rickster thks for responding. I didn't know you could reconstruct it on the other user's end, good to know. I'll post a question about that. Question about your code, it 100% works, I tried something different though. Instead of adding 2 ball players, I add only 1 and no ball. Then in renderer(willRenderScene:) I call your startBouncing(time:) method to move the one player to follow a node that is a child of the camera (sceneView.pointOfView.addChildNode(abcNode)) but it doesn't follow it, the player sits still. I switched last & end to abcNode but nothing happens? Where am I going wrong at? – Lance Samaria Feb 20 '20 at 16:12
  • @LanceSamaria that, too, sounds like too much for a comment. Post a question? – rickster Feb 20 '20 at 18:59
  • @rickster I added the question here: https://stackoverflow.com/q/60327523/4833705 In your answer above your animation code works flawlessly. I copied it and made some adjustments but nada. I can't figure out where I'm going wrong at. – Lance Samaria Feb 20 '20 at 19:53
  • @rickster I've been playing around with this for hours. I finally got it to work using your animation but it's working in reverse. When I turn the device up the image goes right and when I turn the device down the image goes left. When I turn the device left the image goes up and when I turn the device right the image goes down. It's following the image I have tied to the camera but it's following it incorrectly. – Lance Samaria Feb 21 '20 at 00:19
  • Hello @ricker! I was reading your post earlier about AVCameraCalibrationData on https://stackoverflow.com/questions/48093509/how-can-i-get-camera-calibration-data-on-ios-aka-avcameracalibrationdata. I am wondering whether you could give me some help on this: https://stackoverflow.com/questions/62927167/swift-get-the-truthdepth-camera-parameters-for-face-tracking-in-arkit. Thanks!! – swiftlearneer Jul 16 '20 at 22:26