In RealityKit 2.0, unlike ARQuickLook, only a single touch drag gesture is implemented to move a model (double-finger gesture for vertical drag isn't implemented at the moment). With a single-finger gesture you can move entity along its anchoring plane – as a rule it's XZ plane, so there's no Y-axis drag.
public static let translation: ARView.EntityGestures
Despite this, you have the option to additionally implement 2D UIGestureRecognizer
.
(The same way you can implement a two-finger pan gesture (also known as levitation gesture) like in AR Quick Look apps)
import UIKit
import RealityKit
class ViewController: UIViewController {
@IBOutlet var arView: ARView!
var box: ModelEntity? = nil
override func viewDidLoad() {
super.viewDidLoad()
box = ModelEntity(mesh: .generateBox(size: 0.05))
box!.generateCollisionShapes(recursive: true)
arView.installGestures([.all], for: box! as (Entity & HasCollision))
let anchor = AnchorEntity(world: [0, 0,-0.2])
anchor.addChild(box!)
arView.scene.anchors.append(anchor)
for swipe in [UISwipeGestureRecognizer.Direction.up,
UISwipeGestureRecognizer.Direction.down] {
let sw = UISwipeGestureRecognizer(target: self,
action: #selector(dragUpAndDown))
sw.direction = swipe
arView.addGestureRecognizer(sw)
}
}
@objc func dragUpAndDown(recognizer: UISwipeGestureRecognizer) {
if recognizer.direction == .up {
box!.position.y += 0.01
}
if recognizer.direction == .down {
box!.position.y -= 0.01
}
}
}
P. S.
Also, this post will show you how raycasting works in conjunction with RealityKit gestures.