I need to be able to tap on the view, get the X, Y tap position, and then for a scenekit node to appear in that position, including depth. So if I tap at top of screen, it appears far away, and if lower, then closer. It needs to take into account camera view rotation, lateral position, and should appear exactly where touched on screen. Snapchat do this very well with their newest augmented reality objects, as seen here.
I have heard about scenekits projected/unprojected points, and tried the answer to this How to use iOS (Swift) SceneKit SCNSceneRenderer unprojectPoint properly
let projectedOrigin = scnView.projectPoint(SCNVector3Zero)
let vpWithZ = SCNVector3(x: tapX, y: tapY, z: projectedOrigin.z)
let worldPoint = scnView.unprojectPoint(vpWithZ)
However the node that moves to worldPoint shows appears much to close anywhere I tap (how do I scale it?), and even though it shows signs of working laterally, it is very inaccurate. I want it to just move along the xz plane like in snapchat. How exactly does the projection system work to implement it (it is not well explained elsewhere), and is there a better alternative?
Thank you.