2

How can I obtain the colors of ARFrame.rawFeaturePoints? The task requires interaction with iPhone's GPU through metal but I had no previous experience working with metal probably through Vision framework and I have no idea where to start from. If you happen to be a metal guru and can throw a brief example I showing the concept I would really appreciate this. Btw. I've tried mapping ARFrame.rawFeaturePoints from 3d to 2d and then mapping them to ARFrame.capturedImage based on these posts:

iOS11 ARKit: Can ARKit also capture the Texture of the user's face?

https://medium.com/@ugiacoman/arkit-tool-or-toy-bbaf8cd70338

but it hits performance of the application drastically probably due the amount of memory copying and ycbcr to rgb transform. So the only option that is left for me is GPU...

jscs
  • 63,694
  • 13
  • 151
  • 195
Lu4
  • 14,873
  • 15
  • 79
  • 132
  • have you made any progress? i am currently looking into this problem – ɯɐɹʞ Sep 09 '19 at 20:38
  • @ɯɐɹʞ It turned out to be a very complex problem, the reason I've decided not to proceed with my own solution is that I understood that it might take huge amount of efforts to do it right (not talking about proof of concept working on a knee), taking into account a proper support for all the devices I needed to cover it's kinda small fish to fry, although it would have been a great api if Apple would ever publish it. I also know that the AR tech is now a bit hacky in future we should get real depth estimation as an image, just need to wait for that moment... Unfortunately... – Lu4 Sep 09 '19 at 22:41

0 Answers0