Disclaimer: This a very theoretical idea that I'm experimenting with and I get that there are hard limits to CPU hardware and physics; this is more about seeing how far I can get.
The Question: How do you measure the difference in Nanoseconds between the host device and the standard time the device would be synced to.
According to this article you experience "Nanoseconds" of Time Dilation relative to something like sea level every time you fly on an airplane.
The idea is that you have the app before you get on a plane and when you land, you get a notification saying: "You've Time Traveled XXX Nanoseconds!"
I assume that any standard time duration utilities in Swift are subject to the system clock and automatically synced to an NTP Server.
I'm wondering if there's a way to measure a time duration separate from any time that's synced to a remote server or carrier.
If not, can this be accomplished with lower-level language like Rust?
I've seen iOS APIs that can parse Nanoseconds I just can't find any resources on other factors you would have mitigate to actually measure Time Dilation.