2

The idea is Phone A sends a sound signal and bluetooth signal at the same time and Phone B will calculate the delay between the two signals.

In practice I am getting inconsistent results with delays from 90ms-160ms. I tried optimizing both ends as much as possible.

On the output end:
Tone is generated once Bluetooth and audio output each have their own thread
Bluetooth only outputs after AudioTrack.write and AudioTrack is in streaming mode so it
should start outputting before the write is even completed.

On the receiving end:
Again two separate threads
System time is recorded before each AudioRecord.read

Sampling specs:
44.1khz
Reading entire buffer
Sampling 100 samples at a time using fft
Taking into account how many samples transformed since initial read()

user624056
  • 53
  • 8
  • You will probably get a better response if you clarify and expand upon your question. It is hard to tell what exactly you are asking. – FoamyGuy Mar 04 '13 at 22:31
  • 1
    Sound travels at 331 meters/1000ms or 3.31 meters / 10ms. What makes you think you can time the arrival a bluetooth signal and a audio signal with a commercial device? – Morrison Chang Mar 04 '13 at 22:42
  • The fact that this commercial device has a 1.4 ghz operating frequency and will run through 4 cpu cycles before light travels 1 foot. Cycles dont equal commands but I am timing sound latency which is very slow (1ms per foot) ~1400000 cpu cycles – user624056 Mar 04 '13 at 23:05

2 Answers2

7

Your method relies on basically zero latency throughout the whole pipeline, which is realistically impossible. You just can't synchronize it with that degree of accuracy. If you could get the delays down to 5-6ms, it might be possible, but you'll beat your head into your keyboard before that happens. Even then, it could only possibly be accurate to 1.5 meters or so.

Consider the lower end of the delays you're receiving. In 90ms, sound can travel slightly over 30m. That's the very end of the marketed bluetooth range, without even considering that you'll likely be in non-ideal transmitting conditions.

Here's a thread discussing low latency audio in Android. TL;DR is that it sucks, but is getting better. With the latest APIs and recent devices, you may be able to get it down to 30ms or so, assuming you run some hand-tuned audio functions. No simple AudioTrack here. Even then, that's still a good 10-meter circular error probability.

Edit:

A better approach, assuming you can synchronize the devices' clocks, would be to embed a timestamp into the audio signal, using a simple am/fm modulation or pulse train. Then you could decode it at the other end and know when it was sent. You still have to deal with the latency problem, but it simplifies the whole thing nicely. There's no need for bluetooth at all, since it isn't really a reliable clock anyway, since it can be regarded as having latency problems of its own.

Community
  • 1
  • 1
Geobits
  • 22,218
  • 6
  • 59
  • 103
  • Thank you for the reply The sound is recording constantly and I timed just the sound record which takes 44-48ms to read 2048 samples which is very close to the 46ms it should be. The fft takes 0-1ms. I don't see where the 90ms latency is coming from. I am currently sending bits through sound so I could send a timestamp but I want to avoid syncing the clocks. I think this approach would be necessary if the bluetooth hardware/overhead was too slow but it is very fast. – user624056 Mar 04 '13 at 22:56
  • 1
    How long does it take between the time your code says "play" and sound physically comes out of the speaker? Same for the microphone's side. That's one of the biggest sources of delay in most audio applications, and since it's not normally very consistent, it's hard to compensate for. The fact that you record 2048 samples in ~46ms doesn't mean much if you're not sure *when* those samples got to the phone. Any approach is going to have to battle that, but audio latency is notoriously bad on Android. – Geobits Mar 04 '13 at 23:00
  • using setPlaybackPositionUpdateListener I get a delay of about 14ms before the function is called. I have no way of measuring the delay besides that function. – user624056 Mar 05 '13 at 01:16
  • If you're getting 14ms, then I don't think that's a reliable way to measure latency. I've done quite a bit of research on this in relation to an app, and have never seen a report of latency that low on any consumer Android device. – Geobits Mar 05 '13 at 01:22
  • +1: Nice answer and well reasoned. (The embedded timestamp is particularly clever.) – tom10 Mar 22 '13 at 02:52
1

This gives you a pretty good approach http://netscale.cse.nd.edu/twiki/pub/Main/Projects/Analyze_the_frequency_and_strength_of_sound_in_Android.pdf

You have to create an 1 kHz sound with some amplitude (measure in dB) and try to measure the amplitude of the sound arrived to the other device. From the sedation you might be able to measure the distance.

As I remember: a0 = 20*log (4*pi*distance/lambda) where a0 is the sedation and lambda is given (you can count it from the 1kHz) But in such a sensitive environment, the noise might spoil the whole thing, just an idea, how I would do if I were you.

Jani Bela
  • 1,660
  • 4
  • 27
  • 50
  • 2
    I think this would be as inaccurate as using bluetooth signal strength. – user624056 Mar 04 '13 at 22:58
  • Maybe yes. But you dont have to worry about latency, since you just have to start emitting that 1kHz frequency sound constantly and measure the sedation for a certain time. Averaging might be a good idea. – Jani Bela Mar 04 '13 at 23:57