I have written a Java application which records a .wav audio file from a microphone. If you look at a wave graph of the file (screenshot attached), it will be a flat-line with just two spikes. My goal is to process the file to find the time (in the recording) at which each spike peaks.
Example: In the attached screenshot, the first peak happens at 1.52585 seconds into the recording while the second peak happens 1.52692 seconds in. Subtracting these, I can find the time difference between the spikes: 0.00107 seconds.
I'm sure all it takes is a relatively simple algorithm, however, I have no idea how to go about it. Should i be looking at decibels? Can someone please guide me? I'm so confused (I know nothing about audio).