I am trying to detect speed of my internet using NSURLConnection. What I do is, I start downloading a file, in delegates of NSURLConnection, I start a time and then when the download finishes, it gets the time frame as well as the data received and then I calculated to get the mb/sec using the below code.
if (startTime != nil) {
elapsed = NSDate().timeIntervalSinceDate(startTime)
NSLog("\(length) -- \(elapsed)")
var d = (Double(length) / elapsed)
var result = CGFloat( d/1024)
result = result * 0.0078125
result = result * 0.0009765625
return result
}
My question is why I am dividing 1024 here because If I don't do I get something is bits/bytes...
I am assuming I am getting seconds from NSDate().timeIntervalSinceDate(startTime)
and bytes from Nsdata length
I think I am getting right value however I am not sure. Let me know why it's necessary to divide 1024!