I am on the first chapter The Go Programming Language (Addison-Wesley Professional Computing Series) and the 3rd exercise in the book asks me to measure code performance using time
.
So, I came up with the following code.
start := time.Now()
var s, sep string
for i := 1; i < len(os.Args); i++ {
s += sep + os.Args[i]
sep = " "
}
fmt.Print(s)
fmt.Printf("\nTook %.2fs \n", time.Since(start).Seconds())
fmt.Println("------------------------------------------------")
start2 := time.Now()
fmt.Print(strings.Join(os.Args[1:], " "))
fmt.Printf("\nTook %.2fs", time.Since(start2).Seconds())
When I ran this code on Windows and Mac, it always return 0.00 second. I added a pause in my code to check whether it's correct and it seems fine. What I don't understand is why it always returns 0.0.