In go tutorial web crawler exercise: https://go.dev/tour/concurrency/10, I use Mutex to protect the map and use WaitGroup to know when to end. It can print out all 5 urls but go into deadlock at the end and I don't know why. This is my main function and Crawl function:
func main() {
var mu sync.Mutex
visited := make(map[string]struct{})
var wg sync.WaitGroup
wg.Add(1)
Crawl("https://golang.org/", 4, fetcher, mu, visited, wg)
wg.Wait()
}
func Crawl(url string, depth int, fetcher Fetcher, mu sync.Mutex, visitedMap map[string]struct{}, wg sync.WaitGroup) {
// TODO: Fetch URLs in parallel.
// TODO: Don't fetch the same URL twice.
// This implementation doesn't do either:
defer wg.Done()
if depth <= 0 {
return
}
mu.Lock()
_, visited := visitedMap[url]
if visited {
mu.Unlock()
return
} else {
visitedMap[url] = struct{}{}
mu.Unlock()
}
body, urls, err := fetcher.Fetch(url)
if err != nil {
fmt.Println(err)
return
}
fmt.Printf("found: %s %q\n", url, body)
for _, u := range urls {
wg.Add(1)
go Crawl(u, depth-1, fetcher, mu, visitedMap, wg)
}
return
}
This is the output:
found: https://golang.org/ "The Go Programming Language"
found: https://golang.org/pkg/ "Packages"
found: https://golang.org/pkg/os/ "Package os"
not found: https://golang.org/cmd/
found: https://golang.org/pkg/fmt/ "Package fmt"
fatal error: all goroutines are asleep - deadlock!
goroutine 1 [semacquire]:
sync.runtime_Semacquire(0x488120?)
/usr/local/go-faketime/src/runtime/sema.go:56 +0x25
sync.(*WaitGroup).Wait(0x100000000?)
/usr/local/go-faketime/src/sync/waitgroup.go:136 +0x52
main.main()
/tmp/sandbox1832384062/prog.go:53 +0x89