0

I'm learning go, following the online tutorial "A tour of Go".

In this excercise: https://tour.golang.org/concurrency/10

Before going on to solve the problem, I wanted to try something simple:

func Crawl(url string, depth int, fetcher Fetcher) {
    fmt.Println("Hello from Crawl")
    if depth <= 0 {
        return
    }
    body, urls, err := fetcher.Fetch(url)
    if err != nil {
        fmt.Println(err)
        return
    }
    fmt.Printf("found: %s %q\n", url, body)
    for _, u := range urls {
        fmt.Println("in loop with u = %s", u)
        go Crawl(u, depth-1, fetcher) //I added "go" here
    }
}

The only thing I added is the go command right before the recursive call to Crawl. I expected it should not change much to the behavior.

However the printout is:

Hello from Crawl
found: http://golang.org/ "The Go Programming Language"
in loop with u =  http://golang.org/pkg/
in loop with u =  http://golang.org/cmd/

I expected to see Hello from Crawl for each iteration of the loop. Why doesn't my Crawl subroutine get executed?

Nathan H
  • 48,033
  • 60
  • 165
  • 247

2 Answers2

2

Your goroutines started, but ended before doing what you want to do because your main() finished. Execution of goroutines is independent of the main program like threads, but will be terminated when program stops. So you need a WaitGroup to wait for the goroutines to finish their jobs or simply call time.Sleep() to make the main program sleep for a while.

jfly
  • 7,715
  • 3
  • 35
  • 65
1

There is nothing to make sure that all your go routines ends before your program finishes, I'd refactor your code to something similar to this:

func Crawl(url string, depth int, fetcher Fetcher) {
    fmt.Println("Hello from Crawl")
    if depth <= 0 {
        return
    }
    body, urls, err := fetcher.Fetch(url)
    if err != nil {
        fmt.Println(err)
        return
    }
    fmt.Printf("found: %s %q\n", url, body)
   // Adding waiting group to make sure go routines finishes
    wg := sync.WaitGroup{}
    wg.Add(len(urls))
    for _, u := range urls {
        fmt.Println("in loop with u = %s", u)
        go func() {
           defer wg.Done()
           Crawl(u, depth-1, fetcher)
        }()
    }
    wg.Wait()
}
Artem Barger
  • 40,769
  • 9
  • 59
  • 81