2

i have a go colly crawler that i am trying to crawl many sites . on my terminal it prints a lot of :

2023/05/30 02:22:56 Max depth limit reached
2023/05/30 02:22:56 Max depth limit reached
2023/05/30 02:22:56 Max depth limit reached
2023/05/30 02:22:56 Max depth limit reached
2023/05/30 02:22:56 Max depth limit reached
2023/05/30 02:22:56 Max depth limit reached
2023/05/30 02:22:56 Max depth limit reached

it makes it hard for me to read some prints that i have placed . i wanted to know if there is any way to ignore printing this in termial . thanks

Farshad
  • 1,830
  • 6
  • 38
  • 70

1 Answers1

1

Max depth limit reached is colly.ErrMaxDepth. There must be codes like this in your project:

c := colly.NewCollector(colly.MaxDepth(5))

// ...

if err := c.Visit("http://go-colly.org/"); err != nil {
    log.Println(err)
}

If you don't want to log this error, add a simple check to exclude it:

c := colly.NewCollector(colly.MaxDepth(5))

// ...

if err := c.Visit("http://go-colly.org/"); err != nil {
    // log the error only when the error is not ErrMaxDepth.
    if err != colly.ErrMaxDepth {
        log.Println(err)
    }
}

Another option is to redirect the output to a file:

go run . 2>&1 >log.txt

or copy the output to a file and also to standard output with tee:

go run . 2>&1 | tee log.txt
Zeke Lu
  • 6,349
  • 1
  • 17
  • 23