I have been practicing writing a number of Ruby scrapers using Mechanize and Nokogiri. For instance here ( However, it seems that after making a certain number of requests (about 14000 in this case) I get an error saying I have a connection timed out error:
/var/lib/gems/1.8/gems/net-http-persistent-2.5.1/lib/net/http/persistent/ssl_reuse.rb:90:in `initialize': Connection timed out - connect(2) (Errno::ETIMEDOUT)
I have Googled a lot online, but the best answer I can get is that I am making too many requests to the server. Is there a way to fix this by throttling or some other method?