I am using a good Multi CURL interface called Rolling CURL
http://code.google.com/p/rolling-curl/issues/detail?id=20
It works fine, for example it gets data from 20 sites in around 3 seconds. The problem is that I need it to work on 200 - 300 sites that are on the SAME server. This takes around the same time that it takes to make a single CURL request in a loop which is around 10 minutes 47 seconds. So I am a bit stumped as to what to do. All I need to do is grab the HTTP code on each site. I have tried file_get_contents, PHP FTP functions, they are much slower.
Another thing is that when I run through a list of 12 + domains that are on the same server, it seems to block the request so I just don't get any data back at all on any of the sites. This problem will not occur when I run a list of less than 12. I am only fetching the header data of the sites so it shouldn't be that slow.
If anyone can help me or give me a detailed explanation on why this is happening with pointers on how I can overcome this problem I will be incredibly thankful.