I use wget
to access a list of links from a text file. A link example would be:
http://localhost:8888/data/test.php?value=ABC123456789
The PHP file returns a table with information from which the response is to be appended to another text file. As to the error, it is obvious that currently it cannot handle the amount of URLs because it exceeds the character limit. If I use 2 URLs only, it works perfectly fine.
The text file contains a total of 10 000 URLs. The command I am using is:
wget -i /Applications/MAMP/htdocs/data/URLs.txt -O - >> /Applications/MAMP/htdocs/data/append.txt
According to my research, a quick way to "fix" this is to change the LimitRequestLine
or adding it if it does not exist. Since I use MAMP (for MacOS) what I did was:
Open /Applications/MAMP/conf/apache/httpd.conf
And insert under AccessFileName .htaccess
:
LimitRequestLine 1000000000
LimitRequestFieldSize 1000000000
But I still get the same error. I don't know why this happens.
May it be easier to use cURL
? If yes, what would be a similar command?