I have a list of URL files Test.txt
in a text document I am trying to download from a server that requires logging in. I start by creating a .netrc
file in curl in my Linux terminal using the following code:
host-0800-a45e60ef38c9:~ myname$ touch .netrc
host-0800-a45e60ef38c9:~ myname$ echo "machine urs.earthdata.nasa.gov login mylogin password mypassword” >> .netrc
>
host-0800-a45e60ef38c9:~ myname$ chmod 0600 .netrc
host-0800-a45e60ef38c9:~ myname$ touch .urs_cookies
host-0800-a45e60ef38c9:~ myname$ cat Test.txt | tr -d '\r' | xargs -n 1 curl -LJO -n -c ~/.urs_cookies -b ~/.urs_cookies
Where ~myname$ is my home directory.
The Test.txt file contains around a thousand unique links such as this one:
I get the following errors over and over again:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 1207k 100 1207k 0 0 1641k 0 --:--:-- --:--:-- --:--:-- 1641k
Warning: Failed to create the file
curl: (23) Failed writing body (0 != 27)
My guess it has something to do with the permission of credentials .netrc file
. But it is set to read and write. I am brand new to Linux. Can someone point out what's going wrong here?