I need some a assistance with my bash shell which to me seems to be pretty simple. I want to be able extract all of the links of a given website and print them to standard output. I want to this do all through a script of my own. My goal is to have the command and have the website, where I will be extracting all of the links from, to be an argument. Here's what I have so far:
cat > extract_links
curl $1 | grep
I dont have really much programming experience so sorry if this isn't much of a start. Is it necessary to use regular expressions? If anyone is willing to help, a code that is as simple as possible will be much appreciated. Thanks!