Here is a sample website url with a list of folders; I need to get these folder names stored using shell / bash script
Asked
Active
Viewed 1,470 times
-1
-
try using `wget` with the recursive option? https://stackoverflow.com/a/273776/1681480 – beroe Jul 04 '19 at 15:53
1 Answers
4
With GNU grep:
curl <url> | grep -oP '<a href=".+?">\K.+?(?=<)'
Mac:
curl <url> | perl -nle 'print $& while m{<a href=".+?">\K.+?(?=<)}g'
It might need some tweaking for your specific site.

krisz
- 2,686
- 2
- 11
- 18