I'm currently working on a bash script to automate a list of regex for a list of links to clean up the file. Currently i'm doing all manually on kate with find/replace, but having it as a script would be more comfortable. Since i'm fairly new to bash scripting, i ask you for help.
Example list of urls:
0: "/suburl0"
1: "/suburl1"
2: "/suburl2"
3: "/suburl3"
4: "/suburl4"
Currently script i have:
#!/bin/bash
awk '[^\x00-\x7F]+' $1 #there are non-ascii chars in the file, so clean it out
awk 'NF' $1 # remove non-character lines
awk '^[0-900]{0,3}: ' $1 #delete all those number infront of the link
awk '"' $1 # remove those quotation marks
awk '!seen[$0]++' $1 #remove duplicate lines
awk '{print "http://example.com/" $0}' $1 #prepend the full url to the suburl
The goal is to apply all those regexes to the file, so the file ends cleaned up
My guess is, that i'm not redirecting the output of awk correctly, but when i tried to pipe it into the file, the file was just empty lines.