suppose I have a list of URLs and some of them are repeated any number of times, but some of them are unique. I need to get rid of the unique lines, (which are useless) and save the URLs which have been repeated more than 4 times (which are very important URLs for me to keep track of).
How can I make an expression of some sort which would delete all but the duplicate lines? I would prefer to be able to whittle it down to a list of only the URLs which are repeated more than 4 times.