0

For example I have a page and it loads images and other files and some images/files are not available in my directory and and as such the browser throws a 404 error when trying to retrieve these images/files.

As such, my dev console spits out these errors:

GET http://www.google.com/does_not_exist.jpg 404 (Not Found)
GET http://www.google.com/also_does_not_exist.jpg 404 (Not Found)

I can copy one URL at a time but it's time consuming if there are 100 or more URLs.

How can I copy all the 404 URL's at once?

mkrieger1
  • 19,194
  • 5
  • 54
  • 65
Elsa James
  • 71
  • 2
  • 9

0 Answers0