11

I use SIEGE to test my web server performance. For a more realistic test the best way to go would be to have SIEGE hit the web page (website.com/our-company) and all static assets (.css, .js, .png, .jpg). Everything that you see on the firefox / chrome debbuing tools, except of course from resources loaded from external servers (cdn.facebook, apis.google.com).

I am running several tests so it is a pain to manually collect all asset urls. Is there a tool that I can use to load a web page and export the url for everything that was loaded?

This is firefox debugging. If I could export this to txt or csv, it would be perfect. enter image description here

I tried CURL on debian CLI but I am no experct. Any tool will help, it does't have to be a plugin of Firefox / Chrome.

Best regards.

ddutra
  • 1,459
  • 1
  • 14
  • 17

7 Answers7

6

In Chrome you can export these data to a HAR file (it's JSON based) in one click. Go to "Network", right click and choose "Save as HAR with content".

Save as HAR with Content option in DevTools

Konrad Dzwinel
  • 36,825
  • 12
  • 98
  • 105
  • Thanks, but thats not what I need. Maybe I did not understand you correctly. I need a list of all css, js and img that is being requested on a page in order to be able to simulate a full page load with SIEGE. I was able to get that with livehttpheaders + some excel. Very easy. Best regards. – ddutra Oct 08 '13 at 13:09
  • 1
    HAR file contains all URLs that were loaded + all headers + all load times. Since it's a JSON file you can easily extract what you need. – Konrad Dzwinel Oct 08 '13 at 13:12
  • Konrad,Thans for your help, but that seems harder then the other tools (like livehttpheader). Maybe I am not getting you. How do you suggest I extract the URLS from HAR? Using which tool? Best regards. – ddutra Oct 08 '13 at 16:46
  • Perhaps a bit late but in case someone else is exploring a solution, I found this. Haven't tried it yet but give it a shot. http://www.yamamoto.com.ar/blog/?p=201 – Will Schoenberger Mar 09 '15 at 20:41
4

Here's a free command line application to convert HAR files to CSV. Hope it helps.

http://www.yamamoto.com.ar/blog/?p=201

EDIT: added the project to GitHub:

https://github.com/spcgh0st/HarTools

spcgh0st
  • 76
  • 4
1

On Windows you could use HttpWatch to do this with the free Basic Edition in IE or Firefox:

http://www.httpwatch.com/download/

The CSV export function will export the URLs and other fields to a CSV file.

** Disclaimer: This was posted by Simtec Limited the makers of HttpWatch **

HttpWatchSupport
  • 2,804
  • 1
  • 17
  • 16
1

Had the same requirement of exporting HAR files from Chrome DevTools or Firebug to do load testing with siege. Additionally, I wanted to replay POST requests too.

Choose one of these solutions:

flexponsive
  • 6,060
  • 8
  • 26
  • 41
0

Nevermind.

Just fond out the very nive LiveHtttpHeaders extension for firefox.

Best regards.

ddutra
  • 1,459
  • 1
  • 14
  • 17
  • 1
    Hmm, why did you accept the other answer then? Also, what does LiveHttpHeaders do that the Chrome export cannot? – Patrick M Jul 09 '14 at 19:32
0

As you guys know, the HAR file format is a JSON file. So... I looked for a JSON to CSV converter and found this:

https://json-csv.com/

This worked for my HAR file that I got from GTmetrix.com. Enjoy!

milican
  • 26
  • 1
0

You can Export all Http requests from Chrome Developer console by going to the Network tab

  • select one of the requests in Network Tab
  • press Right Mouse Button
  • from PopUp menu select Copy -> Copy all as Har (Curl/Har/etc)
  • paste into file

enter image description here

pymen
  • 5,737
  • 44
  • 35