18

I want to test some url's on a small custom server i am working on. i have used Wfetch on windows and its awesome

My requirements for these tests are: - should be able to run on linux(ubuntu) - should be able to set all params manually - should support digest aunthentication

can someone suggest some gui or extension for such a work.

I have already tried RESTclient and Poster but they do not support digest aunthentication.

Possible duplicate of How do I manually fire HTTP POST requests with Firefox or Chrome?

Community
  • 1
  • 1
neeraj
  • 477
  • 1
  • 3
  • 10

2 Answers2

32

wget may help you.

get:

wget http://example.com

post:

wget --post-data "username=Yarkee" http://example.com
Yarkee
  • 9,086
  • 5
  • 28
  • 29
9

You can use wget for this. From the manual it supports digest authentication and can send POST requests.

There seems to be a GUI at wget::gui, but I don't know how reliable or complete it is.

Olaf Dietsche
  • 72,253
  • 8
  • 102
  • 198
  • Can this work with cookies? Many web sites are protected by either authentification forms or vicious access rules like "you must first see some introductory or advertisement pages". Such sites can't be downloaded with the well known WGET tool alone, that I know of. – Nathan Basanese Aug 06 '15 at 05:35
  • 1
    @NathanBasanese I haven't thought about it, but you might try [`wget --load-cookies`](http://www.gnu.org/software/wget/manual/wget.html#index-loading-cookies) and the following `--save-cookies` option. You can also download more than just one URL, so loading some entry page and then the actual page should be doable. If it becomes more complicated, you should rather look at a web scraping framework, e.g. [Scrapy](http://scrapy.org/). – Olaf Dietsche Aug 06 '15 at 13:17