0

I'm trying to do user acceptance testing on an application which becomes unresponsive on a particular URL parameter included in the GET request.

Steps

  1. I have curl and run the GET req (crafted) copied curl syntax for Unix and copied to ubuntu server along with some changes.

    'https://abc.ai/getMultiDashboard/demouser' -H 'Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _gid=GA1.2.1366208807.1601560229; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0Ellc3NUb2tlbiUyMiUzQSUyMjA2MTk3NjM3NTgwOGE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3A8ZGd7Mol31n_Y8OCLq39dHoo3_mIlRhZ.pFQWz5gG9McKsQLzOikcTBmmb2Wcrxo%2B9u9iPpqoyxw; pageUrl=/#/dashboard/18; _gat_gtag_UA_97985973_5=1' 
    "https://abc.ai/getTagTrends/E1_CPU_PERCENTAGE/2020-9-12%2013:4:0/202**'23548'**0-09-15|%2013:04:00"
    "https://abc.ai/getTagTrends/E1_CPU_PERCENTAGE/2020-9-12%2013:4:0/202**'`23548`'**0-09-15|%2013:04:00"
    
  2. The ** asterisks are not part of the actual values; I use them to demarcate my injected value

  3. Using a small bash script I have generated 1000s of (unique) payload combinations for Curl.

    #/bin/bash   
    for ((i=0; i<1000; ++i)); do
            echo "
    'https://abc.ai/getMultiDashboard/demouser' -H 'Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7
    f-b3ef-6f9f12b13d66; 54651cc_an=4; _gid=GA1.2.1366208807.1601560229; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMjJkZW1vdXNlciU yMiUyQyUyMm4lMjIlM0ElMjJkZW1vdXNlciUyMiUyQyUyMmZyaWVuZHMlMjIlM0ElMjIlMjIlMkMlMjJhdXRoJTIyJTNBJTIyZWQ0YjVhNDFkMzJlY2U4MzQ3Mzk0ZjlkZT    U5YThjMWQlMjIlMkMlMjJyZWZlcmVyJTIyJTNBJTIyaXJpZGl1bS1wcmVwcm9kLmVtcGlyaWMuYWklMjIlMkMlMjJhY2Nlc3NUb2tlbiUyMiUzQSUyMjA2MTk3NjM3NTgwO
    GE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disable
    lastseen=false; 54651cc_usertype=loginuser; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3A8ZGd7Mol31n_
    Y8OCLq39dHoo3_mIlRhZ.pFQWz5gG9McKsQLzOikcTBmmb2Wcrxo%2B9u9iPpqoyxw; pageUrl=/#/dashboard/18; _gat_gtag_UA_97985973_5=1' \"https://abc.ai/getTagTrends/E1_CPU_PERCENTAGE/2020-9-12%2013:4:0/202'$((1 + RANDOM % 10000000))'0-09-15|%2013:04:00\"" 
    > URL.txt   
    done
    
  4. Final command for testing (one-liner) fails as cat URL.txt | xargs -I{} -- curl -O {}

Output:

     % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                     Dload  Upload   Total   Spent    Left  Speed
      0     0    0     0    0     0      0      0 --:--:--  0:00:01 --:--:--     0

Expected output when I run the curl manually copying the contents from URL file I get

[{"dashboard_id": 18, "user_id": "demouser", "dashboard_name": "My_dashboard_1", "description": "Test description One", "creation_date": "2020-09-21 10:13:00", "dashboard_config": null, "id": 5}]


<html>
<head><title>504 Gateway Time-out</title></head>
<body>
<center><h1>504 Gateway Time-out</h1></center>
<hr><center>nginx/1.18.0</center>

In order to troubleshoot, i used set -x on shell cmd-line I can't see why or how the request is crafted and handled by the curl processes. The curl output shows output (above) which has all 0 values in all fields, this tell me its just a bad malformed request, which isn't the actual case since i manually tested running the URL payload given in URL.txt multiple times it works.

EMPTY LINE
CODE
NEW-LINE

CODE
NEWLINE

...

I want to generate as many parallel requests as possible, without waiting for the first one to finish.

Debug

running it with -v using one-liner (showing only importants lines)

> GET /getMultiDashboard/demouser -H Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMgwOGE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _gid=GA1.2.1722546791.1601890062; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3AKsRWcfNnOkbDHh1e65C3NwiDSZMx4LYg.zxLIymu488Ii5Z2%2Brz0qiwS17BzK2P7A0OoTSCHlMQM; pageUrl=/ HTTP/1.1
> Host: abc.ai
> User-Agent: curl/7.58.0
> Accept: */*
>
{ [5 bytes data]
< HTTP/1.1 400 BAD_REQUEST
< Content-Length: 0
< Connection: Close

When I run it with curl alone not using xargs I get the correct output no 400 bad request

> Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMjJkZW1vdXNlciUyMiUyQyUyMm4lMjIlM0ElMjJkZW1vdXNJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _gid=GA1.2.1722546791.1601890062; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3AKsRWcfNnOkbDHh1e65C3NwiDSZMx4LYg.zxLIymu488Ii5Z2%2Brz0qiwS17BzK2P7A0OoTSCHlMQM; pageUrl=/#/dashboard; _gat_gtag_UA_97985973_5=1
>
< HTTP/1.1 200 OK
< Content-Type: text/html; charset=utf-8
< Date: Mon, 05 Oct 2020 09:48:51 GMT
< ETag: W/"3b4-gP1vMAXMzUZy+pt7cwyOmQslPT8"
< Server: nginx/1.18.0
< Strict-Transport-Security: max-age=15552000; includeSubDomains
< Vary: Accept-Encoding
< X-Content-Type-Options: nosniff
< X-DNS-Prefetch-Control: off
< X-Download-Options: noopen
< X-Frame-Options: SAMEORIGIN
< X-XSS-Protection: 1; mode=block
< Content-Length: 948
< Connection: keep-alive
<
* Connection #0 to host abc.ai left intact
[{"dashboard_id": 18, "user_id": "demouser", "dashboard_name": "My_dashboard_1", "description": "Test description One", "creation_date": "2020-09-21 10:13:00",  "2020-08-12 09:08:00", "dashboard_config": {}, "sort_id": 4, "id": 2}, {"dashboard_id": 5}]* Found bundle for host abc.ai: 0x55836cf75a50 [can pipeline]
* Re-using existing connection! (#0) with host abc.ai
* Connected to abc.ai (52.86.136.249) port 443 (#0)
> GET /getTagTr/E1_CP/2020-9-12%2013:4:0/202'6368'0-09-15|%2013:04:00 HTTP/1.1
> Host: abc.ai
> User-Agent: curl/7.58.0
> Accept: */*
> Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMjJkZW1vdXNlciUyMiUyQyUyMmjM3NTgwOGE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _gid=GA1.2.1722546791.1601890062; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3AKsRWcfNnOkbDHh1e65C3NwiDSZMx4LYg.zxLIymu488Ii5Z2%2Brz0qiwS17BzK2P7A0OoTSCHlMQM; pageUrl=/#/dashboard; _gat_gtag_UA_97985973_5=1
user316389
  • 87
  • 7
  • Comments are not for extended discussion; this conversation has been [moved to chat](https://chat.stackoverflow.com/rooms/222542/discussion-on-question-by-user316389-stress-testing-uri-using-xargs-curls-bash). – Samuel Liew Oct 05 '20 at 14:44

1 Answers1

3

Having multiple curl arguments and options in the same file adds a complication which probably isn't worth working around. Basically,

echo "http://example.com -H 'X-Hello: Hello'" | xargs curl -O

passes the entire argument to echo as a single string to curl, which interprets it as the URL to fetch.

My suggestion would be to put the URL and any other arguments on the command line, and only store the -H option's argument in the file.

for ((i=0; i<1000; ++i)); do
    curl -O http://example.com -H "$(sed "s/%|/%$((1 + RANDOM))|/" xm.cookiefile)"
done

and run 400 (or whatever) of these jobs in parallel, perhaps just as regular background processes, or maybe with xargs if you think it adds value. (Maybe also look at GNU parallel which simplifies some aspects of this.)

I took out the big modulo because it's not doing anything; $RANDOM produces integers in the range 0-32767 so if you need a much bigger number, maybe paste together multiple $RANDOM numbers, or maybe use a different random source.

tripleee
  • 175,061
  • 34
  • 275
  • 318
  • Your code is more elegant and simple. I want the `echo` to be multiple GET request with only change be the randomness added in one part of url. The whole URL request is given in file ` – user316389 Oct 04 '20 at 17:33
  • Also in place of wants `curl` command execution which is present in the file, so with `echo` i could do something the number of times the curl has executed or its progress mainly? – user316389 Oct 04 '20 at 17:34
  • And yes i want to run at least 400 parallel request at the same time...this is why i was using `printf` – user316389 Oct 04 '20 at 17:42
  • 2
    @user316389: please be careful about your terminology. I understand "400 parallel request at same time" to mean "400 curl processes hitting the same(ish) URL all at the same time". That sounds like a denial-of-service attack to me. Assuming you have a legitimate reason to do this, do you really have a machine with the memory and internet bandwidth to support 400 simultameous processes? Most PCs won't be able to support that. Would be best if your could clarify your use case further in the body of your question above, rather that extending discussion in comments. Good luck. – shellter Oct 04 '20 at 18:10
  • @shellter yes its more of performance based testing with goals similar to DoS, it an application UAT, and yes ` at same time` OR with `randomness` at particular URL section i want these unique GET request to produce `5xx` timeout errors. – user316389 Oct 04 '20 at 18:36
  • 1
    I don't mind in this particular case, but you want to be careful not to move the goal posts too much, in particular when you have already received answers. It's better to accept an answer and post a new, clearer question as your understanding of the problem space develops. – tripleee Oct 05 '20 at 06:15
  • For limiting concurrent jobs, see https://stackoverflow.com/questions/1537956/bash-limit-the-number-of-concurrent-jobs – tripleee Oct 05 '20 at 17:49
  • 1
    @user316389 : Hopefully your acceptance of this answer indicates you have solved your problem. If you post a new question on this feel free to add a comment with my userID. I'll look it over. Good luck! – shellter Oct 05 '20 at 20:31
  • https://unix.stackexchange.com/questions/613173/how-to-perform-concurrent-get-requests-for-web-resource-which-return-501-status @shellter – user316389 Oct 06 '20 at 15:21