47

I use my PHP back-end to detect AJAX requests by checking for a value in $_SERVER['HTTP_X_REQUESTED_WITH'].

This gives me a reliable detection, making sure the request is made utilizing AJAX techniques.

How can I make sure the request came from my own domain, and not an external domain/robot?

www.example.com/ajax?true could allow anyone to make an AJAX call and cut the information.

I could make sessions for everyone that enters my website normally, and then allow AJAX calls.. but that can be faked too.

Does it even matter these days?

Sᴀᴍ Onᴇᴌᴀ
  • 8,218
  • 8
  • 36
  • 58
Yossi
  • 1,056
  • 2
  • 13
  • 22

8 Answers8

33

Let you Controller

  • generate access token
  • store in session for later comparison

In your View

  • declare the access token as JS variable
  • send the token with each request

Back in your Controller

  • validate HTTP_X_REQUESTED_WITH
  • validate token

Check these security guidelines from OpenAjax.
Also, read the article on codinghorror.com Annie linked.

Gordon
  • 312,688
  • 75
  • 539
  • 559
  • 3
    +1 Since browser-based AJAX requests send cookies with each request - all you need is to add the a token check on each page. – Xeoncross Dec 23 '09 at 17:12
  • 8
    That won't stop a robot hitting the API directly. It can get an access token just like your JavaScript app can. – Quentin Sep 17 '14 at 09:18
23

You can check the HTTP_REFERRER, but not all browsers set it. The best way is to write a wrapper for your ajax calls on the JavaScript side which sends part of document.cookie back to the server--only your domain has access to the cookie. You can compare the cookie in the request headers with the cookie in the AJAX call in php.

In response to, "does it even matter, these days"--YES, it does! Read this.

Cœur
  • 37,241
  • 25
  • 195
  • 267
Annie
  • 6,621
  • 22
  • 27
  • 2
    I'm not sure this method would accomplish anything except security through obscurity. It's trivial to craft a web request, which is all an Ajax request is. It's just slightly less trivial to examine the Javascript wrapper and add whatever it would be adding. All this does is emulate a session token. – zombat Dec 23 '09 at 17:27
  • 1
    I guess it might keep robots at bay, if they didn't use Javascript. But it woudln't make the request *secure*, per se. – zombat Dec 23 '09 at 17:29
  • This method requires access to the session cookie. If a robot has access to the session cookie, then yes, the site is not secure. But you also have bigger problems than XSRF! – Annie Dec 23 '09 at 17:50
  • I read the blog article, it was extremely informative since this subject is new to me. From my checkups, CakePHP does that automatically, and resets the cookie if the browser has been restarted. That should handle registered users. What happens, in public areas against robots, going around? example.com/blog/title1?ajax=true | example.com/blog/title2?ajax=true About "does it matter nowadays", I guess that I simply can't stop a robot from crawling easily through all of my public areas. – Yossi Dec 23 '09 at 20:59
  • About public areas, you can use spam prevention techniques like captchas: http://en.wikipedia.org/wiki/CAPTCHA – Annie Dec 23 '09 at 21:07
  • From my understanding, @zombat's point is still a valid one. We're not talking about preventing CSRF here, we're talking about preventing robots from accessing features which you only want available to your own Javascript, which is a different problem. The solution given here would still allow a robot to request the containing page, be given its own cookie, and use that cookie to fetch information from your app using those features you only intended your own Javascript to use. The cookie doesn't prevent this scenario. As far as I can fathom, there is no 100% reliable way of solving this. – thomasrutter Jul 22 '12 at 13:41
  • You simply have to assume that anything that you allow your Javascript to access for a non-authenticated user, any foreign script could also access. So you cannot design your internal AJAX API to give away any information or perform any actions for unauthenticated users that you wouldn't want an external robot also being able to view/perform. – thomasrutter Jul 22 '12 at 13:47
6

Regarding your last question: "Does it even matter, in these days?" This is a case by case question. If the ajax request is doing something that does not require security (e.g. loading latest stock quotes) then it really doesn't matter IMHO. If the request is loading information that should be secured (e.g. returning identifying information or doing something on the server) then you should treat it as such.

I personally don't use the server variables to know when something is an ajax request. Instead I just add a query parameter to the ajax call (e.g. http://domain.com/?ajax=true). If I need to secure the ajax call then I would use the same methods as securing a regular page request (using both client and server). As Lucas Oman pointed out, anything on the client side can be faked. Bottom line don't trust any request even if you think it is coming from your site or database. Always follow the mantra "filter input - escape output".

Jim
  • 522
  • 1
  • 3
  • 7
  • 1
    Yes, yes, yes. Jim is talking sense. Listen to him. – Quentin Dec 23 '09 at 17:54
  • This is the best answer. You can never be sure the source of a request to your site. Even if the request is authenticated, a user could be using a bot and valid credentials or even just pulling their cookie out of the browser. – frostymarvelous Nov 11 '15 at 12:58
3

David Walsh has a good solution

/* decide what the content should be up here .... */
$content = get_content(); //generic function;

/* AJAX check  */
if(!empty($_SERVER['HTTP_X_REQUESTED_WITH']) && strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) == 'xmlhttprequest') {
    /* special ajax here */
    die($content);
}

/* not ajax, do more.... */
Ben Shelock
  • 20,154
  • 26
  • 92
  • 125
2

Really, the most secure way to do this is to, as you suggested, use server-side sessions, as these cannot be crafted as cookies can.

Granted, someone can still hijack a session ID, but if you also store the user's IP address in their session and check it on each request, you can weed out a lot of hijacks. Only someone on the same LAN or proxy could hijack it.

Any other method mentioned--cookies, javascript, http referer--depends on client-side data, which is insecure and should always be suspected of being fake, forged, hijacked and maliciously constructed.

Lucas Oman
  • 15,597
  • 2
  • 44
  • 45
1

Use POST session secured requests:

Inside the Webpage (e.g. index.php) we need to store the sessionid

<?php
// Create Session
$session = session_id();
if(empty($session)) session_start();
?>
<head>
...
<script type="text/javascript">
  sid = '<?php echo session_id(); ?>';
</script>
<script type="text/javascript" src="ajaxrequest.js"></script>
...
</head>

The ajax requests (ajaxrequest.js)

/* simple getAjax function 
 * @param $url       request url
 * @param $param     parameter (dont use ?)
 * @param callback  function on success
 */
var spinnerid = '#spinner'; // Spinner as long ajax requests running
$(document).ajaxStart(function() { $(spinnerid).show(); });
$(document).ajaxStop(function() { $(spinnerid).hide(); });
function getAjax( url, param, callback ) {
    var data = null;
    url += "?sid=" + sid + "&" + param;
    $.ajax({
        url: url,
        method: "POST", // uncomment to use GET, POST is secured by session
        cache: false,
        async: true,
        success : function(data){
      callback(data);
    },
}

getAjax( 'http://domain.com/', 'data=foo', function( data ) {
 // do stuf with data 
 var jsonobj = eval("(" + data + ")");
 var data = jsonobj[0][ 'data' ];
});

Responsible php side:

if( isset( $_GET['sid'] ) ) $client_sid = $_GET['sid'];

if( session_id() == null ) session_start();

if( session_id() != $client_sid ) {
    // noID or wrongID, redirect to mainindex
    ignore_user_abort(true);
    header( "HTTP/1.1 403 Forbidden" );
    header("Connection: close", true);
    exit;
} else {

    // get data
    if( isset( $_GET['data'] ) ) {
        $data = $_GET['data'];
    } else if( isset( $_POST['data'] ) ) {
        $data = $_POST['data'];
    } else {
        $data = null;
    }

    // do stuff with data

    // return data as json
    $resp[0]['data'] = $data; 
    print_r( json_encode( $resp ) );
}
v4d
  • 24
  • 4
  • I wouldn't use the session ID for security reasons. It's recommended generating a random token for every request, that is saved in your session. Read this: https://docs.phalconphp.com/en/latest/reference/security.html#cross-site-request-forgery-csrf-protection – Yossi Feb 13 '16 at 23:01
  • While this would work because ajax is posting the data (not using get), anybody could still see the values by just looking at the javascript. they could then copy the data, create their own form and post it to the same location. A check on the $_SERVER['HTTP_REFERER'] would help prevent that, as would a check that the data was posted via ajax but it's still not totally secured. Is there a better way? – Delmontee Sep 26 '17 at 10:41
0

Check the $_SERVER['HTTP_REFERER']. This will work in many cases, but shouldn't be confused for a completely-secure solution.

Sampson
  • 265,109
  • 74
  • 539
  • 565
  • 1
    This will give you what you need, but is not reliable, as the browser can put anything it likes in this. It is easy to fake, so the value can not be truly trusted. Fundamentally, you can not guarantee that the request came from anywhere, even if you issue tokens to the page and make your ajax requests pass the token back to you. Really, you have to ask yourself, does it matter? – Rik Heywood Dec 23 '09 at 16:56
  • @Jonathan Sampson - I can catch the request in something like BurpSuite and change the HTTP_REFERER...so I dont think that's the solution. I can also do it in a script w/curl. – mr-sk Dec 23 '09 at 16:56
0

Use google recaptcha... It generates a token for your Ajax then on the server side you can verify the token...

Manomite
  • 199
  • 1
  • 5