2

I'm making a chrome extension that injects an iframe on a webpage and show some stuff.

Content loaded in iframe is from https://example.com and i have full control over it. I'm trying to access cookies of https://example.com from the iframe (which i think should be available) by document.cookie. This is not letting me access httponly flagged cookie and i do not know reason for this. After all this is no cross-domain. Is it?

Here is the code i'm using to get cookie

jQuery("#performAction").click(function(e) {
    e.preventDefault();
    console.log(document.domain); // https://example.com
    var cookies = document.cookie;
    console.log('cookies', cookies);
    var httpFlaggedCookie1 = getCookie("login_sess");
    var httpFlaggedCookie2 = getCookie("login_pass");
    console.log('httpFlaggedCookie1 ', httpFlaggedCookie1 ); // shows blank
    console.log('httpFlaggedCookie2 ', httpFlaggedCookie2 ); // shows blank
    if(httpFlaggedCookie2 != "" && httpFlaggedCookie2 != ""){
        doSomething();
    } else{
        somethingElse();
    }
});

Any suggestions what can be done for this?

Tushar Shukla
  • 5,666
  • 2
  • 27
  • 41

2 Answers2

2

By default in Chrome, HttpOnly cookies are prevented to be read and written in JavaScript.

However, since you're writing a chrome extensions, you could use chrome.cookies.get and chrome.cookies.set to read/write, with cookies permissions declared in manifest.json. And be aware chrome.cookies can be only accessed in background page, so maybe you would need to do something with Message Passing

Haibara Ai
  • 10,703
  • 2
  • 31
  • 47
  • Thanks for your response, can you please show me an example? – Tushar Shukla Sep 03 '16 at 05:11
  • @TusharShukla, http://stackoverflow.com/questions/5892176/getting-cookies-in-a-google-chrome-extension – Haibara Ai Sep 03 '16 at 05:13
  • All my code goes in page i'm loading in iframe, would `chrome.cookies.get` work there? And if not then how do i access cookies in this situation? – Tushar Shukla Sep 03 '16 at 05:23
  • @TusharShukla, sure, to my knowledge, cookies are bound to urls not tabs (frames) – Haibara Ai Sep 03 '16 at 05:26
  • I asked this because i'm getting `Uncaught TypeError: Cannot read property 'get' of undefined` error. – Tushar Shukla Sep 03 '16 at 05:27
  • And due to this, i assumed `chrome.cookies.get` would not work here – Tushar Shukla Sep 03 '16 at 05:28
  • @TusharShukla, have you added `cookies` in `manifest.json`, as I mentioned in the answer? And also be aware `chrome.cookies` can only be accessed in background page – Haibara Ai Sep 03 '16 at 05:29
  • Yes i've added the permission. And i'll try with message passing. – Tushar Shukla Sep 03 '16 at 05:31
  • i tried a lot of things however none of them worked. My main objective is to grab **cookie (including httponly flagged one)** from iframe. Using `chrome.cookies.get` in background script is not helping in this case. Anything else you might suggest in this case? – Tushar Shukla Sep 03 '16 at 07:51
  • @TusharShukla, could you please append what you have tried in the original post? That would help for us to debug. And it the url is provided, that would be better. – Haibara Ai Sep 05 '16 at 00:44
1

Alright folks. I struggled mightily to make httponly cookies show up in iframes after third party cookies have been deprecated. Eventually I was able to solve the issue:

Here is what I came up with:

  1. Install a service worker whose script is rendered by your application server (eg in PHP). In there, you can output the cookies, in a closure, so no other scripts or even injected functions can read them. Attempts to load this same URL from other user-agents will NOT get the cookies, so it’s secure.

  2. Yes the service workers are unloaded periodically, but every time it’s loaded again, it’ll have the latest cookies due to #1.

  3. In your server-side code response rendering, for every time you add a Set-Cookie header, also add a Set-Cookie-JS header with the same content. Make the Service Worker intercept this response, read that cookie, and update the private object in the closure.

  4. In the “fetch” event, add a special request header such as Cookie-JS, and pass what would have been passed in the cookie. Add this to the request headers before sending the request to the server. In this way, you can send all “httponly” cookies back to the server, without the Javascript being able to see them, even if actual cookies are blocked!

  5. On your server, process the Cookie-JS header and merge that into your usual Cookies mechanism, then proceed to run the rest of your code as usual.

Although this seems secure to me — I’d appreciate if anyone reported a security flaw!! — there is a better mechanism than cookies.

Consider using non-extractable private keys such as ECDSA to sign hashes of payloads, also using a service worker. (In super-large payloads like videos, you may want your hash to sample only a part of the payload.) Let the client generate the key pair when a new session is established, and send the public key along with every request. On the server, store the public key in a session. You should also have a database table with the (publicKey, cookieName) as the primary key. You can then look up all the cookies for the user based on their public key — which is secure because the key is non-extractable.

This scheme is actually more secure than cookies, because cookies are bearer tokens and are sometimes subject to session fixation attacks, or man-in-the-middle attacks (even with https). Request payloads can be forged on the server and the end-user cannot prove they didn’t make that request. But with this second approach, the user’s service worker is signing everything on the client side.

A final note of caution: the way the Web works, you still have to trust the server that hosts the domain of the site you’re on. It could just as easily ship JS code to you one day to sign anything with the private key you generated. But it cannot steal the private key itself, so it can only sign things when you’ve loaded the page. So, technically, if your browser is set to cache a top-level page for “100 years”, and that page contains subresource integrity on each resource it loads, then you can be sure the code won’t change on you. I wish browsers would show some sort of green padlock under these conditions. Even better would be if auditors of websites could specify a hash of such a top-level page, and the browser’s green padlock would link to security reviews published under that hash (on, say, IPFS, or at a Web URL that also has a hash). In short — this way websites could finally ship code you could trust would be immutable for each URL (eg version of an app) and others could publish security audits and other evaluations of such code.

Maybe I should make a browser extension to do just that!

  • If anyone is wondering whether this somehow “brings back” third-party cookies: yes it does, but it doesn’t get around the purging of all client-side state after some time. After Safari’s ITP, you should think of all your cookies and localStorage as having essentially the duration of sessionStorage. Every subsequent visit requires a login, thanks to WebAuthn or password managers it can be seamless (and requires user interaction and approval). Users might be able to opt-into automatically logging in non-interactively on some websites, but it shouldn’t be forced on them by the web servers! – Greg Magarshak Nov 08 '22 at 05:18