54

I'm using Html5 Server Sent Events. The server side is Java Servlet. I have a json array data wants to pass to server.

var source = new EventSource("../GetPointVal?id=100&jsondata=" + JSON.stringify(data));

If the array size is small , the server side can get the querystring. But if the array size is big. (maybe over thousands of characters), the server can't get the querystring. Is it possible to use POST method in new EventSource(...) to to pass the json array to server that can avoid the querystring length limitation?

Evan Carroll
  • 78,363
  • 46
  • 261
  • 468
Tom Cheng
  • 1,299
  • 4
  • 14
  • 16

4 Answers4

48

No, the SSE standard does not allow POST.

(For no technical reason, as far as I've been able to tell - I think it was just that the designers never saw the use cases: it is not just large data, but if you want to do a custom authentication scheme there are security reasons not to put the password in GET data.)

XMLHttpRequest (i.e. AJAX) does allow POST, so one option is to go back to the older long-poll/comet methods. (My book, Data Push Apps with HTML5 SSE goes into quite some detail about how to do this.)

Another approach is to POST all the data in beforehand, and store it in an HttpSession, and then call the SSE process, which can make use of that session data. (SSE does support cookies, so the JSESSIONID cookie should work fine.)

P.S. The standard doesn't explicitly say POST cannot be used. But, unlike XMLHttpRequest, there is no parameter to specify the http method to use, and no way to specify the data you want to post.

Sevle
  • 3,109
  • 2
  • 19
  • 31
Darren Cook
  • 27,837
  • 13
  • 117
  • 217
  • 9
    "I think it was just that the designers never saw the use cases" another one if you want to upload file and get progress of completion server side processing. – Yaroslav Oct 15 '16 at 15:34
  • 33
    90% of my use cases for SSE is POST. Creating some sort of resource on the server which takes time to process. – Bryan Larsen Feb 09 '18 at 13:34
  • 3
    @BryanLarsen looks like is a wrong desing, if you want to send some data to your server, you should use fetch or AJAX Api, with HTTP2 is even more efficient because the conection is reused if you have a sse open. Take in mind that when sse request a connection, the cookies are sent too, so you can identify your client – John Balvin Arias Jul 14 '18 at 00:48
  • 3
    @JohnBalvinArias I don't want to send data to the server, I want to process some data on the server, and have it send progress notifications and results back to me. – Bryan Larsen Jul 16 '18 at 14:21
  • @bryanLarsen I do not understand your use case, if you want yo send data from the server perodically you should make a endless loop the first time the client conects to your sse endpoint in order to send the data, and break the loop when the client disconects, It is easy yo do it with golang – John Balvin Arias Jul 16 '18 at 18:35
  • 1
    @JohnBalvinArias As bryanLarsen wasn't the original questioner, and this is veering off-topic (i.e. design patterns for web apps), it should probably be taken to chat. – Darren Cook Jul 17 '18 at 07:42
  • 1
    There is no difference security-wise between putting a password in GET vs. putting it in POST. – rosemash Aug 18 '22 at 13:28
  • @rosemash GET URLs go into the server logs, POST data does not. That is the primary security risk. – Darren Cook Aug 18 '22 at 16:00
  • @DarrenCook That depends on the server implementation, not everyone is using apache or nginx, but okay, I see your point. – rosemash Aug 18 '22 at 16:03
10

While you cannot use the EventSource API to do so, there is no technical reason why a server cannot implement for a POST request. The trick is getting the client to send the request. For instance This answer discusses sse.js as a drop in replacement for EventSource.

mjs
  • 2,837
  • 4
  • 28
  • 48
1

Alternatively, you can read data from a file that you customize with another php

http://..../command_receiver.php?command=blablabla

command_receiver.php

<?php
$cmd = $_GET['command'];
$fh = fopen("command.txt","w");
fwrite($fh, $cmd);
fclose($fh);
?>

demo2_sse.php

<?php
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');

//$a = $_GET["what"];
$time = microtime(true); //date('r');

$fa = fopen("command.txt", "r");
$content = fread($fa,filesize("command.txt"));
fclose($fa);

echo "data: [{$content}][{$time}]\n\n";
flush();
?>

and the EventSource is included in an arbitrary named html as follows

<!DOCTYPE html>
<html>
<body>
<h1>Getting server updates</h1>
var source = new EventSource("demo2_sse.php");
source.onmessage = function (event) {
        mycommand = event.data.substring(1, event.data.indexOf("]"));
       mytime = event.data.substring(event.data.lastIndexOf("[") + 1, event.data.lastIndexOf("]"));
}
</script>
</body>
</html>
Dimitrios Ververidis
  • 1,118
  • 1
  • 9
  • 33
0

I have a novel approach to this problem for those that want to limit this to one request. Event Stream requests pass cookies to the server the same as any AJAX request. My approach is to set cookies containing the data before making the request, and then immediately discard the cookies upon the first response.

Now there are a few limitations with cookies to be aware of. First of all, a cookie's max length is 4096 bytes including the name as well as potentially 3 bytes of overhead. Secondly, the safe amount of cookies per browser is 50. Different browsers have different maximum amounts ranging from Google's 180 to Android's 50.

I'll leave it up to you to determine how you'll implement logic around these limitations, but I'll provide an example of my implementation. Note that this implementation uses js-cookies to manipulate the cookies. We're also making use of the cookie-parser middleware on the server side.


Browser JS

const productIDs = ["b0708d2c-fe46-4251-96e0-1cfb3cd05eb0", "244d1e73-b5b4-4c59-8d4e-006fc1b190fe"];
const size = new TextEncoder().encode(JSON.stringify(productIDs)).length;
const chunkSize = Math.floor(1024 * 3.5);
const segments = Math.ceil(size / chunkSize);
const currentCookies = Object.keys(Cookies.get()).length;

if (segments + currentCookies <= 50){
    let payloads = [];
    for (let segment = 0; segment < segments; segment++){
        const payloadLength = Math.ceil(productIDs.length / segments);
        const startPosition = segment * payloadLength;
        payloads[segment] = productIDs.slice(startPosition, startPosition + payloadLength);
    }

    for (const [index, payload] of Object.entries(payloads)){
        Cookies.set(`qp[${index}]`, JSON.stringify(payload));
    }

    const source = new EventSource("start-processing");
    source.addEventListener("connected", () => {
        for (const [index, payload] of Object.entries(payloads)){
            Cookies.remove(`qp[${index}]`);
        }
    });
}else{
    console.log(`ERROR: Cookie limit was reached.`);
}

As you can see, we set the chunkSize to 3584 bytes. This seems like a pretty safe amount of padding but you can tweak it to suit your needs. Keep in mind that this method will evenly distribute array values across all payload arrays. That means you're unlikely to be hitting that chunkSize value exactly.

You can adjust the maximum cookies from 50 to whatever you want. If you don't care about mobile or legacy and just want to support modern desktop browsers, 150 is the minimum to support Firefox. Check out this page for more information on browser limitations.

After we've split the productIDs into their own individual arrays, we generate cookies for each chunk. In this case, I've labelled it qp and I follow it with a [0] identifier indexed to the chunk.

Finally we initiate our event stream. You'll want to have an event that occurs as an immediate response so your browser knows to clear the newly created cookies. It would also be wise to have an independent function somewhere that clears these cookies in case a response never comes, or the browser is refreshed before the response comes through.

I recommend adding the following code to the start of the above code block in order to clean up any prior data that may be left over.

for (const cookie of Object.keys(Cookies.get()).filter(x => x.startsWith("qp"))){
    Cookies.remove(cookie);
}

Server-side JS (Express)

async (request, response) => {
    response.writeHead(200, {
        'Content-Type': 'text/event-stream', 
        'Connection': 'keep-alive',
        'Cache-Control': 'no-cache'
    });

    const productIDs = Object.entries(request.cookies).filter(([name, value]) => name.startsWith("qp")).map(([index, value]) => JSON.parse(value)).flat();

    const sendEvent = (event, data) => response.write(`event: ${event}\ndata: ${JSON.stringify(data)}\n\n`);
            
    sendEvent("connected", {});
}

Pretty straightforward stuff here. That rather long one liner is a really simple way of obtaining all of the arrays and flattening them into a single one. Just make sure that startsWith("qp") part is set to your cookie name and obviously make sure no other cookies on your site start with that string.

sendEvent() is a very simple function to respond to your client. Just make sure to respond immediately with the connected event to make sure the cookies are cleaned up properly.


And that's it! It really is such a simple approach, and while it has its flaws in terms of cookie limitations, I feel that a forewarned developer will be able to make the judgement as to whether to make use of the technique. Once made into an easily reusable function, it would be a piece of cake to implement anywhere you need.

Zei
  • 409
  • 5
  • 14