1

Trying to insert a record into ElasticSearch using node.js's http module (not using 3rd party modules)

Setup: running instance of ElasticSearch locally on port 9200 (default)

Node.js Code:

var querystring = require('querystring'); // to build our post string
var http = require('http');

// Build the post string from an object
var data = querystring.stringify({
  "text" :"hello world"
});

// An object of options to indicate where to post to
var post_options = {
    host: 'localhost',
    port: '9200',
    path: '/twitter/tweets/1',
    method: 'POST',
    headers: {
        'Content-Type': 'application/x-www-form-urlencoded',
        'Content-Length': Buffer.byteLength(data)
    }
};

// Set up the request
var post_req = http.request(post_options, function(res) {
    res.setEncoding('utf8');
    res.on('data', function (chunk) {
        console.log('Response: ' + chunk);
    });
});

// post the data
post_req.write(data);
post_req.end();

I get the following Error:

{"error":"MapperParsingException[failed to parse]; nested: 
ElasticsearchParseException[Failed to derive xcontent 
from (offset=0, length=18): [116, 101, 120, 116, 61, 104, 
101, 108, 108, 111, 37, 50, 48, 119, 111, 114, 108, 100]]; ",
"status":400}

But doing the following CURLs works as expected:

curl -XPOST 'http://localhost:9200/twitter/tweet/1' -d '{"message" : "hello world"}'

I Looked at the following StackOverflow questions which have similar error messages:

Neither of these answered my question.

Any help much appreciated. Thanks!

Note: I am deliberately trying to use the ElasticSearch REST API using only Node.js Core (no 3rd party) modules (please don't suggest using elasticsearch-js or es or request, etc )

Community
  • 1
  • 1
nelsonic
  • 31,111
  • 21
  • 89
  • 120

2 Answers2

4

Its obvious in retrospect.
Using the querystring module was the mistake.
ElasticSearch expects the data to be sent as JSON ("Stringified")

so code needs to be:

var http = require('http');

// ElasticSearch Expects JSON not Querystring!
var data = JSON.stringify({
  "text" :"everything is awesome"
});

// An object of options to indicate where to post to
var post_options = {
    host: 'localhost',
    port: '9200',
    path: '/twitter/tweets/1234',
    method: 'POST',
    headers: {
        'Content-Type': 'application/json',
        'Content-Length': Buffer.byteLength(data)
    }
};

// Set up the request
var post_req = http.request(post_options, function(res) {
    res.setEncoding('utf8');
    res.on('data', function (chunk) {
        console.log('Response: ' + chunk);
    });
});

// post the data
post_req.write(data);
post_req.end();

This works as expected. Confirmed using:

curl -GET http://localhost:9200/twitter/tweets/1234?pretty

(Thanks @FelipeAlmeida for helping me realise this)

nelsonic
  • 31,111
  • 21
  • 89
  • 120
2

Although it may be possible to index data on Elasticsearch using formencoding (I'm not sure, I would have to research that), the way 99% talk to Elasticsearch is via pure JSON requests.

So try changing 'Content-Type': 'application/x-www-form-urlencoded' to 'Content-type': 'application/json' and tell me if it's worked.

Felipe
  • 11,557
  • 7
  • 56
  • 103
  • I should have given the *full* list of things I tried before posting this question. changing the **Content-Type** to `'application/json'` was one of the. sadly it does not fix it. :-( – nelsonic Dec 15 '14 at 10:29
  • Changing content-type to application/json was half the answer. The most important part was sending the data as JSON... I was url-encoding the data instead of doing a JSON.stringify() - thanks for pointing me in the right direction. – nelsonic Dec 15 '14 at 13:27
  • Ok then. Happy to have helped =) – Felipe Dec 15 '14 at 15:17