2

I have installed elasticsearch python package and I have created a elastic clustere. I am using below python code to send data to elastic cloud:

from elasticsearch import Elasticsearch, RequestsHttpConnection
import time
import datetime

es = Elasticsearch(['70.19.172.110:9200'],http_auth=('<username>','<password>'))

for x in range(0,5):
    es.index(index='test', doc_type='json', id=x, body={
    'data1':"Hello World',
    'value':325,
    'time': datetime.datetime.now()

    })

    print("Data sent {} ".format(x))
    time.sleep(60)

So as you can see in the code I am sending the data with the interval of 1min time.sleep(60). This works fine and all the 5 data's are in the elasticsearch. Then I changed the time.sleep(60) to time.sleep(300) and it gave me below error:

elasticsearch.exceptions.ConnectionTimeout: ConnectionTimeout caused by - ReadTimeoutError(HTTPConnectionPool(host='70.19.172.110', port=9200): Read timed out. (read timeout=10))

Is there anything which I am doing wrong. Is there any way I can keep connected to the elasticsearch so that I dont't through these types of errors.

Thanks.

S Andrew
  • 5,592
  • 27
  • 115
  • 237
  • because the default timeout of Elasticsearch is 10 Second, Your Connection dont Reasch the host , you have 10 second for connect to the host Before Elasticsearch close the connection for Timeout – Skiller Dz May 07 '18 at 12:08
  • @SkillerDz Thanks, so you mean to say I have to do `es = Elasticsearch(['70.19.172.110:9200'],http_auth=('',''))` each time I am sending the data. But then how did it worked when I used 60sec.? – S Andrew May 07 '18 at 12:16
  • es.index(index='test', doc_type='json', id=x, body={ 'data1':"Hello World', 'value':325, 'time': datetime.datetime.now() 'timeout':30 # The time of the Time Out }) – Skiller Dz May 07 '18 at 12:20
  • you must add a new value to timeout on es.index – Skiller Dz May 07 '18 at 12:20
  • @SkillerDz I didnt get it. So you mean to say, for 5 min timeout, I have to add 300 as timeout value. Can you answer the question with explaination, so that I can also accept it – S Andrew May 07 '18 at 12:23
  • @SAndrew What I think happens is that you create a connection in the very beginning and if it is inactive for >60sec the server cuts it off. If you query every 5 min it's ok to create a new connection every time, so yes, moving creation of `Elasticsearch` object (and hence the connection) inside the loop should solve your problem. – Nikolay Vasiliev May 07 '18 at 21:02
  • Also take a look at [this](https://stackoverflow.com/questions/25908484/how-to-fix-read-timed-out-in-elasticsearch) question, it looks similar – Nikolay Vasiliev May 07 '18 at 21:04
  • @NikolayVasiliev Thanks for your response, Its working fine now. – S Andrew May 10 '18 at 09:15

1 Answers1

1

try to Up the timeout of es.index , because Elasticsearch are limited to 10 secondes of timeout , if on 30 Secondes it will not respond that mean your Host arn't connected or dont respond to the request

from elasticsearch import Elasticsearch, RequestsHttpConnection
import time
import datetime
timenow = datetime.datetime.now()

es = Elasticsearch(['70.19.172.110:9200'],http_auth=('<username>','<password>'))

for x in range(0,5):
   es.index(index='test', doc_type='json', id=x, body={
   'data1':"Hello World",
   'value':325,
   'time':timenow,
   'timeout':30, # The Time Of timeout you want

    })

print("Data sent {} ".format(x))
time.sleep(60)
Skiller Dz
  • 897
  • 10
  • 17
  • 1
    I added the `'timeout':300` but it still throws the same error – S Andrew May 07 '18 at 12:31
  • I tryed The Script With the Ip My Routeur 192.168.1.1:80 with port 80 and the script worked fine with me , i think the probleme are on the host who you are tryng to reach , mabe the port 9200 are closed try another port – Skiller Dz May 07 '18 at 12:40