My web server is processing a huge file and then sending a response. I have tried the various nginx timeout parameters without luck. I have tried the parameters recommended in this question, however, I still see the timeout page with the error in the nginx error logs.
1 upstream prematurely closed connection while reading response header from upstream,client: 10.0.42.97, server:
Here is my nginx.conf
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
access_log /var/log/nginx/access.log;
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
client_header_timeout 600;
client_body_timeout 600;
send_timeout 600;
proxy_read_timeout 600;
fastcgi_buffers 8 16k;
fastcgi_buffer_size 32k;
fastcgi_read_timeout 600;
gzip on;
gzip_http_version 1.0;
gzip_comp_level 2;
gzip_proxied any;
gzip_types text/plain text/html text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript application/javascript application/json;
server_names_hash_bucket_size 64;
include /etc/nginx/conf.d/*.conf;
include /etc/nginx/sites-enabled/*;
}
I am still seeing the 502 Bad gateway from time to time, with the above error. Any pointers on what could be wrong? My input file is a csv file, if that helps. Any pointers or recommendations?
How can I fix this? How can I increase the timeout time?