2

I am using chef dk version 12 and i have done basic setup and uploaded many cookbooks , currently i am using remote_directory in my default.rb What i have observed is whenever there are too many files /hierarchy in the directory the upload fails with the below exception :-

ERROR: SSL Validation failure connecting to host: xyz.com - SSL_write: cert already in hash table
ERROR: Could not establish a secure connection to the server.
Use `knife ssl check` to troubleshoot your SSL configuration.
If your Chef Server uses a self-signed certificate, you can use
`knife ssl fetch` to make knife trust the server's certificates. 
Original Exception: OpenSSL::SSL::SSLError: SSL_write: cert already in hash table 

As mentioned earlier connection to server isnt a problem it happens only when there are too many files/the hierarchy is more . Can you please suggest what i can do? I have tried searching online for solutions but failed to get a solution

I have checked the question here but it doesnt solve my problem Chef uses embedded ruby and openssl for people not working with chef

Some updates on suggestion of tensibai, The exceptions have changed since adding the option of --concurrency 1 , Initially i had received, INFO: HTTP Request Returned 403 Forbidden:ERROR: Failed to upload filepath\file (7a81e65b51f0d514ec645da49de6417d) to example.com:443/bookshelf/… 3088476d373416dfbaf187590b5d5687210a75&Expires=1435139052&Signature=SP/70MZP4C2U‌​dUd9%2B5Ct1jEV1EQ%3D : 403 "Forbidden" <?xml version="1.0" encoding="UTF-8"?><Error><Code>AccessDenied</Code><Message>Access Denied</Message>

Then yesterday it has changed to INFO: HTTP Request Returned 413 Request Entity Too Large: error ERROR: Request Entity Too Large Response: JSON must be no more than 1000000 bytes.

Should i decrease the number of files or is there any other option?

Knife --version results in Chef: 12.3.0

Community
  • 1
  • 1
Rahul
  • 170
  • 2
  • 14
  • Did you tried `knife ssl check`/`knife ssl fetch` as suggested? – Aleksei Matiushkin Jun 24 '15 at 06:50
  • Yes i have already done that as said in the post , the connection is established but fails on more hierarchial directory structure or many files,the ouput for the same is:- Connecting to host xyz:443 Successfully verified certificates from `xyz.com' – Rahul Jun 24 '15 at 06:52
  • 2
    may you try with option `--concurrency 1` (the default is 10) ? I suspect a race condition between many threads uploading files. I think there's a bug about it that was fixed. If it works with 1, try increasing until it errors again. – Tensibai Jun 24 '15 at 07:16
  • Hi i tried with concurrency as 1 it took way too long but returned this :INFO: HTTP Request Returned 403 Forbidden: ERROR: Chef::Exceptions::ContentLengthMismatch: Response body length 0 does not match HTTP Content-Length header 206. Will try to hit once more , had tried with 5 and 2 both times got certificate error again – Rahul Jun 24 '15 at 09:03
  • I tied again now its giving the below exception :- INFO: HTTP Request Returned 403 Forbidden:ERROR: Failed to upload filepath\file (7a81e65b51f0d514ec645da49de6417d) to https://example.com:443/bookshelf/organization-00000000000000000000000000000000/checksum-7a81e65b51f0d514ec645da49de6417d?AWSAccessKeyId=6d 3088476d373416dfbaf187590b5d5687210a75&Expires=1435139052&Signature=SP/70MZP4C2UdUd9%2B5Ct1jEV1EQ%3D : 403 "Forbidden" AccessDeniedAccess Deniedg2gCZAATYm9va3NoZWxmQDEyNy4wLjAuMWgDYgAABZtiAAIfLWIAAHNh – Rahul Jun 24 '15 at 09:51
  • Reading on a forum i had increased the erchef value of s3_ssl_ttl from 900 to 7200 for Response body length 0 error – Rahul Jun 24 '15 at 09:54
  • Also the access denied error came with ERROR: You authenticated successfully to https://example.com:443 as node but you are not authorized for this action Response: AccessDeniedAccess Deniedg2gCZAATYm9va3NoZWxmQDEyNy4wLjAuMWgDYgAABZtiAAIfLWIAAHNh – Rahul Jun 24 '15 at 10:02
  • @Raul Ping people with @ in comments (so there's a notification) and edit your question to include the new messages, comments are bad to format and unreadable – Tensibai Jun 24 '15 at 15:08
  • So the problem with little concurency is that the Auth_key for the request time out (15 mins max). Could confirm the knife version with `knife --version` ? (should be `Chef : xx.yy.z`) – Tensibai Jun 24 '15 at 15:10

2 Answers2

1

Should i decrease the number of files or is there any other option?

Ususally the files inside a cookbook are not intended to be too large and too numerous, if you got a lot of files to ditribute it's a sign you should change the way you distribute thoose files.

One option could be to make a tarball, but this makes harder to manage the deleted files.

Another option if you're on an internal chef-server is to follow the advice here and change the client_max_body_size 2M; value for nginx but I can't guarantee it will work.

Community
  • 1
  • 1
Tensibai
  • 15,557
  • 1
  • 37
  • 57
  • Thanks I will try that out , as far as creating tarball goes i cannot update a file individually from the same cookbook if there is a minor change , will have to create tarball again etc. – Rahul Jun 25 '15 at 09:17
  • @Raul that's where continuous integration (CI) come in place, building an artifact from files when changed, and deploying a new artifact to servers (using jenkins and a repo like nexus or artifactory for exemple) – Tensibai Jun 25 '15 at 09:22
  • I changed in http section `client_max_body_size 2M;` and restarted the nginx service but it didnt help as you already said you can't guarantee it will work i will go ahead with reducing the size of directories, thanks for the suggestions. – Rahul Jun 25 '15 at 11:32
0

I had same error and i ran chef-server-ctl reconfigure on chef server then tried uploading cookbook again and all started working fine again

Innocent Anigbo
  • 4,435
  • 1
  • 19
  • 19