0

We are following a pretty standard user id / password check. We store the hashed password in the db. When the user enters credentials we hash the entered password then compare to what the db has. If they match then user is authenticated.

Now this login process under load test is slowing down considerably so I was asked to look at it. VS 2013 Profiler pointed out the hashing method as a hot path. Looking at the method in question we are looping over the hashing process??

private const int totalHashCount = 1723;

public string CreateHash(string salt, string password, int securityIndex)
{
  string hashedPass = this.GenerateHashString(salt + password, securityIndex);
  for (int i = 1; i <= totalHashCount; i++)
  {
    hashedPass = this.GenerateHashString(hashedPass, securityIndex);
  }
  return hashedPass;
}

I went to the developer and he stated the client's security team wanted us to rehash the hash and to do it some prime number greater than 1000....and he provided the email as documentation.

Now I am not a cryptology expert and we have a good relationship with the client so before I went to them and connected this rehash loop to their performance woes I wanted to see if rehashing like this does indeed increase security?

To my understanding a single hash is practically impossible to invert so why waste cycles repeating the process?

Thoughts?


Edit

Added GenerateHash:

protected internal string GenerateHashString(string textToHash, int securityIndex = 0)
{
   UnicodeEncoding uEncode = new UnicodeEncoding();
   SHA512Managed sha = new SHA512Managed();

   byte[] bytVal = uEncode.GetBytes(textToHash + hashIntPool[securityIndex].ToString());
   byte[] hashVal = sha.ComputeHash(bytVal);

   return Convert.ToBase64String(hashVal);
 }
GPGVM
  • 5,515
  • 10
  • 56
  • 97
  • I have never seen this security measure, but it does protected you from dictionary attack (although the salt should suffice in that case) http://en.wikipedia.org/wiki/Dictionary_attack I would ask this improves the security. – Moti Azu Jan 13 '15 at 14:38
  • This depends on what `GenerateHashString` is doing. You want "load testing of logons" to do badly because someone trying to hack in is doing exactly what your load test does. However many built in string hashing functions will [already do what your code is doing](http://msdn.microsoft.com/en-us/library/system.security.cryptography.rfc2898derivebytes.iterationcount(v=vs.110).aspx) so we need to see what GenerateHashString does to tell you if you need to do it yourself. Modern computers can attempt billions of hashes per second when testing for a password match, doing iterations cuts that number – Scott Chamberlain Jan 13 '15 at 14:41
  • 2
    See this question on our security sister site: [How to securely hash passwords?](http://security.stackexchange.com/questions/211/how-to-securely-hash-passwords/31846#31846) The loop in your code is implementing the "Slowness" part of the answer. (From a random guess, I bet `SecurityIndex` is really a 32 bit salt) – Scott Chamberlain Jan 13 '15 at 14:45
  • Added GenerateHashString – GPGVM Jan 13 '15 at 14:51
  • Did a security expert come up with that algorithm? Seems to me you could introduce vulnerabilities/flaws, but I'm not a cryptography expert ... Why aren't you using bcrypt? – Mitch Wheat Jan 13 '15 at 14:58
  • @MitchWheat Code was existing before I saw it. Bcrypt is on my refactor radar but I need to get to the heart of what the client wanted. If they were as Scott said going for the slowness and reduce the number of brute force per seconds possibilities then I want to retain that feature and modify the load test. – GPGVM Jan 13 '15 at 15:05
  • Getting the password from a hash is difficult, but there are databases which include the hash and the password. So when a hacker searches the database by the hash and it is found then the password is revealed (Search 'MD5 database' on Google as an example). The above solution will get around that, but it seems like massive overkill to do it so many times. You also have the problem if you change it you'll need to reset all passwords. – McGaz Jan 13 '15 at 15:11
  • 1
    Password hashing needs to be slow *on the attackers hardware*. – CodesInChaos Jan 13 '15 at 15:39
  • possible duplicate of [Is "double hashing" a password less secure than just hashing it once?](http://stackoverflow.com/questions/348109/is-double-hashing-a-password-less-secure-than-just-hashing-it-once) – erickson Jan 14 '15 at 18:25

2 Answers2

3

The technique, called "stretching", of repeated hashing is used to make brute force attacks more difficult. If it takes 0.1 second to hash a password (due to the repetitions) then an attacker can at best try 10 passwords a second to find a match. If you speed up the hashing process so it takes a microsecond, then the attacker can test a million passwords a second.

You need to balance speed against security. A user login only need to be fast enough to satisfy the user, so 0.1 to 0.5 second is probably acceptable.

If your server is overloaded then get a faster processor, or buy a dedicated hashing server. That will be a lot cheaper than the legal consequences of a data breach.

rossum
  • 15,344
  • 1
  • 24
  • 38
3

Repeating the hash operation is essential to secure password authentication, but you are doing it wrong and therefore indeed wasting CPU to achieve nothing.

You should use an algorithm like PBKDF2 that includes the password in each round of hashing in order to preserve all the unpredictability of the password. bcrypt and especially scrypt are good alternatives too.

Also, one thousand rounds is not nearly enough; to be secure against offline dictionary attacks, you need the hashing operation to be relatively slow, even when performed on the attacker's dedicated password testing hardware. Picking a prime number of rounds is meaningless mumbo jumbo. The number of rounds will depend on the algorithm you select, but for PBKDF2 with SHA-256, somewhere between 10,000 and 100,000 rounds should provide a reasonable level of security.

A slow algorithm is necessary to prevent an attacker who obtains a hash from quickly trying many different passwords to see which produces the same hash. It's true that a secure hash is not feasible to invert, but it won't stop guessing, and attackers are good at prioritizing their guesses to try the most likely passwords first. Repetition of the hash is what provides this necessary slowness.

This has been discussed many times on StackOverflow. I refer you to a previous answer for more background.

In C#, you could use Rfc2898DeriveBytes to perform password hashing securely. You can encode the derived key in Base-64 to be stored as a string, or actually use it as an encryption key to encrypt a known plain text like the bcrypt algorithm does. You'll notice that Rfc2898DeriveBytes uses a "salt", which I discuss elsewhere; you'll need to store this value along with the hash value to perform authentication later.

Community
  • 1
  • 1
erickson
  • 265,237
  • 58
  • 395
  • 493