7

Does using multiple algorithms make passwords more secure? (Or less?)

Just to be clear, I'm NOT talking about doing anything like this:

key = Hash(Hash(salt + password))

I'm talking about using two separate algorithms and matching both:

key1 = Hash1(user_salt1 + password)
key2 = Hash2(user_salt2 + password)

Then requiring both to match when authenticating. I've seen this suggested as a way eliminate collision matches, but I'm wondering about unintended consequences, such as creating a 'weakest link' scenario or providing information that makes the user database easier to crack, since this method provides more data than a single key does. E.g. something like combining information the hash to find them more easily. Also if collisions were truly eliminated, you could theoretically brute force the actual password not just a matching password. In fact, you'd have to in order to brute force the system at all.

I'm not actually planning to implement this, but I'm curious whether or not this is actually an improvement over the standard practice of single key = Hash(user_salt + password).

EDIT:

Many good answers, so just to surmise here, this should have been obvious looking back, but you do create a weakest link by using both, because the matches of weaker of the two algorithms can be tried against the other. Example if you used a weak (fast) MD5 and a PBKDF2, I'd brute force the MD5 first, then try any match I found against the other, so by having the MD5 (or whatever) you actual make the situation worse. Also even if both are among the more secure set (bcrypt+PBKDF2 for example), you double your exposure to one of them breaking.

Paul
  • 6,188
  • 1
  • 41
  • 63
  • 1
    You may have more luck on http://crypto.stackexchange.com with this. –  Jan 15 '13 at 15:55
  • 2
    I believe some banks actually implement double key hashing, with double input method. – drum Jan 15 '13 at 15:56
  • 1
    @drum I'd be interested in some elaboration. Or a link. – svidgen Jan 15 '13 at 16:07
  • 1
    Saying that collisions aren't a big deal does not negate the need for salting. Salting prevents precomputations. Its purpose is not to prevent 2 unique passwords from hashing to the same value. The purpose of a salt is to prevent you from precalculating the hash of a large dictionary of passwords and then matching it up with a database of hashed passwords later. – Luke Jan 15 '13 at 19:21
  • @Luke - I'm not sure that's entirely true. If you did not salt passwords or used the same salt for each password, an attacker with access to the database could look for accounts with the same password hash (and therefore the same password). Those accounts could then be targeted for a common password / dictionary based attack. – Eric Petroelje Jan 15 '13 at 21:13
  • @EricPetroelje I think perhaps you misread my statement. I think you're saying the exact thing I'm saying. :) – Luke Jan 15 '13 at 21:45

4 Answers4

7

The only thing this would help with would be reducing the possibility of collisions. As you mention, there are several drawbacks (weakest link being a big one).

If the goal is to reduce the possibility of collisions, the best solution would simply be to use a single secure algorithm (e.g. bcrypt) with a larger hash.

Eric Petroelje
  • 59,820
  • 9
  • 127
  • 177
  • Could you elaborate on the weakest link case? I think I understand why this is so when you chain/combine the results into a single value (basically hashes reduce the available data, so the weakest hash exposes the whole key to brute-force) but if they're stored independently and both forced to match is that still the case? – Paul Jan 15 '13 at 16:03
  • @Paul - well, let's say hash1 was an MD5 hash. Assuming an 8 character password, and given the salt, an MD5 hash can be brute force cracked relatively easily. So no matter how good your second hash was, an attacker will be able to brute force the password from the MD5 hash. – Eric Petroelje Jan 15 '13 at 16:07
6

Collisions are not a concern with modern hashing algorithms. The point isn't to ensure that every hash in the database is unique. The real point is to ensure that, in the event your database is stolen or accidentally given away, the attacker has a tough time determining a user's actual password. And the chance of a modern hashing algorithm recognizing the wrong password as the right password is effectively zero -- which may be more what you're getting at here.

To be clear, there are two big reasons you might be concerned about collisions.

  1. A collision between the "right" password and a supplied "wrong" password could allow a user with the "wrong" password to authenticate.
  2. A collision between two users' passwords could "reveals" user A's password if user B's password is known.

Concern 1 is addressed by using a strong/modern hashing algorithm (and avoiding terribly anti-brilliant things, like looking for user records based solely on their password hash). Concern 2 is addressed with proper salting -- a "lengthy" unique salt for each password. Let me stress, proper salting is still necessary.

But, if you add hashes to the mix, you're just giving potential attackers more information. I'm not sure there's currently any known way to "triangulate" message data (passwords) from a pair of hashes, but you're not making significant gains by including another hash. It's not worth the risk that there is a way to leverage the additional information.

svidgen
  • 13,744
  • 4
  • 33
  • 58
  • I think you miss a subtle point here. Ideally every hash in the database **is** unique. That way if your database is stolen and the nefarious party takes the time to crack User A's password, they don't automatically get User B's password if it happens to match User A's password. That is where salting comes in. Salting also prevents precomputation, but the scenario I describe is really just a short-term precomputation (I precomputed it just a moment ago). – Luke Jan 15 '13 at 19:19
  • @Luke Ideally, yes. But, not the point. Hash(saltX + passwordX) can be the same as Hash(saltY + passwordY) without significantly compromising either password *if at all*. I.e., not all collisions are meaningful. Proper salting is required in any case. – svidgen Jan 15 '13 at 19:29
  • 1
    I agree. Reading the update from @Paul I think he got the wrong impression there though. Hopefully these comments help to clarify that. Specifically: `Apparently collisions aren't a big deal anymore. Still not sure about this one so I'll keep salting for now.` – Luke Jan 15 '13 at 19:37
  • OK, I think there was perhaps a little semantic sloppiness on my part. The more I think about it the less I'm sure exactly what "doesn't matter" anymore, because the way I'm interpreting this you'd have to go back **many many years** to find a point where these collisions *would* have been a problem. I'm going to remove the second point altogether. All that really matters is that the method described in the question is flawed--I just couldn't quite explain why when I first saw it. – Paul Jan 15 '13 at 20:29
  • @Paul Effectively what svidgen is saying here is that the method is unnecessary because it solves a problem that doesn't exist. It solves the problem of a hash algorithm taking 2 different inputs and producing the same output, which is statistically impossible with any reasonable hash algorithm. – Luke Jan 15 '13 at 20:44
4

To answer your question:

Having a unique salt is better than having a generic salt. H(S1 + PW1) , H(S2 + PW2) Using multiple algorithms may be better than using a single one H1(X) , H2(Y) (But probably not, as svidgen mentions)

However,

The spirit of this question is a bit wrong for two reasons:

  1. You should not be coming up with your own security protocol without guidance from a security expert. I know it's not your own algorithm, but most security problems start because they were used incorrectly; the algorithms themselves are usually air-tight.

  2. You should not be using hash(salt+password) to store passwords in a database. This is because hashing was designed to be fast - not secure. It's somewhat easy with today's hardware (especially with GPU processing) to find hash collisions in older algorithms. You can of course use a newer secure Hashing Algorithm (SHA-256 or SHA-512) where collisions are not an issue - but why take chances?

You should be looking into Password-Based Key Derivation Functions (PBKDF2) which are designed to be slow to thwart this type of attack. Usually it takes a combination of salting, a secure hashing algorithm (SHA-256) and iterates a couple hundred thousand times.

Making the function take about a second is no problem for a user logging in where they won't notice such a slowdown. But for an attacker, this is a nightmare since they have to perform these iterations for every attempt; significantly slowing down any brute-force attempt.

Take a look at libraries supporting PBKDF encryption as a better way of doing this. Jasypt is one of my favorites for Java encryption.

See this related security question: How to securely hash passwords and this loosely related SO question

Community
  • 1
  • 1
Justen
  • 186
  • 7
  • For future readers, a generic salt is often used in addition to a unique salt. The generic salt, humorously called a "pepper", is a secret whereas the salt is generally stored right alongside the password. This secret is not impossible to find out of course, but it's another difficulty for the password cracker. – Charles Burns May 14 '18 at 21:31
0

A salt is added to password hashes to prevent the use of generic pre-built hash tables. The attacker would be forced to generate new tables based on their word list combined with your random salt.

As mentioned, hashes were designed to be fast for a reason. To use them for password storage, you need to slow them down (large number of nested repetitions).

You can create your own password-specific hashing method. Essentially, nest your preferred hashes on the salt+password and recurs.

string MyAlgorithm(string data) {
  string temp = data;
  for i = 0 to X {
    temp = Hash3(Hash2(Hash1(temp)));
  }
}
result = MyAlgorithm("salt+password");

Where "X" is a large number of repetitions, enough so that the whole thing takes at least a second on decent hardware. As mentioned elsewhere, the point of this delay is to be insignificant to the normal user (who knows the correct password and only waits once), but significant to the attacker (who must run this process for every combination). Of course, this is all for the sake of learning and probably simpler to just use proper existing APIs.

Ioan
  • 2,382
  • 18
  • 32
  • PBKDF2 was standardized for this very reason - One should not implement a PBKDF function themselves - but take one from a well-tested library instead. – Justen Jan 15 '13 at 16:30
  • There's no reason (assuming you follow proper security procedures) to not implement your own. If you look at the source code for PBKDF2, it's very simple and does little more. In any case, it's interesting and using a larger/mixed set of HMACs for hashing is yet another hurdle for the attacker. – Ioan Jan 15 '13 at 17:09
  • 1
    New code is almost always less secure than older code in this particular problem space. As an academic exercise it's simple enough, but there's no reason to implement it yourself in production. Especially since it: 1. already exists, 2. was written by experts, and 3. stood the test of time by having vulnerabilities discovered and patched. – Justen Jan 15 '13 at 17:28
  • +1 to the point made by @Justen. There are lots of subtle things to consider in this problem space. Some of which are completely non-obvious like timing attacks or even power fluctuation and usage attacks. Any potential information leaked about a password compromises its security (size, value of a bit or two, etc). – Luke Jan 15 '13 at 19:42
  • @Luke I haven't read the full standard, but looking at the source code of OpenBSD's implementation, it does little more than what I outlined. While I understand the topic of computer security is large, this particular element isn't the hardest. Also, notice I'm not implementing the underlying hash algorithms, which are more important. – Ioan Jan 15 '13 at 20:53