2

I created a random generator for cryptography purposes I would like to know if it is secure enough. If it is you're free to use it of course. Thanks in advance for reading my wall of text :D

Explanation of functions

The $this->pref_hash_algo($bits_of_entropy=null, $inclusive=false) function gets the default hashing method (in my case sha256) or if $bits_of_entropy is given it gets the optimal hashing algorithm that is either inclusive or not. For instance 230 bits of entropy inclusive would return sha256 while exclusive would give sha224.

The algorithm returned from self::$HASH_PREFERENCES['128'][0] is ripemd128

For $this->hash($input, $algorithm=null, $output_type=self::OUTPUT_HEX) it only does a PHP hash($input, $algorithm[, $binary]) with extra checking if the algorithm is considered secure and support for more output types then binary and hexadecimal.

Generation code

For easy access moved first version to http://pastebin.com/YtJFvpah

Update

Based on your input I have altered the code to the following: http://pastebin.com/bQ5tFDdh

Summary of edits:

  • Not hashing, but only formatting /dev/urandom output.
  • Added meantime output hashing, for when hashing algorithm has too little output for the requested amount of random bits (for example sha512 when 4000 bits are requested)

Testing

Case 1

I ran php /my/path/to/file.php | ent to test the /dev/urandom method and the alternative method on a 2.000.000 byte sample.

Case 2

I created a 4,7MB binary file using the ALTERNATIVE METHOD (when /dev/urandom is disabled/not available) and ran dieharder -a -f /home/beanow/random.input -t 10:

Case 3

The same as case 2 but with 20MB binary files and removing the -t argument to use the default setting.

Beanow
  • 1,089
  • 9
  • 16
  • 6
    You're using predefined hashing algorithms, uniqid and microtime functions which you then combine. Your "algorithm" is as good as the hashing algorithm you used, no worse, no better. – N.B. Jan 27 '12 at 17:18
  • 2
    See http://stackoverflow.com/questions/1182584/secure-random-number-generation-in-php (but go past the 'accepted' answer, which is kinda wrong) – zneak Jan 27 '12 at 17:20
  • 3
    Where is the PRNG you have **created**? I see you're **using** one: `/dev/urandom` or whatever the OS throws at you, multiple times (implicitly in `uniqid` and `mt_rand`). Hardly the same thing. (also, "compressing" random data? HUGE red flag!). You have written the answer to your question yourself, IMNSHO: "Don't try to improve this, you will likely just ruin it - I did it anyway." – Piskvor left the building Jan 27 '12 at 17:33
  • 2
    @N.B.: never better, but almost invariably worse. – Marc B Jan 27 '12 at 17:35
  • Why are you reseeding on each step, instead of keeping state? – CodesInChaos Jan 27 '12 at 18:45
  • @Piskvor I created the part where /dev/urandom is not available. Can you explain or link why 'compressing' the random data is bad? – Beanow Jan 27 '12 at 18:57
  • @N.B. I do aim to get the maximum possible entropy from the hashing algorithms I use. microtime() and mt_rand() on their own don't generate 512bits of random data by default. – Beanow Jan 27 '12 at 18:59
  • @CodeInChaos I keep a state and reuse it to alter the state. But I hash that seed to the $str var to ensure the state will not leak directly into the output and become predictable in any way. – Beanow Jan 27 '12 at 19:07
  • 2
    @Beanow - if you can't see that your "algorithm" depends on properties of hashing algorithms you use.. we can't talk about hashing of your own. You just used those algorithms with more overhead. All you get is extended execution time with the same number of permutations. – N.B. Jan 27 '12 at 21:59
  • 1
    If you're asking the question, chances are the answer is no. – Adam Liss Jan 28 '12 at 02:53
  • 1
    @Beanow: You are combining several outputs of the same PRNG. This will *not* provide you with any significant increase in entropy. As for random data compression, start reading at http://en.wikipedia.org/wiki/Kolmogorov_complexity , continue to anything connected to Shannon's law, etc etc. – Piskvor left the building Jan 29 '12 at 18:21
  • @Piskvor thank you, I'll bookmark this for a read soon. – Beanow Jan 30 '12 at 12:49
  • @N.B. can you verify if the above update still indicates that the alternative method is not random enough but just more overhead for existing methods? – Beanow Jan 30 '12 at 12:51
  • 1
    @Beanow - again, you are using either `/dev/urandom` which was written for *NIX by someone other than you or microtime that is also not written by you. Therefore, your "random" algorithm isn't yours, it's just expanding the idea of randomness. Since you rely on either /dev/urandom or microtime, how can we talk about true randomness? People have been doing random number generators for decades, I suggest checking how GUIDs are generated, that might give you some more insight. – N.B. Jan 30 '12 at 13:45

3 Answers3

1

Rule #1 of cryptography is don't design your own cryptography. Cryptography is very subtle and very easy to get wrong. There is no reason why using the OS-built-in PRNG should be avoided; all major platforms (*nix, Windows, etc) have strong, well-understood, well-studied PRNGs. So use those.

  • Somebody has to do it. Besides I do use the OS-built in one if it is available. I wrote the backup algorithm. – Beanow Jan 27 '12 at 18:55
  • he is the PRNG that comes with most *nix systems. – rook Jan 27 '12 at 19:56
  • And then hashing the results, which will not increase the entropy of the output. Don't try to add stuff or be cute with things, read from the system PRNG and use that. –  Jan 27 '12 at 20:57
1

This is not really making your own PRNG. I am not sure why you are hashing the output from /dev/urandom if anything this is going to make this system less secure. Just read out the bytes you need, change it from base256 to base16 or whatever else you need.

rook
  • 66,304
  • 38
  • 162
  • 239
  • Point taken for the case where /dev/urandom is available. But what about the part when it's not? – Beanow Jan 30 '12 at 07:11
1

There are a number of tests you can run against your output to determine how random it really is.

ent - A Pseudorandom Number Sequence Test Program

... [ent] applies various tests to sequences of bytes stored in files and reports the results of those tests. [ent] is useful for evaluating pseudorandom number generators for encryption and statistical sampling applications, compression algorithms, and other applications where the information density of a file is of interest.

The diehard test suite

... a battery of statistical tests for measuring the quality of a random number generator. Wikipedia

The dieharder test suite

The primary point of dieharder (like diehard before it) is to make it easy to time and test (pseudo)random number generators, both software and hardware, for a variety of purposes in research and cryptography.

Passing both the diehard and dieharder test suites is quite difficult with a home-grown pseudo-random number generator.

Tails
  • 3,350
  • 2
  • 17
  • 19
  • Thank you for posting the most constructive answer and not presuming I'm a random script kiddie that made something weird. (I found out afterwards my function looks a lot like drupals). I'll give you the accept for it since people closed it before there was a chance at direct coments on how it works. – Beanow Jan 28 '12 at 23:23
  • I've used the feedback in the thread to improve the function and added test results. Can you see if I interpreted the results correctly and found an algorithm that is almost equally as good as `/dev/urandom`? (Besides that my way is much slower and could possibly have a predictability factor?) – Beanow Jan 30 '12 at 12:44
  • 1
    Those tests are nice if you just need a PRNG for simulations, but they are in no way sufficient to show cryptographic strength of a PRNG. – CodesInChaos Feb 13 '12 at 22:08