1

For a project I'm working on, I needed a Javascript function that would return a random number, in a given range, without repeating itself until the whole range is 'depleted'. As there was no such thing around, I have managed to create it myself.

The function will also require an id to be passed. This way, if you require multiple random numbers, each with their own history, the id keeps track of them all.

The function works, however I need some advice;

  1. is this the 'proper' way to achieve what I want to achieve?
  2. how fast will inArray() perform with very big ranges (maxNum) values? I have a feeling that large numbers will slow the function down, as it is randomizing numbers until it generates a number that is still 'valid' (i.e. not in the history array). But I can't figure out another way to do this..

The script:

var UniqueRandom = {
    NumHistory: [],
    generate: function (maxNum, id) {
        if (!this.NumHistory[id]) this.NumHistory[id] = [];
        if (maxNum >= 1) {
            var current = Math.round(Math.random() * (maxNum - 1)), x = 0;
            if (maxNum > 1 && this.NumHistory[id].length > 0) {
                if (this.NumHistory[id].length !== maxNum) {
                    while ($.inArray(current, this.NumHistory[id]) !== -1) {
                        current = Math.round(Math.random() * (maxNum - 1));
                        x = x + 1;
                    }
                    this.NumHistory[id].push(current);
                } else {
                    //reset
                    this.NumHistory[id] = [current];
                }
            } else {
                //first time only
                this.NumHistory[id].push(current);
            }
            return current;
        } else {
            return maxNum;
        }
    },
    clear: function (id) {
        this.NumHistory[id] = [];
    }
};

usage would be: (100 being the range (0-100) and the_id being.. well, the id)

UniqueRandom.NumHistory[100, 'the_id']

I have set up a Fiddle with a demo.

Klaas Leussink
  • 2,208
  • 16
  • 23

5 Answers5

4
  1. It's not best practice. Imo it would be better to instantiate an object per series of numbers that needs to be generated.
  2. I'd suggest generating an array of all possible values and shuffling it. Then you can just pop of it.
Alex
  • 1,077
  • 6
  • 7
  • 1
    +1; I would also advice not to use `Math.round`: https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Global_Objects/Math/random#Examples – Julien Royer Dec 11 '12 at 09:48
  • That array of possible values, then popping it, makes a lot more sense indeed. So if I understand you correctly, I would have to rewrite the function as suggested by Jack, with the 'history' variable being a pre-filled array (which is populated on the first instance)? – Klaas Leussink Dec 11 '12 at 10:00
3

I took Jack's code and adapted it to work with the popping array method.

function fisherYates ( myArray ) {
  var i = myArray.length;
  if ( i == 0 ) return false;
  while ( --i ) {
     var j = Math.floor( Math.random() * ( i + 1 ) );
     var tempi = myArray[i];
     var tempj = myArray[j];
     myArray[i] = tempj;
     myArray[j] = tempi;
   }
}

function RandomGenerator(maxNum) {

    this.max = maxNum;
    this.initRandomArray();

}

RandomGenerator.prototype.generate = function() {

    // if no more numbers available generate new array
    if( this.left === 0 ) this.initRandomArray();

    this.last = this.arr.pop();
    this.left = this.arr.length;
    this.history.push( this.last );
    return this.last;
}

RandomGenerator.prototype.initRandomArray = function() {

    this.arr = [];
    this.history = [];
    this.last = null;
    this.left = this.max;

    for( var i = 0; i < this.max; i++ ) {
        this.arr.push( i );
    }

    fisherYates( this.arr );

}

var mygen = new RandomGenerator(100);
console.log( mygen.generate() );

I got the fisherYates algorithm from here.

The approach of generating a new random number if it is already found in a history object will result in unnecessary looping.

Fiddle here

Community
  • 1
  • 1
Bruno
  • 5,772
  • 1
  • 26
  • 43
  • Thanks! This is really useful. The only problem is that I need the history to be accessible for other usage. In this case, the 'arr' would be fine, as it is in fact an 'inversed history', correct? The only thing that I need, is the function resetting itself, once all the unique numbers have been returned, so I think I need a check on the array length, re-filling it when it becomes 0, correct? – Klaas Leussink Dec 11 '12 at 11:09
  • @c_kick Added a history object as well as a new method `initRandomArray` that allows you to re-initialise the array if necessary. – Bruno Dec 11 '12 at 11:33
  • Noticed your edits, really appreciate the effort! This is exactly what I need. Edit: scratch that comment on undefined, I made an error. Thanks a lot for your merging of all the great answers below! Thanks Alex, Jack and Bruno! – Klaas Leussink Dec 11 '12 at 11:35
  • @c_kick I have added a fiddle so you can see it in action. – Bruno Dec 11 '12 at 11:38
1

I tend to think that it is indeed not most efficient. I dont immediately get the //first time only.
Further, you can make code more readable by skipping the else return .. and writing the condition to be the opposite, e.g.:

if (maxNum >= 1) {
    //code
} else {
    return maxNum;
}

becomes

if (maxNum < 1) { // or maybe even if maxNum == 0
    return maxNum;
}

//code

Also your x variable seems to be redundant.

EricG
  • 3,788
  • 1
  • 23
  • 34
1

I would probably implement it like this, using actual instances of random generators. This keeps the history of each generator separated.

function RandomGenerator(maxNum)
{
    this.max = maxNum;
    this.history = {};
    this.histn = 0;
}

// generate random number in range [0..maxNum)
RandomGenerator.prototype.generate = function()
{
    var value;

    if (this.histn == this.max ) {
        return false;
    }

    do {
        value = Math.floor(Math.random() * this.max );
    } while (this.history[value]);

    this.history['' + value] = 1;
    ++this.histn;

    return value;
}

var mygen = new RandomGenerator(100);
console.log(mygen.generate());

In my implementation I'm choosing a plain object for the history instead of an array; testing whether a value has been generated before is done by testing a property instead of $.inArray().

Bruno
  • 5,772
  • 1
  • 26
  • 43
Ja͢ck
  • 170,779
  • 38
  • 263
  • 309
  • This seems to work very well! I have combined it with my fiddle: http://jsfiddle.net/DMm5B/. Do you think rewriting it to conform to Alex's suggestion of pre-filled arrays with all possible combinations, popping one off at a time, will deliver beter preformance still? – Klaas Leussink Dec 11 '12 at 10:26
  • 2
    I think that `this.histn == maxNum` should be `this.histn == this.max`. – Bruno Dec 11 '12 at 10:27
  • Noticed that too, see my fiddle – Klaas Leussink Dec 11 '12 at 10:33
0

I agree with Alex that in most use cases, you'd want to generate an array of all values, shuffle them, and then pop them as you need them instead.

Here is an example:

var getShuffledUniqueRandoms = function(count, suffix) {
    var values = [];

    for (var i = 1; i < count+1; i++) {
        values.push(i + suffix);
    }

    // Shuffle function originally from:
    //+ Jonas Raoni Soares Silva
    //@ http://jsfromhell.com/array/shuffle [v1.0]

    return (function(o){ //v1.0
        for(var j, x, i = o.length; i; j = parseInt(Math.random() * i), x = o[--i], o[i] = o[j], o[j] = x);
        return o;
    })(values);
}

var values = getShuffledUniqueRandoms(10, "index");


$('button').click(function() {

    if (values.length == 0) {
        $('body').append('<p>Out of values.</p>');
    } else {
        $('body').append('<p>' + values.pop() + '</p>');
    }
});
​

FIDDLE

With this algorithm, it has a bigger upfront cost, but at least it has a known time it'll take to complete (roughly O(n)).

With the algorithm where you are constantly checking to see if a random value is in an array, it'll get slower and slower with each new iteration.

Now if your data set is always relatively small, your algorithm could work a little better, but anything larger than 10 or so and it starts losing it's edge.

samanime
  • 25,408
  • 15
  • 90
  • 139