15

Is there an Array method that can replace the following function (Something like a.splice(some, tricky, arguments))?

function resize(arr, newSize, defaultValue) {
  if (newSize > arr.length)
    while(newSize > arr.length)
      arr.push(defaultValue);
  else
    arr.length = newSize;
}

If not, is there a better / nicer / shorter / implementation? The function should append a default value if the array should grow and remove values on shrink.

hansmaad
  • 18,417
  • 9
  • 53
  • 94
  • 3
    `arr.length = newSize;` it will be filled with `undefined` – crush Aug 17 '15 at 15:29
  • @crush this does not add default values. – hansmaad Aug 17 '15 at 15:29
  • 1
    Efficiency aside, your `while` loop will push too many values onto the array. Should be `while (newSize > arr.length)` – Paul Roub Aug 17 '15 at 15:30
  • 2
    if you are going for efficiency, you don't grow/shrink the array on every push/pop. It is best to have a large-enough array and shrink when necessary or grow only when realizing the array is not large enough – blurfus Aug 17 '15 at 15:31
  • 3
    Well, there is [Array.prototype.fill](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/fill) but it's not widely supported yet. – crush Aug 17 '15 at 15:31
  • @PaulRoub , Scimonster you're right. edited... – hansmaad Aug 17 '15 at 15:31
  • 3
    Can you explain why you need this behavior in the first place? – crush Aug 17 '15 at 15:31
  • https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/splice – Brian Glaz Aug 17 '15 at 15:32
  • Why are you using code length as an efficiency metric? The only time I could see length really mattering in that way is if things start to get out of hand and get very large. The few line implementation you have looks fine and readable to me. – loganhuskins Aug 17 '15 at 15:34
  • @crush I think it's a quite common task. Actually I resize an array of objects that is bound to a table in a AngularJS app. I just wonder if there is a built-in function for that. Array has some awesome features that surprised me often :) – hansmaad Aug 17 '15 at 15:35
  • @lgh I just wondered if there is some build-in method. Something like `splice` which can be very confusing in the beginning. – hansmaad Aug 17 '15 at 15:37
  • 1
    What I'm getting at is instead of iterating the entire array, and setting each new elements value to some default value, you could instead handle the scenario where the value is undefined, and display the default value until the value is set. I use knockout rather than angular for data-binding, so can't give you a direct example. I do this sort of thing in Knockout all the time. I assume it's similarly possible in Angular. – crush Aug 17 '15 at 15:37
  • @crush hm no, that does not sound more elegant to me :) the current implementation isn't that bad... – hansmaad Aug 17 '15 at 15:41
  • 1
    @hansmaad You said you wanted efficiency. Iterating the array (even just the newly added elements) and filling it with a default value isn't more efficient. You need to clearly define what you consider `elegant`. – crush Aug 17 '15 at 15:42
  • 1
    If you are deadset on this method, then you might consider feature detection on page load, and providing a different method depending on if `Array.prototype.fill` exists natively or not. (don't bother using a polyfill, it would be slower than what you already have). Only other thing you could do is create a huge lookup table of predefined arrays with default values in conjunction with `Array.prototype.concat`. Again, all this really depends on what `elegant` and `efficient` mean to you. Do you mean efficient in terms of processing power, memory, lines of code, etc. – crush Aug 17 '15 at 15:51

8 Answers8

22

In terms of elegance, I would say you could trim down your original solution to:

function resize(arr, newSize, defaultValue) {
    while(newSize > arr.length)
        arr.push(defaultValue);
    arr.length = newSize;
}

Or use prototype:

Array.prototype.resize = function(newSize, defaultValue) {
    while(newSize > this.length)
        this.push(defaultValue);
    this.length = newSize;
}

Edit: ES2016 approach:

function resize(arr, newSize, defaultValue) {
    return [ ...arr, ...Array(Math.max(newSize - arr.length, 0)).fill(defaultValue)];
}
James Brierley
  • 4,630
  • 1
  • 20
  • 39
  • This eliminates a comparison but ensures an assignment. So, whether this is more efficient or not depends on if comparison is more costly than assignment. – crush Aug 17 '15 at 16:26
  • 1
    NOTE: this answer could lead to undesired behavior if `defaultValue` is an object. Look at [my answer](https://stackoverflow.com/a/55600806/3235404) – Reza Apr 09 '19 at 20:28
  • 2
    I do not agree with this answer. Each push to array is resizing the array. So you should first set the length and then set the default value by iteration over the rest items. – siOnzee Oct 09 '19 at 18:26
  • 2
    What's the point of resetting arr.length when you're already changing the length in the first example? it's redudant. – BigDreamz Mar 25 '20 at 04:21
7
function resize(arr, size, defval) {
    while (arr.length > size) { arr.pop(); }
    while (arr.length < size) { arr.push(defval); }
}

I think this would be more efficient though:

function resize(arr, size, defval) {
    var delta = arr.length - size;

    while (delta-- > 0) { arr.pop(); }
    while (delta++ < 0) { arr.push(defval); }
}

And while not as elegant, this would probably me the most efficient:

function resize(arr, size, defval) {
    var delta = arr.length - size;

    if (delta > 0) {
        arr.length = size;
    }
    else {
        while (delta++ < 0) { arr.push(defval); }
    }
}
wayofthefuture
  • 8,339
  • 7
  • 36
  • 53
  • Read the `while` as an `if` initially, my fault. So this would work, but it's less efficient than what the OP has already put forth. Simply setting the length to a smaller value is efficient for limiting any elements beyond a certain size. – crush Aug 17 '15 at 15:45
  • These two methods are virtually equivalent. They do the same thing. `arr.length = size` is faster than manually popping each element until desired size is reached. For adding new elements, your code in both functions is equivalent to what the OP has written. You've just expressed it differently. – crush Aug 17 '15 at 15:55
  • @crush The code is a bit more efficient for adding items because it's not re-evaluating `arr.length` with every iteration. – Thriggle Aug 17 '15 at 16:00
  • Right arr.length = would be better, added another... Thanks – wayofthefuture Aug 17 '15 at 16:05
  • @Thriggle That's not true. They are [virtually equivalent](http://jsperf.com/property-access-vs-cached-value-access). Maybe in some older browser, that statement is true. When you remove that false claim, this answer's third method is exactly what the OP provided. – crush Aug 17 '15 at 16:10
  • The OP method would have to recalculate the size of the array, otherwise it wouldn't work. arr.length changes on every push, so how can that be cached – wayofthefuture Aug 17 '15 at 16:17
  • @Dude2TheN `arr.length` is a property, not a method. It automatically gets updated when you call `Array.prototype.push`, and NOT when you explicitly reference. Your code is equivalent functionally; you have just expressed it differently. – crush Aug 17 '15 at 16:20
  • @crush Chrome's mysterious optimizations produce inconsistent jsperf results. Simply changing the order in which your tests are evaluated made the cached test more efficient. http://jsperf.com/property-access-vs-cached-value-access/2 – Thriggle Aug 17 '15 at 16:26
  • @Thriggle You omitted the teardown code. Take a look at [revision 3](http://jsperf.com/property-access-vs-cached-value-access/3) – crush Aug 17 '15 at 16:29
  • @Dude2TheN That's a good question. At a glance, I don't see why it should make any difference at all. It's purpose is obviously to make sure both tests are equal in every way. It could be something to do with JSPerfs benchmarking tools. Anyone have any idea why the teardown code would make such a difference? It's just setting the initial values to be the same before each test. – crush Aug 17 '15 at 16:36
  • Gotcha.... I think your right though now I see there is no difference. I have a huge application filled with: (var i = 0, len = arr.length; i < len; i++) .... oh well all for nothing. – wayofthefuture Aug 17 '15 at 16:39
  • @Crush, very interesting! Would you be willing to descibe what your teardown code is accomplishing and why that affects the test results? I've just always operated under the model of caching references to array.length, based on what I'd read [here](http://bonsaiden.github.io/JavaScript-Garden/#array.general). Edit: Ah, it seems you were mistaking it for the setup code? – Thriggle Aug 17 '15 at 16:39
  • i think they fixed that in ES5 or something – wayofthefuture Aug 17 '15 at 16:40
  • @Thriggle I believe that caching array.length used to be a worthy enhancement in older browsers. They finally realized that instead of calculating the length on every reference to length, they could just set the length and store it internally on each modification of the array size. I'm a bit stumped on why the teardown is making such a huge difference in this case. It must have something to do with how the JSPerf site runs the tests. The teardown is just setting all the initial values to be exactly the same between tests. Considering the values, though, I don't see why there is a difference. – crush Aug 17 '15 at 16:44
  • u never know what happens in mysterious browser-land! maybe better to cache :) – wayofthefuture Aug 17 '15 at 16:49
  • @Dude2TheN It's true, caching doesn't hurt anything, and the entire idea behind encapsulating this logic into a function is so that the function call itself will be clean. It may be safer for backwards compatibility to cache. Joe's Custom Browser might not implement arrays the way most modern browsers do. – crush Aug 17 '15 at 16:51
  • @Crush, curiously, after updating the teardown to avoid inadvertently introducing global variables, I still see a [big performance difference](http://jsperf.com/property-access-vs-cached-value-access/8) on the latest version of Firefox. I think Chrome is just more aggressive with its optimizations. – Thriggle Aug 17 '15 at 17:33
  • @Thriggle Interesting. I wonder if you attached them directly to `window` (explicitly global), and gave them fairly unique names, what the result would be. – crush Aug 17 '15 at 18:51
3

For one-liner-lovers

If you are one-liner-lover like me, what you're asking would be what my resize_array_right does.


const resize_array_left = (array, length, fill_with) => (new Array(length)).fill(fill_with).concat(array).slice(-length);
// Pads left when expanding an array.
// Put left elements first to be removed when shrinking an array.

const resize_array_right = (array, length, fill_with) => array.concat((new Array(length)).fill(fill_with)).slice(0, length);
// Pads right when expanding an array.
// Put right elements first to be removed when shrinking an array.

You can find it out in NPM, as resize-array.


Browser compatibility

  • Chrome: 45 and above.
  • Firefox: 31 and above.
  • Internet Explorer: ✘.
  • Edge: ✔.
2

You can use the following code for resizing arrays:

// resizing function for arrays
Array.prototype.resize = function( newSize, defaultValue )
{
  while( newSize > this.length )
  {
    typeof( defaultValue ) === "object" ? this.push( Object.create( defaultValue ) ) : this.push( defaultValue );
  }
  this.length = newSize;
}

I checked the type of defaultValue because if it is an object and you just push it to the new elements, you will end up with an array which new elements point to the same object. That means if you change a property of that object in one of your array's element, all others will also change. But if your defaultValue is a primitive, you can safely push it to the new elements.

Reza
  • 3,473
  • 4
  • 35
  • 54
1

This solution will only work for new browsers (except IE) due to Array.prototype.fill (see Browser compatibility at the bottom of the linked page) or using a polyfill.

function resize(arr, newSize, defaultValue) {
    var originLength = arr.length; // cache original length

    arr.length = newSize; // resize array to newSize

    (newSize > originLength) && arr.fill(defaultValue, originLength); // Use Array.prototype.fill to insert defaultValue from originLength to the new length
}
benathon
  • 7,455
  • 2
  • 41
  • 70
Ori Drori
  • 183,571
  • 29
  • 224
  • 209
  • It's ES6 so won't be compatible will all browsers for years. – wayofthefuture Aug 17 '15 at 16:21
  • It works for firefox and Safari (comp table - can't check), and will work for next version of Chrome. Maybe edge to. However, I've started the answer with disclaimer due to the fact that it doesn't work now on all browsers. – Ori Drori Aug 17 '15 at 16:25
1

Expanding on James solution:

Array.prototype.resize = function(newSize, defaultValue) {
    while(newSize > this.length)
        this.push(defaultValue);
    this.length = newSize;
}

If you want to get even more efficient, you could do browser detection for Array.prototype.fill and use that instead of the while loop.

if (Array.prototype.fill) {
    Array.prototype.resize = function (size, defaultValue) {
        var len = this.length;

        this.length = size;

        if (this.length - len > 0)
            this.fill(defaultValue, len);
    };
} else {
    Array.prototype.resize = function (size, defaultValue) {
        while (size > this.length)
            this.push(defaultValue);

        this.length = size;
    };
}

If someone has included a polyfill for Array.prototype.fill, then you want them to use your non-fill version instead. A polyfill would cause the fill method to be slower than the non-fill version.

This StackOverflow Q&A deals with how to detect if a function is natively implemented. You could work that into the initial condition, but that is just additional speed lost.

I'd probably only use this solution if you could ensure that no Array.prototype.fill would exist.

Community
  • 1
  • 1
crush
  • 16,713
  • 9
  • 59
  • 100
0

It is much more efficient to resize an array once, than it is to do a large number of .pushes. This can be accomplished by the setter feature on Array.prototype.length:

function resize(arr, newLength, defaultValue) {
  const oldLength = arr.length;
  arr.length = newLength;
  if (newLength > oldLength && typeof(defaultValue) !== 'undefined') {
    for (let i = oldLength; i < newLength; i++) {
      arr[i] = defaultValue;
      // Note: this will create many references to the same object, which
      // may not be what you want. See other answers to this question for 
      // ways to prevent this.
    }
  }
}
MSmedberg
  • 381
  • 3
  • 13
0

suggestion :

let expectedLength = 500
myArray = myArray.filter((element, index)=>{
     if(index<expectedLength){
         return element
     }
}) 
  • 1
    Your answer could be improved with additional supporting information. Please [edit] to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – mufazmi Jul 06 '22 at 23:23