134

Context: I'm building a little site that reads an rss feed, and updates/checks the feed in the background. I have one array to store data to display, and another which stores ID's of records that have been shown.

Question: How many items can an array hold in Javascript before things start getting slow, or sluggish. I'm not sorting the array, but am using jQuery's inArray function to do a comparison.

The website will be left running, and updating and its unlikely that the browser will be restarted / refreshed that often.

If I should think about clearing some records from the array, what is the best way to remove some records after a limit, like 100 items.

bren
  • 4,176
  • 3
  • 28
  • 43
addedlovely
  • 3,076
  • 3
  • 22
  • 37
  • 6
    You will probably run into more problems with the browser leaking memory from toolbars than from the JS code. :) Firefox 4 I point my finger at you. – epascarello May 27 '11 at 16:18
  • 2
    How often are you checking the array (ex 2s interval)? What constitutes sluggish (ex >500ms)? What order of magnitude is your array (ex thousands, millions, billions)? – zzzzBov May 27 '11 at 16:20
  • 2
    do benchmark testing with http://jsperf.com/ – VirtualTroll May 27 '11 at 16:27
  • I'll be checking and updating the array every minute. And yes sluggish would be a performance hit that starts effecting that load and check, and other animations on the page, hard to define sorry! – addedlovely May 29 '11 at 06:24
  • @Amine thanks for the link, looks like that website will be my new best friend :) – addedlovely May 29 '11 at 06:26

7 Answers7

193

The maximum length until "it gets sluggish" is totally dependent on your target machine and your actual code, so you'll need to test on that (those) platform(s) to see what is acceptable.

However, the maximum length of an array according to the ECMA-262 5th Edition specification is bound by an unsigned 32-bit integer due to the ToUint32 abstract operation, so the longest possible array could have 232-1 = 4,294,967,295 = 4.29 billion elements.

maerics
  • 151,642
  • 46
  • 269
  • 291
  • 15
    @Barkermn01: the ECMA-262 5th Edition specification uses the abstract operation *ToUint32* for checking the length of an array on any operation that modifies its length, so I think the underlying architecture of the machine (or web browser) is irrelevant. – maerics May 27 '11 at 16:31
  • 1
    hrm nice just read that one awsome 64Bit browser are flaming pointless then, – Barkermn01 May 27 '11 at 16:41
  • 4
    @Barkermn01, 64bit browsers still ahve a lot of other improvements. Remember that being a javascript interpreter isn't the only thing a browser does. – Razor Storm May 27 '11 at 17:04
  • 2
    Wowzer wouldn't of expected it to be that high. OK nice I think I'll be fine! – addedlovely May 29 '11 at 06:25
  • 1
    Actually an array can have at most 4294967295 (2^31-1) elements. See http://stackoverflow.com/a/12766547/396458 – NullUserException Oct 07 '12 at 06:34
  • Just adding, `Float32Array` has a maximum length of `2^29-2` on my 32-bit machine. P.S. Does anyone have a clue why that number? – Robbie Wxyz Nov 27 '13 at 03:18
  • @SuperScript: The reason for this is very likely the same reason I'm putting in this restriction in my JavaScript implementation, but it doesn't have to do with the array. I store names of properties as an integer. Properties on JavaScript objects have three attributes: DontEnum, DontDelete and ReadOnly. By storing the names as integers, I can store these properties as part of the name. This is where the 29 available bits come from. Names are stored as negative numbers, and array indexes as positive, so the index is signed and the sign also takes a bit (the -2). – Pieter van Ginkel Dec 07 '13 at 13:23
  • @SuperScript I think there is no deeper justification for this. But you cannot allocate that much memory anyway. `ArrayBuffer` should have a maximum size of `2^32-1`, so `Float32Array` could have up to `2^30-1` elements. But the current spidermonkey restricts the length of the buffers to `2^31-1` and additionally it restricts the length of the views to one element less than the possible maximum. I suppose this is due to internal restraints. For example sometimes -1 is used when the length wasn't specified. And a value of `2^31-1` seems inconvenient since incrementing it results in an overflow. – Robert Jun 23 '14 at 17:09
  • It may be that the specification limits the max length to 2^32, but [this](http://jsfiddle.net/e2uvC/) bombs out with a too much memory error. I ran this on a system with >3 GB of free physical and oodles more of swap. – Allen Aug 04 '14 at 01:08
  • Note that V8 currently has [a bug](https://bugs.chromium.org/p/v8/issues/detail?id=3505) (due to some old garbage collector code that still needs to be updated) which puts an artificial upper limit on the size of arrays, maps etc. –  May 04 '17 at 04:54
32

No need to trim the array, simply address it as a circular buffer (index % maxlen). This will ensure it never goes over the limit (implementing a circular buffer means that once you get to the end you wrap around to the beginning again - not possible to overrun the end of the array).

For example:

var container = new Array ();
var maxlen = 100;
var index = 0;

// 'store' 1538 items (only the last 'maxlen' items are kept)
for (var i=0; i<1538; i++) {
   container [index++ % maxlen] = "storing" + i;
}

// get element at index 11 (you want the 11th item in the array)
eleventh = container [(index + 11) % maxlen];

// get element at index 11 (you want the 11th item in the array)
thirtyfifth = container [(index + 35) % maxlen];

// print out all 100 elements that we have left in the array, note
// that it doesn't matter if we address past 100 - circular buffer
// so we'll simply get back to the beginning if we do that.
for (i=0; i<200; i++) {
   document.write (container[(index + i) % maxlen] + "<br>\n");
}
Lelanthran
  • 1,510
  • 12
  • 18
  • 5
    Clever idea, but by doing this you'll potential overwrite data, confusing indexes, and possibly resulting in strange behavior. – john k Oct 23 '14 at 02:31
  • 12
    The idea is to implement a ring-buffer, so yes - you are intentionally "forgetting" old data (that's what a ring buffer is used for) and that was what the questioner asked for. – Lelanthran Oct 24 '14 at 06:26
  • 1
    I was just bored-clicking around SO and found this response. love the technique with overwriting indexes as needed. – Kyle Hotchkiss Feb 06 '16 at 02:56
17

Like @maerics said, your target machine and browser will determine performance.

But for some real world numbers, on my 2017 enterprise Chromebook, running the operation:

console.time();
Array(x).fill(0).filter(x => x < 6).length
console.timeEnd();
  • x=5e4 takes 16ms, good enough for 60fps
  • x=4e6 takes 250ms, which is noticeable but not a big deal
  • x=3e7 takes 1300ms, which is pretty bad
  • x=4e7 takes 11000ms and allocates an extra 2.5GB of memory

So around 30 million elements is a hard upper limit, because the javascript VM falls off a cliff at 40 million elements and will probably crash the process.


EDIT: In the code above, I'm actually filling the array with elements and looping over them, simulating the minimum of what an app might want to do with an array. If you just run Array(2**32-1) you're creating a sparse array that's closer to an empty JavaScript object with a length, like {length: 4294967295}. If you actually tried to use all those 4 billion elements, you'll definitely crash the javascript process.

Carl Walsh
  • 6,100
  • 2
  • 46
  • 50
8

You could try something like this to test and trim the length:

http://jsfiddle.net/orolo/wJDXL/

var longArray = [1, 2, 3, 4, 5, 6, 7, 8];

if (longArray.length >= 6) {
  longArray.length = 3;
}

alert(longArray); //1, 2, 3
Mohammad Faisal
  • 2,144
  • 15
  • 26
orolo
  • 3,951
  • 2
  • 30
  • 30
3

I have built a performance framework that manipulates and graphs millions of datasets, and even then, the javascript calculation latency was on order of tens of milliseconds. Unless you're worried about going over the array size limit, I don't think you have much to worry about.

Razor Storm
  • 12,167
  • 20
  • 88
  • 148
-1

It will be very browser dependant. 100 items doesn't sound like a large number - I expect you could go a lot higher than that. Thousands shouldn't be a problem. What may be a problem is the total memory consumption.

rjmunro
  • 27,203
  • 20
  • 110
  • 132
-1

I have shamelessly pulled some pretty big datasets in memory, and altough it did get sluggish it took maybe 15 Mo of data upwards with pretty intense calculations on the dataset. I doubt you will run into problems with memory unless you have intense calculations on the data and many many rows. Profiling and benchmarking with different mock resultsets will be your best bet to evaluate performance.

Stephane Gosselin
  • 9,030
  • 5
  • 42
  • 65