"modern" because that definition may change over time (and specifically I mean desktop browsers)
"handle" because that may vary depending on machine configurations/memory, but specifically I mean a general use case.
This question came to mind over a particular problem I'm trying to solve involving large datasets.
Essentially, whenever a change is made to a particular dataset I get the full dataset back and I have to render this data in the browser.
So for example, over a websocket I get a push event that tells me a dataset has changes, and then I have to render this dataset in HTML by grabbing an existing DOM element, duplicating it, populating the elements with data from this set using classnames or other element identifiers, and then add it back to the DOM.
Keep in mind that any object (JSON) in this dataset may have as many as 1000+ child objects, and there may be as many as 10,000+ parent objects, so as you can see there may be an instance where the returned dataset is upwards towards 1,000,000 => 10,000,000 data points or more.
Now the fun part comes when I have to render this stuff. For each data point there may be 3 or 4 tags used to render and style the data, and there may be event listeners for any of these tags (maybe on the parent container to lighten things up using delegation).
To sum it all up, there can be a lot of incoming information that needs to be rendered and I'm trying to figure out the best way to handle this scenario.
Ideally, you'd just want to render the changes for that single data point that has changes rather than re-rendering the whole set, but this may not be an option due to how the backend was designed.
My main concern here is to understand the limitations of the browser/DOM and looking at this problem through the lense of the frontend. There are some changes that should happen on the backend for sure (data design, caching, pagination), but that isnt the focus here.
This isn't a typical use case for HTML/DOM, as I know there are limitations, but what exactly are they? Are we still capped out at about 3000-4000 elements?
I've got a number of related subquestions for this that I'm actively looking up but I thought it'd be nice to share some thoughts with the rest of the stackoverflow community and try to pool some information together about this issue.
What is "reasonable" amount of DOM elements that a modern browser can handle before it starts becoming slow/non-responsive?
How can I benchmark the number of DOM elements a browser can handle?
What are some strategies for handling large datasets that need to be rendered (besides pagination)?
Are templating frameworks like mustache and handlebars more performant for rendering html from data/json (on the frontend) than using jQuery or Regular Expressions?