The Goal:
Optimize a web page that needs to display large amount of text data (25mb+) in a textarea
or content editable div
, without losing too much performance.
Currently, loading a 10mb file takes ~3 seconds on Chrome. I would like this to be 1 second max.
Ideas and Attempts:
I am loading local text files from a users computer using <input type="file">
and have no problem loading large files directly into memory. However as soon as I try and display this text data in a textarea, I naturally run into performance issues.
I have spellcheck, auto capitalization, and auto complete all disabled, and this certainly helped, however I would like to minimize the amount of lag when attempting to render large files (files greater than 10mb, max of 100mb would be ideal).
<textarea autocomplete="off" autocorrect="off" autocapitalize="off" spellcheck="false">
One idea I had was to only render say 100 lines before and 100 lines after the current line, and when the users scrolls the textarea, I would then just switch out what data is being displayed. I can swap a few hundred lines without any noticeable lag, but hundreds of thousands locks up the entire page.
I was also looking at projects such as CodeMirror which is used in some javascript based text editors, and chrome dev tools. However a quick test showed similar performance issues when initially loading large amounts of text.
Another idea was to use highlight.js to render the text dom elements, but I was also noticing large amounts of large when dealing with thousands of DOM elemetns.
This site seems to tackle a similar example by dynamically creating and displaying the dom elements, instead of attempting to render all at once.
I have found that if the number of records in the grid becomes more then just a few thousands the grid gets very slow because the rendering speed is directly related to the number of nodes in the DOM. If the number of nodes in the DOM is more then 40-50k (depending on your computer configuration and amount of memory), your browser will crash or will become unresponsive.
So, I decided to set out on a quest to do two things: (1) dynamically create records as user scrolls (2) optimize grid to handle large data sets. After a few weeks of work, the grid was optimized and ready for testing.
I think this is similar to my first idea, but I have not tried this approach yet.
I'm hoping someone who's had experience with something similar might be able to offer some advice on which path to take, or offer some additional ideas.
Before anyone asks, I cannot not show this data, it needs to be user editable, it does not need to highlight code, or show line numbers. Lastly, the entire text file is being loaded with a FileReader into a variable.
I would like to avoid uploading the file to my webserver if possible for end user privacy and NDA concerns.
Server config: Ubuntu 16.04 LAPP Stack with Laravel 5.4, open to NodeJS solutions though. Use of jQuery is allowed.
Proposed Solution(s):
Lazy Loading - only displaying say 300 line "chunks" at a time as the user scrolls. Would need to make the scrollbar the appropriate height ahead of time in this case though. - Also, should unload these "chunks" as the users scrolls to reduce total the DOM rendering load.
Pseduo code:
c_chunk = scrollpos / scrollheight * totalChunks;
chunk_block = chunk[c_chunk--] + chunk[c_chunk] + chunk[c_chunk++];
my_textarea.val(chunk_block);
Any thoughts on this method?
Thanks all.