I'm planning on writing some code for encrypting files in javascript locally. For large files and large key sizes, the CPU usage (naturally) is pretty high. In a single script design, this often hangs the browser until the task is complete.
In order to improve responsiveness and allow users to do other things in the mean time I want to try make the script 'friendlier' to the user's PC. The encryption process will be reading a file as a binary string and then encrypting the string in chunks (something like 1KB/chunk - needs testing). I want to try and user HTML5-based workers to make the whole thing as incremental as possible. Something like:
- Spawn worker
- Send worker a binary data chunk
- Worker completes encryption, passes back new chunk
- Worker dies.
This might also help with multicore processors, by having multiple workers alive at once.
Anyway, has anybody looked at deliberately slowing down a script in order to reduce CPU usage? Something like splitting the worker's encryption task into single operations, and introducing a delay between them.
Interval timer callback every 100ms (example).
Is worker busy?
Yes - Wait for another interval
No - Start encrypting the next letter
Advice/thoughts?
Does anyone have experience using workers? If you seperate the main UI from intensieve work by making it a worker, does the responsiveness increase?