0

I’m in the early design phases of an application and trying to reason about expectations and requirements, to make sure I don’t spend substantial time down a dead-end. I’ve never worked with V8 or node native add-ons, so bear with me.

Suppose I were building an Electron application. For performance reasons some functionality would be written in native code in an independent library, which would be invoked from a native Node add-on. The native library needs to execute JavaScript. So:

Electron App -> Native add-on -> Native library -> V8

First, is this feasible? For example, will trying to construct/run a V8 context fail due to its execution inside an Electron V8 context? I’m thinking deadlocks, aborts, etc.

Zac
  • 876
  • 1
  • 8
  • 18
  • For what it's worth, V8 is stunningly fast. You probably won't lose much perfomance if you just use Javascript throughout. Even for signal processing. It's possible that, by the time you get this multilanguage solution built, a new version of V8 will have performance outstripping your solution. This is from bitter experience, doing expensive highly optimized solutions because today's software / hardware is too slow. Tomorrow's might not be. – O. Jones Mar 20 '19 at 19:52
  • Assuming that were an option, I cannot seem to find any way of meeting the requirement of running external, arbitrary JavaScript from npm/electron. The files I will be processing contain embedded JS and I will need to be able to execute them. – Zac Mar 20 '19 at 20:23
  • I'm not sure if I understood q correctly - are you saying you'd like to run some code written in native and communicate values between v8 isolate, or want to setup another v8 isolate and run javascript code there? – OJ Kwon Mar 20 '19 at 20:37
  • The application will process files containing JavaScript, and to properly process the files it needs to execute the JS in a context sandboxed to that file. As an example, consider PDF files which contain embedded JavaScript. In the example, the native library would be a PDF library (ala Pdfium) which contains a V8 dependency. It would be loaded by a native add-on and executed from electron, which itself runs in V8. – Zac Mar 20 '19 at 20:47
  • +1 to O.Jones. Your plan sounds technically *feasible*, but quite complicated. Note that cross-language function calls are comparatively heavy. Also it seems weird to have two copies of V8. Maybe you can restructure the flow so that the native library is an "endpoint" whose functions return control to its environment for any required JavaScript execution? – jmrk Mar 20 '19 at 20:58
  • it's already well explained above, having separate v8 isolate in native addon will be technically not impossible but you'll encounter number of edge cases to handle between two isolate. Even things like your example, building pdfium with v8 enabled as native addon then matching v8 version between electron's node and vice versa is non trivial job. – OJ Kwon Mar 20 '19 at 22:34
  • `iframe` objects loaded inline (rather than from a URL) provide a decent sandbox setup. https://stackoverflow.com/questions/195149/is-it-possible-to-sandbox-javascript-running-in-the-browser Worker threads aren't bad either. – O. Jones Mar 21 '19 at 11:02

1 Answers1

0

I've come up with my plan-of-action from comments to the main question. Unfortunately that makes it hard to credit any specific user or comment for the answer. Specifically:

  • Restructure the data flow. Allow the Native Add-on to use its copy of V8 to execute any necessary JavaScript, instead of introducing it as a dependency to the Native Library. (comment)
  • Entering a separate V8 Isolate within another is supported. (comment) (docs)

The Native Library component was conceived because the document type to be supported is large (file-size) and needs JavaScript processing and expensive rendering. Initially I had conceived this to be one large monolithic library for supporting the document type, but in actuality it can be (and probably should be) broken up.

  • Parsing the binary file. Due to V8's speed, this can probably be done in my Node.js app itself, and may be faster than marshaling data across language barriers. If not, I may consider an N-API Native Add-on for parsing the document and returning a JS object representing the parsed data. (comment)
  • Executing document-level JavaScript. This should be doable in a Native Add-on by entering a separate V8 Isolate. (See above.)
  • Rendering the document to a canvas/bitmap. This is an unknown as it is dictated by drawing performance consisting of complex paths, etc. I will probably try to do this directly in the Node.js app first, and if it isn't performant enough then I will consider something like an N-API Native Add-On with e.g. Skia as a dependency to render the data.
Zac
  • 876
  • 1
  • 8
  • 18