I am making a node.js web server that handles large payloads, computations and copies, for example I need to with a deep copy of a large object:
const largeObject = { bla: "bla" } // ...
class Example {
constructor() {
this.copy = JSON.loads(JSON.stringify(largeObject))
this.copy.bla = "blo" // in reality the changes will different per request
}
doStuff(args) {
// do stuff with the deep copy
}
}
now this works fine and with every request context I can create 1 new deep copy and work with that in the class. But my class is becoming big and unstructured so I want so split them up in different classes. I've thought of implementing a base class with a static deep copy, so that every request i can change the copy on the base class and implement that class in my other classes.
const largeObject = { bla: "bla" } // ...
class Example {
static copy;
constructor() {
Example.copy = JSON.loads(JSON.stringify(largeObject))
Example.copy.bla = "blo" // in reality the changes will different per request
}
}
class DoWork {
constructor(someValue) {
this.someValue = someValue
}
doStuff(args) {
// do stuff Example.copy
}
}
I want to deep copy the object only once per request for performance reasons, there is no reason to deep copy the object on every class initialization. But I'm scared that with using a "global" variable that technically outlives the request context I will get issues with race conditions and overlapping contexts. Is this a real problem or Is the single threaded environment of node.js safe enough to handle this.