108

I have a question related to the node.js documentation on module caching:

Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.

Multiple calls to require('foo') may not cause the module code to be executed multiple times. This is an important feature. With it, "partially done" objects can be returned, thus allowing transitive dependencies to be loaded even when they would cause cycles.

What is meant with may?

I want to know if require will always return the same object. So in case I require a module A in app.js and change the exports object within app.js (the one that require returns) and after that require a module B in app.js that itself requires module A, will I always get the modified version of that object, or a new one?

// app.js

var a = require('./a');
a.b = 2;
console.log(a.b); //2

var b = require('./b');
console.log(b.b); //2

// a.js

exports.a = 1;

// b.js

module.exports = require('./a');
mikemaccana
  • 110,530
  • 99
  • 389
  • 494
Xomby
  • 1,247
  • 3
  • 11
  • 10
  • 7
    That sentence in the docs could have been better written. It seems to me that *may not* is the same as *not allowed to*, i.e., *multiple calls to require('foo') **cannot** cause the module code to be executed multiple times*. – Lucio Paiva Jul 23 '14 at 02:38
  • @LucioPaiva Created a PR to fix it: https://github.com/nodejs/node/pull/23143 – mikemaccana Sep 28 '18 at 11:04
  • They meant [ [may not] cause ], as opposed to [ [ may ] not cause ], in the same sense as "No, Timmy, you may not have any more chocolate". So in this context, I agree, that is written ambiguously. – Ciabaros May 27 '20 at 01:13

7 Answers7

79

If both app.js and b.js reside in the same project (and in the same directory) then both of them will receive the same instance of A. From the node.js documentation:

... every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.


The situation is different when a.js, b.js and app.js are in different npm modules. For example:

[APP] --> [A], [B]
[B]   --> [A]

In that case the require('a') in app.js would resolve to a different copy of a.js than require('a') in b.js and therefore return a different instance of A. There is a blog post describing this behavior in more detail.

Petr Stodulka
  • 1,141
  • 9
  • 11
  • 1
    "and in the same directory". I received the same instance when [B] was in a subfolder of where [App] lived. – Bill Tarbell Mar 15 '16 at 21:56
  • 1
    @BillTarbell When I wanted to get the same instance from different folders I got two different ones. – MustSeeMelons Sep 28 '16 at 18:48
  • 1
    is this last caveat still true now that npm would install A,B, and APP in a flat directory structure? If not, how can a module be set up to cache its results between multiple other modules? – Michael Aug 07 '17 at 22:44
  • From the node.js documentation: "Modules are cached based on their resolved filename. Since modules may resolve to a different filename based on the location of the calling module (loading from node_modules folders), it is not a guarantee that require('foo') will always return the exact same object, if it would resolve to different files." So you can *not* use the require cache to implement singleton behavior, case insensitive file systems or symbolic links and such can cause the same module to be loaded multiple times. If you need global data you need to use the "global" object. – Tannin Feb 03 '20 at 16:01
6

node.js has some kind of caching implemented which blocks node from reading files 1000s of times while executing some huge server-projects.

This cache is listed in the require.cache object. I have to note that this object is read/writeable which gives the ability to delete files from the cache without killing the process.

http://nodejs.org/docs/latest/api/globals.html#require.cache

Ouh, forgot to answer the question. Modifying the exported object does not affect the next module-loading. This would cause much trouble... Require always return a new instance of the object, no reference. Editing the file and deleting the cache does change the exported object

After doing some tests, node.js does cache the module.exports. Modifying require.cache[{module}].exports ends up in a new, modified returned object.

moe
  • 133
  • 1
  • 1
  • 9
  • 1
    The code I posted actually works. `b.b` is defined with the value `2`. So in this example it's the same object. – Xomby Jan 16 '12 at 23:37
  • That's a feature and in my eyes quite useful. The question is if I can depend on it. The documentation says `may` which makes it unclear. – Xomby Jan 16 '12 at 23:55
  • Executing a function on require in a.js doesn't change anything... It keeps caching the returned object. Weird, how do my apps works? The only way is, as descriped in docs, executing a passed function after require. – moe Jan 17 '12 at 00:18
  • 1
    *sometimes* the objects are the same. Do not rely on that, pass around a single module instance if you need to. – Ricardo Tomasi Jan 17 '12 at 04:02
  • is there a way to get back a brand new object on a require (not the cached object)? – Matt Jan 04 '13 at 12:52
  • Edit for my previous comment... I can always write delete(require.cache['/full/path/to/include.js']); which works as expected, but is this good practice, or just an ugly hack? Is there a better way? – Matt Jan 04 '13 at 13:08
  • Write a function within your exported object which will return a brand new object – moe Jan 22 '13 at 19:08
3

Since the question was posted, the document has been updated to make it clear why "may" was originally used. It now answers the question itself by making things explicit (my emphasis to show what's changed):

Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.

Provided require.cache is not modified, multiple calls to require('foo') will not cause the module code to be executed multiple times. This is an important feature. With it, "partially done" objects can be returned, thus allowing transitive dependencies to be loaded even when they would cause cycles.

Reg Edit
  • 6,719
  • 1
  • 35
  • 46
2

For what I have seen, if the module name resolve to a file previosuly loaded, the cached module will be returned, otherwise the new file will be loaded separately.

That is, caching is based on the actual file name that gets resolved. This is because, in general, there can be different versions of the same package that are installed at different levels of the file hierarchy and that must be loaded accordingly.

What I am not sure about is wether there are cases of cache invalidation not under the programmer's control or awareness, that might make it possible to accidentaly reload the very same package file multiple times.

Simone C.
  • 181
  • 1
  • 7
1

In case the reason why you want require(x) to return a fresh object every time is just because you modify that object directly - which is a case I ran into - just clone it, and modify and use only the clone, like this:

var a = require('./a');
a = JSON.parse(JSON.stringify(a));
Evgeniy Berezovsky
  • 18,571
  • 13
  • 82
  • 156
0

try drex: https://github.com/yuryb/drex

drex is watching a module for updates and cleanly re-requires the module after the update. New code is being require()d as if the new code is a totally different module, so require.cache is not a problem.

0

When you require an object, you are requiring its reference address, and by requiring the object twice, you will get the same address! To have copies of the same object, You should copy (clone) it.

var obj = require('./obj');

a = JSON.parse(JSON.stringify(obj));
b = JSON.parse(JSON.stringify(obj));
c = JSON.parse(JSON.stringify(obj));

Cloning is done in multiple ways, you can see this, for further information.

Amir Fo
  • 5,163
  • 1
  • 43
  • 51