1

I am writing tests right now for my node application. I have fixtures which I use to test my data and I ran into the Problem, that when I alter any of them in a method, then they are globally altered for all the other tests as well, which obviously has to do with referencing. Now I figured if I write my fixtures into a JSON and require that JSON in each file then they will have unique references for each file, which turns out now, they don't. My question would be: is there an easy way to handle fixtures in Node such that every file has an instance of the fixtures which won't affect the other test files.

The way I currently import my fixtures in every test file:

const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
Alex Haller
  • 295
  • 2
  • 17
  • Possible duplicate of [What is the most efficient way to deep clone an object in JavaScript?](https://stackoverflow.com/questions/122102/what-is-the-most-efficient-way-to-deep-clone-an-object-in-javascript) – wiomoc Mar 10 '19 at 12:21

2 Answers2

1

require calls are cached, so once you call it, consecutive calls will return the same object.

You can do the following:

const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');

const fixtureCopy = JSON.parse(JSON.stringify(fixture1));
const someOtherFixtureCopy = JSON.parse(JSON.stringify(someOtherFixtureCopy));

or use a package:

const deepcopy = require('deepcopy');
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');

const fixtureCopy = deepcopy(fixture1);
const someOtherFixtureCopy = deepcopy(someOtherFixtureCopy);

Or change your module to export a function that will return new copies everytime. This is the recommended approach in my opinion.

module.exports = {
   get() {
      return deepcopy(fixture); // fixture being the Object you have 
   }
}

const fixture = require('./fixture');

const fixture1 = fixture.get();
Marcos Casagrande
  • 37,983
  • 8
  • 84
  • 98
  • Yes I thought about doing that, it would add another step to each of my tests and would make them look a little bit less clean imo. Is there any way you know of I can bypass an extra step and already ensure that the require step makes a deep copy? Like using filesystem for example? – Alex Haller Mar 10 '19 at 12:21
  • 1
    No, `require` calls are cached. Check updated answer. – Marcos Casagrande Mar 10 '19 at 12:24
  • 1
    Very good idea using a method that makes a deep copy as proxy, will do that, thanks a lot! – Alex Haller Mar 10 '19 at 12:26
0

This isn't specific to JSON. It's not uncommon that modules need to be re-evaluated in tests. require.cache can be modified in Node.js to affect how modules are cached, either directly or with helpers like decache.

Depending on the case,

decache('../../../../../fixtures/keywords.json')

goes before require in a test, or to afterEach to clean up.

Estus Flask
  • 206,104
  • 70
  • 425
  • 565