0

Using react, we are told to never modify this.state directly and treat it as immutable. Which makes sense. But consider this situation:

export default class Example extends Component {
  constructor() {
    super();

    this.state = {
      largeArray = [values, in, here],
    }
  }

  copyAndAddElement = () => {
    const newState = [ ...this.state.largeArray ];

    newState.push({ something: 'new' });

    this.setState({ largeArray: newState });
  }

  mutateAndSet = () => {
    this.state.largeArray.push({ something: 'new' });

    this.setState({ largeArray: this.state.largeArray });
  }

  render = () => ( JSON.stringify(this.state.largeArray) )
}

Maybe I'm just being naive, but why not just use mutateAndSet? If you have a lot of data in that array, isn't it process intensive to do a deep copy in order to modify it?

The only problem that I see could be with the asynchronous nature of setState, being that largeArray could be modified by a different function before the setState in muateAndSet actually executes. In my case I'm just adding points to a graph, never removing or modifying from another place.

If I have a very large array, why should I not take the seemingly more performant route?

Tyler Miller
  • 1,232
  • 10
  • 18
  • You break React's way of checking for rerenders and can cause unexpected behavior as a result. You can look many, endless, different directions if you're looking to speed up your app. – Andrew May 07 '18 at 23:52
  • From what I know, the only danger is that you can indirectly lose modifications that you have made. You're still correct, but immutability seems troublesome when working with a lot of data. I guess you've answered my question. – Tyler Miller May 07 '18 at 23:55
  • 1
    As an aside, if you're worried about performance, `this.setState({ largeArray: this.state.largeArray.concat([{something: 'new'}]) })` is going to be much more efficient than your current `copyAndAddElement` method – Hamms May 08 '18 at 00:04
  • 1
    That kind of scenario would probably be a use case for some immutable library like ImmutableJS, which provides optimizations for such cases and many more and doesn't risk breaking react's rendering – SrThompson May 08 '18 at 00:05
  • @Hamms I'm thinking I should have worded my question to suit your comment here. I'm worried about performance. What you're doing there though, is almost exactly what my `muateAndSet` is doing, which is against react standards. – Tyler Miller May 08 '18 at 00:06
  • 1
    In that case, I'd also point out that your most significant bottleneck when dealing with a lot of data is almost certainly going to be rendering it all out rather than any simple manipulations – Hamms May 08 '18 at 00:09
  • I agree with @Hamms, you should not worry about big arrays, I don't know how React treats it internally, but must be a really good algorithm that inserts data in a short period of time. With modern computers, and "modern" algorithms, this will be probalby some ms. – Giovanni Klein Campigoto May 08 '18 at 00:21

1 Answers1

1

You should never mutate the state directly, because of setState assynchronous behavior. If you constantly insert data using mutateAndSet, and fire 2 setStates at a time, React won't know which was the original array, and information will be lost.