I always thought using React, we can use
data.payload.records[0].someField = "abcd1234";
setData({...data});
and it would trigger a re-render of the correct table values.
The data
and setData
is by
const [data, setData] = useState(null);
and the API will fill in the data.
But to my surprise, the above would not update the table values. If I use a deep copy such as lodash's _.cloneDeep()
to clone data
, then it would update the table's values.
So for experiment 2, I used
const payloadNew = deepCopy(payload);
payloadNew.records[0].someField = "abcd1234";
setData({...data, payload: payloadNew });
and it works too.
As experiment 3, I used
data.payload.records[0] = deepCopy(data.payload.records[0]);
data.payload.records[0].someField = "abcd1234";
setData({...data});
and it works too.
So
- What is the rule? What is the reason it works sometimes and sometimes not? Could it be due to sometimes, the code is written so that
data.payload
is passed down to the table component, so it is the samepayload
object, and React optimized it by skipping over the whole table component re-render? So since we don't know what is passed down (such asdata.payload
ordata.payload.records
(other programmers can change it too), so we really should make a deep copy of everything). - Should we always make a deep copy from the top most level so that there is no unexpected result?
The reason I may not want to make a deep copy every time is that, what if there are 1000, 2000, 5000 items, or more, so making a deep copy every time can make the UI sluggish. And if one value is changed, we need to make a copy of 25,000 values? (assuming 50 properties per item). That may sound too mind boggling. However, if it is the world of Redux, I think they always make everything a new object by deep copy.