I am creating an image editor for SVG files. Some of the SVG files contain <image>
tags, which can make the SVG file very large (around 25 MB or more), this makes these images both slow to load but also slow to edit.
Thinking freely, the best way (I think) to tackle this is to make the editor load a "fake" version of that SVG that is only let's say 1MB. Apart from the difference in size, the "fake" SVG is identical in every way: objects, positioning, size ratio etc.
This "fake" SVG is fast to load, and when the user clicks "Save" I have plenty of time replicating those changes on my real SVG in the backend (yes I want the changes to also apply to the original version).
I have started coding on this solution, namely by storing every edit step:
const stepList = [...pervValue.steps.slice(0, currentStep), canvas.toJSON()]
and loading it to a new canvas
canvas.loadFromJSON(json, () => {
turnOnCanvasEvents()
canvas.renderAll()
})
However .. my problem is that nothing shows up and this feels like a very cumbersome approach as the JSON strings also get abnormally large after just a few steps. I will attach the JSON structure for one of my "tamer" SVG images.
I am thinking here, am I making things more complicated than they need to be?