10

Goal: extend my already implemented localStorage in my app, to be well, non local I guess.


I like the implementation of saving simple user settings with local storage API. I have this working for my needs in a web app, but the only problem is it's local to that machine / browser being used/saved. I do not have access to a classic MySQL style table for this. I would like to extend or adapt my local storage to carry over to other browsers; or store my user settings in user JS objects and JS object properties.

  • I like the idea of just creating JSON or JavaScript objects with each user, whenever there is a new user, take the name, create an object or object[key] with the name, and have the field properties default variables at first, and variables become populated and or overridden when the user saves them.
  • Or if the above is frowned upon; would like to keep my localstorage implementation as it works so well and find a plugin/library/extension of some sort that allows me to also save this and re-render in different locations; this has gotten to be thought of before. Although I would love to keep it client side; I am open to a node.js solution as well as python solution, a simple dataframe of sorts should work enough.
  • What about generating a file with my localStorage data? Perhaps a .csv file (this is non sensitive data) and have it update as my localStorage does?
Lajos Arpad
  • 64,414
  • 37
  • 100
  • 175
Dr Upvote
  • 8,023
  • 24
  • 91
  • 204
  • 2
    You cannot do this client-side only. To persist user information across browsers you need a public server and store information on it. Typically this is done using a database; if you don't want to use mySQL there are other types of data storage. Firebase is pretty close to what you imagine, it allows you to store objects of arbitrary structure. (also, JSON is a text format. there's no such thing as a JSON object) –  Feb 12 '20 at 00:23
  • somewhat similar to that: https://stackoverflow.com/a/60279503/4845566 ? – deblocker Feb 26 '20 at 13:27

10 Answers10

3

What about using sqlite?

Just one file on your server like csv. Sending a http request To update it using SQL statement by knex or something like that after updating the localstorage on the client-side.

At least it is better than cvs I think because you can define several tables, which is more scalable, and more efficient, you know, it is a database.

kobako
  • 636
  • 5
  • 11
3

I'm adding my two cents here.


Export/Import File (JSON, XML, CSV, TSV, etc.)

Export:

Serialize settings and download it as file.

Import:

Open the exported/downloaded serialized settings file.

Example Code:

<!DOCTYPE html>
<html lang="en">

<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <meta http-equiv="X-UA-Compatible" content="ie=edge">
    <title>Settings Export/Import Demo</title>
</head>

<body>
    <div id="display"></div> <br>
    <button onclick="exportSettings();">Export</button>

    <button onclick="resetSettings();">Reset</button> <br><br>
    File to import: <input id="file-input" type="file" accept="application/json"> <br>
    <button onclick="importSettings();">Import</button>
    <script>

        function exportSettings() {
            var json = getSettingsAsJSON();
            var blob = new Blob([json], { type: "application/json" });
            var linkElement = document.createElement("a");

            linkElement.href = URL.createObjectURL(blob);
            linkElement.download = "ThisIsMySettings";

            document.body.appendChild(linkElement);

            linkElement.click();

            document.body.removeChild(linkElement);
        }

        function importSettings() {
            var fileInput = document.getElementById("file-input");

            if (fileInput.files.length > 0) {
                var jsonFile = fileInput.files[0];

                var fileReader = new FileReader();

                fileReader.onload = function (e) {
                    var json = e.target.result;

                    try {
                        var settings = JSON.parse(json);

                        if (settings.hasOwnProperty("userId")) {
                            localStorage["myapp_user_id"] = settings.userId;
                        }

                        if (settings.hasOwnProperty("hello")) {
                            localStorage["myapp_hello"] = settings.hello;
                        }

                        if (settings.hasOwnProperty("data")) {
                            localStorage["myapp_data"] = settings.data;
                        }

                        displaySettings();
                    } catch (ex) {
                        console.error(ex);

                        alert("Error occured while importing settings!");
                    }
                };

                fileReader.readAsText(jsonFile);
            }
        }

        function resetSettings() {
            localStorage["myapp_user_id"] = Math.floor(Math.random() * 100000) + 1;
            localStorage["myapp_hello"] = "Hello World!";
            localStorage["myapp_data"] = JSON.stringify([1, 3, 3, 7]);

            displaySettings();
        }

        function displaySettings() {
            var json = getSettingsAsJSON();

            document.getElementById("display").innerText = json;
        }

        function getSettingsAsJSON() {
            return JSON.stringify({
                userId: localStorage["myapp_user_id"],
                hello: localStorage["myapp_hello"],
                data: localStorage["myapp_data"]
            });
        }

        resetSettings();
    </script>
</body>

</html>

URL (Query String)

Export:

Encode settings into query string, and combine with current URL as hyperlink.

Import:

Visit the hyperlink that contain query string with encoded settings, then JavaScript detect and load settings from query string.


Base64 Encoded Data

Export:

Serialize settings then encode it as Base64 string, and then copy to clipboard.

Import:

Paste Base64 string from clipboard into textbox to decode, deserialize and load settings.


QR Code

Export:

Encode settings into query string, and combine with current URL as hyperlink. Then generate a QR code image and display.

Import:

Scan the generated QR code image and visit the hyperlink automatically.


HTTP Server (Node.js) / Cloud Storage (AWS S3)

Export:

HTTP POST to endpoint automatically when updating values, by user id.

Import:

HTTP GET from endpoint by user id.


Extra: PouchDB

The Database that Syncs!

PouchDB is an open-source JavaScript database inspired by Apache CouchDB that is designed to run well within the browser.

PouchDB was created to help web developers build applications that work as well offline as they do online. It enables applications to store data locally while offline, then synchronize it with CouchDB and compatible servers when the application is back online, keeping the user's data in sync no matter where they next login.

Community
  • 1
  • 1
DK Dhilip
  • 2,614
  • 1
  • 8
  • 17
  • Thanks for the response. Best one yet; with the first import/export - I could make it so the user could simply open the file for the settings to load? How could that work? – Dr Upvote Feb 23 '20 at 23:05
  • Ah, I see. Thank you. I just think that is a bit to much to ask from the user. If there was a way I can generate a hyperlink and email it to them perhaps. I think my data contains too many characters to put in a URL query string. How would the QR code option be scanned? – Dr Upvote Feb 24 '20 at 15:15
  • If your data is too large to fit in query string, perhaps you could experiment with gzip/deflate plus Base64 to see if it can reduce the payload size and include in query string? I don't know if this will work good enough for your case, just a random thought. For QR code option, it could be scanned by any devices with a camera, or scan image file directly (from ordinary URL, data URL, open file). – DK Dhilip Feb 24 '20 at 16:29
  • To expand the data size reduction idea, you could also try to manually serialize your data into ArrayBuffer to make it as small as possible and then apply Base64 encoding to it depending on transport method. – DK Dhilip Feb 24 '20 at 16:39
3

Instead of using localstorage, store your user's settings in the InterPlanetary File System (IPFS) https://ipfs.io/

Basically, you would put their setting is a data format like JSON and then write it to file and push it to IPFS.

You will need a way to identify which data goes to which user. Maybe you could use a hash of the username and password to name your files or something like that. Then your user's would always be able to access their content on any device (as long as they didn't forget their password).

Michael Babcock
  • 703
  • 2
  • 7
  • 13
2

You can use the URL as storage if you zip your user params.
get the params you want to store > json > deflate > encode to base64 > push into the URL

const urlParam = btoa(pako.deflate(JSON.stringify(getUser()), { to: 'string' }));

onload: get the params from the url > decode from base64 > inflate > parse json

const user = JSON.parse(pako.inflate(atob(urlParam), { to: 'string' }));

https://jsfiddle.net/chukanov/q4heL8gu/44/

The url param will be pretty long, but 10 times less then maximum available

Anton Chukanov
  • 645
  • 5
  • 20
  • This! However when I console log my localstorage data it is ~20k and ~8k in characters. Could I still do something like this? – Dr Upvote Feb 26 '20 at 18:36
  • 1
    I've tested with 160k characters. It will be 1600 characters after deflate, even IE will work without problems (maximum url length for ie - 2048). Modern browsers have no limitation for the url length. – Anton Chukanov Feb 26 '20 at 22:02
  • Thanks! does pako require a library or something? I am getting pako undefined – Dr Upvote Feb 26 '20 at 22:13
  • 1
    "Modern browsers have no limitation for the URL length" is a false assumption, please refer to [this](https://stackoverflow.com/questions/417142/what-is-the-maximum-length-of-a-url-in-different-browsers). Also, pako is a JavaScript library which can be found here ([GitHub](https://github.com/nodeca/pako), [cdnjs](https://cdnjs.com/libraries/pako)). Lastly, please do also note that the compression ratio can be vary depending on data content. – DK Dhilip Feb 27 '20 at 01:50
1

You can use a library called localForage which is basically has the same API as localStorage except it allows you to store more complex data structures(arrays, objects) and also supports nodejs style callback,promises and async await.

Here is a link to the repository where you can find example usages and how to implement it in your project the way you like.

C.Gochev
  • 1,837
  • 11
  • 21
1

The best way to implement this without using database for sharing data, I believe it to be WebRTC solution based, I thought of it as a way of doing that but I don't have code for it (at least for now), so with some search I found someone already did it (not exactly, but with some tweaks it will be ready) here for example, and its part of this article webrtc without a signaling server

here is one more source: data channel basic example demo

and on github: data channel basic example

WebRTC is not only for video/audio chat, it can also be used in text messaging and collaboration in text editing.

This solution even mentioned in one of the answers here.

ROOT
  • 11,363
  • 5
  • 30
  • 45
0

Use cookies or have a downloadable file that users can take with them to load in when they access another browser. You can do this with a text, JSON, or JavaScript file with object data.

0

You can use Redis. It is an in-memory data structure store, used as a database. You can store your data in key pair formats. It also makes your application fast and efficient.

Deeksha gupta
  • 659
  • 6
  • 15
0

Obviously the best approach is to use a database. However, if you're inclined to using a database then a possible best approach is to use a combination of techniques which I believe you already touched on so I'll help you connect the dots here.

Steps Needed:

  1. LocalStorage API (Since it's already partially working for you).
  2. Build a Node or Python(Or what you're comfortable with) endpoint to GET and POST settings data.
  3. Create a userSettings.JSON file on your API server.

Instructions:

You'd use your localStorage the same way as you're using it now (current working state).

In order to move or have user settings across different devices, a userSettings.JSON file (serving as a document database) will be used for storing and importing user settings.

Your API endpoint would be used to GET user settings if none exist in localStorage. On update of settings, update your localStorage and then POST/UPDATE the new settings in your userSettings.JSON file using your endpoint.

Your API endpoint will only be used to maintain(read and write) the userSettings.JSON file. You'll need a method/function to create, update, and perhaps delete settings in your file. As you may already know, a JSON file format is not much different from a MongoDB database. In this case you're just creating the methods you need to manage your file.

I hope this helps!

-1

You can resolve this without a database, but I wouldn't recommend it. Basically you have (user, localStorage) pairs and when a given user identifies himself/herself, his/her localStorage should be provided in a way. You can tell users to store their local storages at their own machine, but then they will have to copy it to other machines, which is labor-intensive and will never gain popularity. One could manually run a Javascript chunk in the console of their browser to make sure that localStorage has their data and having to copy the localStorage accross machines is only slightly easier than doing manually the whole thing.

You can put the localStorage information encoded into a URL, but besides the problem of URL length, which might become an issue and the ever present encoding problems, your whole localStorage can be monitored by a third party, having access to your router. I know you have said that the data is not sensitive, but I believe you that it's not sensitive yet. But once users will use this, if it's convenient, they will store sensitive data as well or, your clients might have such tasks for you, or even you might realize that you need to store data there which is not 100% public.

Besides these, in practice you will face very serious issues of synchronization, that is, it's all nice to make localStorage agnostic, but then, what's the real version? If you regularly work on 10 different sessions, then synchronizing the localStorages becomes a difficult problem. This means that the localStorage needs to be timestamped.

So, you will need a central place, a server to store the last saved localStorage version. If you are averted by databases for some unkown reasons, you can store localStorages inside files which identify the user, like

johndoe.json

and then, you will need to implement an export feature, which will send the current JSON of the user to the server and saves it into a file and an import feature, that will download the file stored for the user and ensure that localStorage gets updated accordingly. You can do the two together as well, implementing a synchronization.

This is simple so far, but what if the user already has some useful data inside his/her local localStorage and on the server as well? The simplest approach is to override one with another, but which one? If we are importing, then the local one is overrided, if exporting, then the one on the server is overrided, if we synchronize, then the older is overrided.

However, in some cases you want to merge two localStorages of the same user, so:

new elements

I believe that if an element is new, it should be known in some manner that it was created in this session, which is useful to know, because that means that on the other session that we are merging with, this new item was not removed and hence it's intuitive to add it.

element changes

If the same element is different in the two cases, then the newer version should prevail.

removed elements

The interesting case is when in one session it was removed and in the other it was updated. In this case I think the newer change should prevail.


However, despite of your best efforts, users might still mess up (and your software too) things, so backing up each session on your server makes sense.

Lajos Arpad
  • 64,414
  • 37
  • 100
  • 175