1

My Setup:

What I'm doing is pulling a very large table from MySQL (40,000+ Rows)

SpoolInfo.request.query("SELECT SpoolNumber AS a, DrawingNumber AS b, LineSize AS c, PipeSpec AS d, CU AS e, SheetNumber AS f, FluidCode AS g FROM Job" + JobNumber + ".SpoolInfo ORDER BY SpoolNumber ASC");

Doing a simple operation to it to format it how it is needed for the application I have to send it to:

if(SpoolInfo.response.data){
  SpoolInfoRowsObj = {"Cache":[]};
  SpoolInfo.response.data.forEach(function(d,di){
    SpoolInfoRowsObj.Cache.push(_.values(SpoolInfo.response.data[di]));
  });

Then storing it back into a Cache table for quicker access:
(This is needed because I can't find a way to make the .forEach loop to run faster, and currently takes over 20 seconds.)

UpdateCacheTable1.request = new AP.MySQL.Request();
UpdateCacheTable1.request.execute("UPDATE `Job" + JobNumber + "`.`CacheTable_SpoolInfo` SET `SpoolInfoObj` = '" + JSON.stringify(SpoolInfoRowsObj.Cache) + "' WHERE ID = 1");

The Problem:

When I retrieve that data back later:

CacheSpoolInfo.request.query("SELECT SpoolInfoObj FROM Job" + GetJobNumber.response.data[0].JobNumber + ".CacheTable_SpoolInfo ");
CommRef_.SpoolInfo = CacheSpoolInfo.response.data

And try to use is like this:

....
        }else if (t == "SpoolInfo"){
          rowsObj = {"Rows":[]};
          rowsObj = {"Rows":JSON.parse(CommRef_.SpoolInfo[0].SpoolInfoObj)};
        }else{
....

It has a bunch of extra "\" stuff in it for all the special characters like:

      "Rows": [
        [
          "[[\" 1-AII22-84042S01 \",\"1040\r\"],[\"0-A102-27564S01 \",\"110\r\"],.....
        ]
      ]

So of course the ouput is not the same JSON structured format that I saved.

I kinda thought using JSON.stringify() and JSON.parse() as I did would handle this.

My Question:

How can I handle this so the data I pull back can be handled the same way it would have been before I sent it to the DB?? AKA get rid of all the little slashes so JSON can parse it again.

DarbyM
  • 1,173
  • 2
  • 9
  • 25

1 Answers1

0

The problem is that MySQL is adding the escapes when you save the string back in, because MySQL has no idea what JSON is and is storing it as a VARCHAR. An easy way to get around this is to do the escaping yourself before you store the string in the database.

For example, if you have a PHP backend, to store you would do

$escapedString = mysqli_escape_string($myJSONString);
// Store $escapedString in db

For, retrieval, you would then do

$escapedString = // query code
$myJSONString = stripslashes($escapedString);

How you handle the escape will vary based on your backend architecture. You might also consider using a database that has native support for JSON.

Another option would be to write a JavaScript function that does the stripping for you (there is no native function that does this). It should be doable with some regex. To see how you might go about this, see this existing SO question related to stripping slashes. Note that you'll need to change that a bit, since you only want to remove escape slashes while in that post the OP wants to remove all slashes (which is wrong for JSON since your data might actually contain slashes).

Community
  • 1
  • 1
Matthew Herbst
  • 29,477
  • 23
  • 85
  • 128
  • Well that definitely clears up my options a little. First I'm doing this in JavaScript. So no easy to use 'escapedString'. And I'm using Google SQL, so I'm working with MySQL 5.6, No JSON support until 5.7. :( And trying to put together a home brew Regex solution to removed escapes, sounds like a good way for "ME" to really hork up some data.... Hmmm what to do.... – DarbyM Apr 16 '16 at 14:41
  • Hmmm You think encodeURI() could escape and "Unescape" the string as needed? – DarbyM Apr 16 '16 at 14:47
  • Nevermind on the encodeURI(). Exact same result. The decodeURI() puts the escapes in. Damn. I may have to either figure out a Regex, or cache this someplace else. – DarbyM Apr 16 '16 at 15:26
  • 1
    I'll be honest when I say it does seem like poor system design to extract using sql, transform with JS, and then save again with sql. Why not just do the entire thing in the db, and add indexes where needed to make it performant? – Matthew Herbst Apr 17 '16 at 04:39
  • I would be surprised if there isn't an existing SO question related to regex for unescaping JSON. – Matthew Herbst Apr 17 '16 at 04:39
  • I agree, it is a poor design, but the issue lies in how i HAVE to format the data for the system I am sending this data too. All this data that I'm putting into the JSON in the first place is already in MySQL. The problem is, the time (and CPU) it takes to convert it from key/value pairs like I receive from my JDBC calls to MySQL, into just an array of values. – DarbyM Apr 17 '16 at 16:07
  • With 40K+ Rows it is pegging out a single CPU for ~20 seconds. Since it really isn't that important this list of values is generated on every single request, I'm just making the array of values every so often, and storing it as a longtext in the DB for far more rapid access. (Less then 2 seconds. and not CPU intensive at all.\) – DarbyM Apr 17 '16 at 16:07
  • I took your advice and went the RegEx route (wasn't as bad as I thought it would be) But I did run into one unexpected issued. JSON.stringify("Text") > MySQL Insert > MySQL Get > JSON.parse(MySql.Response) yields a textfile JSON.parse would not work on. I ended up having to remove all double quotes from my data before stringifying it. It seems I'm going to have to work this out a bit further, but in the mean time, I appreciate our assistance. – DarbyM Apr 17 '16 at 16:16
  • Hmm, you did bring up a point I could look at. Just do the Transforming in MySQL and make the cache field with out ever even pulling it out. While I have done all kinds of calculations in mysql for responses, I'm not seeing right off the top of my head how to do this transformation in MySQL. But I sure seems doable. Effectively I need to reproduce the results of my .forEach loop IN MySQL is all. Thanks for the idea. – DarbyM Apr 17 '16 at 16:23
  • Happy to be of help. Feel free to ask another SO question if you have trouble converting your loop into SQL – Matthew Herbst Apr 18 '16 at 03:10