7

I just had the idea of writing a function in MySQL that I can pass a subquery to for it to output the JSON representation of that subquery.

I have quite a lot of data that I often fetch from MySQL, then convert it to JSON for API output. Could it perhaps be a speed increase to write a MySQL function to do this on the SQL server that just returns the JSON?

My imagination:

query('SELECT * FROM people');

// Output:
// +----+--------+-----+
// | id | name   | Age |
// +----+--------+-----+
// |  1 | Molly  | 24  |
// |  2 | Edward | 28  |
// +----+--------+-----+

query('JSON(SELECT * FROM people)');

// Output:
// [{"id":1,"name":"Molly","Age":24},{"id":2,"name":"Edward","Age":28}]

Possible? If yes, any clues to how I can start?

Romil Kumar Jain
  • 20,239
  • 9
  • 63
  • 92
Hubro
  • 56,214
  • 69
  • 228
  • 381
  • 1
    It is possible but will hardly be any speed increase. Also, for large resultsets, you can hit certain limits like `group_concat_max_len` and `max_allowed_packet` which would prevent the value to be returned. They are not always possible to change on shared hostings etc. – Quassnoi Jun 04 '12 at 11:28
  • I'm not sure how data is usually sent from MySQL to a client as a response to a query. Would this perhaps reduce bandwidth if the MySQL host was not local? – Hubro Jun 04 '12 at 11:31
  • `MySQL`'s binary protocol is denser than character-based `JSON`. Also, you can compress it with `MYSQL_CLIENT_COMPRESS` (should you need it). – Quassnoi Jun 04 '12 at 11:33
  • `SELECT '{"id":1}'` No need to thank me. – ta.speot.is Jun 04 '12 at 11:40

4 Answers4

3

As of MySQL 5.7 you can output JSON using JSON_ARRAY and JSON_MERGE among other functions.

It works like so:

RETURN JSON_ARRAY('id':1,'name':'Molly','Age':24);

You can have multiple documents also:

RETURN JSON_MERGE('{"id":1,"name":"Molly","Age":24}', '{"id":2,"name":"Edward","Age":28}');
llanato
  • 2,508
  • 6
  • 37
  • 59
2

First, look at this thread (SQL Server) on StackOverflow.

You can also see here for PL/JSON, and here for sql2json (PHP).

Community
  • 1
  • 1
Val
  • 762
  • 8
  • 32
2

Think about a table with username and email, you can contruct the JSON in following manner in MySQl User defined function.

SELECT 
     CONCAT("[",
          GROUP_CONCAT(
               CONCAT("{username:'",username,"'"),
               CONCAT(",email:'",email),"'}")
          )
     ,"]") 
AS json FROM users;

A MySQL-query that returns JSON.

[
     {username:'mike',email:'mike@mikesplace.com'},
     {username:'jane',email:'jane@bigcompany.com'},
     {username:'stan',email:'stan@stanford.com'}
]

There is no inbuilt method that converts result to JSON format, so you have to do yourself in UDF.

Romil Kumar Jain
  • 20,239
  • 9
  • 63
  • 92
0

I don't see the point in writing SQL to generate JSON, it's just messy and ugly and would be really hard to generalize into an all-purpose function, and it's not really what SQL is designed for; SQL is designed to manage, store and retrieve the records you need and perform calculations on result sets - really fast and on the fly - and not much more. If you need JSON you have to spit the rows out to a server-side script anyways so you might as well use the scripting language to create JSON - that's what scripting languages are for.

matchdav
  • 715
  • 1
  • 7
  • 16
  • 1
    Interesting emphasis of 'for' at the end there. – Gga Dec 04 '13 at 19:38
  • I really think that the question has an - at least - performance related point. Why should I need e.g 100kB data read into DB Server memory, sent over IP and then loaded again in PHP/JS/whatever memory, just to convert them to JSON format, when SQL can do it and probably a lot faster!! – Panais Feb 01 '16 at 09:34
  • at this time when tools like graphql gets hyped, this kind of question looks way more relevant — why need a tool to overlays mysql when it can actually do the job by itself ? – Ben Mar 30 '17 at 14:31