0

Server B makes a HTTP POST request to Server A and A returns the results of a mysql query.

At first I simply fetched the results via PDO and brought them into JSON format. Server B did a simple json_decode() then.
But sometimes millions of rows need to be transfered, so the JSON overhead blew up the response data size.

I'm about to format the query results to csv, send them to B and turn them into an array of objects again.

My question: Is that a practical way? Why does no one else seem to prefer a solution like this? I'm surely not the only one who's trying to solve that problem.

Your Common Sense
  • 156,878
  • 40
  • 214
  • 345
DoeTheFourth
  • 345
  • 3
  • 12
  • Are you trying to achieve database replication/backup? – bassxzero Mar 23 '17 at 11:51
  • Nope, just single queries that are pre-defined on Server A. So Server A controls the queries Server B is allowed to execute. – DoeTheFourth Mar 23 '17 at 11:53
  • If i understand you're saying that `json_decode()` and `json_encode()` are too much overhead? Won't parsing a CSV be the same, greater, or a negligible decrease in overhead? – bassxzero Mar 23 '17 at 11:58
  • I meant the overhead of the data itself. Of transfering it through the internet. I did not compare the time it takes for parsing json vs. parsing csv so far. – DoeTheFourth Mar 23 '17 at 12:02
  • So you're talking about SFTP vs HTTP? http://stackoverflow.com/questions/717200/comparing-http-and-ftp-for-transferring-files – bassxzero Mar 23 '17 at 12:03
  • The protocol could be another way to tweak the performance. But independently of the protocol I'm looking for the best way to format the data right now. And csv uses the first line for the field names, where JSON holds them in every object (for every row). – DoeTheFourth Mar 23 '17 at 12:11
  • no one else seem to prefer a solution like this *obviously* because they don't have such an idea of transferring millions of rows to and fro. – Your Common Sense Mar 23 '17 at 12:19
  • "JSON holds them in every object" this is by your choice. This is a perfectly acceptable way to transfer the data with no duplicated headers. http://tinypic.com/r/25tdqis/9 – bassxzero Mar 23 '17 at 12:24
  • @YourCommonSense How it's done then? Via FTP? How would such a request/response process look like? @ bassxzero You're right. Is there a handy function to turn every row into an object (attribute => value) again? – DoeTheFourth Mar 23 '17 at 12:40
  • @DoeTheFourth http://php.net/manual/en/function.array-combine.php – bassxzero Mar 23 '17 at 12:46
  • It depends on the certain detail. May be SQL dump would be the best. – Your Common Sense Mar 23 '17 at 12:54

0 Answers0