0

I have a lot of data to export it in csv file. My function loop into each field and execute a function to get data from sql table. Now i have a very big database and i want to export some data without changing the memory_limit config because i don't want block others users.

How can i do to execute my function ?

For example : I have 100000 persons and each person have à lot of version of some datas. Each day they save an information like this :

Person Table
+-----------+-------------+-------------+
| id_person | name_person | city_person |
+-----------+-------------+-------------+
| 1         | Jack        | Paris       |
+-----------+-------------+-------------+
| 2         | John        | London      |
+-----------+-------------+-------------+
| ...       | ...         | ...         |
+-----------+-------------+-------------+
| 99999     | Rose        | Madrid      |
+-----------+-------------+-------------+
| 100000    | Jackie      | Rome        |
+-----------+-------------+-------------+

Field Table
+----------+------------+-------------------+
| id_field | name_field | label_field       |
+----------+------------+-------------------+
| 1        | Location   | Visited location  |
+----------+------------+-------------------+
| 2        | Article    | Count of articles |
+----------+------------+-------------------+
| ...      | ...        | ...               |
+----------+------------+-------------------+
| 289      | Distance   | Distance          |
+----------+------------+-------------------+
| 299      | Pause      | Time of pause     |
+----------+------------+-------------------+

Field Value Table
+----------+----------+-----------+----------------+------------+
| id_value | id_field | id_person | value          | Date       |
+----------+----------+-----------+----------------+------------+
| 1        | 1        | 148       | Hanover Street | 2015-05-10 |
+----------+----------+-----------+----------------+------------+
| 2        | 66       | 57962     | 20             | 2015-05-10 |
+----------+----------+-----------+----------------+------------+
| ...      | ...      | ...       | ...            |            |
+----------+----------+-----------+----------------+------------+
| 3475992  | 105      | 847       | 17,5           | 2018-02-01 |
+----------+----------+-----------+----------------+------------+
| 3475993  | 15       | 66359     | 44             | 2018-02-01 |
+----------+----------+-----------+----------------+------------+

Each Field have a specific function to get data.

How can i get all the datas for export in csv file without change the limit memory ?

Thanks

prozbk
  • 57
  • 1
  • 8
  • Please show us how you are assembling and writing out this data, and explain at what point you are running into a problem with this. – misorude Oct 22 '18 at 09:54
  • I have a function of each Field in Field Table. For example : I have function GetArticle($persons) This function search into Field Value table the last version of value for the Article Field for each person in $person array. When the list of $persons is big and the liste of fields also. I have timeout and memory limit error – prozbk Oct 22 '18 at 10:04
  • Sorry i hit enter before i complete the comment ! – prozbk Oct 22 '18 at 10:08

3 Answers3

0

Use unbuffered queries, implicit flush, send data directly to output buffer (for downloads), use the CLI (for file exports). Turn off/increase timelimits (if needed), for just this script, not globally.

http://php.net/manual/en/mysqlinfo.concepts.buffering.php

http://php.net/manual/en/wrappers.php.php

How to flush output after each `echo` call? (the answer by @Roger)

http://php.net/manual/en/function.set-time-limit.php

It's too much code for me to write the whole thing, and there is too much unknown. Like what DB you use (MySQL, MsSQL etc.), what DB classes, PDO or MySqli? Are you exporting to a file on the server or downloading. Do you want the data as a CSV, SQL etc..

  • Not buffering the query will cost more network wise, take longer, but manage memory better and will handle larger tables better.
  • Implicit flushing keeps the output buffer small (manage memory).
  • Send data to php://output better memory management and is more efficient.
  • Time limit should be obvious.

My function loop into each field and execute a function to get data from sql table.

Use Joins instead of repeatedly calling the DB, use proper indexes on your tables.

It's ok to use ini_set('memory_limit' ...) and set_time_limit because they only affect the current PHP process, and are not global. It's obviously best to avoid them if you can, but sometimes that's just not possible.

The fastest way to export is mysqldump:

https://dev.mysql.com/doc/refman/8.0/en/mysqldump.html

But it has limitations (for example)

https://dba.stackexchange.com/questions/4654/is-it-possible-to-mysqldump-a-subset-of-a-database-required-to-reproduce-a-query

You can't export using JOIN, and complex queries will become very difficult because I think you can only use a basic --where call, no aggregation ..etc.

ArtisticPhoenix
  • 21,464
  • 2
  • 24
  • 38
  • Can you give me more explanations please ? I'm a beginner in buffer output Thanks – prozbk Oct 22 '18 at 10:15
  • Not without more information about what you are doing. – ArtisticPhoenix Oct 22 '18 at 10:20
  • I already explan this in other comment : I have a function of each Field in Field Table. For example : I have function GetArticle($persons) This function search into Field Value table the last version of value for the Article Field for each person in $person array. When the list of $persons is big and the liste of fields also. I have timeout and memory limit error I can't use joins because each field have a specific behavior and specific display format (list, text, numbers, phone, currency...) – prozbk Oct 22 '18 at 10:30
  • `I can't use joins because each field have a specific behavior` - you can use joins, these `specific behavior` is done in PHP? Nothing stopping you from doing it after using a join. The query is more complicated, yes, put the PHP code will be simpler (in general). PS that still doesn't tell me anything useful. – ArtisticPhoenix Oct 22 '18 at 10:32
  • Lastly there are tons of examples on PHP.net, it's all explained better there then I can do here. – ArtisticPhoenix Oct 22 '18 at 10:37
0

There are two ways to do read and export the big data

  • Via batch process - Split the bulk data into chunk and use of sleep then continue next chunk to process.
  • Via queue items into the database

SAMPLE CODE

$con = 'mysql:host=localhost;dbname=example';
$username = 'example';
$password = 'example';
$pdo = new PDO($con, $username, $password);

$i = !empty($_GET['pass']) ? (int) $_GET['pass'] : 0;

$string = "SELECT * FROM users LIMIT $i,10";
$query = $pdo->prepare("$string");
$query->execute();
// This would not fill memory anymore.
$results = $query->fetchAll();
// Nothing to do, we have finished.
if (!count($results)) {
  return;
}
foreach ($results as $result) {
  // Perform lengthy operation.
  sleep(1);
}

$i++;
// Send request back to process next execution.
$redirect = $_SERVER['DOCUMENT_URI'] . '?pass=' . $i;
header("Location: $redirect");
exit();
Vernit Gupta
  • 339
  • 1
  • 10
-1

Try this export via commands

mysqldump -p -u username database_name > dbname.csv
VinothRaja
  • 1,405
  • 10
  • 21
  • 1
    He's trying to do it in PHP, exporting a MySQL dump will NOT create a CSV – delboy1978uk Oct 22 '18 at 09:56
  • i don't want to export the sql file. I want to export a csv file – prozbk Oct 22 '18 at 10:03
  • @VinothRaja I dont want to export all the database I have function that get the data that i want from several tables. when i looping into the list i have timeout & memory error ! The mysqldump not a solution – prozbk Oct 22 '18 at 10:17