1

I am selecting a lot of data from my mySQL database:

 $sql = "SELECT * FROM data WHERE id = ?";  
 $q = $pdo->prepare($sql);
 $q->execute([$id]);
 $array = $q->fetchAll(PDO::FETCH_ASSOC);

 var_dump($array);

I want to store this data into an array and after work with this array in a loop. My problem is now, that I have such an immense amount of data, that the array is loading and loading and my system is overwhelmed. Is there a way to create an array with a huge amount of data with a better performance?

peace_love
  • 6,229
  • 11
  • 69
  • 157
  • You could always lets your query return chunks and handle those? This link is Laravel specific, but it conveys the idea: http://laraveldaily.com/process-big-db-table-with-chunk-method/ – Loek May 08 '18 at 08:47
  • how many rows and columns are we talking about? – Lelio Faieta May 08 '18 at 08:47
  • @LelioFaieta around 500.000 – peace_love May 08 '18 at 08:48
  • @Jarla, you can use limit to get records and use pagination to display record – Peerbits - Backend Team May 08 '18 at 08:49
  • What is the array being used for after you've got it? Depending on that there are potentially other avenues you can take. i.e. by 'working with it' do you mean mutating the values? – Mikey May 08 '18 at 08:49
  • 1
    500.000 records? My suggestion is to create specific queries where needed with just the records you need. To query the database is always faster compared to array manipulation (even if the array is a small one) – Lelio Faieta May 08 '18 at 08:50
  • and why would you need that many rows at once? and why would you need all 500k rows inside an array? maybe you should answer that first – Kevin May 08 '18 at 09:02
  • @Ghost Because each data is a folder path and I am creating from this array a folder tree – peace_love May 08 '18 at 09:05
  • just like the comment above, just query the rows that you need, i don't think it make sense to select all rows at once, just load that you need, do you need to show all folder path at once? – Kevin May 08 '18 at 09:07
  • 1
    Do you really need all the fields from the "data" table? – SpacePhoenix May 08 '18 at 09:08
  • @Ghost Well, the problem is that I have to have once all the folder paths to be able to find out what is the highest folder in the tree and so on – peace_love May 08 '18 at 09:13
  • @Jarla - 500K rows could easily cost you a gigabyte of PHP space. – Rick James May 21 '18 at 04:49
  • @Jarla - What's the algorithm for "highest folder"? Perhaps it can be done with a _few_ SQL statements. – Rick James May 21 '18 at 04:51

4 Answers4

1

The fetchAll() function maps the whole record to the variable.

Mapping one row per iteration will be considerably faster

$q->execute([$id]);

$i = 0;
while ($row = $q->fetch()) {
    // do something with $row
    $i++;
}
Miraris
  • 83
  • 9
1

Please use fetch function instead of fetching all like below:

while ($arr = $stmt->fetch()){
   // do_other_stuff(); 
}

This will reduce the load on your system.

Please see this

1

I am currently processing 12,000 rows and encoding it into JSON arrays and in my opinion this is working best for me.

$sql = mysqli_query($dbconn, "SELECT * FROM data WHERE id = ?");

$rows = array();
    while($r = mysqli_fetch_assoc($sql)) {
    $rows[] = $r;
    }

Then echoing (or in my case echoing json_encode) it out.

Sanguinary
  • 354
  • 2
  • 3
  • 16
1

The main thing to consider is:

  1. Will this table data get increase further? (already big :))

Looping 500k records each time is not advisable/recommended. It definitely affects the speed/performance.

I suggest the following options:

  1. Place a field in table to identify the highest folder (something like sort-order field) - update it correctly whenever needed. So that we can fetch the highest folder first and if there is a field to identify the parent then you can easily build a tree.
  2. Think about a table to store highest folders with a foreign key to its main table. Update this accordingly so that you can easily identify highest folder.

PS: you can think of using a CRON to process the data and update the sort order if needed.

Manojkumar B
  • 206
  • 1
  • 7