There has a large data that is from mysql or a array, such as over 40000. I need loop them and then do echo, insert or update working. But the memory alway is crashed now. such as the large data, how to do it?
Asked
Active
Viewed 837 times
0
-
3Read them in batches from DB using limit/offset and process. – Raj Dec 02 '16 at 01:49
-
Better yet, [skip `LIMIT` and `OFFSET` (which don't scale well) and paginate in your `WHERE` clause](http://use-the-index-luke.com/sql/partial-results/fetch-next-page). – ChrisGPT was on strike Dec 02 '16 at 02:08
1 Answers
-1
If you can, show a code sample. the idea of picking up large data from a database just to "echo it" should't be an issue. don't add it to any array, just show it as you pick it up row by row. this way you never allocate any more memory.
This is not pure/tested/real code , but just to show it in code format: // assume $link is a connection to your DB
$sql="Select name,age from big_list_of_members";
if ( $res = mysqli_query($link, $sql) ) {
while($r = mysqli_fetch_assoc($res)){
echo 'Name: ' . $r['name'] . ' , Age: '.$r['age']. "\n";
}
}
no matter how big is the table, you never going to allocate too much memory here.

Shai Shprung
- 146
- 8
-
2**_Please_** don't recommend that people use the old, broken, insecure `mysql_*` functions that were deprecated in PHP 5.5 (which is so old it no longer even receives security updates) and completely removed in PHP 7. _Nobody_ should be writing new code with these functions. Instead, talk about PDO and / or `mysqli`. See http://stackoverflow.com/q/12859942/354577 for details. – ChrisGPT was on strike Dec 02 '16 at 02:04
-
Downvoted for using the obsolete, not secure, soon-to-be-deprecated "mysql_*" functions. – Rav Dec 02 '16 at 02:17
-
1ok, agree. this is why i started with "this is not pure/tested/real code" - it was more of pesudo code. I edit it to reflect the use of mysqli – Shai Shprung Dec 02 '16 at 02:55