1

Im trying to load large amount of data from Mysql to array but our server memory limit is too low for it. I can increase this limit but i guess it is not solution. I need better solution .. some type of cycle or what.

My actuall code:

        $sql = "SELECT * FROM $tables";
        
        $response = array();
        $result = mysql_unbuffered_query( $sql );
        while( $row=mysql_fetch_array( $result )) 
        { 
            $response[] = (
                array(
                    "id" => $row['code'], 
                    "name" => $row['name']
                )
            );
        
        
        } 

Using of mysql_unbuffered_query() will improved my situation but not at all.

Actually im loading about 1 000 rows.

UPDATE

$sql = "SELECT 'code','name' FROM $tables";
            
            $response = array();
            $result = $this->con->prepare($sql);
                echo self::convert(memory_get_usage(true)); // 512kb
            $result->execute();
                echo self::convert(memory_get_usage(true)); // 21.25 mb
            $array = $result->fetchAll();
            // here script fails -> due to out of memory
            foreach($array as $key => $row) {
                $response[] = (
                    array(
                        "id" => $row['code'], 
                        "name" => $row['name']
                    )
                );

I have set 128 mb memory on the server. So if i try to fetch 21mb data it will fails ...

Community
  • 1
  • 1
Dominik Vávra
  • 121
  • 1
  • 9
  • If you can, you should [stop using `mysql_*` functions](http://stackoverflow.com/questions/12859942/why-shouldnt-i-use-mysql-functions-in-php). They are no longer maintained and are [officially deprecated](https://wiki.php.net/rfc/mysql_deprecation). Learn about [prepared](http://en.wikipedia.org/wiki/Prepared_statement) [statements](http://php.net/manual/en/pdo.prepared-statements.php) instead, and consider using PDO, [it's really not hard](http://jayblanchard.net/demystifying_php_pdo.html). – Jay Blanchard Jun 22 '15 at 16:43
  • Why are you creating an array to hold this data, can you not just run the code that is going to use this data in place of the array loading code? – RiggsFolly Jun 22 '15 at 16:46
  • What is your memory limit set to? 1000 rows is not a large number! – RiggsFolly Jun 22 '15 at 16:48
  • I have 128M memory limit. Im creating an array because nextime i use json_encode($response); to file for typehead search. – Dominik Vávra Jun 22 '15 at 16:51
  • You could also reduce memory requirement is you changed your query to `SELECT \`code\`,\`name\` FROM $table` just getting the data you require rather than multiple unnecessary columns. – RiggsFolly Jun 22 '15 at 16:51
  • Also your memory usage depends on the whole scripts memory usage. It may well be some other part of the script that is badly written rather than this database access – RiggsFolly Jun 22 '15 at 16:53
  • So i guess 128M memory limit is enough to load 1000 row. – Dominik Vávra Jun 22 '15 at 17:05

0 Answers0