-1

I hope you are well... I have a problem and it is that when I try to export more than 500 lines of MYSQL, the server reaches its waiting limit, I would like to know if there is a way to export in the background. i wanna download for example 2k rows.

<?php
$con = mysqli_connect(DATA TO CONNECT) or die ("could not connect to mysql");
mysqli_set_charset($con,'utf8mb4');
session_start();
if(!$_SESSION['user']){
  header('Location: pages/login.php');
}
if (isset($_GET['country'])) {
  $query = mysqli_query($con,"SELECT * FROM profiles WHERE username='{$_SESSION['user']}' AND status='0' AND country='{$_GET['country']}'");
    $token =''.substr(md5("random".mt_rand()),0,10);
    $file =$_GET['country']."_".$token.'.txt';
        //
        $counter = 0;
    while($data=mysqli_fetch_array($query)){
        //
        $counter++;
        $current = file_get_contents($file);
        $current .= $data['name'].":".$data['country']."\n";
        file_put_contents($file, $current);
        mysqli_query($con,"UPDATE profiles SET status=1 WHERE  id='{$data['id']}'");
        if ( $counter >= 200 ) {
            break;
        }
        }
        $content = file_get_contents ($file);
        header ('Content-Type: application/octet-stream');
        header ("Content-Disposition: attachment; filename=\"". basename($file) ."\"");
        unlink($file);
        echo $content;
}

?>
  • Save the query output to TXT file using `SELECT .. INTO OUTFILE`, not via PHP code. – Akina Nov 09 '21 at 05:52
  • 1
    You're doing a lot of file reading and writing to assemble the text. You don't need to do any of that: build the output in an array and send it. You're also running an update query on every iteration of the loop. Consider reworking that query to perform all the required updates with a single query. – Tangentially Perpendicular Nov 09 '21 at 07:44
  • **Warning:** You are wide open to [SQL Injections](https://php.net/manual/en/security.database.sql-injection.php) and should use parameterized **prepared statements** instead of manually building your queries. They are provided by [PDO](https://php.net/manual/pdo.prepared-statements.php) or by [MySQLi](https://php.net/manual/mysqli.quickstart.prepared-statements.php). Never trust any kind of input! Even when your queries are executed only by trusted users, [you are still in risk of corrupting your data](http://bobby-tables.com/). [Escaping is not enough!](https://stackoverflow.com/q/5741187) – Dharman Nov 09 '21 at 07:47

1 Answers1

0

You're doing a lot of reading and writing a file while you're building the output, and this is slowing everything down. You don't need to be creating the file at all since you can download the data directly from memory.

You're also performing an individual update query for every row. For 2000 rows that's 2000 update queries when only one is needed.

Lastly, there's a counter that breaks the update loop after 200 rows. I imagine that's there for testing, but a better way to limit data is to add ORDER BY and LIMIT clauses to your queries.

Here's a new version of your code that addresses these issues, and also refactors the queries to use prepared statements. I've taken the precaution of selecting the records FOR UPDATE to prevent someone else updating them while this runs, and wrapping the entire thing in a transaction so that it can be rolled back if things go wrong.

<?php
/**
 * exportText.php
 *
 */
error_reporting(E_ALL);
ini_set('display_errors',1);
session_start();

$_SESSION['user'] = 'Fred';  // for testing only

if (!$_SESSION['user']) {
    header('Location: pages/login.php');
}

if (isset($_GET['country'])) {
    try {
        mysqli_report(MYSQLI_REPORT_ERROR | MYSQLI_REPORT_STRICT);
        $con = new mysqli('host', 'user', 'password', 'schema') or die ("could not connect to mysql");
        $con->set_charset('utf8mb4');
        $con->begin_transaction();
        $stmt = $con->prepare("SELECT name, country FROM profiles WHERE username=? AND status='0' AND country=? ORDER BY id LIMIT 200 FOR UPDATE");
        $stmt->bind_param('ss', $_SESSION['user'], $_GET['country']);
        $result = $stmt->execute();
        $stmt->bind_result($name, $country);

        // Store the data  until we've readi ti all and completed
        // the database update. This avoids error messages being written to the data file.
        $output = [];

        while ($stmt->fetch()) {
            $output[] = "$name:$country\n";        //
        }
        // update the statuses using the same criteria we had before.
        $stmt = $con->prepare( "UPDATE profiles set status = 1 WHERE username=? AND status='0' AND country=? ORDER BY id LIMIT 200");
        $stmt->bind_param('ss', $_SESSION['user'], $_GET['country']);
        $result = $stmt->execute();

        // send the data
        $token = '' . substr(md5("random" . mt_rand()), 0, 10);
        $file = $_GET['country'] . "_" . $token . '.txt';

        // Start the output
        header('Content-Type: application/octet-stream');
        header("Content-Disposition: attachment; filename=\"" . basename($file) . "\"");
        echo implode('',$output);

        $con->commit();

    } catch (Exception $e) {
        $con->rollback();
        echo "Exception:".$e->getMessage();
    }
}