0

To work with big data using sqlite, but Apache consumes 100% CPU (server on Ubuntu, PHP on mod Apache).

Example script on PHP:

$pdo = new PDO('sqlite:');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$pdo->query('CREATE TABLE `members` (uid INTEGER PRIMARY KEY, f INTEGER DEFAULT 1)');
for ($i=0; $i<250000; $i++){
    $uid = rand(1, 500000);
    $pdo->beginTransaction();
    $pdo->exec('INSERT OR IGNORE INTO members VALUES ('.$uid.', 0);');
    $pdo->exec('UPDATE members SET f = f + 1 WHERE
        uid = '.$uid);
    $pdo->commit();
}

$one_row = $pdo->query('SELECT COUNT(*) AS `count` FROM `members`')->fetch();

echo 'COUNT: '.$one_row['count']."\r";

Is it possible to limit the consumption of CPU? Or have I done something wrong?

  • SQLite is a poor choice for large amounts of data. Why are you not using prepared statements? Why are you updating every row one at a time, instead of declaring `f` in PHP, incrementing it, and inserting it? – Ian Kemp Oct 20 '15 at 07:50
  • I needed API returns me the data as an array of IDs, I need to count the number of overlapping ID. At first I did it just arrays in PHP, but the size of the array more than 500K and the size of the script much more memory 128MB, but I can not let the script such amounts of memory. Since while the server is assumed to 250 running scripts. The choice fell on SQLite, memory everything is OK! But now the problem is with the cost of CPU. – Leonid Chernyadyev Oct 20 '15 at 07:59
  • 1
    It sounds to me that your *actual* problem is "what is the most efficient way to do this task", not "how do I make this algorithm use less CPU". If that's correct, I would suggest that you post a new question with that information (and your current implementation) so that we can assist. – Ian Kemp Oct 20 '15 at 09:43
  • No, the question is how to optimize the long INSERT, to SQLite does not downloaded with the CPU to 100% – Leonid Chernyadyev Oct 20 '15 at 10:16

0 Answers0