1

I have around 700 - 800 visitors at all time on my home page (according to analytics) and a lot of hits in general. However, I wish to show live statistics of my users and other stuff on my homepage. I therefore have this:

$stmt = $dbh->prepare("
    SELECT 
    count(*) as totalusers, 
    sum(cashedout) cashedout,
    (SELECT sum(value) FROM xeon_stats_clicks 
     WHERE typ='1') AS totalclicks
    FROM users
    ");
    $stmt->execute();
    $stats=$stmt->fetch(); 

Which I then use as $stats["totalusers"] etc.

table.users have `22210` rows, with index on `id, username, cashedout`, `table.xeon_stats_clicks` have index on `value` and `typ`

However, whenever I enable above query my website instantly becomes very slow. As soon as I disable it, the load time drastically falls.

How else can this be done?

oliverbj
  • 5,771
  • 27
  • 83
  • 178

3 Answers3

1

From what I can tell, there is nothing about this query that is specific to any user on the site. So if you have this query being executed for every user that makes a request, you are making thousands of identical queries.

You could do a sort of caching like so:

Create a table that basically looks like the output of this query.

Make a PHP script that just executes this query and updates the aforementioned table with the lastest result.

Execute this PHP script as a cron job every minute to update the stats.

Then the query that gets run for every request can be real simple, like:

SELECT totalusers cashedout, totalclicks FROM stats_table
Dan
  • 10,614
  • 5
  • 24
  • 35
1

You should not do it that way. You will eventually exhuast your precious DB resources, as you now are experiencing. The good way is to run a separate cronjob in 30 secs or 1 min interval, and then write the result down to a file :

file_put_contents('stats.txt', $stats["totalusers"]);

and then on your mainpage

<span>current users :
   <b><? echo file_get_contents('stats.txt');?></b>
</span>

The beauty is, that the HTTP server will cache this file, so until stats.txt is changed, a copy will be upfront in cache too.


Example, save / load JSON by file :

$test = array('test' => 'qwerty');
file_put_contents('test.txt', json_encode($test));
echo json_decode(file_get_contents('test.txt'))->test;

will output qwerty. Replace $test with $stats, as in comment

echo json_decode(file_get_contents('stats.txt'))->totalclicks;
Community
  • 1
  • 1
davidkonrad
  • 83,997
  • 17
  • 205
  • 265
  • That's ... So simple. So obvious. Thank you for kicking my common sense! – oliverbj Oct 28 '15 at 19:44
  • Just a side question: can I write "totalcashedout" and "totalclicks" to the same file and still echo it? Or should it be 3 separate files? – oliverbj Oct 28 '15 at 19:45
  • @oliverbj, you could save it as JSON with `json_encode($stats)` and then parse the content after you load the file, something like `echo json_decode(file_get_contents('stats.txt'))->totalclicks;` – davidkonrad Oct 28 '15 at 19:48
  • as JSON? This is still very new to me. Could I get you to post a simple example? – oliverbj Oct 28 '15 at 19:49
  • @oliverbj, you are welcome! Thank you for accepting the answer. – davidkonrad Oct 28 '15 at 20:14
0

From the query, I can't see any real reason to use a sub-query in there as it doesn't use any of the data in the users table, and it's likely that that is slowing it down - if memory serves me right it will query that xeon_clicks table once for every row in your users table (which is a lot of rows by the looks of things).

Try doing it as two separate queries rather than one.

Octo
  • 139
  • 4