6

I am working with an app which uses phpActiveRecord and mySQL to pull in data from a sensor network and plot it onto a number of flot.js graphs on the client.

There are several timeframes the user can chose between to affect the range of data the graphs display. 2hrs, 24hrs, 3 days and 1 week.

The sensors post to the database every 60 seconds, so when plotting the graphs, the query pulls in all rows between now and DATE_SUB(CUR_DATE - INTERVAL ? DAY) where ? is either 1, 3 or 7 etc.

However this results in a massive number of rows being returned (60,000 + for the full week!) and is causing huge delays and server errors.

I know I can just massively increase the max memory available for queries in the php.ini file, but this is hardly a good solution, and doesn't solve the issue of speed.

My question is, is there a way I can easily select only every second or third row from the required date range depending on the length of the interval the user wishes to view?

In C or Java I would do something like a modulo select to return alternate rows but I cannot think of a way to do this in the current framework.

Any ideas would be appreciated. Thanks.

Haji
  • 1,999
  • 1
  • 15
  • 21
bisslad
  • 85
  • 3
  • 3
    http://stackoverflow.com/questions/858746/how-do-you-select-every-n-th-row-from-mysql – Josh Harrison Nov 15 '13 at 11:12
  • Thanks Josh, these are great links, but unfortunately the way the app is structured and dynamically generated means that the use of user variables is essentially out of the question. The queries are managed by the slim.js routing framework and not directly called by the browser page. I appreciate it though! Thanks. – bisslad Nov 15 '13 at 11:33
  • 1
    If your id is incremental you could use that (add a "is odd" like or is multiple of 4/8 and so on to your WHERE statement) Also: Have you considered using caching for your querys? Even if you might simply use mysql_result_cache you would increase your performace a lot. – Andresch Serj Nov 15 '13 at 11:49
  • 1
    @bisslad If you don't make the selection in the SQL like Josh suggest but in the PHP files like the answers below you shouldn't make the selection at all since all 60.000 rows will be fetched anyway. Unless you do some caching on your queries as Andresch suggests. In that case you can just use a modulus in your fetch loop. – douwe Nov 15 '13 at 12:26
  • Thanks guys, I will definitely be looking into caching results alright. The original developer of the app built a fine solution for a small database but its not very well optimised or built for scalability. – bisslad Nov 15 '13 at 12:44
  • Maybe you can increase your overall performance if you only save value+timestamp after they have changed, this way you won't necessary have a new measurepoint every 60 seconds – Thomas Nordquist Nov 15 '13 at 14:18

4 Answers4

0
<?
$row = 1;
WHILE QUERY {
    if ($row % 2 == 0) {
        echo "Yourstuff";
    } else {
        //Nothing
    }
    $row++;
}
?>

This should help you to think about a solution..maybe not the perfect one for you, but i hope it helps... For every third use $row%3 and so on...

douwe
  • 1,305
  • 10
  • 12
Top Questions
  • 1,862
  • 4
  • 26
  • 38
  • 1
    Its like bringing 60,000 candies back home from market (Database) and then saying no I only need every second one and not the rest. Why get them from market (database) in the first place? Why not decide right there what to take back home (to php) – Hanky Panky Nov 15 '13 at 11:59
  • I'll have to try this type of solution out I think. Was hoping not to have to generate thousands more smaller queries but that may be the only way I have of doing this. Will test and accept when I get this up. – bisslad Nov 15 '13 at 12:46
  • @Hanky Panky: This shouldn´t be a complete solution...but only a thing that maybe helps him get on the right way...another solution would be to make a query for every row...for this you´ll need continuous id´s...but i don´t know if a query for every row to put out would be faster... – Top Questions Nov 15 '13 at 12:51
0

Try:

$i = 0;
foreach($row as $data){
  if($i == 1){
    //Do whatever with your 2n record
    $i++;
  }
  if($i > 1){$i = 0;} //Reset the counter
}
ggdx
  • 3,024
  • 4
  • 32
  • 48
  • 3
    So where do you increment $i ;) – douwe Nov 15 '13 at 12:23
  • To be honest, by the time it's in PHP you've already missed half the problem. Yeah you avoid sending it to the client but you're still fetching it from disk on the database, transferring it across the network to PHP, storing it in memory, etc, etc... This is better than nothing but not the correct solution – Basic Dec 05 '13 at 00:17
0

Returning every second row also not ideal solution. You may use cache.

Easiest way is caching your table to another table. Configure automatic cron job periodically.

datas
datas_cache ( Only Results ) 

If you want more professional solution you need to cache your result into files can use JSON

Fatih Alp
  • 209
  • 1
  • 3
  • 11
0

In the end I did end up managing to select the rows as specified above.

However this was not an acceptable solution to the problem I was having so I have decided to try and generate my graphs server side in order to try and avoid the issue altogether. It seems logical that generating charts with so many data points be done on the server and just push an image down the wire to an html5 canvas or some such. Going to try pChart for this. Thanks to all responders in any case.

bisslad
  • 85
  • 3