4

My first post, because I haven't found answer to this problem anywhere! And i looked way beyond Google.. :)

DESCRIPTION:

So I have a set-up where an arduino device is connected to a laptop via USB serial cable and the laptop is connected to internet.

Like this: http://postimg.org/image/cz1g0q2ib/


arduino ---USB---> laptop (transit.py) ---WWW---> server (insert.php)-> mysql DB

There is a python script (transit.py) on the pc running continuously and listening to the COM port, analyzing received data and forwarding it to a file (insert.php) on a remote server (a free hosting site) See code to learn how that works...

Then there is the insert.php script that receives this data (still almost every second), analyzes it and stores it in the mySql database. This, however, is not the only file that requires mySql connection, therefore i include connect.php at the beginning of every such file.


PROBLEM:

Warning: mysqli::mysqli() [mysqli.mysqli]: (42000/1226): User 'user' has exceeded the 'max_connections_per_hour' resource (current value: 1500) in /server/connect.php on line 8

As a result of all this data travel and it's frequency (and cheapness of the hosting) i run into a "maximum connections per hour exceeded" error. The limit is 1500 per hour and i can't change it (it's a remote server). And no, i don't want to pay for hosting to get a bigger allowance - that's not the point- the issue is inefficiency of my code. Can i have one, persistent connection? Like a service?

Sending data from python script straight to remote mysql is not an option, because i don't have access to this feature.


CODE:

transit.py:

    try:
        ser = serial.Serial('COM4',9600,timeout=4)
    except:
        print ('=== COULD NOT CONNECT TO BOARD ===')

    value = ser.readline()
    strValue = value.decode("utf-8")

    if strValue:
                        mylist = strValue.split(',')
                        print(mylist[0] + '\t\t' + mylist[1]+ '\t\t' + mylist[2])
                        path = 'http://a-free-server.com/insert.php'
                        dataLine = {"table": mylist[0], "data": mylist[1], "value": mylist[2]}
                        toServer = requests.post(path, params=dataLine, timeout=2)

insert.php:

<?php
include 'connect.php';

        //some irrelevant code here...

if (empty($_GET['type']) && isset($_GET['data'])) {
  $table = $_GET['table'];
  $data = $_GET['data'];
  $value = $_GET['value'];

  if($mysqli->connect_errno > 0){
    die('Unable to connect to database [' . $mysqli->connect_error . ']');
  }
  else
  {  
    date_default_timezone_set("Asia/Hong_Kong"); 
    $clock = date(DATE_W3C);

    if (isset($_GET['time'])) {
      $time = $_GET['time'];
    }
    else{
     $time = $clock;
    }   
    echo "Received: ";
    echo $table;
    echo ",";
    echo $data;
    echo ",";
    echo $value;
    echo ",";
    echo $time;

    if ($stmt = $mysqli->prepare("INSERT INTO ".$table." (`id`, `data`, `value`, `time`) VALUES (NULL, ?, ?, ?) ON DUPLICATE KEY UPDATE time='".$time."'")) 
    {
      $stmt->bind_param('sss', $data, $value, $time); 
      $stmt->execute();

      $stmt->free_result();
      $stmt->close();
    }
    else{
      echo "Prepare failed: (" . $mysqli->errno . ") " . $mysqli->error;
    }
  }
}else{
    echo " | DATA NOT received!";
}
?>

connect.php:

<?php

define("HOST", "p:a-free-host.com"); // notice the p: for persistence
define("USER", "user"); 
define("PASSWORD", "strongpassword1"); // my password. don't look!
define("DATABASE", "databass"); 

$GLOBALS["mysqli"] = new mysqli(HOST, USER, PASSWORD, DATABASE, 3306);

$count = intval(file_get_contents('conns.txt'));
file_put_contents('conns.txt', ++$count);         //just something i added to monitor connections
?>

P.S. Everything works fine and all data is handled in a rather desirable manner, except for exceeding the limit and perhaps some other hidden caveats.

Any suggestion on how to decrease the connection count but still receive data every second?

sheelis
  • 41
  • 3

2 Answers2

2

If I have understood your issue correctly, your web host sucks. If you are limited to 1500 connections / hour, and each page requires a connection, that means you can never exceed 1500 page views per hour; that's not very much.

Many programming languages support connection pooling; in this model, the server opens one or more connection at start-up, and individual page requests get one of those connections when they need them. This reduces the overhead of opening and closing connections. See here for a discussion of connection pooling and PHP. You may be able to use one of the answers without too much trouble.

The alternative - and probably better - solution is to batch up data in your Python scripts so you don't have to connect to the web server so often. The classic waty to do this for applications that aren't time critical is to use a message bus. I'm not a Pythonist, but this should do the job...

Community
  • 1
  • 1
Neville Kuyt
  • 29,247
  • 1
  • 37
  • 52
  • Thanks! By the way it's 1500 per user, but i only have 1 user - the one who sends all data to server. I read a lot about persistent connections, singletons, pooling and all of them seem to have extensive lists of drawbacks and it's very rare for these methods to be useful. I guess i will have to try increasing the intervals between mysql connections or - as you suggested - wait for more data (~10 entries), then JSON it and only then send it to database. Or get a better server that would allow me to insert data straight from python into php. I would need SSH for that. – sheelis Apr 26 '14 at 15:21
0

Did you try to create a script that is all the time alive(here you make the connection)(S1) and then the rest?

(S2) In the script that you are doing the operations first check if the connection is alive and if is not redo connection.

Close the connection in S1 at the end of the script.

andy1786
  • 121
  • 4
  • 13