I have the following code to write an entry to a database:
<?php
$database = 'abc';
$dbhost = 'abc.com';
$dbuser = 'abc';
$dbpass = 'abc';
$conn = mysql_connect($dbhost, $dbuser, $dbpass);
$key = 'abc';
$checkKey = $_GET['key'];
if ($key === $checkKey) {
if(! $conn )
{
die('Could not connect: ' . mysql_error());
}
$a = mysql_real_escape_string($_GET['a']);
$b = mysql_real_escape_string($_GET['b']);
$c = mysql_real_escape_string($_GET['c']);
$c = $c / 1000;
$d = mysql_real_escape_string($_GET['d']);
$sql = 'INSERT INTO x '.
'(a,b, c, d) '.
'VALUES (' .$a. ',' .$b. ',FROM_UNIXTIME(' .$c. '),' .$d. ')';
mysql_select_db($database);
$retval = mysql_query( $sql, $conn );
if(! $retval )
{
die('Could not enter data: ' . mysql_error() . ': ' . $sql);
}
echo "Entered data successfully\n";
mysql_close($conn);
} else {
die('Key not valid!');
}
?>
This works fine but it's quite slow, since for every row I want to add, I have to call the script once. Would it be considerable faster to for example collect 1000 rows and then call the URL with this 1000 rows once and store each row in the database? If yes, how can I achieve that?