0

I need to create a script which will parse some data from a url and then insert it into a database. I was wondering whether a perl cgi script or a php script is more suited for this purpose. I would have to run this script every ten minutes or so.

user1667307
  • 2,041
  • 6
  • 27
  • 32

3 Answers3

1

Anything that gets the job done is "suited" to the purpose. Personally I'd go with PHP but that's just because I know PHP and not Perl. Ultimately it's up to you - if you can do it in Perl, do so. If you prefer PHP, do it in PHP.

Niet the Dark Absol
  • 320,036
  • 81
  • 464
  • 592
1

Either PHP or Perl will do the job just fine. I'd send you in the direction of PHP just because it's easier to use out of the box and has a lot more examples geared toward people just getting started.

Here is an example answer to this question

Community
  • 1
  • 1
tdlm
  • 597
  • 3
  • 14
0

10 years too late, but might help someone else. When parsing query string with PHP (parse_str), unlike CGI, PHP will not allow duplicated keys in resulting array, e.g. this query:

parse_str('a=1&b=2&b=3&c=4')

will result in this array

//'a'=>1, 'b'=>3, 'c'=>4

no matter the query is obviously polluted (HTTP parameter pollution (HPP)). PHP/Apache will parse only the last occurrence of parsed string variable (other backend servers will do differently). The same is with parsing a JSON string into array (with json_decode (as object or array, the same)). So in PHP, for parsing even polluted queries, and preventing them from further loops, use string parsing, before converting to array with parse_str, like:

$control_arr = array();
$pairs = explode('&', $query); 
for ($i = 0; $i < sizeof($pairs); $i++){

    $instances = explode('=', $pairs[$i]);
    $key = $instances[0];

    if (!in_array($key, $control_arr)){

        $control_arr[] = $key;

    } else {
    
        //HTTP parameter pollution (parameters are repeating)

    }

}
//if cursor is here, continue with parse_str, query string isn't polluted
Dbertovi
  • 51
  • 3