3

I am looking for a function that gets the metadata of a .mp3 file from a URL (NOT local .mp3 file on my server).
Also, I don't want to install http://php.net/manual/en/id3.installation.php or anything similar to my server.
I am looking for a standalone function.

Right now i am using this function:

<?php
function getfileinfo($remoteFile) 
{ 
    $url=$remoteFile;
    $uuid=uniqid("designaeon_", true);
    $file="../temp/".$uuid.".mp3";
    $size=0;
    $ch = curl_init($remoteFile);
    //==============================Get Size==========================//
    $contentLength = 'unknown';
    $ch1 = curl_init($remoteFile);
    curl_setopt($ch1, CURLOPT_NOBODY, true);
    curl_setopt($ch1, CURLOPT_RETURNTRANSFER, true);
    curl_setopt($ch1, CURLOPT_HEADER, true);
    curl_setopt($ch1, CURLOPT_FOLLOWLOCATION, true); //not necessary unless the file redirects (like the PHP example we're using here)
    $data = curl_exec($ch1);
    curl_close($ch1);
    if (preg_match('/Content-Length: (\d+)/', $data, $matches)) {
    $contentLength = (int)$matches[1];
    $size=$contentLength;
    }
    //==============================Get Size==========================//                 
    if (!$fp = fopen($file, "wb")) {
    echo 'Error opening temp file for binary writing';
    return false;
    } else if (!$urlp = fopen($url, "r")) {
    echo 'Error opening URL for reading';
    return false;
    }
    try {
    $to_get = 65536; // 64 KB
    $chunk_size = 4096; // Haven't bothered to tune this, maybe other values would work better??
    $got = 0; $data = null;

    // Grab the first 64 KB of the file
    while(!feof($urlp) && $got < $to_get) {  $data = $data . fgets($urlp, $chunk_size);  $got += $chunk_size;  }  fwrite($fp, $data);  // Grab the last 64 KB of the file, if we know how big it is.  if ($size > 0) {
    curl_setopt($ch, CURLOPT_FILE, $fp);
    curl_setopt($ch, CURLOPT_HEADER, 0);
    curl_setopt($ch, CURLOPT_RESUME_FROM, $size - $to_get);
    curl_exec($ch);

    // Now $fp should be the first and last 64KB of the file!!               
    @fclose($fp);
    @fclose($urlp);
    } catch (Exception $e) {
    @fclose($fp);
    @fclose($urlp);
    echo 'Error transfering file using fopen and cURL !!';
    return false;
    }
    $getID3 = new getID3;
    $filename=$file;
    $ThisFileInfo = $getID3->analyze($filename);
    getid3_lib::CopyTagsToComments($ThisFileInfo);
    unlink($file);
    return $ThisFileInfo;
}
?>

This function downloads 64KB from a URL of an .mp3 file, then returns the array with the metadata by using getID3 function (which works on local .mp3 files only) and then deletes the 64KB's previously downloaded. Problem with this function is that it is way too slow from its nature (downloads 64KB's per .mp3, imagine 1000 mp3 files.)

To make my question clear : I need a fast standalone function that reads metadata of a remote URL .mp3 file.

dimitris93
  • 4,155
  • 11
  • 50
  • 86
  • 2
    So what you're saying is you don't want to use the library built specifically for this and instead are hoping that some user somewhere has rebuilt the library without the requirement for additional package installations? – Ohgodwhy Sep 24 '14 at 18:58
  • Asking questions again??? http://stackoverflow.com/questions/25955180/php-get-metadata-of-remote-mp3-file-from-url – Sujit Agarwal Sep 24 '14 at 19:01
  • 1
    This library requires installation on each server, doesn't it ? which means that if i change a server i will have to install this library from start. Lets say i make a .php website that needs to use this library , and then i wanted to sell it. I would have to teach every single buyer individually how to install this library on their server so that my .php would work on their server ? Do you see my point ? – dimitris93 Sep 24 '14 at 19:01
  • @SujitAgarwal the question i made now is different, but similar – dimitris93 Sep 24 '14 at 19:02
  • But try making the question title more meaningful, so people can get an idea in first glance – Sujit Agarwal Sep 24 '14 at 19:03

1 Answers1

1

This function downloads 64KB from a URL of an .mp3 file, then returns the array with the metadata by using getID3 function (which works on local .mp3 files only) and then deletes the 64KB's previously downloaded. Problem with this function is that it is way too slow from its nature (downloads 64KB's per .mp3, imagine 1000 mp3 files.)

Yeah, well what do you propose? How do you expect to get data if you don't get data? There is no way to have a generic remote HTTP server send you that ID3 data. Really, there is no magic. Think about it.

What you're doing now is already pretty solid, except that it doesn't handle all versions of ID3 and won't work for files with more than 64KB of ID3 tags. What I would do to improve it to is to use multi-cURL.

There are several PHP classes available that make this easier:

https://github.com/jmathai/php-multi-curl

$mc = EpiCurl::getInstance();
$results[] = $mc->addUrl(/* Your stream URL here /*); // Run this in a loop, 10 at a time or so

foreach ($results as $result) {
    // Do something with the data.
}
Brad
  • 159,648
  • 54
  • 349
  • 530
  • I see what you mean. I didnt understand what is the use of the EpiCurl class , is it to handle different versions of ID3 ? or do you mean that $results[] is an array that contains all the URLs i am trying to get the metadata from ? – dimitris93 Sep 24 '14 at 19:39
  • will this download the metadata of 10 urls at one time ? – dimitris93 Sep 24 '14 at 19:47
  • @Shiro Yes, the point of that class is to make multiple HTTP requests simultaneously. This will allow you to get 10 or so files at a time. – Brad Sep 24 '14 at 19:49