0

I am looking for a way to read multiple (over 50) plain text websites and parse only certain information into a html table, or as a csv file.When I say "plain text" I mean that while it is a web address, it does not have any html associated with it.This would be an example of the source. I am pretty new to this, and was looking for help in seeing how this could be done.

update-token:179999210
vessel-name:Name Here
vessel-length:57.30
vessel-beam:14.63
vessel-draft:3.35
vessel-airdraft:0.00
time:20140104T040648.259Z
position:25.04876667 -75.57001667 GPS
river-mile:sd 178.71
rate-of-turn:0.0
course-over-ground:58.5
speed-over-ground:0.0
ais-367000000 {
    pos:45.943912 -87.384763 DGPS
    cog:249.8
    sog:0.0
    name:name here
    call:1113391
    imo:8856857
    type:31
    dim:10 20 4 5
    draft:3.8
    destination:
}
ais-367000000 {
    pos:25.949652 -86.384535 DGPS
    cog:105.6
    sog:0.0
    name:CHRISTINE
    call:5452438
    type:52
    status:0
    dim:1 2 3 4
    draft:3.0
    destination:IMTT ST.ROSE
    eta:06:00
}

Thanks for any suggestions you guys might have.

cadillacace
  • 86
  • 10
  • This looks a lot like JSON - have you tried a JSON parser? You may need to add some commas instead of newlines. – Floris Jan 04 '14 at 17:19
  • 1
    If you're going for Python I'd suggest using the JSON lib with `import json`, or if you really just want to grab some lines and turn them to variables you can also `import ast` and do `ast.literal_eval(line.rstrip())` – Luiz Berti Jan 06 '14 at 23:01

2 Answers2

1

I may be completely missing the point here - but here is how you could take the contents (assuming you had them as a string) and put them into a php key/value array. I "hard-coded" the string you had, and changed one value (the key ais-3670000 seemed to repeat, and that makes the second object overwrite the first).

This is a very basic parser that assumes a format like you described above. I give the output below the code:

<?php
echo "<html>";
$s="update-token:179999210
vessel-name:Name Here
vessel-length:57.30
vessel-beam:14.63
vessel-draft:3.35
vessel-airdraft:0.00
time:20140104T040648.259Z
position:25.04876667 -75.57001667 GPS
river-mile:sd 178.71
rate-of-turn:0.0
course-over-ground:58.5
speed-over-ground:0.0
ais-367000000 {
    pos:45.943912 -87.384763 DGPS
    cog:249.8
    sog:0.0
    name:name here
    call:1113391
    imo:8856857
    type:31
    dim:10 20 4 5
    draft:3.8
    destination:
}
ais-367000001 {
    pos:25.949652 -86.384535 DGPS
    cog:105.6
    sog:0.0
    name:CHRISTINE
    call:5452438
    type:52
    status:0
    dim:1 2 3 4
    draft:3.0
    destination:IMTT ST.ROSE
    eta:06:00
}";
$lines = explode("\n", $s);
$output = Array();
$thisElement = & $output;
foreach($lines as $line) {
  $elements = explode(":", $line);
  if (count($elements) > 1) {
    $thisElement[trim($elements[0])] = $elements[1];
  }
  if(strstr($line, "{")) {
      $elements = explode("{", $line);
      $key = trim($elements[0]);
      $output[$key] = Array();
      $thisElement = & $output[$key];
  }
  if(strstr($line, "}")) {
      $thisElement = & $output;
  }
}
echo '<pre>';
print_r($output);
echo '</pre>';
echo '</html>';
?>

Output of the above (can be seen working at http://www.floris.us/SO/ships.php):

Array
(
    [update-token] => 179999210
    [vessel-name] => Name Here
    [vessel-length] => 57.30
    [vessel-beam] => 14.63
    [vessel-draft] => 3.35
    [vessel-airdraft] => 0.00
    [time] => 20140104T040648.259Z
    [position] => 25.04876667 -75.57001667 GPS
    [river-mile] => sd 178.71
    [rate-of-turn] => 0.0
    [course-over-ground] => 58.5
    [speed-over-ground] => 0.0
    [ais-367000000] => Array
        (
            [pos] => 45.943912 -87.384763 DGPS
            [cog] => 249.8
            [sog] => 0.0
            [name] => name here
            [call] => 1113391
            [imo] => 8856857
            [type] => 31
            [dim] => 10 20 4 5
            [draft] => 3.8
            [destination] => 
        )

    [ais-367000001] => Array
        (
            [pos] => 25.949652 -86.384535 DGPS
            [cog] => 105.6
            [sog] => 0.0
            [name] => CHRISTINE
            [call] => 5452438
            [type] => 52
            [status] => 0
            [dim] => 1 2 3 4
            [draft] => 3.0
            [destination] => IMTT ST.ROSE
            [eta] => 06
        )

)

A better approach would be to turn the string into "properly formed JSON", then use json_decode. That might look like the following:

<?php
echo "<html>";
$s="update-token:179999210
vessel-name:Name Here
vessel-length:57.30
vessel-beam:14.63
vessel-draft:3.35
vessel-airdraft:0.00
time:20140104T040648.259Z
position:25.04876667 -75.57001667 GPS
river-mile:sd 178.71
rate-of-turn:0.0
course-over-ground:58.5
speed-over-ground:0.0
ais-367000000 {
    pos:45.943912 -87.384763 DGPS
    cog:249.8
    sog:0.0
    name:name here
    call:1113391
    imo:8856857
    type:31
    dim:10 20 4 5
    draft:3.8
    destination:
}
ais-367000001 {
    pos:25.949652 -86.384535 DGPS
    cog:105.6
    sog:0.0
    name:CHRISTINE
    call:5452438
    type:52
    status:0
    dim:1 2 3 4
    draft:3.0
    destination:IMTT ST.ROSE
    eta:06:00
}";

echo '<pre>';
print_r(parseString($s));
echo '</pre>';

function parseString($s) {
  $lines = explode("\n", $s);
  $jstring = "{ ";
  $comma = "";
  foreach($lines as $line) {
    $elements = explode(":", $line);
    if (count($elements) > 1) {
      $jstring = $jstring . $comma . '"' . trim($elements[0]) . '" : "' . $elements[1] .'"';
      $comma = ",";
    }
    if(strstr($line, "{")) {
      $elements = explode("{", $line);
      $key = trim($elements[0]);
      $jstring = $jstring . $comma . '"' . $key .'" : {';
      $comma = "";
    }
    if(strstr($line, "}")) {
      $jstring = $jstring . '} ';
      $comma = ",";
    }
  }
  $jstring = $jstring ."}";
  return json_decode($jstring);
}
echo '</html>';
?>

Demo at http://www.floris.us/SO/ships2.php ; note that I use the variable $comma to make sure that commas are either included, or not included, at various points in the string.

Output of this code looks similar to what we had before:

stdClass Object
(
    [update-token] => 179999210
    [vessel-name] => Name Here
    [vessel-length] => 57.30
    [vessel-beam] => 14.63
    [vessel-draft] => 3.35
    [vessel-airdraft] => 0.00
    [time] => 20140104T040648.259Z
    [position] => 25.04876667 -75.57001667 GPS
    [river-mile] => sd 178.71
    [rate-of-turn] => 0.0
    [course-over-ground] => 58.5
    [speed-over-ground] => 0.0
    [ais-367000000] => stdClass Object
        (
            [pos] => 45.943912 -87.384763 DGPS
            [cog] => 249.8
            [sog] => 0.0
            [name] => name here
            [call] => 1113391
            [imo] => 8856857
            [type] => 31
            [dim] => 10 20 4 5
            [draft] => 3.8
            [destination] => 
        )

    [ais-367000001] => stdClass Object
        (
            [pos] => 25.949652 -86.384535 DGPS
            [cog] => 105.6
            [sog] => 0.0
            [name] => CHRISTINE
            [call] => 5452438
            [type] => 52
            [status] => 0
            [dim] => 1 2 3 4
            [draft] => 3.0
            [destination] => IMTT ST.ROSE
            [eta] => 06
        )

)

But maybe your question is "how do I get the text into php in the first place". In that case, you might look at something like this:

<?php
$urlstring = file_get_contents('/path/to/urlFile.csv');
$urls = explode("\n", $urlstring); // one url per line

$responses = Array();

// loop over the urls, and get the information
// then parse it into the $responses array
$i = 0;
foreach($urls as $url) {
  $responses[$i] = parseString(file_get_contents($url));
  $i = $i + 1;
}


function parseString($s) {
  $lines = explode("\n", $s);
  $jstring = "{ ";
  $comma = "";
  foreach($lines as $line) {
    $elements = explode(":", $line);
    if (count($elements) > 1) {
      $jstring = $jstring . $comma . '"' . trim($elements[0]) . '" : "' . $elements[1] .'"';
      $comma = ",";
    }
    if(strstr($line, "{")) {
      $elements = explode("{", $line);
      $key = trim($elements[0]);
      $jstring = $jstring . $comma . '"' . $key .'" : {';
      $comma = "";
    }
    if(strstr($line, "}")) {
      $jstring = $jstring . '} ';
      $comma = ",";
    }
  }
  $jstring = $jstring ."}";
  return json_decode($jstring);
}
?>

I include the same parsing function as before; it's possible to make it much better, or leave it out altogether. Hard to know from your question.

Questions welcome.

UPDATE

Based on comments I have added a function that will perform the curl on the file resource; let me know if this works for you. I have created a file http://www.floris.us/SO/ships.txt that is an exact copy of the file you showed above, and a http://www.floris.us/SO/ships3.php that contains the following source code - you can run it and see that it works (note - in this version I don't read anything from a .csv file - you already know how to do that. This is just taking the array, and using it to obtain a text file, then converting it to a data structure you can use - display, whatever):

<?php
$urls = Array();
$urls[0] = "http://www.floris.us/SO/ships.txt";

$responses = Array();

// loop over the urls, and get the information
// then parse it into the $responses array
$i = 0;
foreach($urls as $url) {
//  $responses[$i] = parseString(file_get_contents($url));
  $responses[$i] = parseString(myCurl($url));
  $i = $i + 1;
}
echo '<html><body><pre>';
print_r($responses);
echo '</pre></body></html>';

function parseString($s) {
  $lines = explode("\n", $s);
  $jstring = "{ ";
  $comma = "";
  foreach($lines as $line) {
    $elements = explode(":", $line);
    if (count($elements) > 1) {
      $jstring = $jstring . $comma . '"' . trim($elements[0]) . '" : "' . $elements[1] .'"';
      $comma = ",";
    }
    if(strstr($line, "{")) {
      $elements = explode("{", $line);
      $key = trim($elements[0]);
      $jstring = $jstring . $comma . '"' . $key .'" : {';
      $comma = "";
    }
    if(strstr($line, "}")) {
      $jstring = $jstring . '} ';
      $comma = ",";
    }
  }
  $jstring = $jstring ."}";
  return json_decode($jstring);
}

function myCurl($f) {
// create curl resource 
   $ch = curl_init();
// set url 
   curl_setopt($ch, CURLOPT_URL, $f); 

//return the transfer as a string 
   curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); 

// $output contains the output string 
   $output = curl_exec($ch); 

// close curl resource to free up system resources 
   curl_close($ch);    
   return $output;
}
?>

Note - because two entries have the same "tag", the second one overwrites the first when using the original source data. If that is a problem let me know. Also if you have ideas on how you actually want to display the data, try to mock up something and I can help you get it right.

On the topic of time-outs

There are several possible timeout mechanisms that can be causing you problems; depending on which it is, one of the following solutions may help you:

  1. If the browser doesn't get any response from the server, it will eventually time out. This is almost certainly not your problem right now; but it might become your issue if you fix the other problems

  2. php scripts typically have a built in "maximum time to run" before they decide you sent them into an infinite loop. If you know you will be making lots of requests, and these requests will take a lot of time, you may want to set the time-out higher. See http://www.php.net/manual/en/function.set-time-limit.php for details on how to do this. I would recommend setting the limit to a "reasonable" value inside the curl loop - so the counter gets reset for every new request.

  3. Your attempt to connect to the server may take too long (this is the most likely problem as you said). You can set the value (time you expect to wait to make the connection) to something "vaguely reasonable" like 10 seconds; this means you won't wait forever for the servers that are offline. Use

    curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);

for a 10 second wait. See Setting Curl's Timeout in PHP   Finally you will want to handle the errors gracefully - if the connection did not succeed, you don't want to process the response. Putting all this together gets you something like this:

$i = 0;
foreach($urls as $url) {
  $temp = myCurl($url);
  if (strlen($temp) == 0) {
    echo 'no response from '.$url.'<br>';
  }
  else {
    $responses[$i] = parseString(myCurl($url));
    $i = $i + 1;
  }
}

echo '<html><body><pre>';
print_r($responses);
echo '</pre></body></html>';

function myCurl($f) {
// create curl resource 
   $ch = curl_init();
// set url 
   curl_setopt($ch, CURLOPT_URL, $f); 

//return the transfer as a string 
   curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); 
   curl_setopt($ch, CURLOPT_NOSIGNAL, 1);
   curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10); // try for 10 seconds to get a connection
   curl_setopt($ch, CURLOPT_TIMEOUT, 30);        // try for 30 seconds to complete the transaction

// $output contains the output string 
   $output = curl_exec($ch); 

// see if any error was set:
   $curl_errno = curl_errno($ch);
 
// close curl resource to free up system resources 
   curl_close($ch);    

// make response depending on whether there was an error
   if($curl_errno > 0) {
      return '';
   }
   else {
      return $output;
  }
}

Last update? I have updated the code one more time. It now

  1. Reads a list of URLs from a file (one URL per line - fully formed)
  2. Tries to fetch the contents from each file in turn, handling time-outs and echoing progress to the screen
  3. Creates tables with the some of the information from the files (including a reformatted time stamp)

To make this work, I had the following files:

www.floris.us/SO/ships.csv containing three lines with

http://www.floris.us/SO/ships.txt
http://floris.dnsalias.com/noSuchFile.html
http://www.floris.us/SO/ships2.txt

Files ships.txt and ships2.txt at the same location (almost identical copies but for name of ship) - these are like your plain text files.

File ships3.php in the same location. This contains the following source code, that performs the various steps described earlier, and attempts to string it all together:

<?php
$urlstring = file_get_contents('http://www.floris.us/SO/ships.csv');
$urls = explode("\n", $urlstring); // one url per line

$responses = Array();

// loop over the urls, and get the information
// then parse it into the $responses array
$i = 0;
foreach($urls as $url) {
 $temp = myCurl($url);
  if(strlen($temp) > 0) {
    $responses[$i] = parseString($temp);
    $i = $i + 1;
  }
  else {
    echo "URL ".$url." did not repond<br>";
  }
}

// produce the actual output table:
echo '<html><body>';
writeTable($responses);
echo '</pre></body></html>';

// ------------ support functions -------------
function parseString($s) {
  $lines = explode("\n", $s);
  $jstring = "{ ";
  $comma = "";
  foreach($lines as $line) {
    $elements = explode(":", $line);
    if (count($elements) > 1) {
      $jstring = $jstring . $comma . '"' . trim($elements[0]) . '" : "' . $elements[1] .'"';
      $comma = ",";
    }
    if(strstr($line, "{")) {
      $elements = explode("{", $line);
      $key = trim($elements[0]);
      $jstring = $jstring . $comma . '"' . $key .'" : {';
      $comma = "";
    }
    if(strstr($line, "}")) {
      $jstring = $jstring . '} ';
      $comma = ",";
    }
  }
  $jstring = $jstring ."}";
  return json_decode($jstring, true);
}

function myCurl($f) {
// create curl resource 

   $ch = curl_init();
// set url 
   curl_setopt($ch, CURLOPT_URL, $f); 

//return the transfer as a string 
   curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); 
   curl_setopt($ch, CURLOPT_NOSIGNAL, 1);
   curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10); // try for 10 seconds to get a connection
   curl_setopt($ch, CURLOPT_TIMEOUT, 30);        // try for 30 seconds to complete the transaction

// $output contains the output string 
   $output = curl_exec($ch); 

// see if any error was set:
   $curl_errno = curl_errno($ch);
   $curl_error = curl_error($ch);
   
// close curl resource to free up system resources 
   curl_close($ch);    

// make response depending on whether there was an error
   if($curl_errno > 0) {
      echo 'Curl reported error '.$curl_error.'<br>';
      return '';
   }
   else {
      echo 'Successfully fetched '.$f.'<br>';
      return $output;
  }
}

function writeTable($r) {
  echo 'The following ships reported: <br>';
  echo '<table border=1>';
  foreach($r as $value) {
    if (strlen($value["vessel-name"]) > 0) {
      echo '<tr><table border=1><tr>';
      echo '<td>Vessel Name</td><td>'.$value["vessel-name"].'</td></tr>';
      echo '<tr><td>Time:</td><td>'.dateFormat($value["time"]).'</td></tr>';
      echo '<tr><td>Position:</td><td>'.$value["position"].'</td></tr>';
      echo '</table></tr>';
    }
    echo '</table>';
  }
}

function dateFormat($d) {
  // with input yyyymmddhhmm
  // return dd/mm/yy hh:mm
  $date = substr($d, 6, 2) ."/". substr($d, 4, 2) ."/". substr($d, 2, 2) ." ". substr($d, 9, 2) . ":" . substr($d, 11, 2);
  return $date;
}
?>

Output of this is:

enter image description here

You can obviously make this prettier, and include other fields etc. I think this should get you a long way there, though. You might consider (if you can) having a script run in the background to create these tables every 30 minutes or so, and saving the resulting html tables to a local file on your server; then, when people want to see the result, they would not have to wait for the (slow) responses of the different remote servers, but get an "almost instant" result.

But that's somewhat far removed from the original question. If you are able to implement all this in a workable fashion, and then want to come back and ask a follow-up question (if you're still stuck / not happy with the outcome), that is probably the way to go. I think we've pretty much beaten this one to death now.

Community
  • 1
  • 1
Floris
  • 45,857
  • 6
  • 70
  • 122
  • Floris, Thanks for all of your hard work coming up with this. after trying your last code snippet. when i run it from cli it gives me this for every URL.`PHP Warning: file_get_contents(foo.com/NavData): failed to open stream: No such file or directory in /var/www/test.php on line 11` – cadillacace Jan 05 '14 at 03:04
  • 1
    So the `.csv` file contents load correctly but the other files don't? What is the **absolute** location of the file you are trying to load? Is it on your server, or somewhere else? What is the _exact_ content of the .csv file: is it of the form `http://www.foo.com/NavData/file1` or is it `foo.com/Navdata`? You need the whole `http://` in the string - if it's not there, then do `file_get_contents('http://www.' . 'foo.com/NavData/file1');` and see if that works better... – Floris Jan 05 '14 at 04:37
  • Floris, after placing the concatenation of http:// and the address it still isn't working.All of the hosts in the .csv are also on cellular routers, where speed can drastically change. I believe that it could be timing out. – cadillacace Jan 06 '14 at 16:01
  • Are you getting the same error message? If your browser can see it, your php should be able to see it. Do you know the IP address of one of the routers, and can you get _any response at all_ from these servers from inside php? Have you tried using `curl`? Without having access to your environment it is difficult to help you figure this out, so we may need to have an extended discussion to get anywhere... and while I enjoy these things I'm traveling today and my time online is limited. – Floris Jan 06 '14 at 16:34
  • Sure I can give an IP address for you to test. I have done some testing with curl, and it is pulling in the data beautifully. Again, your awesome Floris, and thanks for all your help. – cadillacace Jan 06 '14 at 19:44
  • If it's working with curl, then I believe your problem is solved? Let me know if it's not. And you're welcome. – Floris Jan 06 '14 at 20:39
  • I'll have to find some time later this week to try and implement curl into your php above, sorry about being a complete novice when it comes to this. Any tips would be appreciated. – cadillacace Jan 06 '14 at 22:23
  • My bad luck being stuck at an airport is your good luck to get an update to this answer. Let me know if this works for you! – Floris Jan 06 '14 at 22:55
  • Floris, I have gotten this working partially. I can't get the .csv to work, but if I add multiple addresses in the `Array();` in line 2 I am able to view the formatted text.The issue i am seeing is that if I add quite a few to the array I am getting this message: `fatal error maximum execution time of 30 seconds excedded curl_exec` After doing some research, do you think that `curl_multi_exec` could fix this issue? After reading the documentation, I am lost on how to implement the do while loop in the example here. http://www.php.net/manual/en/function.curl-multi-exec.php – cadillacace Jan 07 '14 at 22:58
  • Do you think that some of the servers might take longer to respond, and that that is why you have the issue? Or is it "after a number of requests"? Does the problem occur even if you request the same resource say 50 times? – Floris Jan 08 '14 at 03:03
  • There is a almost certain possibility that some servers ping could be around 3-5k ms. With some being offline because they travel in and out of signal.What would be the best way for this to "skip-over" the ones with high latency? after testing ten of the same address, the script works as intended. I do believe it is the inconsistent connection speed causing the issues. – cadillacace Jan 08 '14 at 04:07
  • 1
    OK, so I have got all the addresses to be checked without a time out error. I added the `set_time_limit(60);` into the curl loop. I also changed the `max_execution_time=60` into the `php.ini`.All in all, it takes about 4-5 minutes for the script to finish. Now, the last part of the question, how to get this into a user-readable format, either as a .csv download, or as a html table. also during the loading of the page, is there a way to show the progress of the script? I have learned so much with this, thanks Floris! – cadillacace Jan 08 '14 at 16:58
  • Let's do one thing at a time. For "user readable format", how do you want to show things? Do you want a single giant table? Do you want to see all the information, or just some lines? What about this problem of having the repeated `ais-367000000` block - do you want to show it twice? Give me some idea of the layout you're after and I will knock up a sample - it will be a few hours before I have a chance to do it though. As for "progress update" - it would require a little bit of a change to the code. You could take a look at http://stackoverflow.com/a/2996126/1967396 for now. Back later. – Floris Jan 08 '14 at 22:42
  • The `ais-367000000` block is actually a unique number called an MMSI. No two vessels have the same number, I added a generic number for testing purposes.The format, as for rows would need to be the keys without hyphens and brackets, with `[speed-over-ground]` displayed as `current speed` and the `time` field converted to `dd/mm/yy hh:mm`. Besides that I believe that would do it for format. Also, is there a way to update the table in the background that just populates the table, progress would not need to be shown. Thanks again for being so helpful. – cadillacace Jan 09 '14 at 01:17
  • OK - see whether the last update I made gave you enough of a push to implement your solution now... – Floris Jan 09 '14 at 07:32
  • Floris, I have gotten everything to work, but have a question about format. How could it all be put into one table? would the `writetable` function be the only thing that needs to be changed? – cadillacace Jan 09 '14 at 19:30
  • Yes - `writeTable` was deliberately made to be "the only thing that you edit to change the formatting". As it is, I put "many tables inside one table" - really just to show how things can be done. Clearly there is a lot of information there - only you know how you want it to look, but I thought the combination of examples of "status updates" as individual servers are queried, and "building a final table" should be useful. There is so much more you could do... but start simple, get this working first. Glad it's coming together! – Floris Jan 09 '14 at 20:41
0

First combine the websites into a csv or hard coded array, then file_get_contents() / file_put_contents() on each. Essentially:

$file = dataFile.csv
foreach($arrayOfSites as $site){

    $data = file_get_contents($site);
    file_put_contents($file, $data . "\n", FILE_APPEND);

}

Edit: Sorry was trying to do this fast. here is the full

ZombieBunnies
  • 268
  • 2
  • 11
  • just good ole fashion php http://us3.php.net/manual/en/function.file-get-contents.php – ZombieBunnies Jan 04 '14 at 17:24
  • part of the question is how to combine these into a csv or html table. Like I said, I'm just starting out. I understand the basics, just never had to do anything like this. – cadillacace Jan 04 '14 at 17:25
  • Zombie, if i have the list of websites in a .csv, how do i use that instead of an array? Or, what is the best way to put these URL's into an array? – cadillacace Jan 04 '14 at 18:06
  • Sorry for the delay. Got busy with work. If you already have them in a csv, that's great. Just to be clear: One. You have the list of websites already in a csv? Two. You want to get the contents of each and put the contents into another csv? Three. You only have to do this once, not an ongoing process where memory management becomes an issue? – ZombieBunnies Jan 04 '14 at 22:49
  • One. Yes i have them in a csv. Two. Either that or display current values in a table. Three. This would need to be done about once or twice an hour. Hope this helps, thanks zombie. – cadillacace Jan 05 '14 at 03:07