31

I am using the YouTube API to upload videos chunk by chunk (see the code below). However, the upload sometimes fails with larger files (+1GB), but not always. The upload is shown to be complete but only a couple of minutes can be played and the rest is truncated. I did some research but with no apparent success. My question(s) now:

  • Is there a possibility to contact YouTube directly (seeing logs of what's really going on)?
  • Is this some encoding issue?
  • Can the error be caught/detected via the API (at the moment, no exception is thrown)
  • Can this happen if you are uploading different videos at once (in parallel, that is) ?
  • Has anyone else encountered this issue?

Any help/lead in the right direction is much appreciated. I'd even call out a bounty of 500 points as this is driving me crazy (just done that...)

Appendix: The script is run on a command line, through a Gearman Server, with set_time_limit(0); set. The code/function is just an extract (and runs great with smaller files, sometimes even up to 10GB).

EDIT: According to aergistal's and GeorgeQ's comments above, I have changed the while loop to read chunks directly (no feof() anymore) and save the status to the database.

/*
    Uploads one file to youtube chunk by chunk
*/
function uploadFile($dbfile) {
    $client = $this->client;
    $youtube = new Google_Service_YouTube($client);
    $htmlBody = "";

    try {
        // Create a snippet with title, description, tags and category ID
        // Create an asset resource and set its snippet metadata and type.
        // This example sets the video's title, description, keyword tags, and
        // video category.
        $snippet = new Google_Service_YouTube_VideoSnippet();
        $snippet->setTitle("SO Example");

        // Numeric video category. See
        // https://developers.google.com/youtube/v3/docs/videoCategories/list 
        $snippet->setCategoryId("22");

        // Set the video's status to "public". Valid statuses are "public",
        // "private" and "unlisted".
        $status = new Google_Service_YouTube_VideoStatus();
        $status->privacyStatus = "private";

        // Associate the snippet and status objects with a new video resource.
        $video = new Google_Service_YouTube_Video();
        $video->setSnippet($snippet);
        $video->setStatus($status);

        // Specify the size of each chunk of data, in bytes. Set a higher value for
        // reliable connection as fewer chunks lead to faster uploads. Set a lower
        // value for better recovery on less reliable connections.
        $chunkSizeBytes = 1 * 1024 * 1024;

        // Setting the defer flag to true tells the client to return a request which can be called
        // with ->execute(); instead of making the API call immediately.
        $client->setDefer(true);

        // Create a request for the API's videos.insert method to create and upload the video.
        $insertRequest = $youtube->videos->insert("status,snippet", $video);

        // Create a MediaFileUpload object for resumable uploads.
        $media = new Google_Http_MediaFileUpload(
                 $client,
                 $insertRequest,
                 'video/*',
                 null,
                 true,
                 $chunkSizeBytes);
        $media->setFileSize(filesize($dbfile->localfile));

        // Read the media file and upload it chunk by chunk.
        $status = false;
        $handle = fopen($dbfile->localfile, "rb");

        while (!$status && ($chunk = (fread($handle, $chunkSizeBytes))) !== FALSE) { 
            $status = $media->nextChunk($chunk);
            $data = array("filename" => $dbfile->localfile, "status" => print_r($status, true));
            $db->saveLog($data);
        }

        /* the old code
        while (!$status && !feof($handle)) {
            $chunk = fread($handle, $chunkSizeBytes);
            $status = $media->nextChunk($chunk);
        }
        */

        fclose($handle);

        // If you want to make other calls after the file upload, set setDefer back to false
        $client->setDefer(false);

        $log = array("success" => true, "snippet_id" => $status["id"]);
    } catch (Google_ServiceException $e) {
        $log = array("success" => false, "errormsg" => $e->getMessage());
    } catch (Google_Exception $e) {
        $log = array("success" => false, "errormsg" => $e->getMessage());
    }

    return $log;
}
Markus Safar
  • 6,324
  • 5
  • 28
  • 44
Jan
  • 42,290
  • 8
  • 54
  • 79

3 Answers3

9

Is there a possibility to contact YouTube directly (seeing logs of what's really going on)?

Well, this is an impossible mission. You need to send them numerous mails to (maybe) get an answer. I've tried a few times, but no response from them.

Is this some encoding issue?

Yes, this is encoding issue. If you are trying to upload a HD video, and if it's getting truncated or shortened or whatsoever, it's a encoding issue. YouTube has them periodically.

Can the error be caught/detected via the API (at the moment, no exception is thrown)

No it cannot. You need to see the video when it's uploaded to see an error. There's no exception in the middle of the process or in any part of uploading.

Can this happen if you are uploading different videos at once (in parallel, that is)?

It doesn't matter. If you are uploading one video or two, three, five videos simultaneously, it won't matter. It's just an upload. Only bad thing that could happen in process is a loss of connection. Every video goes to its own direction. It's like when you are copying multiple files from one folder to another.

Has anyone else encountered this issue?

Yes. You, me, and whole bunch of other uploaders. It's a YouTube problem. It's their bug or some encoding / rendering / transcoding, whatever problem that they have. It's all because of YouTube processing choices.

When that happened to me, my solution was to use HTTPS/SSL when I uploaded the video. And it worked. There was no cutting, trimming, transcoding nor encoding / rendering problem.

Peter O.
  • 32,158
  • 14
  • 82
  • 96
Josip Ivic
  • 3,639
  • 9
  • 39
  • 57
  • 2
    This smells like to win the bounty :-) Good and detailed answer. As I am uploading videos programmatically, how to ensure to use HTTPS via the SDK then? – Jan Nov 27 '15 at 10:21
2

looks like the script is timeout. try this code on the first line: set_time_limit(0);

Stefan Gehrig
  • 82,642
  • 24
  • 155
  • 189
rain
  • 243
  • 1
  • 8
2

The upload is shown to be complete but only a couple of minutes can be played and the rest is truncated. I did some research but with no apparent success.

Is this some encoding issue?

Ok..you are using "chunked" uploading. In others words: its a "resumable" upload as described in the YT Upload API.

My first guess: its a content-range header issue (of one of the requests). All parts have to perfectly align byte-wise on the YT server-side, else you will end up with only the first part of the binary. Referencing: Upload Chunks and please note the blue box on Content Range Header.

The google-api-php-client should handle this correctly. But bug-wise it could be anything: API out-of-sync with the client, cURL configuration issue, header not set, range not correctly calculated.

Can the error be caught/detected via the API (at the moment, no exception is thrown)

Debugging the client is not your job. One would have to enable GuzzleHttp\RequestOptions::DEBUG to see, if all headers are correct. Then you could try pulling the status of the upload in parallel to the upload itself (second guzzle request).

Is there a possibility to contact YouTube directly (seeing logs of what's really going on)?

Yes, you are using Google_Http_MediaFileUpload and that's part of Google-API-PHP-client.

Just open an issue over at their Github repo: https://github.com/google/google-api-php-client/issues


My suggestion is:

  • leave the PHP onion PHP(ext_curl(libcurl))) + yourscript(google-api-client(guzzle)))
    • PHP onion means: your script uses google-api-client, which uses guzzle, which uses php_ext_curl, which uses libcurl internally
    • you have multiple layers and errors can happen at all of them
    • bottom-line: lets simply bypass the PHP stack for testing from the CLI
  • try to reproduce the chunked upload issue on the CLI using cURL
  • use a second console for requesting the status of the active upload between uploaded chunks
  • then, in case the upload from CLI
    • fails: it would indicate a YT server problem
    • succeeds: compare the headers from the CLI against the PHP script (guzzle in debug mode) to get closer to the problem
Jens A. Koch
  • 39,862
  • 13
  • 113
  • 141
  • Thanks for looking into this. However, I do not completely understand what you mean by your first option (leave the PHP onion) - what is this part meant to say? – Jan Dec 01 '15 at 18:58
  • Ok, i've updated my answer to explain it a bit. (boxcon... cool project!) – Jens A. Koch Dec 01 '15 at 19:08