I'm creating a userscript for Greasemonkey, for displaying additional content near comments in YouTube. I want it to support both Classic mode and New Look (based on Polymer, can be activated at https://www.youtube.com/new ), and I've already implemented most of functionality I wanted.
But a major problem I've recently found is related to apparently complicated YouTube mechanics involving the following processes:
(1) Loading first batch of comments from cache.
(2) Loading any further batches of comments from cache.
To do stuff with each comment, I use waitForKeyElements
to catch them.
This works well when I'm not leaving the current page. But, when I open a different video on the same page, or when I open any other YouTube page and then press Back, stuff breaks.
After navigating, comments that should be processed by the script appear already processed. Of course, that's how YouTube's no-reload design works.
At some point I was able to make the script recognize navigation event yt-navigate-finish
- this solved the problem (1). But another problem (2) remains - new batches of comments appear cached and getting re-inserted, and re-filled. This is where I couldn't find the solution.
To demonstrate the problem, I've created a small Greasemonkey script (MCVE):
// ==UserScript==
// @name TEST2
// @include https://*youtube.com/*
// @require https://ajax.googleapis.com/ajax/libs/jquery/1.12.4/jquery.min.js
// @grant none
// @run-at document-start
// ==/UserScript==
waitForKeyElements('#main .style-scope ytd-comment-renderer', ParseItemTest);
function ParseItemTest(jNode) {
var aNode = $(jNode).find("#author-text")[0];
var newspan = document.createElement('span');
newspan.innerHTML = ' <font color="red">===</font> ' + $(aNode).find("span.style-scope.ytd-comment-renderer").html();
$(aNode).append(newspan);
}
function waitForKeyElements(selectorTxt, actionFunction, bWaitOnce, iframeSelector) {
var targetNodes, btargetsFound;
if (typeof iframeSelector == "undefined") targetNodes = $(selectorTxt);
else targetNodes = $(iframeSelector).contents().find(selectorTxt);
if (targetNodes && targetNodes.length > 0) {
btargetsFound = true;
targetNodes.each(function() {
var jThis = $(this);
var alreadyFound = jThis.data('alreadyFound') || false;
if (!alreadyFound) {
var cancelFound = actionFunction(jThis);
if (cancelFound) btargetsFound = false;
else jThis.data('alreadyFound', true);
}
});
} else {
btargetsFound = false;
}
var controlObj = waitForKeyElements.controlObj || {};
var controlKey = selectorTxt.replace(/[^\w]/g, "_");
var timeControl = controlObj[controlKey];
if (btargetsFound && bWaitOnce && timeControl) {
clearInterval(timeControl);
delete controlObj[controlKey];
} else {
if (!timeControl) {
timeControl = setInterval(function() {
waitForKeyElements(selectorTxt, actionFunction, bWaitOnce, iframeSelector);
}, 300);
controlObj[controlKey] = timeControl;
}
}
waitForKeyElements.controlObj = controlObj;
}
With this script active, enable YouTube's New Look and go to any video with comments. Let them load, note how each comment gets a username added, e.g.
Alice === Alice
John === John
Grace === Grace
After you open any other page and press Back, here is what you'll see:
Alice === Grace
John === John
Grace === Alice
And after you go to another video page and check the comments, you'll see:
Jane === Grace
Dorothy === John
Brad === Alice
I hope it's enough to explain the problem. I just need to catch a moment where YouTube fills the existing comment elements with new data, to parse them again with that new data.