1

I'm attempting to write a google chrome extension, which is effectively crawling a page, since chrome extensions allow cross-origin XHR requests.

However, when it does it, it also tries to load EVERY SINGLE IMAGE on the page. This doesn't actually result in the images loading, as the paths are all relative, but it does result in the console becoming clogged with errors.

My question is, can I do a jQuery.get() to request a webpage, without accidentally trying to preload all the images?

EDIT

Code looks like this:

$.get(
    url,
    function parseData(data) {
        console.log("Images are automatically preloaded once " +
                    "this function exits, for some reason");
    },
    'html'
);
Community
  • 1
  • 1
Eric
  • 95,302
  • 53
  • 242
  • 374

2 Answers2

0

As you dont show your code, my guess is that you load the result into the DOM? once the are detected by the DOM, it will probably try to load them?

So perhaps if you load into a simple variable and then replace/remove all the images first?

BerggreenDK
  • 4,915
  • 9
  • 39
  • 61
  • How do I replace the images without loading it into a DOM? I mean, I'd have to [parse it with regex](http://stackoverflow.com/q/1732348/102441). I've added my code. – Eric Apr 20 '11 at 20:54
  • Not sure, but the last 'html' does that mean that you replace your entire HTML content? if so, thats the DOM-element. I believe you need to load into a string and then do a string-replace on it before putting the result into HTML element. – BerggreenDK Apr 20 '11 at 21:50
0

put the pulled data into a hidden div, remove the imgs and show the hidden div

html

<div id="junk"></div>


script

$('document').ready(function(){
   $('#junk').hide();
});

function parseData(data){
  $('#junk').html(data).children('img').remove();
  $('#junk').show();
};

example http://jsfiddle.net/E69UR/

g19fanatic
  • 10,567
  • 6
  • 33
  • 63