When I tested myself, it seemed that it wouldn't find any elements, because there wasn't any data returned by the request. I thought that surely I'm missing something, due to the comments about invalid encoding... but, I downloaded the XML document and saved it next to the HTML file you provided, having made the following change:
url: 'full.xml',
Having done this, (for me) the paragraph element is now filled with the two entries that could be seen in the XML you were targeting. Surely this is because of the cross-domain policies and what-not?
You have multiple options on grabbing the XML properly:
- Server-side PHP file_get_contents() / cURL
>>The file can now be AJAXed by your page, as you're on the same domain.
>>This server-side request could also be part of this page, i.e: When the page is being loaded, PHP/Whatever could echo it as a variable within the JavaScript, or in a hidden element.
- Modified AJAX Request
>>A modified AJAX request that will work on cross-domain requests. Useful as the server then won't be performing the request, which is perhaps what you desire. If this is the case, then I should mention; I have a file named jquery.xdomainajax.js
which allows cross-domain AJAX requests. I am looking for the source at the moment, but I figured I would rush this post so that you may search yourself if you wish.
Edit: Check this out.
Edit2: After some brief and careless testing, I can't get it to work using the above jQuery workaround plugin... so my solution would be to scrape it server-side.
Here are some more links:
- How to load a certain div of an external webpage on another webpage that is on a different domain
- Get Websites Title
- JQuery ajax cross domain call and permission issue
FYI: I am using FireFox 12.0