1

I have a link in a texbox. When I click a button I want to take the title of the page of the link. How can do it with javascript or jQuery?

Pointy
  • 405,095
  • 59
  • 585
  • 614
Erdinç Özdemir
  • 1,363
  • 4
  • 24
  • 52

5 Answers5

1

this post can give you a start

http://forum.jquery.com/topic/get-external-page-and-fetch-title-googled-a-lot-didn-t-find-any-solution

rahul
  • 7,573
  • 7
  • 39
  • 53
0

If the page is in the same domain, I'd say use an ajax request and get the title from the returned DOM object.

If it's a different domain, I'd say set a hidden IFrame to the location and when it's loaded get the title using something like:

document.getElementById('MyIframe').document.title
Rob Hardy
  • 1,821
  • 15
  • 15
0

It is almost always done by backend script/crawler. It fetches webpage for You on server-side and returns parsed data by AJAX

yakxxx
  • 2,841
  • 2
  • 21
  • 22
0

try something like this

<a href="http://www.google.com" id="googleLink">Google</a>
<span id="titleGoesHere"></span>

--

$(document).ready( function() {
    $('#googleLink').click(function(){
        $.get(this.prop('href'), function(data) {
            var $temp = $('<div id="tempData" />');
            $temp.append(data);
            var title = $('title', $temp);
            $('#titleGoesHere').html(title.val());
        });
    });
});
kavun
  • 3,358
  • 3
  • 25
  • 45
0

For security reasons, browsers restrict cross-origin HTTP requests initiated from within scripts. And because we are using client-side Javascript on the front end for web scraping, CORS errors can occur.

...

Staying firmly within our front end script, we can use cross-domain tools such as Any Origin, Whatever Origin, All Origins, crossorigin and probably a lot more. I have found that you often need to test a few of these to find the one that will work on the site you are trying to scrape.

From this post, I wrote this working and self-contained fiddle:

function replaceAll(str, find, replace) {
    return str.replace(new RegExp(find, 'g'), replace);
}

const url = "https://www.facebook.com/"

$.getJSON('https://allorigins.me/get?url=' + encodeURIComponent(url) + '&callback=?', function(data){
    const content = replaceAll(data.contents, "<script", "<meta");
    $("#content").append(content);
    const d = $("#content");
    $('#title').text(d.find('title').text());
  $('#description').text(d.find('meta[name=description]').attr("content") || "None");
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>

<div id="content" style="display: none;">
</div>

<h3 id="title">Waiting...</h3>

<br/>

<p id="description">Waiting...</p>

A few comments:

  • Use a cross-domain tool through https
  • Don't forget to encodeURIComponent your url
  • I replaced script tags with meta tags so that none of those are executed when appended to the DOM (replace function from this question)
  • To be used, parsed jQuery must be added to the dom (see this question)
ted
  • 13,596
  • 9
  • 65
  • 107