-1

I'd like to check if any sites on domain https://example.com contain any links like https://my-link.com.

I know I can use following Google search: site:https://example.com text to find although it only works for texts.

Any idea if it's possible to find links in href that or other way?

engray
  • 91
  • 1
  • 12
  • 2
    It sounds like you are wanting to search for "backlinks" (links that point back to your site)... if so, and this is a one time thing there are lots of backlink checker tools out there e.g. here's one: https://ahrefs.com/backlink-checker if you want to do this via google, I'm not sure if they expose this as an option? – scunliffe Sep 06 '21 at 13:14
  • 2
    I’m voting to close this question because it is not a programming question. [What topics can I ask about here?](https://stackoverflow.com/help/on-topic) – Rob Sep 06 '21 at 13:16
  • @scunliffe thanks for the tool! This is exactly what I want to do, although not for checking backlinks linking to my website, but I want to be sure some links I had in Wordpress page are not hardcoded in WP template. I have WP admin access, but no FTP rights. To be precise my question is about finding such links via google search – engray Sep 06 '21 at 13:31
  • @Rob my question is about Google search, page you've linked contains point "software tools commonly used by programmers; and is". Is Google not a tool commonly used by programmers? ;) – engray Sep 06 '21 at 13:32

2 Answers2

0

You can inspect the page. By either pressing CTRL + SHIFT + I or CMD + OPTION + I on mac, or you can just double click and click inspect.

You click on the "elements" tab and then press CTRL + F or CMD + F, then you paste the like you wanna check.enter image description here

Check this example.

Hope it helped

Tomas Mota
  • 672
  • 5
  • 27
  • Thanks for reply, I know I could do it this way, but what I'm looking for is finding all hrefs in all sites in some domain. I've edited my question - now I read it and it's really not too specific :D – engray Sep 06 '21 at 13:26
  • You could try using beautiful soup with python maybe. That would let you get all of the hrefs within a page. – Tomas Mota Sep 06 '21 at 13:29
0

If there is a link in a web page, it should mostly be in an <a> tag within the href attribute. So, it is a matter of opening the console and getting all the <a> tags and collecting their href.

var allLinkElements = document.querySelectorAll("a");
var allLinks= []

for (var i=0; i<allLinkElements .length; i++){
   allLinks.push(allLinkElements [i].href);
}

console.log(allLinks.includes('http://www.what-im-looking-for.com'));
Charlie
  • 22,886
  • 11
  • 59
  • 90
  • Thanks for reply, I know I could do it this way, but what I'm looking for is finding all hrefs in all sites in some domain. I've edited my question - now I read it and it's really not too specific :D – engray Sep 06 '21 at 13:26
  • look its not specific thats correct. it is universally understood that you need source code in database to extract term aka 'href' from the each source code file. how you do it is up to you, grab the sitemap file and fetch cource code of each link. could use any simple old php or phython and run regex to extract terms into array. No one will hand it for free its what the big business is all about. – Syed Sep 06 '21 at 14:01
  • here is a previous answer on the subject that might help https://stackoverflow.com/questions/1439326/how-to-find-all-links-pages-on-a-website – Syed Sep 06 '21 at 14:03