6

URL in question: https://www.halleonard.com/viewpressreleasedetail.action?releaseid=10261

When you view the source in a browser and in Developer Tools, you can see all of the meta tags for Open Graph and Twitter. I have checked the Facebook Debugger and, aside from a few canonical issues, I'm fairly happy with the results.

I also plugged the above URL into a third-party Open Graph Debugger: http://debug.iframely.com/ and all of the tags for Open Graph, Twitter and even other all come back positive.

Why is is that Twitter's Card Validator is coming back with a

INFO:  Page fetched successfully
INFO:  9 metatags were found
ERROR: No card found (Card error)

enter image description here

Any insight on how I get Twitter to display properly?

Murphy1976
  • 1,415
  • 8
  • 40
  • 88

3 Answers3

5

I think you may either be blocking or redirecting the Twitterbot agent.

I faked a Twitterbot agent with curl:

curl -A 'Twitterbot' https://www.halleonard.com/viewpressreleasedetail.action?releaseid=10261 -o ~/Desktop/what-twitterbot-sees.html

And this is what your server returns:

enter image description here

As you can see there are 10 meta tags (if you exclude the <meta charset> one) which is what the card validator indicates and there are no <meta name="twitter:*"> tags at all.

You can reproduce with your browser if you can set a custom user agent string. This is possible with Google Chrome:

enter image description here

I'm pretty sure there's some sort of redirection rule going on either at your web server level or in your application code.


According to developer.twitter.com, the user agent string I have used is correct:

Twitter uses the User-Agent of Twitterbot (with version, such as Twitterbot/1.0)

customcommander
  • 17,580
  • 5
  • 58
  • 84
  • We have already modified our robots.txt and Twitterbot has access to every page on the site. We added this to the file: "User-agent: Twitterbot Disallow: " – Murphy1976 Aug 20 '19 at 12:38
  • Hi checked your robots.txt file before posting; it looks fine to me too. What I am suggesting is that perhaps the issue lives on another layer of your stack. Nginx maybe? The point being, when I visit your site with my browser I do see your twitter card, when I visit using curl and a fake agent, I get a different response. I'd definitely check that. – customcommander Aug 20 '19 at 12:51
  • We made some changes, and I can see the 9 (10) that appear before the Google Tag Manager script. Are you saying that ALL of these OG/Twitter tags need to be before the Google Tag Manager is initialized? – Murphy1976 Aug 20 '19 at 13:23
  • I doubt that GTM is the culprit here (although I can't rule it out 100% to be honest). If I were you I'd check your web server configuration or your application code. Perhaps you're actively redirecting traffic based on some conditions. Your issue is quite difficult to solve as is without additional context. At this stage I can only give you some hints. – customcommander Aug 20 '19 at 13:30
  • @Murphy1976 I can reproduce the same error by setting a custom user agent string in Chrome. See updated answer – customcommander Aug 21 '19 at 07:18
  • I can verify what you are seeing. I just don't know how to resolve it myself. – Murphy1976 Aug 21 '19 at 16:44
  • @Murphy1976 like customcommander suggested. I think the problem lies in your web server. So validate your configuration in the Apache server that hosts your web page. And also check if there any useragent whitelisting configured. May be all you have to do is just remove it from the list. – karthick Aug 21 '19 at 20:28
  • We found the true culprit, but your research led the way, so you get the bounty. We had a Bot filter in our web.xml file that should have been blocking bots from crawling al pages behind the account login portions of our site, but instead, it was blocking bots from ALL pages. – Murphy1976 Aug 22 '19 at 15:08
  • In case it's not clear to others, this works perfectly with a locally run app. This was way helpful, thanks! curl -A 'Twitterbot' 127.0.0.1:3000 -o ~/Desktop/what-twitterbot-sees.html – Zack Biernat Aug 26 '21 at 13:53
1

The error is coming because of the twitter:card type you have selected. You forgot to add the twitter:image content. These both should be added.

<meta name="twitter:card" content="summary_large_image">
<meta name="twitter:image" content="ADD IMAGE URL">

Reference Link

Akansh
  • 1,715
  • 3
  • 15
  • 34
  • On the documentation page that you have linked, it says that `twitter:image` is not required. Also Twitter looks for Open Graph tags which exists on that page. – customcommander Aug 21 '19 at 07:22
  • I said should not must, so try to add the `twitter:image` tag and check if that works for you. The same page also says, `Below are the suggested minimum properties for the Summary Card with Large Image including title, description, and image.` – Akansh Aug 21 '19 at 08:29
  • Well from OP the issue doesn’t seem to be about the tags themselves. As they mentioned this works fine in all validation tools except the one from Twitter. So there’s must be an issue in how Twitter sees the site. Also note that the error isn’t about the image not being found. It is the card itself that isn’t. Surely Twitter would be more specific about the error if only the image was missing. – customcommander Aug 21 '19 at 08:52
0

I checked your source code with w3c validator.

https://validator.w3.org/

it seems that google tag manager is not installed properly: you have to move the google tag manager (noscript) (line 10:13) on top of the body section and absolutely remove it from head section.

You can find here some information https://support.google.com/tagmanager/answer/6103696?hl=en

and also (more specific) in your tag manager page (like this attached)

tag manager install instructions

Claudio
  • 3,060
  • 10
  • 17
  • I moved the noscript google tag manager to just inside the tag and Twitter's Card Validator is still coming up with "ERROR: No card found (Card error)" – Murphy1976 Aug 20 '19 at 13:05