2

I read somewhere that pros print out only one line html and one line javascript per page and rest of the rendering process made by the client. I've found this very promising so I thought I'd use the following structure to render pages:

<html>
  <head>
    {{Title}}
    {{Meta tags}}
    {{CSS via CDN}}
  </head>
  <body>
    {{Dynamic JSON array containing the datas of the current page}}
    {{Javascript libraries via CDN}}
    {{JS files that contain HTML templates via CDN}}
  </body>
</html>

So the questions are:

  • Is it really a good practice?
  • Is it worth it to load the HTML templates via CDN?

SEO is secondary, but of course I'd render some necessery meta tags.

Thanks for your answers!

martintrapp
  • 769
  • 6
  • 15
  • It all depends on your requirements – Chethan N Feb 17 '14 at 09:54
  • It's 2014, I'm just looking for new techniques... Increasing performance etc... – martintrapp Feb 17 '14 at 09:58
  • 1
    While google does parse JS, I'm not sure how well does it index JS generated content (might be worth researching). I understand you don't really care about SEO, but if no one can find your site, it's another problem. – jahu Feb 17 '14 at 10:13
  • Read here (might be outdated though): http://moz.com/ugc/can-google-really-access-content-in-javascript-really. According to this article, Google only indexes JS generated content if the JS is contained in the html output and ignores content generated through external JS files (except for AJAX requests, but having your AJAXes indexed separately is usually not desired). While it is good practice to include meta tags on all your pages, other than title, their role in indexing varies from non-existant to marginal (depending on search engine). You want your content to be visible to search engines. – jahu Feb 19 '14 at 09:04

4 Answers4

3

Is it really a good practice?

That's rather subjective. It depends on how much you value reliability, performance and cost.

You can get a performance boost, but you either:

  • Have a very fragile system that will completely break if a JS fail fails to load for any reason, trips over a browser bug, etc or
  • Have to start by building all your logic server side and then duplicate all the work client side and use pushState and friends to have workable URIs.

SEO is secondary, but of course I'd render some necessery meta tags.

Leaving aside questions of meta tags being necessary for SEO… rendering them with client side JavaScript is pointless. If the content is rendered only with client side JS then search engines won't see it at all.

Is it worth it to load the HTML templates via CDN?

Again, it depends. Using a CDN can cause your static files to be delivered faster, but they are an added expense and require a more complex build system for your site (since you have to deploy to multiple servers and make sure the published URIs match up).

Quentin
  • 914,110
  • 126
  • 1,211
  • 1,335
  • Thank you for your answer! I thought it's obvious but I meant I want to render meta and title tags on the server side. That's mostly necessary for open graph contents. – martintrapp Feb 17 '14 at 10:27
  • @martincpt — What about for pages that are not the homepage? – Quentin Feb 17 '14 at 10:30
  • Metas and title tags would change page by page of course. – martintrapp Feb 17 '14 at 10:33
  • @martincpt — They won't if your pages are built only by client side code. – Quentin Feb 17 '14 at 11:18
  • I mean changing them server side. So the head would always rendered by the server itself. – martintrapp Feb 17 '14 at 11:29
  • @martincpt — If you are loading a new HTML document for each page, then using JavaScript to render all the content on it makes no sense. The benefits from using JS for that derive from *not* loading a whole new page each time. – Quentin Feb 17 '14 at 11:32
  • I'd not load the whole page again for users who visit another link. That makes no sense as you said. But if the request isn't ajax based, I must render the head on server side, so bots will see it. – martintrapp Feb 17 '14 at 11:41
  • Then search engines will see a page with meta data but no content and, *at best*, give the page a very low rank (and that's assuming they find the page at all - you won't have any links in the homepage for them to follow). – Quentin Feb 17 '14 at 11:46
  • So what would you recommend? Keep it old school? I mostly develope with django by the way. – martintrapp Feb 17 '14 at 11:57
  • That you start out by building a website that works without JavaScript, and then decide if you want to do the work to duplicate the server side view logic on the client using templates and `pushState` to map the client side changes onto the URLs of the existing pages. – Quentin Feb 17 '14 at 12:13
  • It isn't necessarily true that search engines don't see content generated by JavaScript. It used to be true, but since a couple of years ago they can see it in a lot of cases. They may penalize you for it compared to normal HTML, though. (I've heard they do, but I've also heard that it's mainly because of the additional delay in fetching and rendering with JavaScript versus just serving the HTML inline.) – Chuck Feb 17 '14 at 18:48
1

Ofcourse, this is a good practice (if SEO is really secondary importance) to

Dynamically loading JSON array containing the datas of the current page
Javascript libraries being loaded via CDN
JS files that contain HTML templates via CDN

Besides you can minify your javascript and gzip it Client script is much faster than server script as far as the performance is concerned

  • SEO is still an option with SPA http://stackoverflow.com/questions/18530258/how-to-make-a-spa-seo-crawlable – axelduch Feb 17 '14 at 10:13
  • SEO on Javascript applications can also be facilitated using services like SnapSearch https://snapsearch.io/ – CMCDragonkai Apr 24 '14 at 17:09
0

There are of course pros and cons of rendering website in the client.

Pros:

  • You can reuse given template. No need for asking server to render given UI element so it's faster.
  • When using such tools like Meteor.js you can even go further and when rerendering template only replace parts that changed.
  • You can include given module/subpage (when required) so you can still avoid loading all the data at the first load.
  • When not rendering website on server, it can handle more requests.
  • Websites are more dynamic. User gets feeling of immediacy when swiching page.

Cons:

  • It's not SEO friendly but there are easy to use tools that help to deal with it (there is one for Meteor.js).

The calcuation is easy :). Use dynamic JS website rendering :).

Łukasz Jagodziński
  • 3,009
  • 2
  • 25
  • 33
  • Thanks for your answer! I mostly use django to develope sites, so my plan is to use a django-like JS template engine like Swig. https://github.com/paularmstrong/swig – martintrapp Feb 17 '14 at 10:31
  • Of course you can use whatever template engine you want. However try to minimize the template size if it's often rerendered, especially when it need to rerender the whole template instead of the parts that changed. – Łukasz Jagodziński Feb 17 '14 at 11:40
0

It makes your initial render slower (browsers are extremely well optimized for rendering HTML), which can potentially affect your search rankings, and it is somewhat less amenable to caching. Twitter tried a JavaScript-and-JSON-only architecture and ended up going back to serving a prerendered page along with the JavaScript app because it gave better perceived response times. (Again, the actual response times aren't necessarily better, but the user sees the response sooner.)

Chuck
  • 234,037
  • 30
  • 302
  • 389
  • But if you take a look at google they have 90% percent JS in their sources. – martintrapp Feb 17 '14 at 11:23
  • @martincpt: Which is fine. You can serve lots of JS. Heck, you can even serve only JS, but there are drawbacks and that isn't what Google does outside of highly dynamic pages. – Chuck Feb 17 '14 at 16:42