9

I have a site with multiple subdomains and I want the named subdomains robots.txt to be different from the www one.

I tried to use .htaccess, but the FastCGI doesn't look at it.

So, I was trying to set up routes, but it doesn't seem that you can't do a direct rewrite since every routes needs a controller:

map.connect '/robots.txt', :controller => ?, :path => '/robots.www.txt', :conditions => { :subdomain => 'www' }
map.connect '/robots.txt', :controller => ?,  :path => '/robots.club.txt'

What would be the best way to approach this problem?

(I am using the request_routing plugin for subdomains)

John Saunders
  • 160,644
  • 26
  • 247
  • 397
Christopher
  • 3,391
  • 2
  • 21
  • 13

6 Answers6

18

Actually, you probably want to set a mime type in mime_types.rb and do it in a respond_to block so it doesn't return it as 'text/html':

Mime::Type.register "text/plain", :txt

Then, your routes would look like this:

map.robots '/robots.txt', :controller => 'robots', :action => 'robots'

For rails3:

match '/robots.txt' => 'robots#robots'

and the controller something like this (put the file(s) where ever you like):

class RobotsController < ApplicationController
  def robots
    subdomain = # get subdomain, escape
    robots = File.read(RAILS_ROOT + "/config/robots.#{subdomain}.txt")
    respond_to do |format|
      format.txt { render :text => robots, :layout => false }
    end
  end
end

at the risk of overengineering it, I might even be tempted to cache the file read operation...

Oh, yeah, you'll almost certainly have to remove/move the existing 'public/robots.txt' file.

Astute readers will notice that you can easily substitute RAILS_ENV for subdomain...

Spain Train
  • 5,890
  • 2
  • 23
  • 29
TA Tyree
  • 196
  • 2
  • 3
    Thanks for posting this, this inspired me - I simplified this technique a little bit and updated for Rails 3. http://www.timbabwe.com/2011/08/rails-robots-txt-customized-by-environment-automatically/ – tkrajcar Aug 30 '11 at 19:14
  • Thanks I liked your solution. I've posted a Rails 3.x solution with some slight changes. – luis.madrigal Dec 20 '12 at 20:18
11

Why not to use rails built in views?

In your controller add this method:

class StaticPagesController < ApplicationController
  def robots
    render :layout => false, :content_type => "text/plain", :formats => :txt
  end
end

In the view create a file: app/views/static_pages/robots.txt.erb with robots.txt content

In routes.rb place:

get '/robots.txt' => 'static_pages#robots'

Delete the file /public/robots.txt

You can add a specific business logic as needed, but this way we don't read any custom files.

ramigg
  • 1,287
  • 1
  • 15
  • 16
2

As of Rails 6.0 this has been greatly simplified.

By default, if you use the :plain option, the text is rendered without using the current layout. If you want Rails to put the text into the current layout, you need to add the layout: true option and use the .text.erb extension for the layout file. Source

class RobotsController < ApplicationController 
  def robots
    subdomain = request.subdomain # Whatever logic you need
    robots = File.read( "#{Rails.root}/config/robots.#{subdomain}.txt")
    render plain: robots
  end
end

In routes.rb

get '/robots.txt', to: 'robots#robots'
Ryan Romanchuk
  • 10,819
  • 6
  • 37
  • 41
1

For Rails 3:

Create a controller RobotsController:

class RobotsController < ApplicationController
#This controller will render the correct 'robots' view depending on your subdomain.
  def robots
    subdomain = request.subdomain # you should also check for emptyness
    render "robots.#{request.subdomain}"
  end
end

Create robots views (1 per subdomain):

  • views/robots/robots.subdomain1.txt
  • views/robots/robots.subdomain2.txt
  • etc...

Add a new route in config/routes.rb: (note the :txt format option)

match '/robots.txt' => 'robots#robots', :format => :txt

And of course, you should declare the :txt format in config/initializers/Mime_types.rb:

Mime::Type.register "text/plain", :txt

Hope it helps.

levandch
  • 11
  • 1
0

I liked TA Tyree's solution but it is very Rails 2.x centric so here is what I came up with for Rail 3.1.x

mime_types.rb

Mime::Type.register "text/plain", :txt

By adding the format in the routes you don't have to worry about using a respond_to block in the controller. routes.rb

match '/robots.txt'   => 'robots#robots',   :format => "text"

I added a little something extra on this one. The SEO people were complaining about duplicated content both in subdomains and in SSL pages so I created a two robot files one for production and one for not production which is also going to be served with any SSL/HTTPS requests in production.

robots_controller.rb

class RobotsController < ApplicationController 
  def robots
     site = request.host
     protocol = request.protocol
     (site.eql?("mysite.com") || site.eql?("www.mysite.com")) && protocol.eql?("http://")  ? domain = "production" : domain = "nonproduction"
     robots = File.read( "#{Rails.root}/config/robots-#{domain}.txt")
     render :text => robots, :layout => false
  end
end
luis.madrigal
  • 1,366
  • 1
  • 15
  • 31
0

If you can't configure your http server to do this before the request is sent to rails, I would just setup a 'robots' controller that renders a template like:

def show_robot
  subdomain = # get subdomain, escape
  render :text => open('robots.#{subdomain}.txt').read, :layout => false
end

Depending on what you're trying to accomplish you could also use a single template instead of a bunch of different files.

jdeseno
  • 7,753
  • 1
  • 36
  • 34