You can create a robot.txt and use the Disallow property.
For example, since you mentioned these 3 urls:
www.mysite.com/example.html?start=10
www.mysite.com/example.html?start=20
www.mysite.com/example.html?limitstart=0
you should use this:
Disallow: /?start=
Disallow: /?limitstart=
You have to use Disallow: followed by / and a pattern included in what you want to disallow. It can target specific files or folders.
You can also specify what bots you want to hide the files or folders to, by using the User-agent property:
User-agent: *
Disallow: /?start=
Disallow: /?limitstart=
the code above will work for any bot or crawling engine.
User-agent: googlebot
Disallow: /?start=
Disallow: /?limitstart=
this code will work only for Google for example.
For a reference you can read the material you find on www.robotstxt.org or also wikipedia has a page made good enough. http://en.wikipedia.org/wiki/Robots.txt
Another detailed reference can be found here: https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt