I have code on my server which works very well. It must crawl a few pages on remote sites to work properly. I know some users may want to abuse my site so instead of running the code which uses webclient and HttpRequest i would like it to run on client side so if it is abused the user may have his IP blacklisted instead of my server. How might i run this code client side? I am thinking silverlight may be a solution but i know nothing about it.
-
By symmetry, is it not very easy for your code to abuse your client's machine? Especially if you get (as you want) access to the client machine's file system. – Tarydon Jan 07 '10 at 08:14
-
4Silverlight runs inside a "sandbox", meaning there are a lot of restrictions placed on it for security reasons (even in v4, you can save a file to the user's drive but the only information you are given is a file name without the path and the FileStream instance). It's highly unlikely that you will be able to execute a web crawl from Silverlight, especially with it's cross-domain security in place. – Rory Jan 07 '10 at 08:16
-
@Rory: +1 Quite so. Pity you didn't enter this as an answer. – AnthonyWJones Jan 07 '10 at 09:14
3 Answers
Yes, Silverlight is the solution that lets you run a limited subset of .NET code on client's machine. Just google for silverlight limitations to get more information about what's not available.
I don't know what is the scenario you're trying to implement, and whether you need real-time results, but I guess caching the crawl results could be a good idea?
In case you're after web scraping, you should be able to find a couple of JavaScript frameworks that for you.

- 7,057
- 3
- 22
- 29
-
Looks like i could do some things i need but i need to change my code enough to make it not worth it. Answer is correct and accepeted, it looks like i'll write/copy-paste to a desktop app instead. – Jan 07 '10 at 09:00
I think your options here are Silverlight or somesort of desktop app
Unless maybe there is a jquery library or other client scripting language that can do same things

- 988
- 5
- 16
That's an interesting request (no pun). If you do use Silverlight then maybe instead of porting your logic to it, create a simple Proxy class in it that receives Requests from your server app and shuttles it forward for the dirty work. Same with the incoming Responses: have your Silverlight proxy send it back to the server app.
This way you have the option of running your server app through the Silverlight proxy in some instances, and on its own (with no proxy) in other scenarios. The silverlight plugin should provide a consistent API to program against no matter which browser it's running in.
If using a proxy solution in the web browser, you might even be able to skip Silverlight altogether and use JavaScript/AJAX calls. Of course this kind of thing is usually fraught with browser compatibility issues and it would be an obscure push/pull implementation for sure, but I think JavaScript can access domains and URLs and (in some cases of usage) not be restricted to the one it originated from.
If Silverlight security stands in the way you might look into other kinds of programmable (turing complete) browser plugins like Java, Flash, etc. If memory serves correct, for the Java plugin, it can only communicate over the network with the domain it originated from. This kind of security is too restrictive for your crawling needs.

- 28,441
- 31
- 139
- 229