I'm founder of a major, so-called 'toplist'. The users can submit their website to the toplist and gain a higher position by obtaining votes for their toplist entry.
The average visitor is young, mainly between 13-20 years old. They bring benefits, but also disadvantages. Over the past 5 years I've been actively fighting 'cheaters' who're using robots to obtain votes. These robots used proxies, different user-agents and even managed to solve multiple kind of CAPTCHA questions (reCAPTCHA, SolveMedia and custom captcha's). The use of these robots has dramatically decreased since I've introduced a new system that makes a random lay-out and loads 1 out of 15 different CAPTCHA-systems each page load. It doesn't seem to be an issue any longer.
People have now moved onto manual cheating. They're using browser plugins that change their IP address on pretty much every page load (e.g.: https://addons.mozilla.org/en-us/firefox/addon/ipflood/). I really can't seem to find a way to fight this, but it's a very big issue. It's hard to believe, but these kids are even manually solving 5000 captcha questions, that's taking ages.
My question is, can anybody help me think of a way to solve this issue? I've been using cookie and session setting, but they've started to notice and remove them. I'm going to introduce user accounts and make it more interesting to vote through an account, but I don't want to require accounts. I doubt there is, but is there any other way left to fight the cheating (maybe like a Java web-app that evades browser-set proxies and passes the real IP to the page, if possible?)? Or should I just give up and hire people to do daily checks to see if it's possible for the site to gain so many votes?