I've used the honeypot captcha on three forms since about 2010, and it's been stunningly effective with no modifications until very recently. We've just made some changes that we think will stop most of the spambots, at least until they get more sophisticated. In broad strokes, here's the way we've set it up:
One input field on each form is hidden (display:none specified in the CSS class attribute) with a default value of "". For screen readers and such, the hidden input label makes it clear that the field must be left empty. Having no length at all by default, we use code server-side (ColdFusion in our case, but it could be any language) to stop the form submission if anything at all is in that field. When we interrupt the submission that way, we give the same user feedback as if it was successful ("Thank you for your comment" or something similar), so there is no outward indication of failure.
But over time, the bots wised up and the simplest of our forms was getting hammered with spam. The forms with front-end validation held up well, and I suppose that's because they also don't accept just any old text input, but require an email address to be structured like an email address, and so on. The one form that proved vulnerable had only a text input for comments and two optional inputs for contact information (phone number and email); importantly, I think, none of those inputs included front-end validation.
It will be easy enough to add that validation, and we'll do that soon. For now, though, we've added what others have suggested in the way of a "time trap." We set a time variable when the page loads and compare that timestamp to the time the form is submitted. At the moment we're allowing submission after 10 seconds on the page, though some people have suggested three seconds. We'll make adustments as needed. I want to see what effect this alone has on the spam traffic before adding the front-end validation.
So the quick summary of my experience is this: The honeypot works pretty well as it was originally conceived. (I don't recall where I found it first, but this post is very similar to the first I saw about it more than a decade ago.) It seems even more effective with the addition of client-side validation enabled by HTML5. And we think it will be even better with the server-side limits we've now imposed on those too-hasty submissions.
Lastly, I'll mention that solutions like reCaptcha are off the table for us. We spent significant time developing a web app using Google's map API, and it worked great until Google changed their API without warning and without transition advice. We won't marry the same abusive spouse twice.