I've done lots of development in C#/.Net, and the asynchronous story has always been there from day one (admittedly the API's have changed significantly over the years from begin/end to events, to Task<T>
with async
/await
). For the last year or so I've been developing with Node.js which does all I/O asynchronously and uses a single threaded event loop model. Recently I was working on a project where we were using Ruby, and for one part of the application, I felt that it made sense to make a whole bunch of web requests asynchronously, and was surprised to find that the asynchronous story in Ruby is vastly different. The only way to do any asynchronous I/O is by using the EventMachine
.
My question comes down to this: Why is it that in .Net (and from what I can tell this is true for Java/JVM as well) there's no need for an event loop, and I can fire off an asynchronous request at any time, yet in languages like Ruby/Python, I need to resort to eventmachine/twisted respectively? I feel like there's some fundamental thing about how asynchronous I/O works that I'm not understanding.