My background is primarily as a server dev and manager but I have to get in and do work in front-end and iOS dev. I'm asking this question not to be argumetative but I really don't get what is meant by the single threaded nature of Javascript in terms of network requests.
I understand how events get registered in Javascript and iOS at the userspace level. My question is that the UI in iOS will block if a network request is not put on a background thread. However, in a browser / javascript runtime, it won't. Javascript (at least current implementations / pre-web-workers) is ALWAYS described as single-threaded. I understand (I think - but this could be the problem) the way that setInterval is used to check for completion but how could the single-threaded Javascript runtime have async functions that don't block the UI (especially in light of iOS not having it)? For example, in this answer, it would seem that 5 thread would need to be created: Parallel asynchronous Ajax requests using jQuery
In fact, one could have 6 outbound network requests at once. When the javascript runtime is described as single-threaded, does this mean something fundamentally different than the iOS notion of multithreading (or probably more accurately the POSIX notion of a while loop with pthread_create for handling a socket descriptor).
I'm probably just not getting something but I think most of the examples provided don't get at how this is actually done in a single threaded environment (unless the network request is considered at the OS level and not the Javascript runtime level)
thx for any help on this