Two comments - one micro, one macro.
First, it seems like the TPL will fit your stated goals. Using Task.Factory.StartNew()
, you'll be able to spawn an arbitrary amount of operations which will run on background threads. Further, using the overloads that take a CancellationToken, you cancel operations to as fine a degree as you might wish. Note that a Task is not equivalent to a Thread - by default, Tasks run in ThreadPool threads (as do background workers, by the way). It's entirely possible that your tasks will end up in a queue waiting for a pooled thread to become available.
Second, the bigger comment.
You claim that you will have an unspecified and indefinite number of work items. That you would spawn a thread per item makes me a bit edgy - threads aren't cheap, and if you have a large number of work items you can starve your process by naively starting new threads.
Can you elaborate on what you need to do with these threads? I don't know what constitutes a work item for your project, but it's entirely possible that a producer/consumer setup with a fixed number of threads would meet your parallelization needs. .NET 4.0 makes it easy with the System.Collections.Concurrent classes.
EDIT in response to comment
The case of TCP connections is very nearly what I was thinking about when I wrote the second comment above. In this case, you are considering a potentially infinite consumption of a limited OS resource, namely open sockets. I hope that you will investigate non-blocking I/O. This MSDN page documents asynchronous reading from sockets, and this one talks about the larger asynchronous programming model in .NET.
For scalable TCP serving, a good discussion already exists on SO: peruse at your pleasure.