I have started to learn spark few months back and was going through the architecture and got the below doubt. When spark driver requests yarn for resources(cores and memory) , does yarn provide with actual cores or threads. Is there any relationship between number of cores and threads in spark (no as per me in general). Yarn/OS provides an abstraction layer over the CPU and Cores so as per my understanding when the driver requests for resources (core) it will get the threads. So In actuality we can have more threads than the CPU, is my understanding correct. Is there any way to identify the cores (not threads) used to perform a task.
I had gone through the link(Apache Spark: The number of cores vs. the number of executors) which explains the relationship between core and executors and not cores and threads. An executor can have 4 cores and each core can have 10 threads so in turn a executor can run 10*4 = 40 tasks in parallel. Or its only 4 tasks in the executor.
If we can have more threads per core, is there a way we can tell spark to spin up 10 threads per core.
So the question in One line is : when I say the spin up 2 executors with 4 cores each, do we get 8 cores in total or 8 threads.