What is the complete flow of events when touching the screen in Android?
As per my understanding, when user touches the screen:
- Touch driver will find the co-ordinates and pass them to kernel
- Kernel will pass it to framework
- Framework will ask the graphic library to perform zoom and render (after it determines how much to zoom)
How do the drivers, kernel, native libraries, framework and application interact to achieve a desired action? I'd be great to have some light shed on this.