While the question is interesting, it can not be answered in the way you might hope.
To answer the title independently of the question:
Working with the data inside an app is not what Android Studio does.
Android Studio is used to build apps to run on Android, independent of Android Studio.
Inside an app, which can use the functionality provided by the Android system, detecting the color of a pixel in an image is certainly possible. That is a good question to ask at https://android.stackexchange.com/
The problem as you describe it may be very difficult, because you do not make many constraints. As an example, if that should work with any background, there could be black lines, for example.
I will simplify it to what I think you mean:
The app contains an image that contains a black circle that is much smaller than the image.
It also contains a second smaller circle, called the ball, in a different color.
The rest of the image has a third color, uniformly.
The problem you want to solve is: Move the ball, and find whether the smaller circle is completely inside the larger circle.
This is not easy, and quite interesting.
For this, you need to read pixel colors for parts of the image.
But it is unclear what you mean by "detecting pixels", and that may be an important point.
Reading pixels from an image that is part of your application should be simple.
What I described does not use collision detection, and the collision of the circles is irrelevant.