I have to apply a spherical filter on a image in android, I have attached input and expected output image. Output image will be processed from the squared centered region of input image and mapped it to sphere. Any idea how to do this in Android. Will I have to use openGL for doing this or 2D-trasformation alone will do the task.
4 Answers
I just got an implementation of this working using OpenGL ES 2.0 on iOS:
While this is on iOS, the fragment shader I used can be brought straight across to Android. The spherical refraction portion is accomplished using the following fragment shader:
varying highp vec2 textureCoordinate;
uniform sampler2D inputImageTexture;
uniform highp vec2 center;
uniform highp float radius;
uniform highp float aspectRatio;
uniform highp float refractiveIndex;
void main()
{
highp vec2 textureCoordinateToUse = vec2(textureCoordinate.x, (textureCoordinate.y * aspectRatio + 0.5 - 0.5 * aspectRatio));
highp float distanceFromCenter = distance(center, textureCoordinateToUse);
lowp float checkForPresenceWithinSphere = step(distanceFromCenter, radius);
distanceFromCenter = distanceFromCenter / radius;
highp float normalizedDepth = radius * sqrt(1.0 - distanceFromCenter * distanceFromCenter);
highp vec3 sphereNormal = normalize(vec3(textureCoordinateToUse - center, normalizedDepth));
highp vec3 refractedVector = refract(vec3(0.0, 0.0, -1.0), sphereNormal, refractiveIndex);
gl_FragColor = texture2D(inputImageTexture, (refractedVector.xy + 1.0) * 0.5) * checkForPresenceWithinSphere;
}
The center
is a normalized coordinate for the center of the sphere (from a space of 0.0 - 1.0 in both dimensions), the radius
is the normalized radius, the refractiveIndex
is the air / material index of your sphere, and the aspectRatio
is the aspect ratio of the image (for making sure the sphere is round and not elliptical in the normalized coordinate space).
This calculates the surface normals for a sphere with the supplied center and radius, and uses the GLSL refract()
function to refract an incoming vector and provide lookup coordinates in the image texture.
The background is blurred using a separable Gaussian blur that I describe in this answer.
This filter is fast enough to filter live video in real time on an iPhone, so it should be fairly performant on most Android devices. The source code for it can be found within the GPUImageSphereRefractionFilter in my open source GPUImage framework.

- 1
- 1

- 170,088
- 45
- 397
- 571
-
I used this in openGL and it worked perfectly. However I am having trouble understanding the math. While calculating the y textureCoordinateToUse, I understand why aspectRatio is multiplied, but why is the rest of calculation needed? – tanvi Nov 22 '17 at 06:11
the following code Fish Eye lens for creating the Sphere and apply some modifications for scaling the sphere and background generation, it will work mostly for square images.
This is a ray-tracing problem. OpenGL will most likely not even be a help to you here as OpenGL doesnt provide ray based 3D. However, this may be what you are looking for.
http://www1.cs.columbia.edu/CAVE/publications/pdfs/Garg_TR04.pdf

- 7,033
- 2
- 19
- 33
-
I don't think it's necessarily a difficult raytracing problem provided that the background is just a texture. You can just map the texture to the sphere with some kind of spherical coordinate system or cubemap. If he was reflecting a dynamic 3D scene I would agree with you, but just reflecting a texture is not too difficult. – Tim Jun 08 '12 at 19:01
-
The non-raytracing example is what the paper I provided is about (Im pretty sure)!!! Another way to do it (even with the background being a texture) could still be using ray-tracing though. – trumpetlicks Jun 08 '12 at 19:03
-
2"OpenGL doesnt provide ray based 3D" - it most definitely can when using shaders. I've done a distortion that was close to this using an OpenGL ES 2.0 shader: http://stackoverflow.com/a/9896856/19679 and GLSL even defines a `refract()` function for light refraction through objects. Let me see if I can create a shader to replicate this specific effect. – Brad Larson Jun 08 '12 at 22:31
-
Very good information :-) I don't think this is doing actual ray-tracing though. This is fairly well known that graphics cards can't yet do ray tracing, this is where they would like to go, and are indeed close to live ray-tracing, but haven't gotten there yet. These distortion filters are mathematical close approximations to an effect. Great link, looks like fun stuff, and perhaps something close is exactly what this user needs :-) – trumpetlicks Jun 09 '12 at 00:02
-
I should correct myself. Todays graphics cards do do ray-tracing, but not from the conventional OpenGL sense. They have at this point been doing it from the OpenCL or CUDA perspective. – trumpetlicks Jun 09 '12 at 00:10
-
2OK, I finally put together an OpenGL ES 2.0 fragment shader to do this. The results can be seen in my answer, and they're very close to the above sample image. There might be a bit of translucency that I don't have in there, but that could be added using a simple tweak. The `refract()` GLSL function handles most of the heavy lifting here. – Brad Larson Jul 09 '12 at 00:12
-
@BradLarson - That is REALLY COOL. These shader engines are super powerful. Thanks for leaving me a quick comment. +1 – trumpetlicks Jul 09 '12 at 00:45
I agree with Tim. Transforming one bitmap into another doesn't require 3d points, nor Ray-trace, forget it at all, its simply 2d. I do not know if opengl has something built in, but i have enough 3d experience to point you in the right direction . You have to iterate all points inside the *circle region you choose *this is the clue, and find color using the FISH-EYE transformation. You got plenty on the net. hope this helps
-
The person' shown example shows specular reflection, refraction, and a true circular bound to the water / glass etc... Even if you look at the link provided by Brad Larson (great link by the way +1) the edges at the circular bound are smooth transition back to the image itself. It is an effect, not NECESSARILY what this user is asking for. If the above image is truly what they are looking for then it may be a bit more complex then the bit simpler shader type functions. Haven't gotten much word back from the asker about their interpretation however??? – trumpetlicks Jun 09 '12 at 00:07