3

I have some images that I want to "put inside a bubble". The bubbles kind of float around the screen with these images trapped inside them.

The best is a way to combine the inside image with the bubble image and somehow warp the inside image to look like it is reflected on the inside of the bubble.

Does anyone know how to achieve this effect without using textures and meshes? Perhaps someone remembers an old project or something that did something similar?

Here is an example of what I mean:

enter image description here

Charles
  • 50,943
  • 13
  • 104
  • 142
Paul de Lange
  • 10,613
  • 10
  • 41
  • 56
  • according to the `CoreImage` reference the filter `CIGlassLozenge` is not available on iOS. so, it does not make your life easier because you have to find a 3rd part library or code fragment to reach this effect. – holex Jul 23 '12 at 14:18

2 Answers2

12

You can do this using the GPUImageSphereRefractionFilter from my open source GPUImage framework:

Spherical refraction example

I describe in detail how this works in this answer to a question about a similar affect on Android. Basically, I use a fragment shader to refract the light that passes through an imaginary sphere, then I use that to do a lookup into a texture containing the source image. The background is blurred using a simple Gaussian blur.

If you want to achieve the exact look of the image you show, you might need to tweak this fragment shader to add some grazing-angle color to the sphere, but this should get you fairly close.

For the fun of it, I decided to try to more closely replicate the glass sphere above. I added grazing angle lighting and a specular lighting reflection on the sphere, as well as not inverting the refracted texture coordinates, leading to this result:

Grazing angle lighting sphere

I used the following fragment shader for this newer version:

 varying highp vec2 textureCoordinate;

 uniform sampler2D inputImageTexture;

 uniform highp vec2 center;
 uniform highp float radius;
 uniform highp float aspectRatio;
 uniform highp float refractiveIndex;
// uniform vec3 lightPosition;
 const highp vec3 lightPosition = vec3(-0.5, 0.5, 1.0);
 const highp vec3 ambientLightPosition = vec3(0.0, 0.0, 1.0);

 void main()
 {
     highp vec2 textureCoordinateToUse = vec2(textureCoordinate.x, (textureCoordinate.y * aspectRatio + 0.5 - 0.5 * aspectRatio));
     highp float distanceFromCenter = distance(center, textureCoordinateToUse);
     lowp float checkForPresenceWithinSphere = step(distanceFromCenter, radius);

     distanceFromCenter = distanceFromCenter / radius;

     highp float normalizedDepth = radius * sqrt(1.0 - distanceFromCenter * distanceFromCenter);
     highp vec3 sphereNormal = normalize(vec3(textureCoordinateToUse - center, normalizedDepth));

     highp vec3 refractedVector = 2.0 * refract(vec3(0.0, 0.0, -1.0), sphereNormal, refractiveIndex);
     refractedVector.xy = -refractedVector.xy;

     highp vec3 finalSphereColor = texture2D(inputImageTexture, (refractedVector.xy + 1.0) * 0.5).rgb;

     // Grazing angle lighting
     highp float lightingIntensity = 2.5 * (1.0 - pow(clamp(dot(ambientLightPosition, sphereNormal), 0.0, 1.0), 0.25));
     finalSphereColor += lightingIntensity;

     // Specular lighting
     lightingIntensity  = clamp(dot(normalize(lightPosition), sphereNormal), 0.0, 1.0);
     lightingIntensity  = pow(lightingIntensity, 15.0);
     finalSphereColor += vec3(0.8, 0.8, 0.8) * lightingIntensity;

     gl_FragColor = vec4(finalSphereColor, 1.0) * checkForPresenceWithinSphere;
 }

and this filter can be run using a GPUImageGlassSphereFilter.

Community
  • 1
  • 1
Brad Larson
  • 170,088
  • 45
  • 397
  • 571
  • I was literally just looking at this filter. Perfect, thanks Brad. – Paul de Lange Jul 23 '12 at 14:54
  • Is there a way to turn off the background image, so I only have the reflected sphere image? – Paul de Lange Jul 23 '12 at 15:01
  • 1
    The blurred background is not part of the default filter, which just generates the sphere. In the FilterShowcase example (where I drew the above from) I use a Gaussian blur filter and a blend to combine the two filtered images. You can just use the sphere refraction filter to get only the central sphere image. – Brad Larson Jul 23 '12 at 15:03
  • Brad, regardless of the center this seems to always show the center coordinates wherever center is. For example, if I move center around, the pixels from the center of the image move along with the glass sphere. Does that make sense? Is there a way to create the glass sphere using the pixels around the current center rather than the center of the image? – jjxtra Nov 09 '12 at 02:55
2

For the record, I ended up using GPUImage like @BradLarson suggested but I had to write a custom filter as below. This filter takes an "inside" image and a bubble texture and blends the two while also performing the refraction calculation but not inverting the image coordinates. The effect:

enter image description here

.h

@interface GPUImageBubbleFilter : GPUImageTwoInputFilter

@property (readwrite, nonatomic) CGFloat refractiveIndex;   
@property (readwrite, nonatomic) CGFloat radius;            

@end

.m

#import "GPUImageBubbleFilter.h"

NSString *const kGPUImageBubbleShaderString = SHADER_STRING
(
 varying highp vec2 textureCoordinate;
 varying highp vec2 textureCoordinate2;

 uniform sampler2D inputImageTexture;
 uniform sampler2D inputImageTexture2;

 uniform highp vec2 center;
 uniform highp float radius;
 uniform highp float aspectRatio;
 uniform highp float refractiveIndex;

 void main()
 {
     highp vec2 textureCoordinateToUse = vec2(textureCoordinate.x, (textureCoordinate.y * aspectRatio + 0.5 - 0.5 * aspectRatio));
     highp float distanceFromCenter = distance(center, textureCoordinateToUse);
     lowp float checkForPresenceWithinSphere = step(distanceFromCenter, radius);

     distanceFromCenter = distanceFromCenter / radius;

     highp float normalizedDepth = radius * sqrt(1.0 - distanceFromCenter * distanceFromCenter);
     highp vec3 sphereNormal = normalize(vec3(textureCoordinateToUse - center, normalizedDepth));

     highp vec3 refractedVector = refract(vec3(0.0, 0.0, -1.0), sphereNormal, refractiveIndex);

     lowp vec4 textureColor = texture2D(inputImageTexture, (refractedVector.xy + 1.0) * 0.5) * checkForPresenceWithinSphere; 
     lowp vec4 textureColor2 = texture2D(inputImageTexture2, textureCoordinate2) * checkForPresenceWithinSphere;

     gl_FragColor = mix(textureColor, textureColor2, textureColor2.a);    
 }

 );


@interface GPUImageBubbleFilter () {
    GLint radiusUniform, centerUniform, aspectRatioUniform, refractiveIndexUniform;
}

@property (readwrite, nonatomic) CGFloat aspectRatio;

@end

@implementation GPUImageBubbleFilter
@synthesize radius = _radius, refractiveIndex = _refractiveIndex, aspectRatio = _aspectRatio;

- (id) init {
    self = [super initWithFragmentShaderFromString: kGPUImageBubbleShaderString];
    if( self ) {
        radiusUniform = [filterProgram uniformIndex: @"radius"];
        aspectRatioUniform = [filterProgram uniformIndex: @"aspectRatio"];
        centerUniform = [filterProgram uniformIndex: @"center"];
        refractiveIndexUniform = [filterProgram uniformIndex: @"refractiveIndex"];

        self.radius = 0.5;
        self.refractiveIndex = 0.5;
        self.aspectRatio = 1.0;

        GLfloat center[2] = {0.5, 0.5};
        [GPUImageOpenGLESContext useImageProcessingContext];
        [filterProgram use];
        glUniform2fv(centerUniform, 1, center);

        [self setBackgroundColorRed: 0 green: 0 blue: 0 alpha: 0];
    }

    return self;
}

#pragma mark - Accessors
- (void) setRadius:(CGFloat)radius {
    _radius = radius;

    [GPUImageOpenGLESContext useImageProcessingContext];
    [filterProgram use];
    glUniform1f(radiusUniform, _radius);
}

- (void) setAspectRatio:(CGFloat)aspectRatio {
    _aspectRatio = aspectRatio;

    [GPUImageOpenGLESContext useImageProcessingContext];
    [filterProgram use];
    glUniform1f(aspectRatioUniform, _aspectRatio);
}

- (void)setRefractiveIndex:(CGFloat)newValue;
{
    _refractiveIndex = newValue;

    [GPUImageOpenGLESContext useImageProcessingContext];
    [filterProgram use];
    glUniform1f(refractiveIndexUniform, _refractiveIndex);
}
Paul de Lange
  • 10,613
  • 10
  • 41
  • 56
  • I wanted to see if I could get closer to your bubble effect and the glass sphere above using some lighting calculations alone, and I've updated my answer above. Your bubble texture might still produce more aesthetically pleasing results, but you can try out the GPUImageGlassSphereFilter to see if that might work as well in your application. – Brad Larson Jul 24 '12 at 18:59
  • Go on admit it, it's fun isn't it :) – Paul de Lange Jul 25 '12 at 06:35