12

It seems this should be easy but I'm having a lot of difficulty using part of a texture with a point sprite. I have googled around extensively and turned up various answers but none of these deal with the specific issue I'm having.

What I've learned so far:

  1. Basics of point sprite drawing
  2. How to deal with point sprites rendering as solid squares
  3. How to alter orientation of a point sprite
  4. How to use multiple textures with a point sprite, getting closer here..
  5. That point sprites + sprite sheets has been done before, but is only possible in OpenGL ES 2.0 (not 1.0)

Here is a diagram of what I'm trying to achieve

Point sprite diagram

Where I'm at:

  • I have a set of working point sprites all using the same single square image. Eg: a 16x16 image of a circle works great.
  • I have an Objective-C method which generates a 600x600 image containing a sprite-sheet with multiple images. I have verified this is working by applying the entire sprite sheet image to a quad drawn with GL_TRIANGLES.
  • I have used the above method successfully to draw parts of a sprite sheet on to quads. I just cant get it to work with point sprites.
  • Currently I'm generating texture coordinates pointing to the center of the sprite on the sprite sheet I'm targeting. Eg: Using the image at the bottom; star: 0.166,0.5; cloud: 0.5,0.5; heart: 0.833,0.5.

Code:

Vertex Shader

uniform mat4 Projection;
uniform mat4 Modelview;
uniform float PointSize;

attribute vec4 Position;
attribute vec2 TextureCoordIn;

varying vec2 TextureCoord;

void main(void)
{
    gl_Position = Projection * Modelview * Position;
    TextureCoord = TextureCoordIn;
    gl_PointSize = PointSize;
}

Fragment Shader

varying mediump vec2 TextureCoord;
uniform sampler2D Sampler;

void main(void)
{
    // Using my TextureCoord just draws a grey square, so
    // I'm likely generating texture coords that texture2D doesn't like.
    gl_FragColor = texture2D(Sampler, TextureCoord);

    // Using gl_PointCoord just draws my whole sprite map
    // gl_FragColor = texture2D(Sampler, gl_PointCoord);
}

What I'm stuck on:

  1. I don't understand how to use the gl_PointCoord variable in the fragment shader. What does gl_PointCoord contain initially? Why? Where does it get its data?
  2. I don't understand what texture coordinates to pass in. For example, how does the point sprite choose what part of my sprite sheet to use based on the texture coordinates? I'm used to drawing quads which have effectively 4 sets of texture coordinates (one for each vertex), how is this different (clearly it is)?
Community
  • 1
  • 1
James Andres
  • 1,522
  • 14
  • 20

2 Answers2

10

A colleague of mine helped with the answer. It turns out the trick is to utilize both the size of the point (in OpenGL units) and the size of the sprite (in texture units, (0..1)) in combination with a little vector math to render only part of the sprite-sheet onto each point.

Vertex Shader

uniform mat4 Projection;
uniform mat4 Modelview;
// The radius of the point in OpenGL units, eg: "20.0"
uniform float PointSize;
// The size of the sprite being rendered. My sprites are square
// so I'm just passing in a float.  For non-square sprites pass in
// the width and height as a vec2.
uniform float TextureCoordPointSize;

attribute vec4 Position;
attribute vec4 ObjectCenter;
// The top left corner of a given sprite in the sprite-sheet
attribute vec2 TextureCoordIn;

varying vec2 TextureCoord;
varying vec2 TextureSize;

void main(void)
{
    gl_Position = Projection * Modelview * Position;
    TextureCoord = TextureCoordIn;
    TextureSize = vec2(TextureCoordPointSize, TextureCoordPointSize);

    // This is optional, it is a quick and dirty way to make the points stay the same
    // size on the screen regardless of distance.
    gl_PointSize = PointSize / Position.w;
}

Fragment Shader

varying mediump vec2 TextureCoord;
varying mediump vec2 TextureSize;
uniform sampler2D Sampler;

void main(void)
{
    // This is where the magic happens.  Combine all three factors to render
    // just a portion of the sprite-sheet for this point
    mediump vec2 realTexCoord = TextureCoord + (gl_PointCoord * TextureSize);
    mediump vec4 fragColor = texture2D(Sampler, realTexCoord);

    // Optional, emulate GL_ALPHA_TEST to use transparent images with
    // point sprites without worrying about z-order.
    // see: http://stackoverflow.com/a/5985195/806988
    if(fragColor.a == 0.0){
        discard;
    }

    gl_FragColor = fragColor;
}
James Andres
  • 1,522
  • 14
  • 20
  • I'm have some trouble with this and wondering if someone could help me out. My sprites sheets total size is 640 X 640. With 5 sprites by 5 rows. So each sprite is 128 x 128. My PointSize is 50. I'm just adding this code to my fragment shader. vec2 TextureCoord = vec2(0, 0); // I guess this is top left of sprite sheet vec2 TextureSize = vec2(128, 128); // I'm not sure if this is PointSize(50) or sprite size(128) mediump vec2 realTexCoord = TextureCoord + (gl_PointCoord * TextureSize); vec4 rotatedTexture = texture2D( texture, realTexCoord ); – Kahless Mar 11 '15 at 21:41
  • @JamesAndres great Q&A, exactly what I'm looking for. At first I didn't believe this would be possible given the Khronos official documentation ! Only problem I can foresee is point size, which seems to be maximum of 1024 on my system, so can't be used for general purpose sprite blitting of arbitrary sizes... e.g. full sized backgrounds etc. –  Aug 16 '19 at 11:22
  • @JamesAndres another problem encountered was with rotations. The actual square point sprites don't themselves get rotated on screen. Only the uv texture coordinates get rotated, so unfortunately any rotation seems to get clipped to the static square which is the point sprite. A way round this would be to make sure that my sprite textures are bounded by a rectangle in which they can rotate 360 degrees without going over their boundary. –  Aug 18 '19 at 17:15
  • @JamesAndres and a final caveat I noticed is that if you want a rectangular sprite, e.g. `100x50` then you'd need a sprite texture of `100x100` where we pad out the top and bottom of the sprite with `25` transparent pixels. Otherwise the `100x50` texture would just get mapped onto the **square** point sprite, effectively stretching it t fit the square. I've learned a lot about Point Sprites and from your answer. For these reasons I think I'll stick to Point Sprites for blitzing the tiled backgrounds in my game, and use a regular `GL_TRIANGLES` sprite batcher for the rest ! :-D –  Aug 19 '19 at 05:07
  • PS also for anyone interested, my experiments show that on (my) Android phone, 3000 sprites using optimised `GL_TRIANGLES` **is faster than** 3000 sprites using optimised `GL_POINTS` (with fragment rotations). For relatively small number of sprites I would choose the `GL_POINTS` approach since the CPU side code is much cleaner, but for efficiency, probably best to stick with the `GL_TRIANGLES`, although as I mentioned I think `GL_POINTS` might be good for tiled backgrounds that don't require rotations ! –  Aug 19 '19 at 05:26
5

Point sprites are composed of a single position. Therefore any "varying" values will not actually vary, because there's nothing to interpolate between.

gl_PointCoord is a vec2 value where the XY values are between [0, 1]. They represent the location on the point. (0, 0) is the bottom-left of the point, and (1, 1) is the top-right.

So you want to map (0, 0) to the bottom-left of your sprite, and (1, 1) to the top-right. To do that, you need to know certain things: the size of the sprites (assuming they're all the same size), the size of the texture (because the texture fetch functions take normalized texture coordinates, not pixel locations), and which sprite is currently being rendered.

The latter can be set via a varying. It can just be a value that's passed as per-vertex data into the varying in the vertex shader.

You use that plus the size of the sprites to determine where in the texture you want to pull data for this sprite. Once you have the texel coordinates you want to use, you divide them by the texture size to produce normalized texture coordinates.

In any case, point sprites, despite the name, aren't really meant for sprite rendering. It would be easier to use quads/triangles for that, as you can have more assurance over exactly what positions everything has.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
  • 1
    Thanks for your response Nicol. So it seems that I'm nearly there then. The TextureCoordIn is basically an XY (0..1) point indicating which sprite to render. I have the size of the sprites, the size of the sprite-sheet/texture, etc. Where I'm confused, however, is how to put this all together. I have seen examples floating around the web where they do math against the `gl_PointCoord` to determine which part of the texture to render, eg: `gl_FragColor = texture2D(Sampler, gl_PointCoord + SomeOtherVec2);`. Can you give an example shader to point me in the right direction? – James Andres Mar 07 '12 at 23:21
  • I should also mention, I'm quite new to GLSL (and OpenGL in general). What do you mean when you say "So you want to map (0, 0) to the bottom-left of your sprite" and "pull data for this sprite"? Thanks. – James Andres Mar 08 '12 at 00:01