7

I originally asked this question on gamedev, but none of the answers helped to solve problem, and I still have no clue what is the real cause. I didn't see anything about re-posting questions across SE in FAQs, so I can only hope this is okay. Moreover, in the retrospect the question is probably more related to graphics programming in general than just game development.

Edit 1 begins

The behaviour of the original post applies only to Windows XP and Windows 7, browsers Firefox and Chrome. On Ubuntu, there is no such distortion, but instead the textures "shake" while the camera is being rotated. When the rotation is halted, the shaking stops, but the textures may not be in the completely correct position.

Edit 1 ends

Edit 3 begins

The program has been tested on 4 different computers, and haven't worked as intended on any of them.

Edit 3 ends

I have a large voxel in WebGL which I want to cover with tiled texture each tile has the side length of 1 in the vertex space. In this test scenario the camera is pointing to the negative z direction and the sides of the voxel are in the x-y, x-z, y-z planes.

Smaller voxels (i.e. fewer repeats) work quite well, but at around 2000 x and y repeats per face (ie voxel size 2000*2000*2000) the textures start to look really ugly. When the camera points perpendicularly to the face the textures look correct regardless of the size/amount of repeats, but for voxels of the size mentioned above any rotation of even a couple of degrees causes a visible problem. Increasing the voxel size increases the distortion. The inverse is also true: with small voxels the textures look correct regardless of camera rotation. It seems to me that there is no hard threshold value for the size but that the effect starts to increase gradually from zero when the voxel is increased in size from the approx 2000 per side.

See https://i.stack.imgur.com/zN2tn.jpg for a visualization. The first image is how it should look like, but when the camera is rotated, the lines start to become distorted as in the second image. The effect gets worse by increasing the voxel size and by rotating the camera more. https://i.stack.imgur.com/70kSU.jpg contains two additional images with more severe effect.

The textures (one X per texture) are originally 512*512 px. The screenshots have not been scaled in any way.

My first guess was float inaccuracies, but that's quite hard to believe since the object has only the dimensions of the order of 1000.

My second guess was some kind of weird int/float rounding error, but since everything is always handled in floats, I don't see how this could happen.

The third possibility I could think of is that this is just impossible and that textures should not be repeated that many times. However, this seems quite unlikely IMO. My guess (and hope) is that there is some kind of quite elementary problem.

Any suggestions about what can cause or commonly does cause this kind of thing are well appreciated, because I seem to be having a very hard time trying to narrow down the source of this problem on my own.

I'm using:

gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR_MIPMAP_NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.REPEAT);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.REPEAT);
gl.generateMipmap(gl.TEXTURE_2D);

and in the shader:

#ifdef GL_ES
precision highp float;
#endif

I also performed some very elementary calculations: For 32-bit floats the maximum significand is around 8.4 million. For voxel side length 2000 (which I noticed was the level where the effect became visible), one could naively expect approximately 0.00025 of a float rounding error in the coordinates. Assuming each repeat takes around 100 pixels in screen, the error should be significantly less than 1 pixel, which is not the case. Unless my calculation above was done incorrectly, I therefore would nominate that Float32 is not to blame and that the reason must be somewhere else.

The line texture is used only for visualizing the problem. The problem persists also with other (more natural) kinds of textures.

Edit 2 begins

Enabling or disabling antialiasing makes no visible difference

Edit 2 ends

Community
  • 1
  • 1
Asta
  • 97
  • 1
  • 8
  • Have you tried it on different computers/browsers? Or the "webgl.prefer-native-gl" setting? – Ville Krumlinde Jan 09 '12 at 11:38
  • Excellent point, I almost forgot to do so. I have tested it on Win XP on both the newest Firefox and Chrome, and on Ubuntu on Chrome. The behaviour on Windows is approximately the same regardless of the browser. On Ubuntu the textures shake when rotating the camera, but do not get distorted. I will edit the question with this update. I haven't tried the setting you mentioned, but will do it later when I get to the Win XP computer. – Asta Jan 09 '12 at 11:50
  • That option set to true caused the page to not work at all :( – Asta Jan 09 '12 at 17:00
  • Are all your tests done on the same computer? Perhaps if you make a public page we could help test it on other hardware. – Ville Krumlinde Jan 09 '12 at 18:20
  • I hate to throw this out, but seeing as how you say it's working everywhere but the one OS, I suspect that there may be some faulty drivers at play. Not to say you couldn't work around it by twiddling options (you may have hit on some obscure and rarely tested edge case state), but I don't know of any legitimate reason for getting the behavior you are seeing. – Toji Jan 09 '12 at 18:28
  • The Windows and the Ubuntu tests have been made on different hardware. @Toji, I wouldn't say it is working on Ubuntu, since the shaking is clearly not correct behaviour. There probably still is some driver aspects to this due to the very different kind of erratic behaviour. I'm a bit reluctant to make a public page, since the engine is already quite developed (ie not just a test case -- also not obfuscated or copyrighted yet), and will hopefully be a commercial product in future. – Asta Jan 09 '12 at 18:58
  • Update: test have been also performed on 2 other hardware (including Win7), the same thing happens. – Asta Jan 09 '12 at 19:05
  • Idea: Can you set up a scenario where the texture repetition slides up and down in realtime? Something as simple as starting at 1 and linearly growing to 2000 over the course of 10 seconds. I've often found that watching transitions such as that gives me a better idea of how the renderer got to a particular point, especially with texture issues. I'm curious if the effect just snaps into being or if it slowly begins corrupting the image after a certain threshold. – Toji Jan 09 '12 at 23:37
  • I didn't do an automated system like that, but manual iteration of the size shows that the effect gradually increases starting from about 1000 repeats when it becomes visible if one is carefully watching. Double that, and it's approximately at the level of the second image of the first set. The second set was taken probably with 50k or 100k repeats. – Asta Jan 09 '12 at 23:48

2 Answers2

6

i believe that what you are seeing may really be caused by the precission. You correctly calculated that the floating-point coordinates should be pretty enough, the problem is the hardware is not using floats to lookup the textures. The texture interpolator units do have considerably lower precision (don't know how is it today, but it used to be as low as 16 bits on older GeForce cards).

So ... how can one exceed interpolator precision? By using large texture coordinates (many repeats) on a large geometry, which is exactly what you are doing. Remedy? Subdivide your geometry to be using smaller texture coordinates (you can shift texture coordinates in integer steps so they are closer to 0).

Here is a screenshot of how extreme it can look like (i weren't able to reproduce the error on my GeForce 260, but it is clearly visible on my Tegra 2 tablet, as shown in the image below).

the swine
  • 10,713
  • 7
  • 58
  • 100
  • The more I think about it, the more I believe this is correct. Especially when you start mentioning that you're doing 50-100K repeats on some of your geometry. If your textures are 512x512, then you need precision of ~0.0019 to accurately sample every pixel. If your card is forcing you to a 16 bit, then any values over 1024 will only yield a precision of 0.5(!) Even with full 32 bit floats, though, precision degrades rapidly outside the 0-1 range. That can easily mess with your output in the way you've shown. – Toji Jan 10 '12 at 19:02
  • This seems very plausible explanation to me also. Please forgive me for noobness, but what does "Subdivide your geometry to be using smaller texture coordinates" mean? Split the large voxel into smaller ones? In addition, would using larger or smaller texture size help? – Asta Jan 10 '12 at 19:26
  • I take it you're rendering voxels using triangles or quads. The problem is that once the difference in texture coordinates between two vertices is too high, or the absolute value of any of texture coordinates is too high, the correct values don't even make it to fragment shader, before any texture sampling function is called. So the idea is to use smaller polygons, that way you can make the difference of texture coordinates smaller. Also, provided that you are repeating texture, you can shift the texcoords closer to zero to make the absolute values smaller. – the swine Jan 10 '12 at 20:13
  • That means for example if i have triangle with texture coordinates (10, 20, 30) (only 1D for simplicity), i can shift them to (-10, 0, 10) and nothing will (visualy) change since the texture wraps. But the shift must be in integer steps. – the swine Jan 10 '12 at 20:14
  • To reply to your question more directly : yes, that means smaller ones. And using smaller texture would help in some cases, but not in all. It depends whether the rasterizer is able to interpolate the texcoords (because then the error originates during sampling, and that might be solved using smaller textures - but i guess this is not your case). – the swine Jan 10 '12 at 20:20
  • I am marking this as the accepted answer, because even though the answer is quite difficult to actually validate, there seems to be no alternative reasons. – Asta Jan 11 '12 at 23:51
2

I ran into similar problem under iOS. After repeating the texture 127 times, bad things started to happen.

The solution that worked was this:

I used GL_TRIANGLE_STRIP with some degenerate triangles. The texture is aligned to the vertieces, so at the edge of the texture there is an invisible (degenerate) triangle, where the texture is "displayed" mirrored as I set the texture coordinate to the origin. Thus the next visible triangle shows the texture from coordinate (0.0, 0.0) and it never goes over coordinate (x, 127.0).

There is a blog post explaining this with some examples and pictures.

Vili
  • 1,599
  • 15
  • 40