4

I am trying to implement a parametric curve plotter on a fragment shader in GLSL.

I managed to make a rudimentary one by iterating the parameter and drawing a circle for each computed position.

You can find the (working and commented) code here.

Here are my questions:

  • I would like to plot it with a line. From what I have understood, I see two different ways I could do it :

    1. by computing the distance between each pixel and the computed position inside the loop, accumulate the computed value in a float variable, and then draw the curve using a step function.

      I tried doing this and it seems like I only compute the distance for the one t position:

      #define PI 3.14
      
      void mainImage( out vec4 fragColor, in vec2 fragCoord )
      {
          vec2 uv = -1. + 2.*fragCoord.xy / iResolution.xy;
          uv.x *= iResolution.x/iResolution.y;
          vec3 color = vec3(0.);
          vec2 pos;
          float dist;
      
          for(float t=-2.; t<2.02; t+=.02){
      
              pos.x = sin(4.*t+(iGlobalTime));
              pos.y = cos(6.*t);
      
              dist += 1.-sqrt(pow((uv.x-pos.x),2.)+pow((uv.y-pos.y),2.));
      
              color += vec3(dist);
      
      
          }//for
      
          fragColor = mix(vec4(0.0), vec4(1.0), dist);
      
      }
      
    2. By drawing line segment between each consecutive positions of the parameter

      I found this implementation that seems to do what I am trying to achieve, but i don't quite understand why they are using the previous position of the parameter.

  • Is there a way to do such a function without a for loop managing the parameter?

Last precision : I'm only suing a fragment shader because my goal is to upload it on a VJing software that manages fragment shaders

genpfault
  • 51,148
  • 11
  • 85
  • 139
apazat
  • 41
  • 4
  • take a look at [Draw Quadratic Curve on GPU](http://stackoverflow.com/a/31423105/2521214) – Spektre Apr 07 '17 at 07:25
  • Either evaluate the curve on the CPU, or in a compute shader, then use your vert and frag shaders to draw it. Your current approach performs a ton of redundant work which defeats the purpose of using the GPU. – 3Dave Sep 04 '19 at 20:39

0 Answers0