4

im trying to implement some ligth effect to a sphere. the idea is that it seems to emit light in a volumetric way, in the direction of the normal vectors. but i dont know exactly how can i do it.

my fragment shader is this, i implement a “radial” blur, and it works fine but only when the camera is front facing the sphere.

enter image description here when is not, this happens;

enter image description here im trying to follow this example, but with no success:

https://medium.com/@andrew_b_berg/volumetric-light-scattering-in-three-js-6e1850680a41

this is what im trying to get:

enter image description here

this is my code:

import controlP5.*;

ControlP5 cp5;
import peasy.*;

PGraphics canvas;

PGraphics verticalBlurPass;

PShader blurFilter;
PeasyCam cam;


void setup()
{
  cam = new PeasyCam(this, 1400);
  size(1000, 1000, P3D);



  canvas = createGraphics(width, height, P3D);


  verticalBlurPass = createGraphics(width, height, P3D);
  verticalBlurPass.noSmooth(); 

  blurFilter = loadShader("bloomFrag.glsl", "blurVert.glsl");
}

void draw(){
  background(0);

  canvas.beginDraw();
  render(canvas);
  canvas.endDraw();

  // blur vertical pass
  verticalBlurPass.beginDraw();
  verticalBlurPass.shader(blurFilter);
  verticalBlurPass.image(canvas, 0, 0);
  //render(verticalBlurPass);
  verticalBlurPass.endDraw();

  cam.beginHUD();

  image(verticalBlurPass, 0, 0);

  cam.endHUD();

println(frameRate);
}

void render(PGraphics pg)
{
  cam.getState().apply(pg);

  pg.background(0, 50);
  pg.stroke(255, 0, 0);



      canvas.fill(100, 0, 255);
      canvas.fill(255);

      pg.noStroke();
    canvas.sphere(100);
    pg.noFill();
     pg.stroke(255);
    pg.strokeWeight(10);


}

vertex:

#version 150

   in vec4 position;
   in vec3 normal;

   uniform mat4 transform;

   in vec2 texCoord;
   out vec4 TexCoord;

   in vec3 color;
  out vec3 Color;


  uniform float u_time;
  uniform mat4 texMatrix;


    void main() {
Color = color;

    TexCoord = texMatrix * vec4(texCoord, 1.0, 1.0);
        gl_Position = transform * position ;
    }

fragment:

#version 150
#define PROCESSING_TEXTURE_SHADER
#ifdef GL_ES
precision mediump float;
#endif
in vec4 TexCoord;
in vec3 Color;
uniform sampler2D texture;
uniform float u_time;
uniform float amt;
uniform float intensity;
uniform float x;
uniform float y;
uniform float noiseAmt;
uniform float u_time2;
out mediump vec4 fragColor;
vec4 finalColor;
#define PI  3.14


uniform vec2 lightPosition = vec2(0,0);
uniform float exposure = 0.09;
uniform float decay = .95;
uniform float density = 1.0;

uniform float weight = 1;
uniform int samples = 100;
const int MAX_SAMPLES = 100;


uniform vec2 resolution; // screen resolution

void main(){

  vec2 texCoord = TexCoord.xy;
  // Calculate vector from pixel to light source in screen space
  vec2 deltaTextCoord = texCoord - lightPosition;
  // Divide by number of samples and scale by control factor
  deltaTextCoord *= 1.0 / float(samples) * density;
  // Store initial sample
  vec4 color = texture(texture, texCoord);
  // set up illumination decay factor
  float illuminationDecay = 1.0;

  // evaluate the summation for samples number of iterations up to 100
  for(int i=0; i < MAX_SAMPLES; i++){
    // work around for dynamic number of loop iterations
    if(i == samples){
      break;
    }

    // step sample location along ray
    texCoord -= deltaTextCoord;
    // retrieve sample at new location
    vec4 color2 = texture(texture, texCoord);
    // apply sample attenuation scale/decay factors
    color2 *= illuminationDecay * weight;
    // accumulate combined color
    color += color2;
    // update exponential decay factor
    illuminationDecay *= decay;

  }
  // output final color with a further scale control factor
  fragColor = color * exposure;
}

any ideas? what im doing wrong?

Spektre
  • 49,595
  • 11
  • 110
  • 380
fas
  • 51
  • 4
  • 1
    thanks, fixed the image order, but i cant embeded until level 10 :/ – fas Jan 24 '19 at 10:55
  • 2
    thx for the repair, I added the image formating ... was doing it originally but I noticed the discrepancies in links so I canceled the edit and notified you instead... On the first look on the images and description (withot looking at the code) looks like you're blurring in deferent coordinate system than your screen is in ... – Spektre Jan 24 '19 at 11:16
  • 1
    i just read the same in another forum, but i dont know how to start to fix it! you know? – fas Jan 24 '19 at 11:20
  • 2
    my educated guess is that this line in your fragment: `deltaTextCoord = texCoord - lightPosition;` is the problem. I assume `texCoord` is paralel to screen but `lightPosition` is in global coordinates ... You should transform the `lightPosition` in vertex the same way as the `position` and pass it to fragment .... – Spektre Jan 24 '19 at 11:21
  • 2
    however your `lighPosition` is `vec2` so either its already transformed and I am wrong or you got it wrongly transformed or you got just 2D position instead of 3D ... – Spektre Jan 24 '19 at 11:24
  • now my ligthPosition is set to (0,0) becouse that is where is the sphere. so, i dont know what transformation apply... i read this in other forum: ... So, I managed to get the effect working. The problem that I had (and you're most likely having Asheh) is that the range for the texture coordinates and screen space light position in the fragment/pixel shader is different between OGL/DX. My issue was caused by having the sun position in the -1 to 1 range, and the texture coords in the 0 to 1 range. – fas Jan 24 '19 at 11:35
  • https://www.gamedev.net/forums/topic/533694-god-rays/ – fas Jan 24 '19 at 11:36
  • 1
    volumetric means 3D so your light should be `vec3` and transformed like `lightpos = transform * lightPosition;` in vertex shader and passed to the fragment. there you either use just its `xy` or whole `xyz` coordinates depends on what/how exactly the effect is implemented. – Spektre Jan 24 '19 at 12:42
  • Maybe i tell you wrong. i dont have a "light" in this examples the light function is used to cause the shadows with other objects. im only interest in the radial blur effect, wich is generate in a postprocessing way. the ligthPosition in this case correspond to the sphere position, wich is in 0,0. i try pass this coordinatesto a vertex and multiply by transform, but the result is the same. maybe i dont understand the process. your logic seems to be right, but maybe im doing something wrong. – fas Jan 24 '19 at 18:26
  • i`ve been studie and im a little close to the answer but withour results yet. is not problem with the type of effect, becouse with any other texture (feedback for example) when i want to calculate “radial” for each object, the texture do the calculations but always with the same center, the center of the screen. so i need to transform that coordinates for each object. i dont know exactly how. but, if i have an object in (-100, 0, 0) i need to use like it was the center of the screen. – fas Jan 25 '19 at 04:40
  • May be you should describe what and how the effect is/works for us as we are just guessing ... that way we might spot what exactly is wrong ... – Spektre Jan 25 '19 at 12:55
  • please, check this link https://medium.com/@andrew_b_berg/volumetric-light-scattering-in-three-js-6e1850680a41. this is what im trying to do. notice what happen when effect change its position. – fas Jan 26 '19 at 00:42
  • So its volumetric light scattering not just volumetric light ... In that case you're not doing a radial blur ... that should be concentric spheres in my opinion not enlarging discs in some direction (cone). Also take a look at my [simple Atmospheric scattering in GLSL](https://stackoverflow.com/a/19659648/2521214) and [2D raycasting light effect in GLSL](https://stackoverflow.com/a/34708022/2521214) which are similar effects. interesting topic though may be I will implement it ... – Spektre Jan 26 '19 at 07:39
  • is very interesting, i will read it. but `ve you looked at my shader? becouse i think theres is big problem with technic and i dont know where is it. lets suposse that the only i want is implement this shader, why i see the samples when i move the camera? how can i fix it? – fas Jan 27 '19 at 01:05
  • I did the first time and I commented you with my conclusions. Without implementing it myself I can only guess ... as you did not describe what variable is what, how exactly your implementation works nor any sketch of coordinates system and stuf so for any of us is that just foreign code, we have some notion what it should do but thats it... as `PGraphics` is foreighn class to me I can only guess... how many passes you got what does each of it do ,... its api / parameters makes not much sense to me sphere (100) ? I assume radius but where is center... Also I would expect more input textures – Spektre Jan 27 '19 at 09:09
  • lights (depth+intensity) and obstacles (depth + optional transparency) yes you can encode it to single RGBA but if the case you should describe which channel is what. The blurring you do with multi passing or by raycasting in fragment? and that are just simple details that pops up in my head without even start coding ... As we do not have your side of CPU code (not all code in the same environment) so you should add at least the input texture to the shader so we can bypass that part of code – Spektre Jan 27 '19 at 09:20
  • the code that makes sphere always facing to camera in the example of the link is this: ` var p = lightSphere.position.clone(), vector = p.project(camera), x = ( vector.x + 1 ) / 2, y = ( vector.y + 1 ) / 2; volumetericLightShaderUniforms.lightPosition.value.set(x, y);` and this is the "project" funciton, acordding three.js project: function ( camera ) { return this.applyMatrix4( camera.matrixWorldInverse ).applyMatrix4( camera.projectionMatrix ); }, – fas Jan 29 '19 at 02:26
  • you´re rigth, @spe – fas Jan 31 '19 at 07:44

0 Answers0