When I calculate the gl_PointSize the same way I do it in the vertex shader I get a value "in pixels" (according to http://www.opengl.org/sdk/docs/manglsl/xhtml/gl_PointSize.xml). Yet this value doesn't match the measured width and height of the point on the screen. The difference between the calculated and measured size is no constant it seems.
Calculated values range from 1 (very far away) to 4 (very near)
Current code (with three.js, but nothing magic), trying to calculate the size of a point on screen:
var projector = new THREE.Projector();
var width = window.innerWidth, height = window.innerHeight;
var widthHalf = width / 2, heightHalf = height / 2;
var vector = new THREE.Vector3();
var projector = new THREE.Projector();
var matrixWorld = new THREE.Matrix4();
matrixWorld.setPosition(focusedArtCluster.object3D.localToWorld(position));
var modelViewMatrix = camera.matrixWorldInverse.clone().multiply( matrixWorld );
var mvPosition = (new THREE.Vector4( position.x, position.y, position.z, 1.0 )).applyMatrix4(modelViewMatrix);
var gl_PointSize = zoomLevels.options.zoom * ( 180.0 / Math.sqrt( mvPosition.x * mvPosition.x + mvPosition.y * mvPosition.y + mvPosition.z * mvPosition.z ) );
projector.projectVector( vector.getPositionFromMatrix( matrixWorld ), camera );
vector.x = ( vector.x * widthHalf ) + widthHalf;
vector.y = - ( vector.y * heightHalf ) + heightHalf;
console.log(vector.x, vector.y, gl_PointSize);
Let me clarify: The goal is to get the screen size of a point, in pixels.
My vertex shader:
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = zoom * ( 180.0 / length( mvPosition.xyz ) );
gl_Position = projectionMatrix * mvPosition;