3

I've finished building a solar system using OpenGL and C++. One of the features in this system is having camera positions on each planet pointing to the north which moves based on the planet transformation. The camera positions are: One at the top, one a little behind the planet and the last one is far away from the planet. There are some other features but I don't have any issues with them.

So, the issue I am having is that some planets seem to be trembling for some reason while rotating around their centers. If I increase the speed of spinning the planet will stop trembling or the trembling becomes unnoticeable. The entire solar system is fully based on real textures and proportional space calculations and it has multiple camera positions as mentioned earlier.

Here is some code that might help understanding what I am trying to achieve:

//Caculate the earth postion
GLfloat UranusPos[3] = {Uranus_distance*DistanceScaler * cos(-uranus * M_PI / 180), 0, Uranus_distance*DistanceScaler * sin(-uranus * M_PI / 180)};
//Caculate the Camera Position
GLfloat cameraPos[3] = {Uranus_distance*DistanceScaler * cos(-uranus * M_PI / 180), (5*SizeScaler), Uranus_distance*DistanceScaler * sin(-uranus * M_PI / 180)};
//Setup the camear on the top of the moon pointing 
gluLookAt(cameraPos[0], cameraPos[1], cameraPos[2], UranusPos[0], UranusPos[1], UranusPos[2]-(6*SizeScaler), 0, 0, -1);

SetPointLight(GL_LIGHT1,0.0,0.0,0.0,1,1,.9);
//SetMaterial(1,1,1,.2);
//Saturn Object
// Uranus Planet
UranusObject(  UranusSize * SizeScaler,   Uranus_distance*DistanceScaler,   uranusumbrielmoonSize*SizeScaler,   uranusumbrielmoonDistance*DistanceScaler,   uranustitaniamoonSize*SizeScaler,   uranustitaniamoonDistance*DistanceScaler,   uranusoberonmoonSize*SizeScaler,   uranusoberonmoonDistance*DistanceScaler);

The following is the planet function I am calling to draw the object inside the display function:

void UranusObject(float UranusSize, float UranusLocation, float UmbrielSize, float UmbrielLocation, float TitaniaSize, float TitaniaLocation, float OberonSize, float OberonLocation)
{
    glEnable(GL_TEXTURE_2D);
    glPushMatrix();

    glBindTexture( GL_TEXTURE_2D, Uranus_Tex);
    glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE );
    glRotatef( uranus, 0.0, 1.0, 0.0 );
    glTranslatef( UranusLocation, 0.0, 0.0 );
    glDisable( GL_LIGHTING );
    glColor3f( 0.58, 0.29, 0.04 );
    DoRasterString( 0., 5., 0., "     Uranus" );
    glEnable( GL_LIGHTING );
    glPushMatrix();
    // Venus Spinning
    glRotatef( uranusSpin, 0., 1.0, 0.0 );
    MjbSphere(UranusSize,50,50);

    glPopMatrix();
    glDisable(GL_TEXTURE_2D);
    glEnable(GL_TEXTURE_2D);
    glPushMatrix();

    glBindTexture( GL_TEXTURE_2D, Umbriel_Tex);
    glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE );
    glDisable(GL_LIGHTING);
    if (LinesEnabled)
    {
        glPushMatrix();
        gluLookAt( 0.0000001, 0., 0.,     0., 0., 0.,     0., 0., .000000001 );
        DrawCircle(0.0, 0.0, UmbrielLocation, 1000);
        glPopMatrix();
    }
    glEnable( GL_LIGHTING );
    glColor3f(1.,1.,1.);
    glRotatef( uranusumbrielmoon, 0.0, 1.0, 0.0 );
    glTranslatef( UmbrielLocation, 0.0, 0.0 );
    MjbSphere(UmbrielSize,50,50);

    glPopMatrix();
    glDisable(GL_TEXTURE_2D);
    glEnable(GL_TEXTURE_2D);
    glPushMatrix();

    glBindTexture( GL_TEXTURE_2D, Titania_Tex);
    glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE );
    glDisable(GL_LIGHTING);
    if (LinesEnabled)
    {
        glPushMatrix();
        gluLookAt( 0.0000001, 0., 0.,     0., 0., 0.,     0., 0., .000000001 );
        DrawCircle(0.0, 0.0, TitaniaLocation, 1000);
        glPopMatrix();
    }
    glEnable( GL_LIGHTING );
    glColor3f(1.,1.,1.);
    glRotatef( uranustitaniamoon, 0.0, 1.0, 0.0 );
    glTranslatef( TitaniaLocation, 0.0, 0.0 );
    MjbSphere(TitaniaSize,50,50);

    glPopMatrix();
    glDisable(GL_TEXTURE_2D);
    glEnable(GL_TEXTURE_2D);
    glPushMatrix();

    glBindTexture( GL_TEXTURE_2D, Oberon_Tex);
    glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE );
    glDisable(GL_LIGHTING);
    if (LinesEnabled)
    {
        glPushMatrix();
        gluLookAt( 0.0000001, 0., 0.,     0., 0., 0.,     0., 0., .000000001 );
        DrawCircle(0.0, 0.0, OberonLocation, 1000);
        glPopMatrix();
    }
    glEnable( GL_LIGHTING );
    glColor3f(1.,1.,1.);
    glRotatef( uranusoberonmoon, 0.0, 1.0, 0.0 );
    glTranslatef( OberonLocation, 0.0, 0.0 );
    MjbSphere(OberonSize,50,50);

    glPopMatrix();
    glDisable(GL_TEXTURE_2D);
    glPopMatrix();
    glDisable(GL_TEXTURE_2D);
}

Finally, the following code is for the transformation calculations used for solar animation:

uranus += 0.0119 * TimeControl;

if( uranus > 360.0 )
    uranus -= 360.0;

// Clockwise Rotation
uranusSpin -= 2.39 * TimeControl;
if( uranusSpin <= -360.0 )
    uranusSpin = 0.0;

Note: the problem happens only with 4 planets only.

I really appreciate any idea that could solve the problem.

GraphicsMuncher
  • 4,583
  • 4
  • 35
  • 50
Hafed
  • 27
  • 1
  • 7
  • Why not do the wrapping for uranusSpin at -360 correctly like for uranus ? Or don't do it at all since glRotatef() handles [-inf,inf]. – patraulea Apr 10 '16 at 19:40
  • Thanks for the reply. The reason I do it this way is that Uranus rotates clockwise unlike other planets. But this is not the issue as the problem exists in other planets like Neptune which goes counterclockwise. – Hafed Apr 10 '16 at 20:10
  • 2
    FYI that version of OpenGL has been deprecated for a decade. Unless you're forced to use it for some very niche reason, do yourself a favor and learn modern OpenGL. – GraphicsMuncher Apr 10 '16 at 21:03
  • 2
    If you're using correct distances, the problem may be limited precision. – geometrian Apr 10 '16 at 21:07
  • @GraphicsMuncher Could also try out Vulkan for a blast! – Poriferous Apr 10 '16 at 21:59
  • I really appreciate your answers. Coding in Vulkan is harder than opengl and it meant only for gaining performance and securing the source code. – Hafed Apr 11 '16 at 00:30
  • Had this assignment too at uni ! – Mihai-Daniel Virna Apr 11 '16 at 07:31

2 Answers2

2

First of all take a look at:

Now to your problem. I am too lazy to go through your code but you most likely hit the floating point accuracy barrier and or have accumulating errors during rotating. My bet is that the error is bigger further away from the sun (outer planets and more in the ecliptic plane). If that is so then it is obvious. So how to remedy that?

  1. floating point accuracy

    while rendering you are transforming vertexes by transform matrix. For reasonable ranges is this OK but if you Vertex is very far from (0,0,0) then the matrix multiplication is not that precise. This means when you convert back to camera space the Vertex coordinate is jumping around. To avoid this you need to translate your vertexes to camera origin before feeding it to OpenGL.

    So just subtract camera position from each Vertex (before loading them to OpenGL!!!) you have and then render them with camera with position (0,0,0). This way you get rid of the jumping and even wrong interpolation of primitives. see

    If your object is still too big (this is not the case for Solar system) then you can stack up more frustums together and render each with separate camera shifted by some step so you get in range.

    Use 64 bit floats where you can, but be aware GPU implementations does not support 64-bit interpolators so fragment shader is feed by 32-bit floats instead.

  2. accumulating errors in transform matrix

    If you have some "static" matrix and apply countless operations on it like rotation, translation Then it will lose its precision after while. This is recognizable by changing scale (axises are not unit size anymore) and adding skew (axises are not perpendicular to each other anymore) with time it gets worse and worse.

    To remedy that You can keep counter of operations per matrix and if hit some treshold perform matrix normalization. This is simple just extraxt all the axises vectors, make them perpendicular again, set them back to their original size and write them back to your matrix. With basic vector math knowledge is this easy just exploit cross product (which gives you perpendicular vector). I use Z axis as view direction so I keep Z axis direction as is and correct the X,Y axises directions. The size is easy just divide each vector by its size and you are unit again (or very close to it). for more info see:

[Edit1] What is going on

Have a look at your code for single planet without the rendering stuff:

// you are missing glMatrixMode(GL_????) here !!! what if has been changed?
glPushMatrix(); 
// rotate to match dayly rotation axis?
glRotatef( uranus, 0.0, 1.0, 0.0 );
// translate to Uranus avg year rotation radius
glTranslatef( UranusLocation, 0.0, 0.0 );
glPushMatrix();
// rotate Uranus to actual position (year rotation)
glRotatef( uranusSpin, 0., 1.0, 0.0 );
// render sphere
MjbSphere(UranusSize,50,50);
glPopMatrix();

// moons

glPopMatrix();

So what you are doing is this. Let assume you are using ModelView matrix and you are instructing OpenGL to do this operation on it:

ModelView = ModelView * glRotatef(uranus,0.0,1.0,0.0) * glTranslatef(UranusLocation,0.0,0.0) * glRotatef(uranusSpin,0.,1.0, 0.0);

So what is wrong with this? For small scenes nothing but you are using proportional sizes so:

UranusLocation=2870480859811.71 [m]
UranusSize    =     25559000    [m]

So that means the glVertex magnitudes are ~25559000 and after applying transforms ~2870480859811.71+25559000. Now there are few problems with these values.

First any glRotation call applies sin and cos coefficients to the 2870480859811.71 Lets assume we have error of sin,cos around 0.000001 that means the final position after result has position error in it:

error=2870480859811.71*0.000001=2870480.85981171

The OpenGL sin,cos implementation has probably higher precision but not by much. Anyway if you are comparing it to planet radius

2870480.85981171/25559000=0.112308 -> 11%

You get that the jumping error is around 11% of the planet size. That is huge. The implication from this is that the jumping is the bigger the further away from Sun and more Visible for smaller Planets (as our perception is usually relative not absolute).

You can try to boost this by using double precision (glRotated) but that does not mean it would solve the problem (some drivers does not have double precision implementation of sin,cos).

If you want to get rid of these problems you have follow the bullet #1 or do the rotations on your own on at least doubles and feed only the final matrix to OpenGL. So first the #1 approach. Translation of matrix is just +/- operation (also encoded as multiplication) but no un-precise coefficients are present so you are using full precision of used variable. Anyway I would use glTranslated just to be sure. So we need to make sure the rotations does not use big values inside OpenGL. So try this:

// compute planet position
double x,y,z;
x=UranusLocation*cos(uranusSpin*M_PI/180.0);
y=0.0;
z=UranusLocation*sin(uranusSpin*M_PI/180.0);

// rotate to match dayly rotation axis?
glRotated( uranus, 0.0, 1.0, 0.0 );
// translate to Uranus to actual position (year rotation)
glTranslated(x,y,z);
// render sphere
MjbSphere(UranusSize,50,50);

This affects the daily rotation speed as the daily and Year rotations angles are not adding up, but you are not implementing the daily rotation yet anyway. If this does not help then we need to use camera local coordinates to avoid having big values send to OpenGL:

// compute planet position
double x,y,z;
x=UranusLocation*cos(uranusSpin*M_PI/180.0);
y=0.0;
z=UranusLocation*sin(uranusSpin*M_PI/180.0);
// here change/compute camera position to (original_camera_position-(x,y,z))

// rotate to match dayly rotation axis?
glRotated( uranus, 0.0, 1.0, 0.0 );
// render sphere
MjbSphere(UranusSize,50,50);

Hope I matched your coordinate system if not just swap axises or negate them (x,y,z).

It is much better to have own precise matrix math at disposal and compute the glTranslate,glRotate on CPU side with high precision and use only the resultant matrix in OpenGL. see the Understanding 4x4 homogenous transform matrices link above how to do that.

Spektre
  • 49,595
  • 11
  • 110
  • 380
  • Thanks for the reply... I have the problem on Mars too which is very near to the sun. To clarify this further, the problem occurs on Mars, Uranus, Neptune and Pluto. The rest works just fine and have no issues. Moreover, the size of Mars sphere is very small comparing to other planets, would that be another reason for getting this error on Mars too? – Hafed Apr 11 '16 at 09:47
  • @Hafed no the size should not matter for the error itself (only the magnitude of Vertex values at any stage of OpneGL rendering) but on big planets like Jupiter, Saturn you could overlook this because they big so you place camera further which in perspective projection lowers the jumping. You can try to render just a cross instead of planets and look very close to them (near center of planets) and you should see the jumping. Anyway if you translate the Vertexes to camera local position you shoudl be fine – Spektre Apr 11 '16 at 09:51
  • Thanks, I got the idea now. But the last statement is not clear to me as I think I am doing so by calculating the planet position using the cosine and sine functions based on the planet current location and putting it into a vector[3], then calculating the camera position using the sine and cosine functions based on the planet current location and put it into another vector[3]. – Hafed Apr 11 '16 at 10:11
  • By having these two vectors, I am able to use gluLookAt function to put the camera position at the top of each planet, near the planet or far away from the planet. Is this a good way of positioning the planets' cameras or there are a better way of doing it. I wish that you can have look at the code and figure out what is the issue as I have ran out of solutions to this problem :( – Hafed Apr 11 '16 at 10:11
  • @Hafed I use own matrix class and never really used `gluLookAt` so hard to say. I position my cameras directly by setting its position and view direction. But is possible this is part of your problem as I saw glu matrix precision errors before. For example `gluPerspective` has un-precise `tangens`. People use `glm` lib for this stuff and usually Euler angles (which I do not like/use as they have issues and are too limiting for my needs). Anyway Try to translate the vertexes and camera first (on **CPU** side) as I suggested in **#1** It should be enough. – Spektre Apr 11 '16 at 10:17
  • I really appreciate your patency with me. As I am new in this field and I think I've done great job and spent a lot of time building the system. So, for floating point accuracy, where and how can that be done in my code. I mean should that be done on the planet function or in the camera position calculations. Some coding will be great... – Hafed Apr 11 '16 at 10:30
  • @Hafed for example you can use `gluLookAt(cameraPos[0]-UranusPos[0], cameraPos[1]-UranusPos[1], cameraPos[2]-UranusPos[2], 0.0,0.0,0.0-(6*SizeScaler), 0, 0, -1);` and add `glTranslatef` with `-UranusLocation` – Spektre Apr 11 '16 at 11:24
  • Thanks Spektre. I've got the camera in the right position using the your method `gluLookAt(cameraPos[0]-UranusPos[0], cameraPos[1]-UranusPos[1], cameraPos[2]-UranusPos[2], 0.0,0.0,0.0-(6*SizeScaler), 0, 0, -1);` `glTranslatef(-Uranus_distance*DistanceScaler,0,0);` . However, the camera position does not move with the planet. How can I make it follow the planet? – Hafed Apr 11 '16 at 17:20
  • @Hafed if you substracts the actual position of planet from both camera and mesh then the result should be the same. Camera is OK so render planet at position `(0,0,0)`; Problem is you are using single matrix for this and OpenGL can accumulate the errors in process. Better choice would be use own matrix math ...Also just noticed you are missing some safety `glMatrixMode` calls – Spektre Apr 11 '16 at 19:43
  • Thanks. I will try this and let you know if this solves the problem. – Hafed Apr 11 '16 at 23:47
  • Spektre, what do you mean by "Better choice would be use own matrix math ...Also just noticed you are missing some safety glMatrixMode calls"? what kind of matrix I should create and how? – Hafed Apr 12 '16 at 02:17
  • @Hafed added [Edit1] – Spektre Apr 12 '16 at 06:47
  • Thanks Spektre for the your post modification. I tried to implement your suggestions and things got worse. At least, I know what is the problem now and I will have to figure out a way to solve it. – Hafed Apr 13 '16 at 18:27
1

I can not give you an exact answer based on what you have shown. I can not compile and run your current code and test it with numbers and the debugger. What I can do is give you some advise that ought to help you out.

From what I can tell based on the code that you did supply you are creating your Planet objects via a function call. So I can rationalize that you are doing this for every planet. If that is the case then if you take a notice at your full code base you will see that everyone of these functions besides their names, and the numbers that are used are basically copies of themselves or duplicate code. This is where you need a more versatile structure.

Here are some important things to consider in your construct.

  • Create a full 3D Motion camera class & a separate Player Class

    • The Camera Class will allow you to look up, down, left & right by (restrained angles) - Via Mouse Controls
    • The Player Class will have the camera object attached to it at camera_eye_level, lookAtDirection that is perpendicular to the upVector. The player class will be able to move freely forward, backward, up, down turn left & turn right via keyboard controls. This gives you flexibility.
    • Note: If You are doing by scale and your planets are considerably far apart, make your player's linear speed rate higher so that you move faster towards the object covering more distance.
  • Create a base Geometry class

    • Derived Geometry Classes: Box, Flat Grid, Cylinder, Sphere, Pyramid, Cone
    • These classes will hold a finite container of Vector3s(Vertices), a container of Vector3(Indices), a container of Vector2(Texture Coords), a container of Vector4(Material - Color with Alpha), and a container of Vector3(Normals).
    • The base class will probably only hold an unsigned int (ID value) and or a std::string (Name Identifier) and an Enum Value of Type Geometry that is being created
    • Each Derived class will have a constructor on required parameters, the size of the 3 dimensions (don't need to worry about Texture & Color Information Here that will come later)
  • Create a Material Class
  • Create a base Light class
    • Derived Types: Directional, Point & Spot
  • Create A Texture Class
  • Create A Texture Transform Class - This Will Allow you to apply transformations directly on the texture that is attached to your Geometry To give the effect that an object is moving, when only the texture transform is.
    • Example: Geometry Type Box( 2,1,4 ) Texture Applied "conveyor_belt1.png" this way you don't have to change the box's transform and move all the vertices, texcoords, and normals on each render pass, you can just simply make the texture move in place. Less computationally expensive
  • Create Node Class - Base class for all nodes that will belong to a Scene Graph
    • Types of Nodes for SceneGraph - ShapeNode(Geometry), LightNode, TransformNode(contains vector & matrix information for Translation, Rotation & Scaling),
    • The combination of the Node Classes and The SceneGraph will create a tree like structure such as this:


// This would be an example of text file that you would read in to parse the data 
// and it will construct the scene graph for you, as well as rending it.
// Amb = Ambient, Dif = Diffuse - The Materials Work With Lighting & Shading and will blend with which ever texture is applied      

// Items needed to construct
// Objects
Grid geoid_1 Name_plane Wid_5 Dep_5 divx_10 divz_10
Sphere geoid_2 name_uranus radius_20 
Sphere geoid_3 name_earth radius_1
Sphere geoid_4 name_earth_moon radius_0.3

// Materials
Material matID_1 Amb_1,1,1,1 Dif_1,1,1,1 // (white fully opaque)

// Textures   
Texture texID_1 fileFromTexture_"assets\textures\Uranus.png"
Texture texID_2 fileFromTexture_"assets\textures\Earth.png"
Texture texID_3 fileFromTexture "assets\textures\EarthMoon.png"

// Lights (Directional, Point, & Spot Types)

Transform Trans_0,0,0 // This is the center of the World Space and has to be a 
+Shape geoID_1 matID_1 // Applies flat grid to root node to create your horizontal plane
+Transform Trans_10,2,10000 // Nest a transform node that is relative to the root; this will allow you to rotate, translate and scale every object that is attached and nested below this node. Any Nodes Higher in the branch will not be affected
++Shape geoID_2 matID_1 texID_1
+Transform Trans_10,1.5,200 // This node has 1 '+' so it is nested under the root and not Uranus
++Shape geoID_3 matID_1 tex1ID_2
+++Transform Trans_10,1.5,201 // This node is nested under the Earth's Transform Node and will belong to the Earth's Moon.
+++Shape geoID_4 matID_1 textID_3

END // End Of File To Parse

With this kind of construct, you would be able to translate, rotate and scale objects independently of each other, or by a hierarchy. For example: And as for the moons of Uranus you can apply the same technique as I showed with the Earth and its moon. Each moon would have its own transform but those transforms would be nested under the planets transforms where the planets transforms would be nested to the root or even the Sun's transform if you added one in. (Sun = Light Source) it would have several light nodes attached. (I've already done this before with the Sun - Earth & Moon and had the objects rotate accordingly.

You have a Model of a Jeep but you have 4 different models to render to make the full object in your game. 1 Body, 2 Front Wheels, 3 Back Wheels & 4 Steering Wheel. With this construct a graph my look like this

Model modID_1 modelFromFile_"Assets/Models/jeep_base.mod"
Model modID_2 modelFromFile_"Assets/Models/jeep_front_wheel.mod"
Model modID_3 modelFromFile_"Assets/Models/jeep_rear_wheel.mod"
Model modID_4 modelFromFile_"Assets/Models/jeep_steering.mod"

Material matID_1 Amb_1_1_1_1 Diff_1_1_1_1
Texture texID_1 textureFromFile_"Assets/Textures/jeep1_.png"    

TextureTransform texTransID_1 name_front
TextureTransform texTransID_2 name_back
TextureTransform texTransID_3 name_steering

+Transform_0,0,0 name_root
++Transform_0,0.1,1 name_jeep
+++Shape modID_1 matID_1 texID_1  texCoord_0,0, size_75,40
+++Transform_0.1,0.101,1 name_jeepFrontWheel
+++Shape modID_2 matID_1 texID_1  texCoord_80,45 size_8,8
+++Transform_-0.1,-0.101,1 name_jeepBackWheel
+++Shape modID_3 matID_1 texID_2  texCoord_80,45 size_8,8
+++Transform_0.07,0.05,-0.02 name_jeepSteering
+++Shape modID_4 matID_1 texID_2  texCoord_80,55 size_10,10

END

Then in your code you can grab the transform nodes that belongs to jeep and when you translate it across the screen all the jeeps parts move together as one object, yet independently all tires can rotate front & back using the texture transforms, and your fronts can turn left and right by is own node transform by a constrained degree, also your steering can turn the same direction as the wheels using the texture transform, but may rotate less or more, depending on the characteristics of this particular vehicle and how it will handle within the scene or game.

With this type of system; the process is a bit harder to construct, but once it is working it allows for a simple automation of generating a scene graph. The down side to this is text file is easy to read at the human level but parsing a text file is much harder than parsing a binary file. The main reason for that is your function that belongs to your Scene class to parse and create all of these different nodes (All Geometry(Shapes), All (Lights), Material Texture and Transform Nodes.

The reason for having the Geometry and Lights having a base class is when you create a ShapeNode or a LightNode that is attached or nested to a TransformNode it can accept any derived type from the base class. This way you don't have to code your scene graph construction and parser to accept every type of geometry and light, you can have it accept any geometry and light, but to tell it what type it is.

Now you can make this a bit easier by creating a parser that works reads in binary files. The plus side is, it is easier to write the parser as long as you know the structure of the file, how much data to read in, and the expected type of data at the current read location. The down side is you can not read or set this up manually as I demonstrated above with a human readable text file. However going with this method would require you to have a decent Hex Editor Program that allows for file template types. such as 010 Editor this way you can add your own template to your file structure and when you read in the binary file with the applied template, you can then see if the values are correct and the fields have the appropriate data type and values.

In truth the best advice I can give you is this: The construct above is still good to follow, but may not be the best, it is a good starting point. But what you are learning right now on OpenGL appears to be OpenGL v1.0 which is basically outdated and deprecated. You should scrap all that you have learned from this relic of an API and begin to learn on modern OpenGL any version that is higher than 3.3. From there you can then learn to build and write shaders in GLSL and it will simplify a lot of things for you once you have your frame work up and working. Then once you get that up and working, instead of rendering everything to the screen from the CPU and rendering it from the GPU instead is much more efficient. Once you have that in place then consider looking into the concept of Batch Rendering. Batch Rendering will allow you to control how many buckets there are and how many vertices each bucket will contain. This batch process prevents the bottle neck between the rate of I/O from the CPU over the BUS to the GPU since GPUs do computation much faster than your CPU. Then with your scene graph you will not have to worry about creating lights, creating materials and applying them for all of these will be done in your fragment shader (GLSL) or what is called pixel shader (DirectX). All of your vertices and index information will be passed into the vertex Shader. Then it is just a matter of linking your shaders to an openGL program.

If you would like to see and learn about everything I have described above then visit the community that I've been a member of since 2007-08 at www.MarekKnows.com and join our community. He has several video tutorial series to learn from.

Francis Cugler
  • 7,788
  • 2
  • 28
  • 59
  • When I get some time I can take a look at it. But I do highly suggest making the transition from OpenGL 1.0 to Modern OpenGL & GLSL – Francis Cugler Apr 11 '16 at 02:34
  • @Hafed 407MB zip that is crazy (I do not even try to download) you should fit into 10 MB unpacked. My effemerids are ~25/110/310 MB unpacked depends on the DTM model used but you do not use DTM only textures and may be stellar catalog. You are using unpacked textures? It must take forever to load the app ... – Spektre Apr 11 '16 at 07:48
  • Thanks for the comment. Yes, I am trying to make this as real as possible using real textures and it is not as bad as you think. it takes 5-8 seconds max to load up everything... – Hafed Apr 11 '16 at 09:55