For this project, there were a lot of components that needed to be implemented. The first part that was important to get in quickly was the terrain, which is used twice - once for the terrain itself, and once for the water. In order for everything to work as intended, the planes that make up the water and terrain had to be made up of many vertices, instead of being one big quadrilateral. This way the heights of the vertices could be adjusted to create the intended effects. To implement this, I generated all of the vertex positions, and then went through and created a list of indices so that when they were made into triangle strips, they would be in the right order. In order to make one long triangle strip, degenerate triangles, or triangles with an area of 0, were needed to effectively reset the strip and start over from the beginning, one row down. This creates an effect like this:
This is repeated for the width and height required, which is determined by the width and height of the terrain's heightmap - one pixel on the heightmap is equal to one vertex. In this video, the heightmap I used was this one:
Using SDL, I loaded the image and read the RGB value of each pixel from the image, and used the average of that value (for a grey scale image, all three are the same number) to alter the Y value of the vertex, which resulted in the terrain. However, for lighting to work properly, it was also important to calculate the new normal of each vertex, since it would no longer be straight up as it was when the plane was flat. To do this, I found the two vectors between the vertex and two of its neighbors (in the image of the triangle strip above, vector 0-1 and 0-4, for example), and found the cross product of those, resulting in a vector that was perpendicular to both, or the normal of that vertex. The last aspect of the terrain that needed to be implemented was the texture blending between the sand, grass, and rock, which I based on the height of the vertex. In the fragment shader for the terrain, I use the height of each vertex and depending on the value, blend the grass texture with either the sand or rock textures, and calculate lighting based on that interpolated value.
In order to make the textures look better from a distance, I used OpenGL's built in mipmap generation. By generating the mipmaps for each of the terrain textures and using GL_LINEAR_MIPMAP_LINEAR as the minification filter, the textures will automatically be scaled down to a proper resolution based on their relative size at runtime. This creates an effect that reduces the graininess of the textures when looked at from a distance. For comparison, this is what the world I created looks like without mipmapping:
Once mipmapping is enabled, the same scene looks smoother, like this:
The water was generated similarly, but without the heightmap - the Y values of all of the vertices were set to the same value to create a flat plane, and then the vertex shader uses a sinusoidal function to alter the value so that it creates a wave effect. The water is set at a certain level, and any piece of the terrain that goes below the water level will let the water through. Doing that means that the normals have to be calculated again, since the heights of the vertices are changing. Because this calculation has to be done in the vertex shader after the heights are changed, it has to be done in a slightly different way. I take the partial derivative of the function used to calculate the heights with respect to X and Z, and take the cross product of those vectors to find the normal.