Python Modern Opengl Texturing Rotating Cube

In this Python Modern Opengl article iam going to talk about Texturing Rotating Cube. so a texture is a 2D image (even 1D and 3D textures exist) used to add detail to an object, think of a texture as a piece of paper with a nice brick image (for example) on it neatly folded over your 3D house so it looks like your house has a stone exterior. Because we can insert a lot of detail in a single image, we can give the illusion the object is extremely detailed without having to specify extra vertices.

 

you need an image texture that should be 512 x 512 in your project directory. also you need to install a library that is called pip install pillow

Iam using this image for Texturing

Python Modern Opengl Texturing
Python Modern Opengl Texturing

 

 

 

 

You can check my previous articles on Python Modern Opengl Programming

1:  Python Opengl Introduction And Creating Window

2: Python Opengl Drawing Teapot

3: Python Modern Opengl Drawing Rectangle

4: Python Modern Opengl GLFW Window

5: Python Modern Opengl Triangle With GLFW Window

6: Python Modern Opengl Coloring Triangle

7: Python Modern Opengl Drawing Rectangle

8: Python Modern Opengl Rotating Cube

9: Python Modern Opengl Texturing Rectangle

 

 

 

So now this is the complete code for Python Modern Opengl Texturing Rotating Cube

 

So if you have followed my previous articles on Python Modern Opengl Programming, most of these codes will be familiar to you.but i will explain again some of them if you need to know more about these codes you can check my previous articles, i have given the link at the top.

 

 

These are the rectangle values and also different color for the values with texture coordinates, also we have converted the values to 32bit value.

 

 

 

 

Because we are working with EBO (Element Buffer Object) , we need to create indices for our Cube, also we have converted to 32bit int.

 

 

 

 

These are the vertex and fragment shaders

So at the top of vertex shader we have the version for the shader , and we have three input values for our position, color and texture coordinates. and we have two output value for our color and texture coordinates. in the fragment shader we have two input value for our new color and the texture coordinates, and also we have and output value for color with uniform variable.

 

 

 

What is Uniform Variable ?

So a uniform is a global Shader variable declared with the “uniform” storage qualifier. These act as parameters that the user of a shader program can pass to that program. Their values are stored in a program object. Uniforms are so named because they do not change from one shader invocation to the next within a particular rendering call. This makes them unlike shader stage inputs and outputs, which are often different for each invocation of a shader stage.

 

 

 

 

What Are Shaders ?

Shaders are little programs that rest on the GPU. These programs are run for each specific section of the graphics pipeline. So In a basic sense, shaders are nothing more than programs transforming inputs to outputs. Shaders are also very isolated programs.

 

Vertex Shader

The vertex shader is a program on the graphics card that processes each vertex and its attributes as they appear in the vertex array.

Its duty is to output the final vertex position in device coordinates and to output any data the fragment shader requires.

That’s why the 3D transformation should take place here. The fragment shader depends on attributes like the color and texture coordinates, which will usually be passed from input to output without any calculations.

Remember that our vertex position is already specified as device coordinates and no other attributes exist, so the vertex shader will be fairly bare bones.

 

 

 

Fragment Shader 

the output from the vertex shader is interpolated over all the pixels on the screen covered by a primitive. These pixels are called fragments and this is what the fragment shader operates on. Just like the vertex shader it has one mandatory output, the final color of a fragment. It’s up to you to write the code for computing this color from vertex colors, texture coordinates and any other data coming from the vertex shader.

 

 

 

Compile the program and shaders

 

 

The next step is to upload this vertex data to the graphics card. This is important because the memory on your graphics card is much faster and you won’t have to send the data again every time your scene needs to be rendered (about 60 times per second).

 

 

 

 

This is done by creating a Vertex Buffer Object (VBO):

 

 

 

Now it is time to create EBO

 

 

 

Also get the position

 

 

 

Also we are getting the color and texture coordinates from shader

 

 

 

So now we are creating the texture

 

 

 

Also do param wrapping on textures

 

 

 

So now we do texture filtering

 

 

 

Texture Wrapping

Texture coordinates usually range from (0,0) to (1,1) but what happens if we specify coordinates outside this range?  The default behavior of OpenGL is to repeat the texture images (we basically ignore the integer part of the floating point texture coordinate),

but there are more options OpenGL offers:

  • GL_REPEAT: The default behavior for textures. Repeats the texture image.
  • GL_MIRRORED_REPEAT: Same as GL_REPEAT but mirrors the image with each repeat.
  • GL_CLAMP_TO_EDGE: Clamps the coordinates between 0 and 1. The result is that higher coordinates become clamped to the edge, resulting in a stretched edge pattern.
  • GL_CLAMP_TO_BORDER: Coordinates outside the range are now given a user-specified border color.

 

 

 

Texture Filtering

So texture coordinates do not depend on resolution but can be any floating point value, thus OpenGL has to figure out which texture pixel (also known as a texel ) to map the texture coordinate to. This becomes especially important if you have a very large object and a low resolution texture.

You probably guessed by now that OpenGL has options for this texture filtering as well. There are several options available but for now we’ll discuss the most important options: GL_NEAREST and GL_LINEAR.

GL_NEAREST (also known as nearest neighbor filtering) is the default texture filtering method of OpenGL. When set to GL_NEAREST, OpenGL selects the pixel which center is closest to the texture coordinate.

Below you can see 4 pixels where the cross represents the exact texture coordinate. The upper-left texel has its center closest to the texture coordinate and is therefore chosen as the sampled color:

 

 

 

 

Also in here we are going to load our texture using pillow library

 

 

 

So these lines of codes are for cube rotation in x and y positions

 

 

 

 

 

Run the complete code and you will see a rotating cube with textures

Python Modern Opengl Texturing Rotating Cube
Python Modern Opengl Texturing Rotating Cube

Subscribe and Get Free Video Courses & Articles in your Email

 

3 thoughts on “Python Modern Opengl Texturing Rotating Cube”

  1. Why have you declared the vertices for all faces separately instead of utilizing the indices to call them ?

  2. This examle did not run here.
    I figured why:

    here:
    (( https://stackoverflow.com/questions/15639957/glgetattriblocation-returns-1-when-retrieving-existing-shader-attribute ))
    ————————
    so i remove :
    position = glGetAttribLocation(shader, ‘position’)
    color = glGetAttribLocation(shader, ‘color’)
    texCoords = glGetAttribLocation(shader, “InTexCoords”)

    instead:
    position = 0
    color = 1
    texCoords = 2
    glBindAttribLocation( shader, position, ‘position’ )
    glBindAttribLocation( shader, color, ‘color’ )
    glBindAttribLocation( shader, texCoords, ‘InTexCoords’ )
    —–

Comments are closed.

Codeloop
Share via
Copy link
Powered by Social Snap
×