A Preliminary Study of OpenGL's Texture

1 Overview

First a question: what is a texture?

is composed of large chunks of image data that can be used to paint onto the surface of an object to enhance its realism. (Content in Chapter 6 Objectives of the Red Book )

One more question: what can it do?

As described in the previous question, since it is composed of large blocks of image data, for example, in the fragment processing stage, a large number of textures can be used, that is, in the stage of coloring each vertex, the original rgba color value is converted and used The texture-related data is colored, which saves a lot of ragb color values ​​​​in order to meet the complex color values ​​​​of the overhead and operation difficulty.

A texture is actually a 2D image (it can also be 1D or 3D), which is composed of texel s, which usually contain color data information.

The concept of texture mapping
A texture is like a wallpaper painted with bricks and then glued to a house in 3D so that the house looks like it has the appearance of a brick wall, this is the concept of texture mapping.

To map a texture to a triangle, you need to specify which part of the texture corresponds to each vertex of the triangle, and each vertex is associated with a texture coordinate, which is used to indicate which part of the texture image to sample from (collect fragment color),
The fragment shader can then interpolate on these vertex coordinates.

Texture coordinates are on the x and y axes, ranging from 0 to 1. Using texture coordinates to obtain texture colors is called sampling.

[External link image transfer failed, the source site may have anti-leech mechanism, it is recommended to save the image and upload it directly (img-9pGLkGkd-1593678005599)(https://i.loli.net/2020/07/01/EX9DsMvnxO4uUPB.png )]

float texCoords[] ={
	0.0f,0,0f,
	1.0f,0,0f,
	0.5f,1.0f
}

2. Introduction to texture wrapping

The range of texture coordinates is from (0,0) to (1,1) What happens if the texture coordinates are set outside the range?
OpenGL's default behavior is to repeat the texture image (such as point (0,2)
(1,2) This range point is textured by repeating the texture image. )

wraparound describe
GL_REPEAT Default behavior for textures, repeat texture image
GL_MIRRORED_REPEAT Same as GL_REPEAT, but each repetition is mirrored
GL_CLAMP_TO_EDGE The texture coordinates will be constrained between 0 and 1, and the excess will repeat the edges of the texture coordinates, producing an edge stretched effect
GL_CLAMP_TO_BORDER Exceeded coordinates are the user-specified edge color

When texture coordinates are outside the default range, the visual effect is as follows:
[External link image transfer failed, the source site may have anti-leech mechanism, it is recommended to save the image and upload it directly (img-M8QfexbY-1593678005603)(https://i.loli.net/2020/07/01/jDdCV8bhp2lw6kA.png )]

Regarding setting the above wrapping method, you can use the glTexParameter* function to set a single coordinate axis (for example, the coordinate axes of the texture are s and t, (x, y), that is, you can set the s or t axis, Set to the above-mentioned wrapping method for the S-axis to wrap around);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_MIRRORED_REPEAT);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,
	GL_MIRRORED_REPEAT);

  1. The first parameter specifies the texture coordinates, using a 2D texture

  2. Specify the options of the setting and the texture axis of the application, which is to wrap around the S or T axis as the axis

  3. Set wrap mode

If the GL_CLAMP_TO_BORDER option is used, that is, the color of the edge specified by the user for the out-of-range, so here select a fv suffix of the glTexParameter function to specify a color.

float[] borderColor[] = {1.0f,1.0f,1.0f,0.0f};
glTexParameterfv(GL_TEXTURE_2D,GL_CLAMP_TO_BORDER,borderColor);

3. Texture filtering

Texture coordinates do not depend on resolution and can be any floating point value. OpenGL needs to know how to map texture pixels to texture coordinates;

Texture coordinates are an array of settings for model vertices,That is, it is bound to the vertex data of the model;

Each pixel in the texture pixel image;

OpenGL The texture coordinates of the vertices of the model are used to find the texture pixels, and then the color of the pixels is extracted by sampling, then the mapping of texture pixels to texture coordinates through texture coordinates is completed.

When there is a large object but the texture resolution is very low, we need to filter the texture. For the distant texels, the pixel closest to the texture coordinate from the center point is used for processing. This is a A texture filter.
Here's how to filter:

  1. GL_NEAREST (proximity filtering) is a default texture filtering method of OpenGL. When set to this method, OpenGL will select the pixel whose center point is closest to the texture coordinates.
  2. GL_LINEAR (linear filtering) will calculate an interpolation based on the texture pixels near the texture coordinates to approximate the color between these texture pixels. The advantage of this is to make the image look smoother, and the visual effect will not appear obtrusive. Compare smooth.

The circles in the figure below represent texture coordinate points, and the squares on the right are the returned color values.

[External link image transfer failed, the source site may have anti-leech mechanism, it is recommended to save the image and upload it directly (img-jcsZs5VN-1593678005604)(https://i.loli.net/2020/07/02/d1YBtwuQhsbKfe2.png )]

What kind of situation will use this texture filtering method?
When zoom-in and zoom-out operations are required. Proximity filtering can be used when the texture is reduced,
When zooming in, use linear filtering, because if you force zoom in at this time, the image will be stretched without processing, and the image will appear blurry. A linear filtering method makes the image smoother.

Use the glTexParameter* functions to specify filtering for zoom in and zoom out.

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);

Multi-level fade-out texture

Textures have the same high resolution as near objects, since far objects may produce fewer fragments, it is difficult for OpenGL to get the correct color values ​​from high resolution textures for far fragments, because farther There is less color information nearby, and a texture color needs to be sampled across a large part of the texture fragment, which can produce an unrealistic feel on small objects, and use high-resolution textures for distant fragments. Waste of memory.

In order to solve the above-mentioned sampling problem of distant objects, OpenGL uses a multi-level fade-out texture concept to solve this problem,
details as follows:
It is through a series of texture images, the latter texture image is half of the former;
If the distance from the observer exceeds a certain threshold, OpenGL will use different multi-level fade-out textures, that is, how much of the texture image should be used for the distance. The farther away, the smaller the texture image (because it is always divided by 2), you can see the picture below:

[External link image transfer failed, the source site may have anti-leech mechanism, it is recommended to save the image and upload it directly (img-NVcf2yFX-1593678005606)(https://i.loli.net/2020/07/02/dqG34AQ6NWVtRhj.png )]

In fact, the above picture is drawn by hand to simulate the creation of a series of multi-level fade-out textures. OpenGL provides a glGenerateMipmaps function. After creating a texture, by calling this function, OpenGL will help us complete these multi-level fade-out textures. processing.

When switching multi-level fade-out texture levels in rendering, it will produce unrealistic texture layers at different levels
Therefore, the original filtering method is replaced by the filtering method between different multi-level and progressive texture levels. 3. Texture filtering introduction;

GL_XXX01_MIPMAP_XXX02;

use XXX01 way to interpolate for sampling,
use XXX02 way to texture;

if (XXX02.container(XXX_NEAREST))
{
	cout<<"Use Nearest Neighbor Multilevel Fade Texture Levels"<<endl;
}else{
	cout<<"Linear interpolation between two adjacent multilevel fade-out textures"<<endl;
}

filter method Description|
GL_NEAREST_MIPMAP_NEAREST Use nearest neighbor multilevel fade-out texture to match pixel size and use neighbor interpolation for texture sampling
GL_LINEAR_MIPMAP_NEAREST Use the nearest multi-level fade-out texture level, and use linear interpolation for sampling
GL_NEAREST_MIPMAP_LINEAR Linearly interpolate between two multilevel fade-out textures that best match pixel size, sampled using adjacent interpolation
GL_LINEAR_MIPMAP_LINEAR Use linear interpolation between two adjacent multilevel fade-out textures, and use linear interpolation for sampling
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR_MIPMAP_LINEAR);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);

Common mistakes: Set the zoom-in filter option to one of the multi-level fade-out texture filtering options, which is a wrong concept, because multi-level fade-out textures are mainly used when the texture is reduced

4. Load and create textures

Textures need to be loaded into an application before they can be used. Texture images may be stored in a variety of formats, each with their own data structure and arrangement, so how can these images be loaded into the application?
General solution:
Choose a desired file format such as .PNG, and then write an image loader to convert the image to a byte sequence;

But there are too many file formats to write a supported loader for all of them.

Better options:
Use an image loading library that supports many popular formats to solve this problem.
for example std_image.h library

#define STB_IMAGE_IMPLEMENTATION
#include "stb_image.h"

///......

By defining STB_IMAGE_IMPLEMENTATION, the preprocessor modifies the header file,
Let it only contain the relevant function definition source code, which is equivalent to turning the stb_image.h header file into a .cpp file, just include stb_image.h in your program and compile it.

To load images using stb_image.h, you need to use the stbi_load function:

int width , height,nrChannels;

unsigned char* data = stbi_load("xxx.jpg",&width,&height,&nrChannels,0); // Convert a picture to a sequence of characters

5. Generate textures

  1. Textures are also referenced using ID s, creating a texture object;
unsigned int texture;
glGenTextures(1,&texture);
  1. Binding the texture is for the subsequent texture command configuration to take effect
glBindTexture(GL_TEXTURE_2D,texture);
  1. After the texture is bound, a texture is generated from the loaded image data,
glTexImage2D(GL_TEXTURE_2D,//1
	0,//2
	GL_RGB,//3
	width,//4
	height,//5
	0,//6
	GL_RGB,//7
	GL_UNSIGNED_BYTE,//8
	data//9
	);
glGenerateMipmap(GL_TEXTURE_2D);

  1. The first parameter specifies the texture target. Setting it to GL_TEXTURE_2D means that a texture on the same target as the currently bound texture object will be generated. (Experience the above blackening steps carefully)

  2. The second parameter specifies the level of the multi-level fade-out texture for the texture, 0 is the base level

  3. The third parameter tells OpenGL which format to store the texture in. The image only has RGB values, so the texture is stored as RGB values

  4. The fourth and fifth parameters set the width and height of the final texture. The width and height were stored when the image was loaded before, so use the corresponding variables.

  5. The sixth parameter is the legacy problem, which is always 0

  6. The seventh and eighth define the format and data type of the source image, load this image with RGB values, and store them as a char(byte) array;

  7. The last parameter is the actual image data.

After the above call, the currently bound texture object will be attached with a texture image,
If you want to use a multi-level fade-out texture, you can call glGenerateMipmap, which will automatically generate all the required multi-level fade-out textures for the current texture.

Texture generation process:

unsigned int textureId;
glGenTextures(1,&textureId);
glBindTexture(GL_TEXTURE_2D,texture);

//configrure texture filter and rotate
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_REPEAT);
glTExParamteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_REPEAT);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);

// load and generate texture
int width ,height,channels;

unsigned char * data = stbi_load("cc.jpe",&width,&height,&channels,0);

if(data){
	glTexImage2D(GL_TEXTURE_2D,0,GL_RGB,
		width,height,0,GL_RGB,GL_UNSIGNED_BYTE,data);
	glGenerateMipmap(GL_TEXTURE_2D);
}else{
	std::cout<<"Failed to load texture"<<std::endl;
}
stbi_image_free(data);

6. Apply Textures

Due to the need to add a set of texture coordinates, the data of the vertex array is increasing. Here we use
EBO and glDrawElements are used to draw, so that vertex data can be reduced and memory consumption can be saved. In order for OpenGL to sample textures, texture coordinates need to be added to vertex data, that is, a layer of mapping between texture coordinates and vertex data;

float vertices[] = {
// -- position --- -- color -- -- texture coordinates --
	0.5f,0.5f,0.0f, 1.0f,0.0f,0.0f, 1.0f,1.0f, //top right
	-0.5f,0.5f,0.0f,0.0f,1.0f,0.0f, 0.0f,1.0f,//upper left
	0.5f,-0.5f,0.0f,0.0f,0.0f,1.0f,1.0f,0.0f,//lower right
	-0.5f,-0.5f,0.0f,1.0f,1.0f,0.0f,0.0f,0.0f//lower left

};


An additional vertex attribute is added, so the new vertex format needs to be updated, and the step size information to update the attribute synchronously, as shown in the following figure

glVertexAttribPointer(2,2,GL_FLOAT,8*sizeof(float),
	(void*)(6*sizeof(float)));
glEnableVertexAttribArray(2);

The corresponding vertex shader is as follows:

#version 330 core
layout(location = 0) in vec3 aPos;
layout(location = 1) in vec3 aColor;
layout(location = 2) in vec2 aTexCoord;

out vec3 shareColor;
out vec2 texCoord;

void main(){
	gl_Position = vec4(aPos,1.0);
	shareColor = aColor;
	texCoord = aTexCoord;
}

The corresponding fragment shader is as follows:

#version 330 core
out vec4 FragColor;

in vec3 shareColor;
in vec2 texCoord;

uniform sampler2D shareTexture;

void main(){
	FragColor = texture(shareTexture,texCoord)*vec4(shareColor,1.0);
}

The main note above: sampler2D is a built-in data type that GLSL uses for texture objects, called samplers, through which textures are added to fragment shaders.

There is also a built-in function texture of GLSL to sample the color of the texture. The first parameter is the texture sampler, and the second parameter is the corresponding texture coordinates.

7. Texture units

Texture unit: The position value of a texture, which can be assigned a position value to the texture sampler through the glUniformli function, so that multiple textures can be set in a fragment shader.
By assigning the texture unit to the sampler, multiple textures can be bound at one time, and the texture can be used by subsequent texture activation methods. The combined effect of multiple textures can do some interesting things.

activate bound texture

glActiveTexture(GL_TEXTURE0);//GL_TEXTURE0 is the default, such as only one
glBindTexture(GL_TEXTURE_2D,texture0);


// two
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D,texture1);

//....16
glActiveTexture(GL_TEXTURE16);
glBindTexture(GL_TEXTURE_2D,texture15);

refer to

learnopengl-cn

Tags: Android Computer Graphics

Posted by kevinlcarlson on Tue, 31 May 2022 23:50:28 +0530