Welcome to the OpenGL ES 2.0 Primer. The purpose of this article is to give a solid overview of the core concepts in OpenGL, its shader language (GLSL) and provide a foundation for further learning. While there have been many tutorials focusing on the various versions and subsystems of OpenGL it is still a topic that has a reputation for being complex and unintuitive. OpenGL is actually a fairly simple API once you wrap your head around its overall design and by looking at it from a higher level you can gain a greater understanding and confidence in its lower level details.

What the primer is

A look at the core concepts of OpenGL ES broken into groups of functionality. Some common functions are listed and OpenGL ES 2.0's shader system is explained with example GLSL source code provided to demonstrate real world usage.

What the primer isn't

A "how to setup OpenGL" step-by-step tutorial with full source code, there are lots of these already online and most development environments ship with an OpenGL template that could just as easily be followed through with an article such as this.


In the beginning:

OpenGL ES 1.1 (fixed function)

OpenGL ES is a subset of the OpenGL 3D API designed specifically for embedded systems. OpenGL ES 1.1 is available on all iOS devices, most Android devices, the Nintendo 3DS and lots more. OpenGL provides a common programming interface for rendering hardware accelerated graphics both 2D and 3D along with numerous capabilities such as blending, multi-texturing, lighting and more. OpenGL is neither a game engine nor a GUI system however it can be used to build both of these things. It is scalable enough in its complexity that it can be used to build both sophisticated engines as well as quick prototypes.

OpenGL ES will on most platforms be offered as a C API. This in itself can throw some people off, especially those coming from a strictly OOP background however as the following primer will hopefully illustrate, once you look past the seemingly never-ending function/constant list its architecture can be broken down into just a handful of sub-groups.

Two types of data

Generally speaking OpenGL deals with only two types of data; geometry data in the form of vertices and pixel data in the form of textures and render buffers. OpenGL takes a collection of vertices each defining position as well as other optional attributes such as color and texture coordinates, then renders them in one form or another to a frame buffer which can be displayed onscreen.

Behind the scenes OpenGL manages its own memory and data is copied to it from the application by pointer. For example, to create a texture you would tell OpenGL to generate a new texture then tell it to allocate memory to store the texture data of a defined size, type etc. and pass it a pointer to your image data which OpenGL would then copy into its internal texture memory. With vertex data you can take a similar approach by telling OpenGL to allocate a new VBO (vertex buffer object) and passing a pointer to your vertex data which OpenGL will then copy into the VBO. With vertex data you can also tell OpenGL to copy on render thus avoiding the need for VBOs. In this case you would pass your vertex data pointer to OpenGL however it will wait until a render function is called to copy the data to its internal memory.

The following listing shows a very simple way of defining geometry data. First a 2D and 3D vector type is defined then a vertex type is defined using the 3D vector and 2D vector for its position and texture coordinate attributes respectively. A vertex array is then declared, since a cube has six faces and a face is composed of two triangles the array needs 12 x 3 elements (36 vertices).

				typedef struct
					float x;
					float y;
				} vector2;
				typedef struct
					float x;
					float y;
					float z;
				} vector3;
				typedef struct
					vector3 position;
					vector2 textureCoordinate;
				} vertex;
				#define CUBE_TRIANGLE_COUNT	12
				vertex cube[CUBE_TRIANGLE_COUNT * 3];

Once the vertices in the array have been initialized accordingly, i.e. had their position and texture coordinates set to represent the geometry of the cube, this data would be ready to pass to OpenGL for rendering.

Three types of function

The functions that OpenGL ES 1.1 provides can be roughly split into 3 categories; data handling, state settings and rendering. The data handling functions tell OpenGL to allocate/copy/free internal memory. The settings functions get/set OpenGL state. Lastly, the rendering functions tell OpenGL to render using the current data and settings.

The following listings show a few commonly used functions in each category. While these lists are in no way extensive they should give you a good idea of basic rendering requirements and how they are managed in OpenGL ES 1.1.

Data handling functions

Persistent OpenGL data such as frame buffers, textures and VBOs require basic memory management in the form of their provided creation/deletion functions.

				glEnableClientState( arrayName);					// enables a vertex attribute 
				glDisableClientState( arrayName);					// disables a vertex attribute e.g. 
				glVertexPointer( size, type, stride, pointer);		// setup the vertex positions pointer
				glColorPointer( size, type, stride, pointer);		// setup the vertex colors pointer
				glTexCoordPointer( size, type, stride, pointer);	// setup the vertex texture coordinates pointer
				glGenTextures( size, names);		// generates one or more textures, size is number of textures
													// and names is an array in which to store the texture names (numeric IDs)
				glDeleteTextures( size, names);		// deletes one or more textures, size is number of textures
													// and names is an array of existing texture names (numeric IDs)
				// texture image data can be set by binding a generated texture with glBindTexture() then using
				// the glTexImage2D() function to inform OpenGL of the image data to copy
				glTexImage2D( target, level, internalFormat, width, height, border, format, type, data);

OpenGL settings functions

OpenGL ES 1.1 provides lots of built in capabilities that can be enabled/disabled as needed. Nothing comes for free, if you don't need the lighting system, disable it, if you don't need blending, disable it. Aside from simply enabling/disabling capabilities most also expose settings through specific functions, for example to customize the various light's individual settings you can use the glLight() functions.

				glEnable( capability);			// enable an OpenGL capability
				glDisable( capability);			// disable an OpenGL capability
												// for example GL_BLEND, GL_DEPTH_TEST, GL_LIGHTING
				glMatrixMode( mode);			// sets the current matrix, all matrix functions will then operate on this matrix
				glLoadIdentity();				// set the current matrix to identity
				glTranslate( x, y, z);			// translates (offsets) the current matrix by x/y/z
				glRotate( angle, x, y, z);		// rotates the current matrix: angle around x/y/z
				glActiveTexture( textureUnit);				// sets the current texture unit, all texture functions will
															// then operate on this texture unit
				glBindTexture( target, name);				// binds the texture 'name' to 'target' of the current texture unit
															// (for OpenGL ES 1.1 target must be GL_TEXTURE_2D)
				glTexParameteri( target, pName, pValue);	// sets a texture parameter for the current texture unit
				glLightfv( light, pName, pValue);	// sets a parameter on one of OpenGL's lights
													// light is the light to affect, pName is a parameter name constant
													// and pValue is the actual value you wish to assign to the parameter
													// e.g. glLightfv( GL_LIGHT0, GL_DIFFUSE, diffuseColor);
				glFogfv( pName, pValue);			// sets a fog parameter. pName is a parameter name constant and
													// pValue is the actual value you wish to assign to the parameter
													// e.g. glFogfv( GL_FOG_COLOR, fogColor);

Rendering functions

There are only two functions that perform actual rendering in OpenGL ES 1.1, both render the current vertex data in a variety of modes using the current OpenGL settings. The difference between the two functions is that one expects vertex data to be explicitly ordered while the other uses an array of indices into the vertex data to specify order. The benefit of using indices is that geometry will often have many shared vertices, by only defining unique vertices and then using indices to reference them you can minimize the amount of data copied and stored by OpenGL.

				// both render functions can specify a mode that defines how vertices are to be rendered, the various modes
				// expect vertices to be ordered in a specific way. For example:
				//    GL_TRIANGLES - every 3 vertices represents a single triangle
				//    GL_TRIANGLE_STRIP - a strip of connected triangles, only unique vertices are specified
				//    GL_POINTS - each vertex represents a single point sprite
				glDrawArrays( mode, first, count);					// renders the current vertex data using array vertex order
				glDrawElements( mode, count, type, indices);		// renders the current vertex data using indices into
																	// the vertex arrays rather than array vertex order

Rendering the cube

The following listing shows how to use a few previously described functions from the three groups to render our cube vertex data. The code tells OpenGL ES 1.1 to use a specific texture for rendering, then sets up the vertex data, then sets up the modelview and projection matrices and finally tells it to render.

				// setup the first texture unit to use the texture
				glActiveTexture( GL_TEXTURE0);
				glBindTexture( GL_TEXTURE_2D, textureName);
				// enable and send vertex position attribute data
				glEnableClientState( GL_VERTEX_ARRAY);
				glVertexPointer( 3, GL_FLOAT, sizeof(vertex), &cube[0].position);
				// enable and send vertex texture coordinate attribute data
				glEnableClientState( GL_TEXTURE_COORD_ARRAY);
				glVertexPointer( 2, GL_FLOAT, sizeof(vertex), &cube[0].textureCoordinate);
				// setup projection matrix
				glFrustumf( left, right, bottom, top, zNear, zFar);
				// setup modelview matrix
				glMatrixMode( GL_MODELVIEW);
				glTranslate( 0.0f, 0.0f, -10.0f);
				// render
				glDrawArrays( GL_TRIANGLES, 0, CUBE_TRIANGLE_COUNT * 3);

And then there was light:

OpenGL ES 2.0 (programmable pipeline)

OpenGL ES 2.0 is the next (and current) version of OpenGL for embedded systems and aside from introducing many new capabilities also removes a significant chunk of the OpenGL ES 1.1 pipeline in favor of programmer customization. OpenGL ES 2.0 is available on all iOS devices from the 3GS and iPad up, most Android devices from 2.0 up and many more. OpenGL ES 2.0 is also the API for WebGL enabling high quality 3D and 2D graphics via Javascript in the web browser. With support on such a wide range of platforms, OpenGL ES 2.0 is both an exciting and highly valuable skillset to possess.

The topics covered in the first half of this primer still exist and are relevant to OpenGL ES 2.0 however anything that can now be produced using the programmable pipeline has had its fixed functionality removed. Capabilities such as matrix operations, the lighting system and fog no longer exist in OpenGL ES 2.0 since they can easily be reproduced by the programmer if required and since they are no longer fixed can be customized in ways that were simply not possible with OpenGL ES 1.1.

One program, two shaders

With the core functionality removed, nothing in OpenGL ES 2.0 can be rendered without using a custom program. A program is made up of two shaders; the vertex shader and the fragment shader. Firstly the vertex shader processes every vertex that has been passed to OpenGL, transforming the positions into clip space and setting up any values that should be interpolated across connected vertices such as colors or texture coordinates. OpenGL then performs some magic and calls the fragment shader for every visible fragment providing the interpolated values for that fragment.

All these steps are of course highly customizable. Vertex data is defined by declaring attributes in the vertex shader such as position, color, texture coordinate. Interpolated values are defined by declaring and writing varyings in the vertex shader then declaring and reading in the fragment shader. Program properties can also be defined by declaring uniforms which can be set from the application level and read by both the vertex and fragment shader. For example a tint uniform could be used to control a global tint-color from the application.

The following diagram illustrates the roles of the vertex and fragment shaders, what they can read and what they should write.

This may appear complex at first but as you will see shortly the code for a basic transform, projection and texturing program is very simple with the majority of code spent declaring the attributes, uniforms and varyings.

The vertex shader

In the basic example below the vertex position is transformed and projected using the modelview and projection matrices which are passed to the shader from the application via its uniforms. The texture coordinate attribute is simply assigned to the varying so that the interpolated coordinate can be read by the fragment shader.

				attribute vec4 position;				// vertex position attribute
				attribute vec2 texCoord;				// vertex texture coordinate attribute
				uniform mat4 modelView;					// shader modelview matrix uniform
				uniform mat4 projection;				// shader projection matrix uniform
				varying vec2 texCoordVar;				// vertex texture coordinate varying
				void main()
					vec4 p = modelView * position;		// transform vertex position with modelview matrix
					gl_Position = projection * p;		// project the transformed position and write it to gl_Position
					texCoordVar = texCoord;				// assign the texture coordinate attribute to its varying

The fragment shader

Continuing the basic vertex shader example this accompanying fragment shader simply uses the interpolated texture coordinate to sample a texture and write the output to gl_FragColor. The texture2D() function takes two parameters, a texture and a coordinate and returns the sampled pixel value at the coordinate.

				precision mediump float;		// set default precision for floats to medium
				uniform sampler2D texture;		// shader texture uniform
				varying vec2 texCoordVar;		// fragment texture coordinate varying
				void main()
					// sample the texture at the interpolated texture coordinate
					// and write it to gl_FragColor 
					gl_FragColor = texture2D( texture, texCoordVar);

Program, meet application

Communication between the application and the program is a simple process and introduces just a few new functions that share similarities with their OpenGL ES 1.1 fixed function counterparts.

				glUseProgram( program);		// bind the program, all program functions will then operate on
											// this program and it will be used for any following rendering
				// uniforms are set using the glUniform??() group of functions, all take a location parameter
				// that specifies the uniform to set. Uniform locations can be retrieved with glGetUniformLocation()
				glUniform4f( location, x, y, z, w);							// set uniform as a vec4 (4D vector)
				glUniformMatrix4fv( location, count, transpose, values);	// set uniform as a mat4 (4x4 matrix)
				// just like OpenGL ES 1.1 you pass vertex data by enabling the specific vertex attribute and giving
				// OpenGL a pointer to the data. Since the vertex attributes a program accepts are customizable 
				// you use the following two functions to enable and pass attribute data.
				// index is the number the attribute was bound to at startup using glBindAttribLocation()
				glEnableVertexAttribArray( index);
				glVertexAttribPointer( index, size, type, normalized, stride, data);

To illustrate just how similar the process is, compare the OpenGL ES 1.1 rendering example at the beginning of this article with the following listing. The main steps are the same except we enable/pass the vertex data using the generic glEnableVertexAttribArray() and glVertexAttribPointer() functions and the fixed function matrix operations have been replaced with our own matrices passed to the program through its uniforms.

				// setup the first texture unit to use the texture
				glActiveTexture( GL_TEXTURE0);
				glBindTexture( GL_TEXTURE_2D, textureName);
				// enable the program
				glUseProgram( basicProgram);
				// enable and send vertex position attribute data
				glEnableVertexAttribArray( positionIndex);
				glVertexAttribPointer( positionIndex, 3, GL_FLOAT, GL_FALSE, sizeof(vertex), &cube[0].position);
				// enable and send vertex texture coordinate attribute data
				glEnableVertexAttribArray( textureCoordIndex);
				glVertexAttribPointer( textureCoordIndex, 2, GL_FLOAT, GL_FALSE, sizeof(vertex), &cube[0].textureCoordinate);
				// set uniforms
				glUniformMatrix4fv( modelViewLocation, 1, GL_FALSE, modelViewMatrix);		// set modelView matrix
				glUniformMatrix4fv( projectionLocation, 1, GL_FALSE, projectionMatrix);		// set projection matrix
				glUniform1i( textureLocation, 0);											// set texture unit to sample
				// render
				glDrawArrays( GL_TRIANGLES, 0, CUBE_TRIANGLE_COUNT * 3);

Next steps

This primer will hopefully have given you a good overview of how OpenGL ES operates however there are many things that have been skipped for the sake of clarity. Your next steps should be to investigate your platform's OpenGL ES template and/or a simple example application. Look for functions and steps that you recognize from this primer and use the OpenGL ES 2.0 reference card (mentioned in the following quick tips section) to explain anything you don't recognize.

Nearly everything in OpenGL uses the same naming conventions and approaches that you have read so far and once you have your basic example application up and running you can learn a great deal through experimentation. Change OpenGL state settings, play around with your basic vertex and fragment shader code and see how it all affects rendered output. Above all else have fun!

If you have any further questions or suggestions regarding this primer or OpenGL ES in general I can be reached at: ben@kode80.com

Quick tips

Using normalized coordinates

When rendering a fullscreen quad there is no need for the vertex shader to perform position transformations or projection. By providing the quad's vertex positions in normalized coordinates (-1,1) the vertex shader can simply assign the position attribute to gl_Position. This can be particularly useful when starting out as it will allow you to remove your matrix library from the equation and focus on basic OpenGL setup and rendering, once you know that the basics are setup and you have a program displaying something you can move onto adding matrices.

Use varyings to minimize fragment calculations

In most cases the vertex shader will be run significantly less than the fragment shader due to there being more pixels to process than vertices. Many calculations can be moved from the fragment shader to the vertex shader then interpolated using varyings rather than calculated per-fragment with minimal loss in visual fidelity.

Avoid calculating texture coordinates in the fragment shader

Calculating texture coordinates in the vertex shader allows OpenGL to optimize texture sampling in the fragment shader. Sampling a texture in the fragment shader using anything other than a varying as the coordinate will result in a loss of performance.

Know your hardware

While OpenGL's purpose is to hide hardware details from the programmer through a common programming interface, thought should still be given to the platform your application will be running on. The platform specific implementation will make certain things faster/slower than other platforms and research should be done into these inconsistancies. For example, Apple provides detailed 'best practices' regarding OpenGL ES on its iOS devices.

Download the OpenGL ES 2.0 reference card

This is an indispensable 4 page cheat-sheet that lists every OpenGL ES 2.0 constant and function including full GLSL coverage. Download by visiting Kronos Group's site linked at the bottom of this article and clicking the Reference Card button.

Learn by creating wrappers

A great way of getting comfortable with OpenGL ES 2.0 is creating wrappers in your favorite language for various common tasks. For example creating a Program class that manages compilation/linking/usage of a program will not only teach you about the process but also greatly cut down on the amount of setup time and code in future projects.

Learn with GLSL Studio

I designed GLSL Studio to make learning GLSL and creating OpenGL ES 2.0 programs as user friendly as possible. All the things covered in this primer are handled through the app's GUI, leaving you to focus on shader development. For example, a variety of customizable geometry can be generated in-app for use with your programs as well as common 3D file formats that can be imported. Many example programs are bundled with the app such as 3D lighting, geometry deformation and realtime camera effects, all of which can be edited, exported and learnt from.

To learn more about GLSL Studio, check out the main page.

System requirements: iOS 4.0 or higher Compatible with: iPhone 3GS, iPod 4th Gen, iPad or higher