Graphics Assignment 2


For this graphics assignment we were tasked with building an application that featured Cameras, Loading Texture Files, Loading 3D Models, and Shaders. My approach was to first create a simple model in Blender, add a texture to that model and then export it in the Wavefront .obj file format.

Setting up OpenGL

To set up OpenGL I used two libraries. GLEW (, 2017) was one of them, it provides function pointers to OpenGL functions so that I can use them without declaring a function pointer each time. I also used SFML to open up an OpenGL context (window to render to). I also could have used GLUT (Khronos, 2017), SDL (, 2017), GLFW (, 2017) and many others. I chose to use SFML (, 2017) because I have been using it in my other projects so I was accustomed to setting it up.

int main()
	const int WindowWidth = 1600, WindowHeight = 900;
	sf::RenderWindow window(sf::VideoMode(WindowWidth, WindowHeight), "Graphics Assignment - Elliot");

I also initialised GLEW straight afterwards which allowed me to use OpenGL functions.

GLenum glewErr = glewInit();
if (GLEW_OK != glewErr)
	std::cout << "Failed to init GLEW: " << glewGetErrorString(glewErr) << std::endl;

I also needed a loop that ran whilst the program was open so that I could render on each iteration of that loop.

// Main loop
while (window.isOpen())
	// Event loop
	sf::Event event;
	while (window.pollEvent(event))
		if (event.type == sf::Event::Closed)

	GLfloat radius = 10.0f;
	GLfloat camX = sin(clock.getElapsedTime().asSeconds()) * radius;
	GLfloat camZ = cos(clock.getElapsedTime().asSeconds()) * radius;
	view = glm::lookAt(glm::vec3(camX, 0.0, camZ), glm::vec3(0.0, 0.0, 0.0), glm::vec3(0.0, 1.0, 0.0));

	// back buffer colour
	// OpenGL rendering
	Render(VAOHandle, textureBufferHandle, shaderProgramHandle, monkey.glVertices.size());
	// swap the buffers

If you look at the first code block inside the loop you can see the event handling loop. I didn’t use this because it wasn’t related to the assignment but the one event that is handled is closing of the window, otherwise lots of used resources won’t be freed.

The next code block is for some camera operations that I will get to later on.

After that, you can see me clearing the window which is effectively replacing the back buffer with the data of the supplied colour. I then call  function called Render that I use to draw my model. That effectively writes to the back buffer. I then call display which swaps the buffers so that the back buffer becomes the front buffer, read to be displayed, and the front buffer becomes the back buffer ready to be written to.

After that I simply clean up some resources and close the program.

	// Flag the shaders to be deleted when detached from the program
	// Detach shaders and free up any memory taken by the program.

This provides the basic structure for my program.

Loading the .obj Model

I wrote an OBJ class that provided the functionality needed to load an .obj file from a filepath supplied and convert the data to a form usable by OpenGL.

#pragma once
#include <string>
#include <vector>
#include <fstream>
#include <iostream>
#include <sstream>

#include <glm/vec3.hpp>

#include "Face.h"
#include "StringUtils.h"

class OBJ

	static OBJ LoadOBJ(std::string filepath);
	std::vector<glm::vec3> glVertices;
	std::vector<glm::vec2> glUVs;
	std::vector<glm::vec3> glNormals;

	void Convert();

	std::vector<glm::vec3> vertices;
	std::vector<glm::vec2> uvs;
	std::vector<glm::vec3> normals;
	std::vector<Face> faces;

The main structure of the class was that it contains a public set of vertices, UV coordinates and normals that can be used by my shaders to produce the model. You can see there are two types of these sets in the class. One prepended with ‘gl’ and one without. The ones prepended with ‘gl’ are intended to be used by the shaders and so they’re publicly accessible. The others act as storage for the raw data loaded in from the .obj file.

Below is the implementation of my function to load an .obj model.

OBJ OBJ::LoadOBJ(std::string filepath)
	OBJ output;

	std::ifstream in(filepath, std::ios_base::in);
	if (in.is_open())
		std::string line = "";
		while (getline(in, line))
			if (line.substr(0, 2) == "v ")
				std::istringstream stream(line.substr(2));
				glm::vec3 vector;
				stream >> vector.x;
				stream >> vector.y;
				stream >> vector.z;
			else if (line.substr(0, 2) == "vt")
				std::istringstream stream(line.substr(3));
				glm::vec2 vector;
				stream >> vector.x;
				stream >> vector.y;
			else if (line.substr(0, 2) == "vn")
				std::istringstream stream(line.substr(3));
				glm::vec3 vector;
				stream >> vector.x;
				stream >> vector.y;
				stream >> vector.z;
			else if (line.substr(0, 2) == "f ")
				std::istringstream stream(line.substr(2));
				Face face;
				for (int i = 0; i < 3; i++) 				{ 					std::string temp; 					int tempIndex; 					stream >> temp;

					std::vector<std::string> splitString = StringUtils::split(temp, '/');

					face.vertexIndices[i] = std::stoi(splitString[0]);
					face.textureIndices[i] = std::stoi(splitString[1]);
					face.normalIndices[i] = std::stoi(splitString[2]);
	else {
		std::cerr << "Failed to load file: " << filepath << std::endl;

	return output;

Initially I construct an output OBJ object named output. I then use std::ifstream (input file stream) to attempt to load a file from the supplied filepath. The second parameter of the std::ifstream is optional but specifies that only reading the data is necessary and not writing.

Next, I check that the file has opened successfully. If it has I create a std::string named line that will hold the data of each line read from the file so that it can be processed. I used getline, which takes a line from the file stream and stores it in the line variable. This happens inside a while loop which ends once the file has no more lines to read.

After that, I have all the functionality for processing the line. After reading a Wikipedia article (, 2017) on Wavefront .obj files, I learned that each line inside an obj file represents something based on the first word of the line. After I opened my model up in a text editor I was able to see the data it contained.

The first piece of data was labelled mtlllib. This contains the filename of the ‘material’ file associated with my model. I’ll talk about that more later.

mtllib monkey.mtl

It also contained hundreds of vertices such as this one. The first number represents the x coordinate of the vertex, the second number the y, and the third number the z.

v 0.437500 -0.765625 0.164062

After the list of vertices, there are the lines to represent the vertex texture coordinates. The first number represents the x UV-coordinate, and the second number represents the y UV-coordinate.

vt 0.0000 0.0000

After that list, we get the vertex normal data. The same rules apply here.

vn 0.9693 -0.2456 -0.0118

Finally there is the face data. The face contains 3 sets of data. Each with 3 numbers separated with the ‘/’ character. The first number is the vertex index, the second is the texture index and third is the normal index. These represent which index to use in the list of indices in the file. So 47 means that the face uses the 47th vertex in the list.

f 47/1/1 3/2/2 45/3/3

So when loading the model, I simply looped through each line, looking either ‘v’, ‘vt’, ‘vn’ or ‘f’ and added the data to a std::vector so that I could process it.

All that was left to do was to fill up the std::vectors that OpenGL was going to use. I did this by looping through the faces and putting the data at the corresponding indices in to the vectors that are to be used by OpenGL.


Now that the data from the model was loaded. I needed to set up my shaders to use that data. To do so, I wrote my shaders and then loaded them with OpenGL and compiled them. Once they were compiled I could link up to the main shader program to send to the GPU.

GLuint VAOHandle = 0;
GLuint VBOHandles[2];
GLuint vertexShaderHandle = 0, fragmentShaderHandle = 0, shaderProgramHandle = 0;

vertexShaderHandle = ShaderUtils::getInstance().CompileVertShader("vertShader.vert");
fragmentShaderHandle = ShaderUtils::getInstance().CompileFragShader("fragShader.frag");

shaderProgramHandle = ShaderUtils::getInstance().CreateShaderProgram();
ShaderUtils::getInstance().AttachShadersToProgram(shaderProgramHandle, vertexShaderHandle, fragmentShaderHandle);

// Map the indicies of the attributes to the shader program BEFORE LINKING IT.
glBindAttribLocation(shaderProgramHandle, 0, "VertexPosition");
glBindAttribLocation(shaderProgramHandle, 1, "VertexUV");
glBindFragDataLocation(shaderProgramHandle, 0, "UV");


// Create buffers for each attribute
glGenBuffers(2, VBOHandles);
GLuint positonBufferHandle = VBOHandles[0];
GLuint uvBufferHandle = VBOHandles[1];

// Populate the position buffer
glBindBuffer(GL_ARRAY_BUFFER, positonBufferHandle);
glBufferData(GL_ARRAY_BUFFER, monkey.glVertices.size() * sizeof(glm::vec3), &monkey.glVertices.front(), GL_STATIC_DRAW);

// Populate the uv buffer
glBindBuffer(GL_ARRAY_BUFFER, uvBufferHandle);
glBufferData(GL_ARRAY_BUFFER, monkey.glUVs.size() * sizeof(glm::vec2), &monkey.glUVs.front(), GL_STATIC_DRAW);

// Create and bind the VAO, which stores the relationships between the buffers and the attributes
glGenVertexArrays(1, &VAOHandle);

// Enable the vertex attribute arrays
glEnableVertexAttribArray(0); // Vertex Position
glEnableVertexAttribArray(1); // Vertex UV

// Map index 0 to the position buffer, tell it the buffer will have 3 elements (because of vec3 for position) in of size GL_FLOAT,
// We don't want to normalise the data, there is no stride (byte offset between consecutive attributes),
// and there is no offset from the beginning of the buffer.
glBindBuffer(GL_ARRAY_BUFFER, positonBufferHandle);
lVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, nullptr);

// Map index 1 to the uv buffer. UV has 2 coordinates instead of 3.
glBindBuffer(GL_ARRAY_BUFFER, uvBufferHandle);
glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, 0, nullptr);

// If the shader programs were linked properly
if (shaderProgramHandle != 0)
    // Install the program in to the OpenGL pipeline

I also set up the buffers to hold the vertex (VBO) and texture UV data (UVBO) from the model. Then I set up a VAO to link the data in those buffers with the my shader input variables.

After filling up the VBO and UVBO I could draw my model. You can see the model as you were looking down on to it, but as you can see there are not textures applied to that model.


When you load the model from the .obj file. It contains a mtllib line which has the filepath for the associated .mtl (material) file. This file is similar to the .obj in the way it is laid out.

Instead of vertex, normal data etc. It has data associated with the texture to load. The import data is Ka which is the ambience, Kd which is the diffuse, and Ks which is the specular. There is also map_Kd which gives the file path of the texture.

# Blender MTL File: 'None'
# Material Count: 1

newmtl None
Ns 0
Ka 0.000000 0.000000 0.000000
Kd 0.8 0.8 0.8
Ks 0.8 0.8 0.8
d 1
illum 2
map_Kd tiles.bmp

After loading that data in a similar way to the way I loaded the .obj, I created a buffer to hold the texture data and then filled that buffer.

// Generate texture buffer
GLuint textureBufferHandle;
glGenTextures(1, &textureBufferHandle);

// Populate texture buffer
glBindTexture(GL_TEXTURE_2D, textureBufferHandle);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, monkeyMTL.dimensions.x,
	monkeyMTL.dimensions.y, 0, GL_RGBA, GL_UNSIGNED_BYTE,

I verified that the data was loaded correctly from the MTL, and then tried to set that texture to active and bind it in the render function. However this didn’t work and I was unsure on where I was going wrong.

Unfortunately I ran out of time before implementing cameras fully.


Khronos. (2017). GLUT – The OpenGL Utility Toolkit. [online] Available at: [Accessed 24 Apr. 2017]. (2017). Simple DirectMedia Layer – Homepage. [online] Available at: [Accessed 24 Apr. 2017]. (2017). GLEW: The OpenGL Extension Wrangler Library. [online] Available at: [Accessed 24 Apr. 2017]. (2017). GLFW – An OpenGL library. [online] Available at: [Accessed 24 Apr. 2017]. (2017). SFML. [online] Available at: [Accessed 24 Apr. 2017]. (2017). Wavefront .obj file. [online] Available at: [Accessed 24 Apr. 2017].


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s