Graphics Assignment 2


For this graphics assignment we were tasked with building an application that featured Cameras, Loading Texture Files, Loading 3D Models, and Shaders. My approach was to first create a simple model in Blender, add a texture to that model and then export it in the Wavefront .obj file format.

Setting up OpenGL

To set up OpenGL I used two libraries. GLEW (, 2017) was one of them, it provides function pointers to OpenGL functions so that I can use them without declaring a function pointer each time. I also used SFML to open up an OpenGL context (window to render to). I also could have used GLUT (Khronos, 2017), SDL (, 2017), GLFW (, 2017) and many others. I chose to use SFML (, 2017) because I have been using it in my other projects so I was accustomed to setting it up.

int main()
	const int WindowWidth = 1600, WindowHeight = 900;
	sf::RenderWindow window(sf::VideoMode(WindowWidth, WindowHeight), "Graphics Assignment - Elliot");

I also initialised GLEW straight afterwards which allowed me to use OpenGL functions.

GLenum glewErr = glewInit();
if (GLEW_OK != glewErr)
	std::cout << "Failed to init GLEW: " << glewGetErrorString(glewErr) << std::endl;

I also needed a loop that ran whilst the program was open so that I could render on each iteration of that loop.

// Main loop
while (window.isOpen())
	// Event loop
	sf::Event event;
	while (window.pollEvent(event))
		if (event.type == sf::Event::Closed)

	GLfloat radius = 10.0f;
	GLfloat camX = sin(clock.getElapsedTime().asSeconds()) * radius;
	GLfloat camZ = cos(clock.getElapsedTime().asSeconds()) * radius;
	view = glm::lookAt(glm::vec3(camX, 0.0, camZ), glm::vec3(0.0, 0.0, 0.0), glm::vec3(0.0, 1.0, 0.0));

	// back buffer colour
	// OpenGL rendering
	Render(VAOHandle, textureBufferHandle, shaderProgramHandle, monkey.glVertices.size());
	// swap the buffers

If you look at the first code block inside the loop you can see the event handling loop. I didn’t use this because it wasn’t related to the assignment but the one event that is handled is closing of the window, otherwise lots of used resources won’t be freed.

The next code block is for some camera operations that I will get to later on.

After that, you can see me clearing the window which is effectively replacing the back buffer with the data of the supplied colour. I then call  function called Render that I use to draw my model. That effectively writes to the back buffer. I then call display which swaps the buffers so that the back buffer becomes the front buffer, read to be displayed, and the front buffer becomes the back buffer ready to be written to.

After that I simply clean up some resources and close the program.

	// Flag the shaders to be deleted when detached from the program
	// Detach shaders and free up any memory taken by the program.

This provides the basic structure for my program.

Loading the .obj Model

I wrote an OBJ class that provided the functionality needed to load an .obj file from a filepath supplied and convert the data to a form usable by OpenGL.

#pragma once
#include <string>
#include <vector>
#include <fstream>
#include <iostream>
#include <sstream>

#include <glm/vec3.hpp>

#include "Face.h"
#include "StringUtils.h"

class OBJ

	static OBJ LoadOBJ(std::string filepath);
	std::vector<glm::vec3> glVertices;
	std::vector<glm::vec2> glUVs;
	std::vector<glm::vec3> glNormals;

	void Convert();

	std::vector<glm::vec3> vertices;
	std::vector<glm::vec2> uvs;
	std::vector<glm::vec3> normals;
	std::vector<Face> faces;

The main structure of the class was that it contains a public set of vertices, UV coordinates and normals that can be used by my shaders to produce the model. You can see there are two types of these sets in the class. One prepended with ‘gl’ and one without. The ones prepended with ‘gl’ are intended to be used by the shaders and so they’re publicly accessible. The others act as storage for the raw data loaded in from the .obj file.

Below is the implementation of my function to load an .obj model.

OBJ OBJ::LoadOBJ(std::string filepath)
	OBJ output;

	std::ifstream in(filepath, std::ios_base::in);
	if (in.is_open())
		std::string line = "";
		while (getline(in, line))
			if (line.substr(0, 2) == "v ")
				std::istringstream stream(line.substr(2));
				glm::vec3 vector;
				stream >> vector.x;
				stream >> vector.y;
				stream >> vector.z;
			else if (line.substr(0, 2) == "vt")
				std::istringstream stream(line.substr(3));
				glm::vec2 vector;
				stream >> vector.x;
				stream >> vector.y;
			else if (line.substr(0, 2) == "vn")
				std::istringstream stream(line.substr(3));
				glm::vec3 vector;
				stream >> vector.x;
				stream >> vector.y;
				stream >> vector.z;
			else if (line.substr(0, 2) == "f ")
				std::istringstream stream(line.substr(2));
				Face face;
				for (int i = 0; i < 3; i++) 				{ 					std::string temp; 					int tempIndex; 					stream >> temp;

					std::vector<std::string> splitString = StringUtils::split(temp, '/');

					face.vertexIndices[i] = std::stoi(splitString[0]);
					face.textureIndices[i] = std::stoi(splitString[1]);
					face.normalIndices[i] = std::stoi(splitString[2]);
	else {
		std::cerr << "Failed to load file: " << filepath << std::endl;

	return output;

Initially I construct an output OBJ object named output. I then use std::ifstream (input file stream) to attempt to load a file from the supplied filepath. The second parameter of the std::ifstream is optional but specifies that only reading the data is necessary and not writing.

Next, I check that the file has opened successfully. If it has I create a std::string named line that will hold the data of each line read from the file so that it can be processed. I used getline, which takes a line from the file stream and stores it in the line variable. This happens inside a while loop which ends once the file has no more lines to read.

After that, I have all the functionality for processing the line. After reading a Wikipedia article (, 2017) on Wavefront .obj files, I learned that each line inside an obj file represents something based on the first word of the line. After I opened my model up in a text editor I was able to see the data it contained.

The first piece of data was labelled mtlllib. This contains the filename of the ‘material’ file associated with my model. I’ll talk about that more later.

mtllib monkey.mtl

It also contained hundreds of vertices such as this one. The first number represents the x coordinate of the vertex, the second number the y, and the third number the z.

v 0.437500 -0.765625 0.164062

After the list of vertices, there are the lines to represent the vertex texture coordinates. The first number represents the x UV-coordinate, and the second number represents the y UV-coordinate.

vt 0.0000 0.0000

After that list, we get the vertex normal data. The same rules apply here.

vn 0.9693 -0.2456 -0.0118

Finally there is the face data. The face contains 3 sets of data. Each with 3 numbers separated with the ‘/’ character. The first number is the vertex index, the second is the texture index and third is the normal index. These represent which index to use in the list of indices in the file. So 47 means that the face uses the 47th vertex in the list.

f 47/1/1 3/2/2 45/3/3

So when loading the model, I simply looped through each line, looking either ‘v’, ‘vt’, ‘vn’ or ‘f’ and added the data to a std::vector so that I could process it.

All that was left to do was to fill up the std::vectors that OpenGL was going to use. I did this by looping through the faces and putting the data at the corresponding indices in to the vectors that are to be used by OpenGL.


Now that the data from the model was loaded. I needed to set up my shaders to use that data. To do so, I wrote my shaders and then loaded them with OpenGL and compiled them. Once they were compiled I could link up to the main shader program to send to the GPU.

GLuint VAOHandle = 0;
GLuint VBOHandles[2];
GLuint vertexShaderHandle = 0, fragmentShaderHandle = 0, shaderProgramHandle = 0;

vertexShaderHandle = ShaderUtils::getInstance().CompileVertShader("vertShader.vert");
fragmentShaderHandle = ShaderUtils::getInstance().CompileFragShader("fragShader.frag");

shaderProgramHandle = ShaderUtils::getInstance().CreateShaderProgram();
ShaderUtils::getInstance().AttachShadersToProgram(shaderProgramHandle, vertexShaderHandle, fragmentShaderHandle);

// Map the indicies of the attributes to the shader program BEFORE LINKING IT.
glBindAttribLocation(shaderProgramHandle, 0, "VertexPosition");
glBindAttribLocation(shaderProgramHandle, 1, "VertexUV");
glBindFragDataLocation(shaderProgramHandle, 0, "UV");


// Create buffers for each attribute
glGenBuffers(2, VBOHandles);
GLuint positonBufferHandle = VBOHandles[0];
GLuint uvBufferHandle = VBOHandles[1];

// Populate the position buffer
glBindBuffer(GL_ARRAY_BUFFER, positonBufferHandle);
glBufferData(GL_ARRAY_BUFFER, monkey.glVertices.size() * sizeof(glm::vec3), &monkey.glVertices.front(), GL_STATIC_DRAW);

// Populate the uv buffer
glBindBuffer(GL_ARRAY_BUFFER, uvBufferHandle);
glBufferData(GL_ARRAY_BUFFER, monkey.glUVs.size() * sizeof(glm::vec2), &monkey.glUVs.front(), GL_STATIC_DRAW);

// Create and bind the VAO, which stores the relationships between the buffers and the attributes
glGenVertexArrays(1, &VAOHandle);

// Enable the vertex attribute arrays
glEnableVertexAttribArray(0); // Vertex Position
glEnableVertexAttribArray(1); // Vertex UV

// Map index 0 to the position buffer, tell it the buffer will have 3 elements (because of vec3 for position) in of size GL_FLOAT,
// We don't want to normalise the data, there is no stride (byte offset between consecutive attributes),
// and there is no offset from the beginning of the buffer.
glBindBuffer(GL_ARRAY_BUFFER, positonBufferHandle);
lVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, nullptr);

// Map index 1 to the uv buffer. UV has 2 coordinates instead of 3.
glBindBuffer(GL_ARRAY_BUFFER, uvBufferHandle);
glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, 0, nullptr);

// If the shader programs were linked properly
if (shaderProgramHandle != 0)
    // Install the program in to the OpenGL pipeline

I also set up the buffers to hold the vertex (VBO) and texture UV data (UVBO) from the model. Then I set up a VAO to link the data in those buffers with the my shader input variables.

After filling up the VBO and UVBO I could draw my model. You can see the model as you were looking down on to it, but as you can see there are not textures applied to that model.


When you load the model from the .obj file. It contains a mtllib line which has the filepath for the associated .mtl (material) file. This file is similar to the .obj in the way it is laid out.

Instead of vertex, normal data etc. It has data associated with the texture to load. The import data is Ka which is the ambience, Kd which is the diffuse, and Ks which is the specular. There is also map_Kd which gives the file path of the texture.

# Blender MTL File: 'None'
# Material Count: 1

newmtl None
Ns 0
Ka 0.000000 0.000000 0.000000
Kd 0.8 0.8 0.8
Ks 0.8 0.8 0.8
d 1
illum 2
map_Kd tiles.bmp

After loading that data in a similar way to the way I loaded the .obj, I created a buffer to hold the texture data and then filled that buffer.

// Generate texture buffer
GLuint textureBufferHandle;
glGenTextures(1, &textureBufferHandle);

// Populate texture buffer
glBindTexture(GL_TEXTURE_2D, textureBufferHandle);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, monkeyMTL.dimensions.x,
	monkeyMTL.dimensions.y, 0, GL_RGBA, GL_UNSIGNED_BYTE,

I verified that the data was loaded correctly from the MTL, and then tried to set that texture to active and bind it in the render function. However this didn’t work and I was unsure on where I was going wrong.

Unfortunately I ran out of time before implementing cameras fully.


Khronos. (2017). GLUT – The OpenGL Utility Toolkit. [online] Available at: [Accessed 24 Apr. 2017]. (2017). Simple DirectMedia Layer – Homepage. [online] Available at: [Accessed 24 Apr. 2017]. (2017). GLEW: The OpenGL Extension Wrangler Library. [online] Available at: [Accessed 24 Apr. 2017]. (2017). GLFW – An OpenGL library. [online] Available at: [Accessed 24 Apr. 2017]. (2017). SFML. [online] Available at: [Accessed 24 Apr. 2017]. (2017). Wavefront .obj file. [online] Available at: [Accessed 24 Apr. 2017].


Final Iteration Update and Conclusion

Now that the academic year is coming to a close, we will end any further development of the game for now. Overall I think I am quite happy with how the game turned out considering some of the problems we faced along the way.

Hero Engine turned out to be a very capable engine, however it also had a large learning curve and I think a lot of the team including myself had problems learning how to use it and Hero Scripting Language.

We put together our final level using assets supplied with Hero Engine and also some other custom assets made by ourselves. I think the final level looks quite nice and a lot better than I initially expected.

Oliver also used SpeedTree (, 2017) to create a tree that we used in the level:


Overall though, we didn’t quite get every aspect of the scripting side of things ready due to a lot of time spent on the previous game idea, also a lot of time spent trying to learn how to use HSL and the server-client architecture (, 2012) Hero Engine uses.

We started to overcome some of the more difficult tasks towards the end of the project though such as splitting characters up in to different teams and getting the round start timer working.

I think we would have been able to do a lot more if we didn’t spend a lot of time on a project which wasn’t feasible in the first place. However I think it was quite hard to tell whether or not a project was feasible due to our lack of knowledge of the engine starting off.

We also lacked project management for the first half of the project, which set us back quite a lot due to not having a clear direction and goal in our heads. Once we switched to using Trello (, 2017), we all were producing a lot more work.

Despite this, it was interesting to use a young game engine like Hero Engine instead of one of the more mature ones such as Unity (Unity, 2017) or Unreal (, 2017). The cloud based editing (, 2017), especially, proved to be very useful when world building. Although I am sure we could have got the game in to a much more finished state had we used Unity due to the fact that we are all way more experienced with it.

The biggest downside to Hero Engine for us was definitely the learning side of it. While the wiki did provide many articles to help us learn, it wasn’t very easy to find the information we were looking for and we had to ask on the forums a few times due to this. Hero scripting language also added to the list of things we had to learn, which wasn’t great. I would have been much happier if it used an already established scripting language such as Lua (, 2017) or Python (, 2017).


Hero Engine (2017). Real-time Collaborative World Building [online] Available at: [Accessed 21 Apr. 2017]. (2012). Engine infrastructure – HEWIKI. [online] Available at: [Accessed 21 Apr. 2017]. (2017). Lua: about. [online] Available at: [Accessed 21 Apr. 2017].

Speedtree. (2017). SpeedTree Animated Trees & Plants Modeling & Render Software. [online] Available at: [Accessed 21 Apr. 2017].

Trello. (2017). Trello. [online] Available at: [Accessed 21 Apr. 2017].

Unity. (2017). Unity – Game Engine. [online] Available at: [Accessed 21 Apr. 2017].

Python. (2017). Welcome to [online] Available at: [Accessed 21 Apr. 2017].

Unreal Engine. (2017). What is Unreal Engine 4. [online] Available at: [Accessed 21 Apr. 2017].

System Nodes in Hero Engine

Today I did some more research on splitting player’s characters up in to two teams. After reading the Adapting Clean Engine article on the wiki, I learnt that there are multiple default Hero-Engine scripts that should be replaced with your own scripts. This is so that you can have your own custom behaviour for system nodes, which is important for our game.

Some of examples of the system nodes that are overridden in the tutorial are below:

Some example classes that can be overridden to implement custom behaviour in Hero Engine.

For example, you can see that the $WORLD System Node has a default class called ‘HE_World’ and a default script for that class called ‘HE_WorldClassMethods’. They then have created a class called ‘GameIDWorld’ where GameID should just be a prefix you want to use for your game. Then the methods attached to that class will be in ‘GameIDWorldClassMethods’. These methods will extend/override the methods found in ‘HE_WorldClassMethods’  and implement custom behaviour for event callbacks for the World Server.

This is just one of many different classes that can be overridden to implement custom behaviour in Hero Engine.


Using the HeroEngine repository browser

Adding files to a repository associated with a project is very straightforward with HeroEngine. First of all you need to download the repository browser from the hero engine dashboard after you log in to the hero engine website (See step 3 below).

Hint: Step 3

Now that you have the repository browser, open up the file and it will download the assets for your project. Once this has been completed, you can log in and it will open the HeroEngine Repository Browser window.


From here, you can use the local file browser which is located on the upper side of the window to drag files to the remote, which is the lower side of the window. And that’s it, very simple.

Hero Engine Texturing

This week I focused on learning how to give the assets we use in our level textures. It’s actually surprisingly simple, All you have to do is select the asset in Hero Blade:


Then, in the properties panel that should appear, you can simply select the _MaterialOverrides property, then click the button with the three dots (‘…’). This will bring up the Material Instance Properties window, which lets you choose which texture you want to apply to the object.

You can also specify a normal/specular maps, macro maps and custom shaders. Since I didn’t have access to a lot of textures and I don’t know how to make them myself, I just used the default ones supplied with Hero Engine. This worked well enough for our prototype though.

Once caveat that I had was it only being able to load DDS textures. It would be really helpful I could load in PNG files and it mapped them to textures. Other than that I thought that texturing was really well done in Hero Engine.

Project Management

Last week we took a visit to Derivco, a software house that develops online gaming systems. We showed them our progress so far, and discussed where we were heading and our goal for the end of the semester, which is a simple prototype. We also discussed how we could make sure we achieve that goal and one of the things I brought up is how we should be using project management tools more effectively.

Trello has been our main PM tool, but we haven’t really been using it to it’s full extent until today. Sprints checkups happened inconsistently and development velocity wasn’t being recorded properly.

As of today, my role has been changed to focus more on the PM side of the project so that we can achieve a good amount of progress every week by setting goals for everyone.

This week we achieved a lot more progress in comparison to previous weeks, mainly due to the tasks being set for people.

Since everyone is used to Jira for their group projects, Trello isn’t too different in the fact that there are cards, and lists, and boards. Cards are similar to issues/stories/tasks, Lists can created to hold cards, similar to the to do, in progress, verify and done sections in Jira, and boards are the same as in a Jira.

A Trello Board, showing off Lists and Cards.

We currently use a layout consisting of a backlog list, containing all of the tasks we foresee to be completed at some point in the future, a list for Documentation that encompasses things like Design Document amendments and other write-ups, Design which covers level design, modelling and asset management, and Scripting which contains anything to do programming.

An overview of a Card in Trello

Creating cards is really easy, and there are keyboard shortcuts for all the common tasks which makes setting up our weekly sprints super fast.

So far I can recommend Trello to anyone that wants to incorporate Agile development in to a small project like ours.

Hero Scripting Language Learning – 1

After completing some of the basic Hero Scripting Language tutorials from the wiki, I feel I have a much better understanding of how the engine works in terms of scripting.

I now understand that the client and server each have their own DOM (Data Object Model) and GOM (Game Object Model), and in order to create a script to interact with these, you must first specify if the script is server-side or client-side.

The Client and Server DOM Editor found in the HeroBlade Editor.

A DOM stores information such as Fields, Classes and Enums. Using the DOM Editor we can create new fields and classes and enums. When you create a field you can specify it’s type and give it a description and various other properties. When you create a class you can do the same but also specify which fields you want the class to reference and use.

Then, using scripts you can instantiate these classes as ‘Nodes’ and use them as you would with any other programming language. It’s worth noting however, that once you instantiate a class, the node you create is added to the GOM.

In order to create methods for these classes, you need to create a separate script titled in the format of ‘ + ClassMethods. For example, a class Foo would look for a script named FooClassMethods.’(“HSL For Programmers – HEWIKI”)

Most of the information is from the ‘HSL For Programmers’ article on the wiki.


“HSL For Programmers – HEWIKI”. N.p., 2017. Web. 6 Feb. 2017.