Graphics Assignment 2

Introduction

For this graphics assignment we were tasked with building an application that featured Cameras, Loading Texture Files, Loading 3D Models, and Shaders. My approach was to first create a simple model in Blender, add a texture to that model and then export it in the Wavefront .obj file format.

Setting up OpenGL

To set up OpenGL I used two libraries. GLEW (Glew.sourceforge.net, 2017) was one of them, it provides function pointers to OpenGL functions so that I can use them without declaring a function pointer each time. I also used SFML to open up an OpenGL context (window to render to). I also could have used GLUT (Khronos, 2017), SDL (Libsdl.org, 2017), GLFW (Glfw.org, 2017) and many others. I chose to use SFML (Sfml-dev.org, 2017) because I have been using it in my other projects so I was accustomed to setting it up.

int main()
{
	const int WindowWidth = 1600, WindowHeight = 900;
	sf::RenderWindow window(sf::VideoMode(WindowWidth, WindowHeight), "Graphics Assignment - Elliot");

I also initialised GLEW straight afterwards which allowed me to use OpenGL functions.

GLenum glewErr = glewInit();
if (GLEW_OK != glewErr)
{
	std::cout << "Failed to init GLEW: " << glewGetErrorString(glewErr) << std::endl;
}

I also needed a loop that ran whilst the program was open so that I could render on each iteration of that loop.

// Main loop
while (window.isOpen())
{
	// Event loop
	sf::Event event;
	while (window.pollEvent(event))
	{
		if (event.type == sf::Event::Closed)
			window.close();
	}

	GLfloat radius = 10.0f;
	GLfloat camX = sin(clock.getElapsedTime().asSeconds()) * radius;
	GLfloat camZ = cos(clock.getElapsedTime().asSeconds()) * radius;
	view = glm::lookAt(glm::vec3(camX, 0.0, camZ), glm::vec3(0.0, 0.0, 0.0), glm::vec3(0.0, 1.0, 0.0));

	// back buffer colour
	window.clear(sf::Color::Magenta);
	// OpenGL rendering
	Render(VAOHandle, textureBufferHandle, shaderProgramHandle, monkey.glVertices.size());
	// swap the buffers
	window.display();
}

If you look at the first code block inside the loop you can see the event handling loop. I didn’t use this because it wasn’t related to the assignment but the one event that is handled is closing of the window, otherwise lots of used resources won’t be freed.

The next code block is for some camera operations that I will get to later on.

After that, you can see me clearing the window which is effectively replacing the back buffer with the data of the supplied colour. I then call  function called Render that I use to draw my model. That effectively writes to the back buffer. I then call display which swaps the buffers so that the back buffer becomes the front buffer, read to be displayed, and the front buffer becomes the back buffer ready to be written to.

After that I simply clean up some resources and close the program.

	// Flag the shaders to be deleted when detached from the program
	glDeleteShader(vertexShaderHandle);
	glDeleteShader(fragmentShaderHandle);
	// Detach shaders and free up any memory taken by the program.
	glDeleteProgram(shaderProgramHandle);
	std::cin.get();
	return EXIT_SUCCESS;
}

This provides the basic structure for my program.

Loading the .obj Model

I wrote an OBJ class that provided the functionality needed to load an .obj file from a filepath supplied and convert the data to a form usable by OpenGL.

#pragma once
#include <string>
#include <vector>
#include <fstream>
#include <iostream>
#include <sstream>

#include <glm/vec3.hpp>

#include "Face.h"
#include "StringUtils.h"

class OBJ
{
public:
	OBJ();
	~OBJ();

	static OBJ LoadOBJ(std::string filepath);
	std::vector<glm::vec3> glVertices;
	std::vector<glm::vec2> glUVs;
	std::vector<glm::vec3> glNormals;

	void Convert();

private:
	std::vector<glm::vec3> vertices;
	std::vector<glm::vec2> uvs;
	std::vector<glm::vec3> normals;
	std::vector<Face> faces;
};

The main structure of the class was that it contains a public set of vertices, UV coordinates and normals that can be used by my shaders to produce the model. You can see there are two types of these sets in the class. One prepended with ‘gl’ and one without. The ones prepended with ‘gl’ are intended to be used by the shaders and so they’re publicly accessible. The others act as storage for the raw data loaded in from the .obj file.

Below is the implementation of my function to load an .obj model.

OBJ OBJ::LoadOBJ(std::string filepath)
{
	OBJ output;

	std::ifstream in(filepath, std::ios_base::in);
	if (in.is_open())
	{
		std::string line = "";
		while (getline(in, line))
		{
			if (line.substr(0, 2) == "v ")
			{
				std::istringstream stream(line.substr(2));
				glm::vec3 vector;
				stream >> vector.x;
				stream >> vector.y;
				stream >> vector.z;
				output.vertices.push_back(vector);
			}
			else if (line.substr(0, 2) == "vt")
			{
				std::istringstream stream(line.substr(3));
				glm::vec2 vector;
				stream >> vector.x;
				stream >> vector.y;
				output.uvs.push_back(vector);
			}
			else if (line.substr(0, 2) == "vn")
			{
				std::istringstream stream(line.substr(3));
				glm::vec3 vector;
				stream >> vector.x;
				stream >> vector.y;
				stream >> vector.z;
				output.normals.push_back(vector);
			}
			else if (line.substr(0, 2) == "f ")
			{
				std::istringstream stream(line.substr(2));
				Face face;
				for (int i = 0; i < 3; i++) 				{ 					std::string temp; 					int tempIndex; 					stream >> temp;

					std::vector<std::string> splitString = StringUtils::split(temp, '/');

					face.vertexIndices[i] = std::stoi(splitString[0]);
					face.textureIndices[i] = std::stoi(splitString[1]);
					face.normalIndices[i] = std::stoi(splitString[2]);
				}
				output.faces.push_back(face);
			}
		}
	}
	else {
		std::cerr << "Failed to load file: " << filepath << std::endl;
	}

	return output;
}

Initially I construct an output OBJ object named output. I then use std::ifstream (input file stream) to attempt to load a file from the supplied filepath. The second parameter of the std::ifstream is optional but specifies that only reading the data is necessary and not writing.

Next, I check that the file has opened successfully. If it has I create a std::string named line that will hold the data of each line read from the file so that it can be processed. I used getline, which takes a line from the file stream and stores it in the line variable. This happens inside a while loop which ends once the file has no more lines to read.

After that, I have all the functionality for processing the line. After reading a Wikipedia article (En.wikipedia.org, 2017) on Wavefront .obj files, I learned that each line inside an obj file represents something based on the first word of the line. After I opened my model up in a text editor I was able to see the data it contained.

The first piece of data was labelled mtlllib. This contains the filename of the ‘material’ file associated with my model. I’ll talk about that more later.

mtllib monkey.mtl

It also contained hundreds of vertices such as this one. The first number represents the x coordinate of the vertex, the second number the y, and the third number the z.

v 0.437500 -0.765625 0.164062

After the list of vertices, there are the lines to represent the vertex texture coordinates. The first number represents the x UV-coordinate, and the second number represents the y UV-coordinate.

vt 0.0000 0.0000

After that list, we get the vertex normal data. The same rules apply here.

vn 0.9693 -0.2456 -0.0118

Finally there is the face data. The face contains 3 sets of data. Each with 3 numbers separated with the ‘/’ character. The first number is the vertex index, the second is the texture index and third is the normal index. These represent which index to use in the list of indices in the file. So 47 means that the face uses the 47th vertex in the list.

f 47/1/1 3/2/2 45/3/3

So when loading the model, I simply looped through each line, looking either ‘v’, ‘vt’, ‘vn’ or ‘f’ and added the data to a std::vector so that I could process it.

All that was left to do was to fill up the std::vectors that OpenGL was going to use. I did this by looping through the faces and putting the data at the corresponding indices in to the vectors that are to be used by OpenGL.

Shaders

Now that the data from the model was loaded. I needed to set up my shaders to use that data. To do so, I wrote my shaders and then loaded them with OpenGL and compiled them. Once they were compiled I could link up to the main shader program to send to the GPU.

GLuint VAOHandle = 0;
GLuint VBOHandles[2];
GLuint vertexShaderHandle = 0, fragmentShaderHandle = 0, shaderProgramHandle = 0;

vertexShaderHandle = ShaderUtils::getInstance().CompileVertShader("vertShader.vert");
fragmentShaderHandle = ShaderUtils::getInstance().CompileFragShader("fragShader.frag");

shaderProgramHandle = ShaderUtils::getInstance().CreateShaderProgram();
ShaderUtils::getInstance().AttachShadersToProgram(shaderProgramHandle, vertexShaderHandle, fragmentShaderHandle);

// Map the indicies of the attributes to the shader program BEFORE LINKING IT.
glBindAttribLocation(shaderProgramHandle, 0, "VertexPosition");
glBindAttribLocation(shaderProgramHandle, 1, "VertexUV");
glBindFragDataLocation(shaderProgramHandle, 0, "UV");

ShaderUtils::getInstance().LinkShaderProgram(shaderProgramHandle);

// Create buffers for each attribute
glGenBuffers(2, VBOHandles);
GLuint positonBufferHandle = VBOHandles[0];
GLuint uvBufferHandle = VBOHandles[1];

// Populate the position buffer
glBindBuffer(GL_ARRAY_BUFFER, positonBufferHandle);
glBufferData(GL_ARRAY_BUFFER, monkey.glVertices.size() * sizeof(glm::vec3), &monkey.glVertices.front(), GL_STATIC_DRAW);

// Populate the uv buffer
glBindBuffer(GL_ARRAY_BUFFER, uvBufferHandle);
glBufferData(GL_ARRAY_BUFFER, monkey.glUVs.size() * sizeof(glm::vec2), &monkey.glUVs.front(), GL_STATIC_DRAW);

// Create and bind the VAO, which stores the relationships between the buffers and the attributes
glGenVertexArrays(1, &VAOHandle);
glBindVertexArray(VAOHandle);

// Enable the vertex attribute arrays
glEnableVertexAttribArray(0); // Vertex Position
glEnableVertexAttribArray(1); // Vertex UV

// Map index 0 to the position buffer, tell it the buffer will have 3 elements (because of vec3 for position) in of size GL_FLOAT,
// We don't want to normalise the data, there is no stride (byte offset between consecutive attributes),
// and there is no offset from the beginning of the buffer.
glBindBuffer(GL_ARRAY_BUFFER, positonBufferHandle);
lVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, nullptr);

// Map index 1 to the uv buffer. UV has 2 coordinates instead of 3.
glBindBuffer(GL_ARRAY_BUFFER, uvBufferHandle);
glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, 0, nullptr);

// If the shader programs were linked properly
if (shaderProgramHandle != 0)
{
    // Install the program in to the OpenGL pipeline
    glUseProgram(shaderProgramHandle);
}

I also set up the buffers to hold the vertex (VBO) and texture UV data (UVBO) from the model. Then I set up a VAO to link the data in those buffers with the my shader input variables.

After filling up the VBO and UVBO I could draw my model. You can see the model as you were looking down on to it, but as you can see there are not textures applied to that model.

emu3zh4

When you load the model from the .obj file. It contains a mtllib line which has the filepath for the associated .mtl (material) file. This file is similar to the .obj in the way it is laid out.

Instead of vertex, normal data etc. It has data associated with the texture to load. The import data is Ka which is the ambience, Kd which is the diffuse, and Ks which is the specular. There is also map_Kd which gives the file path of the texture.

# Blender MTL File: 'None'
# Material Count: 1

newmtl None
Ns 0
Ka 0.000000 0.000000 0.000000
Kd 0.8 0.8 0.8
Ks 0.8 0.8 0.8
d 1
illum 2
map_Kd tiles.bmp

After loading that data in a similar way to the way I loaded the .obj, I created a buffer to hold the texture data and then filled that buffer.

// Generate texture buffer
GLuint textureBufferHandle;
glGenTextures(1, &textureBufferHandle);

// Populate texture buffer
glBindTexture(GL_TEXTURE_2D, textureBufferHandle);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, monkeyMTL.dimensions.x,
	monkeyMTL.dimensions.y, 0, GL_RGBA, GL_UNSIGNED_BYTE,
        monkeyMTL.image.getPixelsPtr());

I verified that the data was loaded correctly from the MTL, and then tried to set that texture to active and bind it in the render function. However this didn’t work and I was unsure on where I was going wrong.

Unfortunately I ran out of time before implementing cameras fully.

Bibliography

Khronos. (2017). GLUT – The OpenGL Utility Toolkit. [online] Opengl.org. Available at: https://www.opengl.org/resources/libraries/glut/ [Accessed 24 Apr. 2017].

Libsdl.org. (2017). Simple DirectMedia Layer – Homepage. [online] Available at: https://www.libsdl.org/ [Accessed 24 Apr. 2017].

Glew.sourceforge.net. (2017). GLEW: The OpenGL Extension Wrangler Library. [online] Available at: http://glew.sourceforge.net/ [Accessed 24 Apr. 2017].

Glfw.org. (2017). GLFW – An OpenGL library. [online] Available at: http://www.glfw.org/ [Accessed 24 Apr. 2017].

Sfml-dev.org. (2017). SFML. [online] Available at: https://www.sfml-dev.org/ [Accessed 24 Apr. 2017].

En.wikipedia.org. (2017). Wavefront .obj file. [online] Available at: https://en.wikipedia.org/wiki/Wavefront_.obj_file [Accessed 24 Apr. 2017].

Advertisements

Final Iteration Update and Conclusion

Now that the academic year is coming to a close, we will end any further development of the game for now. Overall I think I am quite happy with how the game turned out considering some of the problems we faced along the way.

Hero Engine turned out to be a very capable engine, however it also had a large learning curve and I think a lot of the team including myself had problems learning how to use it and Hero Scripting Language.

We put together our final level using assets supplied with Hero Engine and also some other custom assets made by ourselves. I think the final level looks quite nice and a lot better than I initially expected.

Oliver also used SpeedTree (Speedtree.com, 2017) to create a tree that we used in the level:

HeroBlade_2017-04-21_14-09-10

Overall though, we didn’t quite get every aspect of the scripting side of things ready due to a lot of time spent on the previous game idea, also a lot of time spent trying to learn how to use HSL and the server-client architecture (Hewiki.heroengine.com, 2012) Hero Engine uses.

We started to overcome some of the more difficult tasks towards the end of the project though such as splitting characters up in to different teams and getting the round start timer working.

I think we would have been able to do a lot more if we didn’t spend a lot of time on a project which wasn’t feasible in the first place. However I think it was quite hard to tell whether or not a project was feasible due to our lack of knowledge of the engine starting off.

We also lacked project management for the first half of the project, which set us back quite a lot due to not having a clear direction and goal in our heads. Once we switched to using Trello (Trello.com, 2017), we all were producing a lot more work.

Despite this, it was interesting to use a young game engine like Hero Engine instead of one of the more mature ones such as Unity (Unity, 2017) or Unreal (Unrealengine.com, 2017). The cloud based editing (Heroengine.com, 2017), especially, proved to be very useful when world building. Although I am sure we could have got the game in to a much more finished state had we used Unity due to the fact that we are all way more experienced with it.

The biggest downside to Hero Engine for us was definitely the learning side of it. While the wiki did provide many articles to help us learn, it wasn’t very easy to find the information we were looking for and we had to ask on the forums a few times due to this. Hero scripting language also added to the list of things we had to learn, which wasn’t great. I would have been much happier if it used an already established scripting language such as Lua (Lua.org, 2017) or Python (Python.org, 2017).

Bibliography

Hero Engine (2017). Real-time Collaborative World Building [online] Available at: http://heroengine.com/heroengine/heroengine-details/world-building [Accessed 21 Apr. 2017].

Hewiki.heroengine.com (2012). Engine infrastructure – HEWIKI. [online] Available at: http://hewiki.heroengine.com/wiki/Engine_infrastructure [Accessed 21 Apr. 2017].

Lua.org (2017). Lua: about. [online] Available at: https://www.lua.org/about.html [Accessed 21 Apr. 2017].

Speedtree. (2017). SpeedTree Animated Trees & Plants Modeling & Render Software. [online] Available at: http://www.speedtree.com/ [Accessed 21 Apr. 2017].

Trello. (2017). Trello. [online] Available at: https://trello.com/ [Accessed 21 Apr. 2017].

Unity. (2017). Unity – Game Engine. [online] Available at: https://unity3d.com/ [Accessed 21 Apr. 2017].

Python. (2017). Welcome to Python.org. [online] Available at: https://www.python.org/ [Accessed 21 Apr. 2017].

Unreal Engine. (2017). What is Unreal Engine 4. [online] Available at: https://www.unrealengine.com/what-is-unreal-engine-4 [Accessed 21 Apr. 2017].

System Nodes in Hero Engine

Today I did some more research on splitting player’s characters up in to two teams. After reading the Adapting Clean Engine article on the wiki, I learnt that there are multiple default Hero-Engine scripts that should be replaced with your own scripts. This is so that you can have your own custom behaviour for system nodes, which is important for our game.

Some of examples of the system nodes that are overridden in the tutorial are below:

3idbquh
Some example classes that can be overridden to implement custom behaviour in Hero Engine.

For example, you can see that the $WORLD System Node has a default class called ‘HE_World’ and a default script for that class called ‘HE_WorldClassMethods’. They then have created a class called ‘GameIDWorld’ where GameID should just be a prefix you want to use for your game. Then the methods attached to that class will be in ‘GameIDWorldClassMethods’. These methods will extend/override the methods found in ‘HE_WorldClassMethods’  and implement custom behaviour for event callbacks for the World Server.

This is just one of many different classes that can be overridden to implement custom behaviour in Hero Engine.

 

Using the HeroEngine repository browser

Adding files to a repository associated with a project is very straightforward with HeroEngine. First of all you need to download the repository browser from the hero engine dashboard after you log in to the hero engine website (See step 3 below).

jx9bvla
Hint: Step 3

Now that you have the repository browser, open up the file and it will download the assets for your project. Once this has been completed, you can log in and it will open the HeroEngine Repository Browser window.

ipdjnmm

From here, you can use the local file browser which is located on the upper side of the window to drag files to the remote, which is the lower side of the window. And that’s it, very simple.

Hero Engine Texturing

This week I focused on learning how to give the assets we use in our level textures. It’s actually surprisingly simple, All you have to do is select the asset in Hero Blade:

4t1vvnv

Then, in the properties panel that should appear, you can simply select the _MaterialOverrides property, then click the button with the three dots (‘…’). This will bring up the Material Instance Properties window, which lets you choose which texture you want to apply to the object.

You can also specify a normal/specular maps, macro maps and custom shaders. Since I didn’t have access to a lot of textures and I don’t know how to make them myself, I just used the default ones supplied with Hero Engine. This worked well enough for our prototype though.

Once caveat that I had was it only being able to load DDS textures. It would be really helpful I could load in PNG files and it mapped them to textures. Other than that I thought that texturing was really well done in Hero Engine.

Project Management

Last week we took a visit to Derivco, a software house that develops online gaming systems. We showed them our progress so far, and discussed where we were heading and our goal for the end of the semester, which is a simple prototype. We also discussed how we could make sure we achieve that goal and one of the things I brought up is how we should be using project management tools more effectively.

Trello has been our main PM tool, but we haven’t really been using it to it’s full extent until today. Sprints checkups happened inconsistently and development velocity wasn’t being recorded properly.

As of today, my role has been changed to focus more on the PM side of the project so that we can achieve a good amount of progress every week by setting goals for everyone.

This week we achieved a lot more progress in comparison to previous weeks, mainly due to the tasks being set for people.

Since everyone is used to Jira for their group projects, Trello isn’t too different in the fact that there are cards, and lists, and boards. Cards are similar to issues/stories/tasks, Lists can created to hold cards, similar to the to do, in progress, verify and done sections in Jira, and boards are the same as in a Jira.

a5pl65h
A Trello Board, showing off Lists and Cards.

We currently use a layout consisting of a backlog list, containing all of the tasks we foresee to be completed at some point in the future, a list for Documentation that encompasses things like Design Document amendments and other write-ups, Design which covers level design, modelling and asset management, and Scripting which contains anything to do programming.

jzviapw
An overview of a Card in Trello

Creating cards is really easy, and there are keyboard shortcuts for all the common tasks which makes setting up our weekly sprints super fast.

So far I can recommend Trello to anyone that wants to incorporate Agile development in to a small project like ours.

Hero Scripting Language Learning – 1

After completing some of the basic Hero Scripting Language tutorials from the wiki, I feel I have a much better understanding of how the engine works in terms of scripting.

I now understand that the client and server each have their own DOM (Data Object Model) and GOM (Game Object Model), and in order to create a script to interact with these, you must first specify if the script is server-side or client-side.

1gjubfa
The Client and Server DOM Editor found in the HeroBlade Editor.

A DOM stores information such as Fields, Classes and Enums. Using the DOM Editor we can create new fields and classes and enums. When you create a field you can specify it’s type and give it a description and various other properties. When you create a class you can do the same but also specify which fields you want the class to reference and use.

Then, using scripts you can instantiate these classes as ‘Nodes’ and use them as you would with any other programming language. It’s worth noting however, that once you instantiate a class, the node you create is added to the GOM.

In order to create methods for these classes, you need to create a separate script titled in the format of ‘ + ClassMethods. For example, a class Foo would look for a script named FooClassMethods.’(“HSL For Programmers – HEWIKI”)

Most of the information is from the ‘HSL For Programmers’ article on the wiki.

Bibliography

“HSL For Programmers – HEWIKI”. Hewiki.heroengine.com. N.p., 2017. Web. 6 Feb. 2017.