Am I really ready to Graduate?

I don’t think my aims have changed since starting this course.  I wanted to marry my animation skills with programming and either create my own games/apps or use these skills to get me further into a late career in the industry.

C++ nearly broke me and perhaps that is a sign of being too old to adapt between different languages.  What the last two studio subjects have taught me is that I want to be a Unity developer when I graduate.

It seems, from reading the job market correctly, that the main concerns left for me are to gain an understanding of the Mechanim system, Animation trees and experience with a range of repository formats (my only experience to date has been Git).

Next Trimester, I am working on the Final Project, which meant that I will be looking at Animation Trees over the holidays and implementing a few of my previous animations so that I can readily understand how they work and how to best leverage them.  This way I will be prepared to implement animations as soon as we return from the holiday break.

Also, next trimester, I will be working with a group of ex-qantm students who have their own studio and are trying to get their game “Hands Off” greenlit through Steam.  I will be doing rapid prototyping for them for new game ideas and improvements or new concepts for the “Hands Off” title.  They will expect me to be knowledgeable in using Bitbucket and TortiseHg, which I will also be looking into over the holiday break.  As I am already conversant with using Git, I’m sure that it will not be a culture shock using Bitbucket and TortiseHg.

I think it will be a very busy lead up till the end of the year, but I am looking forward to it.  Perhaps because I am finished with C++, but I can’t just leave it there.  Some of the advertisers looking for Unity developers also recommend Unreal experience.  While I have used Unreal to show off Modeling assets and have used the Kismet system, I doubt that this would be enough to warrant a job so I will need to continue with my C++ training and try to become a lot more proficient in it.  Who knows, I might even begin to like it.


Postmortem on Pulse Monitoring Plug-in for Designer’s Game

I found that this was an unusual process.  To date I have been deeply immersed in most of the games that I have collaborated in.  The closest to this was my involvement with “Valour” where the programmers went in hard to start with, set a lot of things up and then had much less contact with the designers as they advanced their ideas through the game play.

To even mention “Valour” send shivers down my spine and it is not a comparison to the final product of “Be Brave”, created by Savik Fraguella.  My involvement was even less, in the scheme of things.  I created the software to get Unity to accept an incoming pulse beat from the wearer and feed that pulse rate into an system that would emit a “Fear Factor” rating that Savik would then be able to use to influence the content of his game.

I quickly, over the space of about 2-3 weeks, completed this task and handed it off to Savik to await any problem that might emerge.  There was one situation where the program would quit and then restart as the level was changed but this was quickly over come by making it a persistent singleton that could start during the main menu and then remain through the levels but destroy itself when either quitting, or completing the level(s).

Savik, at one stage, did borrow the equipment needed to test the game, but he either didn’t install the FTDI drivers, or there was a problem installing them.  I referred him to my blog post regarding the project for installing them.

As we got closer to Open Day, where the game would be displayed, I regularly checked with Savik if he wanted the Arduino to test the game, but he wasn’t interested, focusing on the art work of the game.  It wasn’t until the day before Open Day that I realised that there were some problems.

The pulse rate wasn’t getting through to the build of Unity.  It still processes the information quicker in Unity than in the build but that is another story.  I narrowed the information coming into Unity in order for the information not to clog the ability for the build to run.  Instead of collecting all the information, I narrowed to Arduino to only send information that I could use, which was the actual beat rate.

Video showing the Pulse rate through a simple GUI and the Fear Rating being influenced by the pulse rate.  (I must apologise for spelling “Serial” wrong)

With some help from my tutor, who performed some Unity black magic, I was able to get the build to work properly.  When I find out what black magic was performed, I will amend this post with the information.

Savik showed some quite advanced scripting skills that impressed me, both with him knowing that they existed in the first place, and with the application in a game setting.  Using Pims Algorithm to randomly generate a level so that the exit was not close to the entrance was among the concepts that I found interesting.  I would do well to sit with Savik and discuss how he came to implement them and other ideas such as, apparently, using the fear state to assist in the generation of the level.

What I have learned through this project is not to adopt a “set and forget” mentality.  By remaining closer to Savik as he progressed through the making of the game, I would have been aware of the problems with my component much earlier, and I also would have gained a better understanding of how he utilised and adapted those scripts that left me with a positive impression.


Flocking Agent and the optimisation thereof

We were given a flocking simulation that needed to be optimised and customised for others (designers) to be able to use.

My first thought was to try and do a custom user interface screen where the different variables could be set, but this is not Unity, it is c++ and that is a hell of a task for me to try and do.

To start, I did have it set up so that the number of prey and predators could be input from the main screen before it processed the information and ran the simulation.

I decided that a csv file would be the best option for a budding developer to input the ranges of the variables that could be affected by this simulation.  Unfortunately, there was no way for me to hard code parameters for the variables to fall between.  I could have made it part of the csv file, but then they could be changed by the designer, by design or by accident.

I also wanted to set up pack behaviours for the predators so they would form their own groups and chase down the prey, working together.  As part of this, I would need to set up a Quadtree for all the flocking agents, in part to minimise the amount of calls to every other agent in the simulation.

This represented the first major problem of this challenge.  I could find plenty of various coding pieces and pseudo code to do the job, but I had no idea of how to implement it into my scene.

The other major problem was introducing collisions, whereby the predators would eat their prey once they caught up with them.  I struggled with these for the better part of  two weeks.  I was able to implement the collisions but had to drop the Quadtree because time was nearly up for delivery of this project.

My only optimisation was the standard oem threading that was implemented for a for loop that went through every prey and then looped through every prey again, checking for positions and group tendencies, but this had to be removed as it no longer became thread safe when I had the predators consume any prey that it caught.

The full features of my Flocking simulation can be seen in the above video.  The csv file to Initialise the simulation, the actual simulation, the csv file showing the 5 number summary of important information on closing the simulation and the heat map, generated by the simulation.

The feedback showed that this problem went into way more depth than I thought we were needing to go.  There were many more options for optimisation than I thought possible.  I could have gone through the code and replaced division, where applicable, with multiplication, for example, re-writing x/3 as x * 0.3.  Multiplication happens quicker than division.

I could have set up all if statements so that they ran as true for most occasions when entered.  If they run true the code carried on with what is loaded, whereas if they run false, new code needs to be loaded for the program to continue.

I could also have changed code regarding the checking of distances.  Instead of checking the distance between two vectors, I could have taken the sqrMagnitude of the two vectors, converted that into a float and checked if the float was less than a certainDistance * certainDistance.

For example:

float distanceApart = 5f;

float offset = (other.position – transform.position).sqrMagnitude;

if(offset < distanceApart * distanceApart)


//do stuff


The reason that this is quicker is because checking the distance between them will result in a Squared Root operation, which is slower than using the sqrMagnitude which is using and comparing variables that are multiplied.

It is unfortunate that I didn’t have another three weeks to work on this, but by the same token, I didn’t understand how much depth was required to “solve” most of this problem.  I thought implementing a Quadtree (or similar) and some form of threading would suffice.  If I had this to do again, I would have optimised the hell out of it, but without decent C++ skills, I would still struggle to implement a Quad tree.

Why blogging can help organise your mind and open up unexplored avenues

This is yet another blog that has changed since I started writing it.  It was to be “Unity 5 and my failed attempt to access facial tracking using OpenCVSharp.”  I will now try to show why blogging can help to organise your mind and leave you open to possible solutions.  I will still tag this blog with Unity 5, OpenCVSharp, Webcam and facial tracking, just in case someone else is having problems with getting it to work in Unity 5.  The first part will briefly explain how I got to the stage before I got facial tracking to work in Unity and then explain how this process helped me find the solution.

Firstly, if you have a webcam attached to your computer and aren’t sure if it is working, this Windows trick will save you a heap of time.  Go to your start bar and type in “Camera”.  You can then click on the Camera button.  If it asks you to connect a camera, return to the main screen and press Fn-F6.  This will turn your camera back on and you should be right to go.  Now if you click on the camera, your ugly mug should now come up on the screen and you can take a quick photo of your self satisfied look after saving some time on Google.  Please do it, because that will be the last time you will see that look for some time.

Let me explain why this is the case.  Unity 5 is a 64 bit program and will need 64 bit versions of all the relevant DLLs. More commonly, these are 32 bit and they will not work in Unity 5. But the 64 bit versions can be found at this link:

My search for trying to find a working version of Facial tracking led me to this site: and from there I was able to download the relevant Unity package.  It was obvious that it did not include facial tracking and without more research, I wouldn’t get it to work.

I probably spent more time than it deserved, but finally found a working script that tracked faces from this obscure site:

It worked perfectly in Unity 4.x, but directly importing it to Unity 5 gave me errors.  I figured that this would be due to the 32 bit DLLs that the original project held.  I changed the “OpenCvSharp” and “OpenCvSharp.MachineLearning” DLLs to the 64 bit version, but ended up with an error, something along the lines that there was an error reading from the “haarcascade_frontalface_alt.xml” file.

I gave up at this point, because there was so much other work expected from me, and this was a prototype for a game that was using a pulse monitor that seemed to be working quite well.  My lecturer convinced me that I should do a blog about this failure so that it would 1. show the work I had done on researching this problem and 2. give anyone else the option to see what I had done and perhaps save them some work on researching and getting to the same point.

Because some time has gone by, I really can’t remember exactly what steps I went through to get to this point, but the ones that I do remember are listed above.

As I was trying to recreate this, I was trying to work out what I had and hadn’t done, so I made a copy of the Unity 4.x version and re-imported it to Unity 5. I then replaced the DLLs with the 64 bit version (as mentioned above).  Then I wondered if there were any other sneaky DLLs that might need replacing. I did find some more DLLs. There were five DLLs that I tried to find 64 bit versions of but they were only available in 32 bit and the fact sheets seemed to explain that they were good to use in a 64bit situation.  Then I found “OpenCvSharpExtern.dll” which I know can be a 64 bit dll, but was still a 32 bit.  I replaced it with the 64 bit version and tried to run the face tracking again.  Face tracking is now operational in Unity 5.

Face detection working in Unity 5

Face detection working in Unity 5

There is a slight bug, however.  It will work the first time you play a scene, but stopping it and trying to run it again will cause Unity to freeze, or become unresponsive.  This is likely to the one red error that it produces at the start of the run.

The error itself reads:

MissingFieldException: Field ‘OpenCvSharp.DisposableCvObject._ptr’ not found.
FaceDetectScript.Start () (at Assets/Scripts/FaceDetectScript.cs:43)

The error relates to this line of code found in the FaceDetect script:

CvSVM svm = new CvSVM ();

The main things I found trying to research this error are and .

They both appear to be part of shimat’s source code for creating the DLLs.  I am horrible with pointers in C++ and didn’t think that they were supported in C#, so I will try to find out a solution to this problem and will edit this post within 2 weeks.  NOTE: It will be below this with some sort of edited heading.

*****EDITED 21/07/15*****

The solution seems to be to toggle the following lines of code:

CvSVM svm = new CvSVM ();
CvTermCriteria criteria = new CvTermCriteria (CriteriaType.Epsilon, 1000, double.Epsilon);
CvSVMParams param = new CvSVMParams (CvSVM.C_SVC, CvSVM.RBF, 10.0, 8.0, 1.0, 10.0, 0.5, 0.1, null, criteria);

They are not being used elsewhere in the program and seem to be somewhat incomplete in their use.   CvSVM is a Support Vector Machine which is given labeled training data.  An algorithm outputs an optimal hyperplane which can categorise any new examples.

CvTermCriteria means the termination criteria for the iterative algorithm that the CvSVM is using and the CvSVMParams are the parameters used to train the CvSVM.

Usually there is a “Train” and “Predict” method called to the SVM to get it to function and adequately predict something.

I have no idea why this code was left in the project but it is obviously incomplete and the facial tracking works without them being included.


This project had frustrated me in several ways.  Firstly, that for so long, I thought i was so close to having it working.  Secondly, I spent too long on it, without making accurate notes and recording my results which makes it harder to prove what you have done for research into the subject and third, if I had done the above, I could have stumbled on the solution a lot sooner.

What I aim to do in the future, and I highly recommend to others, especially any students, is to record what is happening with your research, even if you save it as a draft.  Actively copy links and what you are looking at, or for, as you go.  In this way, it is all laid out and you might be able to see holes in your logic or thinking as you progress.  At worst, leave it for a week or two and approach it with fresh eyes.

For something completely different.

For a recent side project in Unity, I was trying to find out if certain game objects were seen by the main camera.

To sum up what I was trying to do, I wanted to have enemies able to be highlighted (using a basic outline shader) if they were on screen with the player.  This meant finding if they were on screen with the player, collecting their game objects into a list and cycling through that list to change which enemy is the current target for an attack.

My initial research led me to OnBecameVisible() and it’s opposite, OnBecameInvisible().  This is supposed to tell the renderer when the object comes on screen, or any part of it, including it’s shadow, and needs to be rendered.  What sounded good on paper failed to work in reality.  One drawback is that it is hard to test in the Unity editor because the editor camera also affects the outcome (allegedly).  I say allegedly because I could not get it to work regardless.

What I ended up relying on were two GeometryUnity functions.  CalculateFrustrumPlanes and TestPlanesAABB.

On the script for the object I wanted to test, I used these variables:

private Plane[] planes;
public Camera cam;
private Collider coll;

The main camera is dragged onto the Camera slot in the inspector, although, with a prefab, it might need to be coded directly into the Start() function, where I used this piece of code to access the collider.

coll = GetComponent<Collider>();

In Update(), because the camera is moving with the player, I needed to constantly tell the script what is planes[] were using:

planes = GeometryUtility.CalculateFrustumPlanes(cam);

The script would then announce that it was visible by referring to an external bool function:

if(IsVisible()) {..//insert mad code here//..}

This was a very simple function that returned true when the collider could be “seen” by the main camera and false when it couldn’t.

bool IsVisible()
return GeometryUtility.TestPlanesAABB(planes,coll.bounds);

This code was used for a prototype to prove that I could implement this mechanic, although after implementing it, I realised that it was rather boring for the purpose I had envisaged and will likely be scrapped.  One of the rules of Game Design is that you have to be prepared to “kill” your children.  As a father, I find this is a horrible metaphor, but true still the same.

My main concern with this current layout was if, in the long run, the amount of processing and garbage generated would have been worth knowing what was visible.  In my thinking, it couldn’t be worse than having a raycast every frame to tell if the player is grounded, or not.

I am posting this in case I have to revisit the concept in the future (3 strokes plays havoc with the short term memory) and if anyone else needs to know a working way to detect if something is currently on screen.

C++ pseudo code, header implementation and how-too videos are the reasons why I am not liking C++

While I am sure that C++ is a great language to knuckle down into memory allocation and stack organising but i you have no idea how to implement the language into headers and use the resulting functions correctly, it is all but useless.

That is the situation for me at the moment.  Over the last few weeks, I have been subtly expressing my rage with comments that C++ makes me want to be a Unity developer and I have finally understood the reason why.

My latest two inquiries into the realms of supposed knowledge in the C++ world have been about how to create and implement a quadtree and collisions using Box2D contact information.

There is a wealth of information out there about problems encountered with quadtrees, if you already know how to set them up.  There is ample pseudo code detailing what is happening with them.  There are several complete pieces of code that outline the headers and .cpp files but alas, they don’t quite fit the bill for me.

Wikipedia has a great section on Quadtrees :

It includes some pseudo code to outline how they are made and how it is created, called and accessed.  The problem for me is that I haven’t the foggiest idea how I can implement this pseudo code into headers and then into viable functions.

Box2D seems to be amazing, but again, only if you know how to use it in the first place.  While there are “tutorials”, it seems that there is no clue how to implement the header file and not solutions for errors that arise from the inevitable mistakes in writing the code.  I know that people aren’t born with this remarkable understanding, but I am buggered if I can find out how they came across this knowledge.  It seems to be on some secret internet that I have no way to access.

Any attempts to try and locate tutorial videos only reveal the startling results that various programmers have achieved with what it is you are looking for.

My problem is that tutorials can be found for Unity that will step you through the process so that you know what you are doing and why.  This has led me to believe that, with the internet, tutorials and forums are set up for a vastly different user experience when it comes to Unity and C++.

Unity and c#/js videos will go the extra mile to make sure that the viewer knows what and why something is being done.  C++ forums are set up for experienced programmers experiencing unusual problems and how to troubleshoot those problems.  C++ “how-to” videos are almost non-existent and C++ forums do not seem to cater for inexperienced C++ programmers.  When they do try to help beginners, their language is hard to understand and their expectations of your knowledge is beyond my abilities.

Over the last 5 – 6 weeks, I have spent countless hours researching Quadtrees and other spacial partitioning methods, Box2d collisions, header bloat, Making a GUI interface, Networking, installing libraries into Visual Studio, installing omp into Visual Studio.

From those countless hours, I have achieved installing omp and libraries into VS and bits and pieces of code and pseudo code that I can’t implement.

The reason I am not liking C++ is because it feels like I am learning to code with heavy blinkers on my eyes and both hands tied behind my back.

If I should stumble across any user friendly C++ tutorials and sites, I will edit this post with their addresses, but I don’t expect you to hold your breath waiting for them.

Unity 5 and the recording of pulse rate with the Arduino.

One of my tasks this Trimester is to get feedback and use it, from something other than standard game controllers or the mouse.

I have teamed up with designer, Savik Fraguela (yet again), and I am providing bio-feedback for his game.  In order to keep it inexpensive at this stage, I am using feedback obtained from an Arduino board and software that comes with the “Pulse Sensor” from

This meant that I needed to download and install several Software suites and the FDTI drivers.  The drivers were a pain in my case because I am using Windows 8 and this operating system hates using unverified, or unsigned drivers.  I had to get past that by following the directions on this site :

Installing the rest of the software was uneventful as was attaching the monitor to the Arduino board.

I was able to get the software (called a sketch in Arduino Studio) running and pushed onto the board.  I was also able to get the serial driver showing what feedback was comint through the port.

My first problem came when trying to write my Unity script.  I eventually had errors saying that the serial was not open.  By this stage, I had to return the Arduino hardware to my Lecturer, but I suspected that it was because ArduinoStudio was using the port to read the information.

This was confirmed over the weekend.  I again ran Arduino Studio but didn’t display the Serial Port in that suite.  Unity had no problems with the set up.

Now the fun part.  I am used to manipulating gameobjects, and not so much strings and byte[] and char[].

I was trying to put the data from the serial port into a char array and specifying the length as 5, because the data being sent was a Letter that indicated what sort of information was being sent, then an int of 2 or 3 characters and then a carriage return.  I was able to display only the numbers and nothing else.  The code I was using was serialPort.ReadByte().

I eventually found (thanks Google) that I should be reading using serialPort.ReadLine() and copying that to a string.  This allowed me to read the whole line of data coming over the serial port in the format of S203, etc.

The incoming data was split into three categories.  Data that starts with an S prefix which gives the user raw data.  Data with a Q prefix that is apparently some sort of time measurement from the last beat and data that has a B prefix, which is the one I wanted as it is the beats per minute.  This is taken over 10 beats and averaged out for a minute.

c# strings are so easy to work with.  I thought that I would have to split the string down into a char array and try to work with that, but as it turns out, a string is already a char array.  Each element can be dealt with as string[0], string[1], etc.  That was fine until I tried to use this to find out the start of the string .. if(string[0] == “B”).  I was told that Unity couldn’t compare a string with a char.

More searching and I found that you could do this if(string.StartsWith(“B”)).  This has me beat.  It sounds the same as if(string[0] == “B”), but Unity accepted that and it worked.  I was now getting all of the data that had the B prefix.  I did notice that the information seemed to be fluctuating pretty steadily, and occasionally wildly.

Because there was no guarantee as to how long the ints were, they could be 2, 3, or 4 numbers long(although in the case of the B prefix, it is fair to assume that they will be only 2 or 3 numbers long) I made use of string.Length, they are arrays after all.

As only the B data is coming through to this point, I then created a new temp string and appended it with the contents of my original string, excluding the prefix.  I then converted that temp string into an int.

In order to try and settle down the fluctuations, I created a List of ints and when the count reached 5, it would take the average of those bpm and return this new bpm. (after removing the first member of the List)

I know that Savik will be using an enum to gauge the factor of how scared the player is and using that enum to ramp up the fear inducing elements of his game.  If this list isn’t stable enough, I will look at making the list longer and even consider taking the highest and lowest values out, then averaging the rest and using that information for the new bpm.

In conjunction with this, we were considering using OpenCV and OpenCVSharp (and even some other formats) to use facial tracking from a web cam and supplement information from them to aid in the bpm.  I will write another blog about this regardless of whether I can get it working in Unity 5, so that someone can see the effort I have put into trying to get it working and perhaps show me if and where I went wrong.  I did find a project that will work in Unity 4, but there seems to be a problem with Unity 5 reading the haars_Cascade_facial .xml file.

In the mean time, here is the file that I used to get the pulse monitor working with in Unity:

using UnityEngine;
using System.Collections;
using System.Collections.Generic;
using System.IO.Ports;
using System.IO;

public class SerialListener : MonoBehaviour
public SerialPort serial = new SerialPort(“COM3”, 115200);//create new serial port
private string beat = “”;// string to hols the data in
private List<int> beatPM = new List<int>(); //List to hold the beats

void Start ()
//open a new connection

void Update ()
beat = “”;
//Read data input … should be a letter to start (char)
//then 3 numbers and then carriage return
beat = serial.ReadLine();
//        print (beat);
//Create an int that can be used by the Client
int somethingToSend = AnalyseBeats(beat);
print (somethingToSend);


void OnApplicationQuit()

//Function connecting to Arduino
public void OpenConnection()

if (serial != null)
if (serial.IsOpen)
print(“Closing port, because it was already open!”);
//message = “Closing port, because it was already open!”;
serial.Open();  // opens the connection
serial.ReadTimeout = 1000;  // sets the timeout value before reporting error
print(“Port Opened!”);
//        message = “Port Opened!”;
if (serial.IsOpen)
print(“Port is already open”);

print(“Port == null”);
int AnalyseBeats(string str)
//Data comes in with a “S” header, a “Q” header or a “B” header
//S = raw data .. “Q” is (allegedly) the time since the last beat was detected
//and B is the one that we want.  It is the beats per minute averaged over 10 beats

if (str.StartsWith(“B”))
//            print (“Original data.” + str);
//create a new string
string t = null;
//append to t everything except the starting letter
for(int j = 1; j<str.Length; ++j)
t += str[j];
//Convert that string into an int
int temp = int.Parse(t);
//            print (“Modified data.” + temp);
//add that int to the list

//        print (“BMP.count.” + beatPM.Count);

//Create an average of the BPM to give a more stable reading of the current pulse rate
if(beatPM.Count >=5)
int bAv = 0;
for (int i = 0; i < beatPM.Count; ++i)
bAv += beatPM[i];
bAv= (int) bAv/beatPM.Count;
//            Debug.Log(“Average over last 5 beats = ” + bAv);
return bAv;

return 0;