Tag Archives: Unity 5

Postmortem on Pulse Monitoring Plug-in for Designer’s Game

I found that this was an unusual process.  To date I have been deeply immersed in most of the games that I have collaborated in.  The closest to this was my involvement with “Valour” where the programmers went in hard to start with, set a lot of things up and then had much less contact with the designers as they advanced their ideas through the game play.

To even mention “Valour” send shivers down my spine and it is not a comparison to the final product of “Be Brave”, created by Savik Fraguella.  My involvement was even less, in the scheme of things.  I created the software to get Unity to accept an incoming pulse beat from the wearer and feed that pulse rate into an system that would emit a “Fear Factor” rating that Savik would then be able to use to influence the content of his game.

I quickly, over the space of about 2-3 weeks, completed this task and handed it off to Savik to await any problem that might emerge.  There was one situation where the program would quit and then restart as the level was changed but this was quickly over come by making it a persistent singleton that could start during the main menu and then remain through the levels but destroy itself when either quitting, or completing the level(s).

Savik, at one stage, did borrow the equipment needed to test the game, but he either didn’t install the FTDI drivers, or there was a problem installing them.  I referred him to my blog post regarding the project for installing them.

As we got closer to Open Day, where the game would be displayed, I regularly checked with Savik if he wanted the Arduino to test the game, but he wasn’t interested, focusing on the art work of the game.  It wasn’t until the day before Open Day that I realised that there were some problems.

The pulse rate wasn’t getting through to the build of Unity.  It still processes the information quicker in Unity than in the build but that is another story.  I narrowed the information coming into Unity in order for the information not to clog the ability for the build to run.  Instead of collecting all the information, I narrowed to Arduino to only send information that I could use, which was the actual beat rate.

Video showing the Pulse rate through a simple GUI and the Fear Rating being influenced by the pulse rate.  (I must apologise for spelling “Serial” wrong)

With some help from my tutor, who performed some Unity black magic, I was able to get the build to work properly.  When I find out what black magic was performed, I will amend this post with the information.

Savik showed some quite advanced scripting skills that impressed me, both with him knowing that they existed in the first place, and with the application in a game setting.  Using Pims Algorithm to randomly generate a level so that the exit was not close to the entrance was among the concepts that I found interesting.  I would do well to sit with Savik and discuss how he came to implement them and other ideas such as, apparently, using the fear state to assist in the generation of the level.

What I have learned through this project is not to adopt a “set and forget” mentality.  By remaining closer to Savik as he progressed through the making of the game, I would have been aware of the problems with my component much earlier, and I also would have gained a better understanding of how he utilised and adapted those scripts that left me with a positive impression.

 

Why blogging can help organise your mind and open up unexplored avenues

This is yet another blog that has changed since I started writing it.  It was to be “Unity 5 and my failed attempt to access facial tracking using OpenCVSharp.”  I will now try to show why blogging can help to organise your mind and leave you open to possible solutions.  I will still tag this blog with Unity 5, OpenCVSharp, Webcam and facial tracking, just in case someone else is having problems with getting it to work in Unity 5.  The first part will briefly explain how I got to the stage before I got facial tracking to work in Unity and then explain how this process helped me find the solution.

Firstly, if you have a webcam attached to your computer and aren’t sure if it is working, this Windows trick will save you a heap of time.  Go to your start bar and type in “Camera”.  You can then click on the Camera button.  If it asks you to connect a camera, return to the main screen and press Fn-F6.  This will turn your camera back on and you should be right to go.  Now if you click on the camera, your ugly mug should now come up on the screen and you can take a quick photo of your self satisfied look after saving some time on Google.  Please do it, because that will be the last time you will see that look for some time.

Let me explain why this is the case.  Unity 5 is a 64 bit program and will need 64 bit versions of all the relevant DLLs. More commonly, these are 32 bit and they will not work in Unity 5. But the 64 bit versions can be found at this link: https://github.com/shimat/opencvsharp/releases

My search for trying to find a working version of Facial tracking led me to this site: http://forum.unity3d.com/threads/opencvsharp-for-unity.278033/ and from there I was able to download the relevant Unity package.  It was obvious that it did not include facial tracking and without more research, I wouldn’t get it to work.

I probably spent more time than it deserved, but finally found a working script that tracked faces from this obscure site: http://ux.getuploader.com/aimino/edit/3

It worked perfectly in Unity 4.x, but directly importing it to Unity 5 gave me errors.  I figured that this would be due to the 32 bit DLLs that the original project held.  I changed the “OpenCvSharp” and “OpenCvSharp.MachineLearning” DLLs to the 64 bit version, but ended up with an error, something along the lines that there was an error reading from the “haarcascade_frontalface_alt.xml” file.

I gave up at this point, because there was so much other work expected from me, and this was a prototype for a game that was using a pulse monitor that seemed to be working quite well.  My lecturer convinced me that I should do a blog about this failure so that it would 1. show the work I had done on researching this problem and 2. give anyone else the option to see what I had done and perhaps save them some work on researching and getting to the same point.

Because some time has gone by, I really can’t remember exactly what steps I went through to get to this point, but the ones that I do remember are listed above.

As I was trying to recreate this, I was trying to work out what I had and hadn’t done, so I made a copy of the Unity 4.x version and re-imported it to Unity 5. I then replaced the DLLs with the 64 bit version (as mentioned above).  Then I wondered if there were any other sneaky DLLs that might need replacing. I did find some more DLLs. There were five DLLs that I tried to find 64 bit versions of but they were only available in 32 bit and the fact sheets seemed to explain that they were good to use in a 64bit situation.  Then I found “OpenCvSharpExtern.dll” which I know can be a 64 bit dll, but was still a 32 bit.  I replaced it with the 64 bit version and tried to run the face tracking again.  Face tracking is now operational in Unity 5.

Face detection working in Unity 5

Face detection working in Unity 5

There is a slight bug, however.  It will work the first time you play a scene, but stopping it and trying to run it again will cause Unity to freeze, or become unresponsive.  This is likely to the one red error that it produces at the start of the run.

The error itself reads:

MissingFieldException: Field ‘OpenCvSharp.DisposableCvObject._ptr’ not found.
FaceDetectScript.Start () (at Assets/Scripts/FaceDetectScript.cs:43)

The error relates to this line of code found in the FaceDetect script:

CvSVM svm = new CvSVM ();

The main things I found trying to research this error are https://github.com/shimat/opencvsharp/blob/master/src/OpenCvSharp.CPlusPlus/modules/core/Ptr.cs and https://github.com/shimat/opencvsharp/blob/master/src/OpenCvSharp/Src/DisposableCvObject.cs .

They both appear to be part of shimat’s source code for creating the DLLs.  I am horrible with pointers in C++ and didn’t think that they were supported in C#, so I will try to find out a solution to this problem and will edit this post within 2 weeks.  NOTE: It will be below this with some sort of edited heading.

*****EDITED 21/07/15*****

The solution seems to be to toggle the following lines of code:

CvSVM svm = new CvSVM ();
CvTermCriteria criteria = new CvTermCriteria (CriteriaType.Epsilon, 1000, double.Epsilon);
CvSVMParams param = new CvSVMParams (CvSVM.C_SVC, CvSVM.RBF, 10.0, 8.0, 1.0, 10.0, 0.5, 0.1, null, criteria);

They are not being used elsewhere in the program and seem to be somewhat incomplete in their use.   CvSVM is a Support Vector Machine which is given labeled training data.  An algorithm outputs an optimal hyperplane which can categorise any new examples.

CvTermCriteria means the termination criteria for the iterative algorithm that the CvSVM is using and the CvSVMParams are the parameters used to train the CvSVM.

Usually there is a “Train” and “Predict” method called to the SVM to get it to function and adequately predict something.

I have no idea why this code was left in the project but it is obviously incomplete and the facial tracking works without them being included.

**********

This project had frustrated me in several ways.  Firstly, that for so long, I thought i was so close to having it working.  Secondly, I spent too long on it, without making accurate notes and recording my results which makes it harder to prove what you have done for research into the subject and third, if I had done the above, I could have stumbled on the solution a lot sooner.

What I aim to do in the future, and I highly recommend to others, especially any students, is to record what is happening with your research, even if you save it as a draft.  Actively copy links and what you are looking at, or for, as you go.  In this way, it is all laid out and you might be able to see holes in your logic or thinking as you progress.  At worst, leave it for a week or two and approach it with fresh eyes.