Home || About/Contact || Resume || Articles || LinkedIn


One of my favorite aspects of level design is learning new tools, figuring out what features they have available to them, and understanding their strengths and weaknesses. What follows are a few examples of my excursions into different engines and level editors, and some of the things I learned about them.


HPL:

HPL is the proprietary engine for Frictional Games, the studio behind Amnesia: The Dark Descent. For a project, I used HPL2, the version of the engine used for Amnesia, as well as its sequel, A Machine For Pigs.

I had been used to using Hammer, the level editor for Valve’s Source Engine, before this. The biggest difference was that Hammer is BSP-based, while HPL almost exclusively uses meshes. This basically means that in Hammer I could apply textures to any sort of shape I could make in the editor, whereas in HPL I was limited to the modular assets provided with Amnesia. While the levels I was creating in HPL were immediately nicer-looking, I was constrained with the exact dimensions I could make levels in.

amontilladoInEditorScreenshot

A screenshot of one of my levels in the HPL level editor.

A big plus HPL had, though, was the ability to script events in the level. Using a high-level language called AngelScript, I was able to directly implement logic for gameplay, as well as create scripted sequences. The most common method for this was to recursively call a function that the scripted sequence would happen in; the sequence logic was handled by a switch statement in the function that would be iterated through when recursively called. Here’s a truncated example:

AngelScriptCodeExample

A truncated example of how a Scripted Sequence I wrote is handled in AngelScript

Most of the logic here is dependent on the AddTimer() function, which takes three parameters: a name for the timer so that it can be terminated later, the amount of time in seconds it will last, and the name of the function to call when its time has elapsed. Provided the scripter has an int that they increment every time the function is called, a nice sequence will play using this method. Another function above, AddEntityCollideCallback(), sets up a trigger volume to iterate the function to case 1.

The sequence from the example code above working in-game.


idTech 4:

idTech 4, a BSP-based engine, was made by id Software for their game Doom 3. For a project, I decided to make a short level for The Dark Mod, a game attempting to recreate Thief: The Dark Project in the engine.

idTech 4 has its own unnamed scripting language, which is very similar in syntax to C++. Using the $ symbol as an operator, scripts are able to access objects in the level directly (i.e. “$guard1.activate();”), as opposed to searching for them with strings like in AngelScript. Unfortunately, every function called needs to be a member of some sort of class, even if it’s not class-specific. For this, the “sys” class (short for “system”) is used (i.e. “sys.waitFrame();”). Here’s some example code:

convScript

Some code I wrote to check that two guards are in place before running a conversation sequence.

A lot of logic is also handled in the level itself through logic nodes, entities that are able to specific operations. Source engine has these, too, though presents a much cleaner way of hooking these things up; a lot of idTech 4’s nodes are position-specific, which means that NPCs that play a lot of animations in one spot will require a jumbled mess of nodes there. Using scripting greatly condenses that, but nodes are still required for some things, such as communicating positions to NPCs. On the other hand, scripting is able to do things that nodes can’t, such as accessing certain member functions of different objects.

nodeMess

An in-editor example of logic nodes (purple) adding to clutter. The four yellow entities are each responsible for calling a function in the script, reducing the amount of nodes needed.

Another feature of idTech4 I was interested in was the GUI scripting that was created for Doom 3. This is actually separate from the level scripts I showed above, using a completely different syntax and set of commands, but it allows the creation of diegetic screens in a level that the player can interact with.

guiScripting

A truncated example of the GUI scripting syntax in idTech4. This logic, which is attached to a GUI element called door1, checks the available amount of power left when door1 is clicked, and updates the visuals of some GUI elements to reflect this.

As it turns out, Raven Software made a GUI editor for Quake IV that helps with the creation of these GUIs, including a built-in test program to make sure all the scripting and events are working properly. To include this in a level, an object with a GUI texture in idTech4 must have a GUI attribute assigned to the appropriate GUI file. Then the assigned GUI will be able to access this object’s gui_parm attributes, which call a function in the level script when run.

guiEditor

A screenshot of Raven’s Quake IV GUI Editor

A video showing my GUI system prototype working


Razer Hydras in Unity:

Something I got to experience recently was developing using a hardware peripheral. In this case, it was the now-obsolete Razer Hydra controllers, which consist of two handheld motion controllers, and a base in the center that tracked their position. I decided to use them for implementing a prototype in Unity, since I wanted to use an engine I was familiar with for this project, and because a plugin for the engine was available.

razer-hydra-portal2-gallery-3

A picture of the Razer Hydra setup, with controllers and a central base.

Developing using the plugin was surprisingly easy: all that was required was that an instance of the “SixenseInput” prefab was put into the scene using it. After that, all information regarding the input, position, and rotation of the controllers was accessed through static members of the attached “SixenseInput” script component. Differentiating between the left and right controller is done by using the two-item container “SixenseInput.Controllers”, with Controllers[0] being left, and Controllers[1] being right. Using this, location of a controller can be accessed as a Vector3, while rotation can be returned as a Quaternion:

updateHands

A function I call from Update(), getting the current position and rotation of the Hydra controllers.

For the prototype I was making, I wanted to have the player feel like they were able to pick up objects using “psychic powers”, so first I needed to be able to calculate where each of the player’s controllers were pointing. Logically, raycasting was the best method of handling this, so I could just use the hand positions as the starting point for a ray, and the “handPosition.forward” Vector3 as the direction, while setting an arbitrary maximum length.

getHitPosition

The function called for each hand from UpdateHands(), casting a ray in order to assign a location for the hitPoint object at the end of it.

When the trigger is held on either controller, that means that the player’s hand is grabbing an object, which will lerp its location and rotation to that of the end point of the raycast done above. Doing this allows the player to move it around with their hand, creating the feeling that they are actually manipulating it with their mind. In order to do this, a spherecast is done when the OnTriggerDown event is triggered for either controller, which searches for an object with the “Grabbable” script component.

initGrab

The code for grabbing an object. A spherecast is used to mitigate issues with precise aiming that the Hydras have.

The resulting prototype is actually fairly robust, and I feel that it goes a long way to proving the experience I initially set out for. What astonishes me, though, is how simple the Sixense team has made their library to use. While taking some time in the beginning to learn how it was set up was required, I soon found development for the Hydras to be very intuitive, and was able to focus more on the logistics of implementing my prototype.

A video of my prototype showing off hand tracking and rotation, as well as manipulating objects with the Hydras.

Advertisement