Posted by & filed under article.

Back in late March Enigma arranged for a Hackaton in the new facilities of NyVekst AS across the road of HiØ Remmen. Every one where encouraged to do something they really wanted to do. If that meant working on a school project, it would be just fine. Many fellow students showed up to pull an extra 8 hour run after ordinary school hours.

2014-03-25 18.06.16-1

I teamed up with two students from computer engineering Jon Kjennbakken and Adrian Edlund. Our project was highly motivated to utilize the Kinect tracker in some way. This spur the idea of making info-screens more interactive. Info-screens usually contains lots of information that are displayed on a screen. The content is swapped within short time cycles to allow more information to be presented. Some times this cycle is to short and your unable to read the whole content within the given time-cycle. Then you are forced to wait for the other content to circle threw, before the entry you where interested in returns.

Not to waste to much time developing a new info-screen, we went with a project I earlier had fiddled with. The project exposes functions in JavaScript for easily navigating back and forth of the content cycle. To handle the Kinect we built a c# application due to easy implementation of the Kinect for Windows SDK. The challenge was primarily presented in how we should interpret the data and how we should interact with the info-screen. It felt reasonable to use the arms to slide between content. Similar to how we slide threw images on a smartphone. Gathering x,y,z coordinates of the hands from the Kinect seemed the way to go. However understanding the users intent was quite hard to determine based on the continuous movement. Lets say we acted on the coordinates of the right arm present. Using the x coordinate we could determine the direction of the hand along a horizontal line. By moving the arm to the right, we could check if last x is less than or greater than the newest x. Also adding a threshold to the minimum required movement of Abs(oldX-newX)>20 so that it would not bogusly react to random movement.

2014-03-25 23.05.55

Thinking we had it all figured out, it was still not working as intended. Even with this approach we got errors on the return path. Thinking we could measure the speed of the hand moving to the right and act on it, would still be implicated by the speed of retracting the arm to the normal position. The final take we ended up doing with this project, was to only interpret the hand direction if it was above the elbow. Playing on the hand resting on the return gesture and falling beneath the elbows y coordinate. While it is far from desirable, it works decent!

We used Socket.IO to facilitate the communication between the C# app and the web app.

Before the evening was over we managed to squeeze in a little easteregg triggered by two persons stretching theirs arms in the air. Ready? GO!

2014-03-25 23.05.21Jon and Adrian playing pong.

 

Posted by & filed under article.

This year I got my hands on a eye tracker from The Eye Tribe. This is the world first $99 eye tracker and a giant step towards new thinking and methods of working. The eye tracker has for a long time been used in the area of research, marketing and aid in communication with acute disabled people. However the price for eye tracking equipment over the past years has been a staggering of $2000 to $5000 US Dollars, making it a huge investment and almost impossible if you don’t have the means or necessity to buy one. The steep price reduction from The Eye Tribe has also enforced the current largest eye tracking company Tobii to respond with a partnership with SteelSeries an to produce an eye tracker to $195 (pre-order price).

This is the motivation for creating new and exiting eye tracking applications for the consumer market, following the creation of a prototype DesktopEye.

The DesktopEye (lack of name creativity) is a prototype to easy access and changing between window processes. It uses the eye tracker from The Eye Tribe to record eye position and trigger a menu to show when looking in the bottom left corner. The menu will build it self from running processes present as windows on the desktop. By looking at the menu entries, it will trigger the process to show and be brought to front. When looking upwards and away from the menu will result in automatically closing the menu.

Eye tracking to switch between windows.

The image is an illustration of the menu while looking at Visual Studio. The red dotted area is to illustrate the trigger area to show the menu. The project can be downloaded here as DesktopEye. Again this is a prototype and it lacks some functionality towards how windows are handled and explorer windows are not supported either as they are part of the process explorer. However I hope this project will motivate further development and work as inspiration towards useful eye tracking applications.

The eye tracker has arrived!

Posted by & filed under article.

While trying to figure out if my computers WiFi could be used as a hotspot, I did some search and found this article on how to achieve it! The article shows a step by step process to figure out if your computer supports adhoc network and further how to enable it.

If you did read the article, you will know by now that netsh needs to be run in a command line with administrative privileges and each time you restart the computer, you are required to start the network again from command line. To simplify this procedure I created a GUI wrapper application that still uses the same commands, but wrapped nicely with forms and buttons as illustrated in the image bellow.

WinWiFi

You can download the source files here. If you only want the executable, its located in the source folder ‘\WinWiFi\WinWiFi\bin\Debug\WinWiFi.exe’. This file can be copied to the desktop or any other location.

Enjoy!

Posted by & filed under article.

This semester I have taken the course Computer Graphics at Østfold University College. The course consists of two project, where the first is about understanding 3D space and I made a helicopter in OpenGL. In the second project we are encouraged to think bigger and utilize JMonkey Engine to create some sort of application and yes we get to make games.

Here is a video of how my project turned out:

Posted by & filed under article.

Many games lets you customize the key bindings for movement, abilities, pause, exit and so forth. Some games utilize many key bindings and some times they are not placed in a suitable way for every one. Therefor the key bindings for movement might be set to the w,a,s,d keys by default and they make it optional to swap to different bindings, like utilizing the arrow keys instead. In this tutorial I will illustrate one way of creating a smooth options panel with Nifty GUI, so the user can easily assign new key bindings.

Set up the basic Nifty GUI boiler template.

Create a new XML file Interface/keyBindings.xml

<?xml version="1.0" encoding="UTF-8"?>
<nifty xmlns="http://nifty-gui.sourceforge.net/nifty-1.3.xsd"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://nifty-gui.sourceforge.net/nifty-1.3.xsd http://nifty-gui.sourceforge.net/nifty-1.3.xsd">
    <screen id="screenKeyBindings" controller="mygame.OptionAppState">

    </screen>
</nifty>

Here I have already added a screen with the id screenKeyBindings and set the controller to target OptionsAppState. The controller is necessary to add so the buttons later added know which class to interact with. The screen id is needed for a reference when we later want to update the content.

Utilizing Nifty GUI buttons

Since I want to use the costume buttons that comes with the Nifty GUI, I am required to add some style guidelines and the default controls. You can load these by adding the following two lines above the screen tag.

<useStyles filename="nifty-default-styles.xml" />
<useControls filename="nifty-default-controls.xml" />

I can now setup my button with the following XML markup:

<panel childLayout="horizontal" backgroundColor="#ffffff33" padding="5px">
 <text align="left" width="80%" textHAlign="left" font="aurulent-sans-16.fnt" color="#ffff" text="Move forward" />
 <control width="20%" id="forward" name="button" label="">
 <interact onClick="setKeyBinding(forward)"/>
 </control>
</panel>

nifty-button-markup

The button is encapsulated with a surrounding panel. Inside the panel is followed by a text description and a button. The button nests a interact that will trigger the function setKeyBinding in the mygame.OptionAppState. Notice that the button ID is set to forward and that the setKeyBinding takes the exact ID as parameter. This is important to identify which button was clicked and to later update the button label.

OptionAppState

To organize the game we create a new AppState named OptionAppState to handle the key binding. First we setup the Nifty GUI, this is straight forward like any Nifty GUI implementation. In this case its wrapped nicely in the method initNifty(). It uses the file keyBindings.xml and sets the screen with id screenKeyBindings.

private Nifty nifty;
private NiftyJmeDisplay niftyDisplay;
private void initNifty() {
  niftyDisplay = new NiftyJmeDisplay(assetManager,
									   inputManager,
									   audioRenderer,
									   guiViewPort);
  nifty = niftyDisplay.getNifty();
  nifty.fromXml("Interface/keyBindings.xml", "screenKeyBindings", this);

  guiViewPort.addProcessor(niftyDisplay);
}

Whats gonna happen when we click the button? At the moment we have setup the Nifty GUI with a button that calls setKeyBinding. When this action is triggered we want to start listing to keyboard event. When and if a key is pressed, we want to bind the pressed key to the button we clicked, update the button label with the key we pressed and stop listening to the keyboard.

To achieve this we create a new class named SettingsInputHandler that implements RawInputListener. The class uses the onKeyEvent to grab the key event and send the key event back to the OptionsAppState. We pass the eventId along the way, this is the ID to the button pressed.

public SettingsInputHandler(OptionsAppState appState, String eventId) {
  this.appState = appState;
  this.eventId = eventId;
}

public void onKeyEvent(KeyInputEvent evt) {
  try {
    appState.keyBindCallBack(evt, eventId);
  } catch (Exception ex) {
  }
}

This will be even more clear when we setup the interact setKeyBindings.

private SettingsInputHandler sih;
public void setKeyBinding(String eventId) {
  Screen screen = nifty.getScreen("screenKeyBindings");
  if(sih != null) {
    Button button = screen.findNiftyControl(sih.getEventId(), Button.class);
    button.setText("");
    inputManager.removeRawInputListener(sih);
  }
  Button button = screen.findNiftyControl(eventId, Button.class);
  button.setText("<press any key>");
  sih = new SettingsInputHandler(this, eventId);
  inputManager.addRawInputListener(sih);
}

We setup the SettingsInputHandler as a private variable outside of the method. When the button is clicked the button label is set to “<press any key>”. We initialize the SettingsInputHandler with OptionsAppState and button ID as parameters. Then we add the the SettingsInputHandler to the inputManager as a raw input listner. Notice that if sih is not null, we will clear the pressed button’s label and remove the listener. We need to handle this in case multiple buttons are pressed with out any keyboard interaction.

Back to the flow, a button is clicked. SettingsInputHandler is added to the inputListner. When a key is pressed a callback is sent to the OptionsAppState to the method keyBindCallBack.

public void keyBindCallBack(KeyInputEvent evt, String eventId) {
  /* Callback, triggered from settingsKeyHandler */
  Screen screen = nifty.getScreen("screenKeyBindings");
  Button button = screen.findNiftyControl(eventId, Button.class);
  button.setText("" + KeyBindings.getKeyName(evt.getKeyCode()));

  /* Performe mappings from Nifty GUI to KeyBindings */
  mapNiftyBindings(eventId, keyBindings, evt.getKeyCode());

  inputManager.removeRawInputListener(sih);
  sih = null;
}

The method keyBindCallBack uses the eventId to find the button pressed and update the label with what key that was pressed. Then we need to assign the pressed key to the variable that is referenced threw out the game. This assignment is handled in the method mapNiftyBindings.

KeyBindings

The KeyBindings class that contains our key references and should be accessible to any AppState that handles input. By default the keys defined can be assign a value, but this is not required.

// Default bindings
public int FORWARD = KeyInput.KEY_UP;
public int BACKWARD = KeyInput.KEY_DOWN;
public int EXIT_APPLICATION = KeyInput.KEY_ESCAPE;
public int PAUSE;

This class has a static method getKeyName that traverses the methods of the class KeyInput from JME3 and returns the reference name for the key value. So when the spacebar is pressed KEY_SPACE is returned and not an empty space.

public static String getKeyName(int keyCode) {
 Class keyClass = KeyInput.class;
 for (Field field : keyClass.getFields()) {
   try {
     if(keyCode == field.getInt(null)) {
       return field.getName();
     }
   } catch (Exception ex) {
     // Shh
   }
 }
 return null;
}

The final result:

nifty-button-markup-multiple

You can download the source files here.