Gesture Catalog

Within this section are listed the gestures and actions which can be made using uBox technology at present. Some are these are built-in in the javascript code itself whereas others have been implemented afterwards and can be added to the code to take full advantage of the capabilities of this technology. These gestures are presented with a brief description. If you click on the gesture’s name link you will be taken to another page with a more detailed explanation with an example for better understanding.

Keep in mind that most of these are examples to show how the gestures work to allow you to grasp them and hit the ground running, so does not imply are the only way of doing things. In fact, the community is expanding and the users are creating clever new ways of using the sensors, so feel free to explore and combine them to create new amazing experiences!

BUILT-IN GESTURES

newUser

The gesture or action is activated when a person enters the region captured by the sensors. The maximum number of users allowed depends on the version of Kinect used. 

noUser

The gesture or action is activated when a person leaves the region captured by the sensors.The maximum number of person allowed depends on the version of Kinect used as listed above.

grip

This gesture is activated after closing the active hand. This gesture resembles the action of clicking the mouse. Requires KinectV2.

release

This gesture is activated when the hand is reopened after grip was triggered. This gesture resembles the action of releasing the left button of the mouse.

getSkeletons

This is not a gesture as such but a way of retrieving a .json file with the different values obtained from the Kinect sensor from the body as seen in the image below for triggering actions with them. This .json file passes the values of the joints obtained from the Kinect sensor. So, in simple words, it is not a gesture as such but a list with the positions of those joints captured by the sensor from the body.

next

This gesture is activated when you swipe the arm to the right. It resembles the action which happens when you click on the mouse and swipe to the right which can be used for instance for selecting the following image in a slideshow.

prior

This gesture is activated when you swipe the arm to the left. It resembles the action which happens when you click on the mouse and swipe which can be used for instance for selecting the previous image in a slideshow.

changeHandNone

This gesture is triggered when no hand is raised or alternatively when the gesture changes from at least one hand present to none been captured.

changeHandRight

This gesture is triggered when the right hand is moved.

changeHandLeft

This gesture is triggered when left hand is moved.

msgFromArduino (Message from serial sensor)

When a serial message is received, it can be captured and assigned to other actions.

All Built-in Gestures

Herein it will be shown an example with all the built-in gestures available for testing on the web (except from messageFromArduino)

OTHER GESTURES:

 

Mouse control

As it name suggests allows controlling the application using the mouse but instead of using the mouse is controlled using joints of the body as obtained from getskeletons.

Be aware of that for testing purposes the app could be controlled with the mouse as well, which is not the same of this scenario, in that case it does not require the sensors and uses a different method which is covered in the tutorials (see advance tutorial).

Interactive area

Refers to a defined area in the code  which can be used for triggered actions.

Jump

This gesture as the name suggests allows to jump, to do so measures the difference in a given time interval taken from the spinecenter of the skeleton (therefore needs to use getSkeletons in order to work).

If user does not move (ie.stays still), the value does not change from zero, however when goes up changes  to a positive value and if goes down to a negative value. When an obstacle is present in order to determine whether the ‘jump’ is successful or not, the value has to be bigger of a certain given threshold which represents the height of that obstacle.

Run

This allows the user to run when certain criteria is met. In a nutshell, measures the height of the spinecenter (therefore needs getSkeleton to work) and is triggered when this changes its value from either 0 or 1.

Walk

This allows the user to walk when certain criteria is met.  Measures the height of the spinecenter (therefore needs getSkeleton to work) and is triggered when this changes its value within a certain threshold between 0-1. Its functionality is similar to Run (see below), however it is slightly modified in order to capture the nuances of movement from the sensors and allow a more detailed representation.

Squat/crunch

This gesture works by measuring the difference of change in time from given a joint* of your choice  (therefore needs getSkeletons to work).

*Usually using  “spinecenter” as its the one which offers better stability.

Swipe down/up

This gesture works out the difference in position in a certain time interval from a given joint of your choice (therefore needs getSkeletons to work) . It works with both hands by design, the one which is higher takes over.

Note that at present swipe down and up cannot be used at the same time.

Pull the lever down/up

This works as if you Grip + Swipe up/down

The hand which is higher takes over.

oneHandUp/Down

This gesture determines whether one of the hands is over or behind the head.

BothHandsUp

This gesture determines whether both hands are over the head.

Message to external sensor (serial)

To be implemented