How do I control the Raspberry?

You can either connect a screen and keyboard or use VNC for remote control. Either way, you will have a small GUI application to easily connect the sensors. It will show you battery and radio level of the sensors and will start the gesture recognition right after you’ve successfully connected the sensor.  

How do I receive the gestures?

Gesture events are published in your local network as JSON encoded messages over ZMQ (zeromq.org). Multiple clients can subscribe to the Kinemic GesturePublisher service. ZMQ is available for all major (and also obscure) programming languages and operating systems. Writing a client usually only takes a few lines of code. We provide you with a minimal implementation in Python.  

Which gestures can I use?

We provide a set of twelve gestures, which are depicted below:
Swipe left.

Swipe left

Swipe right.

Swipe right

Swipe up.

Swipe up

Swipe down.

Swipe down

Rotation to the left, then to the right.

Rotation left-right

Rotation to the right, then to the left.

Rotation right-left

Circle counterclockwise

Circle counterclockwise

Circle clockwise

Circle clockwise

Touch left ear with right hand.

Eartouch left

Touch right ear with right hand.

Eartouch right

Hand performing "Check"-gesture .

✓ (OK / Checkmark)

Hand performing "X"-gesture.



What does AirMouse mean? (will be available soon)

In AirMouse mode, you can use your hand just like a mouse. Scroll over maps, move sliders or a pointer just by moving your hand. You will receive the AirMouse movements as dx/dy coordinates over ZMQ just like the gestures.  

How can I use AirWriting? (will be available soon)

You can send the JSON encoded of 10 words to the raspberry, which will then be recognized if written in the air. Again, once a word is recognized, you will receive a JSON message over ZMQ. You can choose arbitrary words in capital letters from the english alphabet.  

If I move around, how will the system know whether I intend to perform or not perform a gesture?

Our world-class algorithms are specifically designed to understand your motion and will ignore your every-day movement. Of course, if a user performs a motion that is very similar or even equal to a gesture, we cannot know if this was a gesture or a random movement. However we have years-long experience in building real-world gesture interfaces and will help you design a system to cope with that. Additionally, there is a mechanical switch on the sensor to turn gesture recognition on and off.  

How long will the sensor last (runtime)?

You can expect a sensor to run for eight hours.  

What is the deal with this “interactive workshop”?

Gestures are a new modality and paradigms from desktop or mobile graphic user interfaces cannot be directly transferred to gesture interfaces. We want to make sure that you get the best out of our evaluation package and will provide our extensive experience in building gesture interfaces which actually *work*. In the future, we will of course release our software directly, together with best-practice guidelines. However, in this early stage, we want to make sure, our customers get the most out of the evaluation package, so we sit down together and share our knowledge.  

How many sensors can be connected with a Raspberry at the same time?

With the evaluation package it’s only possible to connect one sensor at once. If you wish to switch to the other included sensor, the connection will be established and the prior connection will be terminated. Should you have other requirements, please contact us!  

What happens when I move out of the receiver’s reach and then re-enter the receiving range?

The connection will automatically be re-established.  

What happens when I run or otherwise move while at the same time performing gestures?

With a normal or moderate walking pace, there are usually no problems. Generally speaking, gestures work best when started from a resting position. This means the more enthusiastic your other movements are – especially those of your arm or wrist – the harder it will be to successfully detect gestures.  


© 2018 Kinemic GmbH. All logos, signs and trademarks are the property of their respective owners.