You can either connect a screen and keyboard or use VNC for remote control. Either way, you will have a small GUI application to easily connect the sensors. It will show you battery and radio level of the sensors and will start the gesture recognition right after you've successfully connected the sensor.
Gesture events are published in your local network as JSON encoded messages over ZMQ (zeromq.org). Multiple clients can subscribe to the Kinemic GesturePublisher service. ZMQ is available for all major (and also obscure) programming languages and operating systems. Writing a client usually only takes a few lines of code. We provide you with a minimal implementation in Python.
We provide a set of eight gestures, which are depicted below:
In AirMouse mode, you can use your hand just like a mouse. Scroll over maps, move sliders or a pointer just by moving your hand. You will receive the AirMouse movements as dx/dy coordinates over ZMQ just like the gestures.
You can send the JSON encoded of 10 words to the raspberry, which will then be recognized if written in the air. Again, once a word is recognized, you will receive a JSON message over ZMQ. You can choose arbitrary words in capital letters from the english alphabet.
Our world-class algorithms are specifically designed to understand your motion and will ignore your every-day movement. Of course, if a user performs a motion that is very similar or even equal to a gesture, we cannot know if this was a gesture or a random movement. However we have years-long experience in building real-world gesture interfaces and will help you design a system to cope with that. Additionally, there is a mechanical switch on the sensor to turn gesture recognition on and off.
You can expect a sensor to run for eight hours.
Gestures are a new modality and paradigms from desktop or mobile graphic user interfaces cannot be directly transferred to gesture interfaces. We want to make sure that you get the best out of our evaluation package and will provide our extensive experience in building gesture interfaces which actually *work*. In the future, we will of course release our software directly, together with best-practice guidelines. However, in this early stage, we want to make sure, our customers get the most out of the evaluation package, so we sit down together and share our knowledge.
With the evaluation package it's only possible to connect one sensor at once. If you wish to switch to the other included sensor, the connection will be established and the prior connection will be terminated. Should you have other requirements, please contact us!
The connection will automatically be re-established.
With a normal or moderate wlaking pace, there are usually no problems. Generally speaking, gestures work best when started from a resting position. This means the more enthusiastic your other movements are - especially those of your arm or wrist - the harder it will be to successfully detect gestures.