I built a gesture recognition library that I think would be perfect to integrate into this. It uses a detection algorithm that on the RPi I think we could tune to run at 10-15 FPS for 3-5 meters away. It currently recognizes 3 gestures and it can do tracking, so following your palm around.
It’s a bit messy right now and has a dependency on opencv but is anyone interested in having this integrated? Could be cool to having a touchless component
The beta is up on gesture.ai
Yes I’m really interested! Do not know why nobody replied(
I also built a gesture detection module using infrared sensors:
It requires an Arduino and two infrared sensors for detecting presence and up, down, left, right, close and far hand gestures.
It can be used to turn the mirror into sleep mode when its not used, shows compliments when you stand in front of it and allows to scroll through the news ticker and view more news details incl. the full news article.
@thobach have you tried skipping the Arduino and just connecting the sparkfun apds gesture breakout board directly to the i2c ports of the raspberry?
I have it working with wiringpi but I’m thinking of having it work with node directly via i2c for node (https://www.npmjs.com/package/i2c) and then adapting this module for node : https://github.com/jiahuang/apds-gesture
I have no clue how to do it, and if it will even work with the MM and electron.
What do you think?
Have you used MMM-Swipe module. It uses two ultrasonic sensors to detectr right and left motions, I found it quite easy and straightforward