Read the statement by Michael Teeuw here.
Voice/motion control
-
electron-speech, alexa-voice-service & pocketsphinx might be worth a look :)
-
@Simon Have you posted you code anywhere would love to see it!!!
-
@coolbotic No it’s not very pretty :) I’ll see what I can do
-
What do you think about using wit.ai to voice control MM?
Free, unlimited API that could be implemented -
@coolbotic I just made a motion control module. It’s no where near as intricate as leapmotion though. It just detects hand movements, using 2 ultrasonic sensors and give you a swipe left, right, and press notification. If you have any ideas on how to improve it, please let me know.
-
@MichMich This guy has done it, using your MM2
https://howchoo.com/g/yti5mmq0ntu/add-voice-controls-to-your-raspberry-pi-using-jasper
-
@MichMich Well what about annyang. i got it working on my rpi3. See https://github.com/TalAter/annyang-electron-demo. I tried to make a module with the use of your readme file but it didn’t work. (I am not rly (rly not) good at coding).
The demo needs some extra dependencies so it’s not simple to write a module. Can you maybe give it a try?What about the demo. i didn’t get the actual response in my browser but when i opened the console i saw he recognized my words really clear and fast. (and that for a dutch guy ^^)
I think if you get this running with basic setup (hello, show map, show picture), it could even be expanded to run shells with your voice to turn on and off your monitor. (and other crazy ideas :P)
-
You can just install extra dependencies using
npm install
. The problem of using annoying is that it has an API limit afaik. So you can’t use it indefinitely. -
@MichMich is it possible to restart annyang when the onend event is called?
-
@tyho It probably is. I did not look into the Annyang API.