Solved. It was actually the GPIO library for the ASUS tinker board, its RPi.GPIO implementation actually. It had a useless printf that sent that annoying message that messed up the json message. I removed that line and rebuilt from source and now it`s working flawlessly. Thanks.
Read the statement by Michael Teeuw here.
Best posts made by Radu_Stancu
-
RE: Syntax error in python-shell
-
RE: MMM-MovieListings
@mykle1 Thanks, it`s getting closer to what I want. I will check the module you are mentioning. I am not that familiar with js so I did not recognize the things I need to change to accomplish what I was looking for. I will try again and see if I can make the image a bit bigger and the title white. Thanks a lot for your patience.
-
RE: MMM-MirrorMirrorOnTheWall installation issue
@oceank Yea that worked to some extent. When I say show me images it says that something is not supported.
Amazon Photos is not supported on this device. View your photos using the Amazon Photos app or website, or on devices like Fire TV or Echo Show.
When I say Alexa, start magic mirror, it promptly replies : “Hello my Queen, what can I do for you?” so that is working (Although I would like to change that message later)
-
Flick Large gesture control
Hello all,
I have tried to use the flick Large sensor together with my Raspberry Pi 2 to control the interface of the MagicMirror, as I have made some pages and wanted to scroll through them.
What I have managed to do is to control MMM-pages, MMM-page-indicator and the default news module via my MMM-flick module.
The controls are as follows:
Swipe Left - Decrement Page
Swipe Right - Increment Page
Swipe Up - Show news description
Swipe Up again - Open News page
Swipe Up as the news page is opened - Scroll down the news page
Swipe Down - Close News Page
Swipe Down again - Close News DescriptionI`m currently working on implementing the touch functionality. I want to also have mouse control.
The code is not perfect, I`m working it.
Here is the code (note that I have changed some libraries for it to work with the ASUS Tinker Board, as far as I know I`m the only one who has working flicklibs on the TinkerBoard. If someone is interested in getting the flick sensor to work with the TinkerBoard pm me ):
MMM-flick.py - used to read sensor data
#!/usr/bin/env python import sys import json import time import signal import flicklib import RPi.GPIO as GPIO #import autopy GPIO.setmode(GPIO.BOARD) GPIO.setup(15, GPIO.OUT) GPIO.setup(7, GPIO.OUT) #Turn on both LEDs for orange color in Stand By #GPIO.output(7, True) # Turn on RED LED #GPIO.output(15, True)# Turn on GREEN LED #Airwheel data some_value = 5000 last_airwheel = 0 delay = 5000 #Get display size #width, height = autopy.screen.get_size() def to_node(type, message): # convert to json and print (node helper will read from stdout) try: print(json.dumps({type: message})) except Exception: pass # stdout has to be flushed manually to prevent delays in the node helper communication sys.stdout.flush() to_node("status", 'Flick has started...') @flicklib.flick() def flick(start,finish): #Slide down the newsfeed (DOWN GESTURE) if(start == "north" and finish == "south"): to_node("gesture", "ARTICLE_LESS_DETAILS") GPIO.output(7, True) time.sleep(0.5) GPIO.output(7, False) #Slide up the newsfeed (UP GESTURE) elif(start == "south" and finish == "north"): to_node("gesture", "ARTICLE_MORE_DETAILS") GPIO.output(15, True) time.sleep(0.5) GPIO.output(15, False) #Next page (RIGHT GESTURE) elif(start == "west" and finish == "east"): to_node("gesture", "PAGE_DECREMENT") GPIO.output(7, True) time.sleep(0.5) GPIO.output(7, False) #Previous page (LEFT GESTURE) elif(start == "east" and finish == "west"): to_node("gesture", "PAGE_INCREMENT") GPIO.output(15, True) time.sleep(0.5) GPIO.output(15, False) @flicklib.airwheel() def spinny(delta): global some_value global last_airwheel global delay some_value += delta if some_value < 0: some_value = 0 if some_value > 10000: some_value = 10000 now = int(round(time.time() * 1000)) if(now - last_airwheel > delay): #to_node() last_airwheel = now #Mouse control via flick board #@flicklib.move() #def move(x,y,z): # x = (x) * width # y = (y) * height # x = int(x) # y = height - int (y) # if( y > 799 ): # y = 799 #autopy.mouse.move(x, y) #Double tap gesture #@flicklib.double_tap() #def doubletap(position): # #Tap gesture @flicklib.tap() def tap(position): if position == 'center': GPIO.output(15, True)# Turn on GREEN LED time.sleep(0.5) GPIO.output(15, False)# Turn off GREEN LED #Touch gesture #@flicklib.touch() #def touch(position): # signal.pause()
node_helper.js
'use strict'; const NodeHelper = require('node_helper'); const {PythonShell} = require('python-shell'); var pythonStarted = false module.exports = NodeHelper.create({ python_start: function () { const self = this; const pyshell = new PythonShell('modules/' + this.name + '/MMM-flick.py', { mode: 'json', args: [JSON.stringify(this.config)]}); pyshell.on('message', function (message) { if (message.hasOwnProperty('status')){ console.log("node_helper_[" + self.name + "]" + message.status); } if (message.hasOwnProperty('gesture')){ console.log("node_helper_[" + self.name + "] " + message.gesture); self.sendSocketNotification("gesture_observed", message.gesture); } }); pyshell.end(function (err) { if (err) throw err; console.log("node_helper_[" + self.name + "] " + 'finished running...'); }); }, // Subclass socketNotificationReceived received. socketNotificationReceived: function(notification, payload) { if(notification === 'CONFIG') { this.config = payload if(!pythonStarted) { pythonStarted = true; this.python_start(); }; }; } });
MMM-flick.js
Module.register("MMM-flick",{ gesture_up: 0, gesture_right: 0, // Override socket notification handler. socketNotificationReceived: function(notification, payload) { if (notification === "gesture_observed"){ const self = this; self.sendNotification(payload); if (payload === "up"){ MM.getModules().withClass(this.config.defaultClass).exceptWithClass(this.config.everyoneClass).enumerate(function(module) { module.hide(1000, function() { Log.log(module.name + ' is hidden.'); }); }); MM.getModules().withClass("class_up_1_show").enumerate(function(module) { module.show(1000, function() { Log.log(module.name + ' is shown.'); }); }); } else if (payload === "down") { MM.getModules().withClass("class_up_1_show").enumerate(function(module) { module.hide(1000, function() { Log.log(module.name + ' is hidden by gesture.'); }); }); } } }, start: function() { this.current_user = null; this.sendSocketNotification('CONFIG', this.config); Log.info('Starting module: ' + this.name); } });
-
RE: MMM-awesome-alexa snowboy issue
@mattsharp Thanks for trying, I will try soon enough, I have tried the google assistant module, but I`ll probably return to alexa soon.
-
RE: Flick Large gesture control
Updated the code, and also support for the Asus Tinker Board.
-
RE: Flick Large gesture control
That is exactly what I’m planning to do. In theory it works behind the mirror, but you will lose the touch gestures more than likely. Unfortunately I do not have the mirror yet (it has not been delivered yet), but I’ll post a picture/results when I get it.
So far I wrote the code this way:
Left/right swipes go through MMM-pages & MMM-pages_indicator
Up/down swipes open/close newsfeed details
Each gesture also triggers the LED on the sensor board.
I’m still thinking what other things I can do with the remaining gestures.
I’ll update the correct code after I clean it up a bit.
-
RE: Flick Large gesture control
@steffenschmidt, I was successful to some extent. I have managed to use the gestures from the board to control the interface. Up, down, left and right are working and more will follow when I know what I want them to do. The only problem is that the flick sensor is NOT WORKING behind the mirror, probably because of the aluminum/silver layer that makes a mirror a mirror.
I will work on the project in the winter holidays and post some more details.
-
RE: Flick Large gesture control
@xne0n, I have made some progress, it runs perfectly smooth on the Asus Tinker board, I have managed to get mouse input working decently. It works by putting your finger close to the flick, the orange led turns on, and then the mouse follows your finger. It is not mouse pad like precision, but it gets the job done. I will update the source code, or maybe add it as a module on github after I finish writing my thesis.