Read the statement by Michael Teeuw here.
ChatGpt intergration
-
@SILLEN-0 as I said add console.log statements to the node helper so you can see the flow in the console output…
-
yea i just did that and i get this error in the console:
0|MagicMirror | [14.01.2023 13:01.03.714] [LOG] Received QUESTION notification with payload: What is the weather like today? 0|MagicMirror | [14.01.2023 13:01.03.724] [LOG] Error in getResponse: TypeError: Cannot set properties of undefined (setting 'apiKey') 0|MagicMirror | at /home/pi/MagicMirror/modules/chatgpt/node_helper.js:12:27 0|MagicMirror | at new Promise (<anonymous>) 0|MagicMirror | at Class.getResponse (/home/pi/MagicMirror/modules/chatgpt/node_helper.js:11:16) 0|MagicMirror | at Class.socketNotificationReceived (/home/pi/MagicMirror/modules/chatgpt/node_helper.js:31:18) 0|MagicMirror | at Socket.<anonymous> (/home/pi/MagicMirror/js/node_helper.js:108:11) 0|MagicMirror | at Socket.emit (node:events:513:28) 0|MagicMirror | at Socket.emitUntyped (/home/pi/MagicMirror/node_modules/socket.io/dist/typed-events.js:69:22) 0|MagicMirror | at /home/pi/MagicMirror/node_modules/socket.io/dist/socket.js:614:39 0|MagicMirror | at process.processTicksAndRejections (node:internal/process/task_queues:78:11)
and this is my node_helper file:
const NodeHelper = require("node_helper"); const openai = require("openai").default; console.log(openai); module.exports = NodeHelper.create({ start: function() { console.log("Starting node helper for: " + this.name); }, // Send a message to the chatGPT API and receive a response getResponse: function(question) { return new Promise((resolve, reject) => { openai.apiKey = "XXXXXXXXXXXXXX"; openai.Completion.create({ prompt: question, temperature: 0.7 }, (error, response) => { if (error) { reject(error); } else { console.log("Received response from API: ", response); resolve(response.choices[0].text); } }); }); }, // Handle socket notifications socketNotificationReceived: function(notification, payload) { if (notification === "QUESTION") { console.log("Received QUESTION notification with payload: ", payload); this.getResponse(payload) .then((response) => { console.log("Sending RESPONSE notification with payload: ", response); this.sendSocketNotification("RESPONSE", response); }) .catch((error) => { console.log("Error in getResponse: ", error); }); } }, });
-
@SILLEN-0 was openainin context before the
return new Promise inside getResponse() -
@sdetweil if im honest i have no idea what you mean by that. as i said i am very bad at any sort of coding this whole thing is held up with ducktape could you try to explain more what you mean by that?
-
@SILLEN-0 the error says that at the time of trying to to assign the apiKey to the openai object, the openai object is null (0)
TypeError: Cannot set properties of undefined (setting 'apiKey') 0|MagicMirror | at /home/pi/MagicMirror/modules/chatgpt/node_helper.js:12:27
‘undefined’ here is openai,
so, I see you print the object just after the require() at the top
const openai = require("openai").default; console.log(openai);
and ‘assume’ that you looked at the output of npm start (where console.log messages go) and are satisfied that the require(‘openai’) worked
so now to check again later
so I would add and another console.log(openai)
after the getResponse:getResponse: function(question) { // here .. is the openai object good (not null) here? return new Promise((resolve, reject) => { openai.apiKey = "XXXXXXXXXXXXXX"; // this is the statement that failed
-
HOLY SHIT I GOT IT TOO WORK!. ok i need to calm down i am so happy right now. so turn out i was just trying too call the everything with openai.apiKey instead of just apiKey and then i had too do somthing else i dont even remember what i did but now i got it too work and it displays on the magicmirror. but now comes the hard part. if you look in the code there is a variabele named question that the prompt uses and send too the api. and i am no expert but can you have the variable question be defined with some kind of speech too text thing?
-
@SILLEN-0 said in ChatGpt intergration:
and i am no expert but can you have the variable question be defined with some kind of speech too text thing?
i don’t know what you mean…
unless u want to capture speech and convert that to the text for openai
welcome to the problem I have highlighted since the beginning… there is no GOOD speech capture library, and nothing built in.
this is why the MMM-GoogleAssistant provides mechanisms (recipe) to use the captured text for non- google uses.
MMM-Voice uses the pocketsphinx lib, from carnegie mellon, I think this one is terrible for me, <60% accurate and I have to keep saying it over and over.
GA and most others use cloud based services (none free)
but put a filter (hotword, like ok google, or alexa) in front to keep the cost down, not convert everything.the other mirror platform I support , smart-mirror , is voice based, so your ‘plugin’ can get the text from the speech. not compatible with MM.
-
i have MMM-googleassistant and i have used the recipes for some other stuff but could you get that too work with chatgpt. saying somthing like "ok-google chatgpt followed by the question you want too ask and then it takes the question makes it a variable or somthing that the prompt can read and then refresh the answer on the magicmirror. also what do you mean by (none free) MMM-google assistant is free. but i think it wouldnt be a big issue having too pay for chat gpt as in all of my testing and making api calls i have only paid 0.09 usd
-
@SILLEN-0 so to use ga recipe it would send a notification and your module side would get that and send that text to node_helper… rest is the same
-
@SILLEN-0 meaning of life. I assume service has blocked killer questions