| ▲ | Normal_gaussian 2 days ago | |||||||
I've not had any issues with the audio picking up, but its in the living room rather than the kitchen. I have Alexa's in most rooms. I don't play music through it, which I do from the Alexa. Tbh I think the mic and the speakers will be fine when the rest of the 'product' is sorted. I failed to mention I have Claude connected to it rather than their default assistant. To us, this just beats Alexa hands down. I have the default assistant another wake word and mistral on the last, they're about as good as Alexa but I rarely use them. | ||||||||
| ▲ | joshstrange a day ago | parent [-] | |||||||
Interesting, well I'm glad it's working well for you all. I tested with local, HA Cloud, and ChatGPT/Claude and that wasn't the sticking point, it was getting the hardware to hear me or for me to hear it. I will say, while it was too slow (today) with the my local inference hardware (CPU, older computer and a little on my newer MBP) it was magically to talk to and hear back from HA all locally. I look forward to a future where I can do that at the same speed/quality as the cloud models. Yes, I know cloud models will continue to get better but turning on/off my fans/lights/etc doesn't need to best model available, just needs to be reliable and fast, I'm even fine with it "shelling out" to the cloud if I ask it for something outside of the basics though I doubt I'll care to do that. | ||||||||
| ||||||||