| ▲ | How a dancer with ALS used brainwaves to perform live(electronicspecifier.com) | |||||||
| 33 points by 1659447091 5 hours ago | 3 comments | ||||||||
| ▲ | usui 7 minutes ago | parent | next [-] | |||||||
The featured video does not explain how it uses signals to produce which outcomes and they basically just say "we use machine learning while outputting a dance". At 07:10 it looks like the person chooses between two binary options of "sad or relieved". Unfortunately I doubt the person has much real-time input to the live performance as much as it is being claimed. Dentsu is also an advertisng company in Japan, so it seems like this is more marketing than it is technical. Dances by physical humans are always choreographed beforehand but live performances always show physical motion that can interrupt at any time. I have a hard time believing that this person's brainwaves are producing the 3D hologram, other than instructing it which mood preset to use at a given time. | ||||||||
| ▲ | MajorTakeaway 4 hours ago | parent | prev [-] | |||||||
Now is a really good time to contribute to https://openeeg.sourceforge.net/doc/ as far as EEG concerns go. There are a myriad of things that can be observed with EEG, and it would honestly be a decent thing to see grow in time. | ||||||||
| ||||||||