Remix.run Logo
Meta Unveils Wristband for Controlling Computers with Hand Gestures(nytimes.com)
43 points by Anonboxis 6 days ago | 23 comments
nebben64 6 days ago | parent | next [-]

https://archive.vn/9aDER

mrbigbob 6 days ago | parent | prev | next [-]

For those curious meta actually bought out a company that orginally pioneered this idea (wrist controller) from a company called CTRL+Labs in 2019. Here is a verge article that has some photos of the prototype from CTRL-Labs. https://www.theverge.com/2018/6/6/17433516/ctrl-labs-brain-c...

tomhow 3 days ago | parent [-]

https://archive.md/j0d78

Another company doing this (mentioned in the Verge article) was Thalmic Labs, a YC company from 2013, which was acquired by Google in 2020. I remember seeing their presentation at YC Demo Day and it was jaw-dropping stuff; one of the only demos I still remember, 12 years on.

It's sad to see they didn't make it as a commercial success, and is a grim reminder that brilliant innovation doesn't assure a successful outcome.

Caddickbrown 3 days ago | parent [-]

Pretty sure Thalmic sold the tech to CTRL+. I’ve still got one of the bands knocking around somewhere. It was cool tech, but really wasn’t ready for a product.

Thalmic then became North to make smart glasses and then got sold to Google

aitacobell 6 days ago | parent | prev | next [-]

Was skeptical (especially because it's Meta) until it said it's designed for accessibility. Reminds me of the Xbox accessible controller. A lot of devices designed for accessible end up leading to cool user design discoveries.

oc1 2 days ago | parent | prev | next [-]

I would say such inventions are as old as 30 years when i first heard of startups / inventors trying to do such stuff. Obviously the tech must be now much more mature. Still, it never got off back then because typing was magnitudes faster than what ever you could do with your hands alone. Learning to do such hand motions had a similiar fate as why alt keyboard layouts always stayed niche - most people have no patience to learn that complicated stuff when they already have learned something early on that works.

LorenDB 6 days ago | parent | prev | next [-]

Why is this just now news? They already built a similar device for their Project Orion glasses. As far as I can tell, this is just the same thing but with a PC driver.

the-rc 6 days ago | parent | next [-]

The paper was just published in Nature https://www.nature.com/articles/s41586-025-09255-w (the preprint was out almost 18 months ago)

nebben64 6 days ago | parent | prev | next [-]

Having tried prototypes at neuroscience conferences where their team attended, I can tell you that the device was incredibly brittle (e.g. damp wrist, interference from even the metal table or a nearby computer).

As it says in the article, the device seems to be more robust, and ready for the market soon. After having used ML to tune the decoding model on many participants contributing EMG data.

gopher_space 2 days ago | parent [-]

Having tried prototypes of similar ideas since the 90s it seems like something wipes our memory of gorilla arm once a decade.

the-rc a day ago | parent [-]

Why gorilla arm? This doesn't necessarily require lifting it. There's an old video around with Zuck doing gestures while walking and he starts with his arm mostly at rest. Even in the worst case, how is it more tiring than a phone?

etrautmann 3 days ago | parent | prev [-]

You’re correct that this was publicly announced last fall along with Orion. This is back in the news now because of the recent Nature paper demonstrating the performance of general models on new participants without additional training data. It has nothing to do with PC drivers.

falcor84 2 days ago | parent | prev | next [-]

I'm a bit surprised that Meta didn't choose to announce a brand name for it yet, so the article just refers to it throughout as "Meta’s wristband".

eviks 2 days ago | parent | prev | next [-]

Very little info about its capability. How many distinct gestures does it have? Is it sophisticated enough to allow typing?

the-rc a day ago | parent | next [-]

None of that is announced yet, but there are two open source datasets for gestures and typing:

https://github.com/facebookresearch/emg2pose

https://github.com/facebookresearch/emg2qwerty

Infer what you will.

(I helped with their release and last month gave a presentation on the project's original research infrastructure, but I'm no longer on the team and I definitely never was allowed to talk about final products.)

garyfirestorm 2 days ago | parent | prev [-]

If it were that would be the headline

dartharva 3 days ago | parent | prev | next [-]

So I guess the Kinect has vanished from everyone's memory

oc1 2 days ago | parent [-]

From mine certainly. Funny how fast we forget about tech that was for years pretty common and then completely disappeared as it turned out be be a fad.

walterbell 2 days ago | parent [-]

> completely disappeared

Became iPhone FaceID.

HWR_14 2 days ago | parent [-]

The Kinect v1 sensor did. The Kinect v2 used different tech. But as a control it disappeared.

physarum_salad 2 days ago | parent | prev | next [-]

It's not the 1980s. Dropping this in the nytimes seems very underwhelming.

ReptileMan 2 days ago | parent | prev | next [-]

Can I map the middle finger or is already built in?

rkagerer 3 days ago | parent | prev [-]

As if I don't have bad enough RSI already. (Although a diverse repertoire of gestures might actually be better than repetitive taps)