Remix.run Logo
Scientists Make Smartphone App That Performs Full-Body Motion Capture(petapixel.com)
15 points by PaulHoule 7 days ago | 10 comments
freehorse 4 hours ago | parent | next [-]

The university's dissemination article is more well written [0]. The original publication is in [1]. The github repo, including weights, is in [2].

[0] https://news.northwestern.edu/stories/2024/10/app-performs-m...

[1] https://dl.acm.org/doi/10.1145/3654777.3676461

[2] https://github.com/SPICExLAB/MobilePoser

divan 5 hours ago | parent | prev | next [-]

In my experience pose-estimation is relatively solved problem. The biggest issue is estimating camera movement, especially across cuts. There are few commerical solutions for AI mocap (move.ai, deepmotion, wonderdynamics, cascadeur and others), and it's still far from being solved. For accurate motion estimation, lens correction is reuqired, and lens focal distance and sensor size information is required. I don't know if any of these solutions support zoom level change.

There are tons of practical applications for the AI mocap, especiall in sports. But requirement of calibrating users (standing in T pose) and using only calibrated and fixed camera limits the use of these AI mocap solutions to the lab settings only.

freehorse 4 hours ago | parent | next [-]

They use IMU sensors not cameras. The linked article is a bit bad in that it does not make it very clear and it has a quite misleading hero image (in the tradition of useless hero images). They try to estimate pose in detail through IMU signals. While pose estimation through IMU is not new, I am not sure if something like getting into that deep detail (out of a general classification) has really been done.

rubyfan 3 hours ago | parent [-]

For us normies, IMU = Inertial measurement unit

https://en.m.wikipedia.org/wiki/Inertial_measurement_unit

HeatrayEnjoyer 3 hours ago | parent | prev [-]

Lens adjustment, sensor adjustment, this is a problem ripe for the deep learning "bitter lesson"

voidUpdate 4 hours ago | parent | prev | next [-]

I mean this has been done before by someone who I'm pretty sure is just an enthusiast, not a scientist https://github.com/ju1ce/April-Tag-VR-FullBody-Tracker

freehorse 4 hours ago | parent | next [-]

This is not at all what they did here. Here, they use the IMU sensors in the phones, not the cameras. Using cameras is too restrictive if one wants to apply it to more real-life-like situations due to visual obstruction etc. They are not the first to use pose and movement estimation using IMU sensors by any means, but I guess it still is a valuable contribution to have a working-out-of-the-box app, and also they seem to get into estimating more details in the pose than just a general classification.

leonheld 4 hours ago | parent | prev [-]

> just an enthusiast, not a scientist

Considering the anime character, it's 50/50. /s

4 hours ago | parent | prev | next [-]
[deleted]
rspoerri 3 hours ago | parent | prev [-]

cool, now every smartphone can not only track your position but also exact movements. :-) /s