| ▲ | bilsbie 6 days ago |
| I’m waiting like crazy for one of these to show up on vr. |
|
| ▲ | kridsdale1 6 days ago | parent | next [-] |
| Check out visionOS 26’s Immersive Photo mode. Any photo in your iCloud library gets converted by an on device model to (I assume) a Gaussian Splat 3D scene that you can pan and dolly around in. It’s the killer feature that justifies the whole cost of Vision Pro. The better the source data the better it works. I can literally walk in to scenes I shot on my Nikon D70 in 2007 and they, and the people, look real. |
| |
| ▲ | bee_rider 6 days ago | parent [-] | | That is neat. Although, I can think of some old family photos where half the people in them are dead by now (nothing catastrophic, just time). I wonder how it would feel to walk around in that sort of photo. |
|
|
| ▲ | jsheard 6 days ago | parent | prev | next [-] |
| Please don't hold your breath, they're still pretty far from high-res 120fps with consistent stereo and milliseconds of latency. |
| |
| ▲ | geokon 6 days ago | parent | next [-] | | Isn't it picture to 3D model?
You'd generate the environment/model ahead of time and then "dive in" to the photo | | |
| ▲ | jsheard 6 days ago | parent [-] | | I suppose that's an option yeah, but when people envision turning this kind of thing into a VR holodeck I think they're expecting unbounded exploration and interactivity, which precludes pre-baking everything. Flattening the scene into a diorama kind of defeats the point. | | |
| ▲ | throwmeaway222 6 days ago | parent | next [-] | | I actually would rather it be a 3d model so that I don't need to believe they're microwaving a goddamn full size whale for 45 minutes (worth of electricity) | |
| ▲ | andoando 6 days ago | parent | prev [-] | | You just need to prerender the 3d world. If its truly exportable as a 3D model, rerendering it real time based on input is trivial |
|
| |
| ▲ | jimmySixDOF 6 days ago | parent | prev [-] | | While discussing Google Genie v3 and AndroidXR, Bilawal Sidhu said : "to create an even faster, lower latency pipeline to go from like 24 fps to like 100 fps. I could see that being more of an engineering problem than a research one at this point." https://youtu.be/VslvofY16I0&t=886 | | |
| ▲ | x187463 6 days ago | parent [-] | | Based on just about every Two Minute Papers video, the engineering/research attack the latency from both sides. The hardware grants steady improvements and an occasional paper is published with a new/improved approach that decimates the compute required. |
|
|
|
| ▲ | dannersy 6 days ago | parent | prev [-] |
| That would be the most motion sickness inducing thing you could possible do in its current state. The fov on these videos is super wonky. |