Remix.run Logo
modeless 12 hours ago

Foveated streaming! That's a great idea. Foveated rendering is complicated to implement with current rendering APIs in a way that actually improves performance, but foveated streaming seems like a much easier win that applies to all content automatically. And the dedicated 6 GHz dongle should do a much better job at streaming than typical wifi routers.

> Just like any SteamOS device, install your own apps, open a browser, do what you want: It's your PC.

It's an ARM Linux PC that presumably gives you root access, in addition to being a VR headset. And it has an SD card slot for storage expansion. Very cool, should be very hackable. Very unlike every other standalone VR headset.

> 2160 x 2160 LCD (per eye) 72-144Hz refresh rate

Roughly equivalent resolution to Quest 3 and less than Vision Pro. This won't be suitable as a monitor replacement for general desktop use. But the price is hopefully low. I'd love to see a high-end option with higher resolution displays in the future, good enough for monitor replacement.

> Monochrome passthrough

So AR is not a focus here, which makes sense. However:

> User accessible front expansion port w/ Dual high speed camera interface (8 lanes @ 2.5Gbps MIPI) / PCIe Gen 4 interface (1-lane)

Full color AR could be done as an optional expansion pack. And I can imagine people might come up with other fun things to put in there. Mouth tracking?

One thing I don't see here is optional tracking pucks for tracking objects or full body tracking. That's something the SteamVR Lighthouse tracking ecosystem had, and the Pico standalone headset also has it.

More detail from the LTT video: Apparently it can run Android APKs too? Quest compatibility layer maybe? There's an optional accessory kit that adds a top strap (I'm surprised it isn't standard) and palm straps that enable using the controllers in the style of the Valve Index's "knuckles" controllers.

bigiain 8 hours ago | parent | next [-]

> Foveated streaming! That's a great idea.

Back when I was in Uni, so late 80s or early 90s, my dad was Project Manager on an Air Force project for a new F-111 flight simulator, when Australia upgraded the avionics on their F-111 fighter/bombers.

The sim cockpit had a spherical dome screen and a pair of Silicon Graphics Reality Engines. One of them projected an image across the entire screen at a relatively low resolution. The other projector was on a turret that pan/tilted with the pilot's helmet, and projected a high resolution image but only in a perhaps 1.5m circle directly in from of where the helmet was aimed.

It was super fun being the project manager's kid, and getting to "play with it" on weekends sometimes. You could see what was happening while wearing the helmet and sitting in the seat if you tried - mostly ny intentionally pointing your eyes in a different direction to your head - but when you were "flying around" it was totally believable, and it _looked_ like everything was high resolution. It was also fun watching other people fly it, and being able to see where they were looking, and where they weren't looking and the enemy was speaking up on them.

zeroq 5 hours ago | parent | next [-]

I'll share a childhood story as well.

Somewhere between '93 and '95 my father took me abroad to Germany and we visited a gaming venue. It was packed with typical arcade machines, games where you sit in a cart holding a pistol and you shoot things on the screen while cart was moving all over the place simulating bumpy ride, etc.

But the highlight was a full 3D experience shooter. You got yourself into a tiny ring, 3D headset and a single puck hold in hand. Rotate the puck and you move. Push the button and you shoot. Look around with your head. Most memorable part - you could duck to avoid shots! Game itself, as I remember it, was full wireframe, akin to Q3DM17 (the longest yard) minus jump pads, but the layout was kind of similar. Player was holding a dart gun - you had a single shot and you had to wait until the projectile decayed or connected with other player.

I'm not entirely sure if the game was multiplayer or not.

I often come back to that memory because shortly after within that time frame my father took me to a computer fair where I had the opportunity to play doom/hexen with VFX1 (or whatever it was called) and it was supposed to revolutionize the world the way AI is suppose to do it now.

Then there was a P5 glove with jaw dropping demo videos of endless possibilities of 3D modelling with your hands, navigating a mech like you were actually inside, etc.

It never came.

somenameforme an hour ago | parent | next [-]

That sounds like you're describing dactyl nightmare. [1] I played a version where you were attacking pterodactyls instead of other players, but it was more or less identical. That experience is what led me to believe that VR would eventually take over. I still, more or less, believe it even though it's yet to happen.

I think the big barrier remains price and experiences that are focusing more on visual fidelity over gameplay. An even bigger problem with high end visual fidelity tends to result in motion sickness and other side effects in a substantial chunk of people. But I'm sticking to my guns there - one day VR will win.

[1] - https://www.youtube.com/watch?v=hBkP2to1P_c

zeroq an hour ago | parent [-]

It is precisely that! My version was wireframe and I can't recall the dragon, but everything else is exactly like I remembered it!

For me this serves as an example.

Few years later VFX1 was the hype, years later Occulus, etc.

But 3D graphics in general - as seen in video games - are similar, minus recent lumen, it's still stuff from graphics gems from 80-90s, just on silicone.

Same thing is happening now to some degree with AI.

m463 4 hours ago | parent | prev | next [-]

Maybe something like this?

https://en.wikipedia.org/wiki/Virtuality_(product)

I think I played with the 1000CS or similar in a bar or arcade at some point in early 90's

zeroq 4 hours ago | parent [-]

Yes!

The booth depicted on the 1000CS image looks exactly how I recall it, and the screenshot looks very similar to how I remember the game (minus dragon, and mine was fully wireframe), but the map layout looks very similar. It has this Q3DM17 vibe I was talking about.

Isn't this crazy, that we had this tech in ~'91 and it's still not just there yet?

On similar note - around that time, mid 90s, my father also took my to CEBIT. One building was almost fully occupied by Intel or IBM and they had different sections dedicated to all sorts of cool stuff. One of I won't forget was straight out of Minority Report, only many years earlier.

They had a whole section dedicated to showcasing a "smart watch". Imagine Casio G-Shock but with Linux. You could navigate options by twisting your wrist (up or down the menu) and you would press the screen or button to select an option.

They had different scenarios built in form of an amusement park - from restaurant where you would walk in with your watch - it would talk to the relay at the door and download menu for you just so you could twist your wrist to select your meal and order it without a human interaction and... leave without interaction as well, because the relay at the door would charge you based on your prior selection.

Or - and that was straight out of Minority Report - a scenario of an airport, where you would disembark at your location and walk past a big screen that would talk to your watch and display travel information for you, prompting question if you'd like to order a taxi to your destination, based on your data.

m463 an hour ago | parent | next [-]

I remember a guy I know went to japan/asia around 1985ish and came back with a watch. It had hands, but also a small LCD display. You could draw numbers on the face with your finger, like 6 then X then 3 then = and the LCD would show the values, and finally 18

This is completely uninteresting now, but this was 40 years ago

EDIT: I think Casio AT-552

https://www.youtube.com/watch?v=0aQHnyZdgF4

somenameforme an hour ago | parent [-]

It was a really interesting and weird time growing up when Japan was the king of tech. I had a friend who's dad was often over there and bringing all sorts of weird stuff back. There was this NES/Famicon game where you played with a sort of gyroscope. I have no idea how you were supposed to play the game, but found the gyroscope endlessly fascinating. Then of course there were the pirated cartridges with 100 in 1 type games. Oh then we found the box full of his dad's "special" games. Ah, good times.

vardump 38 minutes ago | parent [-]

Special games? I thought NES was controlled by Nintendo?

intrasight 3 hours ago | parent | prev [-]

> Isn't this crazy, that we had this tech in ~'91 and it's still not just there yet?

Not really, because feeding us ads and AI slop attracted all the talent.

amypetrik8 3 hours ago | parent | prev [-]

I'll share a childhood story as well. I worked with a number of peer children with laudable parents. There was Jimmy, whose father ran a used car dealership and had a lot of sway, often threatening people with his father's ownership of that dealership. There was Steve, whose father gave him early access to a user-agent LLM known as "Microsoft Bob". There was Stephano who had SGI's 4D Chartreuse hardware, never publically released. Oh how they would brag and gloat, one up one another. Inevitably there would be a pause, and a lull, and they all would knowingly turn there heads to me -- "My dad.. My dad works for Nintendo". Oh the jealousy. I knew everything, seeing as my dad worked for Nintendo. The next President. Tomorrow's stock market prices. Whether next winter would be mild or severe. They looked to me. "Did you know I can play Donkey Kong - no - a new one with SGI rendered graphics by square". "Oh virtual reality - yea I have the successor to the game boy, it's virtual reality LOL good luck with the SGI crap". It was great. The one time in my life I felt seen, I felt valued. Truly a blessing. Currently my vocation is cleaning the leavings from proctoscopic examinations.

usefulcat an hour ago | parent | prev | next [-]

That’s reality cool. My first job out of college was implementing an image generator for the simulator for the landing signal officer on the USS Nimitz, also using SGI hardware. I would have loved to have seen the final product in person but sadly never had the chance.

m463 4 hours ago | parent | prev [-]

I remember there was a flight simulator project that had something like that, or even it was that.

it was called ESPRIT, which I believe was eye slaved programmed retinal insertion technique.

dagmx 11 hours ago | parent | prev | next [-]

Foveated streaming is cool. FWIW the Vision Pro does that for their Mac virtual display as well, and it works really well to pump a lot more pixels through.

anvuong 7 hours ago | parent | next [-]

It's the same amount of pixels though, just with reduced bitrate for unfocused regions so you save time in encoding, transmitting, and decoding, essentially reducing latency.

For foveated rendering, the amount of rendered pixels are actually reduced.

dagmx 4 hours ago | parent | next [-]

It’s the same number of pixels rendered but it lets you reduce the amount of data sent , thereby allowing you to send more pixels than you would have been able to otherwise

entropicdrifter 6 hours ago | parent | prev [-]

That depends on the specifics of the encode/decode pipeline for the streamed frames. Could be the blurry part actually is lower res and lower bitrate until it's decoded, then upscaled and put together with the high res part. I'm not saying they do that, but it's an option.

eptcyka 9 hours ago | parent | prev [-]

I think it works really well to pump the same amount of pixels, just focusing them on the more important parts.

Psillisp 9 hours ago | parent [-]

Always PIP, Pump Important Pixels

xeonmc 12 hours ago | parent | prev | next [-]

> Roughly equivalent resolution to Quest 3 and less than Vision Pro. This won't be suitable as a monitor replacement for general desktop use. But the price is hopefully low.

Question, what is the criteria for deciding this to be the case? Could you not just move your face closer to the virtual screen to see finer details?

potatolicious 12 hours ago | parent | next [-]

There's no precise criteria but the usual measure is ppd (pixels per degree) and it needs to be high enough such that detailed content (such as text) displayed at a reasonable size is clearly legible without eye strain.

> "Could you not just move your face closer to the virtual screen to see finer details?"

Sure, but then you have the problem of, say, using an IMAX screen as your computer monitor. The level of head motion required to consume screen content (i.e., a ton of large head movements) would make the device very uncomfortable quite quickly.

The Vision Pro has about ~35ppd and generally people seems to think it hits the bar for monitor replacement. Meta Quest 3 has ~25ppd and generally people seem to think it does not. The Steam Frame is specs-wise much closer to Quest 3 than Vision Pro.

There are some software things you can do to increase legibility of details like text, but ultimately you do need physical pixels.

giobox 11 hours ago | parent [-]

Even the vision pro at 35ppd simply isn't close to the PPD you can get from a good desktop monitor (we can calculate PPD for desktop monitors too, using size and viewing distance).

Apple's "retina" HiDPI monitors typically have PPD well beyond 35 at ordinary viewing distances, even a 1080p 24 inch monitor on your desk can exceed this.

For me personally, 35ppd feels about the minimum I would accept for emulating a monitor for text work in a VR headset, but it's still not good enough for me to even begin thinking about using it to replace any of my monitors.

> https://phrogz.net/tmp/ScreenDensityCalculator.html

numpad0 7 hours ago | parent | next [-]

I think there is a missing number here: angular resolution of human eyeballs is believed to be ~60 ppd(some believes it's more like 90).

potatolicious 11 hours ago | parent | prev | next [-]

Oh yeah for sure. Most people seem to accept that 35ppd is "good enough" but not actually at-par with a high quality high-dpi monitor.

I agree with you - I would personally consider 35ppd to be the floor for usability for this purpose. It's good in a pinch (need a nice workstation setup in a hotel room?) but I would not currently consider any extant hardware as full-time replacements for a good monitor.

andybak 10 hours ago | parent [-]

Most people in what age group?

I'm 53 and the Quest 3 is perfectly good as a monitor replacement.

gruturo 9 hours ago | parent | next [-]

I'm in the same boat. Due to my vision not being perfect even after correction, a Quest 3 is entirely sufficient.

pdpi 8 hours ago | parent [-]

I keep hearing this argument, and it baffles me. I find that, as I age and my vision gets worse, I need progressively finer text rendering. Using same-size displays (27") at the same distance, with text the same physical size on screen, 1440p gives me a much worse reading experience than 4k with 2x scaling.

froggit 10 hours ago | parent | prev [-]

Are you saying ppd requirements for comfortable usage vary with age?

whycome 9 hours ago | parent | prev [-]

We get by with lower resolution monitors with lower pixel density all the time.

big_toast 9 hours ago | parent | next [-]

I think part of getting by with a lower PPD is the IRL pixels are fixed and have hard boundaries that OS affordances have co-evolved with.

(pixel alignment via lots of rectangular things - windows, buttons; text rendering w/ that in mind; "pixel perfect" historical design philosophy)

The VR PPD is in arbitrary orientations which will lead to more aliasing. MacOS kinda killed their low-dpi experience via bad aliasing as they moved to the hi-dpi regime. Now we have svg-like rendering instead of screen-pixel-aligned baked rasterized UIs.

giobox 9 hours ago | parent | prev [-]

I'm not sure most of us do anymore - see my 1080p/24 inch example.

No one who has bought almost any MacBook in the last 10 years or so has had PPD this low either.

One can get by with almost anything in a pinch, it doesn't mean its desirable.

Pixel density != PPD either, although increasing it can certainly help PPD. Lower density desktop displays routinely have higher PPD than most VR headsets - viewing distance matters!

modeless 12 hours ago | parent | prev | next [-]

Not only would it be a chore to constantly lean in closer to different parts of your monitor to see full detail, but looking at close-up objects in VR exacerbates the vergence-accommodation mismatch issue, which causes eye strain. You would need varifocal lenses to fix this, which have only been demonstrated in prototypes so far.

whycome 9 hours ago | parent | next [-]

This all sounds a bit like the “better horse” framing. Maybe richer content shouldn’t be consumed as primarily a virtualized page. Maybe mixing font sizes and over sized text can be a standard in itself.

Fernicia 12 hours ago | parent | prev [-]

Couldn't you get around that by having a "zoom" feature on a very large but distant monitor?

wongarsu 11 hours ago | parent | next [-]

Yes. You can make a low-resolution monitor (like 800x600px, once upon a time a usable resolution) and/or provide zoom and panning controls

I've tried that combination in an earlier iteration of Lenovo's smart glasses, and it technically works. But the experience you get is not fun or productive. If you need to do it (say to work on confidential documents in public) you can do it, but it's not something you'd do in a normal setup

potatolicious 10 hours ago | parent | prev | next [-]

Yes but that can create major motion sickness issues - motion that does not correspond top the user's actual physical movements create a dissonance that is expressed as motion sickness for a large portion of the population.

This is the main reason many VR games don't let you just walk around and opt for teleportation-based movement systems - your avatar moving while your body doesn't can be quite physically uncomfortable.

There are ways of minimizing this - for example some VR games give you "tunnel vision" by blacking out peripheral vision while the movement is happening. But overall there's a lot of ergo considerations here and no perfect solution. The equivalent for a virtual desktop might be to limit the size of the window while the user is zooming/panning.

rtkwe 10 hours ago | parent | prev [-]

For a small taste of what using that might be like turn on screen magnification on your existing computers. It's technically usable but not particularly productive or pleasant to use if you don't /have/ to use it.

jayd16 11 hours ago | parent | prev [-]

It's just about what pixel per degree will get you close to the modern irl setup. Obviously it's enough for 80 char consoles but you'd need to dip into large fonts for a desktop.

rtkwe 10 hours ago | parent [-]

I did the math on this site and I'd have to hunch less than a foot from the screen to hit 35 PPD on my work provided Thinkpad X1 Carbon with a 14" 1920x1200 screen. My usual distance is nearly double that so my ppd normally is more like 70 ppd, roughly.

https://phrogz.net/tmp/ScreenDensityCalculator.html#find:dis...

monocasa 12 hours ago | parent | prev | next [-]

Foveated streaming is wild to me. Saccades are commonly as low as 20-30ms when reading text, so guaranteeing that latency over 2.4Ghz seems Sisyphean.

I wonder if they have an ML model doing partial upscaling until the eyetracking state is propagated and the full resolution image under the new fovea position is available. It also makes me wonder if there's some way to do neural compression of the peripheral vision optimized for a nice balance between peripheral vision and hints in the embedding to allow for nicer upscaling.

rebeccaskinner 10 hours ago | parent | next [-]

I worked on a foveated video streaming system for 3D video back in 2008, and we used eye tracking and extrapolated a pretty simple motion vector for eyes and ignored saccades entirely. It worked well, you really don't notice the lower detail in the periphery and with a slightly over-sized high resolution focal area you can detect a change in gaze direction before the user's focus exits the high resolution area.

Anyway that was ages ago and we did it with like three people, some duct tape and a GPU, so I expect that it should work really well on modern equipment if they've put the effort into it.

monocasa 10 hours ago | parent [-]

Foveated rendering very clearly works well with a dedicated connection, wiht predictable latency. My question was more about the latency spikes inherent in a ISM general use band combined with foveated rendering, which would make the effects of the latency spikes even worse.

cube2222 12 hours ago | parent | prev | next [-]

They're doing it over 6GHz, if I understand correctly, which with a dedicated router gets you to a reasonable latency with reasonable quality even without foveated rendering (with e.g. a Quest 3).

With foveated rendering I expect this to be a breeze.

monocasa 11 hours ago | parent [-]

Even 5.8Ghz is getting congested. There's a dedicated router in this case (a USB fob), but you still have to share spectrum with the other devices. And at the 160Mhz symbol rate mode on WiFi6, you only have one channel in the 5.8GHz spectrum that needs to be shared.

zamadatix 10 hours ago | parent | next [-]

You're talking about "Wi-Fi 6" not "6 GHz Wi-Fi".

"6 GHz Wi-Fi" means Wi-Fi 6E (or newer) with a frequency range of 5.925–7.125 GHz, giving 7 non-overlapping 160 MHz channels (which is not the same thing as the symbol rate, it's just the channel bandwidth component of that). As another bonus, these frequencies penetrate walls even less than 5 GHz does.

I live on the 3rd floor of a large apartment complex. 5 GHz Wi-Fi is so congested that I can get better performance on 2.4 in a rural area, especially accounting for DFS troubles in 5 GHz. 6 GHz is open enough I have a non-conflicting 160 MHz channel assigned to my AP (and has no DFS troubles).

Interestingly, the headset supports Wi-Fi 7 but the adapter only supports Wi-Fi 6E.

esseph 11 hours ago | parent | prev [-]

Not so much of an issue when neighbors with paper thin walls see that 6ghz as a -87 signal

That said, in the US it is 1200MHz aka 5.925 GHz to 7.125 GHz.

monocasa 11 hours ago | parent | next [-]

More of an issue when your phone's wifi or your partner watching a show while you game is eating into that one channel in bursts, particularly since the dedicated fob means that it's essentially another network conflicting with the regular WiFI rather than deeply collaborating for better real time guarantees (not that arbitrary wifi routers would even support real time scheduling).

MIMO helps here to separate the spectrum use by targeted physical location, but it's not perfect by any means.

cube2222 11 hours ago | parent | next [-]

IMO there is not much reason to use WiFi 6 for almost anything else. I have a WiFi 6 router set up for my Quest 3 for PC streaming, and everything else sits on its 5GHz network. And since it doesn't really go through walls, I think this is a non-issue?

The Frame itself here is a good example actually - using 6GHz for video streaming and 5GHz for wifi, on separate radios.

My main issue with the Quest in practice was that when I started moving my head quickly (which happens when playing faster-paced games) I would get lag spikes. I did some tuning on the bitrate / beam-forming / router positioning to get to an acceptable place, but I expect / hope that here the foveated streaming will solve these issues easily.

monocasa 11 hours ago | parent [-]

The thing is that I'd expect foveated rendering to increase latency issues, not help them like it does for bandwidth concerns. During a lag spike you're now looking at an extremely down sampled image instead of what in non foveated rendering had been just as high quality.

Now I also wonder if an ML model could also work to help predict fovea location based on screen content and recent eye trackng data. If the eyes are reading a paragraph, you have a pretty good idea where they're going to go next for instance. That way a latency spike that delays eye tracking updates can be hidden too.

cube2222 10 hours ago | parent | next [-]

My understanding is that the foveated rendering would reduce bandwidth requirements enough that latency spikes become effectively non-existent.

We’ll see in practice - so far all hands-on reviewers said the foveated rendering worked great, with one trying to break it (move eyes quickly left right up down from edge to edge) and not being able to - the foveated rendering always being faster.

I agree latency spikes would be really annoying if they end up being like you suggest.

monocasa 9 hours ago | parent [-]

Enough bandwidth to absolve any latency issues over a wireless connection is not really a thing for a low latency use case like foveated rendering.

What do you do when another device on the main wifi network decides to eat 50ms of time in the channel you use for the eye tracking data return path?

cube2222 9 hours ago | parent [-]

I believe all communication with the dongle is on 6GHz - both the video and the return metadata.

So again, you just make sure the 6GHz band in the room is dedicated to the Frame and its dongle.

The 5GHz is for WiFi.

ncallaway 3 hours ago | parent [-]

On the LTT video he also said that Valve had claimed to have tested with a small number of devices in the same room, but hadn’t tried out larger scenarios like tens of devices.

My guess based on that is you likely dont need to totally clear 6GHz in the room the Frame is in, but rather just make sure its relatively clear.

We’ll know more once it ships and we can see people try it out and try and abuse the radio a bit.

entropicdrifter 9 hours ago | parent | prev [-]

Pretty funny to me that you're backseat engineering Valve on this one. If it didn't have a net benefit they wouldn't have announced it as a feature yet lmao

monocasa 3 hours ago | parent [-]

I'm not saying it doesn't work; I'm asking what special sauce they've added to make it work, and noting that despite the replies I've gotten, foveated streaming doesn't help latency, and in fact makes the effects of latency spikes worse.

esseph 9 hours ago | parent | prev [-]

MU-MIMO is very nice.

cyberax 11 hours ago | parent | prev [-]

The One Big Beautiful Bill fixed that. Now a large part of this spectrum will be sold out for non-WiFi use.

brian-armstrong 10 hours ago | parent | next [-]

Oh goody! I hope some of it can be used for DRM encrypted TV broadcasts too.

esseph 9 hours ago | parent | prev [-]

Different spectrum. They're grabbing old radar ranges.

Also talking about adding more spectrum to the existing ISM 6GHz band.

cyberax 4 hours ago | parent [-]

Here's the overview: https://arstechnica.com/tech-policy/2025/06/senate-gop-budge...

rtkwe 10 hours ago | parent | prev | next [-]

The real trick is not over complicating things. The goal is to have high fidelity rendering where the eye is currently focusing so to solve for saccades you just build a small buffer area around the idealized minimum high res center and the saccades will safely stay inside that area within the ability of the system to react to the larger overall movements.

Picture demonstrating the large area that foveated rendering actually covers as high or mid res: https://www.reddit.com/r/oculus/comments/66nfap/made_a_pic_t...

adgjlsfhk1 10 hours ago | parent | prev | next [-]

At 100fps (mid range of the framerate), you need to deliver a new frame every 10ms anyway, so a 20ms saccade doesn't seem like it would be a problem. If you can't get new frames to users in 30ms, blur will be the least of your problems, when they turn their head, they'll be on the floor vomiting.

omneity 12 hours ago | parent | prev | next [-]

It was hard for me to believe as well but streaming games wirelessly on a Quest 2 was totally possible and surprisingly latency-free once I upgraded to wifi 6 (few years ago)

It works a lot better than you’d expect at face value.

LarsDu88 2 hours ago | parent | prev [-]

They use a 6 Ghz dongle

nabakin 10 hours ago | parent | prev | next [-]

And foveated streaming has a 1-2ms wireless latency on modern GPUs according to LTT. Insane.

tshaddox 8 hours ago | parent | next [-]

That's pretty quick. I've heard that in ideal circumstances Wi-Fi 6 can get close to 5ms and Wi-Fi 7 can get down to 2ms.

I's impressive if they're really able to get below 2ms motion-to-photon latency, given that modern consumer headsets with on-device compute are also right at that same 2ms mark.

CobrastanJorji 9 hours ago | parent | prev [-]

Wow, that's just 1 frame of latency at 60 fps.

Edit: Nevermind, I'm dumb. 1/60th of a second is 16 milliseconds, not 1.6 milliseconds.

redrblackr 9 hours ago | parent | next [-]

No, thats between 0.06 and 0.12 frame latency on 60fps. It's not even a frame on 144Hz (1s/144≈7ms)

bspammer 9 hours ago | parent | prev | next [-]

Much less than, 1 frame is 16ms

sph 9 hours ago | parent | prev [-]

60 fps is 16.67 ms per frame.

cedws 11 hours ago | parent | prev | next [-]

Why hasn't Meta tried this given the huge amount of R&D they've put into VR and they had literally John Carmack on the team in the past?

modeless 11 hours ago | parent | next [-]

They prioritized cost, so they omitted eye tracking hardware. They've also bet more on standalone apps rather than streaming from a PC. These are reasonable tradeoffs. The next Quest may add eye tracking, who knows. Quest Pro had it but was discontinued for being too expensive.

We'll have to wait on pricing for Steam Frame, but I don't expect them to match Meta's subsidies, so I'm betting on this being more expensive than Quest. I also think that streaming from a gaming PC will remain more of a niche thing despite Valve's focus on it here, and people will find a lot of use for the x86/Windows emulation feature to play games from their Steam library directly on the headset.

robotnikman 10 hours ago | parent [-]

It will be interesting to see how the X86 emulation plays out. In the Verge review of the headset they mentioned stutters when playing on the headset due to having to 'recompile x86 game code on the fly', but they may offer precompiled versions which can be downloaded ahead of time, similar to the precompiled shaders the Steam Deck downloads.

If they get everything working well I'm guessing we could see an ARM powered Steam Deck in the future.

Despite the fact it uses a Qualcomm chip, I'm curious on whether it retains the ability to load alternative OS's like other Steam hardware.

girvo 9 hours ago | parent [-]

> Despite the fact it uses a Qualcomm chip, I'm curious on whether it retains the ability to load alternative OS's like other Steam hardware.

I think it should: we have Linux support/custom operating systems on Snapdragon 8 Gen 2 devices right now today, and the 8 Gen 3 has upstream support already AFAIK

https://rocknix.org/devices/ayn/odin2/

cube2222 11 hours ago | parent | prev | next [-]

If you mean foveated streaming - It’s available on the Quest Pro with Steam Link.

jayd16 11 hours ago | parent | prev [-]

What do you mean? What part have they not tried?

nixpulvis 6 hours ago | parent | prev | next [-]

I once lived in a place that had a bathroom with mirrors that faced each other. I think I convinced myself that not only is my attention to detail more concentrated at the center, but that my response time was also fastest there (can anyone confirm that?).

So this gets me thinking. What would it feel like to correct for that effect? Could you use the same technique to essentially play the further parts early, so it all comes in at once?

Kinda a hair brained idea, I know, but we have the technology, and I'm curious.

TheOtherHobbes 5 hours ago | parent [-]

Peripheral vision is extremely good at spotting movement at low resolution and moving the eye to look at it.

I don't know if it's faster, but it's a non-trivial part of the experience.

consp 22 minutes ago | parent | next [-]

It's good enough to see flickering on crt monitors at 50-60hz for some people.

nixpulvis 4 hours ago | parent | prev [-]

Yea, I've heard and noticed that as well (thought about adding a note about it to my original comment). But what I'm curious about is the timing. What I suspect is that peripherals are more sensitive to motion, but still lag slightly behind the center of focus. I'm not sure if it's dependent on how actively you are trying to focus. I'd love to learn more about this, but I didn't find anything when I looked online a bit.

MetaWhirledPeas 11 hours ago | parent | prev | next [-]

> Roughly equivalent resolution to Quest 3 and less than Vision Pro. This won't be suitable as a monitor replacement for general desktop use.

The real limiting factor is more likely to be having a large headset on your face for an extended period of time, combined with a battery that isn't meant for all-day use. The resolution is fine. We went decades with low resolution monitors. Just zoom in or bring it closer.

bluescrn 11 hours ago | parent | next [-]

VR does need a lot of resolution when trying to display text.

Can get away with less for games where text is minimized (or very large)

wat10000 8 hours ago | parent | prev | next [-]

The battery isn't an issue if you're stationary, you can plug it in.

The resolution is a major problem. Old-school monitors used old-school OSes that did rendering suitable for the displays of the time. For example, anti-aliased text was not typically used for a long time. This meant that text on screen was blocky, but sharp. Very readable. You can't do this on a VR headset, because the pixels on your virtual screen don't precisely correspond with the pixels in the headset's displays. It's inevitably scaled and shifted, making it blurry.

There's also the issue that these things have to compete with what's available now. I use my Vision Pro as a monitor replacement sometimes. But it'll never be a full-time replacement, because the modern 4k displays I have are substantially clearer. And that's a headset with ~2x the resolution of this one.

cesarb 7 hours ago | parent [-]

> There's also the issue that these things have to compete with what's available now. [...] But it'll never be a full-time replacement, because the modern 4k displays I have are substantially clearer.

What's available now might vary from person to person. I'm using a normal-sized 1080p monitor, and this desk doesn't have space for a second monitor. That's what a VR headset would have to compete against for me; just having several virtual monitors might be enough of an advantage, even if their resolution is slightly lower.

(Also, I have used old-school VGA CRT monitors; as could be easily seen when switching to a LCD monitor with digital DVI input, text on a VGA CRT was not exactly sharp.)

refulgentis 11 hours ago | parent | prev | next [-]

Whether or not we used to walk to school uphill both ways, that won't make the resolution fine.

To your point, I'd use my Vision Pro plugged in all day if it was half the weight. As it stands, its just too much nonsense when I have an ultrawide. If I were 20 year old me I'd never get a monitor (20 year old me also told his gf iPad 1 would be a good laptop for school, so,)

MetaWhirledPeas 7 hours ago | parent [-]

One problem is that in most settings a real monitor is just a better experience for multiple reasons. And in a tight setting like an airplane where VR monitors might be nice, the touch controls become more problematic. "Pardon me! I was trying to drag my screen around!"

krzyk 11 hours ago | parent | prev [-]

2k X 2k doesn't sound low res it is like full HD, but with twice vertical. My monitor is 1080p.

Never tried VR set, so I don't know if that translates similarly.

potatolicious 10 hours ago | parent | next [-]

Your 2K monitor occupies something like a 20-degree field of view from a normal sitting position/distance. The 2K resolution in a VR headset covers the entire field of view.

So effectively your 1080p monitor has ~6x the pixel density of the VR headset.

rtkwe 10 hours ago | parent | prev | next [-]

The problem is that 2k square is spread across the whole FOV of the headset so when it's replicating a monitor unless it's ridiculously close to your face a lot of those pixels are 'wasted' in comparison to a monitor with similar stats.

MetaWhirledPeas 7 hours ago | parent [-]

Totally true, but unlike a real monitor you can drag a virtual monitor close to your face without changing the focal distance, meaning it's no harder on your eyes. (Although it is harder on your neck.)

rtkwe 6 hours ago | parent [-]

To get the same pixel per degree as my work laptop I'd have to put it's virtual replacement screen 11 (virtual) inches from my face and that's probably the lowest PPD screen in my normal life unless I get a bad desk at work that day. Just pasting them inches from your nose is not a great solution, you can already do that with a good set of monitor arms and there's a reason almost no one does it.

11 hours ago | parent | prev [-]
[deleted]
regularfry 7 hours ago | parent | prev | next [-]

I use a 1920x1080 headset as a monitor replacement. It's absolutely fine. 2160x2160 will be more than workable as long as the tracking is on point.

archon810 10 hours ago | parent | prev | next [-]

Have a look at this video by Dave2D. In his hands-on, he was very impressed with foveated streaming https://youtu.be/356rZ8IBCps.

JeremyNT 9 hours ago | parent | prev | next [-]

I guess there's a market for this but I'm personally disappointed that they've gone with the "cram a computer into the headset" route. I'd much rather have a simpler, more compact dumb device like the Bigscreen Beyond 2, which in exchange should prove much lighter and more comfortable to wear for long time periods.

The bulk and added component cost of the "all in one" PC/headset models is just unnecessary if you already have a gaming PC.

pteraspidomorph an hour ago | parent | next [-]

As a current and frequent user of this form factor (Pico 4, with the top strap, which the Steam Frame will also have as an option, over Virtual Desktop) I can assure you that it's quite comfortable over long periods of time (several hours). Of course it will ultimately depend on the specific design decisions made for this headset, but this all looks really good to me.

Full color passthrough would have been nice though. Not necessarily for XR, but because it's actually quite useful to be able to switch to a view of the world around you with very low friction when using the headset.

entropicdrifter 9 hours ago | parent | prev | next [-]

I'm personally quite hyped to see the first commercially available Linux-based standalone VR headset announced. This thing is quite a bit lighter than any of the existing "cram a computer in" solutions.

rpdillon 4 hours ago | parent [-]

Yeah, this is exactly what I've been waiting for for quite a long time. I'm very excited.

numpad0 2 hours ago | parent | prev | next [-]

It's nice to have some local processing for tracking and latency mitigation. Cost from there to full computer on headset is marginal, so you might as well do that.

tfyoung 5 hours ago | parent | prev | next [-]

There's always going to be a computer in it to drive it. It's just a matter of how generalised it is and how much weight/power consumption it's adding.

modeless 9 hours ago | parent | prev | next [-]

You can get a Beyond if that's what you want. It's an amazing device, and will be far more comfortable and higher resolution than this one. Valve has supported Bigscreen in integrating Lighthouse tracking, and I hope that they continue that support by somehow allowing them to integrate the inside-out tracking they've developed for this device in the next version of the Beyond.

preisschild 9 hours ago | parent [-]

That would probably add a lot of extra weight and it would need to make the device bigger.

modeless 9 hours ago | parent [-]

I don't think it would be too bad. Cameras are tiny. The processing would still happen on the PC, and you could delete the lighthouse tracking sensors. I guess the hardest part would be sending that much camera data back to the PC over the cable.

LarsDu88 9 hours ago | parent | prev | next [-]

They crammed a computer into the headset, but UNLIKE Meta's offerings, this is indeed an actual computer you can run linux on. Perhaps even do standard computer stuff inside the headset like text editing, Blender modeling, or more.

rbits 8 hours ago | parent | prev | next [-]

I was worried about the built in computer as well, but then I found out it's only 185g. It is 78g more than the Bigscreen Beyond 2, but it's still pretty light.

5 hours ago | parent [-]
[deleted]
preisschild 9 hours ago | parent | prev [-]

I agree. Hopefully Bigscreen continues making hardware. I still have the original bigscreen beyond and im very happy with it besides the glare.

dyauspitr an hour ago | parent | prev | next [-]

How the hell would foveated streaming even work, it seems physically impossible. Tracking where your eye is looking then sending that information to a server, it processing it and then streaming that back seems impossible.

ch4s3 10 hours ago | parent | prev [-]

> Mouth tracking?

What a vile thought in the context of the steam… catalogue.

SchemaLoad 7 hours ago | parent | next [-]

I'm guessing it's main use case will be VR chat syncing mouths to avatars.

riskable 5 hours ago | parent [-]

The porn industry disagrees.

rtkwe 10 hours ago | parent | prev | next [-]

They're probably thinking of it in comparison to the Apple Pro which attempts to do some facial tracking of the bottom of your face to inform their 'Personas', it notably still fails quite badly on bearded people where it can't see the bottom half of the face well.

ch4s3 9 hours ago | parent [-]

I gathered as much, but still.

willis936 6 hours ago | parent | prev [-]

Funny enough the Digital Foundry folks put a Gabe quote about tongue input in their most recent podcast.

https://www.youtube.com/watch?v=c9zfExb5vCU&t=1h32m44s