> Just like any SteamOS device, install your own apps, open a browser, do what you want: It's your PC.
It's an ARM Linux PC that presumably gives you root access, in addition to being a VR headset. And it has an SD card slot for storage expansion. Very cool, should be very hackable. Very unlike every other standalone VR headset.
> 2160 x 2160 LCD (per eye) 72-144Hz refresh rate
Roughly equivalent resolution to Quest 3 and less than Vision Pro. This won't be suitable as a monitor replacement for general desktop use. But the price is hopefully low. I'd love to see a high-end option with higher resolution displays in the future, good enough for monitor replacement.
> Monochrome passthrough
So AR is not a focus here, which makes sense. However:
> User accessible front expansion port w/ Dual high speed camera interface (8 lanes @ 2.5Gbps MIPI) / PCIe Gen 4 interface (1-lane)
Full color AR could be done as an optional expansion pack. And I can imagine people might come up with other fun things to put in there. Mouth tracking?
One thing I don't see here is optional tracking pucks for tracking objects or full body tracking. That's something the SteamVR Lighthouse tracking ecosystem had, and the Pico standalone headset also has it.
More detail from the LTT video: Apparently it can run Android APKs too? Quest compatibility layer maybe? There's an optional accessory kit that adds a top strap (I'm surprised it isn't standard) and palm straps that enable using the controllers in the style of the Valve Index's "knuckles" controllers.
Back when I was in Uni, so late 80s or early 90s, my dad was Project Manager on an Air Force project for a new F-111 flight simulator, when Australia upgraded the avionics on their F-111 fighter/bombers.
The sim cockpit had a spherical dome screen and a pair of Silicon Graphics Reality Engines. One of them projected an image across the entire screen at a relatively low resolution. The other projector was on a turret that pan/tilted with the pilot's helmet, and projected a high resolution image but only in a perhaps 1.5m circle directly in from of where the helmet was aimed.
It was super fun being the project manager's kid, and getting to "play with it" on weekends sometimes. You could see what was happening while wearing the helmet and sitting in the seat if you tried - mostly ny intentionally pointing your eyes in a different direction to your head - but when you were "flying around" it was totally believable, and it _looked_ like everything was high resolution. It was also fun watching other people fly it, and being able to see where they were looking, and where they weren't looking and the enemy was speaking up on them.
Somewhere between '93 and '95 my father took me abroad to Germany and we visited a gaming venue. It was packed with typical arcade machines, games where you sit in a cart holding a pistol and you shoot things on the screen while cart was moving all over the place simulating bumpy ride, etc.
But the highlight was a full 3D experience shooter. You got yourself into a tiny ring, 3D headset and a single puck hold in hand. Rotate the puck and you move. Push the button and you shoot. Look around with your head. Most memorable part - you could duck to avoid shots! Game itself, as I remember it, was full wireframe, akin to Q3DM17 (the longest yard) minus jump pads, but the layout was kind of similar. Player was holding a dart gun - you had a single shot and you had to wait until the projectile decayed or connected with other player.
I'm not entirely sure if the game was multiplayer or not.
I often come back to that memory because shortly after within that time frame my father took me to a computer fair where I had the opportunity to play doom/hexen with VFX1 (or whatever it was called) and it was supposed to revolutionize the world the way AI is suppose to do it now.
Then there was a P5 glove with jaw dropping demo videos of endless possibilities of 3D modelling with your hands, navigating a mech like you were actually inside, etc.
It never came.
I think the big barrier remains price and experiences that are focusing more on visual fidelity over gameplay. An even bigger problem with high end visual fidelity tends to result in motion sickness and other side effects in a substantial chunk of people. But I'm sticking to my guns there - one day VR will win.
For me this serves as an example.
Few years later VFX1 was the hype, years later Occulus, etc.
But 3D graphics in general - as seen in video games - are similar, minus recent lumen, it's still stuff from graphics gems from 80-90s, just on silicone.
Same thing is happening now to some degree with AI.
As long as the headsets are heavy, I won't get one, no matter how great the graphics are or how good the game is
Didn't stop me from getting two different Oculus headsets (and some custom corrective lense inserts) but ultimately, comfort is what made me give up.
That's it. No idea why something like this doesn't exist. ( Or it exists and I don't know it?)
That's more than 1.1 billion pixels per second. At 24 bits a pixel that's something like 26Gb/s of raw data. And that's just in bandwidth - you also need to hit that 120hz of latency, in an environment where hiccups or input lag can cause physical discomfort for a user. And then even if you remote everything you need the headset to have enough juice to decompress and render all of this and hit these desired throughputs.
I'm napkin mathing all of this, and so I'm sure there have been lots of breakthroughs to help along these lines, but it's definitely not a straightforward problem to solve. Of course it's arguable I'm also just falling victim to the contemporary trappings of fidelity > experience, that I was just criticizing.
Later, I found out that it was a game called ”Dactyl Nightmare” that ran on Amiga hardware:
https://en.wikipedia.org/wiki/Virtuality_(product)
I think I played with the 1000CS or similar in a bar or arcade at some point in early 90's
The booth depicted on the 1000CS image looks exactly how I recall it, and the screenshot looks very similar to how I remember the game (minus dragon, and mine was fully wireframe), but the map layout looks very similar. It has this Q3DM17 vibe I was talking about.
Isn't this crazy, that we had this tech in ~'91 and it's still not just there yet?
On similar note - around that time, mid 90s, my father also took my to CEBIT. One building was almost fully occupied by Intel or IBM and they had different sections dedicated to all sorts of cool stuff. One of I won't forget was straight out of Minority Report, only many years earlier.
They had a whole section dedicated to showcasing a "smart watch". Imagine Casio G-Shock but with Linux. You could navigate options by twisting your wrist (up or down the menu) and you would press the screen or button to select an option.
They had different scenarios built in form of an amusement park - from restaurant where you would walk in with your watch - it would talk to the relay at the door and download menu for you just so you could twist your wrist to select your meal and order it without a human interaction and... leave without interaction as well, because the relay at the door would charge you based on your prior selection.
Or - and that was straight out of Minority Report - a scenario of an airport, where you would disembark at your location and walk past a big screen that would talk to your watch and display travel information for you, prompting question if you'd like to order a taxi to your destination, based on your data.
This is completely uninteresting now, but this was 40 years ago
EDIT: I think Casio AT-552
I somehow suspect in modern times they'd have lost.
Not really, because feeding us ads and AI slop attracted all the talent.
I remember the game was a commercially available shooter though, but the machine was exactly the same, with the blue highlights.
Everything you described and more is available from modern home Vr devices you can purchase right now.
Mecha, planes, skyrim, cinema screens. In VR, with custom controllers or a regular controller if you want that. Go try it! It’s out and it’s cheap and it’s awesome. Set IPD FIRST.
it was called ESPRIT, which I believe was eye slaved programmed retinal insertion technique.
I question that we could not create a special purpose video codec that handles this without trickery. The "per eye" part sounds spooky at first, but how much information is typically different between these frames? The mutual information is probably 90%+ in most VR games.
If we were to enhance something like x264 to encode the 2nd display as a residual of the 1st display, this could become much more feasible from a channel capacity standpoint. Video codecs already employ a lot of tricks to make adjacent frames that are nearly identical occupy negligible space.
This seems very similar (identical?) to the problem of efficiently encoding a 3d movie:
Is the current state of VR rendering really just rendering and transporting two videostreams independent of eachother? Surely there has to be at least some academic prior-art on the subject, no?
For foveated rendering, the amount of rendered pixels are actually reduced.
Foveated streaming is presumably the next iteration of this where the eye tracking gives you better information about where to apply this distortion, although I’m genuinely curious how they manage to make this work well - eye tracking is generally high latency but the eye moves very very quickly (maybe HW and SW has improved but they allude to this problem so I’m curious if their argument about using this at a low frequency really improves meaningfully vs more static techniques)
[1] https://developers.meta.com/horizon/blog/how-does-oculus-lin...
People are conflating rendering (which is not what I’m talking about) with transmission (which is what I’m talking about).
Lowering the quality outside the in focus sections lets them reduce the encoding time and bandwidth required to transmit the frame over.
I wonder if they have an ML model doing partial upscaling until the eyetracking state is propagated and the full resolution image under the new fovea position is available. It also makes me wonder if there's some way to do neural compression of the peripheral vision optimized for a nice balance between peripheral vision and hints in the embedding to allow for nicer upscaling.
Anyway that was ages ago and we did it with like three people, some duct tape and a GPU, so I expect that it should work really well on modern equipment if they've put the effort into it.
With foveated rendering I expect this to be a breeze.
"6 GHz Wi-Fi" means Wi-Fi 6E (or newer) with a frequency range of 5.925–7.125 GHz, giving 7 non-overlapping 160 MHz channels (which is not the same thing as the symbol rate, it's just the channel bandwidth component of that). As another bonus, these frequencies penetrate walls even less than 5 GHz does.
I live on the 3rd floor of a large apartment complex. 5 GHz Wi-Fi is so congested that I can get better performance on 2.4 in a rural area, especially accounting for DFS troubles in 5 GHz. 6 GHz is open enough I have a non-conflicting 160 MHz channel assigned to my AP (and has no DFS troubles).
Interestingly, the headset supports Wi-Fi 7 but the adapter only supports Wi-Fi 6E.
That said, in the US it is 1200MHz aka 5.925 GHz to 7.125 GHz.
Also talking about adding more spectrum to the existing ISM 6GHz band.
I communicate with the FCC and NTIA fairly often at this point.
You need to pay attention to Arielle Roth, Assistant Secretary of Commerce for Communications and Information Administrator, National Telecommunications and Information Administration (NTIA).
https://policy.charter.com/2025-ntia-spectrum-policy-symposi...
From the article, about the November event:
"... administration’s investment in unlicensed access in 6 GHz ensures the benefits of the entire spectrum band are delivered directly to American families and businesses in the form of more innovation and faster and more reliable connectivity at home and on the go, which will continue to transform and deliver long-lasting impact for communities of all sizes across the country.
Charter applauds Administrator Roth's leadership, and her recognition of the critical role unlicensed spectrum plays today and in the future, both in the U.S. and across the globe."
---
Now here: https://www.ntia.gov/speech/testimony/2025/remarks-assistant...
"... To identify the remainder, NTIA plans to assess four targeted spectrum bands in the range set by Congress: 7125-7400 MHz; 1680-1695 MHz; 2700-2900 MHz; and 4400-4940 MHz."
"On the topic of on-the-ground realities, let’s also not forget what powers our networks today. While licensed spectrum is critical, the majority of mobile traffic is actually offloaded onto Wi-Fi. Born in America, led by America, Wi-Fi remains an area where we dominate, and we must continue to invest in this important technology. With Wi-Fi, the race has already been won. China knows it cannot compete and for that reason looks for ways to sabotage the very ingenuity that made Wi-Fi a global standard."
Roth is not going to take away 6GHz from current ISM allocation.
Depending on the spectrum and technology there can be a small slice of guard band between usable portions, which is what we have today.
Nothing there today as provisioned is going to change.
https://www.reddit.com/r/sdr/comments/1ow80n5/help_needed_ho...
MIMO helps here to separate the spectrum use by targeted physical location, but it's not perfect by any means.
The Frame itself here is a good example actually - using 6GHz for video streaming and 5GHz for wifi, on separate radios.
My main issue with the Quest in practice was that when I started moving my head quickly (which happens when playing faster-paced games) I would get lag spikes. I did some tuning on the bitrate / beam-forming / router positioning to get to an acceptable place, but I expect / hope that here the foveated streaming will solve these issues easily.
Now I also wonder if an ML model could also work to help predict fovea location based on screen content and recent eye trackng data. If the eyes are reading a paragraph, you have a pretty good idea where they're going to go next for instance. That way a latency spike that delays eye tracking updates can be hidden too.
We’ll see in practice - so far all hands-on reviewers said the foveated rendering worked great, with one trying to break it (move eyes quickly left right up down from edge to edge) and not being able to - the foveated rendering always being faster.
I agree latency spikes would be really annoying if they end up being like you suggest.
What do you do when another device on the main wifi network decides to eat 50ms of time in the channel you use for the eye tracking data return path?
So again, you just make sure the 6GHz band in the room is dedicated to the Frame and its dongle.
The 5GHz is for WiFi.
My guess based on that is you likely dont need to totally clear 6GHz in the room the Frame is in, but rather just make sure its relatively clear.
We’ll know more once it ships and we can see people try it out and try and abuse the radio a bit.
Picture demonstrating the large area that foveated rendering actually covers as high or mid res: https://www.reddit.com/r/oculus/comments/66nfap/made_a_pic_t...
It works a lot better than you’d expect at face value.
What sort of resolution are one's eyes actually resolving during saccades? I seem to recall that there is at the very least a frequency reduction mechanism in play during saccades
Are you really sure overrendering the fovea region would really work?
I’m not sure what you mean by “look through the entire image to reacquire the iris”? You’re talking about the image from the eye tracking camera?
Yes. A normal trick is to search just a bit outside the last known position to make eye tracking cheap computationally and to reduce latency in the common case.
Question, what is the criteria for deciding this to be the case? Could you not just move your face closer to the virtual screen to see finer details?
> "Could you not just move your face closer to the virtual screen to see finer details?"
Sure, but then you have the problem of, say, using an IMAX screen as your computer monitor. The level of head motion required to consume screen content (i.e., a ton of large head movements) would make the device very uncomfortable quite quickly.
The Vision Pro has about ~35ppd and generally people seems to think it hits the bar for monitor replacement. Meta Quest 3 has ~25ppd and generally people seem to think it does not. The Steam Frame is specs-wise much closer to Quest 3 than Vision Pro.
There are some software things you can do to increase legibility of details like text, but ultimately you do need physical pixels.
Apple's "retina" HiDPI monitors typically have PPD well beyond 35 at ordinary viewing distances, even a 1080p 24 inch monitor on your desk can exceed this.
For me personally, 35ppd feels about the minimum I would accept for emulating a monitor for text work in a VR headset, but it's still not good enough for me to even begin thinking about using it to replace any of my monitors.
I agree with you - I would personally consider 35ppd to be the floor for usability for this purpose. It's good in a pinch (need a nice workstation setup in a hotel room?) but I would not currently consider any extant hardware as full-time replacements for a good monitor.
I'm 53 and the Quest 3 is perfectly good as a monitor replacement.
(pixel alignment via lots of rectangular things - windows, buttons; text rendering w/ that in mind; "pixel perfect" historical design philosophy)
The VR PPD is in arbitrary orientations which will lead to more aliasing. MacOS kinda killed their low-dpi experience via bad aliasing as they moved to the hi-dpi regime. Now we have svg-like rendering instead of screen-pixel-aligned baked rasterized UIs.
No one who has bought almost any MacBook in the last 10 years or so has had PPD this low either.
One can get by with almost anything in a pinch, it doesn't mean its desirable.
Pixel density != PPD either, although increasing it can certainly help PPD. Lower density desktop displays routinely have higher PPD than most VR headsets - viewing distance matters!
I've tried that combination in an earlier iteration of Lenovo's smart glasses, and it technically works. But the experience you get is not fun or productive. If you need to do it (say to work on confidential documents in public) you can do it, but it's not something you'd do in a normal setup
This is the main reason many VR games don't let you just walk around and opt for teleportation-based movement systems - your avatar moving while your body doesn't can be quite physically uncomfortable.
There are ways of minimizing this - for example some VR games give you "tunnel vision" by blacking out peripheral vision while the movement is happening. But overall there's a lot of ergo considerations here and no perfect solution. The equivalent for a virtual desktop might be to limit the size of the window while the user is zooming/panning.
https://phrogz.net/tmp/ScreenDensityCalculator.html#find:dis...
I's impressive if they're really able to get below 2ms motion-to-photon latency, given that modern consumer headsets with on-device compute are also right at that same 2ms mark.
Edit: Nevermind, I'm dumb. 1/60th of a second is 16 milliseconds, not 1.6 milliseconds.
The real limiting factor is more likely to be having a large headset on your face for an extended period of time, combined with a battery that isn't meant for all-day use. The resolution is fine. We went decades with low resolution monitors. Just zoom in or bring it closer.
The resolution is a major problem. Old-school monitors used old-school OSes that did rendering suitable for the displays of the time. For example, anti-aliased text was not typically used for a long time. This meant that text on screen was blocky, but sharp. Very readable. You can't do this on a VR headset, because the pixels on your virtual screen don't precisely correspond with the pixels in the headset's displays. It's inevitably scaled and shifted, making it blurry.
There's also the issue that these things have to compete with what's available now. I use my Vision Pro as a monitor replacement sometimes. But it'll never be a full-time replacement, because the modern 4k displays I have are substantially clearer. And that's a headset with ~2x the resolution of this one.
What's available now might vary from person to person. I'm using a normal-sized 1080p monitor, and this desk doesn't have space for a second monitor. That's what a VR headset would have to compete against for me; just having several virtual monitors might be enough of an advantage, even if their resolution is slightly lower.
(Also, I have used old-school VGA CRT monitors; as could be easily seen when switching to a LCD monitor with digital DVI input, text on a VGA CRT was not exactly sharp.)
Can get away with less for games where text is minimized (or very large)
To your point, I'd use my Vision Pro plugged in all day if it was half the weight. As it stands, its just too much nonsense when I have an ultrawide. If I were 20 year old me I'd never get a monitor (20 year old me also told his gf iPad 1 would be a good laptop for school, so,)
Yikes. How'd that relationship end up? Haha.
Never tried VR set, so I don't know if that translates similarly.
So effectively your 1080p monitor has ~6x the pixel density of the VR headset.
We'll have to wait on pricing for Steam Frame, but I don't expect them to match Meta's subsidies, so I'm betting on this being more expensive than Quest. I also think that streaming from a gaming PC will remain more of a niche thing despite Valve's focus on it here, and people will find a lot of use for the x86/Windows emulation feature to play games from their Steam library directly on the headset.
If they get everything working well I'm guessing we could see an ARM powered Steam Deck in the future.
Despite the fact it uses a Qualcomm chip, I'm curious on whether it retains the ability to load alternative OS's like other Steam hardware.
I think it should: we have Linux support/custom operating systems on Snapdragon 8 Gen 2 devices right now today, and the 8 Gen 3 has upstream support already AFAIK
The main value of Meta VR and AR products is the massive price subsidy which is needed because the brand has been destroyed for all generations older than Alpha.
The current price estimate for the Steam Frame is $1200 vs Quest 3 at $600 which is still a very reasonable price given the technology, tariffs, and lack of ad invading privacy
The bulk and added component cost of the "all in one" PC/headset models is just unnecessary if you already have a gaming PC.
Full color passthrough would have been nice though. Not necessarily for XR, but because it's actually quite useful to be able to switch to a view of the world around you with very low friction when using the headset.
And once you have the pipeline and computation power to enable inside out tracking all on device, adding an OS is essentially free.
As for sending data over a cable, there's nothing inherently laggy about it. After all, the display signal already travels over the cable, and the cable transfer is by far not the limiting factor in latency. The camera data is lower bandwidth than the display signal, too.
Nikos Q: Linux Desktop support? A: Hi,
Linux is not officially supported but can absolutely work with the Beyond 2. I'd suggest joining the Bigscreen Beyond Discord server for more information
Thanks By Bigscreen Support Team
---
Rant: they have disabled selected text for the reviews for some inexplicable reason.
I wish Valve every bit of success, if they deliver an open platform people can own and hack.
On the beyond 2, only by 2 degrees horizontally. I don't think that would even be noticeable.
So this gets me thinking. What would it feel like to correct for that effect? Could you use the same technique to essentially play the further parts early, so it all comes in at once?
Kinda a hair brained idea, I know, but we have the technology, and I'm curious.
I don't know if it's faster, but it's a non-trivial part of the experience.
It would be interesting to see⁰ how that behaves when presented with weird eyes like mine or worse. Mine often don't always point the same way and which one I'm actually looking through can be somewhat arbitrary from one moment to the next…
Though the flapping between eyes is usually in the presence of changes, however minor, in required focal distance, so maybe it wouldn't happen as much inside a VR headset.
----
[0] Sorry not sorry.
It actually makes a lot of sense!
Sure eyes move very VERY fast but if you do relatively small compute on dedicated hardware it can also go quite fast while remaining affordable.
What a vile thought in the context of the steam… catalogue.
Guess we have to get rid of physical home media.
And the internet.
It was a good run I guess.
Meta Quests & Apple Visions require developer verification to run your own software, and provide no root access, which slowed down innovation significantly.
What about the Lynx XR1? Running Android sure but officially rooted (details https://lynx.miraheze.org/wiki/Rooting_Process ) and with Linux proper (details https://wiki.postmarketos.org/wiki/Lynx_R1_(lynx-r1) ) even though experimental.
This has a serious impact on the developer ecosystem - there are still a few people who got their devices and are doing interesting work, but with so few users actually having devices the community is too small for much progress to be expected.
It's kinda similar to the old Jolla Tablet - it was a very interesting device (an x86 tablet running an open Linux distro in 2013!) but it ended up in too few hands due to funding issues & the amount of Sailfish OS apps actually supporting the tablet (eg. big screen, native x86 builds, etc.) reflected that.
Sucks, sorry to hear that :(
I guess I can't complain too much given that I got it for free.
The Go is not the best headset of course, but the games are a different style because of the 3DoF tracking without camera's. Somewhat slower paced and sitting down. A style I personally like more.
You can also unlock the device to get root on it [3], which is quite neat, although there doesn't seem to be any homebrew scene at all. Not even the most bare-bones launcher that doesn't require a Meta login.
[1] That doesn't even seem intentional, but it does mean that once the old version of the app can't communicate with Meta servers anymore, any uninitialized Go turns into a brick.
[2] https://archive.org/details/gear-vr-oculus-go
[3] https://developers.meta.com/horizon/blog/unlocking-oculus-go...
I'm sure he put it to good use. Like 500ms worth of upkeep for one of his yachts.
from the link we don't know if the OS can be changed (might be locked like many Android phones) or if a connected machine is required to run their DRM/Steam. The drivers may also not be open source
Unless the lenses/displays are bad, but I figure we would have heard by now?
i wouldnt characterize this as an "open ecosystem" though
1) The stack is mature now, we know what features can exist.
2) For me it's about having the same stack as on a 3588 SBC, so I don't need to download many GB of Android software just to build/run the game.
The distance to getting a open-source driver stack will probably be shorter because of these 2 things, meaning OpenVR/SteamVR being closed is less of a long term issue.
It's possible that you can have a full open source stack some day on these goggles.. but I don't think that's something that's obviously going to happen. SteamVR sounds like their version of GooglePlay Services
All mainstream headsets get open-source drivers eventually: https://github.com/collabora/libsurvive
Also about cross compiling that is meaningless as you need hardware to test on and then you should be able to compile on the device you are using to test. Alteast that is what I want, make devices that cannot compile illegal.
Yes, there technically is a Linux kernel, but if it's "just Linux" then macOS is "just FreeBSD", because grep -V tells you so, because it has dtrace, because you run (ran?) Docker with effectively FreeBSD's bhyve, etc.
If you wanna spin it even further neither are Safari and Chrome or any other Webkit browsers just Konqueror because they took the layout engine code from KDE (KHTML).
And you can totally install Debian and even OpenBSD, etc. on a Steam Deck and at least the advertisement seems to indicate it won't be all that different for the VR headset.
Many computer users run a modified version of the GNU system every day, without realizing it. Through a peculiar turn of events, the version of GNU which is widely used today is often called Linux, and many of its users are not aware that it is basically the GNU system, developed by the GNU Project.
There really is a Linux, and these people are using it, but it is just a part of the system they use. Linux is the kernel: the program in the system that allocates the machine's resources to the other programs that you run. The kernel is an essential part of an operating system, but useless by itself; it can only function in the context of a complete operating system. Linux is normally used in combination with the GNU operating system: the whole system is basically GNU with Linux added, or GNU/Linux. All the so-called Linux distributions are really distributions of GNU/Linux!
They do hint that you can install a different OS on it:
> Just like any SteamOS device, install your own apps, open a browser, do what you want: It's your PC.
Every other SteamOS device does allow you to install whatever OS on the device, so seems Frame will be the same, judging by that landing page blurb.
They're being a little vague about it but this collaboration to improve Arch's build service/infrastructure is being done in part to faciliate support of multiple architectures.
iirc it was in Tested coverage that Valve said the hardware supports other OSes. It'd be out of character for Valve not to allow for this.
I don't see why they wouldn't unlock the bootloader, it wouldn't be the first Qualcomm-based product to allow it and in press interviews they have pressed, quite hard, that the Frame is still a PC.
That's it.
I don't need 3D, I don't need VR, I don't need weirdass controllers trying to be special. Just give me a damn simple monitor the size of my eyes.
Fuck off with your XR OSes and "vision" for XR, not even Apple could get it fully right, the people in charge everywhere are too out of touch and have no clue where the fuck to go after smartphones.
The best portable private display for your laptop will inevitably be a 6DOF tracked headset with an XR native desktop.
Apple's visionOS comes close but it's crippled by the trademark Apple overcontrolling.
There is a lot going on to render the desktop in a tracked 3D space, all that has to happen somewhere. If you're expecting to plug a HDMI cable into a headset and have a good time then I think you're underestimating how much work is being done.
OpenVR and OpenXR are really great software layers that help that all work out.
You should check out the xreal one!
https://store.steampowered.com/sale/steammachine
https://store.steampowered.com/sale/steamcontroller
No prices listed for any of them yet, as far as I can tell.
6x as powerful as the Steam deck (that I use plugged in anyway 98% of the time—I’d have bought a Steam Deck 2, but I’m glad I get the option to put money toward more performance instead of battery and screen that I don’t use) is great. Not a lot of games I want to play won’t run well at least at 1080p with specs like that.
I've had no driver or compatibility issues in longer than I can remember. Maybe Vista?
I also rarely upgrade because playing at console level settings means I can easily get effectively the same lifetime out of my hardware. Though I do tend to upgrade a little earlier than console users still leaning a bit more towards the enthusiast side.
- Randomly BSODs because of (I think) a buggy Focusrite audio interface driver (that I can't fix and Focusrite refuses to)
- Regularly 'forgets' I have an RX 5600 XT GPU and defaults to the integrated graphics, forcing me to go into the 1995 'Device Manager' to reset it
- Occasionally just... stops playing audio?
- Occasionally has its icons disappear from the taskbar
- Regularly refuses to close applications, making me go into the Task Manager to force-quit them.
These are just the issues I can think of off the top of my head. I've been playing PC games for like 15 years and this is just par for the course for my experience.Linux is still quite far behind in terms of desktop stability in my experience. But I guess if Valve fully controls the hardware they can avoid janky driver issues (it sounds like suspend will work reliably!), so this might actually make a good desktop Linux option.
I'm wondering when and with what hardware they had that bad experience.
But its trivial to run into some .NET or Visual C++ redistributable hell when you just get a cryptic error during starting and thats it. Just check internet. I have roughly 20 of them installed currently (why the heck?) and earlier versions would happily get installed over already-installed version of same for example as part of game installation process, not a stellar workmanship on MS side. Whats wrong with having latest being backward compatible with all of previous ones, like ie Java achieved 25 years ago?
Talking about fully updated windows 10 and say official steam distros of the games.
> its trivial to run into some .NET or Visual C++ redistributable hell when you just get a cryptic error during starting and thats it. Just check internet.
Thanks for making my point for me.
There may be a connection here with age and the type of games I play too. I'm in my mid-30s now and am not interested in competitive twitch shooters like Call of Duty. In many cases, the games I've been interested in have actually been PS5 exclusives or were a mostly equivalent experience on PS5 Pro vs. PC or were actually arguably better on PS5 Pro (e.g., Jedi Survivor). In some cases, like with Doom: The Dark Ages, I've been surprised at how much I enjoyed something I previously would've only considered playing on PC -- the PS5 Pro version still manages to offer both 60 FPS and ray tracing. In other cases, like Diablo IV, I started playing on PC but gradually over time my playtime naturally transitioned almost entirely to PS5 Pro. The last time I played Diablo IV on my PC, which has a 4090, I was shocked at how unstable and stutter-filled the game was with ray tracing enabled, whereas it's comparatively much more stable on PS5 Pro while still offering ray tracing (albeit at 30 FPS -- but I've come to prefer stability > raw FPS in all but the most latency-sensitive games).
One benefit of this approach if you live with someone else or have a family, etc., is that investments in your setup can be experienced by everyone, even non-gamers. For instance, rather than spending thousands of dollars on a gaming PC that only I would use, I've instead been in the market for an upgraded and larger TV for the "home theater", which everyone can use both for gaming and non-gaming purposes.
Something else very cool but still quite niche and poorly understood, even amongst tech circles, is that it's possible to stream PS5 games into the Vision Pro. There are a few ways of doing this, but my preferred method has been using an app called Portal. This is a truly unique experience because of the Vision Pro's combination of high-end displays and quality full-color passthrough / mixed reality. You can essentially get a 4K 120"+ curved screen floating in space in the middle of your room at perfect eye level, with zero glare regardless of any lighting conditions in the room, while still using your surround sound system for audio. The only downside is that streaming does introduce some input latency. I wouldn't play Doom this way, but something like Astro Bot is just phenomenal. This all works flawlessly out of the box with no configuration.
It's apparently small, quiet, capable, and easy.
I'll keep building my own, but most people don't, and the value of saved time and reduced hassle should not be underestimated.
If comparing this device to other pre-built systems, consider that this one is likely to be a first class target for game developers, while others are not.
Dont get me wrong this looks very a nice product, but its nothing revolutionary.
But I think the biggest feature might be the quick suspend and resume. Every modern console has that, but not PCs. You can try to put a computer to sleep, but many games won't like that.
Not to mention windows laptops waking up in bags or backpacks in the middle of the night seemingly for the only purpose of burning themselves up.
This steam machine here is a PC with steam preinstalled for a console-like setup and direct boot to your game library - but it’s still a pc.
The point is, computers are computers I guess ;)
there's plenty of people who just want to play games without researching what CPU and video card to buy.
The best experience you can get atm is to use Steams big picture mode, and that doesn't give you pause/resume, and you will sometimes need to use keyb & mouse to solve issues, plus you need to manage the whole OS yourself etc.
Valves SteamOS which already runs on the Steam Deck gives you all the QoL that you expect out of a console. Pause / resume with power button press, complete control via controller, fully managed OS.
What's missing are "in experience" native apps like Netflix/AppleTV/etc. as well as support for certain games which are blocked on anti-cheat.
My wife is a research scientist who uses linux with her day job, but she isn't interested in dealing with any nonsense when she's relaxing at the end of the day. The Steam Deck has been a wonder for her - suddenly she's playing the same games as me with none of the hassle. The Steam Machine will suddenly open a bunch of my friends and family up to PC games as well.
It won't be long until you can put SteamOS on any machine you make yourself, but the Steam Machine will serve as reference and "default" hardware for the majority.
SteamOS is a super controller-friendly desktop that would be right at home in a living room. Like the Deck, the Steam Machine could become a target profile for developers.
SteamOS's core functionality leans heavily on Mesa and there's been a lot of commits for the Adreno 750 lately, mostly coming from Linaro.
Hoping the next Apple TV will do it.
Edit - updated specs claim it can do this, but it’s limited to HDMI 2.0
Looks like it can do 4k 120hz, but since it's limited to HDMI 2.0 it will have to rely on 4:2:0 chroma subsampling to get there. Unfortunately the lack of HDMI 2.1 might be down to politics, the RDNA3 GPU they're using should support it in hardware, but the HDMI Forum has blocked AMD from releasing an open source HDMI 2.1 implementation.
https://arstechnica.com/gadgets/2024/02/hdmi-forum-to-amd-no...
There are two kinds of DP to HDMI adapters. The passive ones are like you said, they need special support on the GPU (these ports are usually labelled as DP++), IIRC they only do some voltage level shifting. The active ones work on any DP port (they don't need AFAIK any special support on the GPU), and they do the full protocol conversion.
Club 3D active adapter: https://www.amazon.com/Club-3D-DisplayPort1-4-Adapter-CAC-10...
I’m using the Club3D active adapter, which is the only one I found in reviews to reliably work. And it does, 0 problems whatsoever.
It seems to me the wireless is pretty important. I have an MQ3 and I have the link cable. For software development I pretty much have to plug the MQ3 into my PC and it is not so bad to wander around the living room looking in a Mars boulder from all sides and such.
For games and apps that involve moving around, particularly things like Beat Saber or Supernatural the standalone headset has a huge advantage of having no cable. If I have a choice between buying a game on Steam or the MQ3 store I'm likely to buy the MQ3 game because of the convenience and freedom of standalone. A really good wireless link changes that.
I'm talking about the Steam Machine here. In theory you could pipe 4k120 to the headset assuming there's enough wireless bandwidth, yeah.
I reckon it can probably stream at 4K@120 if it can game at half that.
HDMI 2.0
Up to 4K @ 120Hz
Supports HDR, FreeSync, and CEC
I have zero doubts the device can do 4k @ 120Hz streaming Hardware wise. In the end it is just a normal Linux desktop.
Or that's what I think I may be completely wrong.
When they cancelled production I bought 8.
Mac Mini m4: 127 x 127 x 50 mm = 0.8 L
Steam Machine: 156 x 162 x 152 = 3.8 L
That's 4.76 times more volume.
Or is it “comparing apples to steam engines”?
9.5 x 19.7 x 19.7 cm = 3,687 cm³
and half the size of my SFFPC @ 8.3L
Why? VR headsets are a dying fad of the 2020s. Way more excited for SteamOS on ARM.
The fact that this can run standalone, doesn't have a bunch of wires dangling from it, and is pretty much a fully working Linux box makes this am almost on-brainer for me.
I do _hope_ the price is reasonable though, if it ends up being like Apple VR I might not buy into it immediately, but I'm hoping for a reasonable $1000 max price.
There are, of course, the issues with lootboxes but even there they've kept their hands much cleaner than any other game developer.
It's a very well oiled machine, I had another VR headset ordered for sim racing, immediately canceled it when saw the Frame announcement because even if specs-wise it's a bit of a downgrade, I want to buy what Valve is selling.
They do seem to get a pretty big pass on that. Wonder what it is about.
Almost every other aspect of the company I find great, and I do wish they would release more games. Maybe Alyx 2 will come out with the headset? Could be what HLX has been this whole time, where people think it is HL3.
On sim racing in VR, absolute game changer. I would never go back to screens, it's the perfect application for VR.
Back 4 blood was just another "live service" game that stirred up hype, released in an extremely buggy state, poorly balanced, with terrible AI that was never fixed, without mod support or community servers. They cashed in on the initial surge of popularity, cashed in on the DLCs, and then it quickly died off because it didn't have any of the charm or reply value of the games they claimed to improve upon.
See "cheaper than index": https://www.uploadvr.com/valve-steam-frame-official-announce...
> Unlike the Index controllers, Steam Frame Controllers don't have built-in hand grip straps. But Valve says it will sell them as an optional accessory for people who want them, a similar strategy to Meta.
I was disappointed seeing no hand grip straps. I've never used a Valve Index but they seemed very useful. Very glad that they will still be available.
If as I currently intend I end up purchasing this device, I will definitely endeavour to obtain the controller straps as well as the top strap for the headset at the same time, and I recommend others do the same.
https://vr-compare.com/compare?h1=0jLuwg808-j&h2=w8xCM-oPA
The 1000hz tracking frequency is from the Lighthouse tracking system, which the Frame loses. For that and other reasons, I am not convinced the controllers are better than the Index controllers. Personally I think it's likely I will keep using the Index controllers, since I have the whole lighthouse setup and I own trackers as well.
But this headset solves the ecosystem aspect and brings that visual experience with it.
Still hoping that you’re right, though.
I don't think I'm the norm, but probably neither an exception
Only question is if 2160px is enough.
The Quest 3 is already close to good enough to spend decent chunks of time in reading text. Just have breaks every 30m to avoid mild strain.
To me, the sweaty face issue is the main annoyance with working in these types of headsets.
Clarity has been totally fine for work reading text on, if I were inclined to code in VR that would totally work for me.
Having the headset also be a PC (and not essentially a phone OS) is worth a premium of >$250 at least. You can build desktop apps/games on this thing, it can (hopefully) do just about anything a normal PC can.
The Quest is impressive in many ways, but it's a much narrower-use device. I don't think Valve's pricing needs to be in that same bracket to still sell.
Just make sure to wait for reviews on this front - it almost certainly can't run AAA games at the native resolution + fps. Likely it'll only be able to run lower req games on device.
This. The combination of this being from Valve, and the fact it's highly likely to be an open Linux machine you can strap to your face, I'm looking to finally bite the bullet on a headset and the one thing I need to know is, can I use it for productivity, I'm used to working on 27"+ 4k monitors, _how much_ clarity am I going to sacrifice with this.
It has pretty important benefits - lowest possible latency & being able to just pick the headset any play anywhere.
In comparison Meta might have cut down too much in Quest 3 by omitting eye tracking.
In my opinion, VR gaming never becomes more than a gimmick. It adds a questionable improvement in graphics and immersion at the incredibly high cost of excluding yourself from the real world. Right now it’s not worth it, and I don’t think it ever will be, no matter how good the graphics get. That’s assuming they even solve the motion sickness problem, which doesn’t seem solvable to me at this point.
The motion controls in VR will also always be severely limited by the fact that you can’t see your surroundings. You can’t meaningfully move around or swing your arms fast in any realistic home environment when you’re in full VR. You’re constantly at risk of punching something or breaking something, or both. So the controls have to become really stiff and avoid requiring wide movement, at which point you might as well just push buttons on a gamepad.
But AR is a completely different thing. No motion sickness, no risk in any movement, you can move around without silly threadmills, and no exclusion from the world. It’s truly amazing. The AR boxing, pickleball, ping pong and golf are so much closer to real thing then to a videogame adaptation, even the shitty Quest graphics don't ruin the magic. Those AR experiences don't work on videogame rules and really deserve their own name and category - they're as different from gaming as books are from movies. If VR headsets don’t die out, AR is going to be the thing that brings them to the mainstream. I just wish it had more attention, more apps, and more non-Meta mainstream platforms. Not this time, sadly.
The Steam Deck was wildly popular for a non-Nintendo device. It's got Linux up to 3% of total Steam playtime. If this has a similar draw (play every game on Steam without having to buy a TV), maybe the install base of VR will grow to a point where it's more feasible to make games that support it.
It also makes SteamVR relevant again in a world where Oculus has been eating a lot of the mindshare by releasing affordable headsets and buying the most successful game studios.
The big difference seems to be that this headset doesn't have AR cameras at all, but reuse the mapping camera for some light passthrough duty.
The real reason the Frame is monochrome AR is because the cameras are also used for IR tracking which is better in monochrome. You can use the Frame in the dark or a dimly lit room - Quest 3 you can't. For real VR users the trade off is worth it.
You clear the area within the boundaries, leave a little buffer space to the walls, and respect the boundary warnings in game. No problems. You do need a few square meters without any furniture to do this.
Boxing and ping pong feel just as great in VR as they do in AR. It's more a matter of the level of immersion: AR works well for table tennis, but fantasy games are severely limited in what they can do. The most impressive experiences are always in VR - "flying in space" doesn't work while looking at your living room walls.
That's a feature for a good number of games, if not most. For example, Resident Evil 4/8 in VR are by far the best horror experiences I've had, and part of it is that you stop seeing your living room while playing.
> The motion controls in VR will also always be severely limited by the fact that you can’t see your surroundings.
There is zero chance that aiming with a controller is more intuitive than point-and-shoot. What I get from your comment is that the movement can be awkward which is absolutely true, but plenty of games have neat ways around that. And then there are games that require no actual movement, like racing games with a sim setup.
I'd really like to know what the experience is like of using it, both for games and something like video.
Linus the shrill/yappy poodle and his channel are less than worthless IMO.
(If I move my head closer it gets larger, further and it gets smaller)
I would be curious to see a similar thing that includes flashing. Anecdotally, my peripheral vision seems to be highly sensitive to flashing/strobing even if it is evidently poor at seeing fine details. Make me think compression in the time domain (e.g. reducing frame rate) will be less effective. But I wonder if the flashing would "wake up" the peripheral vision to changes it can't normally detect.
Not sure what the random jab at Linus is about.
It could really push the boundaries of detail and efficiency, if we could somehow do it real-time for something that complex. (Streaming video sounds a lot easier)
They are complementary things. Foveated rendering means your GPU has to do less work which means higher frame rates for the same resolution/quality settings. Foveated streaming is more about just being able get video data across from the rendering device to the headset. You need both things to get great results as either rendering or video transport could be a bottleneck.
Foveated streaming is just a bandwidth hack and doesn't reduce the graphic requirements on the host computer the same way foveated rendering does.
While there are some recent'ish extensions to do variable-rate shading in rasterisation[0], this isn't variable-rate visibility determination (well, you can do stochastic rasterisation[1], but it's not implemented in hardware), and with ray tracing you can do as fine-grained distribution of rays as you like.
TL;DR for foveated rendering, ray tracing is the efficiency king, not rasterisation. But don't worry, ray tracing will eventually replace all rasterisation anyway :)
[0] https://developer.nvidia.com/vrworks/graphics/variableratesh...
[1] https://research.nvidia.com/sites/default/files/pubs/2010-06...
Linus says he cannot tell it is actually foveated streaming.
It's close to imperceptible in normal usage.
Right now getting fast enough and reliable wireless connection means either tweaking to death one's setup or spending car money on the entire setup. In particular normal people usually don't realize how crappy their wi-fi and assume it's all the same, which would end in blaming the poor perf on the headset.
A while ago I bought the Quest 3 and set it up with WiFi 6 for streaming games. It's a decent setup, but I only bought it cause I was tired of waiting for the "rumored new headset by Valve".
And it seems everything on my wishlist is here:
- foveated rendering based on eye tracking - this is excellent, and was I think only available in the Quest Pro until now
- a dedicated wireless streaming dongle, with multiple radios on the headset - awesome, tuning WiFi 6 got me to a good-enough state, but I'm looking forward to a dedicated out-of-the-box solution
- pancake lenses
- inside-out tracking
In general, having had the Valve Index previously, and then using the Quest 3, it's a night-and-day difference to play something like Alyx wireless. Much better clarity with pancake lenses, too.
Main surprise here is their usage of a Snapdragon chip and not AMD, didn't expect this. I thought it would effectively be a steam deck hardware wise. Curious to see how well that works, esp. for standalone gaming. In practice though you'll likely want to be streaming any "pc-first" titles anyway.
I'm curious how meta responds imo the only way to compete is on price/ease of use but i'm not interested in another quest the 'social features' are just an excuse to collect data.
But Meta basically having access to my room in 3D, full audio, is not ideal. The very last company I want to invite into my home.
The pass-through video is monochrome and the screens have about 40% of the pixels compared to the Vision Pro.
The Samsung Galaxy XR is much closer to being a Vision Pro competitor.
The Steam Frame is very focused on playing games locally and streamed from a PC.
neither is the Apple Vision Pro
I also trust the Steam ecosystem far more than I probably should...
I mean, I have a Quest 2 and it'd be a step up but not a huge one. I've seen the Apple Vision and that did wow me. The vision is just in a weird corner inside a closed ecosystem and a tech demo for apple. No thanks. Valve will absolutely do that ten times better. But will it be visually so much better than a quest 2? I doubt it.
Guess they have yet another translation layer to run these APKs?