Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I tried this with the Viture Pro XR glasses last year and it sucks. Can't use it with Linux, except in dumb monitor mode. No head tracking unless you're using a supported OS. Android app sucks becaus you can't use it with any old app, eg productivity apps (their app is like a demo of head tracking that only supports stuff like YouTube and local media). Maybe I should have purchased the Xreals?


The first gen XReal glasses are similar in that you need software running on the host to get anything other than dumb monitor mode. With these newer models they've moved a bunch of the functionality into hardware on the glasses themselves, so you get virtual monitor and wider device support out of the box.

There are a couple of projects that are trying to get better open source support of the Airs on linux; I've not kept up with their progress.


I really liked the new Xreal glasses because of the built-in head tracking. But I was getting a lot of drift over time such that after anywhere between a few minutes and maybe half an hour, the center of the screen wouldn't be straight ahead anymore. Meaning I would need to reset the display location far too often. I ended up returning them because of it.


Oh ick. That's not so good. Hope they can firmware-update their way out of that. It's odd that they've got that problem because my experience with the tracking on the Airs with the desktop software driving it has been pretty much fine.


Waiting for the XReal One Pro to vibe code using Aider and Samsung DeX in a Termux terminal while walking my countries national parks (just something I would like to try)


I tried something like that 2 decades ago with a 640x480 monocular strapped to my sunglasses, with interpolation i could use a 1024x768 resolution in combination with a arm based pocketpc with host usb and compactflash video-out. I used it for reading and 'fast' offline wikipedia/tr3 database search with both frogpad and twiddler2 and some voice commands.

The problem is that you can see the foreground and depth ok as the monocular screen on my left eye 'merged' semi-transparently due to brain processing. I assume this is a bit worse on the XReal. The main issue was that when walking you make a slight sinus wave up and down compared to the foreground. You don't notice this usually, but with a paragraph of text or code positioned in front of your eyes it becomes very distracting.

One solution, using a mode of transport that doesn't involve moving up and down slightly, for example using a bicycle or car for transportation. In both circumstances, the latter being the most problematic, it's not advisable safety wise (or even illegal) and a screenreader solution is better. I had the idea of using Emacsspeak for this, or do a smart speakup echo from the commandline.

Another solution is using RSVP, or using both RSVP and text to speech. Samsung DeX is great though, Motorola, Huawai and recent Android have support for desktop mode too (if the phone supports video-out).


The latest Xreal glasses provide 3DoF tracking natively in the hardware. Your eyes in this case would perceive the screen as stationary as you and it inevitably "bounce" during motion.

I still don't recommend walking around with them on while reading.

I will also plus one Samsung Dex. It really is amazing to have a desktop like experience with just glasses on, and feels properly cyberpunk.


That's so cool, thanks for the detailed reply. I agree walking around with a big bouncy screen in front is probably not going to be very ergonomic. Still I can't let the idea go completely. Going RSVP+TTS might indeed be a more viable approach, certainly something to test with the current wave of AI/LLM agents.


I do remember a website a long time ago about a guy walking around with a setup like this. Seemed like it involved lead acid batteries and a lot of weight.

Was really interesting then when desktops were as big as large pizzas.



Oh my gosh I remember Wikipedia on TR3. That's a piece of software I haven't though of in a while.


Will you be going all voice, or will you need some chording keyboards?

https://www.charachorder.com/


I was thinking the same while was walking my dog.


> I’ve found that simply converting a standard, normalized PostgreSQL database doesn’t work well in practice. Instead, you need to design your database with YugabyteDB’s distributed architecture in mind.

I wanted to reply to your previous posting but ycombinator does not allow for comments on older posts.

To put it simply, we spend almost 3 weeks trying out YB for your own usage and ran into massive performance issues.

The main issue seem to be that anything sharded (what it is by default), will perform sort/filter/and other operations only after retrieving sufficient data from those shards.

The results is that this massive impacts data. The YB teams somewhat hides this by spending mass amounts of resources on Seq Scans, or require you to explicite design Index's pre-sorted.

In our experience, a basic 3 node YB cluster barely does 7000 inserts/s, where as Pg instance does easily 100.000+ insert/second (with replication). Of course there is the cheating. Like "look how we insert 1 million / second" (3x), with 100 heavy AWS nodes. Ignoring that they are just inserting rows with no generated IDs ;) If you put that in front of pq, your easily hitting 150.000 insert / second on a single node.

There are issue like trigrams taking up 8x the space compared to pg trigrams (no row level packing). Materialized joins that are pre-sorted, are not respected as inserts are done by hash distribution. So item 1 can be on node 3, item 2 on node 1, ... what needs to be fetched and resorted, killing the performance benefits.

A optimized materialized join pulls out data between 0.2ms to 10ms, where we have seen YB doing between 6ms to minutes. On the exact same dataset, inserted the same way, ... on only 100 million rows of data. What you expect to be those databases their playgrounds.

Plugs that are broken in the new 2.25 version.

We lost multiple times t-master and t-server nodes. Ironically, the loss of the t-master was not even reported in the main UI. What can only be described as amateuristisch.

CockroachDB is another series of similar issue with latency, insert speed, and more ... combined with the new horrible "we spy on you free license, that we may or may not extend every year." We found CRDB more mature in a sense and less a Frankenstein's monster of parts, but lacking in postgres features (or working differently).

In essence, as long as you use them as mongoDB like denormalized databases that can run SQL, great. The issues really start when your combining normalization and expecting performance.

And the resources that both use means you need to scale to a 20x cluster, just to reach the equivalent of a single pq node. And each of those YB/CRDB nodes, need to have twice the resources then the pq node.

In general, my advice is to just run pq, replicate/patroni, maybe scale to more pq clusters where you separate tables to different clusters. Use the build in postgres_fdw to create materialized read nodes to offload pressure and load balance. Unless you are running reddit at scale, the benefits of YB/CRDB totally outway the tons of disadvantages.

The uptime, easy of upgrading, and geo-distribution is handy, not going to lie. But its software that really only gets benefits for companies that reach a special demand or high scale and even then. I remember that reddit ran (still runs?) on barely a few DB servers.

What is even worse, is that as you stated, you need to design your database so much around YB/CRDB. that you can use the above mentioned pq solutions to get way more gains.

I hope this responds is of use.


I have xreal air 2 that gets zero use. I dont recommend them that much, working on a laptop is better and since they constantly making newer versions its worth the wait to not buy anything current and wait for the next one which which will be better. I had the buyers regret, wishing I waited longer for the newer version but unless I buy a steam deck to play games Ill probably never use them.


Ditto. I bought them because I travel a lot for work and TVs at hotels are always a mess when it comes to HDMI in. I use them on maybe 1 of 4 flights I’m on (usually on SteamDeck with XR plug-in in Decky), and rarely at the hotels.

The only time they were really helpful for me was one flight when I had to work on some sensitive content and the person next to me was obviously trying to shoulder surf me.

Maybe it would be more productive if I had a newer iPhone with USB -C or the XReal One glasses. But having to use all the dongles to get to go from iPhone to lightning/HDMI adapter to HDMI to USB-C cable to original XReal Beam (which is way underpowered), to USB-C to glasses, that’s a ton of cables. And some forget, you’ve now got to pair your AirPods with the Beam! Ughh. It’s a little better with the Beam Pro. It’s faster and generally a good experience, but it’s still another device and still doesn’t have a good way to stream 3D content from Plex/NAS. Ugh.

That being said, when I put in the effort, the picture is very nice.


The iPhone sucks with most if not all AR glasses solutions. Most flagship Android devices just need a single cord. Apple made their choices, and it sucks.


Newer iPhones and the XReal One glasses are just fine and in part with the admittedly mediocre experience of Android devices.

Let’s not pretend that Nebula on Android is good. I’ve got a Beam Pro. It also is not great.

SteamDeck is really the best experience I’ve found.


The Air 2 sadly went backwards in the clarity at the edges, which makes them harder to use for work.


SimulaVR[1][2] is releasing our standalone (belt-strappable) compute packs this year, which will (ii) come pre-installed with our FOSS Linux VR Desktop compositor and (ii) work with AR headsets like the Rokid Max series (and potentially the XReal headsets). So basically: you'll get full Linux Desktop apps in AR (not just Android ones) with actual VR window management (not just "dumb monitor mode").

[1] I know we're taking forever D: But we intend for this to be a way to release an intermediate product (which we've been making anyway for our full headsets).

[2] Our next blog update will be about this. Here's a video video preview: https://youtube.com/shorts/Y67D8DkqScU?si=LpdSpjmfGn2k2rxP


I'd like to try this kind of setup (coding from a lounge chair with just a keyboard tray & trackball, yay!), "dumb monitor" would be sufficient - but since switching to high-DPI displays in 2016 I really need this to be 4K.


I did this with nreal air glasses (now xreal air), specifically for coding. Most uses cases for these type of glasses is around media consumption, so I was taking a bit of a leap when doing it for coding/heavy text usage.

There are two modes. One is fixed so that the virtual monitor stays in one spot on the lenses. The single virtual monitor stays directly in front of you. The other is floating, which basically keeps the virtual monitor in one spot and you can turn your head to look away. This mode also lets you set up 3 virtual monitors side by side so you can turn to look at them. It uses head tracking to basically shift the image in the opposite direction you turn your head.

In both cases, the screen does move, and this is super relevant when looking at text and down at status bars. The fixed one is better because it moves relative to your head, but both cases have some amount of jitter. I found the best case coding scenario is the fixed monitor (no head tracking), and being in a seat wth head rest and you can press your head back into it. This minimizes your head movement, which minimizes how much the text moves about. The downside is that we're used to looking up and down at the screen, so you want to set the monitor at a proper distance so you can look up and down with just your eyes. You really want to shrink the monitor to a size close to that of a laptop.

I ultimately ended up not liking the experience very much. No matter what, you're gonna end up with some amount of text movement. There is also a bit of light saturation bleed through (old CRT style). Putting the blackout blinders on helps a lot, but the projected nature remains an issue. Essentially only usable long term in a recliner or a car seat with headrest. Unlike the author, I am using a work provided laptop and I have that with me anyways. There was coolness to leaving the laptop in my backpack and just bringing the glasses up via wire - but to actually do anything, I need a keyboard. Which means taking along a Lenovo's thinkpoint trackpad keyboard (really great backpack keyboard); or pulling out the laptop.

The newer ones, like Xreal One from the article might be a better experience. A coworker had the air 2 pros and used them for travel. He said he didn't really notice the things I did, so maybe it was a improved experience even with that version. But he mostly worked in office documents and only occasional terminal work. When traveling and using the glasses, it was almost all "office docs", and only for short periods of time. For me - I am going to wait and be a slow adopter to move to a new version.


Thank you for your detailed insights. I was on the fence, now I think I’ll wait.


If your monitor keeps following your head movements, even teeny tiny ones, it gets annoying very fast.


Yep, can't get past it


The drivers here https://github.com/wheaney/XRLinuxDriver mark Viture as "recommended" with the best support. I do see some mention that head tracking is a desktop responsibility, but I presume that means some support in the driver... do you have more informatio non this?


The open-source Breezy GNOME is worth a try. It has head tracking and multi-monitor in beta with GNOME DE.

https://github.com/wheaney/breezy-desktop


open source that requires a commercial license. and the drivers link to a repo that 404. think i will pass.


It gets even worse, looks like their driver makes calls to google analytics.[1] I'd stay away. The README doesn't even mention it, and the promise that "your personal data will never be shared, sold, or distributed in any form." certainly sounds misleading when you consider this.

[1] https://github.com/wheaney/XRLinuxDriver/blob/main/src/plugi...



Which is particular odd to have a publicly available GPL repo. For this project, it looks like anyone could just comment out those sections of code that check for a license and redistribute. Or is there another binary somehow required and wasn’t clear in my 2 minutes of searching?


Which link? All the links I could find worked.


No Linux? Full stop.


If only there were credit-card sized, LiPol-battery powered 'puter with built-in wireless networking and a GPU-accelerated remote streaming app that output HDMI, all made and distributed by a Five Eyes alliance country for less than $15 each. If only... /s

The choice of a trusted HMD is a little more complex, but very solvable ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: