It was at CVPR 2019, a computer vision conference. I may be biased since I used to work at Ouster, but cost notwithstanding, I would definitely pick the OS1 again for its unparalleled number of points per second combined with low weight and decent accuracy.
Anecdotally, using Darktable, I could never get as good of a demosaicing result as using the straight-out-of-camera JPEGs from my Fujifilm GFX 100S. In challenging scenarios such as fine diagonal lines, Darktable's algorithms such as LMMSE would add a lot of false colour to the image.
However, modern deep learning-based joint demosaicing and denoising algorithms handily outperform Darktable's classical algorithms.
However, Fujifilm lossless compressed raw actually does a decent job keeping the file sizes down (about 50% to 60% the file size of uncompressed) while maintaining decent write speed during burst shooting.
It's really strange to me that a lossy compressed format could be called "raw". Does that just mean that it hasn't been e.g. gamma-corrected before the compression was applied? (Is it even a good idea to do lossy compression before such correction?)
All raw means is scene-referred data. The idea that somehow raw means "raw" data from the sensor is an often repeated idea, but unfortunately is completely nonsense. Modern sensors do on-chip noise reduction, they can be programmed to give data in all kind of formats and with different processing done to it. The same sensor used in different cameras can have different ISO. The same sensor used in different cameras can produce different RAW files even at the same ISO. Not just in the sense of a different file format, in the sense of different data in the file, from the exact same sensor, but programmed differently.
I once did a project to do multilateration of bats (the flying mammal) using an array of 4 microphones arranged in a big Y shape on the ground. Using the time difference of arrival at the four microphones, we could find the positions of each bat that flew over the array, as well as identify the species. It was used for an environmental study to determine the impact of installing wind turbines. Fun times.
Reminds me of Intellectual Venture's Optical Fence developed to track and kill mosquitoes with short laser pulses.
As a side-effect of the precision needed to spatially locate the mosquitoes, they could detect different wing beat frequencies that allowed target discrimination by sex and species.
This laser mosquito killer is, and always has been, a PR whitewashing campaign for Intellectual Venture's reputation.
This device has never been built, never been purchasable, and it is ALWAYS brought up whenever IV wants to talk about how cool they are.
And I say this as someone who loosely knows and was friends with a few people that worked there. They brought up this same invention when they were talking about their work. They eventually soured on the company, once they saw the actual sausage being made.
IV is a patent troll, shaking down people doing the real work of developing products.
They trot out this invention, and a handful of others, to appear like they are a public benefit. Never mind that most of these inventions don't really exist, have never been manufactured.
They hide the extent of their holdings, they hide the byzantine network of shell companies they use to mask their holdings, and they spend a significant amount of their money lobbying (bribing).
Why do they need to hide all of this?
Look at their front page, prominently featuring the "Autoscope", for fighting malaria. Fighting malaria sounds great, they're the good guys, right?
Now do a bit of web searching to try to find out what the Autoscope is and where it's being used. It's vaporware press release articles going back 8 years.
Look at their "spinouts" page, and try to find any real substance at all on these companies. It is all gossamer, marketing speak with nothing behind it when you actually go looking for it.
Meanwhile, they hold a portfolio of more than 40,000 patents, and they siphon off billions from the real economy.
Part of their "licensing agreement" is that you can't talk badly about them after they shake you down, or else the price goes up.
I did a similar project at 18. Needless to say I didn't have enough HW and SW skills to do much since I implemented the most naive form of the TDOA algorithms as well as the most inefficient way of estimating the time difference through cross correlation. I still learnt a lot and it led me to eventually getting a PhD in SAR systems, which are actually beamformers using the movement of the platform instead of an array
What were the results of your study? I’ve heard that bat lungs are so sensitive that when they fly across the pressure differential of large turbines their capillaries basically explode
Yes basically. Bird lungs are relatively rigid, open at both ends like a tube, and have a one-way flow of air, so they are less prone to pressure-related injuries. Bat lungs are mammalian lungs that expand and contract as they breathe just like us, so they are particularly vulnerable to barotrauma near wind turbines.
After writing a bunch of MATLAB code to find the bats, I handed it off and haven't heard back about whether they actually built the wind turbines or not.
I would love to do something like that to track the bats in my garden, how feasible would it be for an amateur to do as a personal project?
Any good references on where to start.
A nice mention about this is the outstanding and quiet work of the Cosys-Lab of the University of Antwerp. They once put a microphone array below a scorpio, and showed how bats moved their ultrasonic beam to scan for a scorpio. Incredible stuff [0].
Here's the report [1], written when I was a second year undergrad in 2010.
It's very basic. The species identification is based on matching contours of the spectrogram against some template contour. The multilateration was, embarrassingly, done by brute force by generating a dense 3D grid. At the time, I didn't have any knowledge of Kalman filters or anything that could have been helpful for actually tracking the bats.
Honestly, that sounds like amazing work. I wish I could afford to get out of enterprise software engineering and just do academic software development like that.
> There's another, more subtle critique of this system: it lacks path independence. This means that if you start and end a drag with your mouse at a particular location, the rotation will depend on the path that your mouse took.
Actually, when I accidentally tumble models with that kind of UI, I just drag it in a circle until it's right side up.
I accidentally found how to do this while using a solid modeling software and use it quite a bit. I don’t see a formal name for this maneuver, or anyone teaching it. ‘Gimbal tumbling’ is the closest description.
Yeah I will fully admit that "draw circles to yaw" is not intuitive or discoverable, but it is quite convenient when you have learnt it. I'd definitely prefer tumbler over trackball because of this.
But turntable is clearly far superior to the others. Unless you really need to yaw the model then stick with that.
3D printing technology is amazing now. I used to struggle with my ABS prints warping 12 years ago with a PP3DP --- I couldn't even print a giant 3x3x3 rubik's cube that worked. Now there are lots of 3D printers that are essentially just zero configuration and everything works out of the box. I even printed a lens mount for my camera and it came out quite well aligned. So it is very nice to see some regular consumer 3D printers being good enough for a functional 34 x 34 x 34 cube.
Author here: I use a Logitech G Pro X Superlight but also I use the i3 window manager and rely on keyboard shortcuts for a lot of the navigation. I have the mouse sensitivity set so that the cursor can traverse the width of the screen when moving the mouse about 13 cm, without any acceleration. This is still precise enough that I can move the mouse pixel by pixel if needed.
Apart from programming, one of the motivations for getting the 8K display is to look at lidar point clouds. For example the desktop background in my post is a lidar map of Bernal Hill in San Francisco, which I've here downsampled to only 13006 x 7991 px for your convenience [1].
Admittedly, when I bought it at first, I didn't realize there would be so many random issues, as manufacturers all advertised their gear as "8K Ready" even in 2021. As I incrementally fixed the problems, I decided to document my journey in this blog post.
btw I posted this in the past but it got caught by the spam filter and disappeared [2], not sure how to appeal that when it happens. Thanks ingve for posting it again!
I had the Dell 8k monitor you mentioned, the picture quality was great but it died after a few years not long after the warranty expired (a gut punch at the purchase price) and they said too bad so sad... ok that's fine but I will never buy another Dell product again. It was released too early to have proper displayport support and I had to use a custom nvidia-driver X11 config to make it mostly work as two monitors. And there is basically no way to use that kind of DPI without scaling.
I replaced it with an LG 43UN700 which is a 43" 4K display that I use unscaled and although the LCD panel is vastly inferior I love the thing especially at the price point (under $700). I hope manufacturers continue to support this niche of large singular flat displays because they are fantastic for coding, data viewing/visualization and pitch hit at content consumption as your article states although this one would be no good for gaming. And getting a "monitor" or "professional display" firmware load means a lot less problems than a Smart TV load.
I had a similar experience with Dell after they wanted the price of a new laptop for a replacement laptop battery. This was for the Dell Studio back when battery packs were made to be swappable by simply sliding a latch.
After that phone call to customer support, I made a similar vow to never buy another Dell product. These days, I use a Framework laptop.
If Sir is buying his lithium batteries and/or power transformers from the likes of eBay, Alibaba and Amazon, then Sir may wish to check his fire insurance is up to date.
I have bought many third party rechargable batteries from those sites over the years. Yes, slightly lower charge capacity compared to original, but no fires. And, yes, I know my sample size is small!
I also had similar good experiences buying batteries on Aliexpress. The issue with those typically isn't intrinsic quality as the batteries are most of the time good but lack of quality control. Bad batteries will reach the market and this is specially dangerous with packs with many cells like e-bikes packs.
I did in fact buy a knock-off from ebay battery, but it kept it kept it's charge for hilariously little time. Had to run it of mains power permanently (ran it as a little server for a while).
Don't know your exact timing but I run basically on Dells Latitude laptops for past 2 decades. Since its just for travel and not my primary workhorse, I buy used corporate ones for pittance (cca 300$ for models worth 1500 few years before), and swap battery myself for another original OEM one, they cost less than 100$ from original manufacturer. Its just 2-3 philips screws and 1 cable, anybody can do it. They last just as much as advertised on new ones and don't degrade much even after few years.
Batteries (and ie chargers) are one of the things that's utterly idiotic to shop around on chinese portals. You literally always get what you pay for (or worse) and can't punch above this threshold.
This was mid 2010s and the laptop has long since bit the dust. IIRC this was a Dell Studio 15, but I recall checking eBay for old new stock with no luck, but it doesn't surprise me that the Dell Latitudes have lots of stock floating around ebay.
FWIW, it was the same (even at the enterprise level).
We had a commodity (local cloud) computing Dell infra in the mid 2010s and were constantly replacing/returning “simple stuff” (fans, support flanges, memory, NICs).
“Dude, you’re gettn’ a Dell” became—-nope, never again.
I feel like there are not good quality hardware company nowadays.
Dell: the land of motherboard dying and dog shit trackpads.
Asus: dead soldered RAM.
Most BIOS: too long to boot, it's fucking 2024 what is your BIOS doing it needs more than 2s to boot? It was taking the same time to boot 30 years ago.
Every mouse: double click problem due to wrong use of the actuators.
And every hardware company has to try to cram some badly designed software and require you to create an account.
Your trackpad comment brought back a memory of 6 of us in a conference room.
We all had the same OS (NT) to the same patch level, same trackpad config, and same model of Dell laptop and every _single_ trackpad felt different. They weren't strictly "defective", but just wildly disparate physical feels and responsiveness.
I will give shout-outs to: 4th gen Kindles (has physical buttons and lasts forever), first gen iPhone SE, and Microsoft Mobile Mouse 3600.
Why does it take that long to post? I've had multiple Ryzen 300-series motherboards, none of them take anywhere near that long to boot outside of using something like some server-grade HBA that has its own boot step.
I have no idea, but it's a known issue, memory training maybe? It's a gaming PC so nothing special going on, ROG HERO motherboard, 32GB DDR4 (4x8GB), GTX 1080Ti.
I haven't used it much in recent years, I built it for gaming but had kids a couple of years later, now I game on whatever is convenient in the small burts I get; which is also the reason I haven't bothered upgrading it.
I would tepidly recommend lenovo, they support the firmware for a long time and most things work. Warranty is what you decide to buy. Designs tend to be pretty serviceable but it varies in the models and over the years.
I stupidly updated my firmware on my ThinkPad 14 running Linux and that removed the perfectly working S3 sleep and gave me a non-working ridiculous S0x instead.
You may want to look at the Samsung 5K monitor. It can often be had for $700. The sharpness of text is beautiful, especially if you're using a mac since it's optimized at 218ppi to avoid scaling. But, it might be smaller than you want. Apple also makes one that is nearly identical, except for the price.
PS - I have seen Dell go downhill as well. I returned the last Dell laptop I bought. My wife was sitting next to me on the couch and her macbook had full wifi bars while the dell had one bar. I did some research and they were using a pretty cheap wifi controller and maybe also had poor antenna design. I ordered a ThinkPad for the same price and it was great.
Which Samsung are you talking about? We got some new screens at work, Samsung S9 something or other. 27", 5K, thunderbolt 4. As you say, the text is very sharp, and colors seem fine enough. But that's about it, and I would not recommend them at all.
The worst issue is that the viewing angles are ridiculously bad. I'm sitting at arm's length, and the borders are very dark and somewhat blurry. They're of course OK if I move and look at them straight on, but my 32", fairly old LG doesn't have this problem.
Another pain point is the fact that it cuts off the power supply and the USB peripherals plugged in when it goes to sleep. I couldn't figure any way of disabling this behavior. But if you leave your PC running and expecting to connect to it over a USB network adaptor or similar, you're gonna have a bad time.
Yes, that's the one (Samsung only has one 5K monitor). Most reviews I've seen have been pretty positive about it, but it's good to hear a dissenting opinion as well.
I believe the viewing angle problem you're talking about is due to the anti-glare finish. It's a trade off for sure (one that some people would not want). I assume that's why Apple offers their consumer 5K monitor with or without that finish.
Apple also has a Pro 5K monitor. LG also has one older 5K, and those are the only 5K monitors currently on the market.
It's true that the anti-glare works OK. By that, I mean that I've never thought about it, and now that you mention it, I realize it's a good thing since I never felt the need to complain (I don't usually hold back). The screen is also very sharp and doesn't exhibit the weird texture some anti-glare coatings used to have.
However, in that particular office, there are no strong sources of light that would shine directly on the screens, so it's hard to say how good it actually is, especially when comparing to other models. The screen can also get pretty bright, so it should be able to handle most lighting situations in an office.
I type this on a Dell U3223QE, on a black background, with two lamps right behind me. The lamps aren't very bright, but the room is fairly dark (it's still night here). I can see the glare if I pay attention to it (didn't notice it before reading your comment). This is a 32" screen, sitting at roughly the same distance as the Samsung, yet it doesn't exhibit the viewing angle issue at all.
I do know that having a brightish window behind me with this screen requires upping the brightness, or the glare would be a pain. Never tried the Samsung in that configuration.
Have had a similar experience with Dell. Have exclusively used new and second-hand Dell Laptops (Inspiron & Latitude) for the past fifteen years with no problems. Purchased a XPS 15 directly from Dell five months ago and the battery charging circuitry has fried itself. Support ticket has been open for 40+ days awating parts...
Can't answer for them, but: lots of us are older and need glasses. To really benefit from your preferred resolution would mean tiny fonts that give me eyestrain. It makes sense for a phone or tablet held close up; for a monitor a meter away it mainly just increases the expense at every level (including video ram and bandwidth). OTOH more area is worth spending more on.
While I would prefer to have a large and HiDPI display in the future, unscaled 4k was more economical and has fringe benefits of not needing special setup/handling. I lost $5-6k with my failed Dell and am hesitant to spend a lot again since it was supposed to be a decade purchase.
Thinking back the only other monitor I've ever lost was a Dell as well.. including 30 years of CRTs and CCFL LCDs never had any issues with other brands :(
I never owned a Dell screen, but I once had a Dell laptop and it was built like a tank. My brother had a 27" LG monitor from back when 27" was the biggest and best you could get, wasn't even 4k back then. It just died one day, probably the back light. I had a 24" CCFL monitor that actually never died, just got dimmer and dimmer every year, after about 5 years it was about half as bright as it was new.
Today I mostly use my 16" Macbook, which is quite close to being 4K. I really enjoy the HiDPI and the 120Hz refresh rate, makes it hard to use an external monitor since you can rarely get a HiDPI and high refresh montiro.
That's surprising given that Dell usually offers very good warranty on their monitors, at least to consumers. Was this a bussiness (B2B) purchase perhaps?
not my experience, but maybe for some. The big problem is the quality has gone way down hill in the past 10+ years, and the warranty periods are ridiculously small. TVs and monitors are all built (and warrantied) now like they should be replaced every 3-5 years.
In the early 4k era, whenever I saw a TV used as a monitor, the eye strain was high. It was too bright, too contrasty, and generally, the picture was not great for using for things like programming. In addition, many TVs would do not-so-great things to the picture, the worst being digitally sharpening the image (which resulted in e.g. a halo effect around small text).
This might not bother some people, but it bothers me a lot.
How are you finding the display compares to a real monitor? How do I buy TVs which I know won't do this sort of thing?
Many (most?) current TVs has either a game or PC mode which can be set on your HDMI to disable these "improvements".
I think this is primarily driven by console gamers to the benefits of PC users. Our needs align here.
If you check rtings.com they usually evaluate how good the TVs are as a monitor.
You might still have issues with local dimming etc. But that is the price of cheap. Better models works really good today.
I am using a really cheapo LG 43" 4K as a monitor. Properly adjusted it is usable. Would I like better? Yes. But it is worth the trade off. And there are only a few options for a "proper" 4K monitor at 43". I find that a little strange as it hits the sweet spot around 110ppi. I used to used dual monitors but I much prefer a (I know: comically) large screen.
Only real annoyance I have is that it does not turn off automatically like a real monitor using DPMS. This means that I have to turn it on using a button. It will turn off after 15 min if there is no signal. Like in the olden days.
Fortunately, modern TVs can disable the sharpening and contrast/saturation enhancements. Since I do a lot of photography and image processing, I am also extremely sensitive to oversharpened halos, so I was a bit worried about that at first --- but fortunately, that is 100% nonexistent once I applied the appropriate settings on my Samsung QN800A. See pics: [1] [2]
I also detest the miniled HDR that they have going on, which can cause bright things to glow, so I disabled that. Unfortunately my QN800A still have a bit of "full screen HDR", namely, that the whole screen may uniformly dim if it detects that the scene is dark. This means that sometimes when you have a black screen with a single cursor on it, it gets dark, and it becomes hard to see the cursor. This doesn't affect normal usage though, when the screen is at a constant brightness.
On a LG tv you can disable this with buying a ‘service remote’ from amazon and acces a special menu. ( look for TPC or GSR settings ). I don’t know about samsung though.
I am using a 43 inch 4k monitor so I am allin on big screen real estate. But I find that even with a quarter of your screen area, I struggle to read the corners of the screen, the bottom is often obstructed by whatever is lying on my desk, and I had to make the mouse cursor bigger as I kept losing it. I doubt that an even bigger screen would be practical. I do have two 43in monitors side by side but the other one is more like a secondary screen for playing movies or "storing windows", it's too far from the eye to be useful as a primary monitor for reading and writing.
32" being (IMHO) too small and 43" too large, I have invested in a rare (and relatively expensive) 38" 4K monitor (Asus/ROG Swift PG38UQ, ~1000€ when I bought it, hasn't gone down much since), and until now I can only say good things about it. It's big enough to use without scaling (except for a few websites with tiny font size), but small enough so you can have a reasonable distance between the bottom and the desk and still see all four corners without craning your head around too much. It has a fixed foot (only the vertical angle is adjustable), and I originally thought I would have to buy an extra fully adjustable monitor stand, but so far I'm happy with the "default" settings. I'm not getting another one because of space limitations (and spouse tolerance issues), but compared to the two WUXGA monitors I had before, it's already almost four times as much screen space, so that should be enough for the foreseeable future.
Spouse tolerance issues are a significant factor - I have 2 x 32" and my spouse got used to it only after grudging about it for about a year :) If I brought a 43" one, I'm afraid a new apartment with a separate work room would have to come too.
I had a similar experience using a 43" 4K TV as my monitor, it was an OLED so the picture was absolutely beautiful but I'd end up only using the 32" in the middle of the display. I'm now using a 32" 4K display on my desk which is about the sweet spot for me, lots of real estate, and I can see all of it.
I also have a 43" 4K monitor, and I find myself being in the same position as you. The left/right edges are difficult to see. I don't have the issue with the mouse, but I also doubt whether a larger screen would be useful to me. As it is there's a corner that gets unused because it's just out of "eye shot" if that's a phrase. It is now, I guess :D
I use glasses (myopia) and can kind of tolerate the edges of my 32" 4k monitor, but I can't fathom craning my neck all the way up to the edges of a 55"+ display. Not to mention font sizes.
I have fairly bad eyesight with both myopia and astigmatism (-5 sph, -2 cyl) and I wear glasses. I got glasses with 1.71 index lenses, which I greatly prefer over the more common 1.74 index lenses due to the higher Abbe number, resulting in less chromatic aberration.
Anyway, I use browsers at 150% scaling usually, although the text is finer on my terminals. I don't use any scaling for UI elements and terminals. Using the i3 tiling window manager, I put more commonly used terminals on the bottom half of the screen since I find that the top half does require more neck craning.
FWIW there are lenses that are high index while still having a higher Abbe number, but they're expensive and pretty specific materials. Interesting that 1.74 are more common where you are, where I am lower index polycarb are the standard (sadly)
I had a 55" TV as my main display in 2022. Had it about a foot away from my face. It takes a few days, but your brain and body get used to the size.
I just bought a 39" ultrawide and for the first few days I thought "oh dear, I have to keep turning to see the whole thing," but I've not even thought about it for a couple of weeks now, so I guess I'm acclimated.
I have been using a 32" monitor for the last 10 years. I have found that I am using mostly the center of the monitor. The peripheral edges remain unused.
If I sit far from the monitor, then the FOV could be reduced, but then I have to increase the font size defeating the very purpose of maximizing screen real estate.
This is pretty much what I concluded as well after using my 43" 4K LG monitor for about 3 years. Lately I've been trying out my wife's 27" Apple Studio Display. It's smaller but the PPI is amazing...
You don't maximize windows except to watch videos at that size. It's more like having multiple monitors with fluid borders. You focus as needed, leaving the rest in your peripheral vision. That said I did miss maximizing windows to focus on tasks.
I use a combination of Aquasnap's magnetic border feature with MS Power Toys hotkeys and it has been a treat. Still room for improvement tho', esp. if I can force specific browser tabs into particular windows based on purpose.
Nice to see other people doing the same thing I do, albeit with a 4k OLED instead. I am waiting for an 8k OLED at an affordable price but it seems I will have to continue waiting.
What brand and model of desk do you have? I have a 48" TV but I sit rather close so it probably takes up the same field of view as your 65".
As to your last paragraph, if you email hn@ycombinator.com and explain the situation, they'll sort you out and sometimes put you into a second chance pool, as it's called.
I wish deep desks were more common! Modern ultrawide curved monitors sit way too close for comfort for me due to the way their legs have to be angled further back for center of gravity. custom desks end up being so expensive.
That's where a desk mount monitor arm comes in. Even a high-end model capable to hold those 49" 32:9 monsters will likely be significantly cheaper than a custom desk.
i'm using a nice sheet of 4x8 finished plywood from the hardware store. i trimmed the depth down a bit, but not much. put some edge banding on it, and stick it on top of a flexispot or whatever other 4-legged desk frame you want to use.
It's a dense hardwood, near the top of those attributes on wood scales.
> Your suggestion seems oddly specific.
Hardwoods make great table tops, I've always had jarrah workbenches and general desktops and used other woods for 'fancy' tables .. but then I'm in W.Australia and used to recover treefall for slabbing in sheds and using in Brady Drums etc. (I knew Chris Brady back in the day https://www.youtube.com/watch?v=55SXxWz0Vpg)
> Is this available outside of Australia?
Significant tonnages of it were blocked, shipped to England as ballast and used to pave the streets - as a consequence quantities are still kicking about the UK after being recovered and repurposed.
How did you get it in a custom dimension? I'm almost tempted to just put two of my current desk back to back to make it deeper, would probably be much cheaper than 2k, but then again, they're not standing desks.
In case you're wondering whether this works on a Mac, like I did, I found this source[0]. In short, you need an M2 Pro or better, and may need to edit a plist file to get it to work.
A tad surprised that curvature isn't discussed? With such a massive screen, the distance from your eye to the middle of the screen, and eye to the corners, are very different - unless you sit far away. Your eyes thus need to change focus all the time. That's AFAIK why those ultra wide screens are curved - and I find that the more curve they have (smaller radius), the better it is. With such a massive screen, I guess it would be best if it was part of a sphere! (Curved both ways)
I recently acquired a 43" 4K monitor for programming - a very boring Philips monitor, used at 100% scale. I hated it at first, but after a month I loved it.
2160p actual 'workspace' resolution at this distance (2 feet?) and size (43") seems close to a practical limit for typical use I thought, requiring with this measly 43" still a little bit of occasional head movement to see the top right corner. I noticed a tendency to sit slightly to the left of centre on this monitor, to avoid distortion and maintain clarity with what I'm focusing on (e.g. code/windows, not reference materials). Because of this I suspect at this distance a 43" with a slight curve would be optimal, at least for me.
What I wanted to ask you:
- What is your 'workspace' resolution? Is it something like 6K? I'm guessing your scaling is either 125% or 150%? Your PPI should be around 135, mine 102.
- Are you actually sat perfectly centre? I was wondering this because I keep noticing I tend to gradually shift my keyboard to the left over a day. Maybe this is years of 1440p + side portrait monitor use, I'm not sure, but eventually I accepted that I prefer slightly to the left (odd because my side portrait was on the left...)
- Do you think a curved monitor at this size/distance would improve the ergonomics? I imagine you must get a bit of a neck workout.
After getting this monitor, I'm pretty much sold on single screens again - but I had to switch my window management from keyboard-based tiling shortcuts to 'hold CTRL and move mouse' window management (BetterTouchTool on MacOS), with a tendency to stack up windows messily. I tried custom resize snap zones with BetterSnapTool - but I don't use them. I think that was the biggest challenge to switch from multi monitor to large format. It's a huge benefit to have everything in your context on one screen, but had to rethink how windows get moved around. Now I'm used to it, I want CTRL/SHIFT + mousemove modifiers on every system to deal with windows.
Also related, I bought a 4K tv last weekend for another system to use as a monitor, but found that the gaps between the pixels were unexpectedly large, creating a strange optical effect at close distance, making it unusable (but so close). There might be something different about the screen outer layer (on most TVs?) that polarizes light in a way better suited for distance viewing, but clearly not all TVs have this issue.
The power is not so bad, especially compared to the graphics cards you would want to use (and I use my GPU as a tow warmer). Samsung 8k specifically comes with low power presets which are probably usable in this scenario. Of course with so many more pixels in 8k than in 4k there is need for more power but the EU regulation allows selling them if they also support an eco mode.
I am old enough to recall 100W as the typical single light bulb and I still use an electric tea kettle that touches the multi kW range daily.
> I am old enough to recall 100W as the typical single light bulb
I'm regularly in a museum where they showcase some of the 1800s/1900s wares of households in my area. One is a kerosene/gasoline powered clothes iron. Just because something was one common doesn't mean it was good.
> I still use an electric tea kettle that touches the multi kW range daily
How many hours a day is your tea kettle actually using multi-kW? The more useful comparison is how many kWh/day are these appliances using.
Fair enough. Though I didnt live during kerosene times. My tea kettle uses 0.06kWh per session. So one to two weeks of tea for me to reach the energy of a day on such a monitor (see other comment on best guesses on energy use of this monitor). On the other hand, a typical pasta recipe on an electric stove would be 2kWh, so several days of use of such a monitor.
I realize my previous comment might have been a bit more adversarial than expected. Sorry if you took it that way.
And yeah as your comment shows it's really kind of an odd comparison to make in the end. Ultimately I'm of the mind that if the 8K screen really gives you a lot of value then it's probably worth it. You're dealing with that energy cost, and ultimately it's up to society to properly price externalities into the energy costs. You can make the decision whether the energy costs are really offset by the extra value you get.
But like, an 8K screen does use a considerable amount more energy than say a 4K. For a bit back in the day people really started to care about energy use of CRTs as they kept getting bigger and fancier. Then LCDs came out and slashed that energy usage massively compared to an equivalent size. Practically negligible compared to what a decent workstation would use. Now we're finally back to the screen itself using a pretty big fraction of energy use, and IMO consumers should pay attention to it. It's not nothing, it's probably not the single biggest energy use in their home, but it might be around the same as other subscriptions they're questioning in a given month.
And yeah, in the end I think that energy metric should be based on how many kWh you end up using on it in a month or whatever your billing cycle is. Compare it to the value you get for other things. Is it worth a week of tea to run it for a day, cost-wise?
I had a period of time where I bought a car for $3k. I then valued every big ticket thing to the utility I got from a whole car. "That's like .75 Accords, is that worth it?" Kind of an odd way of looking at things but really put a lot of value into perspective.
The eco mode is not usable, it's the manufacturer's way around a ban of 8k monitors. These monitors use at least twice what other monitors of the same size use, sometimes it's four times as much. And these measurements are probably in eco mode, so it could be worse.
> I am old enough to recall 100W as the typical single light bulb and I still use an electric tea kettle that touches the multi kW range daily.
Not sure why you mention this here? Just because we had horribly inefficient light bulbs our monitors can use twice as much?
I’m guilty of this as well. Folks of a certain age will always tend to measure energy consumption in “light bulbs.”
Sort of like how Americans always measure size in “football (gridiron) fields.”
The energy consumption of a traditional incandescent bulb, while obviously inexact, in nonetheless somewhat of a useful rough relative measurement. It is a power draw that is insignificant enough that we don’t mind running a few of them simultaneously when needed, yet significant enough that we recognize they ought to be turned off when not needed.
I always turn my monitors to the lowest possible brightness for long work sessions, so I assumed (perhaps mistakenly) that this eco mode would already be close to my settings out of the box, and if anything, too bright. Assuming 20c per kWh (California rates, mostly solar during the day) and one kWh per day (8h at 130kW average use), much higher than the allowed EU limit and the eco mode, the monetary cost comes down to about 4 USD per month. So definitely not negligible but also not a reason to avoid being able to tile 64 terminals if one wanted to do that.
[edit: the above estimate is almost certainly an upper bound on the energy I would ever use myself with such an item; I would be curious to measure it properly if/when I upgrade to one, and curious if the OP has a measure of their own usage. My best guess is that in practice I would average between 2 and 3 kWh every week (2 USD/month) rather than 5 kWh, because I tend to work in low light.]
Yeah it does emit a bit of heat. I think around one or two hundred watts? I haven't measured it directly. I have a mini split air conditioner in my home office.
The comment above has very wrong numbers, by the way, typical consumption for the whole device should around or less than what that poster claims is drawn just by the CPU!
What zoom (if any) do you typically run at? For instance, a 200% zoom would give you an effective resolution of 4K, but with much sharper and smoother text and rendered graphics.
I tried this a couple of years ago and had to ditch the TV because of too much input lag.
You mention input lag only once where you say:
> Although this post is mostly focused on productivity, most if not all 8K TVs can be run in 4K at 120 Hz. Modern TVs have decent input lag in the ballpark of 10 ms and may support FreeSync.
Have you measure this, or where do you get this number from?
The TV I bought was also advertised as low-latency, but I found it too high in practice (when typing in code, or scrolling, and waiting for the screen to update).
I don't know many Linux users doing 4k+ at 144hz. I am wondering if you do any screen capture or desktop recording, and if so what software you use and what your experience is like? I cannot reliably capture 4k/144hz with my setup but my desktop environment is still on X11. I tried KDE/Wayland and had a better experience, but run into other bugs based on their integration.
Just curious how your experience with sway has been. I installed it but wasn't expecting to come with no config at all and didn't really want to be bothered setting it up just to test screen recording.
The issue with X11 is that even if you record (using any software) it causes the display refresh rate to artificially drop and its a very bad experience overall when you run at 4k144hz. Ultimately, the future is wayland but I am a little surprised how slow it has been for everyone to integrate it into their software.
Yes. It makes the experience much better when anything is moving. Hard to convince by words, it’s a try it and then go back to 60 to see what you’re missing.
Similar to hard drive vs SSD. Before I used a machine with a SSD for the first time hard drives were fine, then my normal was conditioned to that of SSD speeds. Going back to hard drive speeds is painful, just like 60hz even for things like moving windows around the desktop.
Seems to be a rather subjective thing. Going from 4k to 1080p literally causes me headaches, going from 240Hz to 60Hz feels normal after a minute or two. Yes, it feels nicer, but that's it for the most part. Not something that makes me want to update the screen right now.
Not something that makes me want to update the screen right now
That about sums it up.
I alternate between 120hz and 60hz monitors depending on where I’m working.
For software engineers: 120hz is “nicer”, and if you are buying a monitor today I’d say it’s well worth it to pay another $100 or whatever to go for 120hz+ versus 60hz. Certainly not worth upgrading for this alone however.
For designers: Same as above but it perhaps leans a little closer to being worth the upgrade. The mouse cursor is noticeably much moother and if you’re doing digital painting or something all day, 120hz+ might really be worth the upgrade all by itself if budget allows. Working with the 120hz (or is it 240hz now?) stylus on iPad Pros is revelatory for that kind of work.
For gaming: For any fast action gaming (for games and platforms that support high frame rates) it really is worth the immediate upgrade. Your world now looks and feels fluid. It feels real. Input lag is usually halved as well.
They are useful for the same reason response rate is important -- motion blur and judder. Things look more crisp and move more fluidly across the screen.
Its slow because there is no singular Wayland just 12 different waylands that diverge because the primary standard is underspecified and took 16 years for people to agree on functionality people agreed was needed in 1999.
What you really need to match is the angular resolution in microradians from your eye. You can make any screen smaller by sitting farther back. That said, I do wish my TV was only 42". I guess if you really want the ppi to be exactly the same as a 27" 5K screen, then 27 * 7680 / 5120 = 40.5".
This is exactly the reason I intend to stick with 4k for now: I don't want a display that large. I currently have a 48" 4k display, and I'd prefer to have a 42" or 36" one. (Good choices are hard to find, though, particularly if you actually want 4k rather than ultrawide, want OLED, and don't want to just use a TV.)
I bought the Philips Evnia which fits perfectly into that category at 42". Despite being a gaming monitor it's not garish and I've grown to love the ambilight.
Do you use macOS with this and if yes, do you often share your screen? I find large monitor unusable for screen sharing on macOS in general as it will share a lot of blank space and the window you want to share, making the window minuscule for anyone that does not have an 8k monitor like you.
Interesting. Time to buy a new tv or monitor for programming. Wonder which resolution and size to go. Use 4K 27 for programming and a super wide for my fs2020.
Btw I would use two different glass when I use it as tv or playing fs2020/4 vs when I sit close to use it as programming station.
hey, i asked you on the other thread as well (the imac one) but this was my question
—-
Hey, I have a similar setup (https://kayg.org/uses) where I use a LG C148 as my primary TV and monitor. I do all work on it, however I am unable to use tiling window managers as you recommend because I always struggle to see windows / text that is placed above my eye-level.
For that reason, I prefer to use manual window management solutions instead.
I am curious how do you deal with that problem, one big TV user to another? or do you not have that problem at all?
thanks!
I did this same thing with a 50" 4k TV ... I get and it does work .... My biggest issue is the tv brightness levels even at low were waaaaaaay too bright... I was using LCD ... Is oled better for this????
> There is also a Dell UP3218K, but it costs the same as an 8K TV and is much smaller and has many problems. So I do not recommend it unless you really don’t have the desk space. Sitting further back from a bigger screen provides the same field of view as sitting close to a smaller display, and may have less eye strain.
I've recently swapped out my dumb TV with a smart TV. The choice to go smart after clinging on to my dumb TV + old-school Chromecast was only motivated by advances in display tech. In retrospect the smart TV is a considerably worse experience UX-wise than the dumb TV + Chromecast. The built-in Chromecast in the new TV requires the TV to be logged into accounts for all the "apps" that the TV has. I can no longer just cast something from any device connected to my network and have it "just work" like it did before.
I know in this case you're working with HDMI and hopefully have managed to set the TV up to just display an HDMI output on bootup, but did you run into any of these infuriating "smart" TV things?
I was previously working at a lidar company and now I am working at a robotics company providing calibration and localization software to customers using a combination of lidars, cameras, and other sensors.
You COMPLETELY missed the elephant in the room : 8K TVs have really, really massive CPUs that waste a TON of power (150-200w for the CPU, 300-400w for the TV, often!) Think 8 cores of the fastest arm 64-bit processors available plus extra hardware accelerators! They need this extra processing power to handle the 8K television load, such as upscaling and color transforms - which never happen when you are using them as a monitor!
So, 8K TVs are a big energy-suck! There's a reason why European regulations banned 100% of 8K TVs until the manufacturers undoubtedly paid for a loophole, and now 8K TVs in Europe are shipped in a super-power-saver mode where they consume just barely below the maximum standard amount of power (90w) ... but nobody leaves them in this mode because they look horrible and dim!
If everybody were to upgrade to an 8K TV tomorrow, then I think it would throw away all the progress we've made on Global Warming for the past 20 years ...
Anecdotally my house draws 0.4 kW when idle and 0.6-0.7 kW when both my 8K screen and my computer are on. Since my computer draws 0.1-0.2 kW, I surmise that the QN800A doesn't draw 300-400 W total --- maybe 100-200 W.
I run my screen on a brightness setting of 21 (out of 50) which is still quite legible during the day next to a window.
Also, I have solar panels for my house (which is why I'm able to see the total power usage of my house).
The parent comment is completely wrong on nearly every point it makes. I don't know why it's so upvoted right now.
It doesn't even pass the common sense test. Does anyone really think TVs have 200W CPUs inside just to move pixels around? That's into the territory of a high-end GPU or server CPU. You don't need that much power to move some pixels to a display.
I didn't smell anything. A 200W PSU isn't terribly expensive and being cheaper than more efficient processors seems reasonable. I also only run a single 4k monitor so haven't thought about driving 4x the pixels recently.
That's a facially absurd statement. Just on the numbers:
The US consumes 500 gigawatts on average, or 5000 watts per household.
So if every household bought an 8K TV, turned it on literally 100% of the time, and didn't reduce their use of their old TV, it would represent a 10% increase in power consumption.
The carbon emissions from residential power generation have approximately halved in the past 20 years. So even with the wildest assumptions, it doesn't "throw away all the progress we've made on Global Warming for the past 20 years ...".
To put it in perspective, an electric car might need 350 Watt-hours per mile. A 10-mile drive would use 3.5 kWh. That's equivalent to about 24 hours of using that monitor at normal settings, or about 8 hours at maximum brightness.
The comparison doesn't make sense, though, because if you drove to the office you'd still be using a monitor somewhere. A single 4K monitor might take around 30-40W. Using four of them to equal this 8K display would come in right around the 139W typical power consumption of the 8K 65" monitor.
There's no "fixed budget" of energy that is ethically ok to use. The parents point was that these devices are woefully inefficent no matter which way you look at them.
The "best" thing to do would be neither, and is usually to just use the device you have - particularly for low power electronics as the impact of buying a new one is more than the impact of actually running the thing unless you run it 24/7/365
> There's no "fixed budget" of energy that is ethically ok to use.
Not even 0.00001 W? How is it ethical to live in the first place in such case?
> The parents point was that these devices are woefully inefficent no matter which way you look at them.
It's always a trade off, of productivity, enjoyment vs energy efficiency, isn't it?
If I find a setup that allows me to be more productive and enjoy my work more, certainly I would need to balance it with how much potential waste there is in terms of efficiency.
> The "best" thing to do would be neither, and is usually to just use the device you have
That's quite a generic statement. If my device is a budget android phone, do you expect me to keep coding on it, not buying better tools?
> You COMPLETELY missed the elephant in the room : 8K TVs have really, really massive CPUs that waste a TON of power (150-200w for the CPU, 300-400w for the TV, often!)
RTINGS measured the Samsung QN800A as consuming 139W typical, with a peak of 429W.
Your numbers aren't even close to accurate. 8K TVs do not have 200W CPUs inside. The entire Samsung QN800A uses less power during normal operation than you're claiming the CPU does. You do not need as much power as a mid-range GPU to move pixels from HDMI to a display.
> There's a reason why European regulations banned 100% of 8K TVs
This is also incorrect. European regulations required the default settings, out of the box, to hit a certain energy target.
So large TVs in Europe (8K or otherwise) need to come with their brightness turned down by default. You open the box, set it up, and then turn the brightness to the setting you want.
> until the manufacturers undoubtedly paid for a loophole
This is unfounded conspiracy theory that is also incorrect. Nobody paid for a loophole. The original law was written for out-of-the-box settings. Manufacturers complied with the law. No bribes or conspiracies.
> If everybody were to upgrade to an 8K TV tomorrow, then I think it would throw away all the progress we've made on Global Warming for the past 20 years ...
The Samsung QN800A 8K TV the author uses, even on high settings, uses incrementally more power than other big screen TVs. The difference is about equal to an old incandescent lightbulb or two. Even if everyone on Earth swapped their TV for a 65" 8K TV tomorrow (lol) it would not set back 20 years of global warming.
This comment is so full of incorrect information and exaggerations that I can't believe it's one of the more upvoted comments here.
> RTINGS measured the Samsung QN800A as consuming 139W typical, with a peak of 429W.
Can you explain why does a TV's power fluctuate so much? What does peak load look like for a TV? Does watching NFL draw more power than playing Factorio?
Power consumption varies significantly based on what's being displayed, on top of brightness settings.
I have a 42" 4k LG OLED. With a pure black background and just a taskbar visible (5% of screen), the TV draws ~40W because OLED pixels use no power when displaying black.
Opening Chrome to Google's homepage in light mode pulls ~150W since each pixel's RGB components need power to produce white across most of the screen.
Video content causes continuous power fluctuation as each frame is rendered. Dark frames use less power (more pixels off/dim), bright frames use more (more pixels on/bright).
Modern OLEDs use Pulse Width Modulation (PWM) for brightness control - pixels switch rapidly between fully on and off states. Lower brightness means pixels spend more time in their off state during each cycle.
The QN800A's local dimming helps reduce power in dark scenes by dimming zones of the LED backlight array, though power consumption still varies significantly with content brightness. It's similar to OLED but the backlight zones are not specific to each pixel.
Dark mode UIs and lower brightness settings will reduce power draw on both QLED and OLED displays.
Traditional LCDs without local dimming work quite differently - their constant backlight means only brightness settings affect power, not the content being displayed.
This explains those power fluctuations in the QN800A measurements. Peak power (429W) likely occurs during bright, high-contrast scenes - think NFL games during a sunny day game, or HDR content with bright highlights. For gaming, power draw is largely influenced by the content being displayed - so a game like Factorio, with its darker UI and industrial scenes, would typically draw less power than games with bright, sunny environments.
I was under the incorrect impression the power consumption would related to the rendering of the image (ala CPU/GPU work). Having it related to brightness makes much more sense.
To be fair it's not the energy that you're concerned with; it's the source of that energy.
Private jets can't run off nuclear power grids. Also the real problem-child of emissions is not America. China has a billion more people, what are their TVs like?
Good points. I would go further and say it is the integral of emissions over time that we would be most concerned with. From that perspective, over the last 200 years - it is problem children, and rising problem childs.
> The average American household uses about 29 kilowatts of power per day (29,000 megawatts).
Ignoring the megawatts error that the sibling pointed out, it's 29 kilowatt hours per day. Watts are a unit of power consumption -- joules (energy) per second.
One kilowatt hour is the energy used by running something at 1,000 Watts for one hour.