Hacker Newsnew | past | comments | ask | show | jobs | submit | thrdbndndn's commentslogin

I've probably said this a bunch of times already, but based on my past experience, any analysis built on month-to-month changes in the Steam Hardware Survey should be taken with a very large grain of salt, if not considered outright useless for any serious conclusions.

The clue is already in the article itself. The author notes that "part of the jump at least appears to be explained by Valve correcting again the Steam China numbers." If you actually think about what that implies, it raises more questions than answers. A 31.85% monthly drop is obviously not organic, so yes, it makes sense to call it a "correction." But then why was the previous month's data so far off in the first place? Is there something fundamentally flawed in the survey methodology, like sampling bias, non-uniform distribution, regional skew, or something else?

And if this kind of correction happens this month, what's stopping it from happening in previous months? The reality is: it does happen all the time. You can usually spot at least one clearly unrealistic data point in almost every release.

At that point, it's hard to argue there's any real value in trying to analyze these results in a rigorous way.


The explanation I've heard is simply: Chinese New Years happened, which means a lot more Chinese gamers are online in February during the week long national holiday.

It happened in last year's March stats too: https://web.archive.org/web/20250404061527/https://store.ste... -25%


How does a Jan/Feb holiday affect this year’s March number (that was reported in early April)?

I’m not talking about the Feb number that is reported in March.


The number being discussed is not March alone, but the percent change. So March's number relative to February's number.

A holiday that warps numbers in February will no longer be warping things in March.

Of the publicly available sources I think CloudFlares Radar is one of the better ones. Silver linings of having such wide dragnet on the internet. It puts Linux market share at 3-4%, with some regional variance

https://radar.cloudflare.com/explorer?dataSet=http&groupBy=o...

Fun tidbits, Finland is at ~10% (!), and Germany at 6.3%.


This was probably a lot more true in the past but Linux users tend to be more privacy conscious and do things like spoof their user agent, so this is almost certainly an undercount. You basically used to have to do this to browse the web before Firefox became one of the dominant browsers.

I don't know anyone who goes through the trouble to spoof their user agent and I know plenty Linux users.

Unfortunately I have to use some government websites which refuse to work when my user agent contains "Linux x86_64". So I just always spoof it.

This is the reality - most people won't spoof until they figure out it's the way to make a specific site work; and then they'll likely spoof for everything.

I'd also like to add that we forget that we're doing it, or at least I do. Once you set something up like that, there's never any reason to get rid of it; nobody is positively discriminating towards Linux.

I love when a ruleset (firewall, for example) has a "comments" field because I inevitably forget why I added something and then Chesterton's fence means I leave it forever, lest I spend hours a year later wondering why something broke.

Every time I try to change my user agent with a FF extension I get hit with brutal cloudflare captcha loops. How are you changing your user agent in a way that this is not a problem?

The archwiki Firefox privacy guide comes to mind, which mentions UA spoofing:

https://wiki.archlinux.org/title/Firefox/Privacy


Actual reason: SBC retro handheld consoles now run Linux and people are using them to play steam indie games. The China holiday had some blow out pricing.

Non primary devices more likely to run Linux. Primary still windows.


I do, to access YouTube TV on my Ubuntu HTPC.

Tons of people did and do this to get higher resolution on a certain streaming site.

If you spoof user agent, you will get more captchas because it won't match their other fingerprinting.

You also get more captchas because you are on Linux, I see the Cloudflare one on my computer everytime.

Used to be worse. Something happened in the last year and I'm seeing way way less random captchas for regular use from a residential IP. In '22-'24 it used to be extremely common, now it's an event when it happens. Also went from mint to plain ubuntu so that might have something to do with it?

It's a good thing too, because when I see the Cloudflare captcha I try it once and if that doesn't work then I just close the tab and add it to the list of non-functioning websites.

Cloudflare captcha = infinite loop of captchas (if it doesn't work on the first try). You can give up the moment that happens, because you will never get to the website itself.


Privacy minded Linux users probably also know, spoofing your user agent is likely to increase fingerprint entropy and actually decreases privacy. It may have been true in the past, but I don't think anyone even recommends it anymore.

There's still plenty of web sites that check the OS and if it's not Mac OS, Windows, or Andoid it's no service for you. Faking your UA is not always about privacy, it's about defeating stupidity.

You should only do this on websites that actually require it otherwise you're almost certainly going to cause more problems than you'll solve.

Messing with the UA header is going to get you flagged by every bot detection tool because when you change your header from "Firefox on Linux" to "Chrome on Windows" your fingerprints don't add up anymore and you look exactly like a poorly written bot. You're likely going to see more captchas, you might get blocked or rate limited more often, and get placed under increased scrutiny, orders held for verification, silently filtered or shadow banned, etc.


The only websites that really do this anymore are ones that are delivering native code for those platforms or those that require DRM that only work on those platforms.

Even when that is the case (what is a minority of the time), just because I'm using Linux, it doesn't mean that I don't want to download some Windows software.

But well, I haven't had to spoof my browser's UA for a few years. If some site refuses it, I'll just move on. (Including some that started doing it after I brought thousands of dollars worth of stuff from them.)


I'm sure there are some, but having used Linux for 32 years, it's been at least 20 years since I needed to do that.

Browser yes, but OS? Rarely, I have issues with Firefox, but never had Chromium not working, too.

It any case, it would be silly to assume services measuring OS popularity would put up such limitations. And more likely than not, people are changing their UA as a work-around on a case-by-case basis than make it a default, since that's gonna cause trouble.

In the last decade, the only time, I actually had to touch the UA is when breaking ToS with curl :D


Actually that sounds like exactly the sort of nuanced reality that “privacy-conscious Linux users” aren’t that likely to know at all.

The EFF's "Panopticlick" paper was published in 2010 [1], together with Firefox/Tor research that knowledge became mainstream. Therefore privacy guides don't recommend it. The Arch wiki linked above has this warning in bright red:

> "Changing the user agent without changing to a corresponding platform will make your browser nearly unique."

Sorry, I am not sure, if arguing about nuanced reality is the battleground, where I see you thriving.

[1] https://coveryourtracks.eff.org/ (browser test since 2014)


Overall agreed. I think a more interesting look at this is the tracker which GamingOnLinux keeps (not yet updated with the new numbers as of writing), where they also have one graph that shows usage among only English speaking users. Overall it is trending upwards, and English Linux Steam users are approaching 9%.

https://www.gamingonlinux.com/steam-tracker/


Yes, this is the key. You can account for anomalies such as Lunar New year, do rolling averages and other statistical modelling to show trends.

Saying that one source of data should be discarded because it contains nuance is.. a take.


The key word in the article is "again"

'Valve correcting again the Steam China numbers.'

This seemingly is a common problem with the Steam Hardware Report, with Chinese users being erroneously represented. It constantly gets fixed, although takes a bit. It could be the hardware surveys are sent out at a different time compared to the rest of the world, then combined in the following month.

This is proven by "Ended 2025 at around a 3.5% marketshare, dipped a bit in January, and fell to 2.23% in February."


The other aspect I find interesting is the February spike in win10 usage, presumably from Chinese users. Where will they migrate to over the coming years as support goes away. They seem to be both resisting win11 and resisting linux perhaps as either it's not suitable for the games (online?) they play or not great for Chinese users, or perhaps along with the nvidia spike because of getting more out of those GPUs on windows.

Why are Chinese users such big Windows fans in the first place?

It's because of the Chinese user influx during their holiday season. Valve is not correcting anything they are just showing the data. As usual, Phoronix is misinterpreting what they're looking at.

It's about oversampling. Due how the survey is sent, a massive influx of machines coming online all at once will be more likely to trigger the survey. They know the general composition of their users, so they need the survey to be around the ballpark of that.

They are still only reporting the data they see. They are not correcting or manipulating the data like phoronix implies in their article.

unpopular opinion: this can be explained by the social and monetary economics of the gaming ecosystem as a whole.

- Microsoft has worked tirelessly to make the windows compute experience an evermore intrusive and soul crushing experience for the average gamer. artificially outmoded hardware at a time of GPU scarcity means consumers cant comply with redmonds increasingly arbitrary hardware edicts even if they wanted to. at the same time, linux has become ever easier to install and use as an alternative. there is likely an inflection point for a lot of gamers that are just looking to access their library.

- console gaming has become hideously overpriced. madatory tie-ins with playstation network, high costs for all consoles, and the potential for the console stocks to simply not be available at time of release make for a frictional and frustrating experience. Microslop is embracing the same playstation style enshittification that routinely brings sony to its knees. neither juggernaut seems genuinely interested in the end user with the exception of Nintendo, whos quality control issues and pricing as well with switch hardware make it a nonstarter for anyone but the most diehard zelda fan.

- steam + linux offers a largely seamless experience for the casual gamer. steam sales are fun and engaging. the community is generally well rounded. gabe newell is generally well respected by gamers and visibly interested in gaming and the community. Valve has contributed significantly to Linux since their push to obliterate the Windows store and shows no sign of retreat anytime soon. Steam + Linux is free and works with your existing hardware in a time of high prices, inflation, and scarcity in the western world.


This time it's different.

Linux was already stable enough 10 years ago as daily driver, i used Arch.

everything worked just fine, i remember only having issue with graphic drivers and glitches

I never really wanted anything more from it but when i moved to Mac, i saw how it prevents me from opening apps i downloaded from trusted site and every now and then i need to set xattr to open the files, and go through bunch of lockdowns.

Now freecad has improved so much, with all AI coding and all opensource will improve DRASTICALLY and very fast.

using AI which stole everyone's code to develop OpenSource is morally right thing to do vs using it at private companies. It will attract more devs.


I have tried Arch btw

> any analysis built on month-to-month changes […] should be taken with a very large grain of salt

Agreed.

January and February are school vacations in South America. The whole month. Kids have a lot more free hours to tinker and play video games. That might not be the cause of the spike in this particular case, but there's probably dozens of similar random facts that can affect statistics on any month in unexpected ways.


I filled out the survey yesterday and it didn't notice my dGPU. No way to correct the entries as well.

Agree the numbers are not set in stone, but there is absolutely no denying that the Linux userbase has increased.

Proton's updates is a game changer, Windows 11's absolutely garbage buggy slop is frustrating more and more people. OS' like CachyOS and Bazzite etc making the transition far more approachable than ever.

The future is bright.


Really happy to see this kind of analysis on HN. The news you want to hear the most must also be looked at critically, and as much as I love Linux gaming we want to be sober in our expectations.

Even if it wasn't for corrections, one has to look at the longer trends and not just single months.

Loads of people switch to Linux but I do wonder how many are still there a year later? I say this as someone that been a Linux daily runner since about 2010.


> Even if it wasn't for corrections > Loads of people

This is all fine (and might even be true) but not having to fill in the gaps with anecdotal data and wishful thinking is precisely what good statistics are for. Bad statistics, on the other hand, make for a bad conversation starter because everyone is confused and it gets worse from there.


> Loads of people switch to Linux but I do wonder how many are still there a year later?

Everyone who bought a gaming PC last year, only to be told it has to be scrapped now because Windows 11 doesn't like the colour of the power cable.


I mean you make good points and all, but on the other hand I really want this to be the year of the Linux desktop, so I'm gonna go with the other interpretation anyway!

Well, if it's any indication, my sister, who is very much not a tech person, randomly asked me to help her install Linux Mint a month ago, and has been using it successfully since without needing to ask for help once (at least not from me, I suspect ChatGPT is getting a workout).

That felt like an indicator to me. I only switched to Linux a year or two ago and haven't mentioned it to her once, so she got the idea from somewhere else, and had enough impetus from whatever she disliked about Windows to actually go through with the change. If I was in marketing at Microsoft I'd be shitting myself over that, assuming Windows even still fits into their long term plans somehow. It's one thing for 100,000 techies to preach Linux across the web, but if random normies start using it without fanfare, that's real change.


There's a tipping point and we may be getting close. A few of my friends' kids & nephews have recently switched. Now that Valve seems to have solved the gaming compatibility issue, what Windows only software is left for teenagers to use that OS?

Also, regardless of what you think of LLMs, it makes tech support for Linux a whole lot more accessible to the average person. There is going to be less of an expectation now that you need to have a Linux guru on speed dial for the occasional weird edge case situation.

> but if random normies start using it without fanfare

From the normies I know, they only vaguely know what chatgpt is and sure don't use it.


to give you a single data point, I've finally committed to linux on my desktop machine at home (I posted in another comment on this thread regarding my sim setup, thats another issue), but on the desktop machine, I installed steam, proton, downloaded a few games from my library, and they just worked on install, no stuffing around at all, no searching the web for fixes to get it going. It's probably been 6 years since I tried it, and last time I tried pretty much every game needed _something_) to be done to get it working. The level of technical knowledge required to get it going now is minimal, so maybe 2026 is the year of linux

the one caveat was, ubuntu 24.04 LTS still didn't recognise my xbox wireless controller out of the box, and I needed to get xone and compile it and install the driver, a minor inconvenience, but something that would be beyond one of my daughtrs or wife. I've since moved back to debian though but already armed with that knowledge so it wasn't any kind of surprise.

next step will be to migrate my work machine, but that one is more difficult because the primary dev is in Delphi, so it'll probably be a case of linux on the hardware, and virtualbox running a win10 VM to do compilations, the other parts of the job are basically all o/s independent python dev, so no problem there.. although I will miss toad for oracle.


There is value in the gaming specific distros since they already include all the stuff like controller drivers. I installed Bazzite on my desktop which I have plugged in to the TV and it's been every bit as seamless as the steamdeck. It boots up direct in to steam big picture mode and I can do everything with my xbox controller.

Bazzite is an immutible os which is absolutely the future of linux. Your install will never break on updates since rather than a normal update migration process, it simply boots the next version of the OS image, which if it doesn't work will just revert back to the old image where you can wait for the bug to be fixed to update again.


One of the straight-up benefits of TV gaming that Bazzite (and presumably any KDE environment, but it's been a bit since I used another) has over Windows is that you can label your Bluetooth devices. I have blue controller, pink controller, white controller, damaged white controller. 90% of my gaming is local multiplayer games and I switch between an actual PS5 and PC, so this is super useful.

Can't do it in Windows 11 for some reason. No option to label them in the new settings app and the option to label them in the old control panel does not work. They all got saved as "Dualsense Controller" and you just had to guess which one you were reconnecting.


I've been using an immutable Linux for the last year or so, and it's gone quite well, but not without pain points.

There's a lot of stuff that I do which does not have a flatpak or package baked in. To get around this, I've been using distrobox to run these things in Ubuntu containers. So I will do "distrobox enter sdr" to have a terminal open up in that environment. You can export applications so that they show up in the applications list. It really takes some experience to shift your mindset, but it was worth it for me.


I agree that development sometimes takes extra steps, but honestly setting up dev environments almost always takes too many steps anyway lol. Overall it's worth it for the stability.

> I needed to get xone and compile it and install the driver, a minor inconvenience,

Call me nitpicky, but this is why Linux desktop is not ready yet. If anything, I'm a firm believer that SteamOS will be Linux Desktop


I agree, but I'm sure Bazzite/CachySteamOS all have support for them on boot.

Bazzite KDE picked up my 8BitDo controller immediately, with no prior configuration. I didn't even have to manually pair the Bluetooth. I was very impressed.

> CachySteamOS

Where can I find more information on that? I use CachyOS but never heard of that. Googling didn't find a single result (surprisingly, not even your comment)


I think OP missed a slash

yeah, sorry

Yeah I think for the not-so-tech-savvy gamer, there are better distros than Ubuntu. Ubuntu(and Debian) tend to lag behind the cutting edge a bit too. For such users I'd probably recommend fedora (or one of it's variants) or just straight up steamOS

As a Fedora user, I would actually recommend Ubuntu for gamers new to Linux, just because companies that offer Linux builds tend to only support Ubuntu. It's a bit more work comparatively to get to smooth sailing on Fedora. I think that work is worth it, of course, but new users might beg to differ.

I tried cachy, but I decided I hate the kde plasma environment, I should have chosen some other window manager but wanted to try the recommended one

there is also something to be said, negatively, for the number of distros now, cambrian explosion since the good old days of slack, deb, redhat, suse lol


I honestly believe one of the main, highly supported Distros like:

Debian, Arch, Fedora, Gentoo, Ubuntu, Nix, etc are all better choices than Catchy, Manjaro, Bazzite or whatever else niche distro exists.

I commonly find myself running into weird issues that I would of never run into otherwise. Bazzite for example by default, opens Steam on boot. This caused my games drives to not be mapped in Steam. (I assume Steam somehow booted before my drives were properly mapped) I helped my friend for hours troubleshooting his fstab config, rebooting, etc, but then realized it was just a default that he never set.

He quit Linux because of this (and some other minor gripes) and I don't think the gaming distros do much to properly help.


Doesn't Cachy support all of the DEs? Use it to try them all. (I don't know how CachyOS handles it; EndeavourOS lets you pick the DE on login.

yeah, on install you select a window manager, I didn't bother trying any others, just opted to go to Debian instead

It has been achieved with WSL on Windows, and Virtualisation Framework on macOS.

Other than that, I am still waiting for when I can buy a Dell, Asus, HP laptop on Media Markt or FNAC, with GNU/Linux pre-installed having 100% of the hardware being supported.


a lot of people don't even have a computer and do everything on their phone. Given androids market share, one could argue Linux is already present on most desk tops and therefore has already won.

I always thought "opt-in" (not "opt in") meant something you have to actively choose to enable; otherwise, it stays off. So calling something "opt-in by default" sounds like a misnomer to me.

But English is not my first language so please correct me if I'm wrong.


You are correct

The CSS doesn't even load here:

https://trustcompliance.xyz/_next/static/chunks/17psh0.nytnh...: 404 Not Found


Just curious: why your otherwise neatly formatted table uses a different format for the last row?


This question should be addressed to Wikipedia from where I copied the data.


Majority of laptops works "pretty well out of the box".


Not with Linux, typically. If you don't have drivers included in the kernel, it requires a lot of effort to get things working. I've done it many times, so now I will generally only buy laptops that have decent Linux support. [1]

I've had the laptop for about two years now and it still runs just as well as the day I bought it. I'm very happy with it.

[1] No I will not stick with Windows. Please feel free to read through my comment history to see why, but TL;DR I just don't like it.


I've had linux on every laptop I've owned for years, and I haven't really had a problem with any of them running linux, except for display port support on a dell xps.

Aside from that one dell laptop, though, I generally avoid HP and dell entirely, so perhaps that's why.


In 2013 I bought a laptop that I kept five years that had an Nvidia Optimus.

I never really figured out how to get the discrete card working consistently, and since then I haven't bought a laptop with an Nvidia card.

I've had issues with wifi cards and sound drivers and the like as well, though it's going a lot better now than it was a decade ago.


Weird. I must have uncommonly good fortune, as I don't think I've had Wi-Fi or sound issues for longer than that. I remember when I first tried out swaywm and having some sound issues because I also started moving to pipewore from pulseaudio, but nothing from an out of the box install of a decent distro.


I urge you to try HP.


^ this comment is more relevant than people might think. HP regularly deploys broken BIOS updates and literally bricks your laptops. Happened in 2023 I think 7 times that year, and one time even right in the next week. Our IT got so fed up and ditched any HP laptops because of it.


Never update your BIOS unless you have a specific bug that needs fixed.

I remember a Thinkpad BIOS update ended up destroying both undervolting and overclocking, and required a "chip-clip" programmer to revert.


That advice doesn't hold up very well when in recent years we've had multiple instances of a BIOS update being necessary to deal with the problem of "the CPU gets fed too high a voltage and dies prematurely". That's happened to both Intel and AMD desktop CPUs.

It's a real problem that BIOS updates for consumer systems never come with a meaningful changelog, so evaluating whether a particular update is a good idea or not is basically impossible.


I would strongly advice against buying HP laptops if you want to install linux because MX linux worked well on mine pre-owned HP, Zorin OS worked well but somehow I could not install AntiX linux and secure boot of HP troubled me too much and I could install OpenBSD on it but each time I would restart then it would kernel panic and I would havento reinstall. Combined with a long holiday when I left it at home. Now my HP is practically bricked. It is not starting


That advice holds up very well when taken along with "don't buy the very first major release".


I built a tower several years ago and it had CPU temp issues from the start. I RMA’d the cooler, reapplied the thermal paste a couple times, reassembled the whole build, etc. It wasn’t my main machine, but every time I sat down to use it the CPU would run hot and thermal-throttle. It’s an i9 with P/E cores, so I just chalked it up to Linux power management woes. A couple months ago I was on the brink of selling it for parts, but updated the BIOS as a Hail Mary. Totally fixed it.

I guess I did “ have a specific bug that needs fixed”; I just didn’t know it!


People don't have a choice to update their BIOS, as updates like this are automatically installed, by both Windows and the underlying Intel ME tools.

(And I'm trying to avoid talking about microcode updates, which is a whole other story of fuckups)

Regarding Thinkpad BIOS: I have a Raspberry Pi Zero and a self soldered RP2040 programmer [1] in my travel kit for a reason. When travelling, a lot of the Cellebrite rootkits rely on an OEM BIOS, so they typically reflash your BIOS in the "we gonna check your laptop" phase.

[1] would totally recommend serprog, it's awesome: https://codeberg.org/Riku_V/pico-serprog


Most of the laptop BIOS updates are now for CVEs and other security fixes, from my experience. You don't have much choice but upgrade.


These are for "security" against the user, to be fair.


Something related to this article, but not related to AI:

As someone who loves coding pet projects but is not a software engineer by profession, I find the paradigm of maintaining all these config files and environment variables exhausting, and there seem to be more and more of them for any non-trivial projects.

Not only do I find it hard to remember which is which or to locate any specific setting, their mechanisms often feel mysterious too: I often have to manually test them to see if they actually work or how exactly. This is not the case for actual code, where I can understand the logic just by reading it, since it has a clearer flow.

And I just can’t make myself blindly copy other people's config/env files without knowing what each switch is doing. This makes building projects, and especially copying or imitating other people's projects, a frustrating experience.

How do you deal with this better, my fellow professionals?


Software folks love over-engineering things. If you look at the web coding craze of a few years ago, people started piling up tooling on top of tooling (frameworks, build pipelines, linting, generators etc.) for something that could also be zero-config, and just a handful of files for simple projects.

I guess this happens when you're too deep in a topic and forget that eventually the overhead of maintaining the tooling outweights the benefits. It's a curse of our profession. We build and automate things, so we naturally want to build and automate tooling for doing the things we do.


I don’t think those web tooling piles are over-engineered per se, they address huge challenges at Google and Facebook, but the profession is way too driven by hype and fashion and the result is a lot of cargo culting of stuff from Big Dogs unquestioningly. Wrong tooling for the job creates that bubble of over complicated app development.

Inventing GraphQL and React and making your own PHP compiler are absolutely insane and obviously wrong decisions — for everyone who isn’t Facebook. With Facebook revenue and Facebooks army of resume obsessed PHP monkeys they strike me as elegant technological solutions to otherwise intractable organizational issues. Insane, but highly profitable and fast moving. Outside of that context using React should be addressing clear pain points, not a dogmatic default.

We’re seeing some active pushback on it now online, but so much damage has been done. Embracing progressive complexity of web apps/sites should leave the majority as barebones with minimal if any JavaScript.

Facebook solutions for Facebook problems. Most of us can be deeply happy our 99 problems don’t include theirs, and live a simpler easier life.


Not sure why you lumped React in there. Hack is loopy, and GraphQL was overhyped but conditionally useful, but React was legitimately useful and a real improvement over other ways of doing things at the time. Compare React to contemporary stuff like jQuery, Backbone, Knockout, Angular 1.x, etc.


I agree with you very much, if what you are building actually benefits from that much client side interactivity. I think the counterpoint is that most products could be server rendered html templates with a tiny amount of plain js rather than complex frontend applications.


First of all, I read the documentation for the tools I'm trying to configure.

I know this is very 20th century, but it helps a lot to understand how everything fits together and to remember what each tool does in a complex stack.

Documentation is not always perfect or complete, but it makes it much easier to find parameters in config files and know which ones to tweak.

And when the documentation falls short, the old adage applies: "Use the source, Luke."


Don't fall for the "JS ecosystem" trap and use sane tools. If a floobergloob requires you to add a floobergloob.config.js to your project root that's a very good indicator floobergloob is not worth your time.

The only boilerplate files you need in a JS repo root are gitignore, package.json, package-lock.json and optionally tsconfig if you're using TS.

A node.js project shouldn't require a build step, and most websites can get away with a single build.js that calls your bundler (esbuild) and copies some static files dist/.


> As someone who loves coding pet projects but is not a software engineer by profession, I find the paradigm of maintaining all these config files and environment variables exhausting

Then don’t.

> How do you deal with this better, my fellow professionals?

By not doing it.

Look, it’s your project. Why are you frustrating yourself? What you do is you set up your environment, your configuration, what you need/understand/prefer and that’s it. You’ll find out what those are as you go along. If you need, document each line as you add it. Don’t complicate it.


Honestly... ask an AI agent to update them for you.

They do an excellent job of reading documentation and searching to pick and choose and filter config that you might care about.

After decades of maintaining them myself, this was a huge breath of fresh air for me.


Simplify your tools and build processes to as few as possible, and pick tools with fewer (or no) config files.

It could depend on what you're doing, but if it's not for work the config hell is probably optional.


You start with the cleanest most minimal config you can get away with, but over the years you keep adding small additions and tweaks until it becomes a massive behemoth that only you will ever understand the reasoning behind.


Right, and then when you don't work on it for 6 or 12 months, you come back and find that now you don't understand it either.


Part of doing it well is adding comments as you add options. When I used vim, every line or block in the config had an accompanying comment explaining what it did, except if the config’s name was so obvious that a comment would just repeat it.


That's a good call. It's a big problem for JSON configs given pure JSON's strict no-comments policy. I like tools that let you use .js or better yet .ts files for config.


Or consider jsonc - json with comments - or jwcc - which is json with comments and trailing commas to make life a little easier.

https://jsonc.org/

https://nigeltao.github.io/blog/2021/json-with-commas-commen...

There are a lot of implementations of all of these, such as https://github.com/tailscale/hujson


I like this idea a lot, and pushed for json5 at a previous job, but I think there are a few snags:

- it's weird and unfamiliar, most people prefer plain JSON

- there are too many competing standards to choose from

- most existing tools just use plain JSON (sometimes with support for non-standard features, like tsconfig allowing trailing commas, but usually poorly documented and unreliable)

Much easier just to make the leap to .ts files, which are ergonomically better in almost every way anyway.


A lot of json parsers will permit comments even though it isn't meant to be valid. Worth trying it, see if a comment breaks the config, and if not then use comments and don't worry about it.


For reference, jq and python don't allow comments.


Sorry, but this sounds more like a myth, or at least heavily exaggerated. Similar to how Japan often gets romanticized.

Organizing the entire chain geographically at the scale you described (inter-city) doesn't bring huge cost advantages by itself. In China labor has historically been cheap, so the transport cost between regions was never the dominant factor anyway.

Most industrial clusters in China formed organically over time just like the rest of the world. Aside from some exceptions like mining, there isn't some master plan laying out entire cities as linear supply chains to the ocean It's not SimCity.

One thing you're right about is that there is less bureaucratic friction or 'lawyers' in the way when it comes to economic development. For the former, it's because economic growth is THE metric for the government, especially at the local level, so they do whatever it takes to make it happen. For the latter, it's because… well, in China no one sues the government, period. I'm not sure it's a good thing.

Disclaimer: I'm Chinese living in China.


As a Chinese living in China, you must know the layout of the city does provide logical sense. I've only been once, and I buy stuff from factories fairly often. When I went there I basically went to a mall district where all the furniture was sold, then I went to the tile district to review tiles, I went to several other "districts" that where nothing but that single item.

I went to the window factory, which was directly beside more window factories, and directly beside that was the place that extruded aluminum for use. The aluminum they used was produced a up the road in what they called the metal district.

You are even saying that "industrial clusters in China" so there is clearly some amount of planning involved. There is obviously benefits to having all of the aluminum factories beside a aluminum producer, and having the shipping/packaging warehouses by the docks, etc.

There is some amount of government work at play here, either on a small scale or a larger scale to provide a reason for places to all setup.

I've also seen things that just are not possible in North America. Asked for samples of aluminum extrusions and had the die made and extrusion done in a day. Locally it would take months before a sample is at my door.

I've sent designs for quotes and get quotes in hours, half the time factory in NA doesn't even reply. And even when it does it's more of a "go away" then anything else.

I've seen live video of robotic factories building entire cabinets for housing.

There is some amount of rose coloured glasses in this thread. But we cannot deny that China wants business and can get stuff done fast and efficiently. That cannot be said for modern day factories in US or Canada. The work ethic and desire for business are just completely different.


You seem to assume that just because similar industries exist near each other in China, that it must have been government intervention. Which maybe it was, I don't know. But this same trend exists in the USA too.

You have areas with lots of Oil Refineries, Houston and Baton Rouge for example. You have areas with lots of steel mills, like in North West Indiana. These are examples I personally know of. Obviously a lot of big tech factories exist close to each other in Silicon Valley and in Austin Texas too.

There are "industrial clusters" in America too, simply put. It is natural for large chemical plants or industrial sites to build up near where their source is. Hence all the oil refineries around the gulf. This is not a uniquely China thing at all. Lots of major US cities are known for specific types of industries.


Is the labor cheap in China or are you comparing it US salaries?

Can a person working in a Chinese tech factory for a major US company afford a reasonable place to live a reasonable distance, food, some entertainment, and have savings?


I'm not comparing it to US anything, I'm comparing it to other cost components like raw materials and parts, whose prices are often global.

The point is that transportation within China isn't a dominant factor in industrial cost or efficiency. So the idea that major manufacturing cities are laid out like giant assembly lines isn't nearly as important as OP suggests.

China still has many advantages over the US in manufacturing. I just don't think this is a major one, even if there's a grain of truth to it.


Strategic industries, i.e. 5 year plan ones, local gov will absolutely master plan to excruciating detail for complete industrial chain. Less strategic industries local gov will get a few anchor industries to root and rest is organic. Intercity proximity also brought huge advantages in terms of transportation speed, especially in 90s-00s. The other consideration is scale, a bumfuck tier3 chinese city specialize in xyz will have millions of people which naturally enables greater levels/depths of industrial agglomeration, which is what makes PRC exceptional. Think old Detroit motocity hub that dominated 90% of US car production. PRC has 100s of said cities for different industries. It's not myth/exaggeration that consequence of PRC scale, historically exceptional/aberration tier industrial clusters in other countries, PRC has 100s of, as baseline template.


I'd argue Ladybird itself is a "hype" project.


Anything trying to break the browser monopolies in a meaningful way deserves the hype, IMO.


  > monopolies
sorry to be pedantic but, do you mean perhaps oligopolies? and by that do you mean marketshare or technology share? i'm just curious what people are looking for in ladybird (just better tech or better or governance?)


Fair point. What does Ladybird need to achieve in your opinion to shake the "hype" label? Honestly, I, myself, don't have a good answer!


To me, a project's "hype-ness" is the ratio of how much attention it gets over how useful it actually is to users.

As a browser, Ladybird usefulness is currently quite limited for obvious reasons. This is not meant to dismiss its achievements, nor to overlook the fact that building a truly useful browser for everyday users is something few open source teams can accomplish without the backing of a billion dollar company. Still, in its present state, its practical utility remains limited.


> As a browser, Ladybird usefulness is currently quite limited for obvious reasons

By this definition all basic research is hype.


No, not necessarily. It's the ratio and for most basic research the numerator of the fraction is also approaching zero.


Yes, necessarily. Basic research is currently useless in the same way building something that hasn’t been built is currently useless.


Good verdict. I agree.

Ladybird will have to deliver eventually - on this part I think many people agree with.


> What does Ladybird need to achieve in your opinion to shake the "hype" label?

A release (?)


Somehow people manage to run it without this magical release


I mean you can build and try Ladybird for yourself. I posted on HN from it a while back.


This URL just return empty content (while being http 200) for me.


I don't get how it works.

> Encoding: Files are chunked, encoded with fountain codes, and embedded into video frames

Wouldn't YouTube just compress/re-encode your video and ruin your data (assuming you want bit-by-bit accurate recovery)?

If you have some redundancy to counter this, wouldn't it be super inefficient?

(Admittedly, I've never heard of "fountain codes", which is probably crucial to understanding how it works.)


Yes it is inefficient. But youtube pays the storage ;-). (There is probably a limit on free accounts, and it is probably not allowed by the TOS.)


Right, you just pay daily in worrying when, not if, youtube will terminate your account and delete your "videos".


I think it's just meant to be a fun experiment, not your next enterprise backup site


Stegonagraphic backup with crappy ai transmogrified reaction videos. Free backup for openclaw agents so they can take over the internet lol


Hey there, Brandon here (developer). I've uploaded an explanation video here, which might be useful to watch :D

https://youtu.be/l03Os5uwWmk?si=nJDwz4s7_E4WFOwC


He encodes bits as signs of DCT coefficients. I do feel like this is not as optimal as it could be. A better approach IMO would be to just ignore the AC coefficients altogether and instead encode several bits per block into the DC. Not using the chrominance also feels like a waste.


This actually won't work against YouTube's compression. The DC coefficient is always quantized, rounded, scale, and any other things. That means that these bits are pretty much guaranteed to be destroyed immediately. If this is the case for every single block, then data is unrecoverable. Also, chrominance is not used on purpose, because chrominance is compressed much more aggressively compared to luminance.


I meant choosing multiple values, e.g. 4 to represent 2 bits. Say, 0.25, 0.5, 0.75, and 1. Then when decoding you would pick the closest valid value, so for example for 0.20 it would be 0.25. Not using AC coefficients would mean that theoretically you would get more bitrate for the DC ones.


I’ve been told this many times in the comments, but this again is not reliable. Simply put, compression doesn’t necessarily follow a pattern, so specifying “ranges” or rounding to a specific place will not work. Compression optimizes for the eye, and doesn’t do the same thing for every value. It will round some down, some other mores, others less. Giving a range is simply not enough.


Yeah, I would assume that transcodes kill this eventually...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: