Depends on your requirements. For example, if you have any values you want to keep secret in your config files, then using a config manager can help you to not expose them in a Git repository. Also, if you work across multiple operating systems, you can use config managers to alter your config files based on the current OS.
I use Tomato too, but I wouldn't say it offers many benefits over OpenWrt. The main thing is that routers based on Broadcom chipsets often only work with very old Linux kernels (such as 2.6.xx kernels), as the drivers are closed source. For these routers, the primary third-party router OS choice is to use Tomato.
"Although the power module includes a Saturn-style DIN connector, I don’t own a compatible cable, and prefer to use VGA cables. So for my install, I decided to add a SNES multiout port."
OTOH it's instance #3563229642 this week of HN post title that tells you absolutely nothing what the link is about. For all the care that goes into HN and repeated editing of titles, I can't understand why this is regarded as fine :D
LavaRGB is the title of the post. Adding any more text to the title would be editorializing.
Do you buy a book based on its cover/title alone? Why would you not click the link to see what it was about? Something making it to the front page is going to be interesting to a decent sized audience, so why not click the link. It takes no time to ctrl-w if it's not your cup of tea, but it is much more productive than whining about titles
“Don’t judge a book by its cover” is a quaint phrase probably from when all books looked the same. But if the title and cover don’t convey what the contents are about, someone who may have liked it won’t even open it.
This is an indictment of the publisher given a bookstore full of books, not of a “lazy” reader.
> Do you buy a book based on its cover/title alone?
Not a good analogy, and I would counter with: what is the purpose of the subject line in an email? It's neither new nor rocket science, and pretty obvious IMO.
> I wonder, does residential solar with export count towards this sort of thing?
It isn't displayed on this website, or at least not all of them, I checked by looking for an installation that I know exists that is on a feed in tariff and it wasn't shown.
To be fair to the creators of this website, it's probably quite hard to gather and display this type of residential solar information.
I agree the display part is probably challenging, the exact location likely isn't recorded and the density could make it an issue to show on the map; regarding collecting it, I'd be surprised if there wasn't some public-ish record of it somewhere, we do all have to register with the DNO etc as micro-generators so it'd be interesting to know if someone makes an API with that information available.
Aside from that, my real wonder was whether residential solar is factored into national renewables or if it's simply too little to be worth it.
It's not nostalgia for a game, it's wanting to get back to plug-and-play gaming experiences.
With consoles in the 70s/80s/90s, when you put a game into the console and turned it on, you launched directly into the game. That immediacy is lost when you end up with endless software updates and having to launch games from a menu. If you didn't live through that time I can understand why you aren't nostalgic for it.
Most modern physical games don't give the full benefits I'm talking about, as you still have to install them and update them. I'm talking about games where the only thing you need to do is plug them in to a console and start gaming.
Other than LTT continuing with LTT Labs just because they enjoy deeper dives into tech, I'd imagine it also serves a practical purpose, in that if it's a slow tech news day, and they can't think of a video concept they want to try, they have a ready-made set of labs content they can base a video on.
"Like its predecessor, OFS (Old Be File System, written by Benoit Schillings - formerly BFS), it includes support for extended file attributes (metadata), with indexing and querying characteristics to provide functionality similar to that of a relational database."
What BFS did is very cool, and I hope to add that to bcachefs someday.
But I'm talking more about the internals than external database functionality; the inner workings are much more fundamental.
bcachefs internally is structured more like a relational database than a traditional Unix filesystem, where everything hangs off the inode. In bcachefs, there's an extents btree (read: table), an inodes btree, a dirents btree, and a whole bunch of others - we're up to 20 (!).
There's transactions, where you can do arbitrary lookups, updates, and then commit, with all the database locking hidden from you; lookups within a transaction see uncommitted updates from that transaction. There's triggers, which are used heavily.
We don't have the full relational model - no SELECT or JOIN, no indices on arbitrary fields like with SQL (but you can do effectively the same thing with triggers, I do it all the time).
All the database/transactional primitives make the rest of the codebase much smaller and cleaner, and make feature development a lot easier than what you'd expect in other filesystems.
This article is full of nonsense. The Linux desktop push isn't failing because it has experiences and apps that are similar to Windows and macOS. Being able to run Windows apps on Linux is a benefit, not a failure. As for religious wars over init systems, desktop environments and package managers, competition is making the options stronger, not weaker. Competition is a reason why package management on Linux is far better than equivalents on Windows and macOS.
The main reason for Linux not taking off on the desktop is because most users don't care about what OS they run, they just want a computer that works. If the PC they buy comes with Windows out of the box, they're going to stick with that. Until you get manufacturers shipping PCs with Linux as the default OS, you're mainly going to see desktop Linux as an enthusiast-only option. It's no accident that one of the devices helping to spread Linux (the Steam Deck) comes with Linux as the default option.
> As for religious wars over init systems, desktop environments and package managers, competition is making the options stronger, not weaker.
Competition can definitely improve things, but it's not universally positive. In particular, endless competition in parts of the operating system makes it hard to build anything on top of them. E.g. if you want to distribute an application for Linux, do you build a Flatpak, or a Snap? Or take a more traditionalist approach and make RPMs, DEBs, etc.? You either pick your favourite and leave out a large fraction of Linux users who disagree, or you have to do more than one of these. This is definitely a drag on the ecosystem.
I agree that most users don't care about the OS, though.
Generally most Linux distributions are literally the same thing underneath. I have recently done an LFS build (using version 12.3 of the book). The same files were in the same directories in Debian, Arch and LFS for the most part.
I even had a look at source code for makepkgs in Arch and they are literally the same commands in the script that the book has you manually type in.
The packaging critique comes up over the years but it is a bit of an overblown.
Building packages for different distributions isn't super difficult. I've build Arch Packages using the ABS, DEBS and RPMS and they are all conceptually the same. Having a quick skim of the Flatpak instructions it doesn't look that different either.
If you don't want to bother with all of that. You can just have a script that drops the installation either in /opt or ~./.local/. I am pretty sure Jetbrains Toolbox does that, but I would need to check my other machine to confirm and I don't have access currently.
If you build an application, The Right Way™ has always, and probably always will be a tarball. Leave to distributions the hassle to distribute your software.
This is absolutely not a solution. It more or less works for a few big widely used applications like Firefox & Libreoffice, but more niche applications can't realistically get someone in every major Linux distro interested in packaging them. And if someone does take an interest once, there's no guarantee they stay interested to package updates. Distro maintainers need to be highly trusted (give or take AUR), so it's not easy to add many more of them.
On top of that, some of the biggest Linux distros will only release application updates with their own OS releases, on a cadence of months or years. Software developers expect to be able to deliver updates in hours to days (at least in the consumer space - highly regulated things like banking are different).
There are good reasons why RedHat and Canonical, the companies behind some of the biggest Linux distros, are pushing app distribution systems (Flatpak & Snap) which aim to be cross-distro and have upstream developers directly involved in packaging. There are absolutely downsides to this approach as well, but it would be nice if we could stop pretending that traditional distro packaging is a marvellous system.
This is a complete and utter non-starter for most software developers.
On pretty much every other operating system out there, I as the application author have control over the release cadence of my software. On some platforms, I can simply release an installer and be done. On others, I send the app through some form of certification process, and when it's complete it appears on the App Store.
Linux is rather unique in that it has tasked the distro maintainers to be tastemakers and packagers of the software that appears in their repositories. They control what version is used, what libraries it's linked against and their specific versions, how often it's updated, which additional features are added or removed before distribution, et cetera.
From the outside, this is complete and utter insanity. In the old days, you had to either static link the universe and pray or create a third-party repository per specific distro. Thank goodness these days we have numerous ways to cut distros out of the loop - Docker, Flatpak, AppImage and of course Steam if you're a gamer.
> Its not about competition. RedHat employees pushed their idea of things and the volunteers either ate it up or left.
This is a narrow view about how innovation happens in Linux and related software. Yes, Linux-focused companies are driving many of the changes, but there is plenty of exploration of ideas that happens outside of those companies.
I was thinking about maintaining and keeping things running (like you would do with cars and houses and anything else except software) and less about innovation and change. I doubt there is a shortage of ideas that are being explored.
Like, innovative and fresh stuff is cool, but at the end of the day you need to keep your business running and not breaking down.
Dunno about not caring about the OS. My mum who's not techy got persuaded to get a Macbook after Windows and is finding it a big learning curve. I remember when Walmart sold Linux machines they gave up because the buyers returned them when they found their Windows stuff didn't run. I'm a fairly normal user and certainly care if it's Mac, Windows or Linux. I wouldn't run linux as my main os as I spend a lot of time in Excel.
People definitely care about applications they use a lot, and MS Office is a big one - even if LibreOffice would work just as well for many use cases, people are hesitant to give up what they know works.
The OS does make some difference, but I think that if all the same applications were available, a lot of people could switch without much difficulty. In some ways going from Windows to Linux might be easier than Windows to Mac, because many Linux distros are happy enough to conform to UI conventions that are familiar from Windows.
AppImage is, in fairness, a clever idea. But it's also yet another option. It only solves the mess of competing formats if it wins, if it becomes the normal way to publish user-facing software on Linux.
It's because the software they use is not available. Put aside there are alternatives. There are not alternatives for some software. I would use Linux if Autodesk made Revit and AutoCAD for Linux.
I would venture to guess that kids know the difference between Mac and Windows and probably prefers one over the other.
> Being able to run Windows apps on Linux is a benefit, not a failure.
It is a massive moral failure though. It shows that after two decades of work, the Linux community has been unable to build a simple sane functional stable development environment better than Win32.
Sane here is bearing a lot of weight. Developing on Linux is far easier than developing on Windows. I've never seen a windows project as simple as nq[0] or dwm[1].
A random macOS binary is more likely to run on another macOS install from anytime in the last half decade than a Linux binary on the same distribution.
Even Apple’s famously fast deprecation is like rock by comparison.
I'm not sure why you think this is a good metric; the space of "random Mac binaries" is far smaller. There's probably something to be said for this "curation," but you pay for it, both literally with money and in limited selection.
I don’t know; you don’t think having Win32 be the unofficial API is a problem?
It literally means Windows will always exist - as the preferred IDE and Reference Spec for the Linux desktop. It also means all evolution of Linux will be ironically constrained by Win32 compatibility requirements.
Meh. It shows a good part of software (namely, games) is written for Windows, because the userbase is Windows, because Windows is the lion's share. And it shows people on Linux want that software to run. It's an admission, but not a moral failure.
Depends on your requirements. For example, if you have any values you want to keep secret in your config files, then using a config manager can help you to not expose them in a Git repository. Also, if you work across multiple operating systems, you can use config managers to alter your config files based on the current OS.
reply