Hacker Newsnew | past | comments | ask | show | jobs | submit | alimbada's commentslogin

I've been using Amethyst for a couple of years now and it's been working quite well for me.

Do you also post "Take it away from $OWNER" every time your open source software breaks?

If he posted every time GitHub broke, he would have certainly have posted a bunch of times.

What antitrust issue does my open source software have?

What does antitrust have to do with the GitHub services downtime?

The more stable/secure a monopoly is in its position the less incentive it has to deliver high quality services.

If a company can build a monopoly (or oligopoly) in multiple markets, it can then use these monopolies to build stability for them all. For example, Google uses ads on the Google Search homepage to build a browser near-monopoly and uses Chrome to push people to use Google Search homepage. Both markets have to be attacked simultaneously by competitors to have a fighting chance.


It regularly breaks the workflow for thousands of FLOSS projects.

It does? Do you know git is a dvcs? And therefore you're able to continue working without an internet connection or a service provider being up? It delays the code review process but doesn't break it.

I get it that you want it to be 100% up, but let's be serious your FLOSS projects probably break more stuff than GitHub being down does.


How about Issue management?

How often do issues break? And how is waiting a few minutes breaking the process?

Dude, you need to stay on script! Over here you're saying it's an anti-trust issue [1]. It's literally the thread above this.

[1] https://news.ycombinator.com/item?id=46946827#46946914


> Now, the reason why it won't work on Linux is that the Linux kernel and Linux distros both leave that unified memory capability up to the GPU driver to implement. Which Nvidia hasn't done yet. You can code it somewhat into source code, but it's still super unstable and flaky from what I've read.

So it should work with an AMD GPU?


> the Linux kernel and Linux distros both leave that unified memory capability up to the GPU driver to implement

Depends on if AMD (or Intel, since Arc drivers are supposedly OSS as well) took the time to implement that. Or if a Linux based OS/distro implements a Linux equivalent to the Windows Display Driver Model (needs code outside of the kernel and specific to the developed OS/distro to do).

So far, though, it seems like people are more interested in pointing fingers and sucking up the water of small town America than actually building efficient AI/graphics tech.


I would expect the M4 options to be cheaper than their M5 counterparts. Given how expensive Apple products are, I appreciate them offering both options at the same time. Just because it doesn't make sense to you doesn't mean it doesn't make sense for anyone; you're obviously in the miniscule [nit-picking] minority since Apple will have done enough market research to offer the options that they do offer.


> It’s not uncommon to have a one bed flat with 2 phone sockets in the living room and 2 phone sockets in the bedroom and a master socket in the technical room. It’s ridiculous.

This sounds a bit farfetched to me. I'm 40+ and lived in the UK all my life. Growing up we only had 1 phone socket in the house for the first few years until my dad got an extension put in upstairs. I've lived in multiple cities since then and no flat or house I've lived in has had more than 1 phone socket including the house I eventually bought and live in now (which is not small by most UK standards).


I’m a similar age and have also lived in a few houses over the years. I’ve never lived in any place that didn’t have more than one phone socket.

Though I have noticed multiple sockets are less common in really old houses which haven’t seen much modernisation, and less common in really new ones too (since builders expect most people will just use the master socket for broadband and people use mobiles for calls).


Old houses should have 1 extra socket in the master bedroom at the very least, because the master of the house was expected to plug a phone in there, back in the days. (my parents and grandparents all have one).

Incidentally, this is likely to be the furthest room on the furthest floor, so it can be a good place to add a wifi access point for coverage.


Depends on what you class as “old”. Remember that a great many British homes are 50+ years old. You certainly wouldn’t have considered having multiple phones in a house when they were built. So the extra socket was added after it was built.

Adding extension sockets was a very easy job. So easy that many homeowners did it themselves.

So it’s very likely your parents and grandparents bedroom phone wasn’t part of the original wiring.


I’ve lived in two apartments with the setup OP described, and they were both built 2003-2006. But I’ve not had it anywhere else, so it does seem constrained to a specific window of apartment developments


Interesting. Every place I've lived in has been older than that so that makes sense.


It's a setup seen in a lot of new builds flat from the 2000s and 2010s, which is a very large amount of the housing stock in London for flats (There has been so many constructions!).


Yes, the author's assertion here is nonsense. A case of someone with a very small window of experience being certain that what he's seen couldn't possibly be an outlier - must instead be normal for everyone.

The article also has a constant theme of putting people down because of something he doesn't understand. The Helldivers 2 developers are "idiots" because he doesn't understand the reasons for asset duplication in games. Simple daisy-chaining of slave sockets off the master is "incomprehensible", "pointless", "arbitrary" and "a mess"; the person who did the wiring is an "idiot". It all comes across as unfortunately quite arrogant.


I dual booted Fedora back when it was still called Fedora Core from version 6 until 11-ish. I had it installed on a laptop and had a lot of driver issues with it and eventually didn't bother with dual booting when I moved to a new laptop.

I'm now looking to get off Windows permanently before security updates stop for Win 10 as I have no intention of upgrading to Win 11 since Linux gaming is now a lot more viable and was the only remaining thing holding me back from switching earlier. I've been considering either Bazzite (a Fedora derivative with a focus on gaming) or Mint but after reading your comment I may give vanilla Fedora a try too.

So far I've tried out the Bazzite Live ISO but it wouldn't detect my wireless Xbox controller though that may be a quirk of the Live ISO. I'm going to try a full install on a flash drive next and see if that fixes things.


Give it a try! Although, I do all my gaming on a Playstation. In Fedora, the Steam and NVIDIA Fusion repos come preinstalled and can be enabled during installation or in Gnome's 'Software' or the package manager later, but I can't speak to that. The opensource AMD drivers are in the main repo no action needed. ROCm too, but that can be messy and is work-in-progress on AMD's side. Can't vouch for the controller, but people claim they work. Guess, that's the live image. I heard, games with anti-cheat engines in the kernel categorically don't work with Linux, but this may change at some point. In that case, or if you want "console mode", a specific gaming distro may be worth considering, otherwise I would stick to vanilla. Good luck! Hope I didn't promise too much ;)


So I cleared out one of my SSDs and installed Fedora yesterday.

I still had the issue of no gamepad detection. I had to install xone which took some trial and error. Firstly, I didn't have dkms installed and secondly, soon after installing Fedora the kernel was updated in the background and on reboot my display resolution was fixed to 1024x768 or something for some reason (that's gonna be another issue I'll have to look into). I rebooted and went back to the previous version and then dkms complained the kernel-headers were missing. However, the kernel-headers were installed for the latest kernel but not the older version I had rebooted to. I'm not used to Fedora or dnf (I run Proxmox+Debian in my homelab) so after a quick search to figure out how to install a specific version of a package (it's not as simple as <package>@<version> but rather <package>-<version>.fc$FEDORA_VERSION.$ARCHITECTURE) I got kernel-devel installed and was able to finally run the xone install script successfully and have my gamepad detected.

The most frustrating thing is that the xone install script doesn't fail despite errors from dkms so after the first install (where I almost gave up because I thought something was wrong with my setup) I had to run the uninstall script each time there was a problem and then run it again. The xone docs also mention running a secondary script which doesn't actually exist until the first script runs successfully so that added a lot of confusion.


Lol. Well, that does sound terrible!

My understanding is you only need xone for the special adapter right? Have you tried cable and plain bluetooth before? Also Steam seems to come bundled with their own drivers for it, so the controller may just work within games in Steam, regardless.

I feel a bit bad, but honestly gaming on Linux is not my thing. From a quick glance, messing with the kernel like that may cause problems with secure boot and maybe that's causing your issues. Maybe you need to sign your modules or disable secure boot.

Have you tried the Copr repo? https://copr.fedorainfracloud.org/coprs/jackgreiner/xone-git...

And of course Bazzite seems to have addressed this out-of-the-box... :D

Quite frankly, if you want to do anything but gaming on that machine, at least for me, manually installing kernel modules from GitHub would be a deal breaker, since that seems rather unstable and prone to cause nasty problems down the line.


I'd rather use the 2.4Ghz adapter rather than Bluetooth as the connection is supposedly more reliable (and less prone to latency issues) from what I've read. Anyway, after jumping through all those hoops I did get it working so I'm happy with xone for now. I even managed to boot into the newer version of the kernel without the degraded display resolution issue after that.

I have a new issue though after updating 900+ packages using KDE Discover which is that the GUI login doesn't work. The screen goes blank after I enter credentials and nothing happens unless I switch to another TTY at which point I get thrown back to the login screen on TTY1. As a workaround, I can login on another TTY and then use startplasma in order to use KDE. I've learnt my lesson not to use KDE Discover for updates though because it doesn't get logged in dnf history so you can't use dnf rollback.


Obligatory Zed takes money from bigots: https://github.com/zed-industries/zed/discussions/36604


Thanks for pointing it out. Ignore the vice signalers.


Nice. Just signed up for their pro plan.


I guess I'm in the minority. I haven't reinstalled on my desktop machine since 2014 according to the install dates of some of my apps. According to the Windows Registry I've gone from 7 Pro -> 8.1 Pro -> 10 Pro. Both upgrades happened in 2015 and since then I've just stayed up to date with the latest 10 Pro build.

I will be switching to Linux before the ESU program expires though. I use my desktop mostly for gaming and have been planning to evaluate a few distros and desktop environments. I have my own Proxmox/TrueNAS/Debian homelab and use macOS daily for work so I'm fine with the CLI and tinkering but I'd rather everything Just Works™ for my gaming machine. I did a lot of dual booting back in the Fedora[ Core] 6-12 days but ultimately it got too tedious.


I would say that the reason Windows issues are commonly treated as "reinstall it" is because most Windows installs are on corporate PCs. Most of the time, it's not worth spending the time trying to troubleshoot someone's gnarly OS issue when you can fix it in an hour by reimaging. There are exceptions, of course, but most of the time the business just wants that employee back to work ASAP, rather than doing the troubleshooting work.


Yes, before corporate even had a widespread imaging approach, it could be seen this was the way to go.

For decades I clean install Windows to a new PC one time, and that's about it for true "installs".

Then don't get in a hurry, it's my personal computer and I plan to be using it smoothly for a number of years to come.

So spend "a few" hours tweaking and adjusting settings, and this always takes ridiculously longer with each Windows version, but that's table stakes if you want to participate in a mainstream way without all the mainstream drawbacks.

Ideally of course without ever going on the internet, and then comprehensively back up the system before doing anything else.

Any valuable data is also never allowed to be routinely stored on the C: volume, that's what other partitions are for besides merely multibooting.

What's on C: should always be a minimal number of gigabytes, you have to take some kind of action or the defaults will work against you, massively. People can be misled that no attention is required and C: will be fine.

C: is best restricted to a highly-replaceable OS, containing in addition any programs you decide to install afterward, but none of the user data which is very worth the effort to carefully direct elsewhere at every opportunity.

So after I finish installing and configuring the desired programs, then another comprehensive backup is made.

Before it has even handled any valuable user data yet.

This is Windows, you can't take any chances :\

Then later, in situations when others would best re-install but with the typical hesitation, I boot to a different partition, zero the volume formerly known as C: while it is then dormant, followed by recovery of the (tweaked) bare OS backup, or using the image from when the apps and settings were also completely like I wanted.

Obviously programs that are not robust enough to withstand offline recovery from backup are too garbagey to include in a well-crafted backup image. You can't usually find this out without testing your backups in advance. It would be good before the backups are desperately needed if an emergency were to arise.

With basically minimal disaster preparation (but careful hours by necessity), you may never actually need to do a true "re-install" ever again, just recover from backup instead, and without hardly any hesitation at all. Sometimes more than once a day, in minutes. Rather than hours, which with Windows 11 the hours can now really add up and are sometimes best spread over more than one session :\

In that case it would be nice if calendar days were not required to manually get it like you could do with Windows 95 in minutes. I'm not even talking about gaming.

After all that effort I know how tiring it can be. Even more reason to back up your work before doing anything else, and test the backups routinely. Which is another whole session or two. I know it's the complete opposite of mainstream behavior, but it can really allow you to participate a lot more effectively in the long run.

I know, I know, who tests their backups anyway and why would they start now?

For consumers, routine rapid recovery has been effectively de-emphasized for decades since plenty of users respond to Windows failure by purchasing a new PC, which is crafted to be a more simplified and familiar procedure as long as they can afford it.

And it's really not the worst tragedy if they think they screwed up their own computer so bad they needed a new one, if it makes them be more careful next time ;)


It’s a shame the Framework founder refuses to back down on his stance of “neutrality” since the Omarchy/DHH incident.


I was about to comment to say that unless Valve is prepared to invest significant effort into an x86 -> ARM translation layer that's not going to happen but a quick search for "linux x86 to arm translation" led me to an XDA article[1] proving me wrong. The recently announced Steam Frame runs on ARM and can run x86 games directly using using something called FEX.

Now we just need to be as good as (or better than) Apple's Rosetta.

[1] https://www.xda-developers.com/arm-translation-layer-steam-f...



Apple Silicon actually has microarchitectural quirks implementing certain x86-isms in hardware for Rosetta 2 to use. I doubt any other ARM SoC would do such a thing, so I doubt third-party translation will ever get quite as efficient.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: