I'm still on 21.02.7 because they switched to nftables syntax for custom rules ("firewall.user") in 22.03 and I have a pile of them and haven't had the time to rewrite it all. This release means 21.02 will have no more security updates.
It wasn't a pleasant task but when I ripped the band-aid off and converted my rules, I took it as an opportunity to reevaluate them and was able optimize a lot of them, converting to IPSSETs, tweaking them into CIDR ranges, pruning, etc and got a decent speed boost. Maybe that isn't applicable to you, but something to keep in mind.
I'm still on 18. I'll probably switch to 23 sometime.
The device is stable and I'm the only one using it so I didn't feel any pressing rush to deal with the hassle since a lot of "security" issues are often from the inside not the outside.
Unfortunately there's some RCE vulnerabilities and other nasty bugs between 18 and 23 that can be exploited remotely. I wouldn't feel safe running an unpatched version if it can be helped.
I don't advocate staying on an older version, but that depends on the services you have running. If you don't have any external service, you should be OK.
I tend to minimize internal services as well. I don't use OpenWRT as a general purpose server.
Still, an LTS release from time to time would be good (the effort would be the same, as there would still be only two maintained releases: the current stable and the LTS).
Syntax is the least of the problems. The main problem is the ruleset structure they've chosen which, for example, no longer includes user customizable chains. This means I'll have to rethink my rules to make sure they play nice with the default ruleset.
They seem to be forcing the use of the UI for creating rules, but that doesn't cover some that I have (part of which are generated dynamically).
Worse, I'll have to go through 22.03 because they don't support upgrading from 21.02 to 23.05.
> I'd like to offer a more critical perspective as a Portuguese living in Portugal.
I’d like to offer some context for comments such as these as a portuguese living in Portugal.
The portuguese have very low self-esteem as a people. They very often criticize their own country in very harsh terms with a strong bias towards “the grass is always greener on the other side” and tend to steer conversations away from any positive aspects (most often by trying to present them as negative).
Foreigners are expected to passively agree with negative comments about Portugal. But beware: contributing with additional negative aspects is frowned upon and will trigger a sudden patriotic episode.
Over the years, I think most people came to understand "open source" as something closer to "free software". However, that's clearly not the case for projects controlled by a single entity that require copyright assignments from contributors.
Copyright assignments are put in place for exactly this (allowing a single entity to relicense the whole codebase unilaterally based on their own interests), and we should maybe come up with a better term than "open source" for projects in this situation.
IF they have a license that meets the Open Source Definition, the fact that someone has the right (whether its a single owner is the only committer to the main project, or a person or entity who requires copyright assignment before merging outside submissions) has the right to subsequently issue versions with a different license does not change that.
So, no, I don’t see a CLA for an otherwise open source project transforming it into something other than open source.
And, if it did, we’d have to have a serious conversation about the “or any later version” clause of the GPL and how it makes all software using it not open source.
> And, if it did, we’d have to have a serious conversation about the “or any later version” clause of the GPL and how it makes all software using it not open source.
I was referring to people's understanding of what open source stands for, not what it actually means.
However, I can say that I personally dislike the fact that GNU projects require copyright assignments too, and that they were able to relicense all their projects from "GPLv2 or later" to "GPLv3 or later" unilaterally. I'm more willing to give the FSF that blank check than commercial entities, though.
Also, the "or any later version" means the recipient (user) chooses, not any single controlling entity.
"Business Source License" is not Open Source. You don't have to release your software as Open Source if you don't want to, I certainly write a lot of non-open-source software for a living. But people/companies want to take advantage of the good-will/reputation that comes from calling their software Open Source, and associating it with really Open Source software, without really making it Open Source. That's sleazy.
I don't care what a California-based "Open Source Initiative" group try to define as "Open Source Definition". That is all lobbying to me.
If I can see the source, then it's open source. The rest is just play on words which only purpose is to entertain sterile debates of zealot groups attempting vocabulary appropriation in a power struggle.
I don't want to fuel these groups' debates around "free software" vs "open source", Linux vs GNU/Linux or whatnot.
Are you unable to get out of the hole of vocabulary appropriation and lobbying created by some activist groups?
You are perpetrating the mind washing game of zealots that try to convince you they have the right to define what is and is not acceptable to their self defined standards.
You keep quoting articles and pages defining what _a specific group_ with a specific agenda has chosen to appropriate as "Open source".
All your sources and Wikipedia pages are ultimately linked to GNU publications. Read the references.
"an open source software license must also meet the GNU Free Software Definition"
That is plain ridiculous.
> it's also been a thing for 20+ years
No, it has never been "a thing". Especially not 20 years ago. Not even the sources you quote date back more than 5 years.
I don't know for you, but I was there 20+ years ago, developing and using open source softwares, and that distinction did not exist. If you had the source, it was open source, whatever the limitations of the license.
1999. That's when the first version of this document was published. The original 1999 press release is linked to Wikipedia. There were some conferences and debates that happened before that, and some settling of the OSI in the early 2000s. 20+ years ago.
> I don't know for you, but I was there 20+ years ago, developing and using open source softwares, and that distinction did not exist.
That's really pretty strange ... it was a pretty hot topic back then ... are you sure you used open source software? like linux, freebsd, apache httpd, gcc, bash, samba, mozilla...
The right to use software commercially even if the copyright holder doesn't like you is an important qualification of open source. GPL, LGPL, BSD, MIT, MPL, Apache etc all include this right. It's important. It did take people some time and debate to figure it out ... 20+ years ago.
The word "open" has pretty broad implications and only one of those implications is "visible". It's pretty reasonable to expect more from "open source" than just the source being visible.
This is United Linux 21 years later. But this time, instead of building a distribution to compete with Red Hat, it's based on Red Hat (in practice, it is Red Hat).
> "No subscriptions. No passwords. No barriers. Freeloaders welcome."
I'm sure they're trying to be cheeky. But it also comes off as confirming Red Hat's position. OpenELA isn't about community, it's about having a base (i.e. bug-for-bug RHEL clone) upon which to sell support contracts.
If you really want to stay in the Red Hat ecosystem, I'd suggest going with AlmaLinux instead. They seem to have a more honest understanding of what "community" means.
Some people are throwing the word "freeloaders" around. It seems clear that "freeloaders" are not people running RHEL clones, but indeed there are some "freeloaders" in the community.
Most of the businesses that want to run Red hat compatible Linux probably don't care about any of that stuff. They just want to run Linux and have it be essentially as difficult to maintain as their landscaping or electricity.
If you're unsure which enterprise linux fork to go with, here is a panel discussion between AlmaLinux, RockyLinux, and Oracle - https://www.youtube.com/watch?v=PFMPjt_RgXA. They all lay out their philosophy and discuss the changes going forward. Unfortunately a SUSE representative couldn't make it, but you can fill in the gaps from their press material.
I tend to respect the AlmaLinux way of doing things, but I respect where the others are coming from.
> which is more reflective of real world video conferencing use cases
What's more reflective of real world use cases is a hot discussion topic. :)
The "networkQuality" tool on macOS and other implementations(1) of the "RPM" algorithm(2) are very good to see how an Internet connection behaves under high stress. But the real world is seldom high stress (100Mbps+ connections are rarely pushed to the limit, honestly).
In your example, video conferencing indeed does simulataneous upload and download, but it's also low bandwidth (relatively speaking) and doesn't usually come close to saturating a connection. It can impacted by other traffic in the same connection though.
Which can come from a single user, or multiple users sharing the same connection. And neither networkQuality nor Cloudflare's speed test measure how a connection behaves with multiple users stressing a connection.
In the end, which tool you use depends on what you're trying to measure. For me, the Cloudflare speed test is closer to measuring what I experience in real world use.
Yes fair enough. I know my link is great - I pay for 300 up and down and that's what I get. I was running measurements to track degradation in wifi performance due to distance from the router - out in the yard - and how much difference one of these 8 antenna monster routers made. My use case is 100 video conferencing that I want to do reliability from the yard.
Ah, sorry. I thought you were asking for something different (more summarized). It does indeed plot latency during downloads and uploads in the same way as it plots unloaded latency (and I'd call those a "graph" too).
> 100KB and 1MB down/up load tests show unrealistically low values, perhaps due to a high latency (ping).
That's related to TCP congestion control. TCP starts slow and ramps up speed as download progresses.
But you're right that latency also has an impact on smaller downloads, and that's why latency is a critical measurement of connection quality for everyone, not just gamers.
Honestly, the more reactions I see from certain parts of the community, the more I agree with Red Hat’s decision. And I started from thinking it was a bad move.
Also, see Jon “maddog” Hall’s take from a few days ago:
"I see too many articles saying that Red Hat only have to provide source code to its customers - and the blog only implies that - but the Linux license requires more."
The GPL only applies when there is distribution (either in source or binary form) of a work covered by it. And it only applies between the parties involved in that distribution transaction. The GPL doesn't mandate that a covered work be distributed. The combination of these three aspects is critical.
Red Hat only distributes RHEL to its customers (it isn't available for anonymous download), therefore the GPL only mandates that Red Hat provide corresponding source code to their customers. The GPL also mandates that Red Hat allow its customers to redistribute any received binaries or source to anyone. Red Hat is free to decide who can be a customer, and to refuse to distribute some future version/modification of a GPL'ed work to an existing customer.
The meaning of "all third parties" in the GPL is confusing, but it's important to understand that the two parties of the license are the original author of the code and a recipient. In this context, the third parties are anyone to whom the recipient later distributes the work. Not anyone random that happens to approach the recipient asking for distribution to happen.
The term "any third parties" means the recipient cannot violate the terms of the GPL upon redistribution, no matter who it is distributing to. For example, it cannot ask that "third party" to forfeit its rights to redistribute the work it received.
In summary:
1. Red Hat can refuse to distribute a GPL'ed work to anyone - i.e. Red Hat can prevent the GPL from applying between them and anyone by avoiding any action that would trigger the GPL.
2. Once Red Hat distributes a GPL'ed work to someone, that someone can come back at any time in the future asking for the corresponding sources.
3. Red Hat does not prevent customers from redistributing the RHEL source code (they can't).
4. Having distributed some version of a GPL'ed work to someone does not force Red Hat to distribute a different version to that someone. See 1.
I'm still on 21.02.7 because they switched to nftables syntax for custom rules ("firewall.user") in 22.03 and I have a pile of them and haven't had the time to rewrite it all. This release means 21.02 will have no more security updates.