They have a perpetual pricing option available alongside the subscription one. It's still worth thinking about the economics of this.
How are they supposed to fund development? It's important to differentiate independent devs and the goliaths. When you release an app like this, it is "good enough" for more people. Your incentive to build the thing is that you can make a living off of it and continue crafting it with the intent of building other great things. The biggest software companies have many revenue streams and ways to cover costs that are very different from independent devs.
When you sell perpetual use software, you have the incentive to release yearly versions (or whatever cadence is best.) You are incentivized to only put bugfixes into next year's version to force upgrades. Users lose out because they don't get bug fixes and the developer is put in a spot where they have to look for more devious ways of making a profit.
To make money the cost of perpetual software is also very high. Devs make terrible compromises here to seem reasonable, but you need to move a lot of units to reduce the price to the level possible with subscription software.
Subscriptions are far from perfect, but they bring some balance. Next time you complain, it would be an interesting exercise to state what you would be prepared to pay and how often.
SQL Server 2000 was well received in the segments that mattered as a challenger. Oracle was in first place running on Unix. However, it was viewed as expensive and the procurement experience was regarded as unpleasant. People wanted competition even if they didn't think SQL Server, or another alternative, could unseat Oracle for the most important stuff.
Windows was really picking up steam and there was a move to web development in the Windows-based developer space. Visual Basic and Delphi were popular but desktop development had peaked. ASP was for building your apps and SQL Server was the natural backend. SQL Server fed off this wave. It wasn't dislodging Oracle, but rather than every app being built on Oracle, more apps started to use SQL Server as the backend.
Then ASP.NET appeared on the scene and demand grew even more. It was a well-integrated combo that appealed to a lot of shops. I started my career in a global pharma and there was a split between tech budget. IT was a Windows shop for many reasons and ran as much on SQL Server as possible. R&D was Unix/Linux with Oracle. There was a real battle going on in the .NET vs Java (how about some EJB 1) and the databases followed the growth curves of both rather than competing against each other.
The SQL Slammer worm brought a lot of attention to the product. There were instances running everywhere and IT didn't expect so much adoption. Back then you had a lot more servers running inside offices than you do today. My office was much like my homelab today. This validated the need so the patches got applies, IT got involved in the upkeep, and adoption continued to grow.
Oracle's sales folk and lawyers were horrible to deal with. I had some experience of this directly as they tried pushing Java-related products and my boss dragged me into the evals. One of my in-laws was outside counsel in the IT space doing work with enterprise-sized companies. He claims they are the worst company he's ever had to deal with and wouldn't delegate any decision-making locally which endlessly dragged out deals. They had a good product but felt they could get away with anything. Over time he saw customers run lots of taskforces to chip away Oracle usage. This accelerated with SaaS because you could eliminate the app AND Oracle in one swoop.
>> And "the community" isn't moving to Codeberg because Codeberg can't support "the community" without a massive scale up.
People have a superficial knowledge of the space (I think this extends beyond Codeberg) but feel strongly that they need to advocate for something. Codeberg themselves seem to have opinions about what they want to do but people are suggesting they can do more simply because it gives them an outlet.
The constraints that Codeberg set seem to, on the surface at least, ensure they can scale based on their needs and protect them from external threats. Hosting random sites comes with a range of liabilities they probably understand and want to avoid right now. There are EU regulations which can be challenging to handle.
This is an interesting point when the question is "how do I build a Windows app?" and a decision needs to be made. React is definitely one of the options that some consider when this question arises.
I think you miss the more common reasoning though. This starts with "can we build a Windows app?" The answer to that was "no" for many more people until relatively recently. The .NET Framework wasn't as available by default until the second half of the 2000s which caused some Windows app devs to hold off beyond the performance reasons and WinForms vs WPF. Electron and React go hand-in-hand here as they made a (crappy) Windows app easy.
What I feel popularized this was the webview approach on mobile. In 2010, there were a ton of frameworks popping up for hybrid mobile development. This was carried forward to desktop although some of us had been embedding IE webviews much earlier. This let people say "yes" and it went from one thing to the next with diversions into React Native.
One issue is that 95% of the integrations will be fine with the default configuration. The others including some with high profit potential will have weird configs that will frustrate your customers the first time they try if not well tested/documented. It's better to take time and get it right. Enterprise customers love piloting and spending time, so best to approach that the right way too. Going with less complex options, that arguably have better APIs, makes it easier to develop your core product too and get real feedback from users.
There can be different cohorts of students. If a student is at the point where they can start exploring iOS development they can perhaps have a swing at it with this machine. In reality, they'll have been using this machine, know enough about the limitations, and be thinking of upgrading.
Kids already are well aware of iPhone upgrades. Parents will get them this machine. They'll get going and soon enough be badgering their parents for an upgrade to a more competent machine. That is all by design while being an affordance for people who can only get in at the cheap end.
I found mixed results given underlying anxiety that hadn't been diagnosed at the point I was trying this. Talking to new people at work, while out pursuing hobbies, and around town, all accrued to more and better conversations.
It was a much bigger struggle with conversations where I was putting extra pressure on myself. Being able to have those other conversations was helpful though. Eventually, I found a therapist and am in a better place with this.
Letting curiosity be the motivator behind starting these conversations and cultivating curiosity more broadly can help -- or at least I have found it to be helpful in making initiating feel less forced. I wonder about people's jobs or the reasons they are visiting a place or what they think about what's happening nearby, or just generally who they are.
One antipattern I've encountered with this approach tho is that sometimes anxious people will exhaust their conversation partners with a battery of questions. Even if thoughtful, this can sometimes have the effect of exhausting your partner, and tends to keep the conversation steered away from actual connection. YMMV, but either way be mindful and make it a point to share yourself
We delegate power already. Is unleashing AI in some place different from unleashing JSOC on an insurgency in a particular place? One is code and other is a bunch of humans.
You expect the humans to follow laws, follow orders, apply ethics, look for opportunities, etc. That said, you very quickly have people circling the wagons and protecting the autonomy of JSOC when there is some problem. In my mind it's similar with AI because the point is serving someone. As soon as that power is undermined, they start to push back. Similarly, they aren't motivated to constrain their power on their own. It needs external forces.
A big part of the problem is being permitted to teach this stuff. As a UK CS grad from the early-2000s, my observation was that academic staff recognized the need for these skills. They weren't permitted to teach it due to time available and the view that it wasn't academic. Thankfully, my university's CS department offered courses in these kinds of topics taught by the support staff (read: sysadmins). These courses existed to help other departments with skills but were open to students.
Fast forward twelve years and my wife did the MCIT at UPenn (https://catalog.upenn.edu/graduate/programs/computer-informa...) where git and other topics woven into the curriculum. Even then, they were perhaps a novelty because their focus was bringing non-CS undergrads into a CS Masters program. So-called "conversion" master's degrees were the norm in the UK in 2002.
I think it's reasonable to distinguish which side drove this. RAM prices are going up but it's not engineered primarily by RAM manufacturers. They are naturally jumping on the bandwagon and responding, but they aren't the drivers. Of course, how they respond matters. They could make other choices. Over time we'll see how this goes because AI could cool and then RAM manufacturers end up in a spot where they choose to manipulate prices to keep them higher.
How are they supposed to fund development? It's important to differentiate independent devs and the goliaths. When you release an app like this, it is "good enough" for more people. Your incentive to build the thing is that you can make a living off of it and continue crafting it with the intent of building other great things. The biggest software companies have many revenue streams and ways to cover costs that are very different from independent devs.
When you sell perpetual use software, you have the incentive to release yearly versions (or whatever cadence is best.) You are incentivized to only put bugfixes into next year's version to force upgrades. Users lose out because they don't get bug fixes and the developer is put in a spot where they have to look for more devious ways of making a profit.
To make money the cost of perpetual software is also very high. Devs make terrible compromises here to seem reasonable, but you need to move a lot of units to reduce the price to the level possible with subscription software.
Subscriptions are far from perfect, but they bring some balance. Next time you complain, it would be an interesting exercise to state what you would be prepared to pay and how often.
reply