Before the modern Cloud took shape, developers (or at least, the software they created) used to be more trustworthy.
I love those classic tools from the likes of Sysinternals or Nirsoft. I didn't hesitate to give them full access to my machine, because I was confident they'd (mostly) work as expected. Although I couldn't inspect their source, I could reason about how they should behave, and the prevailing culture of the time was one where I knew the developers and myself shared a common set of expectations.
Their creators didn't tend to pull stunts like quietly vacuuming all your data up to themselves. When they did want feedback, they asked you for it first.
There wasn't such a potent "extract value" anti-culture, and successful companies recognized enduring value came from working in the user's best interest (eg. early Google resisted cluttering their search results).
Although silos existed (like proprietary data formats), there was at least an implicit acknowledgement and expectation you retained ownership and control over the data itself.
Distribution wasn't locked behind appstores. Heck, license enforcement in early Office and Windows was based on the honour system - talk about an ecosystem of trust.
One way to work toward a healthier zeitgeist is to advocate tirelessly for the user at every opportunity you get, and stand by your gut feeling of what is right - even when faced with opposing headwinds.
Correct me if I am wrong but I think, Russinovich may now be the CTO of Azure because once upon a time as an independent developer he wrote sysinternals. Its one way to show your talent as an independent developer.
100% agreed. Those small independent developers also had more on the line, and I'd trust them far more than a Big Tech company that cares almost exclusively about $$$ and is made up of employees which largely dilute responsibility among themselves and many of which probably aren't even there because of how well they can program.
Standing in the wind won't stop the storm. You only get blown over.
"If everyone just" stood in front of the storm, they'd all get blown over, and the storm would go on.
No one wants to hear that individual heroics aren't the answer, but they aren't. Our moral intuition fails on problems bigger than it's meant for.
One person, out of hundreds, working in a factory that makes a slightly cheaper widget, that changes a BOM, that drives a different industry to swap one process for another, that forces other factories to move, that changes water use and heat flow in a region, that nudges weather systems out of a regime that favors storms, is doing more to stop the storm than any hundred people standing in the wind.
The person washing windows at a lab that figures out how any of those steps might be connected doesn't get to be a hero, and doesn't feel righteous for fighting, but they're fighting, while the people who choose feeling righteous aren't.
Or, 30% of the society could build up distrust in tech, AI, and scraping on the Internet, and quietly start sabotaging flow of data and/or cash. They're not going to disclose decision criteria that implement that behavior, even when imprinting it to people around.
I think such "immune response" of human society will be more realistic modeling of the storm blowing. The Leviathan won't listen to few of scales screaming data abuse is technically legal or whatever, if it deem that unreasonable.
People want to think in terms of people and their actions, not in terms of systems and their structures. That's the problem.
Show a programmer a system that requires constant on-call burning out the team—they'll tell you (if they've been around) heroics are a sign it's broken, not to find more heroes. Definitely not to pull the person working on systemic issues off that nonsense and get them a shift.
Show them an equivalently structured broken system in a different field and they'll demand heroics.
I can't give you an example of an action, because I don't know, and I'm not talking about actions. I'm saying the approach itself is wrong and any action it suggests won't be effective.
Moralizing has a brutal, self-perpetuating metaproblem that makes people more and more convinced it works the more it doesn't. Actions based on moralizing will be chosen if suggested and they will not work, so any moralizing framing needs to be rejected right away, even if you can't suggest a better action.
The liberal attitude is that the moral character of individuals does not matter for social order so long as the right rules and institutions are in place. Part of Confucius’s point, and that of any conservatism worthy of the name, is that rules and institutions are ineffectual without individuals willing to subordinate their desires to them. And individuals who do not seek the good (so as to “rectify their hearts”) and the true (thus pursuing the “investigation of things”) can neither curb bad desires nor cultivate good ones. The brute force of legal coercion cannot substitute for this missing moral fiber.
Moral fiber is important. Choosing ineffective, superficial actions that feel righteous over unrewarding slog that leads to moral good seems like the opposite of that.
Perhaps the distinction between "developers" and "users" is illusory.
"Developers" are themselves "users" and they, too, must trust other "developers".
I compile kernel and userland myself, sort of like "Linux from Scratch" but more personalised. I am not a "developer". I read through source code, I make decisions based on personal tastes, e.g., size, speed, language, static compilation, etc. Regardless, I still have to "trust". To be truthful, I read, write, edit and compile software not for "trust" reasons but for control reasons, as well as for educational purposes. I am not a fan of "binary packages" except as a bootstrap.
It seems that most self-proclaimed "developers" prefer binary packages. In many cases it appears the only software they compile is software they write themselves. These "developers" often muse about "trust", as in this submission, but at the same time they use other peoples' software as pre-compiled binaries. They trust someone else to read source code and alert them of problems.
They almost certainly use software written by someone else either at work or at home. For example, they likely use an IDE written by someone else. As such they are "users".
It is quite common for self-proclaimed "developers" to write software for other "developers". In other words, developers who are "users". We see this on HN everyday.
I’m actually looking back at the past, and realizing why app stores took over.
For the developers, it was indeed an ecosystem of trust.
For regular users, it was hell. The app stores, on a basic level, were absolutely in their best interest.
Why does the phone, even now, have more apps than desktop? I answer that it was because users could try software for the first time, and could afford to take risks, knowing with certainty it wouldn’t steal their bank account. Users were implicitly trained on the “free” platforms to trust no one, take no risks, that .exe (or .deb) could ruin your life.
For the average user, there has never been such an ecosystem of trust as there is now. That’s a sobering indictment about how a “free” or “open” platform can simultaneously be, for most people, user-hostile.
Or another example: We like owning our data, knowing that Docx is ours, as you also complain above.
But if you talk to many families; so many families have horror stories of digital losses. Lost photos, lost tax documents, lost memories. Apple charges $2.99/mo. and they’ll never lose it (or at least, the odds are lower than a self-inflicted disaster)? For them, the cloud has never felt so freeing.
The phone having more apps is just objectively untrue isn't it? In the last 50 years of personal computing, there have been pieces of software so diverse that you couldn't even put everything that was built in a book you could hold. There pretty much wasn't a productivty task, entertainment task or a part of your computing experience that you couldn't somehow customize and speed up by some piece of tool someone built.
If anyhing, *huge part* of that software cannot be replicated on your phone because the golden cage owner decided that you're not allowed to have it until they monetize it on you.
Objectively, the phone has had more software. Google Play even now lists 1.6 million apps. Apple has 1.8 million. This does not include delisted apps, or LOB apps, so the only relevant comparison is publicly available Windows and Mac apps currently on the market. For context, Steam has around 0.1M. And if you go by sales volume from app stores, Steam had $10B in revenue, Apple had $85B. Apple makes about 4x as much profit from the gaming market than Steam. (Yes, Steam is actually a gaming market minority.)
> If anyhing, huge part of that software cannot be replicated on your phone because the golden cage owner decided that you're not allowed to have it until they monetize it on you.
Objectively, most people have no desire for old software. Nobody wants a 10 year old video editor. Even retro video games, the most heavy market for retro software, is a drop in the bucket.
Terrible example. Many professionals, hobbyists and casuals do, actually. The only reason I still have a Mac is running ancient versions of CS and Premiere..
The only things Im really missing are codecs and, well, running it on a modern OS. Still prefer it over the cloud crap. I guess you think Im a “nobody“.
A law firm I worked for had some elderly senior partners who still used typewriters for documents they submitted to court. While they could have had a paralegal type everything in for them, they'd been using their typewriters for probably close to half a century. Muscle memory and a well-established workflow was more important to them than having their documents in a CMS.
> Objectively, most people have no desire for old software. Nobody wants a 10 year old video editor. Even retro video games, the most heavy market for retro software, is a drop in the bucket.
I'm not talking about old software (yet another weasel thing you dragged into the conversation). I'm talking about software that you're not allowed to have at all - software that automates your phone, software that processes your data, modifies how your phone looks or works, software that modifies other software on your phone (e.g. Outlook itself had a massive ecosystem of productivity plugins just byitself to serve the user).
The fact that those whole businesses existed and still exist for decades directly contracticts your "users don't want it" bull coming directly from corporate whiteknighting.
There are multiple games stores fo PC. Next to Steam you have, e.g., GOG and Itch. How many app stores are there on iOS? And the phone app stores have every category of application, not only games. I doubt you'll find, say, Borland Turbo Pascal on GOG. And that's without going into so-called 'legacy' software. There was so much shovelware made for DOS that it almost makes the Apple App Store look like it's not a heinous garbage dump.
Actually now I wonder how many text editors have been made for PC, must be in the thousands.
I love those classic tools from the likes of Sysinternals or Nirsoft. I didn't hesitate to give them full access to my machine, because I was confident they'd (mostly) work as expected. Although I couldn't inspect their source, I could reason about how they should behave, and the prevailing culture of the time was one where I knew the developers and myself shared a common set of expectations.
Their creators didn't tend to pull stunts like quietly vacuuming all your data up to themselves. When they did want feedback, they asked you for it first.
There wasn't such a potent "extract value" anti-culture, and successful companies recognized enduring value came from working in the user's best interest (eg. early Google resisted cluttering their search results).
Although silos existed (like proprietary data formats), there was at least an implicit acknowledgement and expectation you retained ownership and control over the data itself.
Distribution wasn't locked behind appstores. Heck, license enforcement in early Office and Windows was based on the honour system - talk about an ecosystem of trust.
One way to work toward a healthier zeitgeist is to advocate tirelessly for the user at every opportunity you get, and stand by your gut feeling of what is right - even when faced with opposing headwinds.