Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Shouldn't you be doing the opposite then? Keeping the baseline amount so you can know what it's like for people without a large budget and stop patronizing the applications without acceptable footprints in that circumstance?

I'm not a developer of native Mac apps. If I were, I would definitely have a baseline machine for testing.

The Stable Diffusion XL model is ~13GB

That's not a baseline consumer application. See my other reply (re: grandma and little Billy). If you're developing a native Mac app for grandma and little Billy, Apple probably doesn't want you shipping a 13GB model with it. This is an example of the point I'm trying to make: find a way to compress the model so the end user doesn't have to deal with that kind of bloat (or host it in the cloud).



> I'm not a developer of native Mac apps. If I were, I would definitely have a baseline machine for testing.

You expect every developer to buy a second Mac? They're all doing what you're doing and paying more for the machine with more memory because other applications need it, and then their application runs fine on that machine so they don't even notice the problem for the people with 8GB.

> That's not a baseline consumer application.

It will be before any of these new machines go out of support.


> You expect every developer to buy a second Mac?

It's very easy to check the memory usage of your project (it's kinda in your face on XCode). If you do a few minutes test or let the app stay open for a few hours and usage has ballooned to a few GBs, that usually means you have a leak somewhere to fix.


But the developer of exactly the app using all your memory is writing it in Electron from Windows and has never used XCode.


And is there any wonder Electron has such a terrible reputation?

The thing that really bothers me is that if you look at what regular home and office users were doing on their computers in the 90's it's almost identical to what those users are doing right now, except in the 90's they had orders of magnitude slower computers with orders of magnitude less memory. Yet in many cases those 90's computers were MORE RESPONSIVE than what they have today.

All of those countless billions of investment in technology hasn't done a damn thing for the productivity of Sally the office worker or Billy the 6th grader. Arguably, it's made their lives worse (viz. social media's deleterious effects on mental health). Now everyone's pushing the heck out of AI and all I see is high schoolers using ChatGPT to cheat themselves out of an education. They can't read (critically), they can't write, they can't even spell!

So in light of all that, why should we be pushing more and more computing power (and memory, the original issue) on regular users who aren't getting any benefit (broadly, to their way of life) out of it?

Gosh, now I sound like a luddite!


> And is there any wonder Electron has such a terrible reputation?

Certainly not, but you can't fix it by putting less RAM in the machines of people with budget constraints. The developers will just pay for more themselves and then not care about those people because people who can't afford RAM generally aren't lucrative customers.

And it's also worth considering what actually causes this.

Developers want their code to work on every platform. They don't want to write different code for each platform. But each platform wants them to have to, because that makes it more likely there will be software that only works on their platform, or that doesn't work on some new competing platform. So they refuse to develop or implement cross-platform standards.

Then someone else has to do it, but that's rather a lot of work, and it turns out the easiest way to do it is to piggyback on the work already done for browsers to make them work on every platform. That's Electron. It's terribly inefficient but it saves the developer a lot of porting work, so it's widely used.

If Apple doesn't like this, they should provide cross-platform native APIs for developing applications.


If Apple doesn't like this, they should provide cross-platform native APIs for developing applications.

It’s not in Apple’s interest to do that. It would cost a lot of money to develop and only benefit the competition. It would also slow down Apple’s own ability to innovate on the APIs until the competitors catch up.

Or are you saying Apple should develop the APIs for Windows and Linux as well? Why would they do that?


> Or are you saying Apple should develop the APIs for Windows and Linux as well?

What they should do is provide open source implementations of their APIs for Windows and Linux, i.e. make them standards.

> Why would they do that?

Because then people would use them instead of using Electron.


There’s a lot of profiling tools. There’s even one in the browser’s inspector. It’s not something exclusive to XCode. You can even use Task Manager/Activity Monitor/System Monitor in a pinch.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: