What remains is a sea of Gen Z designers who weren't yet alive when the foggy glass of Windows Vista seemed like a good idea.
Meanwhile, the talent wars are raging, with every AI company offering 7-figure salaries to the best of Apple's prodigies.
Apple is now the old guard. They're no longer cool, and as a public company, cost controls are too stringent; they can't pay as much. What is Apple to do?
They can give the designers a sense of ownership. It's not a question of how (un)qualified the team is; it's a retention play.
Is the design good? The A and B squads would say no. But this is the best Apple can do these days to keep critical talent engaged.
They'll burn a cycle re-learning fundamental lessons in accessibility, retain talent, and cling to the hope that next year they'll have a midwit Siri than can book a flight with a decent looking UI.
Alan Dye is the interface design lead at Apple, he's been there since 2006.
One of the lead designers on Liquid Glass is Chan Karunamuni, who's been at Apple since the early 2010s. If you search for more of the names of the design presenters at this WWDC, you'll find a lot of people with similarly long tenure.
So the theory that it's all Gen Z designers with no experience or talent seems pretty weak.
Yeah, sure. But it's more fun to talk in hypotheticals and point fingers at straw people and those young kids that make a fetish of old Nokia phones and dumb tech.
So I'm sure there's 3 Gen Z folks in a trench coat approving the work of those other Gen Z designers.
All this is just delegating to flavor of the domain "higher powers" instead of trying to grapple with the complexity of reality.
We just have to wait for Gen Alpha to bring back flat design 10 or so years from today.
And to think this is the same field that has an issue with ageism as indicated by this post yesterday. I take serious issue with people over 40 being protected while discrimination against young people "just doesn't exist". It's a clear case of the law being constructed to advantage the already advantaged. It's politically expedient because old people have wealth and influence and young people don't. Could you hire someone who can't demonstrate competence in an interview to do the job? Why does it matter if they're 20 or 100? Yet the two cases are treated very differently. You can say you won't hire a 20 year old because they don't know what they're doing, but can you not hire the 100 year old because their mental faculties have deteriorated?
Edit: this appears to be a hot take, so I challenge others to take a step back and consider other protected classes and anti-discrimination laws. They don't call out one race or sex, they say they're all protected and the very act of discriminating is not allowed during hiring. They don't say "you can't discriminate against white people or men but others are fine". That's what the ADEA does.
A lot of "old and senior people" also fumble with big mistakes a lot of the time. They're not all-perfect gods. In reality, most successful people are one trick ponies. They caught lightning in a bottle once early on that boosted their careers but that doesn't mean they're still relevant and correct with their decision making today.
Look at John Romero, he knocked it out of the park with Doom 1, 2 and some of Quake, but all his projects after have been flops of catastrophic proportions. Look at Jonny Ive's last design mistakes at Apple compared to the early successes that were perfection from all aspects.
Most people can't pull success after success forever, they always bottom out at some point then decline, some sooner than others, especially in a fast changing field like tech. So it's a high chance those senior higher ups at Apple are now dated and out of touch, but still have the high egos and influence from the bygone era. Happens at virtually 100% of the companies.
> They're not all-perfect gods. In reality, most successful people are one trick ponies. They caught lightning in a bottle once early on that boosted their careers but that doesn't mean they're still relevant and correct with their decision making today.
I don't think that characterization is quite right either. I'm a big fan of Brian Eno's "scenius" phrasing:
> A few years ago I came up with a new word. I was fed up with the old art-history idea of genius - the notion that gifted individuals turn up out of nowhere and light the way for all the rest of us dummies to follow. I became (and still am) more and more convinced that the important changes in cultural history were actually the product of very large numbers of people and circumstances conspiring to make something new. I call this ‘scenius’ - it means ‘the intelligence and intuition of a whole cultural scene’.
Extremely successful people benefit from the scenius within which they get to operate. But as that context changes and evolves over time, they fail to recreate their earlier wild successes - not because they lost any of their skills (although that can also happen), but because the skills aren't sufficient, and the deep, layered conditions that enabled those wild successes just aren't there anymore.
I think there is something in that. Certainly the world of work does seem to pivot between rewarding people that “do the work” and those that “do the work around the work” but separate themselves from actual execution. 2021-2 was peak middle manager froth, and were on a swing toward more operator led now. Usually middle management “present the work upwards” types dominate though.
I could believe that this happens at Apple if it wasn't for the executive veto that pushed stuff like the Touch Bar and Butterfly Keyboard to consumers. It sounds less like "very large numbers of people" conspiring, and more like a select few conspirators hand-picking the contributions they think would sell well.
> Look at John Romero, he knocked it out of the park with Doom 1, 2 and some of Quake, but all his projects after have been flops of catastrophic proportions.
And the other guys from id haven't exactly recaptured the same magic either. It's a shame they broke up, it turns out that the team was way stronger together than any of them has been on their own.
I like to observe how organization affects how a company operates. As soon as you create a department, that department will start to generate reasons why it should remain being a department, as a sort of self preservation instinct. If you establish a design department, they will start planning complete redesigns sooner or later -- they need to have something going on to justify their existence. When I see this type of redesign, I can't help but wonder whether it is something that was cooked so that the design department can have a place at the table.
As a tangent, HR departments are very often affected by this as well. As soon as you have large enough HR, they will start generating ideas about how to waste other teams time. They have to justify their existence by organizing some events, trainings, activities, even if they actively harm the bottom line.
“Pournelle's Iron Law of Bureaucracy states that in any bureaucratic organization there will be two kinds of people:
First, there will be those who are devoted to the goals of the organization. Examples are dedicated classroom teachers in an educational bureaucracy, many of the engineers and launch technicians and scientists at NASA, even some agricultural scientists and advisors in the former Soviet Union collective farming administration.
Secondly, there will be those dedicated to the organization itself. Examples are many of the administrators in the education system, many professors of education, many teachers union officials, much of the NASA headquarters staff, etc.
The Iron Law states that in every case the second group will gain and keep control of the organization. It will write the rules, and control promotions within the organization.”
I see this daily in our banking megacorp. We have IT security team(s), which permeates all other IT activities like ink on paper. On its own its a good approach obviously, we weren't for example hacked or scammed in any high profile case, ever.
But there is no limit to how much additional security you can bring, so they do bring all of it. Recently had to get new Tomcat distribution deployed via Chef tool, of course our own package of it. Now it runs under 2 unix users, each owns various parts of Tomcat. Main startup config (options.sh) is owned by root, to which we will never ever get access, one has to do all changes in a complex approval and build process via Chef. Servers disconnect you after 2-3 mins of inactivity, if you deal with a small cluster you need literally ie 16 putty sessions open which constantly try to logout. And similar stuff everywhere, in all apps, laptops, network etc.
All this means that previously simple debugging now becomes a small circus and fight with ecosystem. Deliveries take longer, everything takes longer. Nobody relevant dares to speak up (or even understands the situation), to not be branded a fool who doesn't want the most security for the bank.
I would be mad if this would be my company, but I go there to collect paychecks and sponsor actual life for me and my family so can handle this. For now at least.
Alternative approach, also from a financial services world: VMs are created with a DSL on top of qemu/firecracker, containers with Dockerfiles. Cyber are part of an image review group alongside other engineers that validates the base images.
But: no interactive access to any of these VMs at all. There’s hypervisors running on bare metal, but SRE teams have that scripted pretty well to the point a physical server can be added in a day or so. It does mean you’ve to be serious about logging, monitoring and health.
This is one instance where we got it right (I think). We do have some legacy servers we’re trying to get rid of. But we’ve learnt we can run even complex vendor apps this way.
Conway’s Law comes to bite us in other ways though! Like I said, it’s a bear.
In large companies, each project is approved at each stage by a steering committee. And then as appropriate more senior committes, senior leaders and eventually the CEO and the board.
The poster above is right in that if you create a design team they will want to justify their existence but it's the controls above and around it that is responsible for keeping them in check.
This view is very „hackernewsy“ and reveals a lot more about the mindset around here than the what is going on with apple. Firstly i don‘t think there is much fluctuation with the apple design team except when Ive left but i guess that was mainly due to the ceo change.
I remember a time when microsoft came around the corner with flat design on their phones and the iphone all of a sudden looked outdated. They adopted a flat look shortly after. They did that pretty well.
Thirdly and most important: noone does gaussian blurs, macro and micro transitions better than apple and it‘s a key part of their success. They are taking it one step further now. Even if it doesn‘t improve the experience for users it could help distinguish themselves visually. And there is nothing wrong with that.
Aero Glass in Windows Vista and 7 worked quite well. Virtually no applications had the glass everywhere. Many stayed with the default of only having a glass title bar and window border. Some apps extended it a little to cover a toolbar or two. Also, the glass effect was simpler, and had enough contrast by default (and the colour and transparency were customizable), whereas Apple has the glass everywhere and often with unreadable text.
What Apple demonstrated in their first OS demo is not yet finished, and I'm sure they'll add some more frosted glass efects for legibility and such. What they show off in the video looks fine to me, and the explanation that comes with the visuals show that at least from a designer point of view, all of the weird stuff that jumps out in the macOS demo was violating the design principles.
I loved Aero and I bet once Apple adds the diffuse glass to the places it need to for legibility, I'm sure this will look great too.
Siri to book a flight? I just want it to reliably tell me what time a specific meeting is tomorrow, know that when I ask for where Mount Etna is, I don’t mean a city in the USA, and stop just ignoring me randomly when I talk to it.
Apple are much further behind with Siri than they realise.
> Apple are much further behind with Siri than they realise.
I think Apple realises it way better than you’re giving them credit for. They simply weren’t able to do anything about it yet, even though they’re clearly trying.
It seems like they are trying to unify the UX for vision OS and other devices and have them finally morph with the AR interfaces that are to come. There is probably a bigger vision behind this than just shiny visuals.
Im not an Apple fan boy but Apple has been at the forefront of alot of design decisions that other companies later follow. So whilst I don’t agree with the liquid design. I suspect there’s more to it than meets the eye.
I get the impression that most (myself included) think there is nothing more than meets the eye - which is why some say that Steve Jobs is rolling in his grave.
They have been doing this slowly over the past several years. I decided to move from macOS to Linux the day settings turned into a scrolling iOS-style list rather than an actual settings menu.
I think this too. Microsoft thought something similar when they tried to unify Windows, Xbox, Windows Phone, and Windows RT in one design language.
With how badly Apple's VR headset actually sold, I don't think they're going to for a unified AR-first approach just yet. Then again, Apple did think their VR headset was a good idea, so maybe they're just high on their own supply.
Liquid Glass looks really good, so not sure what you're talking about their A team being gone. All these other companies wish they had Apple's design team.
Does the A-squad include Johnny Ive, who gave us butterfly keyboards and the Touch Bar (where (IIRC) the initial revision of which did not have a separate physical key for ESC)? Though Ive did get rid of skeuomorphism.
By replacing skeuomorphism with minimalism, Ive's anti-skeu was a cure nearly as worse as the disease. They were right to move away from skeuomorphism, but they did so recklessly, giving us a UX where almost all cues for an element being "clickable" were stripped away.
Ive hasn't done a single impressive thing after Jobs' departure. To the extent that Ive did anything noteworthy, it was with Jobs as visionary, product director and tastemaker. Outside of that relationship, his work has been derivative of prior Apple design success, or embarrassingly wrong-footed. Factoring in the lag time of product cycles, it's astonishing how rapidly Apple improved after Ive's departure.
Perhaps people can argue with me: I claim skeuomorphism jumped the shark with the pseudo reel-to-reel playback UI in ... was it the Podcasts app? Or maybe people think it was Notes with the torn edge along the top margin.
Regardless, skeuomorphism seems to have gone too far at some point. Perhaps became overly cute, overly precious, pretty-pretty.
Skeuomorphism was said to have been the thing in early GUI computers, as metaphor or real objects, that helped early users to those interfaces understand them. Dragging a file icon that looked like a dog-eared piece of paper to a trash can icon on the screen (to delete the file) — the most obvious example.
I suspect by the time the Web came around, users had to become more comfortable with being bombarded with all manner of wild UI paradigms and they learned to more or less cope. Skeuomorphism, like training wheels, were perhaps not really needed as much as they had been a decade earlier.
What's wrong with skeuomorphism? It is wrong if it is done wrong, like everything, but done right, it looks good and feels familiar. It is pretty much the standard in music production software and people don't seem to complain about it.
I think more accurately, Apple’s, while imperfect, A-tier editor passed away in 2011, and no one replaced him.
It has been a downward slope since then after the momentum dissipated after his death.
Turns out, I didn’t like the operating system Apple made. I liked the OS Apple made while being curated and directed by Steve Jobs. His taste matched mine in a lot of important ways.
I don’t really believe the narrative that this is the C-team running things now. A complete redesign like this would require approval from numerous executive stakeholders. My guess is that it’s connected to the Apple Vision project - possibly they’re working on a new device at a more consumer-friendly price point.
Aero is leaps and bounds more aesthetically pleasing and easier to work with than flat crap. Sooo glad we don't have to suffer more of that after a decade+.
That’s BS take. iOS design is one of the most coveted roles if not the most important role you can get as a designer. It reaches billions and influences everything else. Just because we are not impressed with Apple’s direction, doesn’t mean these roles at Apple are not highly sought after. People would work for free to have that on their cv. Not everyone is motivated by pay and this is especially true among people with actual talent.
"The difference between the most recent FMV (409A) valuation and your exercise price."
This will almost never be the case. This doesn't account for different share classes, liquidation preferences, preferred stock, all of which get exercised before common shares.
A better description would be "the most recent 409A valuation, minus preferred treatment, and your exercise price."
All of that is moot though, as an employee wouldn't have access to the cap table or liquidation stack. The short answer is you'll have no idea how much your equity is worth until you get the wire transfer into your bank account.
Equity as an incentive truly favors the employer. With vesting, equity rarely works out to be better than having a market rate salary, unless the company becomes a household name.
>This will almost never be the case. This doesn't account for different share classes, liquidation preferences, preferred stock, all of which get exercised before common shares.
>Equity as an incentive truly favors the employer. With vesting, equity rarely works out to be better than having a market rate salary, unless the company becomes a household name
I've worked at 4 different startups. Two were acquired and two are still going, with one making a small profit and being a lifestyle business for the founder, and the other having a great product and still growing.
For the two acquisitions, one of which I held 1% equity in, the value of my options was $0, which was very disappointing. In that case, I did get a cash bonus as the VP Engineering and an offer from the acquiring company that was 3X my cash comp, but the stock was worthless.
At this point in my career, I value stock in private companies at exactly $0 and treat it like a nice bonus should it ever amount to anything.
409a valuations explicitly take into account share classes/liquidation preferences. That's kind of the point. If the Preferred last sold for $1.00, the 409a might value the Common at $0.10 per share, which would then typically be the FMV strike price set in the next round of issued options.
If the Common FMV has been steadily increasing from when you received your options, that would typically be a positive sign. Of course, 409a valuations are based on mathematical models. Since Common shares are so illiquid in a private start-up, you don't "really" know what they're worth until a liquidity event.
1. People were creating quirky, whimsical, odd corners of the internet for nobody but themselves. Art.
2. Entrepreneurs were starting to build sophisticated web applications for other people, i.e. customers.
Nielsen's dogma was excellent for the latter, and disastrous for the former.
History has been kind to Nielsen in the way that the modern web has lost most/all of its charm for the sake of answering the question "but how does it make money?"
There are generational policy and societal shifts that need to be addressed somewhere around true Competent AGI (50% of knowledge work tasks automatable). Just like climate change, we need a shared lexicon to refer to this continuum. You can argue for different values of X but the crucial point is if X% of knowledge work is automated within a decade, then there are obvious risks we need to think about.
So much of the discourse is stuck at “we will never get to X=99” when we could agree to disagree on that and move on to considering the x=25 case. Or predict our timelines for X and then actually be held accountable for our falsifiable predictions, instead of the current vide based discussions.
It’s a good point. For epistemic hygiene I think it’s critical to actually have models of the growth rate and what is implied. Eg we are seeing exponential growth on many capability metrics (some with doubling-times of 7 months), but haven’t joined this up to economic growth numbers. In models where the growth continues you could imagine stuff getting crazy quickly, eg one year AI contributes 0.5% GDP only measurable in retrospect, next year 2%, year after 8%.
Personally I don’t think politicians are capable of adapting fast enough to this extreme scenario. So they need to start thinking about it (and building and debating legislation) long before it’s truly needed.
Of course if it turns out that we are living in one of the possible worlds where true economically meaningful capabilities are growing more slowly, or bottlenecks just happen to appear at this critical phase in the growth curve, then this line of preparation isn’t needed, but I’m more concerned about downside tail risk than the real but bounded costs of delaying progress by a couple years. (Though of course, we must ensure we don’t do to AI what we did to nuclear).
Finally I’ll note in agreement with your point, that there are a whole class of solutions that are mostly incomprehensible or inconceivable to most people at this time (ie currently fully outside the Overton Window). Eg radical abundance -> UBI might just solve the potential inequities of the tech, and therefore make premature job protection legislation vastly harmful on net. I mostly say “just full send it” when it comes to these mundane harms, it’s the existential ones (including non-death “loss of control” scenarios) that I feel warrant some careful thought. For that reason while I see where you are coming from, I somewhat disagree on your conclusion; I think we can meaningfully start acting on this as a society now.
I like your idea of developing a new economic model as a proxy for possible futures; that at least can serve as a thinking platform.
Your comment inspired me to look at historical examples of this happening. Two trends emerged:
1. Rapid change always precedes policy. Couldn't find any examples of the reverse. That doesn't discount what you're saying at all, it reiterates that we probably need to be as vigilant and proactive as possible.
and related:
2. Things that seem impossible become normative. Electricity. The Industrial Revolution. Massive change turns into furniture. We adapt quickly as individuals even if societies collectively struggle to keep up. There will be many people that get caught in the margins, though.
This is some excellent first-principles thinking. Sort of like resisting the next team hire until it's unbearable, keeping the user out of the browser until it's unbearable
Rolex makes 1,000,000 new watches per year, and the wait lists are years-long.
There is definitely enough demand for all of those paintings to be sold for more than you'd think
reply