> Knowing a new language is a lot more than the ability to use google translate.
Machine translation has poor reputation for the mistakes it makes; but if it ever gets good, then I wonder why learning foreign languages will be a more popular activity than learning Latin or Classical Greek is today. Of course reading Homer or Virgil in the original must be much more satisfying than reading them in translation; but the number of people who truly care about that is vanishingly small.
Learning new languages has a lot of benefits, talk to ChatGPT about it. One example connected to recent events is it helps people become more cognitively flexible and less tribal. If your favorite burger joint starts serving contaminated food, it’s nice to be able to seamlessly switch to Indian or Chinese.
I had a similar reaction, though somewhat weaker, and had to take a double take at the images. On the one hand, at first glance, they aren't as mindlessly hopeless as most of other ai-generated imagery. They even make some kind of vague and superficial sense. But of course, if you look closely and try to decipher the details, it all falls apart.
Why do authors think that images like these are better than no images at all?
My point here being - the images are synthetic. Not questioning their utility to the article, quality, or other aesthetics. It's a challenge to the intent to use synthetic imagery while writing against getting too much used to synthetic text (and the lack of personal craft in it).
Does the author fail to recognize his own actions, is this failure on his part or a reinforcement of his fears...? Perhaps not a complete contradiction to his general thesis.
I don't personally like the images. I think he could've put together some sort of collage that would go along better.
> I have to talk about this one role, in detail, multiple times with every company.
Ok; so how do _you_ talk in interviews about this past job that you regret? If you've had to talk in detail about that one role multiple times, haven't you yet come up with a way to talk through it? Haven't you yet developed, either deliberately, or spontaneously, just through the sheer fact of repetition, some kind of a story around it?
I've switched completely to audiobooks for any kind of books (fiction primarily, but also non-fiction) that do not have code in them, or any other material that requires visual consumption. I made the switch more than 10 years ago. I am doubtful I'll ever be able to consume heavy prose requiring concentration and re-reading of the same or previous paragraphs via audiobooks; or poetry; so probably no Gravity Rainbow, or Infinite Jest, or the Bible for me; but for light reading, it's perfect.
I've skimmed through the comments; and seen that most people have commented on the cog in the machine thing, or on layoffs in general and how they suck.
To me, the shock from this blog post was about seeing a Chrome developer relations engineer whom I have grown to admire and who has been doing a stellar job educating web developers on new html and css features, get the sack. He was one of the best remaining speakers on web topics at the Chrome team (I am still sad about the departure of Paul Lewis and Jake Archibald); and produced a lot of top-notch educational materials (the CSS podcast; the conference talks; the demos).
What does this say about Google's attitude to web and to Chrome? What does this say about Google's commitment to developer excellence?
I understand that this is a personal tragedy for Adam; but for me personally, this is also a huge disillusionment in Google.
Agreed, Adam really is one of the best at what he does. His talks, demos, were always so interesting. My guess is that he'll be at Microsoft shortly.
What Google is saying with this layoff is that they no longer care about web developer relations. Chrome has not been well funded for years.
Firefox did the same thing five years ago, when they fired David Baron, who was one of the top 5 engineers in the world that understood how HTML layout works. He got instantly hired by Chrome.
It is kind of crazy that the core group that moves web standards forward is around 150 people. And most of them did not get rich off it, and have been doing it for decades.
>What does this say about Google's commitment to developer excellence?
Look inside the tensorflow code base for your answer.
I had the Kafkaesque experience of reporting a bug, being told there is no bug by a help desk employee, while the bug was throwing up errors in the official docs.
To top it off I got a message by one of the onshore team months later that they were going to solve it only for the person to be fired within a week.
I've mostly moved to jax for daily experiments. Hopefully the fact that codebase is small and modular will mean that when Google Googles all over the project there will be enough know how to maintain a community fork.
In the instances where I’ve needed to run inference in the browser I have used Onnx runtime web to run weights that were exported from PyTorch training. You have to convert your browser data to onnx tensors, but it’s not that bad.
I don’t do actual training in the browser though, so maybe someone else can answer that part.
While possibly a traumatic experience for Adam, I fail to see the significance of this beyond anecdotal level. And I find it rather odd to argue that after all Google did and didn't do that this is what is causing disillusionment with Google. By now Chrome is basically just a Trojan Horse with advertisement and surveillance for this purpose hidden in the inside.
I think the OP explained the broader significance very well: If Google is firing one of the most successful and active web developer relations people they have, it suggests a strategic downgrade of the Chrome, the web, and engagement in human developers. That's bad news for anyone who builds for the web or who relies on it as an open platform for the dissemination of information and software.
I think the position your take re. Google and Chrome is an extreme one. It always surprises me that such black and white opinions about big tech companies are commonplace even on HN. Yes, Google have done things around privacy that I strongly disagree with, but the idea that Chrome is simply a trojan horse for advertising/surveillance is absurdly reductive and ignores the history of Google as a company.
Google was, originally, a web-first company. Their business success relied on the web being an open, competitive platform. And, at a time when Microsoft were still trying to maintain monopoly control of personal computing, Google's development of Chrome did a huge amount of good in maintaining and enhancing the web as an open alternative. And they employed a lot of people who were genuinely believed in that mission, such as Adam.
Make no mistake, the death or spin-off of Chrome will not be a win for privacy or openness. Building a web browser is a hugely expensive and difficult endeavor, and it has to be paid for somehow. Yes, Google has leveraged Chrome in some ways to collect data, but far less than they could have done, and far less than any successor will have to do, just to keep the lights on. Look at what has happened to Mozilla and Firefox if you need proof.
The fact that manifest V3 went through and fundamentally nerfed all extensions that just so happen to block ads and offer privacy means these people failed, regardless of their intentions.
> If Google is firing one of the most successful and active web developer relations people they have, it suggests a strategic downgrade of the Chrome, the web, and engagement in human developers.
A layoff is not firing. If Google is doing layoffs, they'll intentionally choose good performers so they can demonstrate it was done for purely economic reasons. Otherwise they get legal issues.
Besides that, Google may not trust its own performance metrics well enough to use them. The VP might assume the director is lying about who's important etc.
Hadn’t thought of it this way, but if there is (say) a 50% chance of being forced to divest Chrome, then the EV on your investments in the future are substantially lower.
The history of Chrome and Google is interesting but not very relevant for assessing their status quo. If anything you'd have to factor in the trajectory (which I did: "by now") and given its direction it certainly wouldn't improve a valuation.
Regarding your "reductive" opinion of Firefox and Mozilla, all I can say is I use Firefox and I'm quite satisfied with it. Ironically, the worst part about Mozilla and its business decisions can be traced back to it being funded by ... Google.
The worst part about Mozilla and its business decisions is that making a browser isn't something you can build a business on because you're competing with free.
Anyone who thought Mozilla wasn't going to eventually turn to evil to maintain their liquidity has no stones to throw at Mr. Argyle regarding naïveté. Is Mozilla Corp (not Mozilla Foundation) a non-profit? No? Then they need to turn a profit, and I don't see a price-tag attached to that browser they make.
No, not unless there is a large American buyer already in place. The DOJ did not force Microsoft to sell IE in 2001 either, and back then there was no US software company anywhere near Microsoft's size that would have purchased it. DOJ are not in the business of eliminating a global monopoly if a US company holds said monopoly.
I am listening to a podcast Adam Argyle is talking in, listening to what he is passionate about and then getting axed by Google is painful to hear as now it is clear that Google is not passionate about those things (anymore). It is also painful personally because it is what I am passionate about (and my job).
Link: https://dev.to/whiskey-web-and-whatnot/leveraging-css-web-de...
This certainly isn't new. I know someone who worked at Google who mentioned the company culture has been souring since the start of the pandemic. I suspect Google will have a slow death akin to Yahoo in the coming years.
It says they are getting ready for the future when some govt agency splits them up and they are shedding the load now (the areas they will have to sell).
That's a good question. I don't know about the author's credentials but it seem very high and indeed I'm sure they would find a job anywhere they apply to.
But from the PoV of a higher up making layoff decisions, the sooner they severe the organization from the rest of the company, the less traumatic it will be when the time comes. I think at this point they realize it's better to start treating Chrome et al more like Waymore, which seems to have some isolation from the borg collective.
It probably just didn't have enough economic value for the company, from your explanation of the role I'm not sure I see the value either. The guy probably earned enough money in a few years that would take me 15 years of work, I'm not sure this as a "personal tragedy".
Completely agree. What is a tragedy though, is that if Google treats their most hardworking engineers like this they are creating a culture of minimal effort. If this is "just a job" as you can expect to be laid off at a moment's notice with no care for the value of your contributions, then what is the point in doing anything more than what the job description entails. It's just incentivizing people to treat their job the same way the company treats their employees. A culture of distrust and minimum effort. It's very sad to see.
> if Google treats their most hardworking engineers like this they are creating a culture of minimal effort
This is bizarre to me because my impression from three years ago was that they were trying to correct from that, and it sounds like they overcorrected right back to it. I spoke with a Google engineer I think in 2022, he had heard rumors there were going to be layoffs on his team, and he had sent a message to a more senior team member that he hadn't heard from in months to let her know that she should, to put it delicately, maybe manage her visibility better. And she responded that she had lost interest in the job anyway, hadn't done anything except respond to emails and messages in over a year, and had 99% transitioned to managing a collection of properties she had been accumulating over the years, so if he heard she got laid off, he shouldn't feel bad for her. I'm pretty sure that was in 2022.
To see Google go from tolerating being ghosted by highly compensated senior+ engineers in 2022 to laying off people who were doing excellent and high-profile work in 2025 must be surreal for people inside Google. If this is all accurate, they swung the pendulum from one zone of encouraging laxity and disloyalty right through the healthy zone and into another zone of encouraging laxity and disloyalty with dizzying speed.
> If this is "just a job" as you can expect to be laid off at a moment's notice with no care for the value of your contributions, then what is the point in doing anything more than what the job description entails
I guess the half million dollars yearly (I'm assuming he made) and the fact there aren't tons of other places he can get that kind of money and prestige for doing that kind of job.
I'm not saying I'm loving any of this, but yeah the system we've built treats all of us like replaceable cogs. During good economic times we don't really feel it, but we are now in a rough patch and we see the capitalist economic reality for what it is.
> from your explanation of the role I'm not sure I see the value either
Google has traditionally had a fair number of developer relations engineers. Chrome team alone has several. The current devrels include Una Kravets, Bramus van Damme, Rachel Andrew, possibly Jecelyn Yeen, Oliver Dunk, Matthias Rohmer, probably some others... They help prioritise new browser features through developer feedback, document new features, maintain documentation at web.dev, spec up new features and represent Google at various standardizing bodies, write walkthroughs and tutorials, build demos to showcase new browser features, make explanatory videos, give conference talks, and generally keep us, web developers, up to date with modern browser best practices.
Their value to web developers is immense. Their value to Google is possibly in that good devrels are a living advertisement of web technologies in general, and Chromium-based web browsers in particular. The better developers know browser features, the more attractive and capable UIs they can build for the web, the more consumers will be attracted to the web (including Chrome), and the more money Google will ultimately make via advertisements.
Adam has been so great in this role that it does not make sense to me that Google decided to cut specifically his position.
CSS already has @layer; and @scope is coming (waiting for firefox).
There are naming conventions like bem. There are also tools to guarantee unique class names (css modules, 'single-file components' in various js frameworks).
There are design systems to standardize and reduce the amount of CSS.
There are web components, with their styles encapsulated in the shadow DOM.
> The web is the only platform where people routinely chide other people for wanting to use their favorite programming languages.
Oh, I've been plenty chided for wanting to use Node on the server, and not something faster like Go. Although you may have included web backend into 'the web'.
> Most everyone else tried to find a use for Twitter but couldn’t. I know many early users that either abandoned or deleted their accounts before 2010.
So what changed? Why did twitter eventually become so popular?
Twitter was almost immediately popular and it stayed popular, it's a revision of history to claim that it wasn't or that most people abandoned it in 2010. Twitter famously had scaling issues that resulted from demand for its use, and when the server was overloaded, they would print an image of a whale being carried by birds, the infamous "Twitter fail whale" (https://business.time.com/2013/11/06/how-twitter-slayed-the-...).
You can see in the article above that even in 2013 they were talking about Twitter's rise to prominence beginning in 2008.
Twitter was/is a fantastic resource for one-to-many social media communication. Celebrities flocked to it. Media publications analyzed it and ran stories on the platform. The API used to be quite open and basically free so it plugged into countless apps and was often used in hackathon projects. Hash tags became signal for trending topics. Even the public '@' tag (don't 'at' me bro) basically came from Twitter (or was at least, popularized by it). It was a phenomenon. Reaching 10% of Facebook's reach is hardly anything to scoff at (who had hit 1 billion users around the same time), and dwarfed the population of nearly every nation on earth. Twitter had outsized influence on the public conversation because you could get a message out to millions from a single account, which wasn't possible with Facebook due to friend requests (at the time, Facebook was more purely a friend-to-friend network and pretty sure you were restricted to at most 5K friends).
Twitter didn't even require a login to view Tweets. Embedded views in other apps helped to cement its virality.
There's "popular", then there's "every conference talk has @name in it instead of an email" and then there's "heads of state publish stuff there first instead of POSSE".
I'm not saying it wasn't popular, but it was not ubiquitous.
It's more ubiquitous than Facebook among people that matter in public discourse.
Basically anyone with a professional presence that involves talking to the public, publishing papers, blogs, open source projects, etc still uses Twitter to talk to the public. Lot of these people have a hidden or deactivated Facebook, but public Twitter.
This is what I call a failure to measure, or what smart people call bias. To ascend your opinion from silly to valid you only have to qualify two things:
How much more ubiquitous in this regard is Twitter than Facebook (as a percentage) and what real world impact does that number have?
People, in general, tend to invent their own reality. There are smart terms to describe that behavior from a variety of causes but dummies like me just tend to call it bullshit.
> it's a revision of history to claim that it wasn't or that most people abandoned it in 2010.
Whether this is true or not depends a lot on the social circle you are talking about. I am aware of quite a lot of people who abandoned Twitter after it became more closed with respect to the API, but I am also aware of quite a lot of people who nevertheless did stay.
It took many years for Twitter to become valuable and has since lost most of that value. It did not become profitable until 2018 and then became negative again in 2021. https://www.businessofapps.com/data/twitter-statistics/
Twitter usage is also way down, but most analysts stopped using things like account numbers, message quantity, and visitor counts to account for any real concern years ago because most of it was determined to come from bots.
Twitter popularity is illusory. Its a broadcast system that the majority of its users, whether people or bots, solely sought to exploit for offsite metrics.
> Twitter popularity is illusory. Its a broadcast system that the majority of its users, whether people or bots, solely sought to exploit for offsite metrics.
Regardless of what you think people used Twitter for, there are real-world consequences from people using Twitter to communicate with each other. The Arab Spring is probably the biggest example for that, where people used it for activism, while the governments tried to ban it and survive the uprisings happening all around the Arab world.
I'm talking about popularity, not profitability. These are two vastly different things. That Twitter failed to convert its platform to become the advertising behemoths that Google and Facebook are is a failure of its business strategy and execution. The social network itself remained extremely popular and from your article:
> Twitter has 421 million monthly active users, adding 20 million in 2023
It's as popular is its ever been, however, there's been some rotation in demographics I suspect.
Consumer surplus isn't captures. It was very amusing to see the accidental live tweeting of the OBL operation by some guy who heard helicopters. Twitter crowdsourced analysis of ISIS propaganda lead to at least one airstrike. Facebook can't say that.
i think it got popular by people who noticed they can earn a lot of money by tricking others they have interesting things to say. (advertisement via influencers / trends / bots etc.). making it more popular would increase $$ on these things.
i wasn't in the super earlybirds users, but this is what i get from having used it. like any other social media really. trick people that its cool somehow and start shoving crap down all their senses you can reach.
platforms which don't do this, dont get big because they are kept small.
(maybe a bit cynical post, but i don't think its wrong.)
Hashtags became links around 2009, but I think it was just critical mass. Instead of yelling into the void, it became very easy to stumble upon a community or discussion around a hobby or event, and follows didn’t require approval like friending on Facebook. Because Twitter lacked structure, you didn’t have to find the right place to be or the right people to speak to, you’d just overlap due to retweets and hashtags. So it inverted in some ways the traditional structure of social networks, allowing for emergent and ephemeral events and places (and thereby main characters) to bubble up and recede. You could be part of something without ever having to be admitted. This was somewhat true of the blogosphere but the currency of trackbacks and comments there wasn’t quite as freewheeling and expansive.
If Facebook is the minimum for top-tier then there is only one top-tier social network. Being an order of magnitude off Facebook still makes the network one of the most popular social networks of all time.
The top tier is, essentially, Facebook, Instagram, Youtube and TikTok. These are used by literally billions of people; with the exception of China, virtually everyone on earth is exposed to them fairly directly.
(Also Telegram and WhatsApp are a borderline case; they have the users, and they have social-network-like features, but most of the users are likely not _using_ the social-network-like features; they just use them as messaging apps).
The second tier is things like Snapchat, Twitter, LinkedIn, Reddit, Pinterest, Quora; these are in the 300-600 million user range, so they're big, but you don't have the same sort of universal exposure.
It's a stretch to call YouTube a social network along with the messaging apps. Facebook & Instagram are two products by the same company where social identity is literally shared across them. If your top tier is Meta + Chinese apps then we're just going to have to disagree.
"Popular" isn't quite the right word—"significant" might be a bit closer? In my country, at least, Twitter was adopted by the political, celebrity, and media class far more than Facebook ever was.
Machine translation has poor reputation for the mistakes it makes; but if it ever gets good, then I wonder why learning foreign languages will be a more popular activity than learning Latin or Classical Greek is today. Of course reading Homer or Virgil in the original must be much more satisfying than reading them in translation; but the number of people who truly care about that is vanishingly small.
reply