Hacker Newsnew | past | comments | ask | show | jobs | submit | oorza's commentslogin

I remember back in the before times... when escape analysis debuted for the JVM - which allows it to scalar replace or stack allocate small enough short-lived objects that never escape local scope and therefore bypass GC altogether - our Spring servers spent something like 60% less time in garbage collection. Saying enterprise software allocates a ton of short lived objects is quite an understatement.


You don't even need an AppleTV. My Roku and Amazon TVs were both zero-configuration AirPlay targets. And generally speaking, there's not any issue with wifi streaming video - there's a noticeable input lag, but it doesn't desync audio, and videos tend not to be interactive.


Yup. I'm introducing my sister to the masterpiece that is Chrono Trigger by playing an emulated version on my Mac streamed to our Roku TV. Works great. Video is even easier.


I don't think it's necessarily any larger of a leap than any of the other big breakthroughs in the space. Does writing safe C++ with an LLM matter more than choosing Rust? Does writing a jQuery-style gMail with an LLM matter more than choosing a declarative UI tool? Does adding an LLM to Java 6 matter more than letting the devs switch to Kotlin?

Individual developer productivity will be expected to rise. Timelines will shorten. I don't think we've reached Peak Software where the limiting factor on software being written is demand for software, I think the bottlenecks are expense and time. AI tools can decrease both of those, which _should_ increase demand. You might be expected to spend a month outputting a project that would previously have taken four people that month, but I think we'll have more than enough demand increase to cover the difference. How many business models in the last twenty years that weren't viable would've been if the engineering department could have floated the company to series B with only a half dozen employees?

What IS larger than before, IMO, is the talent gap we're creating at the top of the industry funnel. Fewer juniors are getting hired than ever before, so as seniors leave the industry due to standard attrition reasons, there are going to be fewer candidates to replace them. If you're currently a software engineer with 10+ YoE, I don't think there's much to worry about - in fact, I'd be surprised if "was a successful Software Engineer before the AI revolution" doesn't become a key resume bullet point in the next several years. I also think that if you're in a position of leadership and have the creativity and leadership to make it work, juniors and mid-level engineers are going to be incredibly cost effective because most middle managers won't have those things. And companies will absolutely succeed or fail on that in the coming years.


Tailwind might not be the most perfect fit, but it's "just" CSS.


And Tailwind v4 is notably better than v3 in terms of being "CSS first": https://tailwindcss.com/blog/tailwindcss-v4


Where was this line of thinking when it was Obama ordering the DEA to not enforce marijuana laws? Where is this line of thinking when it's a city that chooses not to enforce dog breed restrictions?

The enforcement of law being separate from the passage of law is a key plank in a functioning democracy, it's one of the safety valves against tyranny.


I doubt those events made it to HN, and the questions are obviously from people outside the US who thought that 'Supreme' means 'Supreme'.


Trump has a history of accepting bribes. Past history with this is very relevant. Let me know if Cleveland mayor is accepting bribes for pitbulls.


While I find it entirely plausible that Trump's character is such that he might accept bribes I am aware of no credible evidence that he has ever done so.


Companies spending a lot of money at a Trump property then being granted contracts or favorable legislation is a bribe in my eyes.


And for every video of quality on the platform, there's one that's blatant political propaganda, one that's blatant conspiratorial misinformation, one that's sexualizing children, etc.

It's a mixed bag. It has no more to offer than any other social network. Less, some might argue, because of how easy it is to crosspost to the other video networks.

The only way this is different from the loss of other social networks, Vine most closely, is the government is shutting down the site and collapsing the ecosystem rather than private equity.


I think you'll find most people in leadership positions at most companies are not that forward thinking, proactive, or frankly intelligent. I thought cost-benefit and risk was analyzed on most big company decisions, until I sat in rooms for a Fortune 500 where those decisions were getting made. If you assume that everyone everywhere is doing just barely the minimum to not get fired, you're right more often than not.


Career risk is also a very real motivation. If you are an executive at a company whose competitors are jumping on the AI bandwagon, but you are not, you will have to justify that decision towards your superiors or the board and investors. They might decide that you are making a huge strategic blunder and need to be replaced. Being proven right years later doesn't do much for you when you no longer have a job. And if you were wrong, then things look even worse for you. On the other hand, if you do get on the bandwagon yourself, and things go sideways, you can always point to the fact that everyone else was making the same mistake.


C4 still smokes them both, doesn't it?


hard to smoke sub-millisecond pauses but there may be other axes where it is better. it used to be that people thought azul was better because it was generational but now zgc is as well. my guess is that c4 doesn't have enough of an edge at this point but happy to see benchmarks that prove otherwise.


There's three buckets of performance in interactive software: so fast it doesn't matter to the user, slow enough the user notices but doesn't lose focus, and slow enough the user has time to lose focus. The lines are obviously different for each person, which is why some people feel that software is "fast enough" well before others do.

The jump from an i9 to an M1 moved a lot of tasks from group 3 into 2, some tasks from group 2 into group 1, and was the biggest perceived single-machine performance leap for me in my professional career. I have an M1 Max or Ultra on my work machine and an M3 Ultra in my personal machine - after two years of the work machine being so visibly faster, I caved and upgraded my personal. The M3 Ultra moves a handful of things from group 2 to group 1 but it's not enough to move anything out of group 3.


I am genuinely curious if those barriers have technical justifications. There's a pretty stark difference (to me, at least) between ignoring standards in order to reinvent better wheels and intentionally diverging from standards to prevent compatibility.

It's a question of whether they're _not_ investing resources to maintain standard behavior or they are actively investing resources to diverge from it. If it's the former, I don't find any fault in it, personally speaking.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: