According to the article's graph of the fresh grad unemployment rate, the present climate is about as bad as in 2003 but less than a third as bad as it was in 2010. Unemployment during the pandemic spiked well above 2010, but only briefly, before returning to pre-pandemic level.
Krouse points to a great article by Simon Willison who proposes that the killer role for vibe coding (hopefully) will be to make code better and not just faster.
By generating prototypes that are based on different design models each end product can be assessed for specific criteria like code readability, reliability, or fault tolerance and then quickly be revised repeatedly to serve these ends better. No longer would the victory dance of vibe coding be simply "It ran!" or "Look how quickly I built it!".
This is my hope as well. We now have time to write things a bit better. Comment on the pr with a quick improvement and it can just happen. But I’m failing to convince people at work. The majority seem to just be happy for code to go away and for us to never think about it again.
I abhor small talk. It's physically painful. I've heard that's often true of some cultures, especially northern europeans.
I think the trick to converse on a more engaging level is to introduce conversation that invites deeper thought. Somehow you need to intrigue the other person. Compel them through curiosity to leave their comfort zone and join you where you'd prefer to be.
IMO, even disagreement can be agreeable if it's not confrontational, if you genuinely express curiosity to learn what they think, what they care about.
I don't think so. A decent C programmer could pretty much imagine how each line of C was translated into assembly, and with certainty, how every byte of data moved through the machine. That's been lost with the rise of higher-level languages, interpreters, their pseudocode, and the explosion of libraries and especially, the rise of cut-and-paste coding. IMO, 90% of today's developers have never thought about how their code connects to the metal. Starting with CS101 in Java, they've always lived entirely within an abstract level of source code. Coding with AI just abstracts that world a couple steps higher, not unlike the way that templates in 4GL languages attempted but failed to achieve, but of course, the abstraction has climbed far beyond that level now. Software craftsmanship has indeed left the building; only the product matters now.
The problem for software artisans is that unlike other handmade craftwork, nobody else ever sees your code. There's no way to differentiate your work from that which is factory-made or LLM-generated.
Therefore I think artisan coders will need to rely on a combination of customisation and customer service. Their specialty will need to be very specific features which are not catered for by the usual mass code creation market, and provide swift and helpful support along with it.
If fluid intelligence is based on the ability to recognize new patterns (unsupervised learning) and crystallized intelligence on recognizing known patterns (supervised learning), then more than physiology, age alone may differentiate the two.
Youngsters know no patterns so they can't match new events to known ones. Oldsters know that most seemingly new stuff is not really new, it's just the same old stuff, so they reduce the cost of thinking and reject the noise by adding the new unlabeled event to an existing cluster rather than creating a new noisy one. That's wisdom. But that's also a behavior that will inevitably increase as we age and our clusters establish themselves and prove their worth.
So aren't those two forms of intelligence less about a difference in brain physiology and more about having learned to employ common sense?
More than socio-economic, the chief factor that advances US political candidates is, simply, fame. These days fame is achieved by somehow becoming an outlier: loud extremism, incessant self promotion, and spending truly insane amounts of money. Intelligence of any kind is irrelevant.
Yeah. The right hasn't been able to repeat Trump, other candidates following his playbook have usually failed. And I think it's because they don't have his three-plus decades of lowest common denominator fame and enough money to buy himself out of repeated business failure and corruption. It's a perfect storm.
At the very least, every school, subject, and teacher should be obliged to conduct experiments during the school year -- A/B/C trials in which various forms of note taking are explored: handwritten, computer-typed, and neither.
Then see how it affects the kids' learning speed and retention of the various subjects. Then they need to compare notes with the other teachers to learn what they did differently and what did or didn't work for them.
Ideally they'd also assess how this worked for different types of students, those with good vs bad reading skills, with good vs bad grades, esp those who are underperforming their potential.
The idea that we would A/B test handwritten vs typed to see what would improve retention is focusing on the wrong thing. It's like A/B testing mayo or no mayo on your big mac to see which version is a healthier meal. No part of the school system is optimized for retention. It's common for students to take a biology class in 9th grade and then never study biology again for the rest of their lives. Everyone knows they won't remember any biology by the time they graduate, and no one cares.
We know what increases retention, it's active recall and (spaced) repetition. These are basic principles of cognitive science have been empirically proven many times. Please try to implement that before demanding that teachers do A/B tests over what font to write the homework assignments in.
reply