> Well, if everyone uses a calculator, how do we learn math?
Calculators have made most people a lot worse in arithmetic. Many people, for instance, don't even grasp what a "30%" discount is. I mean other than "it's a discount" and "it's a bigger discount than 20% and lower than 40%". I have seen examples where people don't grasp that 30% is roughly one third. It's just a discount, they trust it.
GPS navigation has made most people a lot worse at reading maps or generally knowing where they are. I have multiple examples where I would say something like "well we need to go west, it's late in the day so the sun will show us west" and people would just not believe me. Or where someone would follow their GPS on their smartphone around a building to come back 10m behind where they started, without even realising that the GPS was making them walk the long way around the building.
Not sure the calculator is a good example to say "tools don't make people worse with the core knowledge".
GPS has also ruined our city-level spatial awareness.
Before, you had the map. So you were aware that Fitzroy was to the west of Collingwood and both were south of Clifton Hill and so on. I had dozens of these suburbs roughly mapped out in my mind.
Driving down an unfamiliar road, one could use signs to these suburbs as a guide. I might not know exactly where I was, but I had enough of an idea to point me in the right direction.
>Before, you had the map. So you were aware that Fitzroy was to the west of Collingwood and both were south of Clifton Hill and so on. I had dozens of these suburbs roughly mapped out in my mind.
>Driving down an unfamiliar road, one could use signs to these suburbs as a guide. I might not know exactly where I was, but I had enough of an idea to point me in the right direction.
Reading those sentences feels like I am dreaming.
The exploration...
The possibilities...
Serendipitously finding you way through and getting temporarily lost at night in a big friendly suburban area with trees and in summer...
But how important is the core knowledge if it isn't necessary to achieve the outcomes people actually value? People only cared about map reading skills to the extent that it got them where they want to go. Once GPS became a thing, especially GPS on mobile phones, getting them where they want to go via map reading became irrelevant. Yes, there are corner cases where map reading or general direction finding skills are useful, but GPS does a vastly better and quicker job in the large majority of cases so our general way-finding experience has improved.
This is especially true because the general past alternative to using GPS to find some new unfamiliar place wasn't "read a map" it was "don't go there in favor of going some place you already knew" in a lot of cases. I remember the pre-GPS era, and my experience in finding new stuff is significantly better today than it was back then.
Using map reading skills as a proxy for this is a bit of a strawman. People who use GPS habitually have worse navigational and spatial awareness skills.
If you habitually use a calculator for all arithmetic, could the result not be similar? What if you reach to an LLM for all your coding, general research, etc.? These tools may vastly speed up some workflows, but your brain is a muscle.
I think you're missing the point, which is to say "those tools make us more productive, but less knowledgeable".
And you answer by saying "it's okay to be less knowledgeable (and hence depend on the tool), as long as you are more productive". Which is a different question.
But to me it's obviously not desirable: if AI allows people to completely lose all sense of critical thinking, I think it's extremely dangerous. Because whoever controls the AI controls those people. And right now, look at the techbros who control the AIs.
So the original question is: is it the case that AI reduces the skills of the people who use them? The calculator and the GPS are examples given to suggest that it doesn't sound unlikely.
At the end of the day, it's the average productivity across a population that matters.
So GPS makes people worse at orienteering -- on average, does it get everyone where they need to go, better / faster / easier?
Sometimes, the answer is admittedly no. Google + Facebook + TikTok certainly made us less informed when they cannibalized reporting (news media origination) without creating a replacement.
But on average, I'd say calculators did make the population more mathematically productive.
After all, lots of people sucked at math before them too.
> After all, lots of people sucked at math before them too.
A calculator doesn't do maths, it does arithmetic. People sucked at maths, but I'm pretty sure they were better with arithmetic.
> At the end of the day, it's the average productivity across a population that matters.
You're pushing my example. My point is that AI may actually make the average developer worse. Sure, also more productive. So it will reinforce this trend that has been in the software industry for more than a decade: produce more but worse software.
Productivity explains why we do it. It doesn't mean it is desirable.
> And no, that's not some slight of verbal hand in measuring "productive" -- they are able to ship more value, faster.
Ship more value faster is exactly a verbal slight of hand. That's the statement used by every bad product manager and finance asshole to advocate for shipping out broken code faster. It's more value because more code is more content, but without some form of quality guard rails you run into situations where everything breaks. I've been on teams just like that where suddenly everything collapses and people get mad.
Do you think compilers helped teams ship more value faster from worse developers? IDEs with autocomplete? Linters?
At the end of the day, coders are being paid money to produce something.
It's not art -- it's a machine that works and does a thing.
We can do that in ways that create a greater or lesser maintenance burden, but it's still functional.
LLM coding tools detractors are manufacturing reasons to avoid using another tool that helps them write code.
They need to get over the misconception of what the job is. As another comment previously quipped 'If you want to write artisanal, hand-tuned assembly that's beautiful, do that on your own time for a hobby project.'
> Do you think compilers helped teams ship more value faster from worse developers? IDEs with autocomplete? Linters?
I'm tired of engaging with this false equivalence so I won't. Deterministic systems are not the same.
> It's not art -- it's a machine that works and does a thing.
That's right. But what you need to understand is that the machines we create can and do actively harm people. Leaking secure information, creating software that breaks systems and takes down critical infrastructure. We are engineers first and foremost and artists second. And that means designing systems to be robust and safe. If you can't understand that then you shouldn't be an engineer and should kindly fuck off.
There is a big difference with compilers. With compilers, the developer still needs to write every single line of code. There is a clear an unambiguous contract between the source code and what gets executed (if it's ambiguous, it is a bug).
The thread here was talking about:
> Well, if everyone uses a calculator, how do we learn math?
The question being whether or not AI will make developers worse at understanding what their code is doing. You can say that "it's okay if a website fails every 100 times, the user will just refresh and we're still more profitable". But wouldn't you agree that such a website is objectively of worse quality? It's cheaper, for sure.
Said differently: would you fly in a plane for which the autopilot was vibe coded? If not, it tells you something about the quality of the code.
Do we always want better code? I don't know. What I see is that the trend is enshittification: more profit, worse products. I don't want that.
> [With compilers] There is a clear an unambiguous contract between the source code and what gets executed
Debatable in practice. You can't tell me you believe most developers understand what their compiler is doing, to a level of unambiguity.
Whether something gets unrolled, vectorized, or NOP-padded is mysterious. Hell, even memory management is mysterious in VM-based languages now.
And yes (to the inevitable follow-up) still deterministic, but those are things that developers used to have to know, now they don't, and the world keeps spinning.
> You can say that "it's okay if a website fails every 100 times, the user will just refresh and we're still more profitable". But wouldn't you agree that such a website is objectively of worse quality? It's cheaper, for sure.
I would say that's the reality we've been living in since ~2005. How often SaaS products have bugs? How frequently mobile apps ship a broken feature?
There are two components here: (1) value/utility & (2) cost/time.
There are many websites out there that can easily take a 1 in 100 error rate and still be useful.
But! If such a website, by dint of its shitty design, can be built with 1/100th of the resources (or 100x websites can be built with the same), then that might be a broader win.
Not every piece of code needs to fly in space or run nuclear reactors. (Some does! And it should always have much higher standards)
> Said differently: would you fly in a plane for which the autopilot was vibe coded? If not, it tells you something about the quality of the code.
I flew in a Boeing 737 MAX. To the above, that's a domain that should have called for higher software standards, but based on the incident rate I had no issue doing so.
> Do we always want better code? I don't know. What I see is that the trend is enshittification: more profit, worse products. I don't want that.
The ultimate tradeoff is between (expensive/less, better code) and (cheaper/more, worse code).
If everything takes a minimum amount of cost/effort, then some things will never be built. If that minimum cost/effort decreases, then they can be.
You and I are of like mind regarding enshittification and declining software/product standards, but I don't think standing in front of the technological advancement train is going to slow it.
If a thing can be built more cheaply, someone will do it. And then competitors will be forced to cheapen their product as well.
Imho, the better way to fight enshittification is creating business models that reward quality (and scale).
> You and I are of like mind regarding enshittification and declining software/product standards, but I don't think standing in front of the technological advancement train is going to slow it.
Note that I'm well aware that I won't change anything. I'm really just saying that AI will help the trend of making most software become worse. It sucks, but that's how it is :-).
The glass half-full would be that effective AI coding tools (read: more competent than a minimal cost human) may actually improve average software quality!
Suppose it depends on how quickly the generative effectiveness improves.
> I'm suggesting you consider it from an objective perspective.
What is objective? That profitability is good? We're destroying our environment to the point where many of us will die from it for the sake of profitability. We're over-using limited natural resources for the sake of profitability. In my book that's not desirable at all.
Companies are profit-maximising machines. The path to more profitability tends to be enshittification: the company makes more money by making it worse for everybody. AI most definitely requires more resources and it seems like those resources will be used to do more, but of lower quality.
Surely that's profitable. But I don't think it is desirable.
I'm unconvinced that calculators have made most people a lot worse in arithmetic. There have always been people who are bad at math. It's likely there are fewer people who can quickly perform long division on paper, but it's also possible the average person is _more_ numerate because they can play around with a calculator and quickly build intuition.
Arithmetic is also near-useless if you have access to a calculator. It's also a completely different skill thab reasoning about numbers, which is a very useful skill.
But, logically, you need to spend time thinking about numbers to be good reasoning about them, and the calculator is about reducing that time.
I feel there's a bit of a paradox, with many subjects, where we all know the basics are the absolute most important thing, but when we see the basics taught in the real world, it seems insultingly trivial.
I understand what you're saying, but I legitimately am unconvinced learning long division is necessary to learn by hand to master division. If anything, perhaps we should be asking children to derive arithmetic from use of a calculator.
I think it’s pretty hard to reason about numbers without having mastered arithmetic. Or at least beat your brain against it long enough that you understand the concepts even if you don’t have all the facts memorized.
I disagree; i think the focus on arithmetic actually enables people saying they're "bad at math" when symbolic reasoning is a completely different (and arguably much easier) skill. You an easily learn algebra without knowing long division.
Hell, if I had to do long division today without a computer I'd have to re-derive it.
I don't think it's so much about doing a long division. To me, it's more about having an intuition that 30/100 is roughly "one third", and that you can put three thirds in the full thing.
And I don't mean specifically those numbers, obviously. Same goes with 20/100, or understanding orders of magnitudes, etc.
Many people will solve a "maths problem" with their calculator, end up with a result that says that "the frog is moving at 21km/s" and not realise that it doesn't make any sense. "Well I applied the recipe, the calculator gave me this number, I assume this number is correct".
It's not only arithmetic of course, but it's part of it. Some kind of basic intuition about maths. Just look at what people were saying during Covid. I have heard so many people say completely wrong stuff because they just don't have a clue when they see a graph. And then they vote.
I agree you can learn algebra without knowing (or being good at) long division on paper, but you need to have a good conceptual understanding of what division is and I don't think a lot of people get that without the rote process of doing it over and over in elementary school.
I can do plenty of arithmetic much faster than I could type it on a calculator keypad. That's like saying hardware keyboards are near-useless if you have access to a touchscreen.
Would you be able to do your numerical work without understanding what an addition or a subtraction is?
I feel like arithmetic is part of the basics to build abstraction. If I say "y = 3x + a", somewhere I have to understand what "3 times x" means and what the "+" means, right?
Or are you saying that you can teach someone to do advanced maths without having a clue about arithmetic?
Sure there have always been people bad at math. But basic arithmetic is not really math. We used to drill it into kids but we no longer do so and I can usually see the difference between generations. For example, women in my mother’s generation were not prioritised for education but they often are pretty quick at arithmetic. But kids and young adults I come across pull out their phones for basic additions and divisions. And I find myself pulling out my phone more and more often.
I mean it’s not the end of the world and as you’ve said the raw number of people of numerate people are rising thanks to technology. But technology also seem to rob people of motivation to learn somewhat useful skills and even more so with LLMs.
For instance, you can certainly say that 381/7 is a positive number. And if I say "381/7 = 198", you can easily say that it is clearly wrong, e.g. because you immediately see that ~200 is roughly half of ~400, so it cannot be anywhere close to 1/7th.
I believe that this is an acquired skill that requires basic arithmetic. But if you need a calculator to realise that 381 is roughly twice as big as 198, then you can't do any of the reasoning above.
One may say "yeah but the point of the calculator is to not have to do the reasoning above", but I disagree. In life, we don't go around with a calculator trying to find links between stuff, like "there are 17 trees in this street, 30 cars, what happens if I do 17+30? Or 30-17? Or 30*17?". But if you have some intuition about numbers, you can often make more informed decisions ("I need to wait in one of those lines for the airport security check. This line is twice as long but is divided between three officers at the end, whereas this short line goes to only one officer. Which one is likely to be faster?").
I see what you're saying, but I just don't care that much about numbers to draw any conclusions you did about the figure you presented. I just see a string of digits.
Try standing in line at a grocery store and listening to people get upset because the amount is much higher than they thought it would be. You will hear statements like "But how is it $43? I didn't buy anything that costs more than $5"
People that failed to grasp arithmetic cannot reason about numbers to a useful degree.
> People that failed to grasp arithmetic cannot reason about numbers to a useful degree.
I think you're extrapolating far too much from such a simple interaction, which doesn't imply anything about ability to reason about numbers, just their ability to compute addition. If you say "if a is larger than b, and b is larger than c, is a larger than c?", you're testing numerical reasoning ability.
I'm not confused. A calculator does arithmetic, not maths. The question was:
> Well, if everyone uses a calculator, how do we learn math?
Which doesn't make much sense, because a calculator doesn't do maths. So I answered the question that does make sense: if everyone uses a calculator, do we still learn arithmetic? And I believe we don't.
And then, if we suck at basic arithmetic, it makes it harder to be good at maths.
But somehow I was born in the age of GPS and yet I ended up with a strong mental map and navigation skills.
I suspect there will be plenty of people who grow up in the age of LLMs and maybe by reading so much generated code, or just coding things themselves for practice, will not have a hard time learning solid coding skills. It may be easy to generate slop, but it’s also easy to access high quality guidance.
Calculators have made most people a lot worse in arithmetic. Many people, for instance, don't even grasp what a "30%" discount is. I mean other than "it's a discount" and "it's a bigger discount than 20% and lower than 40%". I have seen examples where people don't grasp that 30% is roughly one third. It's just a discount, they trust it.
GPS navigation has made most people a lot worse at reading maps or generally knowing where they are. I have multiple examples where I would say something like "well we need to go west, it's late in the day so the sun will show us west" and people would just not believe me. Or where someone would follow their GPS on their smartphone around a building to come back 10m behind where they started, without even realising that the GPS was making them walk the long way around the building.
Not sure the calculator is a good example to say "tools don't make people worse with the core knowledge".