> The problem with deriving the quadratic equation that way is that for most people, that is a lot of symbols to keep track of, and you have to not only do an unintuitive "completing the square" step but you have to unintuitively do it fully generically.
"Unintuitive" depends entirely on your introduction to the topic. If you're already completing the square, using it to solve quadractic equations you cannot factor is not unintuitive.
As far as doing it fully generically, well, how else do you get a generic formula? When teaching this, we would do some completing the square to solve quadractics, and then tell our students:
"You know, this is annoying to have to complete the square EVERY TIME. What if we just decided to solve it the really hard way once, with A, B, and C in the equation instead of the numbers, and see what we get?"
I think you're speaking from the perspective of someone very casually comfortable with symbol manipulation. This does not describe the average middle school student. When I said "unintuitive", I was speaking from the perspective of an average middle school student, for whom this is all either at the edge of the ability, or, often, a bit past it, and for a non-trivial number of them, way past it.
In high school, I was a tutor for a non-accelerated, non-honors class that was about at this level in high school. After years of being in the accelerated course, it was a bit of an eye-opening experience. There's a lot of people who are just passed through this stuff with a C-, and I'm not even sure that's wrong, because there's a lot of people who just aren't ever going to get to the point where they can fluidly derive any of these equations. What you, and probably a great deal of the HN commetariat experience as "average" is actually way above average.
(And the students I was tutoring for, in the parlance of the day, would still mostly be considered "privileged". I would still not be calibrated for the mathematical skill of the truly disadvantaged.)
I teach mathematics and write math curriculum professionally, and have literally taught the lesson I describe above for the better part of a decade. I say "we" in that post because in more than half of those classes, I did not teach it alone, but in a co-taught inclusion class for special education and general education students together. None of the classes where I used this lesson was an "honors"/"accelerated"/"pre-AP"/etc. class.
Every single one of those students was able to derive the quadratic equation by completing the square. It was not easy for some of them, but every single one did it.
Your parenthetical also implies to me that you think that the "truly disadvantaged" have less "mathematical skill". I would encourage you to reflect on that.
> To me it is obvious that the method in the article is far superior than teaching completing the square.
I disagree. I would need some convincing that "two numbers that multiply to C and sum to B must have an average of B/2, so they must be B/2 + z and B/2 - z, so (B/2 + z)(B/2 - z) = C" is by any means obviously superior to completing the square. Neither is immediately intuitive; both will require prompting and teaching by the teacher. Completing the square has uses beyond proving the quadratic theorem; this does not.
I should say: I find this an incredibly cool and level-appropriate proof of the quadratic equation, but I think its merits as an improvement in pedagogy are dubious.
I doubt I can convince you. I’m just going by my experience teaching the topic. At the time students first learn solving such equations they have just been taught factoring and what it means to factor a trinomial. They know the product of the constant terms in the binomials must be c. It’s also easy to explain that the average of two numbers is the midpoint. And thus if I start with the midpoint then to get to the numbers I took the average of I add and then subtract some number from the midpoint. The geometry makes this easier to explain over using completing the square.
I’ve seen a shocking number of calculus students struggle with completing the square. The merits of the approach in the article are entirely obvious to me but like everyone else I’ve had my share of obvious beliefs turn out to be false.
It's phrased in a funny way, but this: ""two numbers that multiply to C and sum to B must have an average of B/2, so they must be B/2 + z and B/2 - z" is pretty obvious.
If x+y=B then the average(x+y) = (x+y)/2 = B/2
B/2 is then the number in between x and y so you can represent x and y as B/2 + z and B/2 - z (where z is just half the distance between x and y, or |x-y|/2)
You are correct. It is generally expected to be part of an Algebra I curriculum (as per Common Core Appendix A's Traditional Pathway).
> Common Core High School: Algebra » Reasoning with Equations & Inequalities » Solve equations and inequalities in one variable. » 4 » a
> Use the method of completing the square to transform any quadratic equation in x into an equation of the form (x - p)^2 = q that has the same solutions. Derive the quadratic formula from this form.
> The part that really pisses me off is the "10 strikes and we wipe your phone bit". That happened to me too on an infrequently used iphone last year.
This is an optional setting, and I'm relatively certain it used to be off by default. I think my newest iPhone specifically prompted me about it during setup.
iPads can use GraphNCalc83[1] and Apple Classroom on a teacher iPad/Mac to restrict students to the app[2]. I'm not sure if there's an android equivalent.
I still find using a touch screen much more frustrating than a calculator with physical buttons, but this is a legit alternative.
A few years ago, Jacques Mattheij accidentally won a lot more bulk lego auctions than he meant to. His story of sorting through them with a homemade hopper/camera/ML/computer vision showed up on HN.
I felt the opposite. Tesla's answer came across the worst to me, and reads to me with the over-specificity of a student making excuses for why their paper isn't done on time.
> "It was a supplier-related issue".
Passing the buck is really not a great look to start with. It makes the rest of the response seem like a series of excuses.
Tesla chose the supplier. Tesla built and inspected and sold and shipped the car with those parts. This is a Tesla-related issue. The only time Tesla should mention a supplier issue is in response to, "why can't you sell me a car?"
> "[It] did not pose any threat to vehicle safety".
That's good, but this isn't the safety survey.
> "[T]here was a[...] false service alert [... that was fixed] within two weeks of being reported".
I'm not even sure how to react to this one. Bringing my car in to be told "false alarm" is really not reassuring. And two weeks to fix a false alert doesn't help me - I've probably already brought it in for service.
> "This proactive approach to improved reliability is one of the reasons why Tesla is the highest rated car brand among consumers, according to Consumer Reports." (Paraphrased:) Our cars are the safest and best performing, and Model S is #1 in satisfaction.
To me, blaming a supplier for the issues is a terrible way to start, and this is likewise a terrible closing.
"You just fell in the rankings to come in 3rd to last in our reliability survey. Any comments?" "Our cars are the safest and fastest and we have always ranked #1 on your customer satisfaction survey."
Those stats are undeniably good. They also make me think those are the stats the company cares about a lot more than reliability.
Yes, but the others don't say anything. That Tesla does this makes them look bad (e.g. your comment) but it's more information for the customer, and that's a good thing.
>Tesla chose the supplier. Tesla built and inspected and sold and shipped the car with those parts. This is a Tesla-related issue.
They are new at mass manufacturing and no doubt learned a lesson from this incident. And that is the whole point of their response, unlike the other American manufacturers they are successfully working away at rapidly improving quality and reliability. And you are angry at them for that.
But Tesla didn't say "we're new at this and we learned from it". They said it was a "supplier-related issue" that has been "addressed for cars in the field" and "resolved [...] with fundamental design improvements".
Nothing in the statement tells me that there won't be another "supplier-related issue" down the line. There should be controls in place for this.
Tesla is new at producing cars in large quantities, and it has greatly improved their quality and reliability. There is no way this could have happened unless they were learning from their mistakes, and that includes controls on suppliers. Do you really disagree?
I don't think they're particularly hard arguments to make.
Turing is the easiest:
What is a computer? What isn't a computer? If you can't answer that question, you don't have a computer. The Church-Turing thesis is the invention of the computer.
Lovelace and Hopper are notable for paradigm shifts (as much as I hate the term) in what computers "are".
Lovelace invented computers because the essence of the computer is the ability to perform computations, not just calculations:
Lovelace published the first algorithm (Babbage obviously had to have written a few to design an analytical engine) and was the first person imagine computers as more than just a flexible calculator. In particular, she was the first to use a variable to represent anything beyond the intermediate/final step of a calculation: she was computing the 7th term of a series, so she used series/loop indices.
Hopper invented computers because a general-purpose computer is a computation machine with a human-friendly interface. A "smart" lightbulb that runs embedded linux isn't a "computer" (until modified).
Grace Hopper conceived of abstract programming in human-aligned languages, and invented A-0, the first "compiler" (her term; now we'd call it a linker). Before Hopper, computers were only ever programmed directly in machine code; every step of a computer program was a specific operation. Before A-0, we just had computation machines, but no human-centered interface. After A-0, we had computers.
Zuse would be a perfectly excellent 5th name, as would Church, Godel, Herbrand, Bohm, and many others.
Turing is the easiest because he's the only one that makes any sense. You have to get incredibly philosophical to have Hopper or Lovelace on the list as they very obviously did not invented the computer.
I implore you to read about the origins of the word computer and who those people were and what they did. Also just in general about the Analytical Engine, Babbage, and Lovelace.
I'm quite familiar with them. But the colloquial term "computer" isn't the same as the historical term, and that's the point here. We aren't talking about the job, or a design that wasn't implemented, or a mechanical computer because nobody means that when they say "computer".
> What is a computer? What isn't a computer? If you can't answer that question, you don't have a computer. The Church-Turing thesis is the invention of the computer.
I don't get this sentiment. So if we had the same devices we did nowadays, with all the same capabilities, but nobody had developed the theory behind computation... then we wouldn't actually have computers? Even though we could do exactly the same things as we do now?
Or heck, so this means even a modern layperson can't have computers? Because in order to have a computer they have to be able to define the Church-Turing thesis for you first? If I hand my laptop to my grandma then suddenly she has... what? An email-receiving brick with a touchscreen that somehow isn't a computer?
> Lovelace invented computers because [...]
No, what she's given credit for is being the first computer programmer (in our modern definitions). Now even that is disputed [1], but regardless, taking it at face value, that's what she gets credit for. Not for inventing the device she was using, but for tasks she realized she could use it for, which Babbage (allegedly?) did not realize.
> the essence of the computer is the ability to perform computations, not just calculations
I don't know if you can tease out a comprehensive definition for what exactly means here, because I don't think I can give a natural definition that would somehow pinpoint the credit to Lovelace rather than Babbage or Turing. Even if you look up what 'computation' means now, the dictionary will say something like "mathematical calculation". The only way I see to exclude 'calculation' from it seems to be to say "the ability to perform Turing-complete calculations", a formal notion which just didn't exist back then, so the credit for that wouldn't go to Lovelace.
> Hopper invented computers because a general-purpose computer is a computation machine with a human-friendly interface.
No, she's given (like you acknowledged) credit for making the first linker. Not the first computer. Nobody I've ever seen tries to argue she invented the first computer. Are these claims you actually hear in the real world?
> Before Hopper, computers were only ever programmed directly in machine code; every step of a computer program was a specific operation. Before A-0, we just had computation machines, but no human-centered interface. After A-0, we had computers.
Great, so the credit to give here would be that Hopper invented high-level programming languages (or something along those lines). Not that she just flat-out invented the computer!.
I'm not sure what the author's target audience is, but:
> But we know from high school physics that a = v' = p''
It would be exceedingly rare to see "p" used as the position variable in high school physics - it's almost exclusively reserved for momentum.
Most high school students do 1D physics with position as "x". Those who go on to study more physics usually use "s" as their displacement function/vector and maybe "r" as a position vector.
(Also, the overwhelming majority of high school physics students never touch calculus-based physics -- only about 50-60 thousand students take the AP Physics C exam each year. 5 times that take AP Physics 1, and even more take non-AP Physics.)
That's really helpful, I'll think about how to reword this. We covered calculus physics in Lebanon which uses the French Baccalaureate program so was making some non global assumptions.
Speaking as someone who came from a less-than-stellar high school, referring to concepts as being "from high school X" can be pretty demoralizing to hear, and it's almost never necessary to mention. Consider using "introductory" instead, unless your audience is explicitly only people who went to advanced high schools.
"Unintuitive" depends entirely on your introduction to the topic. If you're already completing the square, using it to solve quadractic equations you cannot factor is not unintuitive.
As far as doing it fully generically, well, how else do you get a generic formula? When teaching this, we would do some completing the square to solve quadractics, and then tell our students:
"You know, this is annoying to have to complete the square EVERY TIME. What if we just decided to solve it the really hard way once, with A, B, and C in the equation instead of the numbers, and see what we get?"