It feels like LLMs could be good contenders for solving symbolically integrals. After spending some time, it really feels like translating between two languages.
Derive 2 for Dos. Green Screen 286 I think or 386 computers in a small side room. Later Windows version was better. Then there was the DOS version of Minitab 5 I think that came as floppy disks in the back of a spiral bound book which I used to generate data sets for students to process for homework so everyone got a slightly different sample.
You can do a lot of numerical maths just with a noddy spreadsheet of course.
At the 1940s Manhattan project, back when computer meant a job: "person who computes mathematical statements", major advancements were made in the integration of hyperbolic PDEs, by substituting electro-mechanical and then vacuum-tube machines to do the job. You know, those hard-wired vacuum tube monsters like ENIAC.
You could argue that the First useful thing electronic computers did was integration...
Electronics themselves work by understanding integration.
It's full circle. But with Lisp and Lambda Calculus even an Elementary school kid could understand integration, as you are literally describing the process as if they were Lego blocks.
Albeit in Forth would be far easier.
It's almost telling the computer that multiplying it's iterated addition, and dividing, iterated substraction.
Floating numbers are done with specially memory 'blocks', and you can 'teach' the computer to multiply numbers bigger than 65536 in the exact same way humans do with pen and paper.
Heck, you can set float numbers by yourself by telling Forth how to do the float numbers by following the standard and setting up the f, f+, f/... and outputting rules by hand.
Slower than a Forth done in assembly? Maybe, for sure;
but natively, in old 80's computers, Forth was 10x faster than Basic.
From that to calculus, it's just telling the computer new rules*.
And you don't need an LLM for that.