Has anybody tried learning a programming language from GPT-4?
I’m not a programmer myself, but I have made quite a few static web pages over the years for my own use. I code the HTML and CSS by hand, and I think I have a basic understanding of how they work. But JavaScript has always been a mystery to me.
So the other day I asked GPT-4 to teach me the basics of JavaScript. It responded with an understandable outline of the language and a few code samples. Nothing special there. But then I tried asking it questions, and that is where it really shone.
When, for example, I asked “In expressions like console.log("x is greater") or window.onload = function(), what is the role of the period (.) between console and log and between window and onload?,” it replied “The period (.) in JavaScript is used to access properties or methods of an object” and continued with a multiparagraph explanation, including examples. I then followed up with “How many properties or methods does the 'window' object have?,” and it gave me a list of common properties and methods. I continued asking questions and asking for feedback on some code I wrote until I hit my prompt limit. I felt like I was learning very quickly.
The next day, I tried a similar thing with Python, which I also knew nothing about, and once again the lesson went very smoothly.
I have no idea how good it would be for more advanced learners, but for a beginner like me GPT-4 seems like a great programming teacher. Being able to ask questions and get full answers and feedback immediately was particularly exciting.
I’ve been asking questions about F# after a while of using OCaml. I’ve found GPT-4 to hallucinate syntax that doesn’t exist, such as being able to do open <modulename> in a function. It’s actually something I assumed too, because you can do this in OCaml with "let open <modulename> in", but you cannot do this in F#.
Despite this, it’s still been a lot more helpful than searching, is mostly correct, and is my go-to resource.
ChatGPT works well until it doesn't. When ChatGPT fails and generates some garbage it will do so in very convincingly sounding manner, and the learner may not have the right tools yet to recognize that.
You’re right. But for some types of learning it’s not necessary to have a teacher that’s always right. If GPT-4 feeds me hallucinations occasionally about JavaScript or Python, I will find out soon enough when my programs fail to run or when web searches fail to confirm what it says. Being able to ask it questions and get right answers almost all the time more than makes up for the occasional error.
Another example where GPT-4 can be used productively for learning despite its fallibility is with human languages. It doesn’t always explain grammar correctly, and it can spout out hallucinated facts. But its ability to interact almost as if it were a real person, as well as its ability to explain the meanings of words and sentences quite accurately (in the case of major languages, at least), is of enormous value for people learning other languages. Few human language teachers are as knowledgeable and accurate as it is.
Why not just search “best book to learn X reddit”, take a look at a few threads and get the book mentioned, and read it? It was written by an actual expert on the topic, did not hallucinate bullshit and is actually correct, and if it is a good book then it has a proper structure, making you learn in a proper order.
GPT-4 is generally superior to reading a book or following a tutorial, in my experience.
You can't ask a book questions about parts you don't understand. Books will also talk about things that are not directly related to the problem you are trying to solve, which can be frustrating.
The biggest issue with GPT is when you are working with new or rapidly changing tech. For example it might suggest Godot 3 syntax when you are working with Godot 4. But working with OpenGL for example, I find questioning GPT-4 to be more useful than reading a book. The book was probably in it's training data, anyway.
That is a good method, too, of course. But I don’t always learn best by proceeding through material step-by-step. Often wandering around a subject as my curiosity takes me and using a lot of trial and error have enabled to me to assimilate knowledge better in the end. Even if GPT-4’s answers are sometimes wrong, being able to ask it questions is extremely useful.
I’m not a programmer myself, but I have made quite a few static web pages over the years for my own use. I code the HTML and CSS by hand, and I think I have a basic understanding of how they work. But JavaScript has always been a mystery to me.
So the other day I asked GPT-4 to teach me the basics of JavaScript. It responded with an understandable outline of the language and a few code samples. Nothing special there. But then I tried asking it questions, and that is where it really shone.
When, for example, I asked “In expressions like console.log("x is greater") or window.onload = function(), what is the role of the period (.) between console and log and between window and onload?,” it replied “The period (.) in JavaScript is used to access properties or methods of an object” and continued with a multiparagraph explanation, including examples. I then followed up with “How many properties or methods does the 'window' object have?,” and it gave me a list of common properties and methods. I continued asking questions and asking for feedback on some code I wrote until I hit my prompt limit. I felt like I was learning very quickly.
The next day, I tried a similar thing with Python, which I also knew nothing about, and once again the lesson went very smoothly.
I have no idea how good it would be for more advanced learners, but for a beginner like me GPT-4 seems like a great programming teacher. Being able to ask questions and get full answers and feedback immediately was particularly exciting.