Having an LLM do something for you that you don't know how to do is asking for trouble. An expert likely can off load a few things they aren't all that important, but any junior is going to dig themselves into a significant hole with this technique.
But asking an LLM to help you learn how to do something is often an option. Can't one just learn it using other resources? Of course. LLMs shouldn't be a must have. If at any point you have to depend upon the LLM, that is a red flag. It should be a possible tool, used when it saves time, but swapped for other options when they make sense.
For an example, I had a library I was new to and asked copilot how to do some specific task. It gave me the options. I used this output to go to google and find the matching documentation and gave it a read. I then when back to copilot and wrote up my understanding of what the documentation said and checked to see if copilot had anything to add.
Could I have just read the entire documentation? That is an option, but one that costs more time to give deeper expertise. Sometimes that is the option to go with, but in this case having a more shallow knowledge to get a proof of concept thrown together fit my situation better.
Anyone just copying an AI's output and putting it in a PR without understanding what it does? That's asking for trouble and it will come back to bite them.
Having an LLM do something for you that you don't know how to do is asking for trouble. An expert likely can off load a few things they aren't all that important, but any junior is going to dig themselves into a significant hole with this technique.
But asking an LLM to help you learn how to do something is often an option. Can't one just learn it using other resources? Of course. LLMs shouldn't be a must have. If at any point you have to depend upon the LLM, that is a red flag. It should be a possible tool, used when it saves time, but swapped for other options when they make sense.
For an example, I had a library I was new to and asked copilot how to do some specific task. It gave me the options. I used this output to go to google and find the matching documentation and gave it a read. I then when back to copilot and wrote up my understanding of what the documentation said and checked to see if copilot had anything to add.
Could I have just read the entire documentation? That is an option, but one that costs more time to give deeper expertise. Sometimes that is the option to go with, but in this case having a more shallow knowledge to get a proof of concept thrown together fit my situation better.
Anyone just copying an AI's output and putting it in a PR without understanding what it does? That's asking for trouble and it will come back to bite them.