I know you're joking, but I'm talking about legal responsibility like this: https://insideevs.com/news/575160/mercedes-accepts-legal-res.... In this case, Mercedes, not the the developer, is assuming liability, but that responsibility will trickle down.
I doubt it, that decision wouldn't be consistent with other rulings with regards to engineering work. Depending on the jurisdiction, the engineer (actual licensed engineer) stamps their seal of approval on the work and takes responsibility. Even if the software they used had a bug in it that caused ti to produce wrong answers, they are still responsible.
That is yet to be determined. There is some pressure in the other direction. I know of one large corporation, that does a lot of work around the world of the kind where programming errors could destroy property or kill people, has strict policies wrt. AI-generated code. This includes an obligation to clearly mark code that was written by AI, as well as restrictions on when it's allowed. This is driven not just by potential IP issues, but also by security and export control.
(Yes, in a large enough corp, export control is a source of a surprisingly large amount of extra work...)
At some point we had to wear deodorant and a collared shirt, boom we became engineers.