A social one where you use experience to make inferences about what is going to happen next. Who hasn't observed an aggressive driver coming up from behind, weaving around other cars and said to themselves something like "that guy is going to cut me off, better ease off the gas so there's a little extra room in front for when he does". That level of situational awareness isn't coded for is it?
Edit: I have a whole mental model for other drivers and different approaches for them. Someone driving like a grandma? Pass when available. Nervous/erratic/lost driver? Keep extra distance then pass as soon as possible. Aggressive driver? Relax, give some space and let them get ahead. And so on. I get that stereotyping is bad but ignoring the subtle signals other drivers give off seems like it would be myopic. An AI that doesn't anticipate what others will do on the road will always be reactive rather than proactive.
Most self-driving vehicles aren't just coded; they incorporate some form of ML as well. Categorizing patterns of behaviors is well within the reach of ML algorithms, but my understanding is we have far more basic problems to solve first (Tesla's seems to struggle with object tracking over time, which would be a necessary first step to recognizing patterns).
Developing object tracking and a sense of object permanence would be a pretty big prerequisite.. And here I was thinking about the RL model needed to decide what to do in the presence of other agents.
Edit: I have a whole mental model for other drivers and different approaches for them. Someone driving like a grandma? Pass when available. Nervous/erratic/lost driver? Keep extra distance then pass as soon as possible. Aggressive driver? Relax, give some space and let them get ahead. And so on. I get that stereotyping is bad but ignoring the subtle signals other drivers give off seems like it would be myopic. An AI that doesn't anticipate what others will do on the road will always be reactive rather than proactive.