And yet we have Waymo building 2cm-resolution maps and not operating outside of mapped areas, and (more scarily) Tesla geotagging false positives where Autopilot misidentifies some roadside feature and panic brakes.
Are those approaches mutually incompatible? It seems to me that it would be easier to build a system which can drive fast when what it's seeing matches what it's expecting to see (most of the time), and which falls back to a much more conservative stance in the case of surprisal (i.e. expectation not matching observation).
Yes. Cast your mind to lightly-traveled rural roads. If you are driving one of the many unpaved roads in Arizona during flash flood season, and the last car through was a couple hours ago, will you bet your life that the road is still there?
But hey, that's not a realistic danger in the city, right? Well, look up the sinkhole named "Steve" that opened up in the middle of an Oakland, CA freeway.
Then there's Highway 1, which is known for landslides. Not to mention that little incident with the Bay Bridge back in '89...