That's exactly the problem: the learned "concept" is not general purpose at all. It's (from what we can tell) a bunch of special cases. While the AI may learn as special cases cavities inside carboard boxes and barrels and foxholes, let's say, it still has no general concept of a cavity, nor does it have a concept of "X is large enough to hide Y". This is what children learn (or maybe innately know), but which AIs apparently do not.
> It still has no general concept of a cavity, nor does it have a concept of "X is large enough to hide Y". This is what children learn (or maybe innately know), but which AIs apparently do not.
I take it you don't have any hands-on knowledge of the field. Because I've created systems that detect exactly such properties. Either directly, through their mathematical constructs (sometimes literally via a single OpenCV function call), or through deep classifier networks. It's not exactly rocket science.