ok.. this is revolutionary. Using the architecture as a way to capture an image prior hints at how network structure and captured invariance are related. By analogy it leads to thinking of brain areas as both hard coded prior knowledge through their natural arrangement and plastic learning structure. Turning the problem on its head shines a new light to how we could conceive network architectures.