"I don't think we can distinguish text and pictures so easily. Look at Chinese, look at Egytian hieroglyphs."
My understanding is linguistics research has pretty thoroughly debunked this idea.
Don't remember the experimental design (was a long time ago, sorry), but I believe a study showed Chinese readers basically translate the characters back into the sounds of spoken language in their heads, before any processing of meaning takes place. In other words, pictographic mnemonics may be helpful when first learning the characters, but play no role for a fluent reader.
I suspect a similar thing will be true with programming for a long time to come. Even if you try to replace keyboard characters with other icons, it will be just substituting one arbitrary association between symbols and meaning with another. (Which is basically what language boils down to, anyway.)
My understanding is linguistics research has pretty thoroughly debunked this idea.
Don't remember the experimental design (was a long time ago, sorry), but I believe a study showed Chinese readers basically translate the characters back into the sounds of spoken language in their heads, before any processing of meaning takes place. In other words, pictographic mnemonics may be helpful when first learning the characters, but play no role for a fluent reader.
I suspect a similar thing will be true with programming for a long time to come. Even if you try to replace keyboard characters with other icons, it will be just substituting one arbitrary association between symbols and meaning with another. (Which is basically what language boils down to, anyway.)