"The law...". Whose law? "Society says...". Which society? "Implementing morality". Whose morality?
Also, why wouldn't you want things to be optimal, depending what what they're optimizing for? I thought optimizing was the exact point of machine learning and AI.
By morality I was thinking along the lines of socially accepted behaviour in the contemporary United States, and mainstream Christian morality which isn't the official state religion, but in my opinion the basics of it are taken as a given in politics, government, media, and academia.
Overall, I'm referring to mainstream anglo law, society, and morality.
> why wouldn't you want things to be optimal, depending what what they're optimizing for?
The reason is that when you express your goals, you don't fully understand what the consequences will be. There may be consequences that are totally repugnant to you. The child-story level version of this is where you get a genie which gives you your wishes in a horrible way: i.e. I wish to be the richest man in the world, and so the genie kills everyone else. I want a nice big house like my parents, and so the genie kills your parents and you inherit. Et cetera.
Telling the computer to do what you want is notoriously difficult for simple imperative programming, to the point where many people think a large fraction of the population just isn't up to it intellectually, and if you need proof of this you can search for fizz-buzz interview stories. Setting up goals or incentives for a system that behaves in a way the you can barely understand is even more difficult.
"The law...". Whose law? "Society says...". Which society? "Implementing morality". Whose morality?
Also, why wouldn't you want things to be optimal, depending what what they're optimizing for? I thought optimizing was the exact point of machine learning and AI.