You're confused by the double meaning of the word "bias".
Here we mean mathematical biases.
For example, a good mathematical model will correctly tell you that people in Japan (geographical term) are more likely to be Japanese (ethnic / racial bias). That's not "objectively morally bad", but instead, it's "correct".
Although what you stated is true, it’s actually a short form of a commonly stated untrue statement “98% of Japan is ethnically Japanese”.
1. that comes from a report from 2006.
2. it’s a misreading, it means “Japanese citizens”, and the government in fact doesn’t track ethnicity at all.
Also, the last time I was in Japan (Jan ‘20) there were literally ten times more immigrants everywhere than my previous trip. Japan is full of immigrants from the rest of Asia these days. They all speak perfect Japanese too.
Well that's not the issue here, the problem is the examples like searches for images of "unprofessional hair" returning mostly Black people in the results. That is something we can judge as objectively morally bad.
Did you see the image in the linked article? Clearly the “unprofessional hair” are people with curly hair. Some are white! It’s not the algorithm’s fault that P(curly|black) > P(curly|white).
Here we mean mathematical biases.
For example, a good mathematical model will correctly tell you that people in Japan (geographical term) are more likely to be Japanese (ethnic / racial bias). That's not "objectively morally bad", but instead, it's "correct".