Machine learns racial and gender biases embedded in human data.
Lets not assume AI will be evil or wise. AI see, AI do, like any monkey. At some point it may grow up and learn ‘good’ from ‘bad’ but thats debatable.
Machine learning algorithms are picking up deeply ingrained race and gender prejudices concealed within the patterns of language that humans commonly use, scientists say.
For instance, in the mathematical “language space”, words for flowers are clustered closer to words linked to pleasantness, while words for insects are closer to words linked to unpleasantness, reflecting common views on the relative merits of insects versus flowers.
The latest paper shows that some more troubling implicit biases seen in human psychology experiments are also readily acquired by algorithms. The words “female” and “woman” were more closely associated with arts and humanities occupations and with the home, while “male” and “man” were closer to maths and engineering professions.
And the AI system was more likely to associate European American names with pleasant words such as “gift” or “happy”, while African American names were more commonly associated with unpleasant words.