The BGR published an article about machine learning, science proves that training AI to be ‘human’ makes it sexist and racist, too.
The machine learning systems that determine the outcome inherently lean one way or the other
The AI studies the words, how their used, and what words they’re used in association with, in order to provide natural language responses and answers in a way that we can understand. It also, as it turns out, learns some of our more unfortunate quirks.
Scientists tested how it associates various words with others
After training the AI, scientists tested how it associates various words with others. For example, “flower” is more likely to be associated with “pleasant” than “weapon” is. That, of course, makes perfect sense. However, the trained AI also had a habit of associating typically caucasian-sounding names with other things that it considered to be “pleasant,” rather than African-American names. The AI also shied away from pairing female pronouns with mathematics, and instead often associated them with artistic terms.
This is obviously a huge issue since, in the name of creating AI that sound and behave more human, some of the standard training materials being used carry with them some of the worst parts of us. It’s a very interesting problem, and one that you can bet will get a lot of attention now that evidence seems to be mounting.