Science
AI trained on novels tracks how racist and sexist biases have evolved


Books can document the cultural biases of the era when they were published
Ann Taylor/Alamy
Artificial intelligences picking up sexist and racist biases is a well-known and persistent problem, but researchers are now turning this to their advantage to analyse social attitudes through history. Training AI models on novels from a certain decade can instil them with the prejudices of that era, offering a new way to study how cultural biases have evolved over time.
Large language models (LLMs) such as ChatGPT learn by analysing large collections of text. They tend to inherit the biases found within their training data:…
Source link