Explore UCD

UCD Home >

Biases in AI

Critical AI

The first volume of a (opens in a new window)new academic journal, (opens in a new window)Critical AI, was published in October 2023, and it is available on open access. (opens in a new window)Volume 2 was published in April 2024.

The editor’s introduction to Vol. 1 explains what ‘critical AI’ means and why it is so important for the Humanities to be part of the conversation. If you want to read more by Critical AI editor Lauren Goodlad, see the co-authored article (opens in a new window)‘Now the Humanities Can Disrupt AI’.

How AI reduces the world to stereotypes

(opens in a new window)This article by Rest of World, a nonprofit publication covering global technology outside the West, focuses on the way generative AI produces stereotypes and caricatures.

'Algorithms of Oppression' with Safiya Umoja Noble (USC)

This 4-minute video with scholar Safiya Noble focuses on the racial and gender biases in Google algorithms, but the same issues are relevant to generative AI. You can read the book based on Noble’s research via (opens in a new window)UCD Library.

'Gender Shades' with Joy Buolomwini (MIT)

This 5-minute video introduces Job Buolomwini’s work in debunking the myth of machine neutrality by uncovering the biases that come from limited or biased training data. Similar biases are evident in the data used to train generative AI. You can read more about Buolomwini’s research via (opens in a new window)UCD Library.

College of Arts and Humanities

University College Dublin Belfield Dublin 4 Ireland
T: +353 1 716 7777