Maha Bali discusses cultivating critical AI literacies on episode 545 of the Teaching in Higher Ed podcast.
Quotes from the episode
We need to be particularly concerned about AI’s impact on data sovereignty, especially when it comes to religious information and indigenous data. These are areas where misuse or misinterpretation can have profound implications.
-Maha Bali
Bias in AI is not just an incidental issue, it’s a replication of the systemic biases we see in society.
-Maha Bali
It’s crucial that we trace back the sources and origins of information produced by AI.
-Maha Bali
We should align AI usage with our teaching philosophies and values. It’s not just about adopting the latest technology, but doing so in a way that enhances learning and stays true to educational principles.
-Maha Bali
Resources
- A Pedagogy for Liberation: Dialogues on Transforming Education, by Paulo Freire and Ira Shor
- Episode 524 with Jon Ippolito
- Jon Ippolito
- Don’t Trust AI to Cite its Sources, by Anna Mills and Maha Bali
- Tema Okun Writes About White Supremacy
- White Supremacy Culture, by Tema Okun
- Exploring Post-Plagiarism with Google NotebookLM, by Sarah Eaton
- When Knowledge is Dangerous, But Information is Power, by Audrey Watters
- Tressie McMillan Cottom Gives Mini Lecture on AI
- Cake-Making Analogy for Setting Generative AI Guidelines/Ethics, by Maha Bali
- When it comes to AI, is transparency enough? by Maha Bali
- Critical AI Literacy is Not Enough: Introducing Care Literacy, Equity Literacy & Teaching Philosophies, by Maha Bali
- Daniela Gachago and Nicola Palitt
- Google’s QuickDraw
- Bonni’s Google NotebookLM Audio Overview of Course Evaluations
- I have been hallucinated! by Laura Czerniewicz
- Nature Editorial Policies
信息
- 节目
- 频率两周一更
- 发布时间2024年11月21日 UTC 13:00
- 长度50 分钟
- 单集545
- 分级儿童适宜