Cathy O’Neil argues academics ignore the societal implications of tech at their peril.
There’s one solution for the short term. We urgently need an academic institute focused on algorithmic accountability.
- First, it should provide a comprehensive ethical training for future engineers and data scientists at the undergraduate and graduate levels, with case studies taken from real-world algorithms that are choosing the winners from the losers. Lecturers from humanities, social sciences and philosophy departments should weigh in.
- Second, this academic institute should offer a series of workshops, conferences and clinics focused on the intersection of different industries with the world of A.I. and algorithms. These should include experts in the content areas, lawyers, policymakers, ethicists, journalists and data scientists, and they should be tasked with poking holes in our current regulatory framework — and imagine a more relevant one.
The Ivory Tower Can’t Keep Ignoring Tech
Also, Cathy O’Neil talks about her book, Weapons of Math Destruction, on Ted Radio Hour. TED Radio Hour: Can We Trust the Numbers?