Plato's Allegory of the Cave is one of higher education's most pervasive foundational lessons. Almost ubiquitous in philosophy and political science curricula, the central tenet of the allegory, that many in society see only presented illusions of reality rather than reality itself, is as relevant as ever in the age of constant digital connections.
It also presents an opportunity to introduce computer science and engineering students to the ethical dilemmas of the very technology they are being trained to produce, according to University of Colorado teaching assistant professor Paul Diduch. As we scroll through the information we receive on our computers and mobile devices, we are, in effect, creating our own caves and echo chambers.
"The allegory still stands and doesn't need modification," Diduch said. "Screens intensify aspects of cave life, but they do it in a way where the influence can be insidious. We think having a screen makes us more individualistic, but really, in many ways, it makes us more complacent and conformist because we are siloed with others who are like us. The proliferation of devices actually minimizes the extent to which we engage meaningfully with dissenting voices."
Diduch is a faculty member in the university's Herbst Program for Engineering, Ethics, and Society, and the allegory is just one issue the program's participants ponder. Founded in 1989 as the Herbst Program of Humanities in Engineering and re-branded under its current name in 2019, the program was a pioneering effort encouraging students to think critically about the responsibilities and ambiguities of ethical technology development.
The Herbst program is getting a lot of company; in just the past few months, universities have created new engineering ethics programs, added concentrated subject-matter ethics programs, and also made ethics course material available worldwide through open access and free licensing arrangements. While the spate of recent activity may not quite be a Big Bang, it certainly indicates that for computer scientists, the open space between 1 and 0 is full of ambiguities that no responsible technologist can afford to ignore anymore.
"In my view, it's a phase of catching up in the data sciences, and it's really critically important," said Julie Shah, associate dean of the Massachusetts Institute of Technology's Social and Ethical Responsibilities of Computing (SERC) program, adding, "Better late than never."
MIT's contribution to the growing awareness around ethics and technology was the release in early March of SERC instructional materials created by MIT faculty across computing, data sciences, humanities, arts, and the social sciences to the MIT OpenCourseWare initiative. The materials, which include lecture notes, instructor insights, and various lesson types, are freely available worldwide, and modifiable by users.
"We are not putting up just stale PDFs, but also source files," said David Kaiser, SERC's other associate dean. "We don't have all the answers. It's all Creative Commons licenses. They often have notes in the back from the instructor about what might work in one classroom, but a certain variation might work better for another. These are open to further adaptation, and always open to improvement."
Hewing Close to Their Missions
Kaiser said the SERC release takes advantage of OpenCourseware's 20 years of experience and visibility in disseminating the material. Shah said it also is an indicator of the wider willingness of higher education to make ethical considerations an essential part of the curriculum.
"Looking across institutions, I'd say we're currently in a phase of a spirit of experimentation, and what will work at one university may or may not exactly work in the same way at another," she said. "But a number of the universities that have been working to both embed this type of material into classes and spur new research collaborations are in close communication and coordination."
In displaying that spirit of experimentation, the newly announced programs are hewing close to each institution's missions, philosophies, and faculty members' credentials.
For example, Marquette University's Center for Data, Ethics, and Society, which launched formally April 4th, is an extension of the university's Jesuit history of championing social justice, according to its director, Michael Zimmer.
"It's by definition interdisciplinary," Zimmer said. "It's data – comma – ethics, and society," Zimmer said. "It's not about data ethics necessarily. It's more than issues around things like algorithmic bias or some things that are very much front of mind for many people right now. It's bigger than that. It's about content moderation. It's about speech online. It's about all these different spaces."
Interestingly enough, though Zimmer started his tenure at Marquette in the computer science department, he is not a computer scientist by training; his background is in marketing and media studies. Likewise, the background of the Herbst program's Diduch is in political science and philosophy; at MIT, Shah received her pre-doctorate degrees in aeronautics, and Kaiser received his degrees in physics and the history of science.
Shah said her background in aeronautics serves as a solid backdrop for her role at SERC. Even though the industry is steeped in "measure twice, cut once" deliberation, when design shortfalls or oversights result in accidents and fatalities, the ethics of development, testing, and regulatory and societal expectations come to the fore.
"Even in disciplines where substantial emphasis, training resources, and historical learning should enable us to avoid these types of challenges, it does happen again," she said. "You look away for one moment in one aspect of a very complex endeavor, and you have these types of lapses. So that makes it all the more imperative that we get this right for new technologies and the impact they are having."
The latest initiative in ethics and computer science at Carnegie Mellon University is not intended to serve as a multi-disciplinary crossroad; it is tightly focused around issues that may arise as artificial intelligence systems begin to interact not only with humans, who presumably would have a way to override an algorithmic decision, but with other AI-driven machines. The new Foundations of Cooperative AI lab (FOCAL) will be led by AI and ethics scholar Vincent Conitzer starting in the fall, and is funded by the Center for Emerging Risk Research and the Cooperative AI Foundation (which itself is backed by an initial philanthropic commitment from the Center on Emerging Risk Research of $15 million).
"If these systems have conflicting objectives, this may lead to a variety of unexpected and disastrous outcomes," Conitzer said. "FOCAL aims to avert such outcomes by creating foundations of game theory appropriate for advanced, autonomous AI agents—with a focus on achieving cooperation."
Cooperating Between Institutions and Disciplines
Some of the ethical ambiguities in computer science might best be tackled by exchanging research discoveries at professional conferences and in journals. However, just as many of the new on-campus ethics initiatives are in a new and improvisational stage to find what works, so too are these vectors of exchange, especially in multi-disciplinary topics that include computer science and another field of study.
Conitzer said computer science conference organizers are trying hard to find ways to respect other fields' publication protocols so inter-disciplinary research can be shared without affecting career advancement within a field. Speaking of the relationship between economics and computer science, for example, Conitzer said computer science conference organizers are working hard to permit economists to present pertinent work at those conferences, without mandating the work be published in the conference proceedings. That way, the research can be shared, but still published first in the author's home subject journal.
Zimmer, a member of the ACM Special Interest Group on Computer-Human Interaction's ethics committee, said he is seeing computer scientists embrace ethics issues with increasing seriousness.
"One of the challenges my community around data ethics and research ethics has had is that many in computing and in more traditional computer science training don't get that training as part of their upbringing," he said. "If you look at something like the recent changes in the ACM Code of Ethics and how seriously that process was taken—and even the reaction to it, which wasn't all positive—it shows how seriously people take that document. Even though it's not something with licensing teeth to it, it's a culture."
Though progress toward embedding ethics deeply into every computer science curriculum is still in its early stages, Zimmer thinks it is firmly planted. For example, both the Herbst program and SERC award research fellowships to students to investigate ethics and technology questions about which they are passionate; the Herbst program also offers a certificate for undergraduate students who have completed 12 hours of ethics and technology courses.
"I haven't seen too many computer science departments that have hired an outsider ethicist like me, but I do see a much greater focus on embedding ethics within the curriculum," Zimmer said.
He adds, "But this is hard. Do you bring a philosopher into a computer science class? Do you find a computer science professor to think they know just enough about ethics? It's a hard balance to figure out."
Gregory Goth is an Oakville, CT-based writer who specializes in science and technology.
No entries found