Digital technologies for learning, health, politics, and commerce have enriched the world. Digital heroes like Sir Tim Berners-Lee, Batya Friedman, Alan Kay, JCR Licklider, and Joe Weizenbaum have blazed trails. Yet there is trouble. We depend upon software that nobody totally understands. We are vulnerable to cyberterrorism. Privacy is overrun by surveillance capitalism.7 Totalitarian control advances. Daily Internet news matching our beliefs makes it difficult to tell true from false. People are addicted to devices. Jobs disappear without social safety nets. Digital leviathans threaten to control all commerce. Our values are threatened.
There are risks of premature use of AI in domains such as health, criminal justice, senior care, and warfare. Much current AI is unreliable, without common sense, deceptive in hiding that it is an algorithm, unable to explain decisions, unjust, neither accountable nor responsible, and untrustworthy.
Our digital dreams are now digital nightmares.
We cannot put the genii back in the bottle, but we can ensure human control. Our ethical duty as computer scientists is to do so. We can advise thoughtful citizens and society on productive actions. There is much we can do by ourselves. The digital world can be consistent with our values.
I will start with actions for ordinary people. I mean non-technical people who encounter technology daily—our parents, neighbors, and friends.
We must exercise self-control over digital technologies, limiting use of cellphones, and establishing periods of disconnection—for even days at a time. We can establish zones of unavailability—with family or with friends. When was the last time you rode on transit, and instead of texting or playing a game, you looked around, noticed something pleasing or bizarre, and even smiled and waved?
Parents can establish rules for appropriate digital immersion by their children. Just as there are bounds on TV viewing, there must be limits on Internet and social media time. Parents can explain their limits on digital junk food.
Many people feel inadequate with respect to technology. They feel they do not understand it, they cannot do it, and it is their fault because they are stupid. They are not stupid. Despite the work of UX professionals, functionality and complexity in software grow faster than simplicity. Users continue to be overwhelmed. Yet there are few tech concepts that cannot be explained by responsible computer professionals in plain language.
People should not put up with jargon: geekspeak. If someone uses: complexity … concurrency … phishing … platform … system, speakers must be asked to explain. Tell your parents and neighbors they can expect to have technology ideas and concepts explained clearly. GOFAI … identity theft … intelligent tutor … machine learning … precision medicine … user experience … all can be explained to enable people to feel less steamrolled by high-tech.
Underlying many technology applications are social and ethical questions about their use, about our priorities, and about justice and goodness. Answers to the questions drive actions with respect to tech and applicable regulations and laws. Computer scientists must participate in ethical action.
People can indicate their approval or disapproval of big tech firms by their purchasing power. If you disagree with their behavior, do not buy their products. Do not download their apps. Tell people and the company what you did and why.
Citizens must be included in critical decisions about technology and its use. They must be full participants in social, political, and ethical discussions. They must make their voices heard.
Citizens can lobby and vote expressing support or disapproval of actions involving tech, such as using your city's streets for testing self-driving cars, or whether it should become a "smart city." These cases happened recently, the latter where I live in Toronto.3 Concerned citizens should write letters to the editor or to their representatives, and speak up through social media.
People invested in a company's stock can make their voices heard as shareholders, as Amazon experienced with respect to climate change. If you feel strongly, buy shares of the firm you feel is evil; speak up at the shareholders' meeting.
Other actions may be taken by society typically acting through governments. Before the 1960s, colleges were committed to a liberal education. This was thrown out by many universities in the 1970s. Many CS students today have little idea from their education about the nature of the world and of their responsibilities as citizens.
This must change. CS students should learn about virtue ethics and utilitarianism, Lear's agony and K's confusion, and the yearnings of the politically dispossessed. Students studying CS must not focus totally on computing and mathematics, as is now often true. Most colleges are government-funded, so society can apply pressure.
Computer science students should be required to take "computers and society" or "computer ethics," now typically not required by many universities. My 2019 textbook provides a thorough approach.1 Another imaginative and engaging method is to introduce issues by reading and viewing science fiction novels or films.2
Under the leadership of CS Professor Barbara Grosz, in collaboration with Philosophy Professor Alison Simmons, Harvard has developed a program called "Embedded Ethics."5 Philosophers are embedded into courses to teach students how to think about ethical and social issues. Students study bias in machine learning, fake news in networks, and accessibility in HCI. Results have been positive. The program keeps ethics at the forefront throughout the curriculum. Students are engaged, "expressing eagerness for … more opportunities to develop skills in ethical reasoning."
Medicine, law, and engineering license practitioners and accredit their qualifications and degree-granting institutions. Physicians, lawyers, and engineers may be held legally responsible for their work. Actions violating ethical norms are subject to sanctions, removal of qualifications to practice, even criminal prosecution. CS has not been subject to such standards, with no mechanism to ensure acceptable performance. This must change.
University computer science departments must agree to being accredited. General education accreditation requirements must be strengthened. The current ABET accreditation requires only one-half of a computers and society course, with no requirement for ethics. This is wrong.
Computer scientists must be licensed, with requirements for continuing education, adherence to professional practice standards, and procedures for removal of one's license and disciplining—fines and even prison—for CS malpractice.
Regulation has been sparse. Legal action has been rare. This is changing because of social media's hate speech, fake news, and privacy invasions. Europe has been at the vanguard of legislation, as with the General Data Protection Regulation.a It has been effective in enforcing laws against Big Tech.
Big Tech has created wonderful products. But the leviathans must be broken up to prevent increasing monopoly power—their ability to squash innovative competitors, to spread unchecked into new markets, and to control prices.
I have insisted that ordinary people not accept geek speak, so you must do your part. Practice your ability to describe in plain English your tech work or articles in the media to relatives and friends.
Speak up when you encounter systems that are unusable, cluttered, and confusing, that have been inadequately tested, and that leave people unhappy and frustrated. Complain to managers if you are working for a firm shipping such monsters of poor design and careless implementation.
Bloatware is systems cluttered with thousands of commands and features, most only appealing to a tiny minority of users. Much of today's software is packed with more features than any normal human needs and can use.6 Bloatware makes systems unreliable, unusable, and forbidding. Campaign vociferously against bloatware.
Tech R&D require CS talent to realize human dreams rather than the nightmares. AI agents helping people must be identified as algorithms. Decisions and actions of algorithms should be transparent and explainable. There are opportunities to ensure algorithms make fair, just decisions.
Design of AI keeping in mind goals such as openness, transparency, and fairness is an example of value sensitive design (VSD), developed by University of Washington Professor Batya Friedman and collaborators.4 Computing professionals designing systems can devise ways to ensure functionality and UI reflect their values. Other key system values are safety, simplicity, clarity, honesty, compassion, and empathy.
We cannot put the genii back in the bottle, but we can ensure human control.
VSD is an example of technology motivated by social good. CS students can look for applications that address pressing societal problems, such as the environment. They can look for aspects of technology that speak to and respond to user needs, such as clarity of the user experience, or availability of sensitive customer support.
Because of the pervasiveness of digital technologies, CS students should consider government careers or even running for public office. Most elected officials were trained as lawyers. Occasionally, we see legislators who were doctors or teachers. Computer scientists should also undertake government careers and use their expertise and their ethical concerns to enrich tech policy.
Students can consider a firm's ethical track record in deciding whether to work there. Facebook saw this after misuses of social media during the 2016 U.S. election caused many new graduates to shun the company.
Employees can speak up when they believe their firm's actions are immoral. There are ways to escalate speech that are relatively safe. Start with private conversations and email with fellow workers. Send private statements of your beliefs to managers. If this has no effect, go public, first within the firm, then outside it.
Google is a case study in employee beliefs and actions. Workers spoke up about a military project viewed as odious, work for U.S. Immigration and Customs Enforcement (ICE) that seemed immoral, a search engine for China enforcing censorship, and when the firm's actions with respect to gender equality and safety seemed insufficient. Turmoil within the company continues to the present, especially since there seem to be firings in retaliation for speaking out or publishing papers.b Unions are forming in tech.
Two actions go beyond protesting. One can become a conscientious objector. Patterned on soldiers refusing to serve in war, this is the refusal to work. The distinction between general and selective objection is key. My friend Louis Font in 1969 became the first West Point graduate to become a selective conscientious objector and refuse to serve in Vietnam, even though he did not object to all war. One can object to a specific task or to all work, of course at great personal risk to one's career.
The final action is whistleblowing. This occurs when employees are so certain of the immorality of confidential actions that they announce them to the world. The best recent example is Frances Haugen's revelations about actions and failures to act by Facebook that have caused great harm. The U.S. government has protections forbidding retaliation against whistleblowers. There is not true of companies; whistleblowing requires great courage.
There is much we can do. We can assume responsibility so our friends and neighbors understand enough tech to exercise their rights as citizens with respect to its use. We can ensure the education of computer scientists and governance of CS reflect our beliefs. We can pursue careers recalling our values and speak up when a company's actions violate them. We can work with politicians and activists to advance the public good.
This is a clarion call to explain computer science clearly, to take ownership of the responsibilities of CS, to overcome lethargy and defeatism, to think hard about our values, and to step forward and act.
a. See https://bit.ly/3A2gf6v
b. See https://bit.ly/3qw1Psa
This column had been derived from the author's address to ACM SIGCHI for its 2020 Social Impact Award.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.
No entries found