acm-header
Sign In

Communications of the ACM

Viewpoint

Designing an Ethical Tech Developer


group of people standing outside, with faces shielded and obscured

Credit: Trismegist San

Most students anticipate a difficult academic transition from high school to college, but for many, the social transition is just as jarring. To ease this stress, Harvard freshman Yuen Ler Chow developed the facial recognition software called The FaceTag,7 which links to a database of fellow Harvard students' contact information. After users scan a classmate's face using the app, they gain access to that person's phone number or social media accounts. The common problem of meeting someone once but barely getting their name was fixed, simplifying the exchange of information and streamlining the friend-making process.

This solution resulted in more than 100 users downloading the app in less than two months. However, after Yuen Ler Chow posted a viral video to the social media platform TikTok, the comments section agreed that The FaceTag was an unethical violation of privacy.10 The idea that a person need only scan someone's face before having access to their personal information was too much, even for a generation of avid social media users. They especially did not trust an 18-year-old to keep such sensitive information secure.

During my freshman year at Colby College in the fall of 2021, I began exploring computer science for the first time. In my introductory class, my skills grew quickly. I developed a diverse skill set in Python, from printing out my first "Hello, World!" to processing massive data files. In the span of months, I was a capable (albeit inexperienced) programmer with a special interest in the ethical dilemmas presented by my introductory computer science professor. She gave us various extra-credit projects that allowed us to explore real-world situations, which used or abused the skills we were developing. The FaceTag in particular caught my eye because I have friends currently attending Harvard whose lives would be impacted by this program. Yuen Ler Chow unknowingly encouraged me to explore how we can teach young people to ethically wield the power of technology.

For the most part, Yuen Ler Chow was very careful when creating his program. He explained how the data was secured in Google's Firebase, accessible only to him and to whom the user grants permission (which requires specific clearance by the user if the account is private). Students must enter their Harvard login to use the program, ensuring only Harvard students can access this database. Yuen Ler Chow says he aspires to be better than Mark Zuckerberg, the tech mogul who started Facebook as a 19-year-old Harvard sophomore.

Although Chow's comparisons to Facebook were only in jest, ambition is indicative of a generation of aspiring Zuckerbergs. Colleges in the U.S. awarded 79,598 degrees in computer science in 2018 (63,704 men, 15,894 women), an 11.5% increase from the prior year.14 The demand for programmers is ever-growing and ever-lucrative, the average annual wage for programmers with a bachelor's in computer science standing at $89,000 in 2020.4 The increased demand for students with a CS degree attracts more young people every year, and that is just those with degrees in the field. Many programmers do not have an interest in higher education or begin writing code while still in school. The online accessibility of programming training is part of its allure, allowing young people to acquire coding skills so long as they have an interest and a computer.

As his standing in school may have you believe, Yuen Ler Chow is a part of this young wave of programmers. Although both he and his users must have demonstrated impressive intellect to attend their prestigious university, they are still subject to the limitations of adolescent cognitive development related to decision making. The amygdala, the emotional reactivity center of the brain, drives the decisions and actions of adolescents.9 With maturity, the frontal cortex takes the wheel, allowing for more logical decision-making and reflection on long-term consequences. The underdeveloped frontal cortex of young people is associated with increased risk-taking behavior. Adolescent brains also overestimate rewards, encouraging risks to be taken without an accurate appraisal of the benefits. As a result, adolescent-designed systems, such as The FaceTag, are flawed on both ends, based on the decision making of both the programmer and the target user. Neither stakeholder has the cognitive maturity to fully appreciate the risks of such a program, and, when the rewards seem so great, these risks pale in comparison.


Because both programmers and users share the blame for putting the user's safety and privacy at risk, they both have a part to play in creating and operating safer programs.


The FaceTag has no shortage of such risks. Facial recognition's inaccuracies leave certain groups mis-identified at alarming rates, like how women were misidentified 8.1% to 20.6% more than men. Dark-skinned faces were also misidentified more than light-skinned faces at a rate of 11.8%–19.2%.3 Even if a programmer is confident in their technology's ability to function, it does not mean users should trust each other. A top concern with programs such as The FaceTag comes from the risk of classmates who access users' information through the legitimate use of the app. For example, accounts that are not private and permit immediate access to personal information could lead to stalking or harassment. Additionally, the complications and inconveniences of meeting someone for the first time provide opportunities to consider if they are trustworthy. By removing the checkpoints to pause and evaluate this stranger through a bumpy and awkward introduction process, a person becomes more susceptible to coercion and manipulation. This is especially dangerous for young women, who make up the majority of social media users and who are far more impacted by violence in college than any other group of age or gender.15,16 These risks are real, regardless of whether or not adolescents regard them as such.

Because both programmers and users share the blame for putting the user's safety and privacy at risk, they both have a part to play in creating and operating safer programs. This includes action to be taken by the institutions educating the next generation of programmers, namely teaching computing ethics, especially in introductory computer science courses. These lessons are not intended to substitute instruction on programming skills and algorithmic design patterns. Instead these lessons should be an ongoing conversation about the values that are embedded into algorithms and systems and the differential harms and benefits that technology will have on diverse groups of stakeholders and communities. This conversation must start early and be continued throughout the computing program.8 Computer science programs have a lot of freedom with respect to how exactly they would like to educate their students on considering the ethical impact and responsibilities of technology creation, but we should not save the conversation for an intermediate-level college class or an instructor with a particular interest.


Institutions educating programmers should be actively discussing how to best teach computing ethics beyond accreditation requirements.


Students just starting on their computing pathway should read about the profit-motivated abuses of technology companies as well as the computing researchers and professionals who are reshaping development culture with ethics by design.1 There are new resources available to provide support for CS faculty who would like to implement ethical lessons in their courses but feel unprepared by their own training to lead discussions or design new lessons. For example, there are hands-on curricula that require students to actively investigate ethical computing concepts such as fairness, accountability, and transparency as applied to the areas that interests them the most.12 There are tools and resources to discuss emergent technologies using current case studies and fictional narratives to help a young generation of programmers and designers develop skills to critically think through the potential impact of technology on all of humanity and the planet.6 The goal of the movement to embed computing ethics throughout CS curricula, and in turn, professional computing culture, is to create a more proactive generation of responsible technologists who know how to think critically about the ethical impact at every stage of the design, development, and deployment life cycle. It also means perhaps making decisions to not develop an idea based on that ethical evaluation and a commitment to a set of responsible innovation and design values.

Institutions educating programmers should be actively discussing how to best teach computing ethics beyond accreditation requirements, so their program reflects their own institutional or department values. Furthermore, early CS graduates, developers, and system engineers also need to reexamine their own direct responsibilities to their technology users. Some tech companies have engaged in extensive 'ethics washing' to respond to public scrutiny of their technical development, hiring, and business practices and the results have been predictable and chilling.11 Other companies have instead partnered with ethicists and vulnerable stakeholders to support retraining employees and developed outstanding responsible innovation practices to share with others seeking to do the same.5,13 As a computer science community, ACM has recently updated the professional codes of ethics to guide responsible computing practices and to affirm that individual ethical conduct is a responsibility of the profession.2

Students must learn about these expectations in their first introduction to computer science and throughout the rest of their professional training. The responsibility programmers have to their users goes beyond issues like biases built into code or flawed simulations; it has far more to do with the effects of programming on all stakeholders in society. Although technology stands to optimize many areas of our lives and solve the grand challenges our planet faces, it often comes at the expense of justice, access, privacy, and security. Young people are especially at risk for ignoring (rather than prioritizing) the ethical repercussions of their work. By starting education on ethics while their brains and habits are still developing, we will create a safer landscape for technological development that not only mitigates the shortcomings of adolescent psychological development but also create a skill set that these young programmers can utilize for the rest of their lives. Establishing a commitment to responsible computing practices and accountability among academic institutions, companies, and individual programming professionals (especially young aspiring Zuckerbergs) as soon as possible will ensure we do not sacrifice the well being of any community in the name of technological progress.


Young people are especially at risk for ignoring (rather than prioritizing) the ethical repercussions of their work.


Perhaps it is too idealistic to expect adolescent program developers and their targeted users to carefully consider the full and long-term impact of the programs they build and use. However, as a new programmer, I welcome continued discussions of the ethical impact of new and old technologies in my computing courses while I build my technical skills. I was fortunate enough to have a professor who found the time to explore ethical dilemmas in an introductory computer science course, and it has allowed me to execute my own projects without compromising the safety and security of my users. For example, I created a psychological experiment on attention that implemented a randomly generated identification code for all users to maintain anonymity. While this may seem like a simple or even unnecessary accommodation, it trains the prioritization of user integrity above all else, an attitude I will continue throughout my programming career. The more we, as CS students, can learn about the importance of these conversations and our own personal responsibility as technology creators (and users), the more we can learn how to hold ourselves accountable to ensure our code does not harm others in the name of innovation and progress.

Back to Top

References

1. Allyn, B. Here Are 4 Key Points from the Facebook Whistleblower's Testimony on Capitol Hill. NPR. (2021); https://n.pr/3QSrQgG

2. Association for Computing Machinery. ACM Code of Ethics and Professional Conduct. Association for Computing Machinery. (2018); https://bit.ly/3XGsII5

3. Buolamwini, J. and Gebru, T. Gender Shades. (2018); https://bit.ly/3Hms5h3

4. Bureau of Labor Statistics. U.S. Department of Labor, Occupational Outlook Handbook, Computer Programmers (2022); https://bit.ly/3XISwmK

5. Costanza-Chock, S. Design Justice: Community-Led Practices to Build the Worlds We Need. MIT Press. (2020); https://bit.ly/3XLBzYp

6. Doore S.A. et al. Computing Ethics Narratives repository. (2021); https://bit.ly/3WvDxeh

7. Duggasani, S. and Gilman, I.Y. Can I scan your face? (Oct. 2021); https://bit.ly/3WkZsVL

8. Fiesler, C. et al. Integrating ethics into introductory programming classes. In Proceedings of the 52nd ACM Technical Symposium on Computer Science Education, (2021), 1027–1033.

9. Fryt, J. and Szczygiel, M. Predictors of positive and negative risk-taking in adolescents and young adults: similarities and differences. Europe's Journal of Psychology 17, 1 (2021), 17.

10. Haskins, C. Harvard Freshman's 'the Facetag' app stokes facial recognition debate. (Oct. 2021); https://bit.ly/3HLFvn3

11. Hoa, K. We read the paper that forced Timnit Gebru out of Google. Here's what it says. MIT Technology Review. (2020); https://bit.ly/3ZNPUWa

12. Mozilla Foundation. Responsible Computer Science Challenge: Teaching Responsible Computing Playbook. (2021); https://mzl.la/3QV83x1

13. Mozilla Foundation. The Mozilla Manifesto (2020); https://mzl.la/3HCHiL1

14. National Center for Education Statistics. IPEDS: Integrated Postsecondary Education Data System: Computer Science Degrees. National Center for Education Statistics, Washington, D.C. 2022.

15. Pew Research Center. Social Media Fact Sheet: Demographics of social media users and adoption in the United States. (2020); https://pewrsr.ch/3WsT2nr

16. Stoner, J.E. and Cramer, R.J. Sexual violence victimization among college females: A systematic review of rates, barriers, and facilitators of health service utilization on campus. Trauma, Violence, & Abuse 20, 4 (2019), 520–533.

Back to Top

Author

Alexandra Gillespie (a.gillespie5@icloud.com) is a computational psychology major and a research assistant at the INSITE lab at Colby College in Waterville, ME, USA.


Copyright held by author.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2023 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: