In 2019, in a former railroad terminal in Pittsburgh, PA, a small tech start-up revealed its groundbreaking AI technology. They were met by a company whistleblower and an untimely zero-day hack on their system. This series of events would go on replay to audiences of 50 people for eight weeks, each night with a slightly different ending.
The immersive theater production that caused these events, Project Amelia,a was written by a technologist (and the first author) and supported by a group of Carnegie Mellon University researchers to expose the public to the potential crises of our technical future and learn from their reactions. Project Amelia tried to challenge the incongruence between people's privacy preferences and behaviors. People's beliefs about privacy and data sharing are often not informed by lived experiences that align risk perceptions to reality. Our hypothesis was that a performed encounter with a plausible privacy crisis may push people to act before facing a real crisis—an identity theft, an arrest, a stalker, or a stark realization that your data may be baked into a new product that appears to be dystopian.
Our project brought privacy risk to participants in a safer way: through immersive theater.
Our project brought privacy risk to participants in a safer way: through immersive theater.b Delayed precaution and passivity are not unique to privacy concerns, but also climate change, epidemiology, social equity, and other "wicked problems" where we appear to be challenged on how to instrument meaningful change. This article looks at our attempt to create and study the impact of an experiential narrative that gave people an embodied role and agency to explore a crisis that involves individual privacy and autonomy. This expands on a growing body of work around experiential futures that generates impactful experiences to support people in bridging the "gulf"3 of what they see day-to-day and what they may experience if they were someone else, somewhere else, or in a time yet to come.
This performance also served as a testbed for ongoing research. Our team sought to explore whether this immersive experience of a privacy crisis changed behavior of those who attended. To do this, we observed performances, gathered interaction data, and asked attendees and actors to complete surveys and interviews.
Unlike traditional performances, immersive theatre invites audiences to engage with actors in a less-structured format. Instead of assigned seats, each member of the audience can freely explore the space, be a "fly on the wall" as scenes are performed, or interact with cast members one-on-one.1 In some cases, the audience can even influence how the narrative might unfold. The result is that each person walks away with an experience that is unique to how they have chosen to navigate the event, and each night is special to the audience that participated.
Our research team had the opportunity to work closely with the large-scale immersive production Project Amelia,c written and conceived by team member Michael Skirpan, and produced by Bricolage Production Company in partnership with Probable Models. The production was a technology-enabled immersive theatre experience that invited audiences into the R&D labs of Aura—an imagined tech giant in the near future—to participate in the launch of a groundbreaking AI product: Amelia. Performances took place during eight weeks between September 2019 and November 2019 and hosted typically 50-60 people per night, six nights per week (see Figure 1).
Prior to each performance, Aura's marketing director reached out to "invited attendees" of the product launch (that is, ticket holders). This initial communication included an online survey that was used to assign each person a "role" within the performance. They were also invited to link their social media accounts to an audience-specific database that was made available to technology installations and provided to actors to infuse into interactions within the performance.
On arrival, guests were checked in, asked to surrender their electronic devices, provided with a dedicated smartphone to take part in the social network of the narrative world, and given an RFID wristband that could be scanned to unlock their real-world data for experiences with several fictitious products.
The show began with tours of Aura's research labs. Each tour was framed for the role of the audience group—for instance, a group of board members would meet the CEO whereas journalists were shown key products and sold on the company's successes (see Figure 2). After the tours, everyone was called to an auditorium for the AI product launch: a sophisticated humanoid android named Amelia. Halfway into the presentation, a former employee arrived to interdict the company's big debut with a whistleblowing revelation that the company was unethically experimenting on users. While Aura executives scrambled to save face, a hacker planted in the audience tampered with Amelia. These two frictions set the stage for the remainder of the show, which had seven unique endings determined by the audience's choices in the remainder of the interactive scenes (see Figure 3).
This plot simulated a moment where a company's lack of concern for privacy and ethics led to real human impacts. Through the story itself, private moments between audience and actors, and interactions with technology, an array of themes related to privacy and computing ethics emerged to both help raise audience awareness and also provide an experience with some real (albeit fictional) stakes.
One of the key story elements revealed by the company whistleblower, Felicia, was the revelation that Aura was doing experiments to impact their customers' mental health and emotional well-being without any kind of oversight or consent. Felicia slowly leaks hints of an A/B test targeting the mental health of a group of users that ended with dire consequences for some. Those who followed Felicia's storyline participated in 30 minutes of actor-supported sleuthing to learn the details of an experiment eerily similar to the controversial Facebook Emotional Contagion Study.2 The commitment of the audience to understanding the whistleblower's truths and bringing them to justice is one of the main factors that determines the ending. This thread allowed for both interactive learning and deeper reflection on the role of whistleblowers in our society.
Another moment from the story showcased emerging issues in AI privacy and security: a hacker giving an adversarial input to Amelia. While the Aura execs were distracted, an actor planted in the audience approached Amelia and played an odd set of sounds. The attack was inspired by the infamous DolphinAttack using inaudible voice commands.4 The attack left Amelia in a state of confusion, unable to identify contexts, emotions, and solutions as the android had previously demonstrated. This moment was placed to not only act as a teaching tool for a rather complex and new security threat vector, but also to provoke questions about how AI and other autonomous systems should be governed.
The themes of privacy and ethics ran deeper than the storyline. Audience members were further engaged in interactive moments or throughlines that could allow them to spend time considering the questions being raised. For instance, two audience members per night were invited by the hacker to join "Cicada," an elite hacking organization trying to bring down Aura. Those who agreed were taken on a parallel shadow-track of the show that played out more like an escape room.d They searched for poorly hidden passwords, identified cameras living on insecure internal networks, and picked a lock to potentially shift the show's ending in favor of a hacker's revolt.
Other audience members were given moments of small-group time with Amelia. This often led to fascinating displays of people's inner hopes, fears, and misunderstandings of AI. Patrons would regularly ask Amelia, "[w]hat data are you accessing right now?" which offered Amelia ways to play around with the inferential possibilities of AI and predictive privacy (for example, using body language and tone to claim that Amelia inferred the person as 'threatening' or 'disturbed'). Sometimes Amelia asked the group whether AI was about to revolutionize the world. These moments would often spark impromptu discussion and debate among the audience.
Cast as board members or key stockholders, some attendees were brought into a boardroom to speak with the CEO following the whistleblower's accusations. They were tasked with deciding how aggressive the company should be toward Felicia. Their decisions impacted what happened in the coming scenes.
Nearly a dozen technology installations and fictitious "beta-products" were installed around the performance space. One product, Own Up, challenged people to take part in an entertaining privacy experiment. Audience members could check in to a machine that pulled from their recent social media history to display anonymized quotes of those playing the game on a large public projector screen. Those who decided to "own up" to what they said would walk to the center and press a button in front of everyone. The quotes that were unclaimed sat in a graveyard on the screen for all to consider. Another product, Aura Vision,e made a variety of inferences (often of questionable accuracy) about individuals using only their face, including age, gender, emotion, attractiveness, responsibility, and other attributes.
Our research team developed an IRB-approved protocol for collecting audience data that we could use to evaluate the impact of the experience. We found immersive theater to be a challenging environment for conducting research that respects participant consent—how do you convey to an audience that has lived in an imaginary world for the evening that our request for consent to use show data for research purposes is actually real? Technical difficulties with the show's Wi-Fi network and low-end smartphones posed additional challenges to our planned automated data collection. We nevertheless managed to collect some data in the form of interviews and surveys after the performances. Only a small fraction of audience members answered in-depth questions about their privacy intentions and behaviors. While the show did not give most participants specific skills for managing their privacy, our data suggests it was highly successful in equipping participants with motivation and broader frameworks for discussing privacy and making decisions. For example, one audience member said, "I walked away from Project Amelia with more awareness of my presence on the Internet, determined I wanted to manage the information that circulates [online]."
Immersive theater and other narrative approaches can open new doors in online privacy education.
The inclusion of many different stakeholder characters and open dialogue equipped some audience members to pause and consider decisions that some would consider less privacy protective. Several weeks later, we asked some audience members to recall any privacy-related actions they took as a result of Project Amelia. One audience member explained how they decided to acquire a smart device, "I had always been kind of creeped out by smart speakers. But, I had the opportunity to get one for free and I thought I would try it out. Project Amelia brought up the ways technology can make life easier. While I don't completely trust the smart speaker algorithm, I decided I was okay with giving up that portion of privacy to have the connectivity."
Immersive theater and other narrative approaches can open new doors in online privacy education. Project Amelia afforded audiences and actors the opportunity to safely try on roles, behaviors, and opinions not available to them in everyday life. While how-to workshops and resources certainly have their place, technologists ought not underestimate the power of narrative and open-ended dialogue in effecting behavior change. In a survey, one audience member illustrated how the show impacted discussions with friends, "Project Amelia gave me a new way to initiate and frame conversations with family/friends. If you're talking about a theatrical production, people don't shut down as quickly. Once you're into the conversation, you can turn it toward reality."
Reflecting on the experience, our research team believes we could have been more successful in collecting data had we prioritized a smaller set of data targets and relied less on automated collection methods. Considerations for future researchers include:
Our team hopes to see more collaborations similar to Project Amelia, where artists and writers produce interesting, engaging, and thought-provoking content, researchers and industry players explore important questions at the heart of societal living, the audience actively participates in exploring an invented world, and all collectively enjoy and learn from the experience. Project Amelia focused on important current privacy and ethical dilemmas, but this approach can be borrowed and applied to many exciting areas, using fictional worlds to thoughtfully and positively advance our real world.
4. Zhang, G. DolphinAttack: Inaudible Voice Commands. In Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security (CCS '17). ACM, New York, NY (2017), 103–117; DOI:https://doi.org/10.1145/3133956.3134052
a. See https://bit.ly/3nCzRZX
b. Immersive theater is in the family of dramatic methods that includes experiential learning, legislative theater, and futures work. Other technology-related works include Arizona State University's "Emerge: A Festival of Futures" (https://bit.ly/32aEdjg) and Science Gallery Dublin's Grow Your Own (https://bit.ly/3FA3tgw). For those who are curious, you might consult Dunne and Raby's 'Speculative Everything' (MIT Press, 2013) as well as the work of Super-Flux (https://bit.ly/33QpUB3), the Near Future Laboratory (https://bit.ly/3nCCg6V), the Situation Lab (https://bit.ly/3ADqzSF), the Institute for the Future (https://bit.ly/3qDMkOD), and the Center for Science and the Imagination (https://bit.ly/3qF6ojJ).
c. This episode of PBS televison series "Immersive World" shows details about the performance: https://to.pbs.org/33MP8QC
d. Escape rooms are group puzzle games that are usually done in staged environments where participants are locked in a room and must complete all puzzles, often related to an underlying story, in order to get out and ultimately "win."
e. A remake of the computer vision exhibit "Biometric Mirror" by Niels Wouters; see https://bit.ly/3GCls7k
The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.
No entries found