Values Levers: Encouraging Ethical Debate within Software Design

Abstract: 

Emerging information and communications technologies enable individuals and communities to collect and share granular, accurate, and sometimes personal data about their lives and environments, raising numerous challenges for socially responsible innovation. To encourage socially responsible data collection, ICT designers must incorporate discussions of ethics, values, and social responsibilities into their design process. An ethnographic study of mobile ICT development suggests that values levers – design practices which encourage new conversations about social values and help designers come to consensus about those values – play an important role in encouraging responsible innovation in ICT. 

Full content: 

Values Levers: Building Critical Reflection into Software Design

Growing interest in corporate social responsibility and privacy by design is transforming ethical decision-making into a critical component of information and communication technology (ICT) development. Policymakers are increasingly encouraging corporate responsibility in the ecosystem of ICTs that collect, process, and share personal data. For example, US policymakers are asking the emerging mobile application market to self-regulate according to voluntary codes of conduct (Federal Trade Commission, 2012). Such self-regulation places responsibility for ethical decision-making on the designers of mobile data applications.

Ethical decision-making is critical in this space because emerging tracking and recording technologies enable individuals and communities to collect and share granular, accurate data about their lives and environments. Mobile devices and wearable sensors collect location, movement, and activity data that can contribute to new treatments in health and wellness, discoveries in public health and social science, and community building and civic engagement. But these data are also quite personal and sometimes sensitive, and such technologies raise challenges to privacy, consent, equity, and data retention and accountability (Shilton, 2009). Designers of pervasive data collection systems must make ethical decisions about the granularity of data collected, how (and if) user consent is sought, whether sensitive data will be stored over the long term, and whether sensitive data will be sold to third parties or turned over to governments upon request.

Though building consideration of ethics directly into engineering practice is a growing movement (Johnson, 2011; Miller, Friedman, & Jancke, 2007; Shilton, 2010a; Spiekermann & Cranor, 2009; Verbeek, 2006a), we still don’t fully understand how technical affordances that support some values (e.g. efficiency and novelty) over others (e.g. privacy and security) are chosen and implemented during the design process. Encouraging software engineers and mobile developers to recognize the shifting and permeable boundary between data collection for individual or social goals, and corporate or government surveillance, is a challenging goal (Shilton & Estrin, 2012).

Ethnographic study of a mobile ICT development laboratory  (Shilton, 2010a, 2013)suggested that some practices within design explicitly encourage attention to ethical issues and social responsibility. These are values levers: design practices which encourage new conversations about social values in design and help designers come to consensus about those values. Four design practices in particular – 1) participating in prototype testing, 2) participating in interdisciplinary teams, 3) internalizing advocacy from a team member dedicated to values issues, and 4) seeking and gaining new funding – proved effective at surfacing social issues and generating consensus around technological features based on social values. One final lever – navigating institutional ethical mandates – proved less effective because of its distance from the design process. By encouraging values levers, we can encourage attention to ethics and social responsiblility within software design.

 

Background

Values levers are a part of how, as Verbeek writes, engineers “do ethics by other means” (2006b, p. 369). Three years of participant-observation in a design laboratory provided a case study in design practices that can help software engineers attend to social values as part of their design work. Shilton performed interviews, document analysis, and participant observation of the Center for Embedded Networked Sensing (CENS), a science and technology research center based at the University of California, Los Angeles (UCLA). The project investigated how social values such as privacy, consent, equity, and social forgetting intersected with design work in this lab. Coding field notes, documents and interviews revealed that these social values tended to surface during a variety of design activities. These activities, identified as values levers, opened new conversations about ethics and values (Shilton, 2013). By investigating what factors encourage engineers – consciously or not – to prioritize human values in their work, this study illustrated that the routinized practices of design shape the values incorporated into new technologies.

 

Examples

CENS engineers discovered and debated privacy, consent and equity concerns during three commonplace laboratory activities: testing prototypes of their data collection applications and those of their colleagues; discussing their work with interdisciplinary colleagues; and shifting arrangements and responsibilities as teams grew in size due to increased funding and resources (Shilton, 2013).  

As in many development labs, designers tried new systems themselves before they were tested with users. Prototype testing fostered a focus on personal data, which was distinctive within the design process. During self-testing, the kinds of data collected (such as granular location tracking) helped designers explore and experience the possible inferences that could be drawn from that data. Self-testing made personal data, and ethical concerns surrounding that data, material for designers. That internal prototype testing is important to good design is not surprising, but the effects of such testing on consideration of social values have gone unexplored. When CENS students ran their colleagues’ location-tracking programs, they gained new respect for privacy and equity as design criteria. A practice meant to check new products for usability and bugs had the unanticipated result of encouraging researchers to reflect on the sensitivity of the personal data in their systems.

Working alongside interdisciplinary colleagues was another practice that encouraged a focus on the social concerns surrounding personal data collection. While a majority of CENS designers were pursuing degrees in computer science (CS) or electrical engineering (EE), statisticians attended weekly meetings as well. Engineers tended to focus on system parameters such as speed and resiliency, but statisticians frequently referred designers back to issues in the data. This refocusing on (often sensitive) project data deployed a values lever. It allowed for not only statistical discussions, but also ethical debate about data representation, sharing, and security. Being forced to talk across disciplinary boundaries helped the design team articulate social values of importance.

Seeking funding and subsequently gaining resources were also values levers at CENS, as growth in funding encouraged practices that fostered attention to values. Outside funding (such as from the National Science Foundation) ensured there would be graduate students to work on a project, full-time staff to concentrate on duties unsuited to graduate students, and resources such as mobile phones and server space to devote to a project. Better-funded projects had correspondingly large development teams. Larger teams required formal weekly meetings and clear lines of communication. Social values tended to come up in these meetings, due to a variety of factors. Leaders often attended larger meetings, as did interdisciplinary colleagues. The discussions fostered by a larger, more diverse group of people tended to reveal social worries and opinions, which were then elevated to design concerns. This contrasted to smaller projects, which had little outside funding and only two or three developers. Design meetings for these projects were often spur-of-the-moment, and team members communicated about these projects largely over email. Fewer ethical concerns surfaced in the discussions of the small working teams.

Values levers were most often deployed by people close to design, including interdisciplinary colleagues and project leaders. But CENS designers were also influenced by actors farther from design, including administrators responsible for the responsible conduct of research at the university. The university imposed its own ethical mandates on design, enforced through the oversight of the Institutional Review Board (IRB). Laboratory leaders were proactive in their relationship with the IRB, actively informing it of research developments. The IRB was, in turn, flexible and accommodating of design timelines and internal procedures. Though the IRB demonstrated their willingness to work with engineers, designers still considered seeking IRB approval to be undesirable or even painful. Negotiating outside ethical mandates required paperwork, could take quite a bit of time, and therefore slowed down the pace of testing and implementation. The focus on paperwork made IRB discussions into administrative tasks, rather than practices central within design decision-making. In this way, the IRB functioned very differently than other values levers, which brought values discussions into design meetings. Outsider status frustrated the IRB’s effectiveness. Moving ethical reflection closer to design is an important attribute of effective values levers.

 

Related Solutions

Values levers are an approach to embedding ethics in design that reflects other, related approaches currently being explored in the technology ethics community. Values levers are complimentary, but distinct from, approaches focused on classroom education for engineers (Hollander, 2009), or interventions such as embedding ethics experts within design teams (Fisher & Mahajan, 2010). Values levers also build upon related approaches in values-sensitive design (Friedman, Kahn, & Borning, 2006), critical making (Ratto, 2011), and reflective design (Sengers, Boehner, David, & Kaye, 2005). While the latter approaches are recognized as distinctive design practices, values levers occur in everyday design settings.

A values-levers approach is also complimentary to specific interventions (what could be defined as external values levers) such as values cards (Friedman & Hendry, 2012), design scenarios (Carroll, 2000), and cultural probes (Halpern, Erickson, Forlano, & Gay, 2013). Indeed, these purposeful activities may serve as values levers by encouraging ethical discussion and consensus within design. But focusing on finding values levers pays attention to strengthening existing practices which evoke values discussions within design. For a laboratory leader, corporate counsel, or funding representative who hopes to encourage discussion of social consequences during ICT design, values levers are a tool to reshape workplace discussions. While educational, interventionist, and alternative design culture approaches are all important, a focus on values levers suggests that changes to the structure and work processes of design labs can encourage socially responsible innovation.

 

Limitations and Future Work

The major limitation of this research, like most qualitative work, is that values levers are not immediately generalizable. Work practices which open new discussions about social values may vary between organizations or ICT deployment contexts. Additional case studies from multiple design contexts will help identify and unpack other work practices that may serve as values levers. Shilton is pursuing both qualitative and quantitative follow-up research to identify and generalize values levers in ICT development, as well as identify possible impediments to values levers within design.

            An additional limitation of the CENS study is that the values analyzed all relate to privacy and surveillance. This limited range of social values was a product (and weakness) of the author’s method, rather than an attribute of values levers or discussion at CENS. Because of the concerns raised by the pervasive nature of CENS data collection, and because of Shilton’s academic background, anti-surveillance values (e.g. privacy, consent, equity, and forgetting) were identified as objects of interest upon beginning the ethnography (Shilton, 2010b). And so while a range of other values were discussed at CENS – for example, openness in open software development – the author did not explicitly collect data on the levers which raised these values. This limitation points to a considerable challenge for values research: deciding what, or whose, values on which to focus. Researchers have long struggled with whether attention to values should be done deductively, as from a list or existing framework, or inductively from observation of the research team (Le Dantec, Poole, & Wyche, 2009; Shilton, Koepfler, & Fleischmann, 2013). Because the discovery of values levers was limited by the deductive framework applied to data collection at CENS, Shilton’s ongoing work observing a contrasting technology design team will take a different approach. In this new work, the author is choosing values on which to focus inductively, by examining and coding values expressed in early publications, and then observing how and why the values discussion changes over time (Shilton & Koepfler, 2013).

Though it is a fledgling concept, values levers suggest that researchers interested in social responsibility in ICT pay attention to the structures and practices of engineering workplaces. By encouraging social considerations as part of the work of design, we can encourage developers to implement appropriate standards for data protection and privacy, consent, data retention and forgetting, and equity. By incorporating social and human values as design criteria, software engineers can strive for positive social impacts, counter possible negative impacts, and encourage socially responsible innovations. Ongoing work will continue to study how work practices and organizational arrangements can have a direct influence on socially responsible design.

 

References

Carroll, J. M. (2000). Making Use: Scenario-Based Design of Human-Computer Interactions. Cambridge, MA: The MIT Press.

Federal Trade Commission. (2012). Protecting consumer privacy in an era of rapid change: recommendations for businesses and policymakers. Washington, DC: Federal Trade Commission.

Fisher, E., & Mahajan, R. (2010). Embedding the humanities in engineering: art, dialogue, and a laboratory. In M. E. Gorman (Ed.), Trading zones and interactional expertise: creating new kinds of collaboration (pp. 209–230). Cambridge, MA: MIT Press.

Friedman, B., & Hendry, D. G. (2012). The envisioning cards: a toolkit for catalyzing humanistic and technical imaginations. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems (pp. 1145–1148). New York, NY, USA: ACM. doi:10.1145/2207676.2208562

Friedman, B., Kahn, P. H., & Borning, A. (2006). Value sensitive design and information systems. In D. Galletta & P. Zhang (Eds.), Human-Computer Interaction and Management Information Systems: Applications (Vol. 6). New York: M.E. Sharpe.

Halpern, M. K., Erickson, I., Forlano, L., & Gay, G. K. (2013). Designing collaboration: comparing cases exploring cultural probes as boundary-negotiating objects. In Proceedings of the 2013 conference on Computer supported cooperative work (pp. 1093–1102). New York, NY, USA: ACM. doi:10.1145/2441776.2441900

Hollander, R. (2009). Ethics Education and Scientific and Engineering Research: What’s Been Learned? What Should Be Done? Summary of a Workshop. Washington, D.C.: National Academy of Engineering.

Johnson, D. G. (2011). Software Agents, Anticipatory Ethics, and Accountability. In G. E. Marchant, B. R. Allenby, & J. R. Herkert (Eds.), The Growing Gap Between Emerging Technologies and Legal-Ethical Oversight (pp. 61–76). Springer Netherlands. Retrieved from http://link.springer.com.proxy-um.researchport.umd.edu/chapter/10.1007/9...

Le Dantec, C. A. L., Poole, E. S., & Wyche, S. P. (2009). Values as lived experience: evolving value sensitive design in support of value discovery. In Proceedings of the 27th international conference on human factors in computing systems (CHI) (pp. 1141–1150). Boston, MA, USA: ACM. Retrieved from http://portal.acm.org/citation.cfm?id=1518701.1518875&coll=ACM&dl=ACM&ty...

Miller, J. K., Friedman, B., & Jancke, G. (2007). Value tensions in design: the value sensitive design, development, and appropriation of a corporation’s groupware system. In Proceedings of the 2007 international ACM conference on Supporting group work (pp. 281–290). Sanibel Island, Florida, USA: ACM. Retrieved from http://portal.acm.org/citation.cfm?id=1316624.1316668

Ratto, M. (2011). Critical Making: Conceptual and Material Studies in Technology and Social Life. The Information Society, 27(4), 252–260. doi:10.1080/01972243.2011.583819

Sengers, P., Boehner, K., David, S., & Kaye, J. “Jofish.” (2005). Reflective design. In Proceedings of the 4th decennial conference on Critical computing: between sense and sensibility (pp. 49–58). New York, NY, USA: ACM. Retrieved from http://doi.acm.org/10.1145/1094562.1094569

Shilton, K. (2009). Four billion little brothers?: privacy, mobile phones, and ubiquitous data collection. Commun. ACM, 52(11), 48–53.

Shilton, K. (2010a). Technology development with an agenda: interventions to emphasize values in design. In Proceedings of the 73rd Annual Meeting of the American Society for Information Science & Technology (ASIST) (Vol. 47). Presented at the 73rd Annual Meeting of the American Society for Information Science & Technology (ASIST), Pittsburgh, PA.

Shilton, K. (2010b). Participatory sensing: building empowering surveillance. Surveillance & Society, 8(2), 131–150.

Shilton, K. (2013). Values levers: building ethics into design. Science, Technology & Human Values, 38(3), 374 – 397.

Shilton, K., & Estrin, D. (2012). Ethical issues in participatory sensing. In Ethics CORE. Urbana-Champaign, Illinois: University of Illinois. Retrieved from http://nationalethicscenter.org/content/article/177

Shilton, K., & Koepfler, J. A. (2013). Making space for values: communication & values levers in a virtual team. In Proceedings of the 6th International Conference on Communities and Technologies (C&T 2013). Presented at the 6th International Conference on Communities and Technologies (C&T 2013), Munich, Germany: ACM.

Shilton, K., Koepfler, J. A., & Fleischmann, K. R. (2013). Charting Sociotechnical Dimensions of Values for Design Research. The Information Society, 29(5).

Spiekermann, S., & Cranor, L. F. (2009). Engineering Privacy. IEEE Transactions on Software Engineering, 35(1), 67–82.

Verbeek, P.-P. (2006a). Materializing Morality Design Ethics and Technological Mediation. Science, Technology & Human Values, 31(3), 361–380. doi:10.1177/0162243905285847

Verbeek, P.-P. (2006b). Materializing Morality. Science, Technology & Human Values, 31(3), 361 –380.