On September 21, 2021 our SP2 doctoral candidate Verena Distler successfully defended her thesis titled:
“The Experience of Security in Human-Computer Interactions: Understanding Security Perceptions Through the Concept of User Experience“
*** Congratulations to Verena for being honoured with an Excellent Thesis Award by the Doctoral School in Humanities and Social Sciences ***
Members of the defense committee:
Katharina Krombholz, CISPA Helmholtz Center for Information Security
Florian Alt, Bundeswehr University Munich
Carine Lallemand, University of Luxembourg and Eindhoven University of Technology
Gabriele Lenzini, University of Luxembourg
In traditional interactions that do not rely on technology, most people are able to assess risks to their privacy and security and understand how to mitigate these risks. However, risk assessment and mitigation is more challenging when interacting with technology, and people’s perceptions of security and privacy risks are not always aligned with reality. It is important for those who design technologies to understand how people perceive the security of technologies in order to avoid having their designs contribute to erroneous perceptions. Instead, interactions with technology should be deliberately designed to ensure that people do not over- or underestimate the security provided by the system.
This dissertation contributes to a better understanding of users’ perceptions of security in human-computer interactions. It investigates which factors induce a perception of security and privacy risks and how user-centered design can influence these factors to deliberately design for or against perceived security.
I use a mixed-methods approach to address these objectives, including a systematic literature review, empirical data collection with focus groups, expert co-creation sessions, user tests in a controlled environment and a quantitative survey experiment.
The first research objective is to analyze how security and privacy researchers induce a perception of security and privacy risks with research participants. We conducted a systematic literature review and focused our analysis on study methods; risk representation; the use of prototypes, scenarios, and educational interventions; the use of deception to simulate risk; and types of participants. We discuss benefits and shortcomings of the methods, and identify key methodological, ethical, and research challenges when representing and assessing security and privacy risk. We also provide guidelines for the reporting of user studies in security and privacy.
The second research objective is to explore the factors that contribute to the acceptance of privacy and security risks in situations where people need to weigh the potential advantages of a technology against its associated privacy or security risks. We conducted a series of focus groups and highlighted the reasons why people accept compromises to their privacy and security, finding that perceived usefulness and the fulfilment of the psychological needs for autonomy and control were important factors. Our results suggest potential links between technology acceptance models and user experience models in the context of privacy-relevant interactions.
The third research objective is to design and evaluate examples of visible representations of security mechanisms, with a focus on encryption. We studied the effects of these visual and textual representations empirically to understand the impact of these visible security mechanisms on user experience, perceptions of security and users’ understanding of encryption. We addressed this question in a series of studies, both lab studies and online experiments. In a vignette experiment, we find that more complex descriptions of encryption can lead to a better understanding and higher perceived security when designed carefully. However, we find no effect of novel visualizations of encryption on user experience (UX), perceived security or understanding of encryption.
The fourth objective is to explore how we might make the link from subjective experience to more secure behaviors. We introduce a new framework of security-enhancing friction design. The framework suggests helping users behave more securely by designing for moments of negative UX in security-critical situations while also ensuring that overall UX remains at an acceptable level to avoid disuse of secure technologies.
Overall, this doctoral dissertation contributes to research in the field of human-computer interaction, and more specifically, usable privacy and security. It improves our understanding of the methods used by researchers in the field of usable privacy and security use to create a perception of risk, and the factors that make people accept or reject certain privacy trade-offs. This dissertation also makes contributions to helping researchers and creators of technology understand how their designs influence perceptions of security, UX and understanding of encryption. This enables them to design for or against a perception of security, depending on the actual level of security provided by the technology. Finally, we conceptualize security-enhancing friction, a framework that suggests helping users to behave more securely by designing for moments of negative UX.
Link to dissertation: