Spooks in the day light – the perverse side of data privacy

By Rebekah Rousi

By now, the terms ‘data privacy’ and ‘General Data Protection Regulation’ (GDPR) are well known by expert and novice alike. There are many reasons even the general public are aware of, in terms of needing to be cautious and protect personal information. Most are aware of identity theft, fraud, direct financial loss, scarred reputation (Kesan, Hayes & Bashir, 2015), and even the unforgetting, unforgiving nature of the disclosure of specific types of information via the Internet (see e.g., Guitton’s (2019) article on sexpionage). But, what are some of the murky issues lurking beneath?

Our perception of privacy and potential threats operates on an embodied level (Smith & Ellswoth, 1985). When issues arise that are experienced as an imminent danger to our wellbeing, our bodies begin to prepare to react – in order to survive. Freeze, flight or fight (Isen, Daubman & Nowicki, 1987), are some basic modes of response to threat that activate and are sensed on various levels of consciousness. From being mildly creeped as seen in the case of ‘resignated acceptance’ (Shklovski et al., 2014) whereby users understand that they are being watched and are disturbed by this fact, yet, over time, the panopticonic sensation imposed by the device fades to an awareness, a dislike, yet understanding that this state of being will remain regardless of attitude, to an aroused sense of spook – constant paranoia and disturbance. There is an edginess to the surveillance side that just doesn’t shake once people start conceptualising the ways these data-driven systems work.

Spookiness comes into play when these surveillance systems become personified. When people begin to imagine or conceptualise the people and intentions behind the surveillance technology, and when they feel that this form of techno-practice is a direct attack on their human integrity. This form of disturbed state is a fully embodied experience that causes stress and manifests in lack of sleep, depression, and even suicide, similar to that which is generated as a result of being stalked (Davis, Coker & Sanderson, 2002).

Now, here comes the flip side. Privacy, can and is being used by felons to protect their own identity when committing crimes (Jardine, 2018). Often these crimes involve a direct violation of others’ privacy and human integrity. The drive of many privacy shielded, yet privacy breaching criminals is an embodied and highly aroused drive that is fed through controlling and harming others (Ferrell, 1997).

Then, to tame this down to our own behavior within everyday life, all we have to observe is how we position ourselves in relation to other people’s data privacy. When perusing social media pages and websites are we driven by a friendly affective sensation of curiosity and concern for our associates’ wellbeing? Or, are there darker intentions behind following the information of those around us? All of this in turn boils down to our experience of cognitive-affect from various angles, via differing intentions, and in alternate contexts. Think about that the next time you browse Google Scholar.

About the author:

Rebekah Rousi is an Associate Professor of Communication and Digital Economy at the University of Vaasa, Finland. Rousi is a human-centered specialist who focuses on examining the relationship between human experience and technology design. Rousi obtained her PhD in Cognitive Science at the University of Jyväskylä, Finland, on the topic of user experience from a cognitive semiotic perspective. Rousi has worked in a number of projects focusing on a range of topics from embodied and multisensory user experience to digital literacy in the context of mental health. Rousi is currently Principal Investigator of an Academy of Finland project titled, “The emotional experience of privacy and ethics in everyday pervasive systems (BUGGED)” and leads the VME Interaction Design Environment for development and research of future human-technology interaction. Rousi’s research interests include embodied experience in human-robot interaction, human-AI interaction, posthumanism, trust, and ethics in data-driven systems.


Davis, K. E., Coker, A. L., & Sanderson, M. (2002). Physical and mental health effects of being stalked for men and women. Violence and Victims17(4), 429-443.

Ferrell, J. (1997). Criminological verstehen: Inside the immediacy of crime. Justice Quarterly14(1), 3-23.

Guitton, M. J. (2019). Manipulation through online sexual behavior: exemplifying the importance of human factor in intelligence and counterintelligence in the Big Data era. The International Journal of Intelligence, Security, and Public Affairs21(2), 117-142.

Isen, A. M., Daubman, K. A., & Nowicki, G. P. (1987). Positive affect facilitates creative problem solving. Journal of personality and social psychology52(6), 1122.

Jardine, E. (2018). Privacy, censorship, data breaches and Internet freedom: The drivers of support and opposition to Dark Web technologies. new media & society20(8), 2824-2843.

Kesan, J. P., Hayes, C. M., & Bashir, M. N. (2015). A comprehensive empirical study of data privacy, trust, and consumer autonomy. Indiana Law Journal, 91, 267.

Shklovski, I., Mainwaring, S. D., Skúladóttir, H. H., & Borgthorsson, H. (2014, April). Leakiness and creepiness in app space: Perceptions of privacy and mobile app use. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2347-2356).

Smith, C. A., & Ellsworth, P. C. (1985). Patterns of cognitive appraisal in emotion. Journal of personality and social psychology48(4), 813.

*The featured image was made with the combination of Adobe Stock and Generative Fill on Adobe Photoshop.