Abstract
According to different legal frameworks such as the European General Data Protection Regulation (GDPR), an end-user’s consent constitutes one of the well-known legal bases for personal data processing. However, research has indicated that the majority of end-users have difficulty making sense of what they are consenting to in the digital world. Moreover, it was demonstrated that marginalized people are confronted with even more difficulties dealing with their own digital privacy. In this paper, using an enactivist perspective in cognitive science, we develop a basic human-centric framework regarding digital consent. We argue the action of consenting is a sociocognitive action and includes cognitive, collective, and contextual aspects. Based on this theoretical framework, we present our qualitative evaluation of the practice of gaining consent conducted by the five big tech companies, i.e. Google, Amazon, Facebook, Apple, and Microsoft (GAFAM). The evaluation shows that these companies are lacking in their efforts to empower end-users by considering the human-centric aspects of the action of consenting. We use this approach to argue that the consent gaining mechanisms violate principles of fairness, accountability and transparency and suggest that our approach might even raise doubts regarding the lawfulness of the acquired consent–particularly considering the basic requirements of lawful consent within the legal framework of the GDPR.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
To ensure the webpage was specifically designed to be compliant with the GDPR, we chose the local top-level domain of Austria where applicable. All consent forms mentioned the GDPR in one way or the other.
- 2.
Firefox 68.0.1 and Chrome 76.0.3809, respectively.
- 3.
Through third-party tool.
- 4.
Including YouTube.
- 5.
- 6.
- 7.
References
Amazon.de: Advertising Preferences. https://www.amazon.de/adprefs?ref_=ya_d_l_advert_prefs (2019). Accessed 22 Aug 2019
Microsoft account \(\vert \) Privacy. https://account.microsoft.com/privacy/ad-settings/ (2019). Accessed 23 Aug 2019
Microsoft Privacy Statement—Microsoft privacy. https://privacy.microsoft.com/en-us/privacystatement (2019). Accessed 23 Aug 2019
Acquisti, A.: Privacy in electronic commerce and the economics of immediate gratification. In: Proceedings of the 5th ACM Conference on Electronic Commerce, pp. 21–29. ACM (2004)
Allen, M., Friston, K.J.: From cognitivism to autopoiesis: towards a computational framework for the embodied mind. Synthese 195(6), 2459–2482 (2018)
Alt, R., Human, S., Neumann, G.: End-user empowerment in the digital age. In: Proceedings of the 53rd Hawaii International Conference on System Sciences, pp. 4099–4101 (2020)
Anderson, R.J.: Representations and requirements—the value of ethnography in system design. Hum.-Comput. Interact. 9(2), 151–182 (1994). https://doi.org/10.1207/s15327051hci0902_1
Aybar, K.O., Human, S., Gesenger, R.: Digital inequality: call for sociotechnical privacy management approaches. Workshop on Engineering Accountable Information Systems. European Conference on Information Systems—ECIS 2019 (2019)
Blackmon, M.H., Polson, P.G., Kitajima, M., Lewis, C.H.: Cognitive walkthrough for the web. CHI p. 463 (2002). https://doi.org/10.1145/503376.503459
Bösch, C., Erb, B., Kargl, F., Kopp, H., Pfattheicher, S.: Tales from the dark side: privacy dark strategies and privacy dark patterns. Proc. Priv. Enhanc. Technol. 2016(4), 237–254 (2016). https://doi.org/10.1515/popets-2016-0038
Boyd, D.: It’s Complicated: The Social Lives of Networked Teens. Yale University Press (2014)
Boyd, D., Marwick, A.: Social privacy in networked publics: teens attitudes, practices, and strategies. In: Decade in Internet Time: Symposium on the Dynamics of the Internet and Society. Oxford, UK (2011)
Busch, A.: Privacy, technology, and regulation: why one size is unlikely to fit all. Social Dimensions of Privacy: Interdisciplinary Perspectives, pp. 303–323. Cambridge University Press, Cambridge (2015)
Chromik, M., Eiband, M., Völkel, S.T., Buschek, D.: Dark Patterns of Explainability, Transparency, and User Control for Intelligent Systems. IUI Workshops, vol. 2327 (2019)
Clark, A.: Whatever next? Predictive brains, situated agents, and the future of cognitive science. Behav. Brain Sci. 36(3), 181–204 (2013)
Commission, E.: Special Eurobarometer 431: Data Protection (2015)
Das, S., Kramer, A.D., Dabbish, L.A., Hong, J.I.: The role of social influence in security feature adoption. In: Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, pp. 1416–1426. ACM (2015)
Dourish, P.: Implications for design. In: The SIGCHI Conference, p. 541. ACM Press, New York, NY, USA (2006). https://doi.org/10.1145/1124772.1124855
Emami Naeini, P., Degeling, M., Bauer, L., Chow, R., Cranor, L.F., Haghighat, M.R., Patterson, H.: The influence of friends and experts on privacy decision making in iot scenarios. In: Proceedings of the ACM on Human-Computer Interaction, vol. 2(CSCW), p. 48 (2018)
EU: Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing. Off. J. European Union 1–88 (2016)
Fox, J.: The uncertain relationship between transparency and accountability. Dev. Pract. 17(4–5), 663–671 (2007). https://doi.org/10.1080/09614520701469955
Granovetter, M.: Economic action and social structure: the problem of embeddedness. Am. J. Sociol. 91(3), 481–510 (1985)
Gray, C.M., Kou, Y., Battles, B., Hoggatt, J., Toombs, A.L.: The dark (patterns) side of UX design. In: Proceedings of the Conference on Human Factors in Computing Systems. Purdue University, West Lafayette, United States (2018). https://doi.org/10.1145/3173574.3174108
Greenberg, S., Boring, S., Vermeulen, J., Dostal, J.: Dark patterns in proxemic interactions—a critical perspective. In: Proceedings of the Conference on Designing Interactive Systems, pp. 523–532 (2014). https://doi.org/10.1145/2598510.2598541
Huber, J., Payne, J.W., Puto, C.: Adding asymmetrically dominated alternatives: violations of regularity and the similarity hypothesis. J. Consum. Res. 9(1), 90–98 (1982)
Human, S., Bidabadi, G., Peschl, M.F., Savenkov, V.: An enactive theory of need satisfaction. In: Müller, V.C. (ed.) Philosophy and Theory of Artificial Intelligence 2017, pp. 40–42. Springer International Publishing, Cham (2018)
Human, S., Bidabadi, G., Savenkov, V.: Supporting pluralism by artificial intelligence: conceptualizing epistemic disagreements as digital artifacts. In: Müller, V.C. (ed.) Philosophy and Theory of Artificial Intelligence 2017. Springer, Cham (2018)
Human, S., Fahrenbach, F., Kragulj, F., Savenkov, V.: Ontology for representing human needs. In: Różewski, P., Lange, C. (eds.) Knowledge Engineering and Semantic Web, pp. 195–210. Springer International Publishing, Cham (2017)
Human, S., Gsenger, R., Neumann, G.: End-user empowerment: an interdisciplinary perspective. Hawaii Int. Conf. Syst. Sci. 2020, 4102–4111 (2020)
Human, S., Neumann, G., Peschl, M.: [how] can pluralist approaches to computational cognitive modeling of human needs and values save our democracies? Intellectica 70, 165–180 (2019)
Human, S., Wagner, B.: Is informed consent enough? considering predictive approaches to privacy. In: CHI2018 Workshop on Exploring Individual Differences in Privacy. Montréal (2018)
Hutto, D.D.: Surfing uncertainty: prediction, action and the embodied mind, by andy clark, pp. xviii+ 401,£ 19.99 (hardback), 2016. Oxford University Press, New york (2018)
Kain, E.: Facebook Turned a Blind Eye to ‘Friendly Fraud’ as Kids Racked up Thousands on Games. Forbes (2019)
Kemper, J., Kolkman, D.: Transparent to whom? No algorithmic accountability without a critical audience. Inf. Commun. Soc. 1–16 (2018). https://doi.org/10.1080/1369118X.2018.1477967
Kirchner, N., Human, S., Neumann, G.: Context-sensitivity of informed consent: the emergence of genetic data markets. Workshop on Engineering Accountable Information Systems. European Conference on Information Systems—ECIS 2019 (2019)
Kumaraguru, P., Cranor, L.F.: Priv. Indexes : A Surv. Westin’s Stud. (2005). https://doi.org/10.1184/R1/6625406.v1
Lehtiniemi, T., Kortesniemi, Y.: Can the obstacles to privacy self-management be overcome? Exploring the consent intermediary approach. Big Data Soc. 4(2), 2053951717721,935 (2017)
Lehtiniemi, T., Kortesniemi, Y.: Can the obstacles to privacy self-management be overcome? Exploring the consent intermediary approach. Big Data & Soc. 4(2), 205395171772,193 (2017). https://doi.org/10.1177/2053951717721935
Lupton, D.: Personal data practices in the age of lively data. Digit. Sociol. 335–50 (2016)
Lyon, D.: Surveillance Capitalism, Surveillance Culture and Data Politics, pp. 1–15 (2018)
Madden, M.: Privacy, security, and digital inequality: how technology experiences and resources vary by socioeconomic status, race, and ethnicity. Data Soc. (2017)
Marwick, A., Fontaine, C., Boyd, D.: Nobody sees it, nobody gets mad: social media, privacy, and personal responsibility among low-ses youth. Soc. Media+ Soc. 3(2), 2056305117710,455 (2017)
Marwick, A.E., Boyd, D.: Networked privacy: how teenagers negotiate context in social media. New Media Soc. 16(7), 1051–1067 (2014)
Marwick, A.E., Boyd, D.: Privacy at the margins understanding privacy at the margins-introduction. Int. J. Commun. 12, 9 (2018)
Mathur, A., Acar, G., Friedman, M., Lucherini, E., Mayer, J., Chetty, M., Narayanan, A.: Dark patterns at scale—findings from a crawl of 11K shopping websites. CoRR 1907 (2019). arXiv:1907.07,032
Obar, J.A., Oeldorf-Hirsch, A.: The biggest lie on the internet: ignoring the privacy policies and terms of service policies of social networking services. Inf. Commun. Soc. (2018)
Rudolph, M., Feth, D., Polst, S.: Why users ignore privacy policies–a survey and intention model for explaining user privacy behavior. In: International Conference on Human-Computer Interaction, pp. 587–598. Springer (2018)
Tene, O., Polonetsky, J.: A theory of creepy: technology, privacy and shifting social norms. Yale J. Law Technol. 16, 59 (2013)
Tidwell, J.: Designing interfaces. Patterns for Effective Interaction Design. O’Reilly Media, Inc. (2005)
Van Dijck, J., Poell, T., De Waal, M.: The Platform Society: Public Values in a Connective World. Oxford University Press (2018)
Varela, F.J., Thompson, E., Rosch, E.: The Embodied Mind: Cognitive Science and Human Experience. MIT Press (2017)
Zuboff, S.: The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books (2019)
Acknowledgments
This work is partially funded through the EXPEDiTE project (Grant 867559) by the Austrian Federal Ministry for Climate Action, Environment, Energy, Mobility, Innovation and Technology under the program “ICT of the Future” between September 2018 and February 2020. We would like to express our great appreciation for valuable criticism and ideas contributed by Gustaf Neumann, Seyedeh Anahit Kazzazi, Seyedeh Mandan Kazzazi, Stefano Rossetti, Kemal Ozan Aybar, Rita Gsenger, and Niklas Kirchner.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Human, S., Cech, F. (2021). A Human-Centric Perspective on Digital Consenting: The Case of GAFAM. In: Zimmermann, A., Howlett, R., Jain, L. (eds) Human Centred Intelligent Systems. Smart Innovation, Systems and Technologies, vol 189. Springer, Singapore. https://doi.org/10.1007/978-981-15-5784-2_12
Download citation
DOI: https://doi.org/10.1007/978-981-15-5784-2_12
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-5783-5
Online ISBN: 978-981-15-5784-2
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)