Abstract
Concerns about social networks manipulating the (general) public opinion have become a recurring theme in recent years. Whether such an impact actually exists could so far only be tested to a very limited extent. Yet to guarantee the accountability of recommendation and information filtering systems, society needs to be able to determine whether they comply with ethical and legal requirements. This paper focuses on black box analyses as methods that are designed to systematically assess the performance of such systems, but that are, at the same time, not very intrusive. We describe the conditions that must be met to allow black box analyses of recommendation systems based on an application on Facebook’s News Feed. While black box analyses have proven to be useful in the past, several barriers can easily get in the way, such as a limited possibility of automated account control, bot detection and bot inhibition. Drawing on the insights from our case study and the state of the art of research on algorithmic accountability, we formulate several policy demands that need to be met in order to allow monitoring of ADM systems for their compliance with social values.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Rundfunkstaatsvertrag.
- 2.
A regional, private television channel in Germany: https://www.rnf.de.
- 3.
- 4.
Even when no login is needed, this approach yields a great advantage in certain cases, for example, when geospatial data like the IP address might be relevant (Krafft et al. 2019).
- 5.
Self-selection refers to the self-enrollment in these kinds of studies. It almost always biases the sample such that it is not representative of all users of a system.
- 6.
- 7.
References
Bovens, L.: The ethics of dieselgate. Midwest Stud. Philos. 40(1), 262–283 (2016)
Diakopoulos, N.: Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism (2014)
Dreyer, S., Schulz, W.: Künstliche Intelligenz, Intermediäre und Öffentlichkeit. Technical report, Alexander von Humboldt Institut für Internet und Gesellschaft & Leibniz-Institut für Medienforschung (2019)
Krafft, T.D., Gamer, M., Zweig, K.A.: What did you see? a study to measure personalization in google’s search engine. EPJ Data Sci. 8(1), 38 (2019)
Patil, S.V., Vieider, F., Tetlock, P.E.: Process versus outcome accountability. The Oxford handbook of public accountability, pp. 69–89 (2014)
Sandvig, C., Hamilton, K., Karahalios, K., Langbort, C.: Auditing algorithms: Research methods for detecting discrimination on internet platforms. Data and discrimination: converting critical concerns into productive inquiry, 22 (2014)
Schneble, C.O., Elger, B.S., Shaw, D.: The cambridge analytica affair and internet-mediated research. EMBO Rep. 19(8), e46579 (2018)
van Drunen, M., Helberger, N., Bastian, M.: Know your algorithm: what media organizations need to explain to their users about news personalization. International Data Privacy Law (2019)
Yang, Z., Wilson, C., Wang, X., Gao, T., Zhao, B.Y., Dai, Y.: Uncovering social network sybils in the wild. ACM Trans. Knowl. Discov. Data (TKDD) 8(1), 1–29 (2014)
Acknowledgement
We wish to thank Ralph Kühnl for presenting us the issue of the perceived unequal roll-out of content from Facebook pages and his trust to give us access to the Facebook account of the Rhein Neckar Fernsehen.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Krafft, T.D., Hauer, M.P., Zweig, K.A. (2020). Why Do We Need to Be Bots? What Prevents Society from Detecting Biases in Recommendation Systems. In: Boratto, L., Faralli, S., Marras, M., Stilo, G. (eds) Bias and Social Aspects in Search and Recommendation. BIAS 2020. Communications in Computer and Information Science, vol 1245. Springer, Cham. https://doi.org/10.1007/978-3-030-52485-2_3
Download citation
DOI: https://doi.org/10.1007/978-3-030-52485-2_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-52484-5
Online ISBN: 978-3-030-52485-2
eBook Packages: Computer ScienceComputer Science (R0)