Why Do We Need to Be Bots? What Prevents Society from Detecting Biases in Recommendation Systems | SpringerLink
Skip to main content

Why Do We Need to Be Bots? What Prevents Society from Detecting Biases in Recommendation Systems

  • Conference paper
  • First Online:
Bias and Social Aspects in Search and Recommendation (BIAS 2020)

Abstract

Concerns about social networks manipulating the (general) public opinion have become a recurring theme in recent years. Whether such an impact actually exists could so far only be tested to a very limited extent. Yet to guarantee the accountability of recommendation and information filtering systems, society needs to be able to determine whether they comply with ethical and legal requirements. This paper focuses on black box analyses as methods that are designed to systematically assess the performance of such systems, but that are, at the same time, not very intrusive. We describe the conditions that must be met to allow black box analyses of recommendation systems based on an application on Facebook’s News Feed. While black box analyses have proven to be useful in the past, several barriers can easily get in the way, such as a limited possibility of automated account control, bot detection and bot inhibition. Drawing on the insights from our case study and the state of the art of research on algorithmic accountability, we formulate several policy demands that need to be met in order to allow monitoring of ADM systems for their compliance with social values.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 5719
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 7149
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Rundfunkstaatsvertrag.

  2. 2.

    A regional, private television channel in Germany: https://www.rnf.de.

  3. 3.

    https://selenium.dev/.

  4. 4.

    Even when no login is needed, this approach yields a great advantage in certain cases, for example, when geospatial data like the IP address might be relevant (Krafft et al. 2019).

  5. 5.

    Self-selection refers to the self-enrollment in these kinds of studies. It almost always biases the sample such that it is not representative of all users of a system.

  6. 6.

    https://www.propublica.org/article/facebook-blocks-ad-transparency-tools.

  7. 7.

    https://www.wired.com/2013/03/att-hacker-gets-3-years/.

References

  • Bovens, L.: The ethics of dieselgate. Midwest Stud. Philos. 40(1), 262–283 (2016)

    Article  Google Scholar 

  • Diakopoulos, N.: Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism (2014)

    Google Scholar 

  • Dreyer, S., Schulz, W.: Künstliche Intelligenz, Intermediäre und Öffentlichkeit. Technical report, Alexander von Humboldt Institut für Internet und Gesellschaft & Leibniz-Institut für Medienforschung (2019)

    Google Scholar 

  • Krafft, T.D., Gamer, M., Zweig, K.A.: What did you see? a study to measure personalization in google’s search engine. EPJ Data Sci. 8(1), 38 (2019)

    Article  Google Scholar 

  • Patil, S.V., Vieider, F., Tetlock, P.E.: Process versus outcome accountability. The Oxford handbook of public accountability, pp. 69–89 (2014)

    Google Scholar 

  • Sandvig, C., Hamilton, K., Karahalios, K., Langbort, C.: Auditing algorithms: Research methods for detecting discrimination on internet platforms. Data and discrimination: converting critical concerns into productive inquiry, 22 (2014)

    Google Scholar 

  • Schneble, C.O., Elger, B.S., Shaw, D.: The cambridge analytica affair and internet-mediated research. EMBO Rep. 19(8), e46579 (2018)

    Article  Google Scholar 

  • van Drunen, M., Helberger, N., Bastian, M.: Know your algorithm: what media organizations need to explain to their users about news personalization. International Data Privacy Law (2019)

    Google Scholar 

  • Yang, Z., Wilson, C., Wang, X., Gao, T., Zhao, B.Y., Dai, Y.: Uncovering social network sybils in the wild. ACM Trans. Knowl. Discov. Data (TKDD) 8(1), 1–29 (2014)

    Article  Google Scholar 

Download references

Acknowledgement

We wish to thank Ralph Kühnl for presenting us the issue of the perceived unequal roll-out of content from Facebook pages and his trust to give us access to the Facebook account of the Rhein Neckar Fernsehen.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tobias D. Krafft .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Krafft, T.D., Hauer, M.P., Zweig, K.A. (2020). Why Do We Need to Be Bots? What Prevents Society from Detecting Biases in Recommendation Systems. In: Boratto, L., Faralli, S., Marras, M., Stilo, G. (eds) Bias and Social Aspects in Search and Recommendation. BIAS 2020. Communications in Computer and Information Science, vol 1245. Springer, Cham. https://doi.org/10.1007/978-3-030-52485-2_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-52485-2_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-52484-5

  • Online ISBN: 978-3-030-52485-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics