Abstract
In this paper, we begin to address the question of which scientists are online. Prior studies have shown that Web users are only a segmented reflection of the actual off-line population, and thus when studying online behaviors we need to be explicit about the representativeness of the sample under study to accurately relate trends to populations. When studying social phenomena on the Web, the identification of individuals is essential to be able to generalize about specific segments of a population off-line. Specifically, we present a method for assessing the online activity of a known set of actors. The method is tailored to the domain of science. We apply the method to a population of Dutch computer scientists and their coauthors. The results when combined with metadata of the set provide insights into the representativeness of the sample of interest.
The study results show that scientists of above-average tenure and performance are overrepresented online, suggesting that when studying online behaviors of scientists we are commenting specifically on the behaviors of above-average-performing scientists. Given this finding, metrics of Web behaviors of science may provide a key tool for measuring knowledge production and innovation at a faster rate than traditional delayed bibliometric studies.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bar-Ilan, J.: Web of science with the conference proceedings citation indexes: the case of computer science. Scientometrics 83(3), 809–824 (2010)
Colazzo, L., Molinari, A., Villa, N.: From e-learning to “co-learning”: the role of virtual communities. In: Kendall, M., Samways, B. (eds.) IFIP International Federation for Information Processing 281, pp. 329–338 (2008)
de Solla Price, D.J.: Networks of scientific papers. Science 149, 510–515 (1965)
Dunbar, K.: How scientists think: Online creativity and conceptual change in science. In: Ward, T.B., Smith, S.M., Vaid, S. (eds.), Conceptual Structures and Processes: Emergence, Discovery and Change, pp. 461–493. American Psychological Association, Washington, DC (1997)
Groth, P., Gurney, T.: Studying scientific discourse on the web using bibliometrics: A chemistry blogging case study. In: Proceedings of the WebSci10: Extending the Frontiers of Society On-Line. Raleigh, NC: US, April 26–27th, 2010
Gurney, T., Horlings, E., van den Besselaar, P.: Author disambiguation using multi-aspect similarity indicators. Scientometrics, 1–15 (2012)
Hampton, K., Sessions-Goulet, L., Rainie, L., Purcell, K.: Social networking sites and our lives. Pew Res. Center (2011)
Latour, B., Woolgar, S.: Laboratory Life: The Social Construction of Scientific Facts. Sage Publications, Los Angeles (1979)
Ley, M.: Dblp - some lessons learned. PVLDB 2(2), 1493–1500 (2009)
Lucas, J.W.: Theory-testing, generalization, and the problem of external validity. Socio. Theor 21, 236–253 (2003)
Mika, P.: Social networks and the semantic web. In: Web Intelligence, pp. 285–291(2004)
Newman, M.E.J.: The structure of scientific collaboration networks. In: Proceedings of the National Academy of Sciences 98, pp. 404–409 (2001)
Neylon, C., Wu, S.: Article-level metrics and the evolution of scientific impact. PLoS Biol. 7(11: e1000242) (2009)
Priem, J., Costello, K.: How and why scholars cite on twitter. In: Proceedings of the 73rd ASIS&T Annual Meeting, Pittsburgh, PA, (2010)
Priem, J., Costello, K., Dzuba, T.: Prevalence and use of twitter among scholars. In: Metrics 2011: Symposium on Informetric and Scientometric Research. Poster, New Orleans, LA, October 2011
Priem, J., Hemminger, B.M.: Scientometrics 2.0: Toward new metrics of scholarly impact on the social web. First Monday (7) (2010)
Priem, J., Parra, C., Piwowar, H., Groth, P., Waagmeester, A.: Uncovering impacts: a case study in using altmetrics tools. In: Workshop on the Semantic Publishing SePublica 2012 at the 9th Extended Semantic Web Conference, pp. 1–5(2012)
Priem, J., Taraborelli, D., Groth, P., Neylon, C.: Alt-metrics: A manifesto, (v.1.0). http://altmetrics.org/manifesto, 26 October 2010
Tang, J., Zhang, J., Yao, L., Li, J., Zhang, L., Su, Z.: Arnetminer: extraction and mining of academic social networks. In: Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’08, pp. 990–998. ACM, New York, NY, (2008)
Taraborelli, D.: Readermeter: Crowdsourcing research impact. Academic Productivity, 2010. Retrieved April 5, 2011, from: http://www.academicproductivity.com/2010/readermeter-crowdsourcing-research-impact/
Walsh, J.P., Bayma, T.: Computer networks and scientific work. Soc. Stud. Sci. 26, 661–703 (1996)
Wellman, B., Gulia, M.: Virtual communities as communities. In: Smith, M.A., Kollock, P. (eds.) Communities in Cyberspace.. Routledge, New York (1999)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer Science+Business Media New York
About this paper
Cite this paper
Birkholz, J.M., Wang, S., Groth, P., Magliacane, S. (2013). Who Are We Talking About? Identifying Scientific Populations Online. In: Li, J., Qi, G., Zhao, D., Nejdl, W., Zheng, HT. (eds) Semantic Web and Web Science. Springer Proceedings in Complexity. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-6880-6_21
Download citation
DOI: https://doi.org/10.1007/978-1-4614-6880-6_21
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-6879-0
Online ISBN: 978-1-4614-6880-6
eBook Packages: Computer ScienceComputer Science (R0)