Citation impact

"Citation metric" redirects here. It is not to be confused with Citation index.

Citation impact quantifies the citation usage of scholarly works.[1][2][3][4][5] It is a result of citation analysis or bibliometrics. Among the measures that have emerged from citation analysis are the citation counts for an individual article, an author, and an academic journal.

Article-level

Main article: Article-level metrics

One of the most basic citation metrics is how often an article was cited in other articles, books, or other sources (such as theses). Citation rates are heavily dependent on the discipline and the number of people working in that area. For instance, many more scientists work in neuroscience than in mathematics, and neuroscientists publish more papers than mathematicians, hence neuroscience papers are much more often cited than papers in mathematics.[6][7] Similarly, review papers are more often cited than regular research papers because they summarize results from many papers. This may also be the reason why papers with shorter titles get more citations, given that they are usually covering a broader area.[8]

Journal-level

Main article: Journal-level metrics

The basic journal metric is the average citation count for the articles in a journal; other metrics include:

Impact factor and manuscript rejection rates

It is commonly believed that manuscripts are more often rejected at high impact journals. However, in a random selection of 570 journals there was no such correlation.[9] However, specific disciplines may have weak correlations, .e.g. in the physical sciences there is even a negative correlation.[10]

Author-level

Main article: Author-level metrics

Total citations, or average citation count per article, can be reported for an individual author or researcher. Many other measures have been proposed, beyond simple citation counts, to better quantify an individual scholar's citation impact.[11] The best-known measures include the h-index[12] and the g-index.[13] Each measure has advantages and disadvantages, spanning from bias to discipline-dependence and limitations of the citation data source.[14]

Alternatives

Main article: Altmetrics

An alternative approach to measure a scholar's impact relies on usage data, such as number of downloads from publishers and analyzing citation performance, often at article level.[15][16][17][18]

As early as 2004, the BMJ published the number of views for its articles, which was found to be somewhat correlated to citations.[19] In 2008 the Journal of Medical Internet Research began publishing views and Tweets. These "tweetations" proved to be a good indicator of highly cited articles, leading the author to propose a "Twimpact factor", which is the number of Tweets it receives in the first seven days of publication, as well as a Twindex, which is the rank percentile of an article's Twimpact factor.[20]

Recent developments

Further information: Citation analysis

An important recent development in research on citation impact is the discovery of universality, or citation impact patterns that hold across different disciplines in the sciences, social sciences, and humanities. For example, it has been shown that the number of citations received by a publication, once properly rescaled by its average across articles published in the same discipline and in the same year, follows a universal log-normal distribution that is the same in every discipline.[21] This finding has suggested a universal citation impact measure that extends the h-index by properly rescaling citation counts and resorting publications, however the computation of such a universal measure requires the collection of extensive citation data and statistics for every discipline and year. Social crowdsourcing tools such as Scholarometer have been proposed to address this need.[22]

While citation counts are often correlated with other measures of scholarly and scientific performance, causal statements linking a citation advantage with open access status[23][24][25][26][27][28][29][30] have been contradicted by some experimental and observational studies.[31][32]

Research suggests the impact of an article can be, partly, explained by superficial factors and not only by the scientific merits of an article.[33] Field-dependent factors are usually listed as an issue to be tackled not only when comparison across disciplines are made, but also when different fields of research of one discipline are being compared.[34] For instance in Medicine among other factors the number of authors, the number of references, the article length, and the presence of a colon in the title influence the impact. Whilst in Sociology the number of references, the article length, and title length are among the factors.[35] Also it is suggested scholars engage in ethical questionable behavior in order to inflate the number of citations articles receive.[36]

Automated citation indexing[37] has changed the nature of citation analysis research, allowing millions of citations to be analyzed for large scale patterns and knowledge discovery. The first example of automated citation indexing was CiteSeer, later to be followed by Google Scholar. More recently, advanced models for a dynamic analysis of citation aging have been proposed.[38][39] The latter model is even used as a predictive tool for determining the citations that might be obtained at any time of the lifetime of a corpus of publications.

References

  1. Garfield, E. (1955) Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas. Science, Vol:122, No:3159, p. 108-111
  2. Garfield, E. (1973) Citation Frequency as a Measure of Research Activity and Performance in Essays of an Information Scientist, 1: 406-408, 1962-73, Current Contents, 5
  3. Garfield, E. (1988) Can Researchers Bank on Citation Analysis? Current Comments, No. 44, October 31, 1988
  4. Garfield, E. (1998) The use of journal impact factors and citation analysis in the evaluation of science. 41st Annual Meeting of the Council of Biology Editors, Salt Lake City, UT, May 4, 1998
  5. Moed, H. F. (2005a) Citation Analysis in Research Evaluation. NY Springer
  6. Derek J. de Solla Price (1963) Little Science, Big Science, by. Columbia University Press.
  7. Peder Olesen Larsen & Markus von Ins (2010) The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics (2010) 84:575–603, DOI 10.1007/s11192-010-0202-z
  8. Deng, B. (2015). Papers with shorter titles get more citations. Nature News, doi:10.1038/nature.2015.18246
  9. Pascal Rocha da Silva (2015) Selecting for impact: new data debunks old beliefs, Frontiers Blog, 21 Dec 2015
  10. Pascal Rocha da Silva (2016) New Data Debunks Old Beliefs: Part 2, Frontiers Blog, 4 March 2016
  11. Belikov, A.V.; Belikov, V.V.; (2015). "A citation-based, author- and age-normalized, logarithmic index for evaluation of individual researchers independently of publication counts". F1000Research 4: 884. doi:10.12688/f1000research.7070.1.
  12. Hirsch, J. E. (15 November 2005), "An index to quantify an individual's scientific research output", PNAS 102 (46): 1656916572, arXiv:physics/0508025, Bibcode:2005PNAS..10216569H, doi:10.1073/pnas.0507655102, PMC 1283832, PMID 16275915
  13. Egghe, L. (2006), "Theory and practise of the g-index", Scientometrics 69 (1): 131–152, doi:10.1007/s11192-006-0144-7
  14. Francisco M. Couto, Catia Pesquita, Tiago Grego, and Paulo Veríssimo (2009), "Handling self-citations using Google Scholar", Cybermetrics 13 (1).
  15. Bollen, J.; Van de Sompel, H., Smith, J. and Luce, R. (2005), "Toward alternative metrics of journal impact: A comparison of download and citation data", Information Processing and Management 41 (6): 1419–1440, arXiv:cs.DL/0503007, doi:10.1016/j.ipm.2005.03.024
  16. Brody, T., Harnad, S. and Carr, L. (2005) Earlier Web Usage Statistics as Predictors of Later Citation Impact Journal of the American Association for Information Science and Technology (JASIST)
  17. Kurtz, M. J.; Eichhorn, G.; Accomazzi, A.; Grant, C. S.; Demleitner, M.; Murray, S. S. (2004), "The Effect of Use and Access on Citations", Information Processing and Management 41 (6): 1395–1402, arXiv:cs/0503029, Bibcode:2005IPM....41.1395K, doi:10.1016/j.ipm.2005.03.010.
  18. Moed, H. F. (2005b), "Statistical Relationships Between Downloads and Citations at the Level of Individual Documents Within a Single Journal", Journal of the American Society for Information Science and Technology 56 (10): 1088–1097, doi:10.1002/asi.20200.
  19. Perneger, T. V (2004). "Relation between online "hit counts" and subsequent citations: Prospective study of research papers in the BMJ". BMJ 329 (7465): 546–7. doi:10.1136/bmj.329.7465.546. PMC 516105. PMID 15345629.
  20. Eysenbach, Gunther (2011). "Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact". Journal of Medical Internet Research 13 (4): e123. doi:10.2196/jmir.2012. PMC 3278109. PMID 22173204.
  21. Radicchi, F.; Fortunato, S.; Castellano, C (11 November 2008), "Universality of citation distributions: Toward an objective measure of scientific impact", PNAS 105 (45): 17268–17272, arXiv:0806.0974, Bibcode:2008PNAS..10517268R, doi:10.1073/pnas.0806977105, PMC 2582263, PMID 18978030
  22. Hoang, D.; Kaur, J.; Menczer, F. (2010), "Crowdsourcing Scholarly Data", Proceedings of the WebSci10: Extending the Frontiers of Society On-Line, April 26-27th, 2010, Raleigh, NC: US
  23. Bibliography of Findings on the Open Access Impact Advantage
  24. Brody, T.; Harnad, S. (2004), "Comparing the Impact of Open Access (OA) vs. Non-OA Articles in the Same Journals", D-Lib Magazine 10: 6.
  25. Eysenbach G. (2006a) Citation Advantage of Open Access Articles. PLoS Biol. 2006;4(5) p. e157. Shows the Open Access citation advantage over non-Open Access papers, as well as a gold-OA over green-OA citation advantage. in the Proceedings of the National Academy of Sciences (PNAS)
  26. Eysenbach G (2006b) The Open Access Advantage. J Med Internet Res 2006;8(2):e8 Some follow-up data to the Eysenbach PLoS Biol study
  27. Hajjem, C., Harnad, S. and Gingras, Y. (2005) Ten-Year Cross-Disciplinary Comparison of the Growth of Open Access and How It Increases Research Citation Impact IEEE Data Engineering Bulletin 28(4) pp. 39-47. The study analyzed 1,307,038 articles published across 12 years (1992-2003) in 10 disciplines (Biology, Psychology, Sociology, Health, Political Science, Economics, Education, Law, Business, Management). Comparing OA and NOA articles in the same journal/year, OA articles have consistently more citations (25%-250% varying with discipline and year)
  28. Lawrence, S, (2001) Online or Invisible? Nature 411 (2001) (6837): 521. Landmark paper: First demonstration of the Open Access citation advantage for self-archived papers
  29. MacCallum CJ & Parthasarathy H (2006) Open access increases citation rate. PLoS Biol 4(5): e176. Editorial about the Eysenbach study
  30. Gargouri, Y., Hajjem, C., Lariviere, V., Gingras, Y., Brody, T., Carr, L. and Harnad, S. (2010) Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research. PLOS ONE 5 (10) e13636
  31. Davis, P. M., Lewenstein, B. V., Simon, D. H., Booth, J. G., & Connolly, M. J. L. (2008). Open access publishing, article downloads and citations: randomised trial. BMJ, 337, a568-
  32. Open access, readership, citations: a randomized controlled trial of scientific journal publishing FASEB J. (2011) v25 n7:2129-2134
  33. Bornmann, L., & Daniel, H. D. (2008). What do citation counts measure? A review of studies on citing behavior. Journal of Documentation, 64(1), 45-80.
  34. Anauati, Maria Victoria and Galiani, Sebastian and Gálvez, Ramiro H., Quantifying the Life Cycle of Scholarly Articles Across Fields of Economic Research (November 11, 2014). Available at SSRN: http://ssrn.com/abstract=2523078
  35. van Wesel, M.; Wyatt, S.; ten Haaf, J. (2014). "What a difference a colon makes: how superficial factors influence subsequent citation". Scientometrics 98 (3): 1601–1615. doi:10.1007/s11192-013-1154-x.
  36. Wesel, M. van (2016). "Evaluation by Citation: Trends in Publication Behavior, Evaluation Criteria, and the Strive for High Impact Publications". Science and Engineering Ethics 22 (1): 199–225. doi:10.1007/s11948-015-9638-0.
  37. Giles, C. L.; Bollacker, K.; Lawrence, S. (1998). "CiteSeer: An Automatic Citation Indexing System". DL'98 Digital Libraries, 3rd ACM Conference on Digital Libraries: 89–98. doi:10.1145/276675.276685.
  38. Yu, G.; Li, Y.-J. (2010). "Identification of referencing and citation processes of scientific journals based on the citation distribution model". Scientometrics 82 (2): 249–261. doi:10.1007/s11192-009-0085-z.
  39. Bouabid, H. (2011). "Revisiting citation aging: A model for citation distribution and life-cycle prediction". Scientometrics 88 (1): 199–211. doi:10.1007/s11192-011-0370-5.

Further reading

This article is issued from Wikipedia - version of the Wednesday, March 30, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.