fbpx
Wikipedia

Citation impact

Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors.[1][2][3][4][5][6] Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics,[7][8] specializing in the study of patterns of academic impact through citation analysis. The importance of journals can be measured by the average citation rate,[9][6] the ratio of number of citations to number articles published within a given time period and in a given index, such as the journal impact factor or the citescore. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.

Article-level edit

One of the most basic citation metrics is how often an article was cited in other articles, books, or other sources (such as theses). Citation rates are heavily dependent on the discipline and the number of people working in that area. For instance, many more scientists work in neuroscience than in mathematics, and neuroscientists publish more papers than mathematicians, hence neuroscience papers are much more often cited than papers in mathematics.[10][11] Similarly, review papers are more often cited than regular research papers because they summarize results from many papers. This may also be the reason why papers with shorter titles get more citations, given that they are usually covering a broader area.[12]

Most-cited papers edit

The most-cited paper in history is a paper by Oliver Lowry describing an assay to measure the concentration of proteins.[13] By 2014 it had accumulated more than 305,000 citations. The 10 most cited papers all had more than 40,000 citations.[14] To reach the top-100 papers required 12,119 citations by 2014.[14] Of Thomson Reuter's Web of Science database with more than 58 million items only 14,499 papers (~0.026%) had more than 1,000 citations in 2014.[14]

Journal-level edit

The simplest journal-level metric is the journal impact factor (JIF), the average number of citations that articles published by a journal in the previous two years have received in the current year, as calculated by Clarivate. Other companies report similar metrics, such as the CiteScore (CS), based on Scopus.

However, very high JIF or CS are often based on a small number of very highly cited papers. For instance, most papers in Nature (impact factor 38.1, 2016) were only cited 10 or 20 times during the reference year (see figure). Journals with a lower impact (e.g. PLOS ONE, impact factor 3.1) publish many papers that are cited 0 to 5 times but few highly cited articles.[15]

Journal-level metrics are often misinterpreted as a measure for journal quality or article quality. However, the use of non-article-level metrics to determine the impact of a single article is statistically invalid. Moreover, studies of methodological quality and reliability have found that "reliability of published research works in several fields may be decreasing with increasing journal rank",[16] contrary to widespread expectations.[17]

Citation distribution is skewed for journals because a very small number of articles are driving the vast majority of citations; therefore, some journals have stopped publicizing their impact factor, e.g. the journals of the American Society for Microbiology.[18] Citation counts follow mostly a lognormal distribution, except for the long tail, which is better fit by a power law.[19]

Other journal-level metrics include the Eigenfactor, and the SCImago Journal Rank.

Author-level edit

Total citations, or average citation count per article, can be reported for an individual author or researcher. Many other measures have been proposed, beyond simple citation counts, to better quantify an individual scholar's citation impact.[20] The best-known measures include the h-index[21] and the g-index.[22] Each measure has advantages and disadvantages,[23] spanning from bias to discipline-dependence and limitations of the citation data source.[24] Counting the number of citations per paper is also employed to identify the authors of citation classics.[25]

Citations are distributed highly unequally among researchers. In a study based on the Web of Science database across 118 scientific disciplines, the top 1% most-cited authors accounted for 21% of all citations. Between 2000 and 2015, the proportion of citations that went to this elite group grew from 14% to 21%. The highest concentrations of 'citation elite' researchers were in the Netherlands, the United Kingdom, Switzerland and Belgium. 70% of the authors in the Web of Science database have fewer than 5 publications, so that the most-cited authors among the 4 million included in this study constitute a tiny fraction.[26]

Alternatives edit

An alternative approach to measure a scholar's impact relies on usage data, such as number of downloads from publishers and analyzing citation performance, often at article level.[27][28][29][30]

As early as 2004, the BMJ published the number of views for its articles, which was found to be somewhat correlated to citations.[31] In 2008 the Journal of Medical Internet Research began publishing views and Tweets. These "tweetations" proved to be a good indicator of highly cited articles, leading the author to propose a "Twimpact factor", which is the number of Tweets it receives in the first seven days of publication, as well as a Twindex, which is the rank percentile of an article's Twimpact factor.[32]

In response to growing concerns over the inappropriate use of journal impact factors in evaluating scientific outputs and scientists themselves, Université de Montréal, Imperial College London, PLOS, eLife, EMBO Journal, The Royal Society, Nature and Science proposed citation distributions metrics as alternative to impact factors.[33][34][35]

Open Access publications edit

Open access (OA) publications are accessible without cost to readers, hence they would be expected to be cited more frequently.[36] Some experimental and observational studies have found that articles published in OA journals do not receive more citations, on average, than those published in subscription journals;[37] other studies have found that they do.[38][39][40]

The evidence that author-self-archived ("green") OA articles are cited more than non-OA articles is somewhat stronger than the evidence that ("gold") OA journals are cited more than non-OA journals.[41] Two reasons for this are that many of the top-cited journals today are still only hybrid OA (author has the option to pay for gold)[42] and many pure author-pays OA journals today are either of low quality or downright fraudulent "predatory journals," preying on authors' eagerness to publish-or-perish, thereby lowering the average citation counts of OA journals.[43]

Recent developments edit

An important recent development in research on citation impact is the discovery of universality, or citation impact patterns that hold across different disciplines in the sciences, social sciences, and humanities. For example, it has been shown that the number of citations received by a publication, once properly rescaled by its average across articles published in the same discipline and in the same year, follows a universal log-normal distribution that is the same in every discipline.[44] This finding has suggested a universal citation impact measure that extends the h-index by properly rescaling citation counts and resorting publications, however the computation of such a universal measure requires the collection of extensive citation data and statistics for every discipline and year. Social crowdsourcing tools such as Scholarometer have been proposed to address this need.[45][46] Kaur et al. proposed a statistical method to evaluate the universality of citation impact metrics, i.e., their capability to compare impact fairly across fields.[47] Their analysis identifies universal impact metrics, such as the field-normalized h-index.

Research suggests the impact of an article can be, partly, explained by superficial factors and not only by the scientific merits of an article.[48] Field-dependent factors are usually listed as an issue to be tackled not only when comparison across disciplines are made, but also when different fields of research of one discipline are being compared.[49] For instance in Medicine among other factors the number of authors, the number of references, the article length, and the presence of a colon in the title influence the impact. Whilst in Sociology the number of references, the article length, and title length are among the factors.[50] Also it is found that scholars engage in ethically questionable behavior in order to inflate the number of citations articles receive.[51]

Automated citation indexing[52] has changed the nature of citation analysis research, allowing millions of citations to be analyzed for large scale patterns and knowledge discovery. The first example of automated citation indexing was CiteSeer, later to be followed by Google Scholar. More recently, advanced models for a dynamic analysis of citation aging have been proposed.[53][54] The latter model is even used as a predictive tool for determining the citations that might be obtained at any time of the lifetime of a corpus of publications.

Some researchers also propose that the journal citation rate on Wikipedia, next to the traditional citation index, "may be a good indicator of the work's impact in the field of psychology."[55][56]

According to Mario Biagioli: "All metrics of scientific evaluation are bound to be abused. Goodhart's law [...] states that when a feature of the economy is picked as an indicator of the economy, then it inexorably ceases to function as that indicator because people start to game it."[57]

See also edit

References edit

  1. ^ Garfield, E. (1955). "Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas". Science. 122 (3159): 108–111. Bibcode:1955Sci...122..108G. doi:10.1126/science.122.3159.108. PMID 14385826.
  2. ^ Garfield, E. (1973). "Citation Frequency as a Measure of Research Activity and Performance" (PDF). Essays of an Information Scientist. 1: 406–408.
  3. ^ Garfield, E. (1988). "Can Researchers Bank on Citation Analysis?" (PDF). Essays of an Information Scientist. 11: 354.
  4. ^ Garfield, E. (1998). "The use of journal impact factors and citation analysis in the evaluation of science". 41st Annual Meeting of the Council of Biology Editors.
  5. ^ Moed, Henk F. (2005). Citation Analysis in Research Evaluation. Springer. ISBN 978-1-4020-3713-9.
  6. ^ a b Haustein, S. (2012). Multidimensional Journal Evaluation: Analyzing Scientific Periodicals beyond the Impact Factor. Knowledge and Information. De Gruyter. ISBN 978-3-11-025555-3. Retrieved 2023-06-06.
  7. ^ Leydesdorff, L., & Milojević, S. (2012). Scientometrics. arXiv preprint arXiv:1208.4566.
  8. ^ Harnad, S. (2009). Open access scientometrics and the UK Research Assessment Exercise. Scientometrics, 79(1), 147-156.
  9. ^ Garfield, Eugene (1972-11-03). "Citation Analysis as a Tool in Journal Evaluation". Science. 178 (4060). American Association for the Advancement of Science (AAAS): 471–479. Bibcode:1972Sci...178..471G. doi:10.1126/science.178.4060.471. ISSN 0036-8075. PMID 5079701.
  10. ^ de Solla Price, D. J. (1963). Little Science, Big Science. Columbia University Press. ISBN 9780231085625.
  11. ^ Larsen, P. O.; von Ins, M. (2010). "The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index". Scientometrics. 84 (3): 575–603. doi:10.1007/s11192-010-0202-z. PMC 2909426. PMID 20700371.
  12. ^ Deng, B. (26 August 2015). "Papers with shorter titles get more citations". Nature News. doi:10.1038/nature.2015.18246. S2CID 186805536.
  13. ^ Lowry, O. H.; Rosebrough, N. J.; Farr, A. L.; Randall, R. J. (1951). "Protein measurement with the Folin phenol reagent". The Journal of Biological Chemistry. 193 (1): 265–275. doi:10.1016/S0021-9258(19)52451-6. PMID 14907713.
  14. ^ a b c van Noorden, R.; Maher, B.; Nuzzo, R. (2014). "The top 100 papers". Nature. 514 (7524): 550–553. Bibcode:2014Natur.514..550V. doi:10.1038/514550a. PMID 25355343.
  15. ^ Callaway, E. (2016). "Beat it, impact factor! Publishing elite turns against controversial metric". Nature. 535 (7611): 210–211. Bibcode:2016Natur.535..210C. doi:10.1038/nature.2016.20224. PMID 27411614.
  16. ^ Brembs, Björn (2018). "Prestigious Science Journals Struggle to Reach Even Average Reliability". Frontiers in Human Neuroscience. 12: 37. doi:10.3389/fnhum.2018.00037. PMC 5826185. PMID 29515380.
  17. ^ Triggle, Chris R; MacDonald, Ross; Triggle, David J.; Grierson, Donald (2022-04-03). "Requiem for impact factors and high publication charges". Accountability in Research. 29 (3): 133–164. doi:10.1080/08989621.2021.1909481. PMID 33787413. One might expect, therefore, that a high JIF factor indicates a higher standard of interest, accuracy and reliability of papers published therein. This is sometimes true but unfortunately is certainly not always the case (Brembs 2018, 2019). Thus, Björn Brembs (2019) concluded: "There is a growing body of evidence against our subjective notion of more prestigious journals publishing 'better' science. In fact, the most prestigious journals may be publishing the least reliable science."
  18. ^ Casadevall, A.; Bertuzzi, S.; Buchmeier, M. J.; Davis, R. J.; Drake, H.; Fang, F. C.; Gilbert, J.; Goldman, B. M.; Imperiale, M. J. (2016). "ASM Journals Eliminate Impact Factor Information from Journal Websites". mSphere. 1 (4): e00184–16. doi:10.1128/mSphere.00184-16. PMC 4941020. PMID 27408939.
  19. ^ Chatterjee, Arnab; Ghosh, Asim; Chakrabarti, Bikas K. (2016-01-11). Bornmann, Lutz (ed.). "Universality of Citation Distributions for Academic Institutions and Journals". PLOS ONE. 11 (1). Public Library of Science (PLoS): e0146762. Bibcode:2016PLoSO..1146762C. doi:10.1371/journal.pone.0146762. ISSN 1932-6203. PMC 4709109. PMID 26751563.
  20. ^ Belikov, A. V.; Belikov, V. V. (2015). "A citation-based, author- and age-normalized, logarithmic index for evaluation of individual researchers independently of publication counts". F1000Research. 4: 884. doi:10.12688/f1000research.7070.1. PMC 4654436.
  21. ^ Hirsch, J. E. (2005). "An index to quantify an individual's scientific research output". PNAS. 102 (46): 16569–16572. arXiv:physics/0508025. Bibcode:2005PNAS..10216569H. doi:10.1073/pnas.0507655102. PMC 1283832. PMID 16275915.
  22. ^ Egghe, L. (2006). "Theory and practise of the g-index". Scientometrics. 69 (1): 131–152. doi:10.1007/s11192-006-0144-7. hdl:1942/981. S2CID 207236267.
  23. ^ Gálvez RH (March 2017). "Assessing author self-citation as a mechanism of relevant knowledge diffusion". Scientometrics. 111 (3): 1801–1812. doi:10.1007/s11192-017-2330-1. S2CID 6863843.
  24. ^ Couto, F. M.; Pesquita, C.; Grego, T.; Veríssimo, P. (2009). . Cybermetrics. 13 (1): 2. Archived from the original on 2010-06-24. Retrieved 2009-05-27.
  25. ^ Serenko, A.; Dumay, J. (2015). "Citation classics published in knowledge management journals. Part I: Articles and their characteristics" (PDF). Journal of Knowledge Management. 19 (2): 401–431. doi:10.1108/JKM-06-2014-0220.
  26. ^ Reardon, Sara (2021-03-01). "'Elite' researchers dominate citation space". Nature. 591 (7849): 333–334. Bibcode:2021Natur.591..333R. doi:10.1038/d41586-021-00553-7. PMID 33649475.
  27. ^ Bollen, J.; Van de Sompel, H.; Smith, J.; Luce, R. (2005). "Toward alternative metrics of journal impact: A comparison of download and citation data". Information Processing and Management. 41 (6): 1419–1440. arXiv:cs.DL/0503007. Bibcode:2005IPM....41.1419B. doi:10.1016/j.ipm.2005.03.024. S2CID 9864663.
  28. ^ Brody, T.; Harnad, S.; Carr, L. (2005). "Earlier Web Usage Statistics as Predictors of Later Citation Impact". Journal of the Association for Information Science and Technology. 57 (8): 1060. arXiv:cs/0503020. Bibcode:2005cs........3020B. doi:10.1002/asi.20373. S2CID 12496335.
  29. ^ Kurtz, M. J.; Eichhorn, G.; Accomazzi, A.; Grant, C.; Demleitner, M.; Murray, S. S. (2004). "The Effect of Use and Access on Citations". Information Processing and Management. 41 (6): 1395–1402. arXiv:cs/0503029. Bibcode:2005IPM....41.1395K. doi:10.1016/j.ipm.2005.03.010. S2CID 16771224.
  30. ^ Moed, H. F. (2005b). "Statistical Relationships Between Downloads and Citations at the Level of Individual Documents Within a Single Journal". Journal of the American Society for Information Science and Technology. 56 (10): 1088–1097. doi:10.1002/asi.20200.
  31. ^ Perneger, T. V. (2004). "Relation between online "hit counts" and subsequent citations: Prospective study of research papers in the BMJ". BMJ. 329 (7465): 546–7. doi:10.1136/bmj.329.7465.546. PMC 516105. PMID 15345629.
  32. ^ Eysenbach, G. (2011). "Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact". Journal of Medical Internet Research. 13 (4): e123. doi:10.2196/jmir.2012. PMC 3278109. PMID 22173204.
  33. ^ Veronique Kiermer (2016). "Measuring Up: Impact Factors Do Not Reflect Article Citation Rates". The Official PLOS Blog.
  34. ^ "Ditching Impact Factors for Deeper Data". The Scientist. Retrieved 2016-07-29.
  35. ^ "Scientific publishing observers and practitioners blast the JIF and call for improved metrics". Physics Today. 2016. doi:10.1063/PT.5.8183.
  36. ^ Hitchcock, Steve (2013) [2004]. "The effect of open access and downloads ('hits') on citation impact: a bibliography of studies". opcit.eprints.org. University of Southampton. Retrieved 2023-01-22.
    Brody, T.; Harnad, S. (2004). "Comparing the Impact of Open Access (OA) vs. Non-OA Articles in the Same Journals". D-Lib Magazine. 10: 6.
    Eysenbach, G.; Tenopir, C. (2006). "Citation Advantage of Open Access Articles". PLOS Biology. 4 (5): e157. doi:10.1371/journal.pbio.0040157. PMC 1459247. PMID 16683865.
    Eysenbach, G. (2006). "The Open Access Advantage". Journal of Medical Internet Research. 8 (2): e8. doi:10.2196/jmir.8.2.e8. PMC 1550699. PMID 16867971.
    Hajjem, C.; Harnad, S.; Gingras, Y. (2005). "Ten-Year Cross-Disciplinary Comparison of the Growth of Open Access and How It Increases Research Citation Impact" (PDF). IEEE Data Engineering Bulletin. 28 (4): 39–47. arXiv:cs/0606079. Bibcode:2006cs........6079H.
    Lawrence, S. (2001). "Free online availability substantially increases a paper's impact". Nature. 411 (6837): 521. Bibcode:2001Natur.411..521L. doi:10.1038/35079151. PMID 11385534. S2CID 4422192.
    MacCallum, C. J.; Parthasarathy, H. (2006). "Open Access Increases Citation Rate". PLOS Biology. 4 (5): e176. doi:10.1371/journal.pbio.0040176. PMC 1459260. PMID 16683866.
    Gargouri, Y.; Hajjem, C.; Lariviere, V.; Gingras, Y.; Brody, T.; Carr, L.; Harnad, S. (2010). "Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research". PLOS ONE. 5 (10): e13636. arXiv:1001.0361. Bibcode:2010PLoSO...513636G. doi:10.1371/journal.pone.0013636. PMC 2956678. PMID 20976155.
  37. ^ Davis, P. M.; Lewenstein, B. V.; Simon, D. H.; Booth, J. G.; Connolly, M. J. L. (2008). "Open access publishing, article downloads, and citations: randomised controlled trial". BMJ. 337: a568. doi:10.1136/bmj.a568. PMC 2492576. PMID 18669565.
    Davis, P. M. (2011). "Open access, readership, citations: a randomized controlled trial of scientific journal publishing". The FASEB Journal. 25 (7): 2129–2134. doi:10.1096/fj.11-183988. PMID 21450907. S2CID 205367842.
  38. ^ Chua, SK; Qureshi, Ahmad M; Krishnan, Vijay; Pai, Dinker R; Kamal, Laila B; Gunasegaran, Sharmilla; Afzal, MZ; Ambawatta, Lahiru; Gan, JY (2017-03-02). "The impact factor of an open access journal does not contribute to an article's citations". F1000Research. 6: 208. doi:10.12688/f1000research.10892.1. PMC 5464220. PMID 28649365.
  39. ^ Tang, M., Bever, J. D., & Yu, F. H. (2017). Open access increases citations of papers in ecology. Ecosphere, 8(7), e01887.
  40. ^ Niyazov, Y., Vogel, C., Price, R., Lund, B., Judd, D., Akil, A., ... & Shron, M. (2016). Open access meets discoverability: Citations to articles posted to Academia. edu. PLOS ONE, 11(2), e0148257.
  41. ^ Young, J. S., & Brandes, P. M. (2020). Green and gold open access citation and interdisciplinary advantage: A bibliometric study of two science journals. The Journal of Academic Librarianship, 46(2), 102105.
  42. ^ Torres-Salinas, D., Robinson-Garcia, N., & Moed, H. F. (2019). Disentangling Gold Open Access. In Springer Handbook of Science and Technology Indicators (pp. 129–144). Springer, Cham.
  43. ^ Björk, B. C., Kanto-Karvonen, S., & Harviainen, J. T. (2020). How frequently are articles in predatory open access journals cited. Publications, 8(2), 17.
  44. ^ Radicchi, F.; Fortunato, S.; Castellano, C. (2008). "Universality of citation distributions: Toward an objective measure of scientific impact". PNAS. 105 (45): 17268–17272. arXiv:0806.0974. Bibcode:2008PNAS..10517268R. doi:10.1073/pnas.0806977105. PMC 2582263. PMID 18978030.
  45. ^ Hoang, D.; Kaur, J.; Menczer, F. (2010). (PDF). Proceedings of the WebSci10: Extending the Frontiers of Society On-Line. Archived from the original (PDF) on 2016-03-16. Retrieved 2017-02-20.
  46. ^ Kaur, J.; Hoang, D.; Sun, X.; Possamai, L.; JafariAsbagh, M.; Patil, S.; Menczer, F. (2012). "Scholarometer: A Social Framework for Analyzing Impact across Disciplines". PLOS ONE. 7 (9): e43235. Bibcode:2012PLoSO...743235K. doi:10.1371/journal.pone.0043235. PMC 3440403. PMID 22984414.
  47. ^ Kaur, J.; Radicchi, F.; Menczer, F. (2013). "Universality of scholarly impact metrics". Journal of Informetrics. 7 (4): 924–932. arXiv:1305.6339. doi:10.1016/j.joi.2013.09.002. S2CID 7415777.
  48. ^ Bornmann, L.; Daniel, H. D. (2008). "What do citation counts measure? A review of studies on citing behavior". Journal of Documentation. 64 (1): 45–80. doi:10.1108/00220410810844150. hdl:11858/00-001M-0000-0013-7A94-3. S2CID 17260826.
  49. ^ Anauati, M. V.; Galiani, S.; Gálvez, R. H. (2014). "Quantifying the Life Cycle of Scholarly Articles Across Fields of Economic Research". SSRN. doi:10.2139/ssrn.2523078. SSRN 2523078.
  50. ^ van Wesel, M.; Wyatt, S.; ten Haaf, J. (2014). "What a difference a colon makes: how superficial factors influence subsequent citation" (PDF). Scientometrics. 98 (3): 1601–1615. doi:10.1007/s11192-013-1154-x. hdl:20.500.11755/2fd7fc12-1766-4ddd-8f19-1d2603d2e11d. S2CID 18553863.
  51. ^ van Wesel, M. (2016). "Evaluation by Citation: Trends in Publication Behavior, Evaluation Criteria, and the Strive for High Impact Publications". Science and Engineering Ethics. 22 (1): 199–225. doi:10.1007/s11948-015-9638-0. PMC 4750571. PMID 25742806.
  52. ^ Giles, C. L.; Bollacker, K.; Lawrence, S. (1998). "CiteSeer: An Automatic Citation Indexing System". DL'98 Digital Libraries, 3rd ACM Conference on Digital Libraries. pp. 89–98. doi:10.1145/276675.276685.
  53. ^ Yu, G.; Li, Y.-J. (2010). "Identification of referencing and citation processes of scientific journals based on the citation distribution model". Scientometrics. 82 (2): 249–261. doi:10.1007/s11192-009-0085-z. S2CID 38693917.
  54. ^ Bouabid, H. (2011). "Revisiting citation aging: A model for citation distribution and life-cycle prediction". Scientometrics. 88 (1): 199–211. doi:10.1007/s11192-011-0370-5. S2CID 30345334.
  55. ^ Banasik-Jemielniak, Natalia; Jemielniak, Dariusz; Wilamowski, Maciej (2021-02-16). "Psychology and Wikipedia: Measuring Psychology Journals' Impact by Wikipedia Citations". Social Science Computer Review. 40 (3): 756–774. doi:10.1177/0894439321993836. ISSN 0894-4393. S2CID 233968639.
  56. ^ "Psychology and Wikipedia: Measuring journals' impact by Wikipedia citations". phys.org. Retrieved 2021-09-08.
  57. ^ Biagioli, M. (2016). "Watch out for cheats in citation game". Nature. 535 (7611): 201. Bibcode:2016Natur.535..201B. doi:10.1038/535201a. PMID 27411599. S2CID 4392261.

Further reading edit

  • Chanson, Hubert (2007). "Research Quality, Publications and Impact in Civil Engineering into the 21st Century. Publish or Perish, Commercial versus Open Access, Internet versus Libraries ?". Canadian Journal of Civil Engineering. 34 (8): 946–951. doi:10.1139/l07-027.
  • Panaretos, J.; Malesios, C. (2009). "Assessing Scientific Research Performance and Impact with Single Indices". Scientometrics. 81 (3): 635–670. arXiv:0812.4542. doi:10.1007/s11192-008-2174-9. S2CID 1957865.

External links edit

  •   Media related to Citation impact at Wikimedia Commons

citation, impact, citation, metric, redirects, here, confused, with, citation, index, citation, rate, measure, many, times, academic, journal, article, book, author, cited, other, articles, books, authors, citation, counts, interpreted, measures, impact, influ. Citation metric redirects here Not to be confused with Citation index Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles books or authors 1 2 3 4 5 6 Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics 7 8 specializing in the study of patterns of academic impact through citation analysis The importance of journals can be measured by the average citation rate 9 6 the ratio of number of citations to number articles published within a given time period and in a given index such as the journal impact factor or the citescore It is used by academic institutions in decisions about academic tenure promotion and hiring and hence also used by authors in deciding which journal to publish in Citation like measures are also used in other fields that do ranking such as Google s PageRank algorithm software metrics college and university rankings and business performance indicators Contents 1 Article level 1 1 Most cited papers 2 Journal level 3 Author level 4 Alternatives 5 Open Access publications 6 Recent developments 7 See also 8 References 9 Further reading 10 External linksArticle level editMain article Article level metrics One of the most basic citation metrics is how often an article was cited in other articles books or other sources such as theses Citation rates are heavily dependent on the discipline and the number of people working in that area For instance many more scientists work in neuroscience than in mathematics and neuroscientists publish more papers than mathematicians hence neuroscience papers are much more often cited than papers in mathematics 10 11 Similarly review papers are more often cited than regular research papers because they summarize results from many papers This may also be the reason why papers with shorter titles get more citations given that they are usually covering a broader area 12 Most cited papers edit The most cited paper in history is a paper by Oliver Lowry describing an assay to measure the concentration of proteins 13 By 2014 it had accumulated more than 305 000 citations The 10 most cited papers all had more than 40 000 citations 14 To reach the top 100 papers required 12 119 citations by 2014 14 Of Thomson Reuter s Web of Science database with more than 58 million items only 14 499 papers 0 026 had more than 1 000 citations in 2014 14 Journal level editMain article Journal level metrics Further information Journal impact factor The simplest journal level metric is the journal impact factor JIF the average number of citations that articles published by a journal in the previous two years have received in the current year as calculated by Clarivate Other companies report similar metrics such as the CiteScore CS based on Scopus However very high JIF or CS are often based on a small number of very highly cited papers For instance most papers in Nature impact factor 38 1 2016 were only cited 10 or 20 times during the reference year see figure Journals with a lower impact e g PLOS ONE impact factor 3 1 publish many papers that are cited 0 to 5 times but few highly cited articles 15 Journal level metrics are often misinterpreted as a measure for journal quality or article quality However the use of non article level metrics to determine the impact of a single article is statistically invalid Moreover studies of methodological quality and reliability have found that reliability of published research works in several fields may be decreasing with increasing journal rank 16 contrary to widespread expectations 17 Citation distribution is skewed for journals because a very small number of articles are driving the vast majority of citations therefore some journals have stopped publicizing their impact factor e g the journals of the American Society for Microbiology 18 Citation counts follow mostly a lognormal distribution except for the long tail which is better fit by a power law 19 Other journal level metrics include the Eigenfactor and the SCImago Journal Rank Author level editMain article Author level metrics Total citations or average citation count per article can be reported for an individual author or researcher Many other measures have been proposed beyond simple citation counts to better quantify an individual scholar s citation impact 20 The best known measures include the h index 21 and the g index 22 Each measure has advantages and disadvantages 23 spanning from bias to discipline dependence and limitations of the citation data source 24 Counting the number of citations per paper is also employed to identify the authors of citation classics 25 Citations are distributed highly unequally among researchers In a study based on the Web of Science database across 118 scientific disciplines the top 1 most cited authors accounted for 21 of all citations Between 2000 and 2015 the proportion of citations that went to this elite group grew from 14 to 21 The highest concentrations of citation elite researchers were in the Netherlands the United Kingdom Switzerland and Belgium 70 of the authors in the Web of Science database have fewer than 5 publications so that the most cited authors among the 4 million included in this study constitute a tiny fraction 26 Alternatives editMain article Altmetrics An alternative approach to measure a scholar s impact relies on usage data such as number of downloads from publishers and analyzing citation performance often at article level 27 28 29 30 As early as 2004 the BMJ published the number of views for its articles which was found to be somewhat correlated to citations 31 In 2008 the Journal of Medical Internet Research began publishing views and Tweets These tweetations proved to be a good indicator of highly cited articles leading the author to propose a Twimpact factor which is the number of Tweets it receives in the first seven days of publication as well as a Twindex which is the rank percentile of an article s Twimpact factor 32 In response to growing concerns over the inappropriate use of journal impact factors in evaluating scientific outputs and scientists themselves Universite de Montreal Imperial College London PLOS eLife EMBO Journal The Royal Society Nature and Science proposed citation distributions metrics as alternative to impact factors 33 34 35 Open Access publications editMain article Open Access Open access OA publications are accessible without cost to readers hence they would be expected to be cited more frequently 36 Some experimental and observational studies have found that articles published in OA journals do not receive more citations on average than those published in subscription journals 37 other studies have found that they do 38 39 40 The evidence that author self archived green OA articles are cited more than non OA articles is somewhat stronger than the evidence that gold OA journals are cited more than non OA journals 41 Two reasons for this are that many of the top cited journals today are still only hybrid OA author has the option to pay for gold 42 and many pure author pays OA journals today are either of low quality or downright fraudulent predatory journals preying on authors eagerness to publish or perish thereby lowering the average citation counts of OA journals 43 Recent developments editFurther information Citation analysis An important recent development in research on citation impact is the discovery of universality or citation impact patterns that hold across different disciplines in the sciences social sciences and humanities For example it has been shown that the number of citations received by a publication once properly rescaled by its average across articles published in the same discipline and in the same year follows a universal log normal distribution that is the same in every discipline 44 This finding has suggested a universal citation impact measure that extends the h index by properly rescaling citation counts and resorting publications however the computation of such a universal measure requires the collection of extensive citation data and statistics for every discipline and year Social crowdsourcing tools such as Scholarometer have been proposed to address this need 45 46 Kaur et al proposed a statistical method to evaluate the universality of citation impact metrics i e their capability to compare impact fairly across fields 47 Their analysis identifies universal impact metrics such as the field normalized h index Research suggests the impact of an article can be partly explained by superficial factors and not only by the scientific merits of an article 48 Field dependent factors are usually listed as an issue to be tackled not only when comparison across disciplines are made but also when different fields of research of one discipline are being compared 49 For instance in Medicine among other factors the number of authors the number of references the article length and the presence of a colon in the title influence the impact Whilst in Sociology the number of references the article length and title length are among the factors 50 Also it is found that scholars engage in ethically questionable behavior in order to inflate the number of citations articles receive 51 Automated citation indexing 52 has changed the nature of citation analysis research allowing millions of citations to be analyzed for large scale patterns and knowledge discovery The first example of automated citation indexing was CiteSeer later to be followed by Google Scholar More recently advanced models for a dynamic analysis of citation aging have been proposed 53 54 The latter model is even used as a predictive tool for determining the citations that might be obtained at any time of the lifetime of a corpus of publications Some researchers also propose that the journal citation rate on Wikipedia next to the traditional citation index may be a good indicator of the work s impact in the field of psychology 55 56 According to Mario Biagioli All metrics of scientific evaluation are bound to be abused Goodhart s law states that when a feature of the economy is picked as an indicator of the economy then it inexorably ceases to function as that indicator because people start to game it 57 See also editMathematical Citation QuotientReferences edit Garfield E 1955 Citation Indexes for Science A New Dimension in Documentation through Association of Ideas Science 122 3159 108 111 Bibcode 1955Sci 122 108G doi 10 1126 science 122 3159 108 PMID 14385826 Garfield E 1973 Citation Frequency as a Measure of Research Activity and Performance PDF Essays of an Information Scientist 1 406 408 Garfield E 1988 Can Researchers Bank on Citation Analysis PDF Essays of an Information Scientist 11 354 Garfield E 1998 The use of journal impact factors and citation analysis in the evaluation of science 41st Annual Meeting of the Council of Biology Editors Moed Henk F 2005 Citation Analysis in Research Evaluation Springer ISBN 978 1 4020 3713 9 a b Haustein S 2012 Multidimensional Journal Evaluation Analyzing Scientific Periodicals beyond the Impact Factor Knowledge and Information De Gruyter ISBN 978 3 11 025555 3 Retrieved 2023 06 06 Leydesdorff L amp Milojevic S 2012 Scientometrics arXiv preprint arXiv 1208 4566 Harnad S 2009 Open access scientometrics and the UK Research Assessment Exercise Scientometrics 79 1 147 156 Garfield Eugene 1972 11 03 Citation Analysis as a Tool in Journal Evaluation Science 178 4060 American Association for the Advancement of Science AAAS 471 479 Bibcode 1972Sci 178 471G doi 10 1126 science 178 4060 471 ISSN 0036 8075 PMID 5079701 de Solla Price D J 1963 Little Science Big Science Columbia University Press ISBN 9780231085625 Larsen P O von Ins M 2010 The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index Scientometrics 84 3 575 603 doi 10 1007 s11192 010 0202 z PMC 2909426 PMID 20700371 Deng B 26 August 2015 Papers with shorter titles get more citations Nature News doi 10 1038 nature 2015 18246 S2CID 186805536 Lowry O H Rosebrough N J Farr A L Randall R J 1951 Protein measurement with the Folin phenol reagent The Journal of Biological Chemistry 193 1 265 275 doi 10 1016 S0021 9258 19 52451 6 PMID 14907713 a b c van Noorden R Maher B Nuzzo R 2014 The top 100 papers Nature 514 7524 550 553 Bibcode 2014Natur 514 550V doi 10 1038 514550a PMID 25355343 Callaway E 2016 Beat it impact factor Publishing elite turns against controversial metric Nature 535 7611 210 211 Bibcode 2016Natur 535 210C doi 10 1038 nature 2016 20224 PMID 27411614 Brembs Bjorn 2018 Prestigious Science Journals Struggle to Reach Even Average Reliability Frontiers in Human Neuroscience 12 37 doi 10 3389 fnhum 2018 00037 PMC 5826185 PMID 29515380 Triggle Chris R MacDonald Ross Triggle David J Grierson Donald 2022 04 03 Requiem for impact factors and high publication charges Accountability in Research 29 3 133 164 doi 10 1080 08989621 2021 1909481 PMID 33787413 One might expect therefore that a high JIF factor indicates a higher standard of interest accuracy and reliability of papers published therein This is sometimes true but unfortunately is certainly not always the case Brembs 2018 2019 Thus Bjorn Brembs 2019 concluded There is a growing body of evidence against our subjective notion of more prestigious journals publishing better science In fact the most prestigious journals may be publishing the least reliable science Casadevall A Bertuzzi S Buchmeier M J Davis R J Drake H Fang F C Gilbert J Goldman B M Imperiale M J 2016 ASM Journals Eliminate Impact Factor Information from Journal Websites mSphere 1 4 e00184 16 doi 10 1128 mSphere 00184 16 PMC 4941020 PMID 27408939 Chatterjee Arnab Ghosh Asim Chakrabarti Bikas K 2016 01 11 Bornmann Lutz ed Universality of Citation Distributions for Academic Institutions and Journals PLOS ONE 11 1 Public Library of Science PLoS e0146762 Bibcode 2016PLoSO 1146762C doi 10 1371 journal pone 0146762 ISSN 1932 6203 PMC 4709109 PMID 26751563 Belikov A V Belikov V V 2015 A citation based author and age normalized logarithmic index for evaluation of individual researchers independently of publication counts F1000Research 4 884 doi 10 12688 f1000research 7070 1 PMC 4654436 Hirsch J E 2005 An index to quantify an individual s scientific research output PNAS 102 46 16569 16572 arXiv physics 0508025 Bibcode 2005PNAS 10216569H doi 10 1073 pnas 0507655102 PMC 1283832 PMID 16275915 Egghe L 2006 Theory and practise of the g index Scientometrics 69 1 131 152 doi 10 1007 s11192 006 0144 7 hdl 1942 981 S2CID 207236267 Galvez RH March 2017 Assessing author self citation as a mechanism of relevant knowledge diffusion Scientometrics 111 3 1801 1812 doi 10 1007 s11192 017 2330 1 S2CID 6863843 Couto F M Pesquita C Grego T Verissimo P 2009 Handling self citations using Google Scholar Cybermetrics 13 1 2 Archived from the original on 2010 06 24 Retrieved 2009 05 27 Serenko A Dumay J 2015 Citation classics published in knowledge management journals Part I Articles and their characteristics PDF Journal of Knowledge Management 19 2 401 431 doi 10 1108 JKM 06 2014 0220 Reardon Sara 2021 03 01 Elite researchers dominate citation space Nature 591 7849 333 334 Bibcode 2021Natur 591 333R doi 10 1038 d41586 021 00553 7 PMID 33649475 Bollen J Van de Sompel H Smith J Luce R 2005 Toward alternative metrics of journal impact A comparison of download and citation data Information Processing and Management 41 6 1419 1440 arXiv cs DL 0503007 Bibcode 2005IPM 41 1419B doi 10 1016 j ipm 2005 03 024 S2CID 9864663 Brody T Harnad S Carr L 2005 Earlier Web Usage Statistics as Predictors of Later Citation Impact Journal of the Association for Information Science and Technology 57 8 1060 arXiv cs 0503020 Bibcode 2005cs 3020B doi 10 1002 asi 20373 S2CID 12496335 Kurtz M J Eichhorn G Accomazzi A Grant C Demleitner M Murray S S 2004 The Effect of Use and Access on Citations Information Processing and Management 41 6 1395 1402 arXiv cs 0503029 Bibcode 2005IPM 41 1395K doi 10 1016 j ipm 2005 03 010 S2CID 16771224 Moed H F 2005b Statistical Relationships Between Downloads and Citations at the Level of Individual Documents Within a Single Journal Journal of the American Society for Information Science and Technology 56 10 1088 1097 doi 10 1002 asi 20200 Perneger T V 2004 Relation between online hit counts and subsequent citations Prospective study of research papers in the BMJ BMJ 329 7465 546 7 doi 10 1136 bmj 329 7465 546 PMC 516105 PMID 15345629 Eysenbach G 2011 Can Tweets Predict Citations Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact Journal of Medical Internet Research 13 4 e123 doi 10 2196 jmir 2012 PMC 3278109 PMID 22173204 Veronique Kiermer 2016 Measuring Up Impact Factors Do Not Reflect Article Citation Rates The Official PLOS Blog Ditching Impact Factors for Deeper Data The Scientist Retrieved 2016 07 29 Scientific publishing observers and practitioners blast the JIF and call for improved metrics Physics Today 2016 doi 10 1063 PT 5 8183 Hitchcock Steve 2013 2004 The effect of open access and downloads hits on citation impact a bibliography of studies opcit eprints org University of Southampton Retrieved 2023 01 22 Brody T Harnad S 2004 Comparing the Impact of Open Access OA vs Non OA Articles in the Same Journals D Lib Magazine 10 6 Eysenbach G Tenopir C 2006 Citation Advantage of Open Access Articles PLOS Biology 4 5 e157 doi 10 1371 journal pbio 0040157 PMC 1459247 PMID 16683865 Eysenbach G 2006 The Open Access Advantage Journal of Medical Internet Research 8 2 e8 doi 10 2196 jmir 8 2 e8 PMC 1550699 PMID 16867971 Hajjem C Harnad S Gingras Y 2005 Ten Year Cross Disciplinary Comparison of the Growth of Open Access and How It Increases Research Citation Impact PDF IEEE Data Engineering Bulletin 28 4 39 47 arXiv cs 0606079 Bibcode 2006cs 6079H Lawrence S 2001 Free online availability substantially increases a paper s impact Nature 411 6837 521 Bibcode 2001Natur 411 521L doi 10 1038 35079151 PMID 11385534 S2CID 4422192 MacCallum C J Parthasarathy H 2006 Open Access Increases Citation Rate PLOS Biology 4 5 e176 doi 10 1371 journal pbio 0040176 PMC 1459260 PMID 16683866 Gargouri Y Hajjem C Lariviere V Gingras Y Brody T Carr L Harnad S 2010 Self Selected or Mandated Open Access Increases Citation Impact for Higher Quality Research PLOS ONE 5 10 e13636 arXiv 1001 0361 Bibcode 2010PLoSO 513636G doi 10 1371 journal pone 0013636 PMC 2956678 PMID 20976155 Davis P M Lewenstein B V Simon D H Booth J G Connolly M J L 2008 Open access publishing article downloads and citations randomised controlled trial BMJ 337 a568 doi 10 1136 bmj a568 PMC 2492576 PMID 18669565 Davis P M 2011 Open access readership citations a randomized controlled trial of scientific journal publishing The FASEB Journal 25 7 2129 2134 doi 10 1096 fj 11 183988 PMID 21450907 S2CID 205367842 Chua SK Qureshi Ahmad M Krishnan Vijay Pai Dinker R Kamal Laila B Gunasegaran Sharmilla Afzal MZ Ambawatta Lahiru Gan JY 2017 03 02 The impact factor of an open access journal does not contribute to an article s citations F1000Research 6 208 doi 10 12688 f1000research 10892 1 PMC 5464220 PMID 28649365 Tang M Bever J D amp Yu F H 2017 Open access increases citations of papers in ecology Ecosphere 8 7 e01887 Niyazov Y Vogel C Price R Lund B Judd D Akil A amp Shron M 2016 Open access meets discoverability Citations to articles posted to Academia edu PLOS ONE 11 2 e0148257 Young J S amp Brandes P M 2020 Green and gold open access citation and interdisciplinary advantage A bibliometric study of two science journals The Journal of Academic Librarianship 46 2 102105 Torres Salinas D Robinson Garcia N amp Moed H F 2019 Disentangling Gold Open Access In Springer Handbook of Science and Technology Indicators pp 129 144 Springer Cham Bjork B C Kanto Karvonen S amp Harviainen J T 2020 How frequently are articles in predatory open access journals cited Publications 8 2 17 Radicchi F Fortunato S Castellano C 2008 Universality of citation distributions Toward an objective measure of scientific impact PNAS 105 45 17268 17272 arXiv 0806 0974 Bibcode 2008PNAS 10517268R doi 10 1073 pnas 0806977105 PMC 2582263 PMID 18978030 Hoang D Kaur J Menczer F 2010 Crowdsourcing Scholarly Data PDF Proceedings of the WebSci10 Extending the Frontiers of Society On Line Archived from the original PDF on 2016 03 16 Retrieved 2017 02 20 Kaur J Hoang D Sun X Possamai L JafariAsbagh M Patil S Menczer F 2012 Scholarometer A Social Framework for Analyzing Impact across Disciplines PLOS ONE 7 9 e43235 Bibcode 2012PLoSO 743235K doi 10 1371 journal pone 0043235 PMC 3440403 PMID 22984414 Kaur J Radicchi F Menczer F 2013 Universality of scholarly impact metrics Journal of Informetrics 7 4 924 932 arXiv 1305 6339 doi 10 1016 j joi 2013 09 002 S2CID 7415777 Bornmann L Daniel H D 2008 What do citation counts measure A review of studies on citing behavior Journal of Documentation 64 1 45 80 doi 10 1108 00220410810844150 hdl 11858 00 001M 0000 0013 7A94 3 S2CID 17260826 Anauati M V Galiani S Galvez R H 2014 Quantifying the Life Cycle of Scholarly Articles Across Fields of Economic Research SSRN doi 10 2139 ssrn 2523078 SSRN 2523078 van Wesel M Wyatt S ten Haaf J 2014 What a difference a colon makes how superficial factors influence subsequent citation PDF Scientometrics 98 3 1601 1615 doi 10 1007 s11192 013 1154 x hdl 20 500 11755 2fd7fc12 1766 4ddd 8f19 1d2603d2e11d S2CID 18553863 van Wesel M 2016 Evaluation by Citation Trends in Publication Behavior Evaluation Criteria and the Strive for High Impact Publications Science and Engineering Ethics 22 1 199 225 doi 10 1007 s11948 015 9638 0 PMC 4750571 PMID 25742806 Giles C L Bollacker K Lawrence S 1998 CiteSeer An Automatic Citation Indexing System DL 98 Digital Libraries 3rd ACM Conference on Digital Libraries pp 89 98 doi 10 1145 276675 276685 Yu G Li Y J 2010 Identification of referencing and citation processes of scientific journals based on the citation distribution model Scientometrics 82 2 249 261 doi 10 1007 s11192 009 0085 z S2CID 38693917 Bouabid H 2011 Revisiting citation aging A model for citation distribution and life cycle prediction Scientometrics 88 1 199 211 doi 10 1007 s11192 011 0370 5 S2CID 30345334 Banasik Jemielniak Natalia Jemielniak Dariusz Wilamowski Maciej 2021 02 16 Psychology and Wikipedia Measuring Psychology Journals Impact by Wikipedia Citations Social Science Computer Review 40 3 756 774 doi 10 1177 0894439321993836 ISSN 0894 4393 S2CID 233968639 Psychology and Wikipedia Measuring journals impact by Wikipedia citations phys org Retrieved 2021 09 08 Biagioli M 2016 Watch out for cheats in citation game Nature 535 7611 201 Bibcode 2016Natur 535 201B doi 10 1038 535201a PMID 27411599 S2CID 4392261 Further reading editChanson Hubert 2007 Research Quality Publications and Impact in Civil Engineering into the 21st Century Publish or Perish Commercial versus Open Access Internet versus Libraries Canadian Journal of Civil Engineering 34 8 946 951 doi 10 1139 l07 027 Panaretos J Malesios C 2009 Assessing Scientific Research Performance and Impact with Single Indices Scientometrics 81 3 635 670 arXiv 0812 4542 doi 10 1007 s11192 008 2174 9 S2CID 1957865 External links edit nbsp Media related to Citation impact at Wikimedia Commons Retrieved from https en wikipedia org w index php title Citation impact amp oldid 1210095738, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.