Home Has the Time Come for Bibliometrics and the H-Index in Academic Radiology?
Post
Cancel

Has the Time Come for Bibliometrics and the H-Index in Academic Radiology?

Appearing in this issue of Academic Radiology is an article by Rad and colleagues entitled “The H-index in Academic Radiology” . In this work, authors from the Mayo Clinic have produced what is, to the best of my knowledge, the first peer-reviewed publication using the H-index to characterize publications by members of academic radiology departments . There have been several excellent published descriptions of the H-index in particular and bibliometrics in general ; please see these articles for details. In brief, the H-index was introduced around 2005 by Hirsch, ostensibly for use in evaluating publications by specific authors in physics . In general terms, higher H-indices are “better” . The H-index is based on a list of publications ranked in descending order by the times cited, and the value of H is a representation of the number of articles ( n ) in the list that have n or more citations . Among other reasons, this metric may be useful because it presumptively discounts the disproportionate weight of highly cited articles or articles that have not yet been cited .

Some authors have suggested that breakpoints or thresholding could be applied to the H-index results to evaluate the career of an academician . It is in a similar vein that the authors of “The H-index in Academic Radiology” report their work here . In addition to something akin to proof of principle/concept and an apparently novel application to diagnostic radiology, these authors suggest that the H-index could be used to establish thresholds for decisions such as academic promotion . I found their article to be well-written, interesting, timely, and thought-provoking. There are many reasons why I suggest that now is the time for academic radiology to explore bibliometrics and be actively involved in its study and implementation in our academic communities. I am fond of using an approach using “pearls and pitfalls” and will employ that approach here.

Pearls

Metrics

I believe that we are in an era of metrics and have been for several years. Although I find this to be acutely evident in the administration of business aspects and training programs in academic radiology, it also seems to permeate all aspects of our lives as academic radiologists. Perhaps this is an offshoot of the application of business practices to our field; perhaps not. In any case, it seems logical to me that scientific (ie, academic or scholarly) productivity should not be expected to be immune to this influence/requirement. The H-index offers us a potential metric in this characterization. I submit that we in radiology should be proactive as an academic community and be involved in the characterization and presumably eventual application of (biblio)metrics to our work, if and when appropriate.

Outcomes

I believe we are also in an era of outcomes. Logically, one would make a connection between metrics and outcomes. This implied relationship between metrics and outcomes could be used in several ways. One form could be what is in essence a self-fulfilling prophesy—establishment of normative thresholding metrics that would establish a specific outcome. For example, the H-index could be applied to members in an academic radiology department as either a necessary or sufficient parameter for academic promotion, annual review, or incentive pay, etc. Alternatively, this relationship between metrics and outcomes could be used to evaluate the past; for example, to assess a particular career when being considered for an academic societal Gold Medal or the future during mentorship and advising of faculty. I am not necessarily promoting these, but they do seem interesting and potentially useful.

Documentation

We are also, I believe, in an era of documentation, now pervasive in our professional lives. Use of bibliometrics such as the H-index allows for a convenient method of providing what some may describe as objective data. Depending on the supporting literature and the community’s perception of individual bibliometrics, documentation the use of these values in a rigorous fashion may perhaps be viewed as objective as well. To me, it would be preferable if such an assessment was supported by a preponderance of peer-reviewed evidence in the literature as well as a thoughtful assessment of whether or not this is the “right thing to do.”

Transparency

This “hot topic” is a characteristic that seems to me to be closely connected to the discussion of bibliometrics. If we are to develop normative thresholding for consideration of career assessment, academic promotion, salary, incentive pay, and so on, I believe that individuals should have an opportunity to understand the “system” thus constructed and have an opportunity to modify and manage their own careers. For example, the criteria for academic promotion in university systems can be challenging to obtain or understand. Could a more transparent use of bibliometrics in these systems improve the process of academic promotion? I would like to think so.

Pitfalls

Is the H-index Generalizable?

Get Radiology Tree app to read full this article<

What about Other Objective Success Criteria we Produce in Terms of Scholarly Pursuits?

Get Radiology Tree app to read full this article<

Will this Adversely Affect Submission of Academic Works (in Radiology) to Meetings or Journals where the Work would be Best Suited?

Get Radiology Tree app to read full this article<

Accuracy

Get Radiology Tree app to read full this article<

Postscript

Get Radiology Tree app to read full this article<

References

  • 1. Rad A.E., Brinjikji W., Cloft H.J., et. al.: The H-index in academic radiology. Acad Radiol 2010; 17: pp. 817-821.

  • 2. Hirsch J.E.: An index to quantify an individual’s scientific research output. Proc Natl Acad Sci U S A 2005; 102: pp. 16569-16572.

  • 3. Castillo M.: Measuring academic output: the H-Index. AJNR Am J Neuroradiol 2009; 31: pp. 783-784.

  • 4. Krestin G.P.: Evaluating the quality of radiology research: what are the rules of the game?. Radiology 2008; 249: pp. 418-424.

  • 5. Fuller C.D., Choi M., Thomas C.R.: Bibliometric analysis of radiation oncology departmental scholarly publication productivity at domestic residency training institutions. J Am Coll Radiol 2009; 6: pp. 112-118.

  • 6. Durieux V., Gevenois P.A.: Bibliometric indicators: quality measurements of scientific publication. Radiology 2010; 255: pp. 342-351.

This post is licensed under CC BY 4.0 by the author.