But when first the two black dragons sprang out of the fog upon the small clerk, they had merely the effect of all miracles—they changed the universe. He discovered the fact that all romantics know—that adventures happen on dull days, and not on sunny ones. When the cord of monotony is stretched most tight, it breaks with a sound like song. —GK Chesterton
No one likes monotony. For one thing, it’s boring. For another, it’s boring. And then there’s the unfortunate fact that it’s boring.
Monotony literally means “one tone”, and musically and rhetorically it can apply to a variety of different variables (or in this case, nonvariables), including pitch, intonation, and inflection. How many presentations at meetings have we sat through where the speakers seemed simply to drone on and on, grinding through the text of one interminable slide after another?
Monotony implies sameness and tediousness, a wearisome and unending succession with no alteration. It also tends to be quite uninformative, offering no new lessons. We typically glean little or nothing new from the same old routine. A degree of fluctuation and novelty are required to engage and enlighten the mind.
Something similar might be said about educational evaluation. Accrediting organizations such as the Liaison Committee on Medical Education and Accreditation Council on Graduate Medical Education require that programs conduct regular evaluations of their educational programs, but in many cases such data turn out to be less than scintillating.
Consider the case of educational evaluation in radiology residency programs, which generates vast quantities of data. Conservatively, assuming 1000 training positions each year over 4 years with an average of two evaluations per rotation and 12 rotations per year, we estimate that more than 200,000 evaluations are completed each year across the United States. Assuming that each evaluation takes on average 4 minutes to complete, including logging in and off, this comes to 225 person-hours per year at our program alone. These numbers are probably doubled when faculty member evaluations of residents and resident evaluations of faculty are both included.
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
References
1. Barnette J.J.: Responses that may indicate nonattending behaviors in three self-administered educational surveys. Res Sch 1996; 3: pp. 49-59.
2. Stratton T.D., Witzke D.B., Jacob R.J., et. al.: Medical students’ ratings of faculty teaching in a multi-instructor setting: an examination of monotonic response patterns. Adv Health Sci Educ Theory Pract 2002; 7: pp. 99-116.