On Mean Means, and What a Mean might Mean

Since reading this unsurprising-but-still-upsetting article at chronicle.com, I’ve been thinking about the whole business of C as an “average” grade. It just doesn’t make a lot of sense to me. Even granted that the average – construed as the arithmetic mean – isn’t *technically* the middle grade of a set (that’s the median), it’s still an attempt to represent the middle – the central point around which it’s meaningful to measure deviation above or below. But on every grade scale I’ve ever worked under, as either student or teacher, there just isn’t that much below a C that isn’t failing: at Hunter College, for example, the available grades are A+, A, A-, B+, B, B-, C+, C, D, F. There isn’t even a C- available, and D is a no-credit grade (i.e. it converts to fail if the student is taking the course pass/fail). So to say that C is the average seems to imply that there are roughly as many students failing the course as are getting A’s and B’s.  How can this be? Were professors formerly just that mean, as to fail all those students?

The best explanation I’ve been able to come up with is that C can be “average” only if it takes into account the entire population, not just those in school – if people who have never even encountered the course material are treated as failing the course.

This seems pretty odd on a number of levels: it assumes that grades exist on an absolute scale, with a strangely universal reach; it presumes to judge people who aren’t in school, and to judge them fairly harshly; and, significantly, it assumes that about half the population shouldn’t be in school.

To my mind, this sheds new light on the phenomenon of grade inflation. It’s not that teachers are relaxing standards of excellence (or at least, not necessarily) – it’s that far fewer people are failing-by-default. They’re showing up and being counted, and what “average” folks can achieve is different from what any one supposed – and not just luckier.

Leave a Reply


*

Skip to toolbar