Wednesday, February 9, 2011

A Full Day of Grading

Teachers, TAs, and professors love to complain about the amount of grading they have. But how much is that, exactly? I've seen studies that say assessments and assessment-related activities (I imagine this includes creation, proctoring, and grading) take up 1/3 - 1/2 of a teacher's time. I certainly believe it. But I thought it might be interesting/useful/fun procrastination to consider a rough estimate of how many hours per week it might be reasonable to expect a teacher (in this case, imagine a high school teacher) to spend just grading. And I found what I came up with to be rather amazing (despite all the time I've personally spent grading).

Here are my basic low-end assumptions:

  • A teacher has about 100 students (imagine 4 classes of 25 students each). These days, many teachers teach 5 classes of 30 each. Some pampered private school teachers (like I was) teach 4 classes of 15 students each. Still, 100 is a reasonable estimate and a nice round number.
  • Each student hands in one page to be graded each day. This will likely be a single sheet of homework. This excludes any tests or quizes, which are generally longer than a sheet, so this is definitely a low-end estimate.
  • Each page takes the teacher 1 minute to grade. Again, this sounds like a short time to me if you're actually grading the work. And if you also consider the time it takes to determine the full-page grade and enter it into the computer system...

These estimates lead to a need to grade 500 pages per week, which I'm estimating could take 500 minutes. That's 8.33 hours, or a full work day. Yes, a teacher could easily spend a full work day each week just grading. I find that astonishing.

Tuesday, February 1, 2011

SODA: Instance Optimality

The highlight of SODA for me was the plenary talk Computational Geometry for Non-Geometers by Timothy Chan, especially the part in which he discussed instance optimality. The basic idea is to consider algorithm analyses that take all inputs into account and operate optimally over all inputs in comparison to an algorithm designed specifically for that input (but not for that input's order). I believe this was first considered in a geometric setting (Instance-Optimal Geometric Algorithms), so you could imagine that the input is a set of points and the algorithm you're competing against doesn't know in advance in what order the points will be given to you. This wonderfully fixes the problem with worst-case analyses in which an algorithm might perform poorly on most input instances but there is some hard instance for which the performance is reasonable. Instead, under an instance-optimal analysis, the algorithm is expected to perform well on all input instances.

In the study of data structures for moving objects, a similar idea known as motion sensitivity was previously introduced under a motion-specific setting. Motion sensitivity generally describes algorithms that perform well under orderly motion and have performance that degrades as the point motion becomes more random. In my own work, I consider entropy-based analyses in order to capture this concept. In fact, the instance-optimal analysis Timothy described also resulted in an entropy-like inequality, so perhaps this relationship is inevitable/necessary for these analyses.

It's an exciting new area that I'm hoping to see a lot more of. It was certainly an excellent talk.