14 January 2009

Measuring Online Instructor Activity

With expectations for growth in online education (due to the recession), there is renewed interest in attracting faculty members to online teaching assignments. There is a growing community of adjunct online instructors but institutions seek to increase the number of regular faculty online.

U of Maryland - Baltimore County (UMBC) is highlighting their "most active" online instructors through a Blackboard Reports web site. Activity is not in number of courses taught, but in number of "page hits" by the instructors in their Blackboard course shells.

The Chronicle story that described UMBC's reporting of usage stats included the context of how such reporting can be of value. Readers' comments on The Wired Campus include UMBC's John Fritz's explanation of their system and some of their rationale for collecting (and publishing) stats. The range of comments on Wired Campus is tremendous and reflects just about any reaction you can imagine to the publishing of such data. From "invasion of privacy" to "reasonable protection," the concerns are posted from all perspectives: student, professor, and administrator.

The question of usage stats is not likely to become more acceptable to critics just by calling the stats an "indicator" rather than a measure, but that does describe how many online instructors utilize stats in reviewing students' activity. "Hits" as well as number and frequency of log-ins provide an indication of whether a student needs prompting. Of course, a check on quality further informs the decision to prompt.

My personal experience with Blackboard included a crude test of the accuracy of the usage stats that Blackboard calls Course Statistics. Several graduate students spent timed sessions in a Blackboard shell and intentionally accessed files in certain patterns. We confirmed what we already suspected, that the reported hits did not correlate well with time-on-task or even number of files accessed. Even with that lesson, we proceeded to use Course Statistics to identify MIAs (students "missing in action"), establish the date "drops" stopped attending class, and track performance as one indicator of student engagement. My informal conclusion after teaching online for almost 10 years is that page hits don't necessarily correlate with students performing at middle and high levels. But they do correlate with students on the low end and sometimes serve as an early warning signal to identify those students.

(John Fritz has shared his paper with the higher ed community; more on that in another post.)

© 2009 Mary Bold, PhD, CFLE. Email contact: bold[AT]marybold.com. The content of this blog or related web sites created by Mary Bold (www.marybold.com, www.boldproductions.com, College Intern Blog) is not under any circumstances to be regarded as legal or professional advice. Bold is the co-author of Reflections: Preparing for your Practicum or Internship, geared to college interns in the child, education, and family fields. She is a consultant and speaker on assessment, distance learning, and technology.

No comments: