Friday, May 30, 2008

Think about assessment

The point of this post is that online writing teachers would be well served to be thinking about assessment. By assessment here, I don't mean evaluation of students and their work. I mean assessment in a course and, perhaps, programmatic, sense: How effective are instructional methods for writing in the online setting? What methods emerge as "best practices" in this setting? Is student writing different in this environment? If so, how?

If we believe our methods are solid, if we believe that the opportunities in the online teaching world are substantial, if we believe that we can help our students learn to write effectively, then we need to mine the resources--textual and otherwise--at our fingertips to demonstrate that what we do online makes sense and is pedagogically sound.

There are many methods, some of them relatively untapped, to help us demonstrate the kinds of things that are happening in our classes and how we might measure how student behaviors affect learning. Of course, we can look at student evaluations or surveys. But, if we use message boards as a primary means of class communication, we could also look at student posting habits and how they relate to student success: Do early posters/discussion participants have different outcomes in a class (I looked at this in a post here in April 2006)? Do students who follow up their posts have a different experience? Does post length have any meaning in terms of course outcome? Number of errors? We could perform a number of interesting, localized assessments of our own class practices using the student texts archived right in our own courses.

Also, because of the large amount of informal writing we can easily assign and use online, can we find quantitative or qualitative differences in the writing students do in this environment? How do students collaborate online, in peer review and team-based projects? What works in e-collaborations? What doesn't?

We also can certainly think about our efforts in terms of do no harm, bolstering the "no significant difference" aspect of online instruction by thinking about retention, student evaluations, and perhaps even student grades in future writing courses.

I write this partially as a way to motivate myself, because I'm aware that right under my nose in my online class are archives containing a wealth of information that can help us build an understanding of not only how students perform online but how they learn in general.

It's an assessment-based education world, for sure. In order to support our work, like anyone else in teaching, we should be looking for ways, in our own classes, program wide, and across the field, to demonstrate the strengths and weaknesses of what we are doing. There are a variety of resources to help develolp the framework of such studies, including the impressive recently published anthology Digital Writing Research: Technologies, Methodologies, and Ethical Issues by Heidi McKee and Danielle Nicole DeVoss (Hampton Press, 2007). A CCCCs group, chaired by Beth Hewett, is looking into best practices in online writing instruction. Check out CompPile for a search on [assessment]. Or delve into assessment resources and model articles on sites like MERLOT or Sloan-C or one of the many journals dedicated to online learning, such as the Online Journal of Distance Learning Administration.

Much of the research focus on online learning has been on so-called "content" courses. It would seem there is an open vista before us for assessment of writing-centered courses in this environment. Exploring that will help us validate our teaching work to external audiences, our students, and, of course, our selves.