No. Not that word. I would like to talk about the “F” word that haunts all conscientious faculty: Failure. When our cohort last met for a brainstorming session three weeks ago, Kerry asked a question that has stayed with me ever since: “What if we fail?”
We are designing something new, so the fear of failure is likely to haunt us. I also worry that I am designing a course without any student input, and while I know what students need to learn, I wish I knew more about how they will learn online.
Many studies have demonstrated, not unsurprisingly, that feeling isolated, struggling with course technology, and wrestling with illogical course navigation negatively impact how students learn in an online course. Yet in my own experience teaching composition online, I have reservations about how students value peer-interaction to avoid feeling isolated, a perspective shared by some of my colleagues. If the studies are right, students would seem more engaged by discussion forums and group projects, yet I do not see that in the courses I teach. If this type of social presence is critical to student engagement, where is the evidence from students to prove it?
I looked for recent studies that collected data from students about what they value in an online course. Two studies, both published in 2010, included student surveys. Yi Yang and Vance Durrington in “Investigations of Students’ Perceptions of Online Course Quality,” surveyed 176 students and found that they valued peer-to-peer interaction, instructor feedback, and clear course structure, but of these three, they rated course structure as more important than anything else.
My take: My students value peer-interaction to some extent, but they value instructor feedback far more, even though they don’t often read it. They do, however, contact me immediately if they cannot find something, whereas they are unlikely to contact me if they have questions about my feedback.
Marcia D. Dixson, in “Creating effective student engagement in online courses: What do students find engaging?”surveyed 186 students enrolled in 36 courses at six campuses. She found students identified specific activities that increased their engagement in the course: application activities (applying concepts they learned to case studies or problem solving tasks), debating concepts in a discussion forum, group projects, and research papers. Students reported less engagement with more passive activities: watching PowerPoint presentations and video lectures, and taking quizzes. These students also listed doing the reading as a passive activity.
My take: In general education courses, students are wary of group projects, though learning to research and write collaboratively is a learning objective in our composition courses. Students do need to interact with each other, but getting students to engage meaningfully in discussion forums continues to elude me. (I need to learn how to write better prompts). I also know that the hours I put into creating videos often goes to waste when data show how few students actually watch them. Yet a video that shows a student how to do an assignment is usually a hit. No thinking, just mimicry. Not. Good. Teaching.
The findings in a more recent study were more in line with my own experience. Penny Ralston-Berg, Janet Buckmeyer, Casimir Barcyzk, and Emily Hixon studied students’ experiences in online courses. In their spring 2015 article, “Students’ Perceptions of Online Course Quality: How Do They Measure Up to the Research?”they found that students did not rate peer-interaction more highly than elements such as course design and relevance of course activities/materials to assignments. Their study, larger in scope (3,160 students surveyed at 31 colleges and universities located in 22 states), asked students enrolled in online courses to rate the course criteria according to the Quality Matters rubric, and they placed great value in the orientation (having “clear instructions for how to get started in the course”) and navigation (how to “find various course components”) more than activities designed to encourage a social presence. They also ranked having a “clear grading policy” and “criteria for evaluating student work” in the top five most highly rated criteria.
These students placed far less importance on the role of interactive learning activities, in part, Ralston-Berg concludes, from having poor experiences with group work. The lowest rated item was having students introducing themselves to the class, though they did want to see an introduction from the instructor.
My take: These findings reinforce what I see in my online composition courses. Students value interacting with the instructor far more than interacting with their peers. The online platform individualizes the learning no matter how much peer-interaction I provide because students are not together physically. Too often, they feel like that are taking a course with me alone rather than with a class.
In my online teaching, I have not yet been able to counter that attitude in my students. Instructors only have so much time to spend with students individually, and we know they can and do learn from each other. Finding a way to do this successfully in an online course is a challenge, and I like that we have time to experiment with strategies that might work better than what I use now.
And as for failing–I think it’s better to situate what we are doing in terms of what it really is: a beta test course, designed by faculty who value student engagement and student learning. We will learn a great deal from teaching our course the first time, and that feedback mechanism will let us know what worked and what needs to be improved.
Dixson, Marcia D. “Creating effective student engagement in online courses: What do students find engaging?” Journal of The Scholarship of Teaching & Learning, vol. 10. no. 2, June 2010. https://www.iupui.edu/~josotl/archive/vol_10/no_2/v10n2dixson.pdf
Ralston-Berg, Penny; Janet Buckenmeyer, Casimir Barczyk, and Emily Hixon. “Students’ Perceptions of Online Course Quality: How Do ey Measure Up to the Research?,” Internet Learning, vol. 4, no. 1, 2016. http://digitalcommons.apus.edu/cgi/viewcontent.cgi?article=1047&context=internetlearning.
Yang, Yi and Vance Durrington. “Investigation of Students’ Perceptions of Online Course Quality.” International Journal on E-Learning, vol. 9, no. 3, July 2010. https://www.learntechlib.org/p/29460.