ASSESSING AUDIO-VISUAL FEEDBACK EFFECTIVENESS ON DOCTORAL STUDENTS’ TECHNICAL WRITING SKILLS

University of the Rockies // About // Scholars' Summit Faculty Blog // ASSESSING AUDIO-VISUAL FEEDBACK EFFECTIVENESS ON DOCTORAL STUDENTS’ TECHNICAL WRITING SKILLS

ASSESSING AUDIO-VISUAL FEEDBACK EFFECTIVENESS ON DOCTORAL STUDENTS’ TECHNICAL WRITING SKILLS

Feedback Effectiveness

Effective feedback is widely viewed as a vital element of student achievement in higher education (Evans, 2011, Sadler, 2013). Despite the importance of feedback to students, issues prevail related to the timeliness, quality, effectiveness, and student engagement with feedback (Crook, Mauchline, Maw, Lawson, Drinkwater, Lundqvist, & Park, 2012). Evans (2011) described a sense of disappointment with feedback practices among both faculty and students. In a decade-spanning review of studies on feedback, Evans (2011) reported that faculty members were dissatisfied with how feedback is applied and students reported frustration with the technical aspects of feedback.

Teachers and students have divergent views on the effectiveness of feedback (Sadler, 2010). Student understanding of feedback is often lower than teachers realize. Walker (2009) reported that up to 30% of students do not understand the feedback provided by instructors. Low levels of understanding mean that students may not be able to apply teacher feedback to assignments or even more elemental, understand the meaning of the feedback (Sadler, 2010). Weaver (2006) confirmed up to 40% of students questioned the credibility and value of the feedback provided by instructors on assessments.

A Need for Further Research

In reviewing current literature, fewer studies on feedback are generated within the United States when compared to other nations, such as the United Kingdom (Clark, 2011). Globally, fewer studies address higher education student and teacher views (Scott, et al., 2011). Experiences of students who are returning to school years after completion of a prior degree are a particularly under-researched population (Evans, 2011).

Assessing effective feedback practices within specific individual or general contexts needs to remain a research focus (Sadler, 2010). Probing a gap in the literature revealed a need for an explanatory study that investigates how useful feedback related to learning outcomes might support student achievement (Clark, 2011). The timing was good to situate such a study within an emerging trend in higher education student assessment, which is using technology to provide feedback. Integrating technology into student feedback can allow instructors to place the feedback most effectively (Phillips & Wolcott, 2014) and build a relationship with students (Bye & Fallon, 2015).

Current Research

I had the opportunity, under a research fellowship, to lead a team of investigators for conducting a quasi-experimental study to define the extent that audio-visual feedback may improve rubric-grading scores for technical writing proficiency within an online technologically supported learning environment. In the research site, many postgraduate students are members of the workforce, who are returning to study after many years away from a scholarly environment. These students may have more trouble using dialogue-based feedback in higher education settings.

Students in the treatment group (n=21) received supported audio and visual feedback on assignments. Students who comprised the control group (n=30) received traditional print based feedback. Quantitative data was collected in accordance with the course-based assignment rubrics to evaluate the effects of membership within the experimental or control group on student outcomes in their writing and organization of course assignments. The covariate was student grade point average (GPA) in the core and specialization courses of the student’s degree program.

Results

In the current study, which employed technology for the delivery of audio-visual feedback for an underserved research demographic (non-traditional, working adult doctoral students), there were no significant differences in competency improvement between groups, which received the technology intervention and those, which did not, for technical writing scores. Further, similar patterns of improvement and decline in weekly scores were observed within groups across different weekly assignments. These results, as well as the finding that the improvement of the group’s score mean was stronger for the experimental group and the decline of the group’s score mean was lower for the experimental group, are suggestive of an important agenda for additional research to address the utility of audio-visual feedback for the non-traditional doctoral student.

The results of the study could be enhanced with further analysis of the higher improvement and lesser reduction of graded-rubric score means when student feedback is provided in audio-visual feedback. Varying combinations of student social interaction with digital technology and data collection protocols could form the basis for such research. Methodological scenarios worthy of consideration include the effects of two-way interactions with audio-visual technology, the collection of longitudinal data, the timing of the introduction of audio-visual feedback into a course, and the integration of best practices for the delivery of audio-visual feedback. Further study is needed to determine whether these advantages found in the study of different student populations are transferable to instruction and assessment practices to address technical writing competences of non-traditional working adult doctoral students.

For those considering their personal research agendas, there are potentially rewarding pathways to pursue with respect to assessment, technology, the digital experience, and learning outcomes for non-traditional higher education students in online degree programs.

-By Kenneth C. Sherman, PhD

References

Bye, J., & Fallon, W. (2015). Supporting a relational approach to feedback on assessments: The unintended impacts of institutional influences on student feedback. Employment Relations Record, 15(1), 27.

Clark, I. (2011). Formative assessment: Policy, perspectives, and practice. Florida Journal of Educational Administration & Policy, 4, 158–180.

Crook, A., Mauchline, A., Maw, S., Lawson, C., Drinkwater, R., Lundqvist, K., . . . Park, J. (2012). The use of video technology for providing feedback to students: Can it enhance the feedback experience for staff and students? Computers & Education, 58(1), 386-396. doi:10.1016/j.compedu.2011.08.025

Evans, C. (2011). Making sense of assessment feedback in higher education. Review of Educational Research, 83(1), 70-120.

Phillips, F., & Wolcott, S. (2014). Effects of interspersed versus summary feedback on the quality of students' case report revisions. Accounting Education, 23(2), 174-190. doi: 10.1080/09639284.2013.847328

Sadler, D. R. (2010). Beyond feedback: Developing student capability in complex appraisal. Assessment and Evaluation in Higher Education, 35, 535–550. doi:10.1080/02602930903541015

Sadler, D. R. (2013) Opening up feedback: Teaching learners to see. In Merry, S., Price, M., Carless, D., & Taras, M. (Eds.) Reconceptualising feedback in higher education: Developing dialogue with students. (Ch. 5, 54-63). London: Routledge.

Scott, D., Evans, C., Hughes, G., Burke, P. J., Watson, D., Walter, C., . . . Huttly, S. (2011). Facilitating transitions to masters–level learning—Improving formative assessment and feedback processes. Executive summary. Final extended report. London, UK: Institute of Education.

Walker, M. (2009). An investigation into written comments on assignments: Do students find them usable? Assessment and Evaluation in Higher Education, 34(1), 67-78. doi:10.1080/02602930801895752

Weaver, M. R. (2006). Do students value feedback? Student perceptions of tutors’ written responses. Assessment & Evaluation in Higher Education, 31(3), 379-394. doi:10.1080/02602930500353061

ASSESSING AUDIO-VISUAL FEEDBACK EFFECTIVENESS ON DOCTORAL STUDENTS’ TECHNICAL WRITING SKILLS pageTitle:, pageUrl:, pageId:, pageDescription:,