0% found this document useful (0 votes)
175 views11 pages

Walsh Capstone Part 2 and 3 2020

1) The goal of the capstone project was to train teachers on using video tools like eClass and Screencast-o-matic to provide feedback on student writing. 2) The training involved presenting best practices for feedback and modeling the video tools in professional development sessions. 3) Surveys found that 60% of teachers tried the video tools, but only 36% used them for feedback. Teachers reported mixed results on the impact on student learning and preferred in-person feedback.

Uploaded by

api-509386272
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
175 views11 pages

Walsh Capstone Part 2 and 3 2020

1) The goal of the capstone project was to train teachers on using video tools like eClass and Screencast-o-matic to provide feedback on student writing. 2) The training involved presenting best practices for feedback and modeling the video tools in professional development sessions. 3) Surveys found that 60% of teachers tried the video tools, but only 36% used them for feedback. Teachers reported mixed results on the impact on student learning and preferred in-person feedback.

Uploaded by

api-509386272
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 11

1

Training Teachers to Give Effective Writing Feedback with Video Tools

Bridget Walsh

Department of Instructional Technology, Kennesaw State University

ITEC 7500: Capstone Experience & Portfolio

Prof Rahn

December, 2020
2

Training Teachers to Give Effective Writing Feedback with Video Tools

In the initial proposal, the goal of this capstone project was to determine if the addition of

video feedback would strengthen the impact of teacher feedback on student writing. I proposed

to start by surveying teachers about current methods and practices for formative feedback on

student writing. From there, I wanted to present teachers the “feedback best practices” handout at

the February 2020 language arts department meeting. In a later February PD, I wanted to show

language arts teachers the eClass video tool, model the tool, and create how-to handouts and

video tutorials for the tool. In a March PD, I wanted to do the same for Screen-cast-o-matic.

After teachers have had two months to try the video tools, I would meet with a small group of

early adopters in May to troubleshoot any problems they may have had. My goal was to have

60% of participants implement eClass video tool for at least one class period and 60% of

participants implement Screen-cast-o-matic for at least one class period. In an exit survey, I

would ask teachers to share class data and perceptions of video feedback’s impact on student

writing. Finally, I would analyze class data and teacher’s perceptions in order to determine if

training teachers on video feedback tools helped improve student writing.

Implementation

To complete the capstone project, I first reached out to administrators over professional

development. Together we set up dates for professional development in advance. Next, I shared

my capstone idea with teachers I am close to in the department. I theorized that if teachers had

more exposure to the idea of video feedback that my trainings might have more traction. I spent a

day in January creating materials, including the “eClass video note how-to,” and “Screen-cast-o-

matic how-to” handouts, for my first scheduled training. I also made a “feedback best practices”

handout based on the research I did on effective feedback in the fall (Nordrum, Evans &
3

Gustafsson, 2013; Sermsook, Liamnimitr & Pochakorn, 2017; Hatie, 2013). I presented these

feedback strategies and video tools at a new teachers meeting in January with several different

examples modeling effective individual and whole class feedback. Some of these examples were

video feedback I had created for students, and some were models created for the training. After

the training, I analyzed the results of the evaluative survey I gave teachers in professional

development. I also identified early adopters I might follow up with to address the gaps in my

instruction or the need that still exists in supporting teacher as they learning video tools for

feedback.

Next, in February, I followed up with a few participants in the initial training to discuss

their implementation of video feedback. Their input helped me understand that many teachers

found video feedback more time consuming than written feedback. I adjusted my strategy more

towards formative whole class and selective individual feedback; that is, video might be a

medium more appropriate for delivering some messages about writing—like big picture

corrections—than others. Additionally, I decided to add Flipgrid as a video tool in my follow up

training since it is a medium that teachers at my school are somewhat familiar with already due

to past trainings. While my first professional development session took place at a new teachers

meeting, I scheduled the second session with a larger audience in late February, as part of a

school-wide training day. I had thirty-five participants from almost every subject area, including

interpreters and career track teachers (e.g. culinary, computer tech, etc.). This time, I created new

materials demonstrating Flipgrid’s potential for delivering powerful feedback on student work in

general, rather than writing specifically. Again, I analyzed teacher feedback from the session. In

March, I sent out another survey to interested participants asking if they had tried video tools for

feedback and about the experiences they had with these tools.
4

Project Outcomes

Based on survey results of a sample size of 18 teachers, 60% of teachers who attended

the professional development in February said that they used either Flipgrid, Scree-cast-o-matic,

or eClass video notes in some form with students. Fewer teachers, about 36%, said that they used

video tools for student feedback, specifically. The majority of teachers, about 75%, expressed

interest in learning more about video tools in future sessions. However, teachers were more

uncertain about the impact on student learning. Of the teachers who had tried a video tool for

student feedback, there were mixed feelings about the impact of video feedback on student

learning, with about 28% reporting a positive impact, 42% reporting no impact. Generally,

teachers prefer in-person and/or written feedback. They state the reasoning that in-person

feedback is “more efficient” and video feedback more cumbersome. However, since teachers

were surveyed before switching to digital instruction, I am curious how the change to distance

learning might impact their perception of video tools as a means of communicating feedback to

students.

Barriers Encountered

The project timeline, audience, tools, and evaluation I planned in my proposal changed

during the course of the project. First, I made several key changes to the timeline outline in the

project proposal. For example, because I had written the capstone proposal before administrators

created the professional development schedule for spring, I changed my dates to work with their

schedule. In the proposal, I planned for two professional development sessions months apart.

What ended up happening was two different sessions occurring within one month of each other

for two different audiences, with only a few participants from the first session in attendance at

the second session. While I had planned for the same crew of participants in both sessions, I still
5

used data collected from the first session to improve instruction for the following session.

Because I was working with existing calendars planned by administrators, the project timeline

was more condensed than I anticipated, with the final follow-up survey sent out to teachers in

March instead of May. The changes to the timeline actually worked in my favor since schools

switched to digital instruction mid-March, due the Coronavirus lockdown.

The audience I had planned changed during the implementation of the capstone project.

Initially, I had planned to present in front of the language arts department. However, in late

January, I was invited to present at a new teacher’s meeting. I decided the take advantage of this

leadership opportunity to deliver my capstone video-feedback training. Already, I needed to

change the content I had planned in the proposal for language arts teachers to suit a more general

audience of teachers, and new teachers at that. Including more Learning Management System

(LMS) instruction to support teachers in their use of video feedback as they learn our schools

LMS. Next, since I delivered instruction to a groups of teachers from a variety of subject areas—

not just language arts—I adapted video tutorials for strategies that might also work, say, in a

math class. For example, I did a brief tutorial on how to film student work through a document

camera. While my focus was still on improving students writing—and perhaps literacy, more

broadly—I adapted my strategy to include interdisciplinary writing and content learning.

Finally, I added Flipgrid as a video tool for student feedback. I did this for several

reasons. First, I wanted to adapt my project to a wider, interdisciplinary audience. While

Flipgrid, a tool for student video creation, is not student writing, I justified the change by

thinking of student videos as evidence of student thinking. Videos of student thinking can be part

of the writing process, as pre-writing before writing or reflections after writing. Additionally, I

believed the medium of student content creation and teacher feedback (also in video form—as
6

one would comment on a discussion forum post) would lend itself more to an interdisciplinary

audience of teachers. Furthermore, multimedia student products, as opposed to written products,

help teachers differentiate instruction to a more diverse student population (Hobgood & Ormsby,

2010).

My first obstacles, were working with the school calendar to schedule professional

development sessions, as described above. I also had to adapt plans for a different audience of

teachers than I had originally planned for. However, plans I outlined in my proposal changed

drastically when schools switched to digital learning following the statewide Coronavirus

lockdown. The lockdown, in one sense, ended my original project plans, and, in another sense,

created additional opportunities to support teachers as they transitioned to a fully online

curriculum. In April, I participated in a school-wide Zoom professional development session, re-

delivering parts of the PD I hosted, in-person, in February. My school administration created this

event to help teachers as they transitioned to online teaching. The biggest obstacle presented by

the lockdown was teacher burnout or overload of new tools. Many teachers I followed up with

simply did not have the time to try a new tool as they frantically tried to keep up with adapting

their curriculum to a digital format.

The district canceling end-of-course exams (EOC’s) was another obstacle to the

evaluation of my project. In my proposal, I planned to compare EOC data of teachers who

implemented video feedback to teachers who did not. Instead, I relied solely on participant data

and teacher perception surveys in order to gauge the success of the project. Again, this worked

out well since the audience of the project was no longer language art teachers, and data

comparison would be too difficult to do across grade levels and subject areas.

Follow-Up 
7

To follow up, I plan on remaining in contact with teachers who need support with videos

tools, since on-going support is one of the criteria for integrating a new tool into regular use

(Essential Conditions, 2020). I have reached out to teachers who attended my sessions with

follow-up emails thanking them for coming and inviting them to reach out to me if they have

questions. Additionally, there are other teacher leaders who are proficient enough with these

video tools to assist others in their Professional Learning Communities, should they need further

help. The technology team has how-to guides for eClass Video Note, Screen-cast-o-matic, and

Flipgrid posted on the Berkmar High School Training page, for asynchronous support for

teachers who seek it out. While I do not plan to do any further professional develop trainings on

video tools for student feedback, the school and the district’s technology teams have already

created the infrastructure to support teams wanting to incorporate video into their instruction.

Discussion and Reflection

From completing this capstone, I learned that technology diffusion takes time. I know

that the change process requires key influencers in order to have full adoption of an innovation. I

attempted to build change theory into my project by identifying lead innovators after the first PD

session, and seeking them out for feedback before redelivering to a broader audience. I also

considered change theory as I talked individually with colleagues, with the knowledge that

trusting relationships help facilitate the change process. While I did not list Diffusion of

Innovations & Change (PSC 1.4) as a standard I would address in this project, I quickly found

that in order to meet the 60% goal for implementation that I would need to work strategically

with teachers to get them to adopt video tools for feedback. Therefore, through this project, I

have shown the disposition to embrace the change process.


8

Additionally, I have learned that good leadership requires listening to the needs of the

faculty. I made several key decisions to significantly alter my professional development content

based on feedback I received from new teachers. Although I had strategically planned a course of

action, through evaluating the tools I selected and reflecting on the needs of the teachers in my

training, I decided to change directions to broaden my audience. This decision led to a higher

regard for the professional development by the teachers present—the data from my evaluations

shows that video tools were much better received and more likely to be implemented by teachers

who attended the second training session, as opposed to the first. The change to my program

shows my skill in program evaluation (PSC 5.3) as well as selecting and evaluating digital tools

and resources (PSC 3.6). Tellingly, I did not include standard 5.3 in my capstone proposal either.

When I wrote the proposal, I did not understand fully the amount of work or flexibility it would

take to get teachers to adopt video tools for feedback. The capstone experience was essential to

my working knowledge of how to adapt a program to a new audience as well as how to listen to

that audience to increase the likelihood of adoption, a task I did not recognize as part of my

proposal until after the fact.

As part of this capstone experience, I improved my disposition to lead. First, I worked

closely with school leadership to schedule dates and pitch my capstone idea as one that will

benefit student learning. I improved my disposition to exhibit self-motivation, initiative, and

leadership. For the first time, I led school-wide training in front of faculty I had been working

beside for over five years. Because I had to seek out leadership opportunities and communicate

effectively with school leadership in order to accomplish the goals of my project, I believe that

my disposition to lead effectively has improved considerably.


9

While, overall, I grew from this experience, I could have done more to ensure valid

results from the experiment. On one hand, my ability to adapt to the needs of the faculty was a

strength of this project, and on the other hand, I see now where I could have done more to make

the design of this experiment produce more reliable results. The degree to which my project

varied from what I proposed, ideally, would be less. For example, I originally designed this study

for the language arts department. The idea was to run two training sessions with one informing

the other to support teachers as they communicated with students through video. I was going to

measure results based on assessment data. However, this morphed into two different PD sessions

designed for two different audiences. For this reason (and because there was no assessment data

in the spring), I did not have the amount of data at the end of the project I had hoped for.

Additionally, my evaluation of the effectiveness of my program was based on teacher perception

surveys, which delivered mixed reviews. My results were ultimately inconclusive. If I had

followed through with some variation of the original design, I believe that I have had more

reliable results.

Recommendations

I recommend that future instructional technology candidates keep the scope of their

projects manageable, since I believe that the complexity of mine was one of the reasons I

struggled to stick to my original plan. For teachers wishing to research the impact of video tools

of student writing, I recommend first normalizing which tools and in which learning situations

student receive feedback (i.e. formatively, as a public or private response, etc.), then asking the

teachers to test this method of feedback on a few students, rather than whole classes, as the

teachers enrolled in my program complained that creating video feedback was more laborious
10

than written feedback. Next, I recommend a pre-test/post-test method of analyzing data. That is,

the students who are going to receive video feedback are measured against themselves at the end

of the year and compared to the student growth of those who only received traditional written

feedback. While most of my research indicates that if used correctly to supplement written

feedback, video feedback can have a positive impact on student learning, most of these studies’

participant are students learning English as a foreign language, rather than fluent English

language arts students. Therefore, there is more room for research on this topic, and I am curious

to see where future experiments lead.


11

References

Essential Conditions. (2020). International Society for Technology in Education.

www.iste.org/standards/essential-conditions

Hattie, J. (2012). Know Thy Impact. Educational Leadership, 70(1), 18–23.

www.ascd.org/publications/educational-leadership/sept12/vol70/num01/Know-Thy-

Impact.aspx

Hobgood, B., & Ormsby, L. (2010). Inclusion in the 21st-century classroom: Differentiating with

technology--Reaching every learner: Differentiating instruction in theory and practice.

UNC School of Education. Retrieved March 1, 2020, from

http://web.archive.org/web/20180125110137/www.learnnc.org/lp/editions/every-

learner/6776/

Nordrum, L., Evans, K., & Gustafsson, M. (2013). Comparing student learning experiences of

in-text commentary and rubric-articulated feedback: Strategies for formative assessment.

Assessment & Evaluation in Higher Education, 38(8), 919–940.

dx.doi.org/10.1080/02602938.2012.758229

Sermsook, K., Liamnimitr, J., & Pochakorn, R. (2017). The Impact of Teacher Corrective

Feedback of EFL Student Writers' Grammatical Improvement. In English Language

Teaching. https://doi.org/10.5539/elt.v10n10p43

You might also like