0% found this document useful (0 votes)
69 views4 pages

Evolution of DDI Initiative at Loyola Academy: Seat. Paul Agrees To Lead LA's Faculty Institute: August 24

This document summarizes the evolution of a data-driven instruction (DDI) initiative at Loyola Academy from 2007-2010. It began with common exams in some departments and identifying inconsistencies. In 2009, an expert consultant helped launch a full DDI program with common assessments, data analysis meetings, and professional development. There was initially resistance from faculty concerned about time and control. Over the years, support increased as teachers saw benefits for student achievement and their own practice through examining assessment data and adjusting instruction. The long-term goal was to establish a more rigorous, aligned curriculum across all departments.

Uploaded by

mkearney6948
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
69 views4 pages

Evolution of DDI Initiative at Loyola Academy: Seat. Paul Agrees To Lead LA's Faculty Institute: August 24

This document summarizes the evolution of a data-driven instruction (DDI) initiative at Loyola Academy from 2007-2010. It began with common exams in some departments and identifying inconsistencies. In 2009, an expert consultant helped launch a full DDI program with common assessments, data analysis meetings, and professional development. There was initially resistance from faculty concerned about time and control. Over the years, support increased as teachers saw benefits for student achievement and their own practice through examining assessment data and adjusting instruction. The long-term goal was to establish a more rigorous, aligned curriculum across all departments.

Uploaded by

mkearney6948
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Evolution of DDI Initiative at Loyola Academy

 2007-2009: Common element on semester exams/ encouraging quarter


exams (Math dept. only). Inadequate local assessments- standardized test
results are the primary way to gauge effectiveness of LA instruction- lots of
grade inflation/ variation in teacher assessment practices as evidenced by my
review of semester exams over 3 years. Spanish 4 exam.
 2008-2009: Test Day Policy change: designated days for exams each wk. -
each dept. two possible days for testing rather than designated days at end
of quarter only. Lots of abuses w/the change. Students very stressed. Root
cause analysis: “Why does testing cause so much anxiety within the LA
community (students and faculty)? Reflected on the why, what, when and
how do we test students to determine mastery of course outcomes.
 June 2009: reached out to Kim Marshall: Marshall memo- help with finding
an assessment/testing expert as I had been taken with his work on interim
assessments. Paul Bambrick-Santoyo. Paul – LA alum (Uncommon Schools
consortium: North Star Academy). Reference article: Data in the Driver’s
Seat. Paul agrees to lead LA’s faculty institute: August 24.
 Paul’s major premise: “Standards/outcomes are meaningless until you define
how you will assess them.”
 July 2009: conference with NLNS- Boston Univ. – 2 day workshop. I am both
intrigued and nervous about DDI (Data Driven Instruction) implementation at
LA. We don’t have identifiable outcomes for each course or a coherent
curriculum (written curriculum: fiction for most schools).
○ DDI: rigorous, common interim assessments are administered and
analyzed 4 times a year to inform instruction and improve student
achievement. At LA (year one): 1 course analyzed.
○ LA’s current issues: academic rigor is not consistent within
departments and throughout the school. Curriculum is not well
articulated. Instruction reflects teachers areas of expertise,
preferences, and expectations for learning rather than what students
need. Pockets of excellence. Lack of vertical articulation. Lots of grade
inflation yet disconnect with standardized test performance.
Summative assessments are mostly forced choice do not reflect higher
order thinking skills. Gaps in achievement as learning deficits are not
addressed systematically.
 July 2009: meeting with dept. chairs about feasibility of DDI – they are
hesitant but see the value. Share my reflection on Paul’s conference and
review summer mailing to teacher with homework to review semester exams.
( Reference summer homework handout and DDI team assignments)
 August 2009: Faculty Institute Day (Aug 24) and Dept. Institute Day (Aug.
25): Paul Bambrick-Santoyo gave a day long interactive workshop on DDI,
rigorous assessments and how to conduct data analysis. (Handout: data
analysis sheets). The goal of DDI is to get teachers talking about the data
and then to plan for re-teaching.
 September 2009: Department chairs are nervous about reactions to DDI in
their departments. Lots of hostility: “too much time/ I already have all of my
lessons, exam, etc. written for the entire year, etc. / not enough time”:
common complaints. I run before/after school information sessions with
faculty: trying to de-mystify DDI and showing how LA’s culture supports DDI:
athletics/AP exams. Pretty good attendance. Spent time showing how our
current semester exam data did not provide any insights into student
performance/achievement.
 September 2009: Faculty meeting: late start. Faculty work on 1st Q exams
in DDI teams.
 October 7 2009: Faculty meeting: I did a presentation on the academic
health of LA: how it is measured: enrollment, student performance on
standardized tests, college matriculation information, semester exam
information (again showed how we could not gain info from it). Sobering
standardized test data: slight decline/ gender and ethnicity show gaps in
achievement. Articles on grade inflation and academic rigor distributed to
faculty.
 October 2007: Prior to my trip to NYC I gave the Oct. 7 presentation to the
Programs/Policies committee of the Board of Trustees. Jeanie Egmon.
 October 2007: I traveled to NYC (College Board conference) and then spent
one day with Paul Bambrick in Newark: participated in professional
development sessions he was leading for Uncommon Schools administration
and faculty.
 November 11: Lunch forums with faculty: I discuss the phases of DDI, the
benefits to individual teachers, and the benefits to our students and the
school in general in that it promotes excellence.
 Nov. 12: Individual teachers provide a simple analysis of one class to their
chair. Pre-tenured teachers must present this Q exam data in pre-observation
conferences. Observe them in their DDI class.
 November 19: DDI teams reflect on first quarter student performance
( reference the handout)
 December 1: Software purchase for data analysis
 January 2010: Chairs to North Star: visit school and professional
development on leading data analysis meetings with their faculty.
 January 2010: Semester exams: some compromise to keep the peace but I
am keeping faculty feet to the fire.

Benefits
 Individual teachers:
○ having quarter tests written and in-hand before the quarter will drive
better instruction.
○ Assessments are connected to course outcomes
○ Assessment and analysis are linked
○ Targeted follow through for unsuccessful students: teachers are
poolside
○ Evidence to support the rigor of one’s instruction
○ Curriculum: quality and intensity
○ Reflection on practice and student performance
○ Focus on results of instruction.
 School:
○ Student achievement will improve

○ Insights into grade inflation and student performance esp. gaps by


ethnicity and gender.
○ Promotes an aligned curriculum: decreased curricular variance.
○ Provide us with evidence that supports the effectiveness of our
instruction beyond student performance on standardized tests.
Teachers are writing the exams to reflect the course outcomes they
have written for courses.
○ Allows LA to react systematically to the ACT issue.
○ May allow highly selective colleges/universities to perceive us
differently
○ Advances teacher professionalism: DDI is very Ignatian in that it
promotes reflection and evaluation of one’s practice. Action.
○ If used correctly, DDI analysis will allow us to evaluate tenured
teachers without creating another evaluative instrument. More
authentic.

Obstacles
 Creating common assessments that reflect common outcomes is a challenge.
Written curriculum is often fiction and teachers choose not to adhere to it.
 Construction of rigorous assessments is a challenge. Need professional
development.
 Rigorous instruction is a challenge: have to create an environment that
supports teachers
 Time is an issue: next year we will add more time for DDI teams to meet
 One on one data analysis can be stressful and personal. Which is okay for
some faculty members!
 Many of the Q1 exams were of poor quality. Okay- baby steps.
 “Teaching to the Test” common complaint: Tests have always been a
representation of the knowledge and skills classroom instruction promotes. I
counter with “Haven’t you always been teaching to the tests you write?”
Conclusion
 DDI is a journey. I am guessing that our implementation with take about 5
years. Not sharing that directly with the faculty just asking them to be
patient.
 Phases of DDI:
○ Phase I: ignorance, confusion overload
○ Phase II: feeling inadequate and distrustful of the whole process
○ Phase III: challenging the test
○ Phase IV: Looking for causes but no action
○ Phase V: Changing teacher practices: teachers follow through on
analysis and lesson plans reflect spiraling.
 Lunch forums were more successful than I anticipated: Teachers did tell their
stories of not buying into DDI at first and then seeing the value of the doing
the data analysis: read Bill Lowe’s email.

You might also like