Monday, November 24, 2014

During a discussion last week with several Greater Boston districts, I was excited to hear so many district and school leaders thinking about how to use their district common assessments to monitor progress of low performing students or subgroups.  MCAS, ACCESS and other summative assessments provide us with important information about how our students performed at one point in time during the year.   One of the best aspects of this test is that we can easily manipulate the data to identify how certain groups of students performed, measure student growth, and compare our students against students across the district, with other districts and across the state.

But how are our students performing prior to March and May?  How will we know what to expect in the MCAS data?  Schools and districts have a lot of additional data they can draw from to monitor progress throughout the year- and make adjustments to instruction and resources.  DIBELS, DRA, district or external benchmark assessments, and school-developed writing rubrics and common assessments are all available and can provide the same type of comparable data to monitor student progress throughout the year.

One request to the assessment developers/publishers- help teachers and leaders monitor progress of subgroups by sharing data in more than just aggregate and student-level forms!

Additional Resources

In case you missed the Edwin Roadshow, or were looking for more customized support, Thinkgate is offering a series of webinars on Edwin topics (see below).  All webinars are offered Nov. 25-Dec. 10.  Register and find out more here: 

  • Assessments:
    • Performance and Observational Assessments
    • Sharing Classroom Assessments
    • Assessment Results for Teachers
    • Assessment Administration- Online
    • Assessment Administration-Paper/Pencil
    • Populating your Item Banks
  • Curriculum:
    • Model Curriculum Units and Digital Resources
    • Curriculum Maps and Unit Plans
  • Instruction:
    • How Grouping Can Assist you with Interventions

Thursday, November 13, 2014

A question that frequently arises when discussing student assessment data is how to prevent educators from feeling defensive when the results are not as promising as hoped.  As with any team discussion, how the conversation is framed and the norms that are established are important.  Set clear guidelines for what will be discussed and why, setting the purpose of the discussion as identifying where students  are struggling and how we as a school might better provide support.

When diving into the data, begin by reviewing the data through an objective lens.  Team members share what stands out to them from the data and what surprises them.  They do NOT make any "because statements" or begin to provide context to the data.  Rather, keep the discussion to statements like "I noticed that ELA student growth percentile increased this year by more than our math student growth percentile" or "I see that while the percent of students scoring advanced did not increase, the percent of students scoring warning did decrease."  Once the team can discuss the data in an objective manner, move on to drawing conclusions and interpretations, bringing in the contextual evidence as well.

There are a number of great protocols available to help guide team data discussions.  RBT's Data Driven Dialogue is very user-friendly, as well as MA DESE's District Data Team Toolkit:

Resource to share:
The Regional Education Laboratory at EDC is hosting a webinar on Using Data to Inform Instruction.  The webinar will be held on November 24 from 10:30-12.  Learn more and register using the following link: 

Thursday, November 6, 2014

DSAC's New Data Specialist!

As the Greater Boston's newest team member, I'd like to introduce myself as the Data Specialist.  I've met many of you already, through the Learning Walkthroughs, learning from student work, and professional development sessions I've participated in with you, and look forward to visiting more Greater Boston schools in the next few weeks!

I began my career in education as an elementary teacher, primarily in grades 3 and 4.  I taught in New York City, and Taunton, MA before working in K-12 research with the Center for Education Policy Research at the Harvard Graduate School of Education., and as a consultant conducting education program evaluations.  For the past few years, I've studied schools' use of student assessment data to drive instruction and student achievement, and the impact of using video observation in teacher evaluation systems.

District Determined Measures- Resource for Administering Common Assessments

As districts begin to finalize and implement DDMs this year, the question arises of how best to maintain consistency in district or school-developed assessments, both in administration and data collection.  While attending the Edwin Roadshow last week I was introduced to a component of the Edwin system that may prove quite useful in DDM implementation.  Using Edwin Teaching and Learning, teachers or administrators can develop common assessments to be shared within and across schools.  Teachers can then administer the assessment online to their group of students, and receive near real-time assessment results (within 48 hours).  This system is worth considering, as most districts are using a combination of vendor-provided and district-developed measures.  This system is also free to districts this year!

Finally, a plug for the PARCC Practice Test Regional Session.  If you're looking for additional information about the format and content of the PARCC assessment, sign up for one of the training sessions on the MA DESE website: