Wednesday, December 17, 2014

Going Beyond Summative Assessment Data...



Several of my latest blog posts have focused on standardized assessment data as this fall MCAS and PARCC data questions appear foremost on the minds of many teachers and administrators.  But beyond the summative assessments, what other data, already readily available, should schools and districts use to track student progress?  What other data resources should be used in identifying at-risk students?

As I've thought about this question I came across an article in Education Week focusing on the efforts of Tacoma, Washington's Superintendent, Josh Garcia.  Garcia has transformed the district's use of data by moving beyond attendance, test scores and graduation rates, to incorporating additional measures that capture the whole child.  Tacoma's accountability system now includes over 3 dozen measures of students, broken down by grade level and content area, from preschool through grade 12.

What I find most interesting about Tacoma's new accountability system is that it is both holistic and aligned.  The district identified 4 key goals to focus their efforts in raising the graduation rate, the climate of schools, and the achievement levels of all students.  These goals are defined in the areas of academic excellence, partnerships, early learning, and safety.  Realizing that, as a district, they were gathering very little data on their students, and not data to monitor progress, the district began to look at what additional data could be gathered to align with each of the four key goals.  Data is now gathered from the community, from students PK-12, faculty and staff.

A few examples of data Tacoma currently includes in the K-12 accountability system are as follows:

  • % of students accessing preschool and full day kindergarten programs
  • At each grade level: % of students meeting standards; 
  • Grade 8: % of students accessing extracurricular activities
  • Grade 9: % of students failing 1 class; % of students failing more than 1 class
  • High School: % of students accessing extracurricular activities; % of students receiving college acceptance letters
  • % of public visits to the district's website and social media pages
  • Community Partners, Health Youth and School Climate survey responses
  • % of registered school volunteers

More information on Tacoma's accountability system can be found at the following link: http://www.tacoma.k12.wa.us/information/strategicplan/documents/tps-measuring-the-whole-child.pdf

Additional Resources:
The EdEval office has partnered with the MA Organization of Educational Collaboratives to organize a series of DDM office hours between January and May 2015.  DDM working group members (1-2 per district) are invited to attend these office hours to have their questions answered, learn about and share implementation strategies, and receive feedback on potential measures from ESE's DDM experts.  Space is limited.

Session Date
Time
(2 hours)
Host
Session Location
To RSVP, contact:
January 15th
9:30-11:30
Cape Cod Collaborative
418 Bumps River Road, Osterville, MA 02655
March 11th
9:30-11:30
Assabet Valley Collaborative
57 Orchard Street, Marlborough, MA 01752
May 6th
10:00-12:00
Collaborative for Educational Services
97 Hawley St. Northampton, MA 01060





Tuesday, December 9, 2014

PARCC Assessment Resources

PARCC tests are scheduled as early as March and districts have already begun receiving emails about registering their students for the tests.  So how can you find out more information and alleviate that PARCC anxiety your teachers may be facing?

The good news is- resources are out there!

  • Visiting PARCC's website, all teachers administering PARCC assessments this year should check out the tutorials and take a sample test http://practice.parcc.testnav.com/  Learn about the different types of questions (e.g. drag and drop), the math and ELA tools (e.g. protractor, ruler, compass, line reader, text highlighter, and my favorite- answer eliminator).  The brief tutorials provide a wealth of information to help teachers know how to prepare their students for a different testing format.  Open response scoring rubrics and PBA descriptors are also available.
  • For administrators, you should have received an email on October 27 with your PearsonAccess account information.  By December 23rd you'll need to review and update all student files for accommodations and student transfers since the October SIMS submission.
  • How to explain to students how to take an online assessment: http://tg.apulumtechnologies.ro/t101.html
  • How to prepare for PARCC once student data has been uploaded: https://massparcctrial.files.wordpress.com/2014/12/student-data.pdf
  • How to communicate with parents about PARCC: http://www.doe.mass.edu/parcc/CommTool/10Things.html

Thursday, December 4, 2014

Rethinking Student Assessments



This week during a PARCC training session I was able to try out the new online assessments in math and ELA.  My family is a family of educators, so it was pretty typical that I give them a rundown of my testing experience.  My brother, one of the few non-educators at the table, posed the question: Why are the tests changing?  But to what appeared to be a simple question, I found I didn't have a simple answer.  Rather, it led me to reflect on precisely why our summative assessments are changing.

As educators we do need feedback about how our students are performing on grade level standards.  And having that information at multiple times during the year, and in a mode that allows you to compare your students with other students from similar schools and districts is valuable.  And for an assessment to be truly useful, it needs to be a true measure of not just what we want students to learn, but what we want them to know and be able to do.  Thinking about assessments in this way helps us realign the types of questions and the level of rigor of items to what we expect students to be able to do.  We also need our assessments to:
  • provide timely data (preferably within 48 hours of administration) to allow for teacher follow up and reteaching;
  • provide data in a usable format for teachers to access to make decisions about instruction and flexible student grouping;
  • assess standards in multiple ways and at multiple cognitive levels- much of our current assessments ask few to no questions that require students to analyze, evaluate or create; and
  • provide more authentic assessment experiences, and ongoing assessment of proficiency.

Resources:
Article: Rethinking Assessment to Improve Student Outcomes http://www.slideshare.net/slideshow/embed_code/40090962

If you, like me, missed this week's Askwith Forum at the Harvard Graduate School of Education on Closing the Gap with African American student achievement, check out the video on: http://www.gse.harvard.edu/news/14/11/closing-gap-african-american-educational-excellence 

The forum discussed the purpose and goals of the White House Initiative on Academic Excellence for African American students, led by Executive Director David Johns.


Monday, November 24, 2014


During a discussion last week with several Greater Boston districts, I was excited to hear so many district and school leaders thinking about how to use their district common assessments to monitor progress of low performing students or subgroups.  MCAS, ACCESS and other summative assessments provide us with important information about how our students performed at one point in time during the year.   One of the best aspects of this test is that we can easily manipulate the data to identify how certain groups of students performed, measure student growth, and compare our students against students across the district, with other districts and across the state.

But how are our students performing prior to March and May?  How will we know what to expect in the MCAS data?  Schools and districts have a lot of additional data they can draw from to monitor progress throughout the year- and make adjustments to instruction and resources.  DIBELS, DRA, district or external benchmark assessments, and school-developed writing rubrics and common assessments are all available and can provide the same type of comparable data to monitor student progress throughout the year.

One request to the assessment developers/publishers- help teachers and leaders monitor progress of subgroups by sharing data in more than just aggregate and student-level forms!

Additional Resources

In case you missed the Edwin Roadshow, or were looking for more customized support, Thinkgate is offering a series of webinars on Edwin topics (see below).  All webinars are offered Nov. 25-Dec. 10.  Register and find out more here: http://www.thinkgate.com/upcoming-webinars 

  • Assessments:
    • Performance and Observational Assessments
    • Sharing Classroom Assessments
    • Assessment Results for Teachers
    • Assessment Administration- Online
    • Assessment Administration-Paper/Pencil
    • Populating your Item Banks
  • Curriculum:
    • Model Curriculum Units and Digital Resources
    • Curriculum Maps and Unit Plans
  • Instruction:
    • How Grouping Can Assist you with Interventions

Thursday, November 13, 2014


A question that frequently arises when discussing student assessment data is how to prevent educators from feeling defensive when the results are not as promising as hoped.  As with any team discussion, how the conversation is framed and the norms that are established are important.  Set clear guidelines for what will be discussed and why, setting the purpose of the discussion as identifying where students  are struggling and how we as a school might better provide support.



When diving into the data, begin by reviewing the data through an objective lens.  Team members share what stands out to them from the data and what surprises them.  They do NOT make any "because statements" or begin to provide context to the data.  Rather, keep the discussion to statements like "I noticed that ELA student growth percentile increased this year by more than our math student growth percentile" or "I see that while the percent of students scoring advanced did not increase, the percent of students scoring warning did decrease."  Once the team can discuss the data in an objective manner, move on to drawing conclusions and interpretations, bringing in the contextual evidence as well.

There are a number of great protocols available to help guide team data discussions.  RBT's Data Driven Dialogue is very user-friendly, as well as MA DESE's District Data Team Toolkit: http://www.doe.mass.edu/apa/dart/lg.html.

Resource to share:
The Regional Education Laboratory at EDC is hosting a webinar on Using Data to Inform Instruction.  The webinar will be held on November 24 from 10:30-12.  Learn more and register using the following link: http://www.relnei.org/events/using-data-building-capacity-educators.html 

Thursday, November 6, 2014

DSAC's New Data Specialist!

As the Greater Boston's newest team member, I'd like to introduce myself as the Data Specialist.  I've met many of you already, through the Learning Walkthroughs, learning from student work, and professional development sessions I've participated in with you, and look forward to visiting more Greater Boston schools in the next few weeks!

I began my career in education as an elementary teacher, primarily in grades 3 and 4.  I taught in New York City, and Taunton, MA before working in K-12 research with the Center for Education Policy Research at the Harvard Graduate School of Education., and as a consultant conducting education program evaluations.  For the past few years, I've studied schools' use of student assessment data to drive instruction and student achievement, and the impact of using video observation in teacher evaluation systems.

District Determined Measures- Resource for Administering Common Assessments

As districts begin to finalize and implement DDMs this year, the question arises of how best to maintain consistency in district or school-developed assessments, both in administration and data collection.  While attending the Edwin Roadshow last week I was introduced to a component of the Edwin system that may prove quite useful in DDM implementation.  Using Edwin Teaching and Learning, teachers or administrators can develop common assessments to be shared within and across schools.  Teachers can then administer the assessment online to their group of students, and receive near real-time assessment results (within 48 hours).  This system is worth considering, as most districts are using a combination of vendor-provided and district-developed measures.  This system is also free to districts this year!

Finally, a plug for the PARCC Practice Test Regional Session.  If you're looking for additional information about the format and content of the PARCC assessment, sign up for one of the training sessions on the MA DESE website: http://www.doe.mass.edu/conference/?conferenceid=5967


Wednesday, July 30, 2014

MCAS Analysis for Grade-Level Teams


As you likely know by now, full preliminary MCAS results are scheduled to be available next week, on August 7th, in the MCAS Dropbox and in Edwin Analytics. These data are embargoed until the official release in mid-September, so that means the data CANNOT be released to the public or discussed in public meetings prior to the official release. As noted in the most recent Student Assessment Update, "principals and school officials should use these preliminary results to verify the accuracy of student information and student test status data, and to identify potential discrepancies to report to the Department."

There are a lot of data reports in Edwin Analytics, and it may be overwhelming at times to figure out how best to proceed, especially with limited time available for analysis. With the caveat that the results are preliminary, I thought it would be helpful to post some suggestions for MCAS analysis that have been received positively in our work with several schools. The MCAS Analysis for Grade-Level Teams outlines a few activities to get teams started reviewing the aggregate results as well as the curriculum and item analysis results (the template is at the end of the document). Please use if you think it will be helpful to you and your teams!




Wednesday, July 16, 2014

Student & Staff Feedback Resources

ESE recently posted a variety of materials to support districts and schools in incorporating student and staff feedback in the educator evaluation system, as required in the upcoming school year. Resources include: 

  • Quick Reference Guide
  • Detailed guidance (Part VIII: Using Staff & Student Feedback in the Evaluation Process)
  • Model feedback surveys (districts have the option to adopt, adapt, or use their own instruments to collect feedback) with administration protocols and discussion prompts
  • Considerations for Collective Bargaining
  • Report & Recommendation on the Use of Parent Feedback in Educator Evaluation
  • ESE Survey Pilot Project Summary

These are resources to review and consider, but districts have the flexibility to determine feedback instruments (not limited to surveys), administration protocols, and processes for integrating feedback into the evaluation cycle.

Tuesday, July 15, 2014

ESE Student Assessment Update

Here is the latest verbatim "Student Assessment Update" from ESE, with a lot of important dates and information. 

July Conference Call Sessions for Districts Considering PARCC Computer-based Testing
The Department is holding conference call sessions on July 22 and 23 to give initial guidance to districts that have chosen or are considering choosing PARCC computer-based testing in spring 2015. Please visit  www.doe.mass.edu/news/news.aspx?id=10328 for more information. If your school will participate in or may participate in PARCC computer-based testing, contact your superintendent to learn more about your district plans to register for one of the conference calls.

MCAS/PARCC Choice Decisions by District
The Department has released a list of districts that responded to the MCAS/PARCC assessment choice survey by the early decision deadline (June 30, 2014). The list, posted at www.doe.mass.edu/news/news.aspx?id=10345, indicates each district’s choice of MCAS or PARCC testing for spring 2015.

Multiple-Choice Results Interpretive Guide
A Multiple-Choice Results Interpretive Guide for Mathematics and Grades 5 and 8 STE is now available in DropBox Central in the Department’s ESE Security Portal – Drop Box central – MCAS 2014 Data folder. The interpretive guide provides guidance in interpreting preliminary multiple-choice results.

MCAS-Alt Score Appeals 
Principals will receive MCAS-Alt score appeal results by regular mail no later than July 31.

MCAS Reporting and Accountability Teleconferences
Registration for the upcoming reporting teleconferences on August 7 and 8 will open on July 17. During the teleconferences, Department staff will share important information regarding the spring 2014 MCAS test results, and participants will have an opportunity to ask questions.

Important Upcoming Dates
·         July 17Registration opens for the MCAS Reporting and Accountability teleconferences
·         July 22 and 23: PARCC computer-based testing conference call sessions
·         July 31: Principals receive MCAS-Alt score appeal results
·         August 7 and 8: MCAS Reporting and Accountability teleconferences

Friday, June 13, 2014

MCAS/Accountability Data Release Schedule

DESE recently posted the 2014 MCAS and Accountability Release Schedule. Here are the key dates summarized but please see the full document for more details. 

  • June 23rd - partial* preliminary MCAS Student Rosters posted electronically at DropBox Central in the Security Portal
  • June 26th - partial* preliminary reports available in Edwin Analytics (student rosters, as well as district/school item analyses and results by domain and cluster)
  • August 7th - full preliminary MCAS results available in both DropBox Central and Edwin Analytics
  • August 12th - Preliminary Accountability data provided through the Security Portal (the specific application is called something like “Accountability 2014”)
  • Aug 7-14th - MCAS discrepancy reporting window opens/closes
  • Mid-September - Official data released publicly (note, all data are embargoed prior to this date!)


*Partial results include: 
  • ELA, full results 
  • Mathematics, multiple-choice results only 
  • STE, grades 5 and 8, multiple-choice results only 
  • MCAS-Alt, full results in all subjects

What did students think of PARCC?

What did students think of the content and difficulty of the PARCC online PBA questions?

In ELA:  
  • 12% of the students thought that “All” or “Most” of the questions asked about things that had not yet learned this school year.
  •  88% thought that “Few” or “None” were about things not yet learned.
and
  • 28% thought the content was “harder” than their school work.
  • 56% thought it was “about the same” as their school work.
  • 17% thought it was “easier” than their school work. 
In Math:
  • 30% thought that “All” or “Most” of the questions asked about things that had not yet learned this school year.
  • 70% thought that “Few” or “None” were about things not yet learned.
and
  • 61% thought the content was “harder” than their school work.
  • 33% thought it was “about the same” as their school work.
  • 5% thought it was “easier” than their school work. 

This data comes from the PARCC Field Test Student Survey Responses document, posted on ESE’s PARCC site. The site includes a number of resources to assist districts in the 2015 PARCC/MCAS decision (including an FAQ at the top of the page), as well as communication tools to use with various audiences (e.g. parents, educators, school committees).


I know I'll be perusing the "MCAS and PARCC: How We Will Compare Results" and "MCAS and PARCC: Sustaining a Reliable Growth Measure"

Thursday, May 22, 2014

PARCC/MCAS Choice and Listening Tour

Districts are in the process of deciding whether to administer PARCC or MCAS tests in ELA and Mathematics for students in grades 3-8 in spring 2015. The timeline for making the decision is as follows:
  • Monday, May 19: District superintendents receive email with instructions and a pass code for registering the district's choice through an online tool
  • Mid to late-May: Information sessions to assist in decision-making (see below)
  • May 19–June 30: District superintendents register early decision
  • 12:00 p.m., Monday, June 30: Deadline for early decision submission
  • 12:00 p.m., Wednesday, October 1: Deadline for decision, pending availability
To help inform districts' decision-making around this choice, the Department has been holding a series of conference calls and regional meetings for district and charter leaders. There are two upcoming conference calls: Fri 5/23 10am and Wed 5/28 10am (more information). There are also a number of 2-hour regional meetings, too many to list here, but you can find more information online.  

The sessions are designed to help districts answer questions such as:

  • Will my district choose to administer PARCC or MCAS tests for grades 3-8 in ELA and Mathematics in spring 2015?
  • If my district chooses PARCC:
    • Will my district administer the computer-based or paper-based PARCC tests?
    • How would my district plan technology resources for PARCC computer-based testing?
  • Will my district choose to administer spring 2015 PARCC tests in ELA and Mathematics in grade 9 and/or 11, if funding is available?
  • What are the accountability implications for my choice?
In addition, the Department is hosting a PARCC 2014 "listening tour" statewide to elicit feedback from test administrators and classroom teachers on the spring 2014 administration of the PARCC field test. These regional sessions are often scheduled back-to-back with the choice information sessions mentioned above. 

Questions? Please email parcc@doe.mass.edu

Monday, May 5, 2014

2014 TELL MASS Survey Results Available

ESE announced, in a recent Commissioner’s Update, that the TELL MASS survey results are now available for the state as well as for districts and schools that met or exceeded the minimum response threshold (50 percent and at least 5 respondents). More than 38,000 educators responded to the survey, which collected perception data on the following topics:
• Community Engagement and Support
• Teacher Leadership
• School Leadership
• Managing Student Conduct
• Use of Time
• Professional Development
• Facilities and Resources
• Instructional Practices and Support
• New Teacher Support


To view the aggregate results go to www.tellmass.org/results. The "Resources" tab provides tools and guides to assist in reviewing and discussing the survey data to inform school improvement.

Friday, May 2, 2014

School and District Feedback Survey for PARCC Field Test

The following was included in the most recent SAS update. If your school/district participated, or will participate, in PARCC, please take the time to share your thoughts, concerns, questions, etc. Surveys are due by June 13th. 

"PARCC created a survey that school and district leaders may use to submit additional feedback on the PARCC Performance-Based Assessment (PBA) and End-of-Year (EOY) Assessment. PARCC will use information gathered through this survey to improve test administration policies and protocols. District/school test coordinators and technology coordinators are encouraged to fill out this survey after their field test administration (PBA or EOY) but no later than June 13, 2014, at www.surveymonkey.com/s/PARCC-field-test-feedback. Staff of district/schools participating in both the PBA and the EOY may choose to give feedback through separate survey responses (i.e., one response after the PBA and one response after the EOY) or through a combined survey response.

Please note that test administrators may submit feedback through the Test Administrator Survey found atwww.surveymonkey.com/s/3ZJSXH3."

Thursday, February 20, 2014

2014 TELL MASS – Extension to Complete Survey


Last week's Commissioner's Update included the following about the TELL Mass Survey (see earlier post related to the survey):

"The Department has extended the deadline to complete the 2014 TELL Mass Survey to Monday, Mar. 3. If you are an educator who did not receive an access code for your school, please contact the Help Desk at 800-310-2964 or helpdesk@tellmass.org. As a reminder, Race to the Top districts and schools are required to participate in the survey. To learn more and view additional resources, go to http://www.tellmass.org/."

Friday, February 14, 2014

ESE Seeking Input on School and District Profiles

ESE is inviting you and your colleagues to participate in a survey regarding the School and District Profiles. ESE has contracted with Public Consulting Group, Inc. (PCG) to better understand the current user experience of the Profiles website and to envision its future potential. In partnership with EOE and ESE, PCG will make recommendations regarding the public use of the portal and a strategy for overall Profiles improvement.


The survey will be open from Friday, February 14th through Friday, February 28th. ESE appreciates your participation and your help in spreading the word about the survey. Participation is voluntary and anonymous, but hugely important.

Friday, February 7, 2014

TELL MASS Survey Collection Underway

ESE announced recently:
"The Department has launched the second TELL MASS Survey for Massachusetts public school educators. The online survey, which is voluntary, confidential, and anonymous, allows us to gather educators' perspective on working conditions in their schools, including matters regarding instructional practices, available time, facilities, resources, and community engagement and support. The survey, which is open to all school-based, licensed public school educators, will remain open through Feb. 18, 2014. If you are an educator who did not receive an access code for your school, please contact the Help Desk at 800-310-2964or helpdesk@tellmass.org. As a reminder, Race to the Top districts and schools are required to participate in the survey. To learn more about the survey and view additional resources including a district-by-district response rate tracker, go to http://www.tellmass.org/."

TELL MASS includes questions on the following topics:
• Community Engagement and Support
• Teacher Leadership
• School Leadership
• Managing Student Conduct
• Use of Time
• Professional Development
• Facilities and Resources
• Instructional Practices and Support
• New Teacher Support

Results will be available for schools reaching the 50% minimum response rate and minimum of 5 educators. You can view results from the 2012 survey on the website.