As the state was trapped at home with two days and two feet of snow and shoveling, anyone watching the news was inundated with data. Forecasts predicting the timeline of the snow, snowfall totals, hourly temperatures and wind speeds. Constant updates on weather conditions, road conditions, school closures, power outages, storm budget expenditures... the data went on and on. And as the storm draws to a close, we'll now receive our summative data and data comparisons- who had the most/least snowfall totals? Who had the most storm damage?
While weathermen are frequently criticized for their inaccurate predictions, I'm actually quite impressed by how well they forecast, given the changeability of conditions. While we lack satellites and weather balloons, given our educational landscape is equally, if not more, variable, what instruments can we use to predict student performance?
EWIS, or Early Warning Indicator System, provides academic milestones that have a positive correlation with high school graduation.
Interim assessment programs, like ANet and NWEA MAP provide means of measuring student growth throughout the year, and predicting student outcomes on future interims or summative assessments. These might be in the form of a growth calculation or prediction, comparison between students, classes or schools, or resources to set individual student goals.
Common rubrics for monitoring student open responses, performance tasks, and written responses/essays are also gaining more frequency in use as the summative assessment (PARCC, Smarter Balanced) landscape changes.
And don't discount those other non-academic measures, like:
- early childhood education attendance;
- attendance K-12;
- school engagement and sense of belonging, as measured by middle or high school; extracurricular participation;
- behavior- suspension and office referral rates; and
- observed "soft-skill" behaviors like grit, communication, integrity, honesty, flexibility, and optimism.