How to Make Remote Instruction Work
By Alex Fernandez
As schools across the country resume instruction this fall, there are few certainties and seemingly endless questions. As the assessment director for Imagine Schools, a system with more than 30,000 K–12 students spread across seven states and the District of Columbia, I was particularly interested in not just how we’d assess our students this fall, but in how consistent our results would be from one school to the next—and even from one state to another. Perhaps the biggest question we had was, “Is remote assessment as effective as in-person assessment?”
As we attempted to prepare for everything—and we will see everything across our system—here are a few answers we discovered about assessment.
The Need for Assessment
Like any other school or district in the nation, we have legal requirements to assess our students and report on the results. That varies from state to state, of course, making it a complex job. While those varied state accountability requirements can make assessment more complicated across our system, the accountability itself is its own benefit. We want to know that our instruction is effective at every school and in each classroom.
But assessment is also important for informing instruction. It’s simply the easiest and most direct way for teachers to figure out where each individual student is at so they can meet them with the appropriate instruction. While assessing students is important for understanding what they have already learned, it’s even more important in helping us understand what they still need to learn.
Is Remote Assessment Effective?
During last year’s closures due to COVID-19, one of our many concerns was the efficacy of remote assessment. What good is assessment, after all, if it doesn’t accurately tell us what students know?
For a definitive answer to this question, Dr. Bill Younkin of the Biscayne Research Group performed some research for us. Sixteen of our schools and approximately 5,000 students agreed to participate in remote assessment for the research. Since we use the same assessment platform at all our schools, whether they’re on-campus or remote, that was consistent throughout the study.
His findings indicated that remote assessment was indeed as effective as in-person assessment, with a couple minor caveats. Exceptionally low scores were a bit less common with remote assessment, and exceptionally high scores were a bit more common. Both effects were observed more frequently at lower grades and almost disappeared at higher grades.
According to Dr. Younkin, the lack of extremely low scores suggests that more students take the assessments seriously at home. They are just less likely to blow the test off if Mom or Dad are sitting there taking it seriously themselves. At the other end, the slightly higher prevalence of extremely high scores is probably a result of parents helping their children on the assessment. Parents tend to think of assessments as something that determines who gets rewarded and who gets punished, and so they might be inclined to help their children get rewarded.
For this reason, it’s important to remind parents and families that the purpose of assessments is to provide teachers and principals with the data they need to make important decisions about student learning.
Is Distance Learning Effective?
Another question Dr. Younkin looked into for us was the efficacy of online learning by comparing students’ assessment scores at the end of last year, following school closures caused by the pandemic, with scores from the previous year. Dr. Younkin’s findings were that assessment scores from 2020 were largely consistent with those from 2019, with the exception of very low grades.
“The pattern of differences observed in the lower grade levels were not seen in virtual schools,” Dr. Younkin noted in his report, “leading to a conclusion that, even at these grade levels, information provision and training for ‘test administrators’ can overcome the issues at the lower grades.”
Overall, these findings led us to the conclusion that it may take some extra effort, but we can make distance learning and remote assessment work for the families we serve.
Planning for an Unpredictable Fall
Needless to say, the results gave us some confidence in planning for the fall. What was less clear to us was how to implement across each of the seven states we serve, because each state and campus has unique circumstances.
We’ve decided to keep to our usual fall assessment schedule, in which we assess two to three weeks into the year, once students and staff have settled in a bit. We understand that in some systems, circumstances will require a change, but that’s going to be up to them. Those initial assessments will form the baseline for individual teachers, and together they will form a baseline for us as an organization.
We will then have three to four assessments for progress-monitoring throughout the year, ending with our spring test at the end of the school year. So far, the schedule for those is set to remain the same as in a more typical year.
We do expect most of our schools to open online, and we’re predicting that about 30,000 students will take their baseline assessments remotely. Of course, that’s a lot more students than the 5,000 who took remote assessment in the spring, but we feel much better prepared for it than at the end of last year. We’ve updated our protocols, doubled down on improving communication with parents, and provided our teachers with professional development, so we expect a smoother process this time around, and maybe even more accurate results than in the spring.
Putting Results to Work
Another thing that will be slightly different this year is how we put assessment results to work. With the sudden interruption to the end of the school year in 2019–2020, we expect there to be even more variation in student skills, abilities, and knowledge than most years.
Every teacher knows that some skills are more essential than others. Learning to compare and contrast different versions of a story—say the written fairy tale of Cinderella versus the Disney movie—and learning what sounds each letter makes are both literacy skills students are expected to master. The latter skill, however, is a fundamental building block that students must have in place before they can learn to read. Without it, their progress will remain stalled. The other skill is also important, but students can learn to read without it—and, in fact, when they learn to read, they’ll probably also pick up the ability to compare two versions of the same story along the way.
Every teacher probably has a pretty good idea of what these Focus Skills are for the grades they teach, but they are likely a bit fuzzier on the Focus Skills for the grades before theirs. We’ve been providing our regional teams professional development so that they can get their teachers ready to focus on the areas that will help their students cover the most ground in the least amount of instructional time.
This fall certainly feels different, but when it comes to assessment, it will be much the same at the most basic level. The details may change, and we may have to drill down more here or there to fully support our students and teachers, but at the end of the day, the data will show us the way forward.
About the author
Alex Fernandez is the assessment director for Imagine Charter Schools. She can be reached at email@example.com.
This article was originally published by The Learning Counsel, a research institute and news media hub focused on providing context for the shift in education to digital curriculum