Poverty Not Destiny for Educational Performance

This is the second in a series examining Virginia’s Standards of Learning.by Matt Hurt

In the 2013-2014 school year school superintendents in Virginia’s Region VII, a region encompassing Southwest Virginia, began to focus on declining student pass rates during their monthly regional meetings. The Virginia Board of Education had recently adopted more rigorous Standards of Learning in Math and Reading and implemented much more difficult Technology Enhanced Items on those new SOL tests.

School board budgets had been slashed since the Great Recession of 2008. Many central office positions had been merged through reduction in staff, and those who were left had to attend to the administrative requirements of state and federal mandates. Therefore, the superintendents decided to pool their resources and their talents by creating a consortium, the Comprehensive Instructional Program (CIP).

The mission of the CIP was simple: to improve student outcomes as measured primarily by Virginia’s Standards of Learning tests. Initially, data was analyzed to determine which division was the most successful on each SOL test. The most successful teachers of the most at-risk students in that division (as determined by SOL results) were recruited, and they spent the 2014-2015 school year sharing their pacing guides, instructional materials, and assessments, all of which were posted online for others to use. During the first year of implementation (2015-2016), the divisions that used the common pacing guides and common assessments realized greater gains in reading, writing, math, science, and history SOL tests than any other region in the state.

Collectively, the nineteen public school divisions that comprise Virginia’s Superintendents Region VII are the least well funded, have the highest rate of students with disabilities, and the second highest rate of students who live in poverty in the state. When folks hear this, they wouldn’t dream that this is also the region with the highest proficiency rates on Virginia’s Standards of Learning tests, but it’s true. It wasn’t always that way, but it has been for the last few years, thanks to their collaboration.

Figure 1. Regional per-pupil expenditures 2020 (most recent data available). ADM = Average Daily Membership (enrollment)
Figure 2. Regional demographics, 2021. SWDs = Students with Disabilities
SOL Pass Rates 2006-2021, all students, by region

When folks realized that Region VII had become the most successful region in the state despite also being one of the most disadvantaged, they decided to interview teachers and administrators to ask them what brought about those significant improvements. These interviews produced two very consistent answers from almost all interviews. First, putting teachers in charge of instructional decision making was key. Each year teams of teachers are assembled from across the consortium to make changes to pacing guides and assessments, as well as to decide necessary curriculum updates or anything else that needs to be done to further our mission of improving outcomes for students. What those teachers say goes. The instructional frameworks those teams provide, specifically the common pacing guides and common quarterly assessments were noted as linchpins to the entire process.

Second was noted the consortium’s reliance on data to lead them to better outcomes. Data is collected from SOL tests, benchmarks, state reports, grades, etc. to determine what is working best and share those strategies with everyone else. Relative outcome reports are also distributed to help inform expectations of teachers and administrators. These relative outcome reports help folks to understand that if others are getting it done, they can as well.

One of the most eye-opening uses of data, employed early on, examined the relationship between poverty and SOL outcomes. For years, one would hear the trope “the SOL test is nothing more than a measure of poverty,” and many folks used that to dismiss their students’ subpar performance. Figure 4 displays the interplay between those two variables. The poverty rates (from the US Census Bureau) and the overall SOL pass rates for each division were converted to percentile ranks, and then charted on the scatterplot. Please note that the divisions with the most positive and negative outliers are identified.

Figure 4. Relationship between SOL pass rates and student poverty.

The most positive outlier enrolled students who lived in poverty at a rate of more than 87% of school divisions in Virginia, yet had SOL pass rates that exceeded 97% of the state. The division that was the most negative outlier had a poverty enrollment that was less than 67% of divisions, yet was outperformed by 90% of divisions on SOL tests that year. The ten named divisions were those that outperformed this trend to the greatest degree. Eight of those ten are in Region VII, and nine of the ten are in the CIP consortium.

When this report was provided to teachers and administrators, and they could plainly see that divisions which had a significantly higher poverty enrollment also earned significantly higher SOL pass rates, they realized that it could be done. Many of these folks resolved to increase the expectations they had of their students and of themselves to rectify this problem, and when they did this, they were able to improve student outcomes. There has been a significant, negative correlation between a division’s relative enrollment of students who live in poverty and relative SOL achievement, but this statistic had been declining from 2014 to 2019. Based on the analysis of the data it appears that Region VII divisions are the driving force which is improving this statistic.

The data from 2021 was skewed in many ways due to our political and educational responses to the COVID-10 pandemic, which will be discussed in great detail in the next article in this series.

Data investigations have yielded other ways in which low expectations have negatively impacted the outcomes for students. This analysis was prompted by examining schools in the consortium that had failed to meet full state accreditation for some time. In all of these schools, an alarmingly high rate of students had earned A’s and B’s for the final grade in a course but failed the SOL test associated with that course. Analysis showed that schools with lower SOL performance consistently had higher percentages of such students than schools with higher SOL scores.

Figure 5 compares the middle school in Region VII with the lowest SOL pass rates to the middle school with the highest pass rates. The table clearly demonstrates that it is easier for a student to earn an A or a B for a course in the bottom-performing school than the top-performing school.

For example, in the bottom-performing middle school, 14% of students who earned an A failed their SOL test associated with that course, compared to 0% in the top-performing middle school. At the top-performing middle school, 56% of students who earned an A for the course scored “pass advanced” on the SOL test compared to only 17% of A’s at the bottom-performing middle school.

Figure 5. Final Grades and SOL Outcomes. MS = Middle School

This trend is also consistent when reviewing the data for elementary schools, high schools, and the overall school division.

This trend also extends to analysis of subgroup data. In divisions with significantly lower subgroup scores than other divisions, there is a measurable difference in expectations among their subgroups. In those divisions, it is easier for students in some subgroups to earn A’s and B’s, though those students are less likely to pass their SOL tests. These low expectations serve a great disservice to those students. Their performance may look good on their report cards, but their skills as measured by the SOL tests are deficient.

Since 2014, there has been a significant change in leadership philosophy in Region VII. Much of this was born of increased collaboration among administrators, as well as the use of data to replicate strategies from successful schools and divisions. Administrators are much more likely to set expectations for teachers but allow teachers the discretion to make the instructional decisions in their classrooms to best meet the needs of their students. Teacher and administrator evaluations are much more aligned with assessing student outcomes than evaluate inputs as had been the case before.

Rather than spend time monitoring top-down mandates, administrators devote more time working with teachers to understand their problems and remove stumbling blocks that thwart them from being as effective as they can be. Basically, there is much more servant leadership in play because these administrators understand they have to support their teachers, students, and families for student outcomes to improve.

In conclusion, there is no simple algorithm that can be applied to improve student outcomes. Education is a people business and managing people is very much like herding cats. Collaboration, reliance on data, problem solving, and focus on the main thing seem to be the keys that have helped one of the most disadvantaged regions in the state to be the most successful.

Matt Hurt is director of the Comprehensive Instructional Program based in Wise.


Share this article



ADVERTISEMENT

(comments below)



ADVERTISEMENT

(comments below)


Comments

14 responses to “Poverty Not Destiny for Educational Performance”

  1. LarrytheG Avatar
    LarrytheG

    I thank Matt for his thought-provoking essay per his usual. The “expectation” idea is connected to “grading”? And low-expectation schools grade high when in fact the actual academic performance is lower?

    So the first thing I wonder about is how we know the actual grades from schools? I’m not sure I see that in the VDOE SOL build-a-table or school profiles.

    but then that brings up another issue which is or seems to be that unlike the SOLs, “grade” is not standardized but instead pretty subjective on a per school or per teacher basis? I’m ignorant about this aspect so maybe someone can explain.

    1. Kathleen Smith Avatar
      Kathleen Smith

      Larry, grading and expectations are different. High expectation, in my opinion, is the altruistic teacher who believes that all children can and will learn if as a teacher, I don’t give up, ever. I believe I can teach the same skills at the same high level regardless of your disability, economic status or race.

  2. Stephen Haner Avatar
    Stephen Haner

    “Expectations” is not a word held in favor in large parts of this commonwealth. A “trigger” to some, I’m sure. But you will never get better than you expect, and the expectations have to come from both teachers and parents. Education in that region is still seen as the ticket to success (correctly.)

    My wife’s first two jobs were in very rural SWVA elementary schools, and we’ll fuzz up the number of years ago…:). One thing she reported, though, was a driving ambition on the part of many of the parents that their children succeed in school and have better economic lives. That is and always has been crucial, although some other adult can substitute in the equation.

  3. Kathleen Smith Avatar
    Kathleen Smith

    Expectations are key. Congratulations Dr. Hurt and Region 7. I have worked with high poverty populations my entire career. This is exactly what I have hoped for- Equity!!

  4. Virginian78 Avatar
    Virginian78

    I am a graduate of JJ Kelly High School in Wise and a graduate of the University of Virginia in Charlottesville. I had a superior education in fundamental reading and writing skills and fundamental math skills than anyone in my first year dorm which was dominated by NOVA and Richmond area guys. This is 40+ years ago but reinforces that basic skills taught by dedicated teachers is the best education a young student can have!

    1. LarrytheG Avatar
      LarrytheG

      My K-12 education was at about a half dozen or more schools – normal for kids of parents in the military.

      Needless to say, that “curriculum” is a potpourri of “standards” and “skills”. Ironically, the DOD schools (which are not available at all bases) scores quite high on NAEP.

      One thing I’ve encountered with my discussions with my circle of teacher friends is that they are reluctant to hold a child back a class level unless the kid is behind across the spectrum of subjects. Otherwise, they want to get him/her help in the area they are having trouble and help them catch up and stay on grade level.

      It’s easy to say hold them back, but it has real consequences where the tradeoffs are not so easy to be convinced of.

      Having said this, I do NOT believe that very basic reading, writing and math skills are good enough for academic performance in higher ed.

      The skill most lacked is the one where someone can not only read and write but competently articulate the concepts they encounter in their reading. In other words Critical Thinking. THe basic skills get you to that doorstep – but stepping up to be able to critically evaluate and understand concepts is a step up and is mandatory for many 21st century jobs and occupations.

      If one wants to determine how they do at this I invite them to read a medical professional write-up of their last encounter with their doctor and especially so if there is a medical issue that involves describing the issue and planned therapy, etc. If you can read and understand that – then you can read and understand what it takes to be a medical professional in the 21st century.

      You may not and may think it’s a “higher level” but all I’m pointing out that there are occupations in the 21st century that do require that level of competence if you want a job (and the pay that goes with it) at that level.

      1. Kathleen Smith Avatar
        Kathleen Smith

        Last week, I tried to explain curriculum. The scope of the SOL is what is supposed to be taught. The written curriculum is the framework in detail with resources of what should be taught. It sometimes includes the how. The how might differ for three or for kids in the class. So after all is taught, then it is assessed. In Region 7, we know the scope is good. Then teachers teach. That must be good too as the assessments show that not only do the non economically disadvantaged kids, but so do the disadvantaged kids.

        An Asian student could be disabled and economically disadvantaged and an English Language Learner. So that student is in four subgroups.

        Bottom line for me in equity is the difference between have and have-nots. I can see if there is a difference between students with disabilities who are haves and students with disabilities who are have-nots. Blacks who are haves and Black who are have-nots. Etc.

        1. LarrytheG Avatar
          LarrytheG

          re: ” The scope of the SOL is what is supposed to be taught.”

          Yes, and the testing validates how successfully it was taught

          But what perplexes me is how and why letter grades given in different schools differ in respect to performance – and why letter grading is not standardized across the schools AND are at least somewhat consistent with SOL performance standards.

          It sort of makes a mockery of the SOLs in that what is done prior them being giving may not actually be effective even though it gets “good” grades – but it all comes down when the actual SOL testing show significant disparities between the letter grades and actual SOL testing performance.

          I cannot understand why ANY school district Administration would allow this to go on and not requires standardized letter grading consistent with SOL academic standards – throughout the curriculum and school year.

          Before SOLS, standardized testing in general, this is what happened. Each school has it’s own benchmarks and really non one really knew if they were really good/tough grading standards or really bad and loose – until the kid showed up for college and SATsthen the truth came out.

          1. Matt Hurt Avatar
            Matt Hurt

            Larry, I think Kathleen nailed it in her first post to this article. From my experience, it is pretty plain to see that the expectations as evidenced by the relationships between final grade and SOL score is a function of school/division culture, and the belief that the kids can accomplish what the state requires. If a teacher (or the broader school or division) doesn’t believe a student can perform at the level required by the state, they feel that if they assign them a poor grade, that is punishing a student for something they cannot control. So, the teacher may lower the rigor of the assignments to match the “expected ability” of the students, or use creative gradebook tricks so as not to penalize the student because he/she “lacks ability”. The most nefarious part of all of this is that the lower expectations are more pronounced among some subgroups of students, even within the same school and division.

          2. LarrytheG Avatar
            LarrytheG

            Thanks Matt. I think we are on to something important here. It seems like more than just one teacher might being doing this in a particular school.

            But if that school has to give SOL tests, then doesn’t this strategy come unglued when the kids take the SOLs and their scores are nothing like their classroom grades?

            It seems counter-productive.

            I’m NOW starting to “get” the “expectations” thing – and yes, nefarious and – surprised it happens apparently at the scope and scale in some places, jurisdictions, etc.

            I keep pointing out – beyond Region 7 and SW, that other places like NoVa and Henrico/Chesterfield have within their same respective school divisions – some of the higher scoring schools in the state and at the same time in that same school district, some of the lower scoring schools.

            In other words – not even across the district in multiple schools in that district – do they score similarly on the SOLs. Same district. Same district rules and standards (in theory) but radically different outcomes on a per school basis.

            And this is starting to look like each school has it’s own “culture”, it’s own grading standards and expectations that may not be at all like other schools in the very same district.

            Is this right or am I off track?

          3. Matt Hurt Avatar
            Matt Hurt

            Right on target!

  5. James Wyatt Whitehead Avatar
    James Wyatt Whitehead

    I am eager to learn what Southwest Virginia will do to recover the 15% plus drop. They fell too but not as hard as others. My bet is Region 7 might offer some early direction on how to recover lost learning.

  6. tmtfairfax Avatar
    tmtfairfax

    Democracy Dies in Darkness. One would think the self-proclaimed national leader in journalism would be leading the way on news like this. But it’s easier to be woke.

    If the goal is to educate kids as best as we can, why not look at best practices and the data?

    1. LarrytheG Avatar
      LarrytheG

      Apparently what SW Region 7 is doing – i.e. collaborating on curriculum and academic performance standards , major school districts in other regions are not doing even within their own school district.

      I”ll be honest – I don’t think most parents and others REALLY understand how K-12 education actually “works” these days especially in lower performing schools. I’m not sure, that is solely the fault of the media.

      Blogs like BR have the potential to drill down and get to the issues… but the Matt Hurtt way not the “cranky” way.

Leave a Reply