SOL Testing 101

This is the first of a series of four articles explaining Virginia’s Standards of Learning assessments, showcasing school districts have demonstrated success despite significant challenges, providing context for the 2021 assessment results, and expressing concerns about recent General Assembly and Board of Education actions that could have significant negative unintended consequences. Given the crucial necessity of producing well-educated graduates, it is vitally important that Virginia citizens understand how the assessments work. — Matt HurtVirginia has administered Standards of Learning (SOL) assessments for more than twenty years. Over that period, many changes have taken place through actions of the General Assembly and the Virginia Board of Education. According to the Virginia Department of Education, the purpose of SOL testing is to “inform parents and communities about whether students — as individuals and collectively — are meeting the commonwealth’s expectations for achievement in English, mathematics, science and history. SOL tests allow the state Board of Education to identify schools that need assistance and support. The assessments also provide an objective means for measuring achievement gaps between student subgroups and for determining the progress of schools, divisions and the state toward closing these gaps.”The SOL tests measure skills that are foundational to students’ success in future academic endeavors. I have yet to find anyone that could successfully argue these skills are not valuable or that student success in these skills is not desired. If students cannot read, interpret, and understand written text, or reason through mathematical concepts, they will not be able to access higher level courses and will be less well prepared for post-secondary education.In this series, I focus primarily upon critical reading and math SOLs. The history and science tests are less skill based and rely more on assessing student ability to recall discrete facts.The tests are criterion-referenced assessments designed to measure student proficiency relative to Virginia’s Standards of Learning (SOLs). These standards were first adopted by the Virginia Board of Education in 1995, and in 1997 the Board approved changes to the Standards of Accreditation that linked school accountability to the outcomes of assessments based on the SOLs. Prior to that time, the state testing programs consisted primarily of nationally normed assessments that measured a student’s achievement relative to his/her peer group from across the nation.Virginia has identified a progression of reading and math skills from Kindergarten through 12th grade. This progression assumes students have no academic background upon entering Kindergarten, and begins that grade with letters, sounds, and numbers. Each year, the skills are built upon the learning of the previous year. If competent instruction is provided, it is reasonable to expect students to master each grade’s skills within the course of one school year. If students fail to achieve proficiency, however, they fall into the at-risk category because they lack the prerequisite skills necessary to be successful in the next grade.According to Virginia Code § 22.1-253.13:1, the Board of Education is required to review the Standards of Learning every seven years. During this process, changes are typically suggested to and approved by the Board. When this happens, the SOL tests must be updated to correspond to the changes in the standards. As part of this process, the Board also must approve new cut scores (the number of questions that must be answered correctly to earn a pass proficient or pass advanced score). The Board accepts feedback from groups of teachers assembled for this purpose, as well as the state superintendent, then makes the decision. These changes in standards and cut scores have impacted pass rates, both negatively and positively.Early on, high school students who failed with a score of 375-399 (400 is passing) were afforded the opportunity to retake the test, as SOL tests are one of the criteria used to earn a high school diploma. In 2015, elementary and middle school students who failed within the same score range were also allowed an expedited retake. With the implementation of more rigorous math standards in 2012 and more rigorous reading standards in 2013, the SOL test also included Technology Enhanced Items (TEI) for the first time

. These items were not straight multiple select questions, but a question for which students would be required to select all the correct answer choices, drag labels to the correct position on a graphic, manipulate graphs, etc.  TEI items are typically considered more difficult than traditional multiple-choice questions.

Figure 1 and Figure 2 below display how changes in standards, cut scores, and testing protocols have impacted SOL pass rates over time.

The first major dip in scores indicated in these charts reflected  implementation of the more rigorous standards and TEI items in Math (2012) and Reading (2013). There was a significant increase in pass rates in 2015 that corresponded with the advent of expedited retakes at the elementary and middle level. In 2019, there was a significant increase in SOL pass rates, which corresponded with a new SOL test which had lower cut scores than the previous test. There was a much more significant decline in math pass rates in 2021 than in reading, in part due to lower cut scores for the new reading tests that were first administered that year. Of course, our educational response to the COVID pandemic played a big role in the 2021 outcomes, but that will be discussed in the third article of this series.

Figure 1
Figure 2

Given that Virginia’s SOL tests are criterion-referenced and that cut scores are to a certain degree arbitrarily decided by the Board of Education, understanding  these outcomes requires context.

Figures 3 through 6 compare Virginia SOL test pass rates with Virginia National Assessment of Educational Progress (NAEP). NAEP tests are administered every two years nationally, and can serve as a reference point by which to evaluate our outcomes relative to the rest of the country. These charts demonstrate a declining trend on state SOL results (caused by the implementation of more rigorous state tests), which correspond with increases on NAEP results. This seems to support the notion that higher expectations (increased rigor of SOL tests) yields higher performance (relative to the NAEP assessment). The topic of expectations and results will be explored further in the next article.

 

Matt Hurt is director of the Comprehensive Instructional Program based in Wise.


Share this article



ADVERTISEMENT

(comments below)



ADVERTISEMENT

(comments below)


Comments

14 responses to “SOL Testing 101”

  1. LarrytheG Avatar

    Matt. Thanks for your informative perspectives!

    I’m a little confused by the charts – specifically what y-axis means.

    The top line is the SOL pass rate but what is the bottom NAEP line?
    Is it the percent of kids who achieve “proficiency” – according to NAEP criteria?

    And that goes to what skills and capabilities are being taught – and tested?

    Many folks see “Reading” and “Math” but really each contain a series of skills and capabilities that students must demonstrate to get correct answers.

    Right?

    How does NAEP criteria for “proficiency” compare and contrast with SOL achievement?

    1. Kathleen Smith Avatar
      Kathleen Smith

      Matt, good job. Larry, NAEP is what most say 4th graders show know and be able to do nationally, while SOL assess what SOLs 4the graders must know and be able to do. The difference is really the Scope of what was assessed.

      Scope + sequence =‘s curriculum.

      Scope is weighted and outlined in the SOL blueprints – so that sequence matters (how much time needed to teach fractions Vs geometry. If 20/30 questions on the SOL assessment are fractions and 2/30 are geometry, you can kind of guess the sequence.

      We often forget that a finite 180 days per year is a factor in sequence. You can add scope, but you can’t change total time allowed to teach.

      Basically NAEP is a good indicator of what kids should know and be able to do. If SOL curriculum has the right scope and sequence in determining what kids should know and be able to do, then it would follow that high SOL scores should equate to high NAEP scores.

      Exception to note NAEP is a percentile. So 50 percent of the nation did better than you and 50 percent of the nation did worse than you plus or minus one standard deviation is good. (A statistical number that would uncle about 66 percent of the number of test takers) The bell curve.

      SOL pass rates are not bell curve driven. They are driven by Cut Scores. Matt’s point. If the board lowers the cut scores, more pass, if they raise cut scores, more fail.

      1. LarrytheG Avatar

        I’ve been in and out today and so has BR apparently.

        Thanks much for sharing more knowledge that I am learning from.

        On the NAEP – on scoring , Virginia scores high compared to other states – right?

        https://www.nationsreportcard.gov/profiles/stateprofile?chort=1&sub=MAT&sj=&sfj=NP&st=MN&year=2019R3

        but I found this interesting:

        https://nces.ed.gov/nationsreportcard/subject/publications/studies/pdf/2021036.pdf

        I THINK what this means is that Virginia’s cut scores are very low – dead last or close to it.

        wanna take a shot at explaining it?

        Is there a histogram of Virginia SOL scores for the entire cohort that would show the range of scores? (not sure I articulated this well).

          1. Kathleen Smith Avatar
            Kathleen Smith

            Larry, the article shows how they attempted to compare a bell curve percentile to a state pass rate (pass/fail). Actually, according to your chart above, we were a little above the average. Think of average if about 66% scoring above basic. At or above proficient is a higher bar. If 66% are average, then you have 34% left. 17% were at the low end and 17% were at the high end. Actually, we look okay with 38% at the high end. What this shows is that we expect a little more of what kids should know and be able to do than MOST states. Massachusetts may or may not expect more, but children are able to do more than most states (at least this data shows).

  2. Here’s a link to some “released” Virginia SOL tests in case anyone wants to give them a try:

    https://www.doe.virginia.gov/testing/sol/released_tests/index.shtml

  3. Rafaelo Avatar

    Tocqueville warned of the modern penchant to think in ‘groups,’ rather than to consider each individual on their own. Is that problem built into the very concept of SOLs?

    Hairs on the back of my neck rise at: “The assessments also provide an objective means for measuring achievement gaps between student subgroups and for determining the progress of schools, divisions and the state toward closing these gaps.”

    When I was in school each individual student got measured against other students in that class, learning from that teacher. Students were not burdened with the “achievement gaps” of a statewide “subgroup” Collective guilt?

    Worse, how do schools close achievement gaps between subgroups? The Woke say systemic racism causes these gaps. Schools cannot eliminate systemic racism. It’s the whole system: housing and jobs and everything you see on TV, according to the Woke. Schools can’t fix that.

    The Asian subgroup, once the target of systemic racism, manages to preserve an achievement gap with all the others. How do we dumb them down?

    Let’s reconsider subgroups, and the value of statewide testing. How about we concentrate resources on teaching each individual student?

    1. LarrytheG Avatar
      LarrytheG

      FYI :

      “Proficiency Gap Groups – Student subgroups used to identify Focus schools under Virginia’s flexibility waiver.

      Proficiency Gap Group 1 – Students with disabilities, English language learners and economically disadvantaged students, regardless of race and ethnicity (unduplicated)
      Proficiency Gap Group 2 – African-American students, not of Hispanic origin, including those also counted in Proficiency Gap Group 1
      Proficiency Gap Group 3 – Hispanic students, of one or more races, including those also counted in Proficiency Gap Group 1”

      https://www.doe.virginia.gov/statistics_reports/school_report_card/accountability_terminology.shtml

      “Under NCLB, there are four different subgroups: race/ethnicity, socioeconomic status, limited English proficiency, and students with disabilities. If any one of these subgroups fails to make adequate yearly progress (AYP) under NCLB, the entire school fails.”

      https://eric.ed.gov/?id=EJ864096

    2. Kathleen Smith Avatar
      Kathleen Smith

      One of the problems of looking at gaps is the assumption that because I am in a certain subgroup, I did poorly, when in fact, I could have scored the highest score possible.

      On the other hand, we give lots and lots of money through Title I programs for improving outcomes for economically disadvantaged and additional funding to support students with disabilities. If we don’t look at the improvements in those groups, then we could be wasting our money. Look up the “black hole of Title I funding” in the late 90’s and early this century. This led to NCLB and the subgroup data.

      Teachers should look at individual growth. Policy makers, not so much. Hearing one story about making a difference in one person’s life is important, but making a difference in lots of lives is even greater.

      1. LarrytheG Avatar
        LarrytheG

        Pretty sure the subgroups came from the original No Child Left Behind law………….

    3. Matt Hurt Avatar
      Matt Hurt

      Here’s the problem with that. There is objective data which demonstrates that educators have higher expectations of kids in one subgroup than others. Using the subgroup data provides us a good jumping off point to then drill down to individual students. Unless we work effectively with individual students, we can’t move the needle on the subgroup. The next article in this series will touch on this a little bit.

      1. LarrytheG Avatar
        LarrytheG

        Glad you’re “back” here… and paying attention to your thoughts!

        Yes on the subgroups. But did want to point it that these subgroups are not a Virginia-only thing nor a “woke” thing. If I recall, they were part of the No Child Left Behind legislation – which passed Congress with overwhelming bipartisan support in 2001 and was signed into law by President George W. Bush on Jan. 8, 2002.

        AND mandated standardized testing to all states.

        Virginia’s SOLs, got started about that time, dunno if it was chicken/egg or what but prior to that there was no real standardized testing nor reporting of results if I recall correctly..

        I just think some of these facts are getting mashed up and conflated with “woke” and VDOE.

Leave a Reply