The “Honesty Gap” Is Dishonest

by David Wojick

Loyal Baconeers know what the so-called “Honesty Gap” is, as we have been discussing it at length. It is heavily featured in a recent report from the Virginia Department of Education. The gap itself is the big difference between the numerical thresholds of “proficiency” used by Virginia versus the National Assessment of Educational Progress (NAEP) for certain test regimes. NAEP’s threshold is much higher than Virginia’s (and most other States).

The gap has nothing to do with honesty, just language. Virginia has two tiers of passing scores: “proficient” and “advanced.” NAEP has three: “basic,” “proficient” and “advanced.”  What Virginia calls proficient aligns closely to what NAEP calls basic. Comparing Virginia students’ SOL pass rates at the proficient level with their NAEP pass rates at the proficient level is likening apples to organges.

So, where does “honesty” come in? This is where the story gets interesting, and in my view unpleasant. It turns out that “honesty gap” is hyperbolic rhetoric, spawned by an advocacy group.

The Honesty Gap ( HonestyGap.org) is a project of the Collaborative for Student Success, which is funded by a number of prominent left-wing foundations. The project home page immediately hits you with this grandiose pronouncement:

Parents deserve the truth. Historically, states have exaggerated the percent of students who are proficient – as demonstrated by the huge gaps that have existed between state NAEP scores and what states report as their proficiency rate.

This is truly dishonest. There is nothing untruthful, inaccurate or exaggerated in the proficiency ratings of Virginia and the other states. If anything, NAEP is using an extreme definition of proficient. Switching to a three-tier system would provide a bit more information, but how about five tiers, or twenty? The point is there is nothing dishonest about the Virginia system. Calling this an “honesty gap” is a hoax. If anything, Virginia’s threshold is more accurate.

It turns out that “proficient” has a fairly wide range of meanings. But after checking six different dictionaries I found the most common synonym was “competent.” Virginia’s use of the term to cover around 70% of SOL test takers is certainly consistent with this definition. They are competent in the subject, for example reading or math. Those who fail to pass SOL exams are less than competent. This all makes sense.

A more extreme definition that also occurs occasionally is “expert.” The NAEP definition is up there somewhere as well. Mind you, I seriously doubt that a third of the students are experts, in math for example. That claim would be ridiculous.

In fact NAEP’s threshold looks to be in a form of semantic limbo, floating somewhere between competent and expert. As I have written elsewhere, NAEP is a federal black box so there is no way to tell what they think they are doing. See my https://www.cfact.org/2022/05/18/ridiculous-federal-4th-grade-reading-tests/.

In short, there is no honesty gap and the claim that there is, is itself dishonest. In addition to this issue being semantic nonsense, it distracts from the real gaps that deserve our attention. These are the well documented differences in test levels among various demographic groups. I address this issue in the CFACT article referenced above. Our tests may be semantically biased against low income students. This is a serious issue.

David Wojick, Ph.D. is an independent analyst working at the intersection of science, technology and policy. He has been on the faculty of Carnegie Mellon University and the staffs of the U.S. Office of Naval Research and the Naval Research Lab. 


Share this article



ADVERTISEMENT

(comments below)



ADVERTISEMENT

(comments below)


Comments

27 responses to “The “Honesty Gap” Is Dishonest”

  1. LarrytheG Avatar
    LarrytheG

    Have you got a view on PISA and how it compares to NAEP?

    seems like NAEP is how to be able to compare the states academic performance and PISA is how we’d compare countries academic performance.

    How do we know how “good” the SOLs are in measuring proficiency as compared to the way Massachusetts or Alabama or Germany measures proficiency?

    1. As far as I know all states use SOLs. It is a federal requirement. But proficiency is too crude a concept to be measured. It is not a scientific concept, just a vague word.

      There is no way to measure what a person knows.

      I gave my view on PISA in a reply to you in the last honesty gap Bacon. The countries way ahead have an education driven culture, like Finland and Singapore. Among the big polyglot countries the US is right there with the rest, where we belong. Ranking based on three significant figures is ridiculous. We are in good shape.

      1. Fred Costello Avatar
        Fred Costello

        There is no way to measure what a person knows? Do you think that all tests are useless?

        1. Not at all, but tests are just tiny samples of the large population of what a student knows, or can think, etc. As cognitive indicators they are very crude and should be treated as such. Treating them like fuel gages is ridiculous, and destructive.

          1. Fred Costello Avatar
            Fred Costello

            Your alternative?

      2. LarrytheG Avatar
        LarrytheG

        Maybe you said SOLs for each state and meant standardized tests for each state – ?

        https://en.wikipedia.org/wiki/List_of_state_achievement_tests_in_the_United_States

        1. Every State has education standards. Here is a listing:
          https://www.educationworld.com/standards/state/toc/

          Here is why: “All states and schools will have challenging and clear standards of achievement and accountability for all children, and effective strategies for reaching those standards.” — U.S. Dept. of Education

      3. LarrytheG Avatar
        LarrytheG

        Do you think all states use the same SOLs that Virginia does, not their own ?

        What do standardized tests actually measure?

        re: PISA – yes but what does PISA measure?

        and so you don’t think PISA is legitimate either?

        Do you know if PISA is given in the US to select schools like NAEP is?

        re: “we are in good shape”.

        So Youngkin and other critics like JAB are dead wrong about Virginia?

        1. Tests do not measure anything. They are small samples of what the student does and does not know. Samples are not measurements.

          PISA says nothing about Virginia. My point was about PISA, in reply to your question. I do however think that most criticism of the education system is wrong. It assumes the system could be very different when it cannot.

          All the endless talk about “reforming” education is ridiculous. The present system is about as good as it can be, given existing conditions which cannot or will not be changed.

          The education system is like the car (and about as old). Minor improvement is always possible, but it is basically as good as it can be, absent some huge discovery.

          Mind you artificial intelligence might be that discovery for both systems. Self driving cars and “self driving” individual teachers are both emerging. Too soon to tell.

          1. LarrytheG Avatar
            LarrytheG

            So a test for an airline pilot or a doctor does not measure anything?

            re: the education system and change.

            are you talking about Virginia or all systems in the US or all systems worldwide? that such systems cannot change to improve?

          2. What would the units of measurement be?

            I am talking about the US, including Virginia. But I expect it is global as we have been doing education for a long time. It is still a teacher talking to a room full of kids.

            But I did not say what you say I said (as usual). I said that minor improvements are possible. Plus I pointed at one thing that might actually be a najor improvement, which is artificial intelligence. Now I have said it twice.

      4. James C. Sherlock Avatar
        James C. Sherlock

        “We are in good shape”.

        Who is “we”? Does “we” include the students at Henry Marsh III Elementary in Richmond? https://schoolquality.virginia.gov/schools/henry-marsh-iii-elementary#fndtn-desktopTabs-assessments

        “There is no way to measure what a person knows.” So that is your assessment – there is no way to assess? Really?

        If your comment about “good shape” does not include Henry Marsh III and urban public schools across the Commonwealth, why not?

        Until it does, your narrowly pedantic approach to education is a distraction from the truth. But thanks, anyway.

        1. We is America, versus other countries.

          Sampling is not measurement. Assessment is a vague term that includes both plus lots of other things.

        2. Eric the half a troll Avatar
          Eric the half a troll

          He acknowledged the real problems that do exist. It is not pedantic to expose that the basis of much of the entire Youngkin “report” was (perhaps intentionally)… um, shall we say… in error…

          1. Nancy Naive Avatar
            Nancy Naive

            Error. How kind.

        3. Matt Hurt Avatar
          Matt Hurt

          Jim, at the state level, “we” can be (and are) in good shape, and specific schools not be. For example, Virginia usually ranks in the top 10 in NAEP scores. However, you’re talking about school level analysis, and you’re right, that school is not in good shape.

          It’s important that we parse this out correctly. For example, if the state was last in the country, “we” are not in good shape, and there would be cause to make some foundational changes to the entire enterprise. In our case, we’re doing pretty good overall (though we should be at the very top of the heap), but we have pockets in need of significant improvement, such as the school you point out.

          We need to differentiate these points so that we fully understand where the problem is to best apply a solution that targets the specific problem and doesn’t throw the baby out with the bath water.

          1. James C. Sherlock Avatar
            James C. Sherlock

            I agree 100%. You are making my point.

            The schools in Wise County where you live are amazingly effective at teaching poor children. You personally contribute to the success of Region 7 schools.

            Richmond, Petersburg, Danville, and Portsmouth schools are disturbingly ineffective.

            Loudoun County schools in the urban cluster near Dulles Airport are ineffective.

            These wide variations exist all over the state. That is why it frustrates me to continually have the conversation about the “average” school in Virginia.

            The discussion masks the problem in urban and urban cluster schools.

            The evidence of how to teach those kids effectively is available. See Success Academy. It frustrates me that Virginia ignores it.

      5. James C. Sherlock Avatar
        James C. Sherlock

        “We are in good shape”.

        Who is “we”? Does “we” include the students at Henry Marsh III Elementary in Richmond? https://schoolquality.virginia.gov/schools/henry-marsh-iii-elementary#fndtn-desktopTabs-assessments

        “There is no way to measure what a person knows.” So that is your assessment – there is no way to assess? Really?

        If your comment about “good shape” does not include Henry Marsh III and urban public schools across the Commonwealth, why not?

        Until it does, your narrowly pedantic approach to education is a distraction from the truth. But thanks, anyway.

  2. By the way, I am still working on the bias stuff. NAEP’s sample readings are on things like fishing and innkeepers (an archaic term). Low income people know little of these concepts. How about some stories on dealing with cockroaches or figuring out what food you can afford?

    1. DJRippert Avatar
      DJRippert

      “Low income people know little of these concepts.”

      Pure elitism. If “Innkeeper” is an archaic term then it is archaic for middle class and wealthy kids too. And fishing? Seriously? Low income kids have some blind spot over the term “fishing”?

      Your comments seem to be more in the realm of excuse making than serious dialog.

      1. Nancy Naive Avatar
        Nancy Naive

        In GOPworld, poor kids cross the river, they don’t fish in it.

      2. Middle and high income kids are likely to have stayed in an Inn, if not in an inn. Low income kids not so much. Likewise I suspect the fraction of middle and high income kids that know people who fish, and the fraction that have fished, are much high that for low income inner city kids.

        These are just the sort of empirical questions that should guide test design. We are dealing with different cultures, speaking different languages. It is a matter of bias due to prior knowledge and experience. The language, knowledge and experiences assumed by the tests could explain much of the test score differences.

        1. Lefty665 Avatar
          Lefty665

          Or they could highlight how poorly we have prepared some kids to thrive in our predominant culture which is one of the goals of our educational system.

          If you advocate counting cockroaches or beans in a burrito you are playing to pretty low racial stereotypes. Is that really a virtue in testing?

  3. LarrytheG Avatar
    LarrytheG

    https://uploads.disquscdn.com/images/d7bd638f7c8a6688481b61a7e3df706055b85cb064fa3e9912522c91fb92b389.jpg Pretty much each state has it’s own definition of “proficiency” on it’s standardized tests (which are required by No Child Left Behind).

    they may or may not align with each other and because of that one cannot
    compare one state to another – literally apples to oranges.

    NAEP has tried to normalize with one standard that maps each state’s standards to that one standard – which then allows comparisons between states.

    One step further – PISA is an OECD entity that tries to do essentially a similar thing on a country to country basis.

    All of these players, the states, NAEP and PISA define what they say proficiency is.

    For instance, Virginia looks like this (see chart above).

    and NAEP looks like this;

    https://uploads.disquscdn.com/images/3c4afc5b85bfe0dcac3c4850c08124bdf3a6ad6df709050b87549c29fc8dabf7.jpg

    https://nces.ed.gov/nationsreportcard/reading/achieve.aspx

    there is no “gap’, they don’t measure proficiency exactly the same.

    but after all is said and done, NAEP does rank Virginia in the top 10 for 4th grade reading – even though it has dropped a bit – most states have also dropped a bit – not just Virginia.

    https://uploads.disquscdn.com/images/28a4c0508c8a9915a7fa52a037a2058d42047a70f29df748d2b3819a4e430a3b.jpg

    What IS also relevant but not reported well is what Sherlock alludes to and that is the fact that despite a high ranking, Virginia still has a lot of
    low performing schools as well as many children who do not meet even basic proficiency standards and it’s essentially hidden by using a statewide average rather than a statewide median score and histogram distribution – which would much more clearly show the problem Virginia (and other states) has. A histogram showing the mean score and the distribution would be much more informative – and that data DOES exist but is apparently not reported.

    Virginia’s higher scoring school districts pull up our average and essentially mask the lower scores of other districts.

    1. DJRippert Avatar
      DJRippert

      ” … literally apples to oranges”

      No, literally comparing apples to oranges would involve evaluating two different types of fruit.

      1. LarrytheG Avatar
        LarrytheG

        It’s sorta like comparing how JD Power evaluates a car verses how Consumer Reports might or USNWR rating a colleges verses say Niche or the College Scorecard.

        On NAEP verses PISA….on K-12 reading proficiency, etc, etc…

        so perhaps one might consider all of them “apples”?

  4. Lefty665 Avatar
    Lefty665

    You raise some good questions. But what it seems you have done in this Bacon and your linked article is to establish a correlation between Virginia SOL Proficient reading and NAEP Basic reading, not what either reliably measures or why one measure of proficiency is more valid than the other.

    Do either of them measure functional literacy that is a vital tool for daily living and conducive to a lifetime thriving in our culture, and how is that established?

    Can we use the tests as a blunt instrument? Is it enough to know that kids who do not pass either test are destined for marginal lives? Can we use that to determine who needs intensive reading training/education?

Leave a Reply