College Graduation Rates and SAT Scores

This table shows the math and verbal SAT scores for Virginia's public universities, along with college graduation rates.

This table shows the math and verbal SAT scores for Virginia’s public universities, along with college graduation rates. Table credit: Cranky’s Blog

John Butcher, of Cranky’s Blog fame, is turning his analytical gaze from K-12 schools to higher education. In his latest post, he explores the strong correlation between a Virginia public institution’s six-year graduation rate and the average SAT scores of its student body, as seen in the table to the left and the plotted chart below of median SAT math scores. (See his post for the chart of English SAT scores.)

The commentary in his post is sparse, but he makes interesting points in his email correspondence with me:

UVa and Mary&Bill both take very smart kids and graduate nearly all. The middle-tier colleges take less bright kids and graduate fewer. VCU takes still less bright kids and graduates still fewer. All these sit pretty well on the fitted line, except that JMU under-performs on the math datum.

Why should schools taking less able students graduate a smaller proportion? If they were doing their jobs, they would adapt to their clientèle and give them degrees. Doubtless the market would discount those degrees (surely it does already as to the kids who graduate now). But we wouldn’t see the kids being sloughed off.

VSU and Longwood both over-perform, albeit not by nearly enough. But they are doing something better, if not entirely right. What is it?

Image source: Cranky’s Blog

Six-year college graduation rates are the standard performance metric for U.S. colleges and universities. Four years to graduation is the ideal. Six years contain a lot of slack and, to my mind, and represents a shamefully low hurdle. The inability of a student to graduate within six years constitutes a total failure, either on the student’s part, the university’s part or both. It represents a misallocation of resources by the higher ed system and a personal tragedy for the student, who typically accumulates thousands of dollars in loans and has no sheepskin to show for it.

We need to better understand the key variables affecting six-year graduation rates.

John gets the conversation rolling by noting that the odds are stacked in favor of smarter students (with smarts measured by SAT scores). Indeed, SAT scores account for almost 80% of the variation in the graduation rate. Smarter kids come disproportionately from well-off families that raise them in an environment that rewards educational achievement and also have the means to support them financially while in school. These students can spend more time studying and less time worrying how to pay tuition, fees, room, board and incidentals.

But the correlation is not perfect. Some institutions do better with the raw material (students) they are given than others, as can be seen by the squares above and below the plotted line. (Old Dominion University may be an outlier because its student population contains a high percentage of military personnel who leave when they rotate to an assignment in another location.)

John asks if institutions are gearing their curriculum and academic standards toward the students in their student body, as opposed, perhaps, to the students they wished they had. That hypothesis is worth pursuing.

Here’s another: Could the guidance and support given students play a role in college graduation rates?

In 2011 the University of Virginia performed slightly above expectations in six-year graduation rates, but only slightly. As part of its strategic plan (the Cornerstone Plan) instituted in 2013, UVa is pioneering the concept of “total advising,” which integrates academic advising, career advising, and coaching. One would hope that the program will show higher six-year graduation rates for the class of 2017.

Likewise, it would be interesting to see what the Virginia Military Institute, Christopher Newport University, and Mary Washington University — all of which performed above expectations — are doing differently from other universities.

One way or another, we need to figure out how to help students graduate on time and on budget.

There are currently no comments highlighted.

32 responses to “College Graduation Rates and SAT Scores

  1. Here’s a question.

    What would the average SAT score be in Virginia if all the courses the kids took was ONLY those paid for by the State – the SOQ/SOLs?

    Bonus Question – (and maybe Cranky can work his magic on this):

    if you SUBTRACT the SOQ spending from each of the Virginia schools and just count what the locality spends over and above their mandated SOQ match – what would the SAT scores look like ? Would the schools that spend more money loclly and above the SOQs show a higher SAT score correlation?

    It’s my suspects that much of what local school systems spend – over and above their SOQ match -goes for courses and studies that help boost SAT scores… things like AP, dual-enrollment, baccalaureate and Governor School – all of which cost fairly substantial dollars.

  2. A few initial thoughts:

    First, the graduation rates reflect those that graduate from the institution in that time frame. It excludes those who transfer and may graduate elsewhere. For W&M and UVA at least, almost all that they accept end up graduating from somewhere. So I’m not sure that necessarily constitutes failure as Cranky suggests. At other schools on the list, however, it may be that many never do complete college.

    Second, some of the schools here are essentially doing remedial education. They are taking students who are not, in any reasonable sense, prepared for college. Remedial education adds time. Are we saying these colleges should not do that as part of their mission? (I am not answering that here, I am just pointing it out.)

    Third, you have to dig through the SCHEV data to find it, but all schools do not receive the same amount per student from the state. UVA gets significantly more per student than JMU, for example. Schools like UVA and W&M also take students coming from significantly higher income brackets than say ODU (again, this is in SCHEV data). It would almost be extraordinary if UVA didn’t graduate at a higher rate than ODU, given that UVA takes better students from higher income families and gets more money to educate them. It is difficult to say which one is really doing a better job on a “bang for buck” basis given their different situations.

  3. re: ” Second, some of the schools here are essentially doing remedial education. They are taking students who are not, in any reasonable sense, prepared for college. Remedial education adds time. Are we saying these colleges should not do that as part of their mission?”

    more than time – money -and for many – going into debt to pay for college prices for remedial education as a result of what happened or more precisely what did not happen in their high school.

    Bacon says he wants more transparency. I do too but not for the things he wants. I want to know WHAT high schools are delivering supposedly college-prepared students that are not … what schools and school districts ? what are the numbers on a per school and per district basis – in Virginia?

    Further I want to know for each college how many they accept rather than reject and then subsequently say they are not ready. What tests are given by colleges to assure that incoming Freshman ARE ready and can proceed BEFORE they get “accepted”?

    what is the average SAT score for students that are accepted but then have to be remediated on a per high school basis?

    what is the average SAT score at EACH high school and for the district?

    Do districts that spend more local money actually deliver higher scoring SAT scores? Is there a correlation between higher local funding for courses over and above the SOQ/SOLs and SAT?

    Conversely do districts that don’t spend much local money deliver lower scoring SAT scores?

    How about within each school district for their individual high schools?

    do students at each of those high schools have equal opportunity with regard to college-prep courses, AP, dual-enrollment, baccalaureat, governors school, etc. or are there disparities in what courses are offered that vary by school?
    what is the median SAT score PER HIGH SCHOOL?

    what are high schools doing to insure that kids headed to college are ready and will not have to be remediated? what percent at each high school end up having to be remediated in college?

    Finally – why do we WAIT for the Feds to require more transparency in these data instead of Virginia itself requiring it especially when we seem to have a love/hate relationship with the Feds when we blame them for requiring data then blame them again for not requiring more?

    Doesn’t Virginia and it’s high schools have responsibility to fairly and accurately report – on a per high school basis – data about it’s grads headed to college?

    When we send kids from high school to college that have to be remediated, we have wasted money TWICE. First – we have apparently failed at the high school level even though we’ve spent money and then more money at the College level has to be spent to get them up to spec and even then there appears to be a high fail rate which begs the question way back at the high school level – what high schools are having a high failure rate in college?

    Jim thinks it’s due to bad parents and bad teachers and disruptive kids.

    I want to know why – if that is true – it varies by school within the same district. do we actually have schools where there are more bad parents and more bad teachers and more disruptive kids and that’s why those schools “fail”. If the district administrators KNOW that a given school in their district has bad teachers why do they not address it? How do you define what a “bad parent” is in high school? do you blame the parents in high school if the kid is actually headed for college but woefully unprepared? whose fault is that?

    so yes.. I agree we need more transparency – but for different reasons than some … I want to know why some high schools are top notch and others dismal failure in the SAME school district. One presumes the administrator in such districts decide what resources to put in what schools and know whether there are or are not disparities in staffing and course offerings.

    My thinking is that the entire College SAT conundrum – is a direct reflection of the high schools where these unprepared kids come from.

  4. From the beginning the SAT has been advertised as a test that predicted success at college level work, so it is nice to see that demonstrated. That gets a bit of a “Duh”. Ditto to Izzo on the fact that transfers out often graduate somewhere else. Izzo is also right that many of the schools with challenged students take longer because of remediation. Missing from the discussion is the role played by money, because six years doesn’t always mean the student paid for 12 semesters – often there are interruptions for financial, health or family crisis reasons.

    And please understand that the screaming for transparency, transparency, transparency means the people running a school (or a hospital) spend all their time doing reports, reports, reports and then the whining will be about a fatter administration. Plus it leads to more litigation, litigation, litigation.

    Finally, UVA and the others who are intensifying their advising structure for students at risk of failure will find it does move the needle. It has elsewhere.

  5. re: “screaming for transparency” versus “blame mentality and related”

    It’s hard to use benchmarks like SAT as metrics for performance without also looking at other – already collected metrics – that are not released to the public but are released to government and industry .. essentially kept from the public.

    so let me give you an example for a public school in Va:

    State Assessment Proficiency
    84 %
    Composite SAT/ACT Score
    SAT: 1140
    ACT: 28

    Graduation Rate
    90 %

    AP Enrollment
    40.2 %
    AP Test Pass Rate
    70.2 %

    Student-Teacher Ratio
    18 : 1

    Now this data is already collected AND made available to the govt but is not available at DOE nor on that School or School Districts website.


    here’s another in the same school district – note the extreme differences in SAT, ACT and AP participation and scores.

    State Assessment Proficiency
    80 %
    Composite SAT/ACT Score
    SAT: 1010
    ACT: 23

    Graduation Rate
    81 %

    AP Enrollment
    28.2 %
    AP Test Pass Rate
    34.2 %

    Student-Teacher Ratio
    15 : 1

    why is this data – already collected not provided by the State nor the School ?

    and why does one school in the district have an SAT score of 1140 while the other one 1010, one with with AP pass rate of 70% and the other 34%?

    are the students at one schoold dumber or have lower IQs or is there a difference in staffing or availability of remedial help, etc?

    these issues cascade into the colleges.. where students have to be re-taught high school by college-level instructors at college level prices, paid for by taxpayers and student debt – and these kids came from schools that presumably offered AP courses, dual enrollment, and remediation BEFORE entry to college, etc.. so how did this happen and how do we do better ?

    how do we do better – if we don’t want to know the data that already exists?

    I’m not taking a blame position on this – I think that kind of orientation is destructive and counter-productive but if we are unwilling to address known issue by hiding data then that only emboldens those who do seek to tear down the public schools.

    at some point – we have to address the issues or the torch and burn folks are going to prevail.

  6. I also want to add this to the discussion because it’s been part of the school discussion and I fear it may not appear on these pages unless I put it here:

    ” State Sen. Bill Stanley, R-Moneta, has a trio of bills aimed at reducing school suspensions and expulsions. One would cut maximum long-term suspensions from 364 days to 45 school days. Another would prohibit suspensions and expulsions for disruptive behavior unless that behavior causes or threatens injury. The third would prohibit pre-school and elementary school suspensions, except for drug and firearm offenses, and some criminal acts.

    A 2016 report by the Legal Aid Justice Center found that Virginia public schools issued 126,000 suspensions over the 2014-15 school year and that 20 percent of them were for elementary school students. Roughly 16,000 of those suspensions affected children from pre-K through third grade, the report found.”

    and again – should this data be more granular? Should people know which schools are seeing these kinds of issues? Sure you can see it on a per school basis on the “report cards” but can you see it for all the schools in one school district so we know if it a generalized county-wide issue or is specific to demographics or geography, etc?

    I do applaud Mr. Stanley who is a GOP.. and would be curious to know more about his motivations as opposed to the motivations of the “throw them out” folks.

  7. “Smarter kids” is a euphemism for the detested IQ. The SAT has been measuring IQ, although the tests are being modified to measure achievement more. So the graphs show that students with a higher IQ are more likely to graduate from college. Isn’t that obvious? The correlation coefficient between SAT and SOL (in math and reading) is 0.98, so students with a high IQ also are higher in achievement. Isn’t that obvious? Our ideology that all people are created not only equal in dignity but also equal in ability blinds us to the data that shows students from wealthy parents, who probably have high IQ’s, inherit their high IQ’s, just as athletes inherit their abilities from their parents. Society consists of people with various abilities, each person contributing (or not) according to his particular skill. The six-year criterion is a sign that the high-schools are not educating. The correlation between IQ and graduation rates is a sign that all colleges have somewhat the same education standards.

  8. re: ” The correlation coefficient between SAT and SOL (in math and reading) is 0.98, ”

    I’d like to see the source of that claim… is there actual data or is it just a belief?

    I also dispute the idea that we’re looking for equal outcomes. we’re not – never were… that’s why we have a wide array of non-college career paths if the k-12 schools provide them in equal measure to college prep.

    we recognize that not all are equal in talent – or other attributes but the goal is to be – all you can be… to meet your potential – and THAT’s the GOAL of public education – not a contest to see who gets to college or becomes a loser.

    we’ve got a “throw out the baby with the bath water mentality” when we essentially say that if you are not high dollar SAT college material -you’re not entitled to resources to help you be what you can be as a productive person who is employable.

    Too many K-12 prioritize their local money to focus on college attendance – even for those who probably ought not to .. and we have tools like CLEP to help guide kids – and parents in making realistic choices.

    There is no shame in Community College or even Vocational School if you come out of it – as an employable worker who can support themselves.. which is in my view what the goal of k-12 education ought to be – not a place to wash out those not fit for 4yr college but perfectly fit and trainable for a real job. We’ve made non-college a mark of failure.. and we basically preordain failure by talking about low IQs and other foolishness.

  9. where are the IQ’s in his work, Izzo?

    ” The correlation coefficient between SAT and SOL (in math and reading) is 0.98, so students with a high IQ also are higher in achievement.”

    and Izzzo – citing the titles of publications for data you are claiming to be using is bogus. you need explicit references to the data you did use:

    1 – where is the data?
    2 – again what pages?
    3 once again – show me the data you accessed
    4 – what exactly are you actually comparing? where is the data you are citing?

    Izzo – this is how fake studies are done …

    it’s fairly typical of think tanks and other sites that claim that data proves something …

    they cite the publication and that’s it… this is no actual data that backs up what they are claiming…

    here’s one for you to try that the doc claims is reference:


    tell me specifically what it is that he is claiming as supporting data…

    by the way – it’s a no-brainer that tests that measure academic performance are likely to correlate but what does that have to do with IQ and where is the data that shows correlation between IQ and academic performance on ANY test?

  10. Izzo – where is the IQ data references in his work?

    ” The correlation coefficient between SAT and SOL (in math and reading) is 0.98, so students with a high IQ also are higher in achievement.”

    and even though he gives other references – none of them actually point to specific data – they instead point to Titles… of publications and websites…not specific data he is claiming to be using…

  11. Larry,

    I’ll let Fred comment on sources and methodology.

    There are so many issues here it is difficult to know where to start. I think one of the things Fred was commenting on is income is correlated with student quality which is correlated with graduation rates, so the line Cranky plots is “obvious”.

    What would be more remarkable is an institution that makes its education available to less affluent students in an affordable way and graduates them to good paying jobs, thereby increasing social mobility. There are rankings that attempt to do this. The Social Mobility Index ( ) has an interesting approach and data.

    There are several things that jump out in the SMI rankings: 1) VMI, W&M, and UVA are in the bottom 25 of all schools in the country for enrolling low income students (and W&L is last) — note that these are the top 3 schools in Cranky’s graduation rate list; 2) these schools are also among the highest in giving their Pell grants (which are intended to make higher ed accessible for lower income groups) to relatively wealthy students (top half) — W&L gives 62%, VMI 57%, UVA 41%, and W&M 30%. In contrast, Hampton U give 0% to the top half; 3) the top 5 public schools in the state in the rankings are GMU, Norfolk State, Old Dominion, Virginia State, and Radford — which are all interestingly “greyed out” (afterthoughts?) ones in Cranky’s plot. Those are the ones SMI believes are actually doing a better job of enabling social mobility.

    I think Jim is going to have to wrestle with how to frame this new focus on higher education. Blogs like Cranky’s might suggest that the schools that get diamonds and polish them a little are producing the most “bang for the buck”. But the issue is complex, and ultimately goes to the role of higher education and the role of the state.

    • Izzo, the overview of the Social Mobility Index makes some interesting observations:

      Pell grants. “Despite its widespread promotion as a marker for inclusiveness, Pell grant participation is, in fact, a very poor indicator of campus economic diversity. … Pell Grants are not consistently given to students from disadvantaged family economic backgrounds. We broke new ground in the 2015 SMI by revealing, for each campus, the minimum percentage of its Pell Grant recipients who come from families making more than the median family income. The data show that at many campuses, over half of their Pell Grant recipients are from the richest half of our nation’s population. Further, as reported by the US Dept of Education, deductions and exclusions in the Pell Grant formula now permit some families making over $100,000 per year to receive Pell Grant awards.”

      I had no idea. I really thought the grants were an indicator of socio-economic diversity.

      Pricing opacity. “A key factor … suppressing college participation of the disadvantaged is its pricing opacity. Regardless of how skillfully and patiently an applicant navigates the financial aid maze, the level of any given university’s institutional aid cannot be known by him/her in advance. The university must first assemble admission offers to its freshman class, wait for acceptances of those offers, and, depending on the need mix of the students, parcel out available funding as award packages. To understand the suppression effect of this byzantine process, imagine what would happen to sales of cars if their price tags had huge, unaffordable numbers that could only be reduced if potential buyers were willing to apply for the right to purchase, fill out more forms to demonstrate financial need, and then wait months for possible acceptance/denial as a customer.”

      Hmmm. Opaque prices…. reminds me of the health care industry.

      Slavish adherence to rankings. “One egregious example of policy sycophancy to the periodical rankings has a noted university mandating no class sizes beyond 19 despite a student body of 16,000 (19 is a cutoff for the periodical in terms of evidencing “small class” sizes). Not only is there no research to support that 19 students vs 20 vs 30 in a college setting carries any impact on learning outcomes, such arbitrary measures clearly increase costs and jeopardize accessibility.”


    • P.S. I didn’t read Cranky the same way you did. His starting point was that SAT scores and graduation rates were strongly correlated. He focused on institutions that performed above or below expectations.

      • isn’t that kind of a “Duh” concept? At the college level why would you NOT think the higher the SAT, the higher the graduation rate – at institutions that are taking the higher scoring SATs?

        Is Cranky trying to correlate lower scores with specific institutions or what? I don’t see what you’d learn at that level unless you want to speculate that SAT has a similar problem to QCA “creep”…

  12. Well the FIRST thing that should be done is to establish a testing regime that identifies kids who are NOT ready for college BEFORE they are accepted and enroll.. and have to be remediated once enrolled. Why are the colleges doing this in the first place? Are they accepting kids who have acceptable SAT scores and QCAs that are then found to not be capable of entry level college?

    It that is the case – then this is a high school issue where those kids are not being vetted properly in the first place. There should be a CLEP type testing regime that kids have to pass in order to demonstrate competence because obviously QCA and apparently SAT are clearly not.

    and what legitimate role does any focus on IQ play in any of this in the first place if a kid passes the SOLs and graduates… even if they don’t go to a 4-year.. they can go to a 2-year or get an occupation certificate or similar?

    I do taxes for mentally handicapped people who have JOBS… earn money – come have their taxes done so don’t traffic in this bogus IQ stuff to start with. It has no place in serious discussions of education that fits the capabilities of the kid.. even if they are not a genius…most are more than capable of doing typical non-college work in the economy.

    And in terms of data – I have yet to find a single school in Va that provides it’s median SAT on it’s own website, on the school district website and nor on the VDOE website.. but I KNOW that data actually is generated because you can get it from 3rd party sites like Niche…

    to get SAT scores for 15 different high schools would be painful – one at a time but perhaps others here know where the data can be easily accessed. I thank you in advance if you know… Perhaps Cranky does.

    I actually WOULD be interested in a table that shows mean SAT with school SOL pass rates… – on a PER SCHOOL basis … then Plot that data.

    VDOE actually does provide datasets …

    but nowhere do I see SAT data…. or any other assessments other than SOLs.

    perhaps I don’t know where to look.. DOE is a bit of a rabbit warren …

    The US News ranking also does not provide SAT but it uses AP for it’s college readiness index

    you can google for it – if I provide a second URL – BR holds the comment until Jim gets around to it.

  13. Jim,

    Fair enough. I was referring to Cranky’s original post on “bang for buck” which I thought gave a different impression.

    On Cranky’s SAT vs Graduation rate analysis, it would be interesting to see it if it incorporated incomes as well. Some of the schools toward the bottom have students that are struggling with academics AND paying to stay in school.

  14. re: ” Some of the schools toward the bottom have students that are struggling with academics”

    I was under the impression that SATs were independent assessments and not vulnerable to “creep” grading… so that the SAT score SHOULD BE some kind of indicator of college readiness…

    • Larry,

      re: “I was under the impression that SATs were independent assessments and not vulnerable to “creep” grading… so that the SAT score SHOULD BE some kind of indicator of college readiness…”

      Yes it likely is an indicator of readiness, those schools at the bottom of Cranky’s analysis have lower SAT scores, and consequently some of the students are struggling with the college level academics. The additional point was that there may be double whammy in that they are also struggling with affording college.

  15. I was wrong. The correlation coefficients between SAT and SOL are above 0.9, some close to 0.98, but, for sociological studies, that is dead on. Many of the sociologists whose work I have read consider above 0.4 to be a strong correlation. I show a graph to give some sense of the degree of correlation.
    There is no explicit IQ data in my work because I could not find IQ data. I used the SAT score as the measure of IQ because SAT mostly measures IQ. I show the sources of the raw data. If anybody wants the Excel file with the data that I copied from the sources, I would be happy to email it to them. They could then check it against the sources — probably spot check it because a full check requires much labor. I welcome any corrections.

  16. One of the more obvious flaws in interpreting the SAT as an indicator of IQ is that SAT can be prepared for. A cottage industry has evolved to help students improve their SAT scores and the SAT itself is also measuring what you have learned in high school.

    If you gave SAT to a middle schooler – many with the exception of rare individuals, would do terrible on it – because SAT is not really measuring innate intelligence as much as it is how accomplished a person is at learning and using techniques and knowledge in accomplishing verbal and math exercises.

    Take an simple math exercise in algebra.. if the student had never been instructed in a particular concept like Polynomial expressions, equations, and functions, even though highly intelligent, they’d likely not be able to accomplish that exercise.

    Similarly – if a verbal narrative used words that the student had never encountered… say Polynomial and the verbal was using it in a sentence or paragraph – it’s quite likely the student would not easily understand it.

    both tasks depend far more on whether they have been introduced to the subject and acquainted with what it was and how it worked, ectc. than innate intelligence.

    This goes back to the K-12 school they went to. Let’s assume they have a high IQ but their school does not offer these advanced subjects or they do but the teacher is an entry level and barely understands the concept themself… then you can see that student , not taking SAT prep material and then struggling with the test – even though they have high intelligence but they really not got good instruction in their school.

    You’d see this typically in schools in low income neighborhoods with high school educated parents and where staff resources are not at the level of the schools in higher income neighborhoods with college-educated parents who demand more competent instructors.

    This is why I think if we measured things like SAT on a per school basis we might know more and wonder how different schools in the same school district can have wide variation on SAT scores.

    I note that a number of others consider AP and CLEP scores better indicators of competence in given subjects – and as well as indicators of the lack of proficiency.

    If you look at a Schools for AP -you’ll see even more widely divergent data and not unusual at all for less than half the students taking the APs to start with and then about half the students to not pass the AP tests.

    I note also that the new DOE Student Profile data DOES have some of the AP data… not all for every school..

    but I’ve heard that AP is a feared curriculum and test for the subjects because you can actually fail it whereas you always get a “number” for the SATs even if it’s not as wonderful as one might hope.

    I’d be curious to see correlation with college graduation with entering student High School AP or CLEP scores as a better correlating measure and it would clearly weed out the weaker students.. If a kid takes AP language or math and fails.. it’s not a good sign for college.

  17. There is no great mystery here. The top colleges get their pick of the top students and the lesser desired colleges end up having to pick through the lesser qualified students not accepted or qualified for the top schools.

    not exactly rocket science except apparently some of the top schools are accepting kids with good SATs that are then found to be not prepared for entry- level College courses which IS a mystery in that one would presume that the SAT cannot be “fudged” as easily as individual HS QCAs.

    How can UVA weed these kids out BEFORE they are enrolled and cost taxpayers money for remediation?

    Use AP and CLEP either in addition to or instead of SAT which can be “prepped” for and produces a higher SAT score than say AP or CLEP would (because both AP and CLEP) are actual course work (like college level) and a much more comprehensive indicator than the SAT snapshot questions are.

    But why are these high level colleges taking questionable quality students to begin with?

    It’s because tuition has gotten so expensive that the demand for high dollar colleges has reduced and those colleges are dipping into the available pool as deep as they have to – to get full enrollment – even if it means they end up taking some that are over their heads and have to be remediated and/or they fail and leave.

    Of much greater interest to me is the K-12 high schools in Va and their individual mean SAT – AND AL /CLEP scores of which you will find vary fairly widely even within the same school district.

    why is that if the school district has the same curriculum and other educational facilities and resources at each school?

    are we saying that an entire school have lesser able students while other schools have much higher performing students? the entire student body?

    I don’t think kids are widgets – but one would think that if a county had say 5 high schools – with the same curriculum and equivalent staff level resources, etc – that they would yield roughly similar results on an entire school basis.

    You can see why this is not the case with K-3; we know that neighborhood schools reflect the income and education demographics of those who live there. Low income neighborhoods typically have lower-paid residents with lower education and their kids are fairly easily identifiable with testing to be “behind” kids of better educated, more highly paid parents in more upscale neighborhoods.

    the disparity of high school SAT/AP scores seems to indicate that low income neighborhoods do not remediate the K-3 kids such that when they reach high school – they’ve caught up.

    So we know – when we look at college and SATs – that they too reflect this conundrum with low income neighborhoods and elementary schools..

    i.e. public schools are largely not fully succeeding in getting these kids up on grade level by high school thence for college for the ones that go.

    I think looking at this issue at the college level is sort of like looking at the front leg of an elephant when it’s the back leg that has problems.

  18. Fred – I’m very impressed!!! I’m not sure I agree with your methodology which seems to be a bit of looking for data to prove your original premise but I give you credit for the documentation, the referencing and putting it on dropbox so others can get to it.

    I note though on this one the two references to Fairfax pubs are now 404 errors and I really wanted to see the doc that had the Fairfax HS SATS on it.

    Did they provide that info on an existing page and just point you to it or did you have to FOIA it and they put it on for you but you could not download it?

    these are pretty high SAT scores for the 24 HS .. the lowest one is 1417. These scores are a lot higher that a LOT of other Va schools..

    For instance, compare that to Deep Run HS in Henrico which appears to be one of their better High Schools – it’s median SAT is 1200 . Highland Springs, it’s lowest is 970… you read that right.

    it’s downright ridiculous that a 3rd party like Niche provides the SATs and you have to FOIA the very same info from the school district itself.

  19. Fred – I’m very impressed!!! I’m not sure I agree with your methodology which seems to be a bit of looking for data to prove your original premise but I give you credit for the documentation, the referencing and putting it on dropbox so others can get to it.

    I note though on this one the two references to Fairfax pubs are now 404 errors and I really wanted to see the doc that had the Fairfax HS SATS on it.

    Did they provide that info on an existing page and just point you to it or did you have to FOIA it and they put it on for you but you could not download it?

    these are pretty high SAT scores for the 24 HS .. the lowest one is 1417. These scores are a lot higher that a LOT of other Va schools..

    For instance, compare that to Deep Run HS in Henrico which appears to be one of their better High Schools – it’s median SAT is 1200 . Highland Springs, it’s lowest is 970… you read that right.

    it’s downright ridiculous that a 3rd party like Niche provides the SATs and you have to FOIA the info in the school district itself.

  20. I did not have any original premise. The FCPS references were direct. They were not from FOIA requests. Somewhere on the FCPS website there is probably recent data. The older data might have been discarded. My conclusion, after looking at schools throughout Virginia, was that the FCPS SAT’s are high because the FCPS has so many Asian students. Notice that the improvement has been great among Asian students as compared to the improvement in the others. We can debate whether the performance of the Asian students is due to nature or nurture. The two are quite entangled.

  21. but Fred – it looks like you started out with that premise, no?

    it’s almost impossible to hold constant – all other influences in these kinds of analyses…

    were you able to get SATs for ONLY asians? only whites? or blacks?

    I’ve never seen SAT data to that level of granularity..

    and I’m not finding per school SAT scores.. on first pass of their website… it’s not in plain sight.

  22. Larry — 1. You are judging falsely about having a premise. When I decide to study data, I totally immerse myself in the data, looking for any and all trends. I do not start with any premise whatsoever — or any agenda. Have you ever tried that approach yourself? It is the most interesting way to learn from data. Why do you keep supposing that I start with a premise? 2. You must be patient and persistent in looking for data. Don’t give up so easily. It is difficult to find. Just because you haven’t found the data does not mean that it is not available somewhere. Keep in mind that some data is available only with a FOIA request, but it is available.

  23. Fred – it’s your responsibility as a person who has done a study and made conclusions to show the data you used. It’s not my job to go looking for it.

    that SAT data is pretty important for anyone to validate your work.

    I’ve spent many hours long before your report – looking for SAT on a per school basis and have not found it to be reported much less reported for demographics like race or ESOL or FRM…. even though they probably do have it

    I think such data would be fascinating and probably provide insights…

    without that data – there is not real way to make much of your work.

    surely you remember if you had to FOIA it.. and if you did, surely you have a copy of it and can share it, no?

    when you correlate with SOLs are you comparing SOL Language with SAT verbal and SOL math with SAT math? Did you actually get the math/verbal SAT breakdown also?

  24. Larry: I reported the data honestly. All of the data that I got and used is in the Excel workbook that I provided. Even if I had saved what I got with a FOIA, you would want it signed by a FCPS official and notarized. If you think I am dishonest, then I cannot convince you.
    The comparison between SOL and SAT scores is given by subject in Report -143, which you should have downloaded.

  25. Fred – I’m only asking for the source SAT data – that’ it. …

    I did look at Report 143 and it left me not understanding:

    ” The mean SATI and mean SOL test scores for 25 high schools in Fairfax County are closely correlated (Exhibit 1). The lowest correlation coefficient is 0.820, between the SAT math and the end-of-course SOL for Virginia and US History. The correlation coefficient is almost independent of whether SAT I math, SAT I reading, or SAT I writing is used. Most correlation coefficients are above 0.9″

    how do you correlate things like SOL History with SAT at all?

    and why would you conclude that history is an equivalent proxy for comparing SOL math, reading and writing with SAT?

    I struggle to understand this when SAT is usually not broken down by the sub-classifications – beyond Verbal and Math…

    where have I gone wrong on understanding this?

    Perhaps I don’t understand how SAT data is reported by the Schools.. You seem to have gotten it broken down by subject and demographics… I’ve never seen that before…

Leave a Reply