The National Assessment of Education Progress – the NAEP – is our longest ongoing “national report card” and has its roots in the early 1970s.

It is important to know that the NAEP is made up of multiple distinct assessments rather than being a single set of tests. The most prominent are the Long-Term Trend (LTT) assessment – that has remained mostly unchanged since the early 1970s and produces only national level scores – and the Main NAEP (or just “NAEP”) that started in 1990 and produces state-level results. Since 2003 (and the No Child Left Behind federal law), the Main NAEP has been given every two years rather than every four years as before.

It is also worthwhile to know that the LTT remains essentially unchanged since its inception, while the Main NAEP is driven by the so-called “content frameworks” that change periodically to properly reflect the overall content of the U.S. education landscape.

For example, the most recent change in the reading framework was in 2009, and in math in 2005. It is also useful to know that about 10-12 points on the NAEP scale reflect the equivalent of a grade-level; in other words, a change of 2-3 points or more on the NAEP – a quarter of grade or more – is a “big deal” and nothing like 2-3 points on a scale of 100 that most of us are used to.

By its mission, NAEP must be a broad test rather than be aligned to a particular program or curriculum. This makes sense. States already have their own tests aligned to their standards and curricula, but we also need to have a broad measure that is not tailored to a particular state and does not change too much over time (and, incidentally, will not force more teaching to the test). That is the only way we will be able to notice when states – or the nation – decline in performance; such changes may be unnoticeable on those tightly-aligned state tests that, by their nature, measure only what is explicit in each state’s program rather than the totality of expected knowledge.

Common Core standards were adopted by more than 40 states in 2010, and their implementation started in 2011 in a handful of states, with the bulk of the states implementing them in the 2012-13 and 2013-14 school years, ahead of the first administration of the federally funded Common Core tests in 2014-15. In other words, the 2015 NAEP is the first opportunity to examine how the country is doing using an independent and objective yardstick, after Common Core implementation was completed.

Despite Common Core promoters’ claims of them being “rigorous and internationally-benchmarked,” it has been clearly shown they are of low quality and experimental nature, so it is understandable that Common Core peddlers were anxious about the 2015 NAEP results – and some chose to undermine the upcoming NAEP verdict “just in case.”

The president of the pro-Common Core Fordham Institute floated the idea that “if the NAEP results will be terrible, blame the recession.” The American Institutes for Research (AIR), one of the main subcontractors for the federally funded Common Core tests, published right on cue – a couple of days before the NAEP results were announced – a new report that studied the NAEP in relation to Common Core math. The outcome of that report suggests that NAEP results are invalid because NAEP is not sufficiently aligned with Common Core and the recommendation is made to align NAEP to the Common Core.

In fact, if one carefully reads the report, its findings actually clearly say:

Overall, the review by expert panelists suggests that concordance between NAEP and the CCSS is reasonable at both grade levels. It makes sense that NAEP, which is required by its mission to be broad, will include some items that are outside the bounds of the CCSS and will not assess every standard in the CCSS. (emphasis added)

But in its conclusions, the report ignores its own findings and disingenuously argues, “We believe that this is an appropriate moment for NAEP’s Governing Board to review the framework in light of the CCSS.”

Education Week chose to join in this specious undermining of NAEP validity and placed the fact the NAEP alignment was found reasonable in scare quotes: “NAEP and Common-Core Math Show ‘Reasonable’ Overlap, Study Says.”

The NAEP results were published a few days later and they were devastating: national drops of 2-3 points in 8th grade reading and math, and in 4th grade math (and with many individual states dropping by 4, 5 and even 9 points). This was unprecedented since the Main NAEP inception more than 20 years back, and the decline – particularly in math – was essentially everywhere. While statistically significant decline occurred in about half the states, all but seven states dropped in 8th grade math, and all but 13 states in the 4th grade assessment. Clearly, such massive and meaningful decline does not look like the blip that Arne Duncan argued it was. To his credit, Fordham Institute’s president quickly acknowledged that it doesn’t seem to have anything to do with the economy after all.

Consequently, what was left was to undermine NAEP itself and ensure its alteration so it will never again show Common Core deficiencies. This is the context in which the recent Hechinger Report piece by Jill Barshay, “Is It Time to Update NAEP?” must be read.

Barshay interviews Fran Stancavage of AIR, the lead author of the report discussed above. She argues that the alignment of NAEP to Common Core is so poor that it can’t be valid for assessing states, and that NAEP must be aligned with the Common Core as soon as possible. Until that is accomplished, NAEP would falsely – in her opinion – signal the decline in American education.

Luckily, the piece is so sloppily written, and Stancavage’s quotes are so erroneous and mathematically incoherent (she has only a BA and MA in sociology) that one only hopes it will have little impact.

The main pitch of Barshay and Stancavage is that some items on NAEP are not in Common Core. To illustrate this, they use a NAEP stem-and-leaf plot item. Barshay writes:

It’s an old-fashioned way to depict data, one that isn’t used much anymore. The Common Core standards, used in more than 40 states, intentionally pared down the list of data representations that students should learn to ones that are commonly used in the real world, such as bar charts and time-series graphs, so that students could spend more time mastering them.

This single paragraph shows the type of mathematical ignorance that makes me blush. First, stem-and-leaf plot is not an “old-fashioned” technique that is not used anymore. It is simply one of many possible representations and not every representation needs to be listed in standards. Every professional with expertise in this area knows this. Indeed, if one looks at the Eureka Math Common Core program that is considered exemplary, it addresses stem-and-leaf plots in grade 7 (lesson 14 in module 5). Further, claiming as she does that bar charts are “commonly used in the real world” is incorrect – bar charts are pedagogical abstractions widely used in Singapore’s schools but rarely, if ever, used “in the real world.”

Stancavage piles on.

“It becomes something of an IQ test to figure out what the heck you’re looking at…The kinds of things we’re seeing, it wouldn’t matter if someone had become an expert implementer,” said Stancavage. “Some of these things the kids have never been exposed to.”

Well, wrong. It is not an IQ test. It is taught by the most-celebrated Common Core curriculum. Any 8th grader who has difficulty figuring out what stem-and-leaf plot is – even without being taught it – probably doesn’t belong in 8th grade anyway.

“They [NAEP items] haven’t changed it since 1990,” Stancavage added. “They probably are due. It’s 25 years.”

Houston, we have a problem. Stancavage is introduced in this piece as someone who “has helped develop the NAEP exam for the U.S government.” On the AIR site she is described as someone who “directed the National Assessment of Educational Progress (NAEP) Validity Studies project” – yet she doesn’t know that the NAEP items did change – multiple times! – since 1990? One can expect some amount of mathematical ignorance from a sociologist (although one wonders why a sociologist was chosen to lead a study of the mathematics alignment of NAEP and Common Core), but gross ignorance of the nature of the NAEP from a director of the NAEP Validity Studies project?

Or, perhaps, the issue here is not ignorance but rather abusing her personal and institutional academic authority to peddle an ideology in which she – and her institution – are invested. After all, if Common Core dies then AIR will likely return to being a third-tier player in the test market, the position it held prior to the Common Core.

But, what is also disappointing is that the Hechinger Report allowed itself to be used in that way as well.

Ze’ev Wurman is a former senior policy adviser in the U.S. Department of Education under President George W. Bush, and a senior fellow at American Principles Project.