The Glitch: When the Math Test Fails at Math

The Glitch: When the Math Test Fails at Math

November 14, 2025

I have a computer science background, and I spend hours every day providing math instruction to my kids. So when I see calculation errors, I notice. It’s the same instinct that made me spot discrepancies in Oak Park Village’s Vision Zero crash analysis data back in 2024.

But I shouldn’t have to audit a school district’s addition to make sure my daughter gets a fair evaluation.

Good thing I did anyway.

The First Error: Kindergarten, Spring 2024

In April 2024, Principal Hussain Ali sent us an email: my daughter had scored 5 out of 6 points on Section 1 of Oak Park Elementary School District 97’s math acceleration rubric. She needed 6 points to proceed to the next round of testing.

Result: Denied.

But wait—let me just double-check the math.

She got:

  • Student Voice Survey: 1 point
  • Fall MAP (89th percentile): 1 point
  • Winter MAP (94th percentile): 2 points
  • Report Card (Meets standards in 60% of areas): 2 points

That’s 1 + 1 + 2 + 2 = 6 points.

Not 5.

I sent a quick email to the district’s instructional coach: “I guess it’s moot now, but if my daughter got 1 point for the Student Voice survey, by my calculations she should be at 6 points for section 1. Maybe I misread something?”

Six hours later, I got this reply:

“You are actually correct! I contacted the district this afternoon and they discovered a glitch in the calculations. The exact 60% was marked a point lower on the rubric instead of receiving 2 points. Things have now been adjusted.”

The error: The district’s system marked her 60% Report Card performance one point too low. Instead of 2 points, she got 1 point (or possibly 0).

The result: 5 points instead of 6.

Who caught it: Not the district. Me. A parent with a calculator and a healthy skepticism of systems that claim to be data-driven.

Timeline: Four days elapsed from the initial denial (April 19) to the correction (April 23).

The Irony You Can’t Make Up

Before we get to whether the error “mattered,” let’s appreciate the cosmic irony here:

This was an application to accelerate past first grade math. The skills taught in first grade math include:

  • Basic addition and subtraction
  • Understanding place value
  • Simple word problems

And the district’s sophisticated, multi-criteria acceleration rubric—the system designed to evaluate whether my five-year-old daughter had mastered first grade math—failed at first grade addition.

1 + 1 + 2 + 2 = 6.

The rubric couldn’t add its own points correctly at the very grade level it was designed to assess.

Plot Twist: It Didn’t Matter (This Time)

The correction allowed my daughter to proceed to Section 2 testing, which she took in May. But even with the error fixed, she ultimately didn’t qualify for acceleration. She scored 8 out of 24 total points, far below the 18-point threshold.

The calculation error mattered for process (without the correction, she wouldn’t have tested at all). But even with correct math, the rubric’s thresholds were harsh enough that she fell short.

And one year later, it happened again.

The Second Error: First Grade, Spring 2025

In May 2025, Principal Ali sent another email announcing my daughter’s acceleration decision. This time, she scored 27 points out of a possible 46. She needed 36 points (80%). With 27 points (59%), she didn’t qualify.

But something bothered me about that number.

I requested the detailed rubric breakdown. When I got it in June, I did what any parent would do: I checked the math.

Section 1a: 18 points Section 1b: 11 points

18 + 11 = 29.

Not 27.

I pointed this out in a July email. Principal Ali’s response confirmed it: “You are correct, the total is 29 points (18+11).”

The error: Arithmetic. 18 + 11 = 29, not 27.

The result: Understated her performance by 2 points.

Who caught it: Again, not the district. Me.

Still didn’t qualify: Even with the correct score of 29 points (63%), she was below the 80% threshold.

The Irony Strikes Again

And here’s where it gets really good:

This application was to accelerate past second grade math. Second grade math skills include:

  • Two-digit addition and subtraction
  • Understanding the relationship between addition and subtraction
  • Fluency with addition facts within 20

The acceleration rubric was evaluating whether my six-year-old daughter had mastered second grade math.

And it couldn’t correctly add 18 + 11.

That’s a second grade math problem. Addition of two-digit numbers. The exact skill level the rubric was supposed to assess.

Once again, the system failed its own test.

The Pattern

Let’s be clear about what happened here:

Kindergarten application (2023-2024):

  • Calculation error discovered: Yes
  • Direction of error: Understated performance
  • Who caught it: Parent
  • Grade level of error: First grade addition (accelerating past first grade)

First grade application (2024-2025):

  • Calculation error discovered: Yes
  • Direction of error: Understated performance
  • Who caught it: Parent
  • Grade level of error: Second grade addition (accelerating past second grade)

Two applications. Two calculation errors. Both caught by a parent, not by the district’s quality control processes. Both errors went in the same direction—against my daughter. Both errors occurred at exactly the grade levels the rubric was designed to assess.

The Question That Keeps Me Up at Night

My daughter ultimately didn’t qualify for acceleration in either year, even with the errors corrected. The thresholds are harsh, and she fell short by margins that no amount of arithmetic fixing could bridge.

But here’s what haunts me:

What about the kids whose parents don’t check the math?

If I hadn’t independently verified the rubric calculations—twice—these errors would have gone undetected. And while they didn’t change my daughter’s outcome, what about a child who scored 5 points when they should have scored 6? Or 35 points when they should have scored 36?

What about the families who trust the system to get the basic arithmetic right?

What This Reveals

This isn’t about individual mistakes. People make errors. I get that.

This is about a system that presents itself as rigorous and data-driven but can’t handle basic arithmetic at the grade levels it’s designed to assess.

Oak Park Elementary School District 97 uses a detailed, point-based rubric to make acceleration decisions. The rubric has specific thresholds where a single point can mean the difference between testing and not testing, qualifying and not qualifying.

When the stakes are that precise, the arithmetic has to be right.

And when it’s not—twice, at exactly the grade levels being evaluated—maybe the problem isn’t just execution. Maybe it’s a signal that the system isn’t as rigorous as it claims to be.

In my daughter’s case, the quality control was me. Armed with a calculator and a growing realization that “data-driven” doesn’t mean what I thought it meant.

The Equity Question

Here’s the uncomfortable truth: I have the time, education, and confidence to question these decisions. I can independently verify rubric calculations. I know how to write emails citing specific numbers and following up persistently when I don’t get answers.

I’ve been doing this for years with Oak Park Village infrastructure—emailing Bill McKenna about bike lane obstructions, applying to serve on the Transportation Commission, challenging data in Vision Zero reports. I know how slow institutional change can be, and I have the resources to keep pushing.

Most parents don’t.

And that means calculation errors—which the district has admitted happened at least twice in my daughter’s case—likely affect other families who trust the system to get the basics right.

How many children have been denied progression to Section 2 testing because of calculation errors that went undetected?

How many families received scores of 35 points when the correct score was 36?

How many kids missed opportunities because their parents didn’t think to audit the district’s arithmetic?

The Bigger Picture

My daughter didn’t qualify for math acceleration in kindergarten or first grade. Even with the correct math, her scores fell short of the district’s thresholds.

But the fact that I had to correct the district’s arithmetic twice—at exactly the grade levels the rubric was supposed to assess—tells me everything about this system:

It failed its own test. Twice.

And when you’re making decisions about five-year-olds and six-year-olds—decisions that shape their educational trajectories—maybe we should start by asking whether the system measuring them can pass the tests it’s giving.

Spoiler: It can’t.


Related Posts:

Next in series: Coming next week: The Ghost Rubric—when requirements reference tests that don’t exist.


This is part of an ongoing series documenting one family’s experience with gifted education acceleration in Oak Park Elementary School District 97. All facts are based on emails, rubric documents, and official communications obtained through public records requests and direct correspondence with district officials.

Names of school officials (principals, district administrators) are used as they are public officials performing official duties. Student and parent names are withheld to protect privacy.