The Last Resort: When a Simple Question Requires a FOIA Request
I asked a simple question.
Three times.
Over three months.
I never got an answer.
So I filed a Freedom of Information Act request.
This is the story of how asking “What research supports your rubric?” became a public records demand—and what that reveals about how school districts operate when accountability is optional.
The Question
On August 25, 2025, during my daughter’s acceleration appeal, I asked Oak Park Elementary School District 97’s Director of Teaching and Learning a straightforward question:
“Please share the validation the district used to set 92nd percentile = 0 points and 98–99th = 5 points on the rubric.”
The rubric had some odd scoring. On the AimsWeb assessment:
- 92nd percentile: 0 points
- 93rd-97th percentile: 1 point
- 98th-99th percentile: 5 points
One percentile point—92nd to 93rd—was the difference between getting zero credit and getting credit.
Five percentile points—93rd to 98th—earned the same single point.
Then suddenly, 98th percentile jumped to 5 points.
These seemed like arbitrary cutoffs. But maybe there was research showing these specific thresholds predicted successful acceleration?
I wanted to see it.
The Illinois Law
This isn’t just academic curiosity.
Illinois law—the Accelerated Placement Act (105 ILCS 5/14A-32)—explicitly requires that acceleration practices “be based on successful research on effective practices.”
Not “should be based on research.”
Not “we recommend research.”
“Be based on successful research.”
So when a district uses a rubric with specific percentile cutoffs to make acceleration decisions, there should be research supporting those cutoffs. Validation data. Evidence that these thresholds actually predict successful acceleration.
I asked to see it.
Request #1: August 25, 2025
In my email to Emilie Creehan (Director of Teaching & Learning) and Acting Superintendent Patrick Robinson during the appeal:
“Per our conversation last Wednesday, please share the validation the district used to set 92nd percentile = 0 points and 98–99th = 5 points on the rubric.”
I explained I’d looked at the AimsWeb development manual, which describes it as a universal screening and progress-monitoring tool. The manual notes that single measures have modest predictive validity (correlations typically .30s–.50s) and that composites perform better.
But that’s not the same as local evidence that tiny percentile differences near the ceiling meaningfully separate readiness for skipping a year.
“I’m happy to send the specific pages I’m looking at if useful,” I offered.
Response: None. The email didn’t address this question.
Request #2: September 4, 2025
After the appeal was denied, I sent a follow-up on September 4:
“This is the third request for the research or local outcome data supporting the rubric’s conversion of 92nd percentile = 0 points and 98–99th = 5 points. You have described the spread as valid; please share the sources you’re relying on.”
I also noted the broader research context:
“The costs of false negatives are well documented across meta-analyses and longitudinal work (Kulik & Kulik; Rogers; Steenbergen-Hu, Makel, & Olszewski-Kubilius; Colangelo/Assouline’s A Nation Deceived/Empowered; Gross; Lubinski & Benbow/SMPY): appropriately selected acceleration yields strong academic gains with neutral-to-positive social-emotional outcomes, while withholding acceleration after mastery is demonstrated drives disengagement.”
If the district had research showing their specific thresholds reduced false positives without increasing false negatives, I wanted to see it.
Response: Still nothing about validation research.
The Answer: September 11, 2025
Seven days later, Acting Superintendent Patrick Robinson finally responded:
“We do not have all of the detailed work readily available to provide.”
Not “here’s the research.”
Not “we’ll get it to you next week.”
“We do not have all of the detailed work readily available.”
Read that carefully.
After three requests over two weeks, the district’s response was that they don’t have the research “readily available.”
Which raises an obvious question: If you don’t have the research readily available, what were the thresholds based on?
What This Means
Let me be clear about what this response reveals:
Option A: The district has validation research but won’t share it.
Option B: The district once had research but lost it or can’t find it.
Option C: The thresholds were never based on research.
Any of these options is troubling.
If it’s Option A—they have research but won’t share it—why? Parents have a right to understand the basis for decisions affecting their children’s education. The research is either public information or the district should be able to provide citations.
If it’s Option B—they had research but can’t locate it—that suggests poor record-keeping for a critical decision-making tool. How can a district validate its rubric if it doesn’t know what the original validation was?
If it’s Option C—the thresholds were never research-based—then the district is violating Illinois law. The Accelerated Placement Act requires research-based practices. “We picked numbers that seemed reasonable” doesn’t meet that standard.
In any case, families making acceleration decisions deserve to know what the thresholds are based on.
The Broader Pattern
This wasn’t the only question the district couldn’t or wouldn’t answer:
The 80% Threshold
I asked why students need to score 36 out of 46 points (78%, advertised as “80%”) to qualify.
Why 80%? Why not 75%? Or 85%?
Response (September 11, Robinson): “We do not have all of the detailed work readily available to provide.”
Same non-answer.
The Report Card Weighting
I asked why “Meets” on report cards = 0 points, the same as “Below.”
Why is meeting grade-level standards treated as failing the acceleration rubric?
Response: None. Question not addressed.
The Trimester 3 Omission
The rubric only uses Trimester 1 and 2 report cards. Trimester 3—the most recent data, closest to when students would enter the accelerated grade—doesn’t count.
I asked why.
Response: Practical limitations (teacher summer vacation timing). But no acknowledgment that this means using less predictive data.
Parent Participation
The district’s website states: “The MTSS team may be comprised of… a family or legal guardian of the referred student or a representative designated by a family or legal guardian.”
I requested inclusion on the MTSS team. I designated a 4th-grade teacher (Amy Mariani) as my representative.
Neither was contacted or included.
I asked for clarification of the district’s policy on parent participation.
Response: Parents participate by submitting applications and having appeal rights. Not by attending MTSS meetings.
No explanation of how this squares with the website’s statement that families “may be comprised of” the team.
The Pattern: No Accountability
Here’s what these non-answers reveal:
When asked for the research basis of the rubric: “We do not have all of the detailed work readily available.”
When asked for validation data: Not provided.
When asked for policy clarification: Given process description, not legal basis.
When asked about specific thresholds: “Not readily available.”
The district operates an acceleration system with specific, consequential thresholds—and cannot or will not explain what those thresholds are based on.
The FOIA Request: November 4, 2025
After three months of requests and non-answers, I filed a Freedom of Information Act request.
This wasn’t my preference. FOIA requests are adversarial. They’re formal. They create a record of distrust.
But when a district won’t answer basic questions about the research basis for a system affecting dozens of students annually, FOIA becomes the only tool left.
What I Requested
The FOIA request asks for six categories of records:
Category 1: Rubric Validation Research
- Research studies, journal articles, or technical manuals the district relied on when establishing percentile-to-points conversions
- Local validation studies or outcome data showing the rubric predicts successful acceleration
- Longitudinal tracking of accelerated students: success rates vs. de-acceleration rates
- Statistical validation of cutoff scores
- Communications discussing research basis for thresholds
This last item is critical. Illinois publishes acceleration numbers by grade and demographics on the state Report Card—but not success rates. The district knows which students had to de-accelerate because the material was too challenging. If they’ve been raising bars in response to struggles, that data should exist. But it’s not public. Without it, we can’t know whether tightening thresholds actually improved outcomes or just reduced access.
Category 2: Rubric Development Process
- Meeting minutes from rubric development
- Names and credentials of who developed the rubrics
- Alternative threshold models that were considered
- Rationale for selecting current percentile conversions
Category 3: Comparative Application and Equity Analysis
- District-wide acceleration statistics by grade level (applications vs. approvals)
- Demographic breakdown of applicants and approval rates
- Documentation of students scoring below 80% who were approved anyway
- Internal analysis of the 10:1 disparity (276 seventh graders accelerated vs. 26 first graders)
Category 4: Assessment Platform Transition
- Documentation of transition from MAP/AimsWeb to new assessment (STAR)
- Timeline for updated rubrics
- Plans for validating new thresholds
Category 5: Parent Participation
- District interpretation of parent participation requirements
- Policy documents about MTSS team composition
- Communications about whether parents can attend MTSS meetings
Category 6: External Guidance
- Communications with Illinois State Board of Education about validation requirements
- ISBE feedback on District 97’s rubrics or procedures
Why This Matters
These aren’t gotcha questions.
They’re basic transparency requests:
- What research supports your thresholds? (Required by state law)
- How were the thresholds developed? (Public has a right to know)
- Do they predict successful acceleration? (Validation question)
- Why are early grades accelerated at 1/10 the rate of middle school? (Equity question)
- Can parents participate in decision-making? (Required by state law)
A district confident in its research-based, validated, equitable system should be able to answer these questions easily.
A district that responds “we do not have all of the detailed work readily available” is admitting something else entirely.
What I Expect to Find
I have three predictions about what the FOIA response will reveal:
Prediction 1: No validation research exists.
The thresholds were likely set through a combination of professional judgment, borrowing from other districts, and incremental adjustments over time. Not based on studies showing these specific cutoffs predict successful acceleration.
Prediction 2: Thresholds were raised in response to acceleration struggles.
Districts see some accelerated students struggle. They respond by raising bars. But if the original measures were wrong (screening tools instead of readiness tests; grade-level performance instead of above-grade potential), raising the bar doesn’t fix the problem.
Prediction 3: No equity analysis was conducted.
The 10:1 disparity (276 seventh graders vs. 26 first graders) suggests either (a) the rubric is calibrated differently by grade, or (b) it has unintended disparate impact. I expect no documentation showing the district analyzed this pattern or considered whether early-grade barriers were appropriate.
If I’m wrong—if the district has robust validation research, equity analyses, and clear policy justifications—wonderful. I’ll be relieved to see it.
But “we do not have all of the detailed work readily available” doesn’t inspire confidence.
The Broader Implications
This isn’t just about one family’s acceleration decision.
This is about how districts operate when accountability is optional.
The Accelerated Placement Act requires research-based practices.
Oak Park District 97 uses a rubric with specific percentile cutoffs to make consequential decisions about dozens of students each year.
When asked what research supports those cutoffs, the district says the research isn’t “readily available.”
If there’s no research, the system violates state law.
If there is research but the district won’t share it, families can’t meaningfully participate in decisions affecting their children.
Either way, this is a transparency failure.
Why FOIA Was the Last Resort
I didn’t want to file a FOIA request.
I wanted the district to answer a simple question: What research supports your rubric thresholds?
I asked three times.
I waited three months.
I was patient, professional, and specific.
The response: “We do not have all of the detailed work readily available.”
FOIA became the only tool left to get a straight answer about whether the district follows the law.
And that’s deeply troubling.
Because if parents have to file public records requests to learn the research basis for a supposedly research-based system, the system isn’t actually transparent.
It’s just performatively compliant.
What Comes Next
The district has five business days to respond to the FOIA request—either producing records, explaining why specific exemptions apply, or requesting a single five-day extension.
I expect they’ll request an extension. These are complex requests spanning multiple categories.
When the records arrive, I’ll review them carefully.
If they show robust validation research, I’ll acknowledge it. If they show the thresholds were evidence-based and equitable, I’ll say so.
But if they confirm what “we do not have all of the detailed work readily available” suggests—that the thresholds were never validated, never based on research showing they predict success, never analyzed for equity impact—then we have a bigger problem.
Because a district that requires 80% on a rubric to accelerate students should be able to explain why 80% is the right threshold.
And “we don’t have the detailed work readily available” isn’t an explanation.
It’s an admission.
Related Posts:
- The Feedback Loop: How Bad Rubrics Create Their Own Crisis - Why raising bars on wrong measures doesn’t fix the problem
- When Ready Isn’t Enough: How Rubrics Measure the Wrong Things - The two questions rubrics conflate
- The Wrong Tool: Why Screening Tests Don’t Belong on Acceleration Rubrics - AimsWeb design vs. acceleration use
- The Acceleration Gap: 276 to 26 - Grade-level disparities raise equity questions
This is part of an ongoing series documenting one family’s experience with gifted education acceleration in Oak Park Elementary School District 97. All facts are based on emails, rubric documents, and official communications obtained through public records requests and direct correspondence with district officials.
Names of district administrators and principals are used as they are public officials performing official duties. Teacher and staff names have been removed to protect privacy. Student names are withheld to protect privacy.