Reporting tool — not an official determination. This tool reflects DOE rulemaking data and modeled estimates. The public file is grouped at the institution × credential × 4-digit CIP level; the proposed rule would operate at the 6-digit CIP level. Institutions assign CIP codes to their own programs, so similar-looking majors may be classified differently across colleges. Use as a reporting lead, not a regulatory finding.

Methodology

How this tool works, what the data can and cannot say, and what every reporter should know before using it.

209,321
Total program groupings
2,874
DOE-modeled fail
46,509
Passes DOE model
159,938
Insufficient public data

Modeled fails by sector

Sector Institutions Fails
proprietary 1610 1267
public 1822 1144
private nonprofit 1664 463

Modeled fails by credential level

Credential Fails
Undergrad Certificate 1659
Associate 504
Master's 372
Bachelor 300
Graduate Certificate 14
First Professional Degree 13
Doctoral 11
Post-Bacc Certificate 1

Modeled fails by broad field (top 10)

Field Fails
Personal & Culinary Services 892
Health Professions & Clinical Sciences 807
Visual & Performing Arts 363
Business & Management 117
Education 92
Family & Consumer Sciences 89
Liberal Arts & Humanities 60
Communications Technologies 45
Agriculture & Natural Resources 42
English & Literature 42
Psychology 37
Communications & Journalism 33
Social Sciences 28
Public Administration & Social Work 21
Computer & Information Sciences 20
Parks, Recreation & Fitness 19
Philosophy & Religious Studies 18
Mechanic & Repair Technologies 18
Theology & Religious Vocations 16
Interdisciplinary Studies 16
Biological & Life Sciences 16
Security & Protective Services 14
History 12
Foreign Languages & Linguistics 12
Construction Trades 11
Precision Production 11
Engineering Technologies 7
Transportation & Materials Moving 5
Library Science 3
Law & Legal Studies 2
Area, Ethnic & Cultural Studies 2
Architecture & Design 1
Basic Skills & Developmental Education 1
Physical Sciences 1
Mathematics & Statistics 1

What is PPD:2026?

The Program Performance Data file (PPD:2026) was released by the U.S. Department of Education as part of its 2026 rulemaking on earnings accountability for higher-education programs. It contains DOE's modeled estimates of how programs would fare under the proposed rule, based on earnings outcomes measured roughly four years after program completion. This tool uses that file as its primary data source.

What unit of analysis does it use?

The public PPD:2026 file is built at the level of institution × credential level × 4-digit CIP code. That is the unit of analysis throughout this tool. Each "program grouping" shown here represents one such combination.

The actual proposed rule would operate at the level of institution × credential level × 6-digit CIP code. This mismatch is one of the most important caveats in this dataset. A single record here may represent one 6-digit program, or it may represent several distinct 6-digit programs that share the same 4-digit CIP prefix.

DOE's own rulemaking materials note that approximately 83 percent of 4-digit CIP codes contain only one 6-digit CIP code at any given institution. The remaining 17 percent may contain multiple programs, and the public data cannot distinguish between them.

Why are modeled results not final?

The PPD:2026 estimates are modeled, not adjudicated. They are based on a single cohort snapshot with some missing-data assumptions. They cannot account for:

  • Future institutional appeals or corrections
  • Teach-out arrangements or program closures before a rule takes effect
  • Changes in program structure or CIP coding
  • Student behavior changes in response to the proposed rule
  • Multi-year averaging — the actual rule would require failing in 2 of 3 years in which the test is calculated

A program appearing as a modeled fail here is a reporting lead, not a regulatory determination.

How does the earnings test work?

Under the proposed rule, a program generally becomes a "low-earning outcome program" if it fails the earnings premium measure in 2 of 3 years in which the measure is calculated. Failing the measure means median earnings of program completers fall below a benchmark — either a state or national median for workers with only a high school diploma (for most undergraduate programs) or a bachelor's-degree-based benchmark (for most graduate programs).

Earnings are measured approximately four years after completion. This timing matters especially for fields where early-career incomes are lower but lifetime earnings are competitive — graduate-level programs in education, social work, or humanities, for example.

For CIP codes not listed in the benchmark statute (Section 84001), DOE falls back to the national high school median as the benchmark. This tool notes when that fallback applies.

Why do institution-assigned CIP codes matter?

Institutions assign CIP codes to their own programs. There is no external auditor confirming that similarly-named majors at different colleges share the same CIP code. This creates meaningful comparability limitations:

  • A "Communication Studies" program could be coded under journalism (09), communication (09), or interdisciplinary studies (30), depending on how a college structures it
  • Area studies, language programs, and interdisciplinary humanities fields are especially variable
  • Two institutions teaching nearly identical curricula may face different outcomes simply because of their coding choices

This tool flags program groupings in CIP families where classification variability is especially common.

What does "Insufficient public data" mean?

159,938 program groupings (76% of the dataset) carry an "Insufficient public data" designation. This means DOE's public file either does not include an earnings test result for this grouping, or the earnings data was suppressed for privacy reasons (typically fewer than 30 completers with earnings data). Absence of a fail designation is not the same as passing the test.

How is IPEDS completions data used?

This tool does not yet incorporate IPEDS 6-digit completions data. When available, it will be used as a structural bridge — not as an earnings file — to identify which 6-digit programs likely sit inside each 4-digit CIP grouping at a given institution. Any such bridge data will be clearly labeled as separate from DOE's earnings modeling.

How are city and metro assignments constructed?

City assignments come directly from the institution's address in the PPD:2026 geographic supplement. Metro area (CBSA) assignments also come from that supplement and reflect Census Bureau Core Based Statistical Area delineations (2023 vintage). Institutions without a CBSA assignment — typically those in rural areas or with non-standard geographic designations — are listed as "Not applicable" and do not appear in metro-level views.

Risk labels used in this tool

DOE-modeled fail
The public PPD:2026 file's master fail flag (mstr_obbb_fail_cip2_wageb) is set to 1 for this grouping. This is the corrected flag DOE released on January 2.
Passes DOE model
The master fail flag is set to 0. The program's median earnings meet or exceed the benchmark in DOE's modeling. This does not guarantee the program would pass under a final rule.
Insufficient public data
The master fail flag is null or the earnings test was not calculable for this grouping in the public file. This includes privacy-suppressed results and groupings with too few completers to test.

What this tool should not be used for

  • Determining which specific majors will lose Title IV eligibility
  • Comparing similarly-named programs across institutions as if they are equivalent
  • Treating "Passes DOE model" as a guarantee of regulatory compliance
  • Treating IPEDS completions (when added) as if it were DOE's earnings file
  • Any public-facing reporting without independent verification and additional reporting

The test for whether this tool is working

A reporter using this tool correctly should be able to say: "Here are the broad program groupings at colleges in my area that DOE's own modeling suggests may be exposed. Here are the caveats. Here are the questions I should ask next."

A reporter using it incorrectly would conclude: "This tool tells me which exact majors will lose aid."

Data sources: DOE PPD:2026 rulemaking file · DOE geographic supplement (Jan 7 release) · Census Bureau CBSA delineations (2023 vintage) · IPEDS 6-digit completions (not yet incorporated)