Freddie Mac AI Mandate Compliance Checklist: What Mortgage Companies Must Do Now
Freddie Mac Bulletin 2025-16, issued December 3, 2025, rewrote the rules for artificial intelligence in mortgage lending. As of March 3, 2026, every approved seller/servicer must operate an auditable AI governance program under the updated Section 1302.8 of the Seller/Servicer Guide. If your company sells loans to or services loans on behalf of Freddie Mac and uses any form of AI or machine learning, this mandate applies to you.
The deadline has passed. If you have not built or documented your AI governance framework, you are already out of compliance. This article breaks down exactly what the bulletin requires, provides a point-by-point compliance checklist, and maps out a catch-up plan for mortgage companies that still need to act.
What Freddie Mac Bulletin 2025-16 Requires
Bulletin 2025-16 updates Section 1302.8 of Freddie Mac's Single-Family Seller/Servicer Guide. The update moves beyond general information security policies into an explicit requirement: approved mortgage companies must maintain a comprehensive governance framework for the responsible development, deployment, and oversight of AI and machine learning systems.
The scope is broad. Any AI or machine learning used in connection with loan origination or servicing falls under the mandate. That includes automated underwriting models, document classification and extraction tools, automated valuation models (AVMs), fraud detection systems, quality control automation, borrower-facing chatbots, and servicing decision engines. If your loan origination system, QC platform, or document processing vendor embeds AI, you are covered. For lenders already using AI in underwriting, the compliance implications are significant — see our guide on how AI is revolutionizing mortgage underwriting.
The framework centers on three principles that Freddie Mac spells out explicitly: transparency, accountability, and ethical stewardship. In practical terms, this means you need to know where AI operates across your organization, who is responsible for governing it, and how you monitor and control it over time.
Senior Management Accountability
The updated guide requires senior management sign-off on AI policies. That means your Chief Information Officer, Chief Technology Officer, or Chief Risk Officer must formally approve the governance program. This is not a checkbox exercise. Freddie Mac expects named accountability at the leadership level.
Auditable Program
Your AI governance program must be auditable. That means documented policies, procedures, and practices with evidence trails. Freddie Mac expects compliance with standards like NIST 800-53 and ISO 27001. If you cannot demonstrate your AI governance during an examination, you have a problem.
The March 3, 2026 deadline has already passed. Mortgage companies that have not documented their AI governance program are technically non-compliant today. Freddie Mac has made AI governance an examination-ready requirement, meaning auditors can now ask for your AI inventory, governance policies, risk assessments, and oversight documentation during any review. GSE compliance is not optional. Loss of approved seller/servicer status is an existential threat for any mortgage operation that sells to or services for Freddie Mac.
Why This Mandate Changes Everything for Mortgage AI
Before December 2025, AI governance in mortgage lending was a best practice. Forward-thinking compliance teams built governance frameworks because it was the responsible thing to do. After Bulletin 2025-16, AI governance is a GSE requirement baked into your seller/servicer agreement.
This shift matters for three reasons.
First, GSE compliance is non-negotiable. If Freddie Mac determines that a seller/servicer is not meeting the requirements of Section 1302.8, the consequences can range from remediation requirements to suspension or revocation of approved status. For mortgage companies that depend on the secondary market, losing seller/servicer status would effectively shut down operations.
Second, the requirement applies to vendor AI, not just tools you build internally. Freddie Mac's framing explicitly covers AI embedded in third-party platforms. If your loan origination system uses machine learning for document classification or your QC vendor uses AI-powered fraud detection, those fall under the mandate. Most mortgage companies have far more AI exposure through vendors than through internal development.
Third, this mandate sets the floor, not the ceiling. The GAO published report GAO-25-107201 in September 2025, urging FHFA to provide clearer fair lending guidance for AI-powered mortgage technology. State-level AI regulation is accelerating, with Colorado's AI Act (SB 24-205) taking effect June 30, 2026. Federal agencies are converging on the same expectations. Freddie Mac's requirements today will be the minimum standard across the industry by 2027.
The Complete Compliance Checklist
This checklist maps to the specific requirements in Section 1302.8 of Freddie Mac's updated Seller/Servicer Guide. Each category addresses a distinct compliance area. Work through them in order.
AI Inventory
- Catalog every AI and machine learning tool, model, and system used in connection with loan origination or servicing
- Include vendor-embedded AI (document recognition in your LOS, AVMs, fraud detection, QC automation, chatbots)
- Document what each AI system does, what data it accesses, and what decisions it influences
- Classify each system by risk level (high risk: underwriting decisions, fair lending impact; medium risk: document processing; lower risk: internal communications)
- Update this inventory at least quarterly and after any major vendor or system change
Governance Framework
- Establish a written AI governance policy approved by senior management (CIO, CTO, or CRO)
- Define roles and responsibilities for AI oversight, including who owns each AI system and who reviews decisions
- Create an AI oversight committee or assign clear governance responsibility to an existing risk committee
- Document escalation procedures for AI-related incidents, failures, or concerns
- Align policies with recognized frameworks (NIST AI Risk Management Framework, ISO 27001, NIST 800-53)
Risk Assessment
- Conduct bias testing on all AI models that affect lending decisions, pricing, or borrower outcomes
- Perform fair lending analysis to identify potential disparate impact across protected classes
- Validate AI model performance against documented benchmarks on a regular schedule
- Assess AI-specific threats: model inversion, data poisoning, prompt injection, deepfake-generated documents
- Document all risk assessment results and remediation actions taken
Human Oversight
- Define decision review thresholds where human review is mandatory before AI recommendations become final
- Establish override procedures so qualified staff can reverse or modify AI-generated decisions
- Create exception handling workflows for cases where AI outputs are uncertain, conflicting, or flagged
- Train staff on how to evaluate and challenge AI recommendations
Vendor Management
- Require third-party vendors to disclose all AI and ML components in their products
- Include AI governance clauses in vendor contracts (model transparency, audit rights, change notification, data handling)
- Assess vendor AI risk as part of your third-party risk management program
- Require vendors to comply with the same governance standards you apply internally
- Monitor vendor AI updates and model changes that could affect loan quality or compliance
Documentation and Audit Trail
- Log all AI-influenced decisions with sufficient detail for post-hoc review
- Track AI model versions, updates, and configuration changes over time
- Maintain records of all testing, validation, and bias assessment results
- Ensure documentation meets Freddie Mac examination standards and can be produced on request
- Retain records according to your regulatory retention schedule (minimum 3-5 years for most mortgage records)
Borrower Communication
- Update adverse action notice processes to provide specific, accurate reasons when AI influences credit decisions (per CFPB Circular 2023-03)
- Prepare disclosure language for borrowers when AI plays a material role in underwriting, pricing, or servicing decisions
- Ensure AI-generated communications to borrowers meet TRID, ECOA, and state-level disclosure requirements
"There are no exceptions to the federal consumer financial protection laws for new technologies."
Consumer Financial Protection Bureau, August 2024Fair Lending and AI: The CFPB Connection
Freddie Mac's AI governance mandate does not exist in isolation. It intersects directly with the CFPB's fair lending framework, which has been tightening around AI for three years.
In September 2023, the CFPB issued guidance clarifying that lenders using AI or complex models for credit decisions must provide specific, accurate reasons in adverse action notices. Generic bucket categories are not sufficient. If your AI model denies a loan based on cash advance patterns, the adverse action notice must say that, not point to a vague "insufficient credit history" template.
In June 2024, the CFPB, along with the Federal Reserve, FDIC, NCUA, OCC, and FHFA, approved a rule requiring companies that use algorithmic appraisal tools to implement safeguards ensuring accuracy, preventing data manipulation, avoiding conflicts of interest, and complying with nondiscrimination laws.
The practical implication: your Freddie Mac AI governance program must also address fair lending. Bias testing is not optional. If your AI models touch underwriting, pricing, or appraisals, you need documented evidence that they do not produce discriminatory outcomes. The GAO's September 2025 report (GAO-25-107201) specifically recommended that FHFA provide clearer fair lending guidance as AI reshapes homebuying, signaling that more prescriptive requirements are coming.
For mortgage companies, the action item is straightforward: integrate fair lending testing into your AI risk assessment process. Test for disparate impact across race, ethnicity, gender, age, and other protected classes. Document your methodology and results. If you find bias, remediate and retest before deploying the model.
Vendor AI: The Blind Spot Most Mortgage Companies Miss
Here is where most mortgage companies underestimate their exposure. The AI governance mandate covers vendor AI, and most mortgage operations run more vendor AI than they realize.
Your loan origination system likely uses AI for document classification and data extraction. Your AVM provider uses machine learning models trained on historical property data. Your QC platform may use AI to flag defects. Your fraud detection tools use pattern recognition models. Your servicing platform may use AI for loss mitigation decisioning. Even your borrower-facing chatbot or communication tools may use generative AI models.
According to the National Mortgage News Emerging Tech and AI Survey (2025), 63% of mortgage lenders reported using AI for document classification and indexing, 54% for document reading, and 21% for underwriting decisions. The majority of this AI runs inside vendor platforms, not custom-built internal systems.
Freddie Mac's mandate requires you to govern all of it. That means:
- Inventory vendor AI by requesting disclosure from every technology vendor about AI/ML components in their products
- Assess vendor AI risk as part of your third-party risk management program, aligned with OCC Bulletin 2023-17 and FFIEC guidance
- Update vendor contracts to include AI transparency clauses, audit rights, change notification requirements, and GSE compliance obligations
- Monitor vendor AI changes because a vendor model update could change how your loans are processed, scored, or reviewed
If your vendors cannot or will not provide the transparency Freddie Mac requires, that is a risk you need to document and escalate. FHFA's recent decision to terminate its own AI vendor contract shows how seriously regulators are taking AI vendor risk, even at the agency level.
Implementation Timeline: From Zero to Compliant
The deadline has passed. If you are starting from zero, here is a prioritized catch-up plan. The goal is to demonstrate good-faith progress and get to a defensible compliance posture as quickly as possible.
Week 1-2: Governance Foundation
- Draft and get senior management approval on your AI governance policy
- Assign a governance owner (CIO, CTO, CRO, or designated compliance lead)
- Establish or designate an AI oversight committee
- Begin your AI inventory with internal systems first
Week 3-4: AI Inventory and Vendor Outreach
- Complete your internal AI inventory (every AI tool, model, and system)
- Send AI disclosure requests to all technology vendors
- Classify AI systems by risk level
- Begin documenting data flows and decision points for high-risk AI systems
Week 5-6: Risk Assessment
- Conduct initial bias testing on high-risk AI models (underwriting, pricing, appraisals)
- Perform AI-specific threat assessment (data poisoning, model inversion, prompt injection)
- Document all risk findings and create remediation plans for identified issues
- Review and update adverse action notice processes for AI-influenced decisions
Week 7-8: Operationalize
- Finalize human oversight procedures and decision review thresholds
- Update vendor contracts with AI governance clauses
- Train staff on AI governance policies and their specific responsibilities
- Conduct tabletop exercise simulating an AI governance audit
Ongoing: Monitor and Maintain
- Quarterly AI inventory updates
- Regular bias testing and model validation cycles
- Annual AI governance policy review and senior management re-approval
- Include AI-specific threats in your annual security awareness training (a specific Bulletin 2025-16 requirement)
- Monitor regulatory developments from FHFA, CFPB, and state regulators
This timeline assumes a mid-size mortgage company with 50-500 employees. Larger organizations with more complex AI footprints may need additional time, but the priority order remains the same: governance policy first, inventory second, risk assessment third, ongoing monitoring last.
Colorado's AI Act (SB 24-205), signed into law May 2024 and taking effect June 30, 2026, will require mortgage companies using AI for eligibility, pricing, or fraud detection to provide disclosure about how AI contributed to decisions, the data types and sources used, and correction and appeal opportunities. The Mortgage Bankers Association has flagged that unclear AI definitions in state laws could put lenders at regulatory risk. Building your Freddie Mac compliance framework now positions you to meet Colorado and other state-level requirements as they take effect.
How ABT Helps Mortgage Companies Meet AI Governance Requirements
Access Business Technologies has managed technology environments for mortgage companies since 1999. As the largest Tier-1 Microsoft Cloud Solution Provider primarily dedicated to financial services, ABT works with hundreds of mortgage companies navigating the intersection of technology, compliance, and regulation.
Freddie Mac's AI governance mandate adds a new layer to what mortgage companies must manage. The technology environment where AI operates, from your Microsoft 365 tenant to your LOS integrations, needs the same governance discipline Freddie Mac now requires for AI itself. ABT's Guardian platform provides the control layer that manages, monitors, and secures the infrastructure AI runs on.
For mortgage companies facing the Bulletin 2025-16 requirements, the challenge is not just writing a governance policy. It is building the operational capability to inventory vendor AI across your technology stack, maintain audit trails, monitor for AI-specific threats, and produce documentation when examiners ask for it. That is where a managed services partner with deep mortgage compliance experience makes the difference.
ABT also helps mortgage companies manage the broader AI risk management frameworks that underpin GSE-level requirements, including alignment with NIST standards and federal regulatory guidance.
Related reading for your compliance team: TRID Compliance IT Checklist and Encompass Cloud Hosting Configuration Guide.
Need Help Meeting Freddie Mac's AI Governance Requirements?
ABT has helped mortgage companies meet GSE compliance requirements for over 25 years. Talk to our team about building the AI governance framework, vendor management processes, and technology controls that Bulletin 2025-16 demands.
Talk to an ExpertFrequently Asked Questions
Freddie Mac Bulletin 2025-16, issued December 3, 2025, updates Section 1302.8 of the Seller/Servicer Guide to require approved seller/servicers to establish a comprehensive governance framework for AI and machine learning systems used in loan origination or servicing. It mandates transparency, accountability, senior management oversight, and auditable documentation of AI governance practices.
The Freddie Mac AI governance requirements took effect March 3, 2026. The bulletin was issued December 3, 2025, giving seller/servicers approximately 90 days to establish or update their AI governance programs. Any mortgage with an application received on or after March 3, 2026 must be processed under a compliant governance framework.
The mandate covers any AI or machine learning used in connection with loan origination or servicing. This includes automated underwriting models, document classification and extraction, automated valuation models, fraud detection, quality control automation, borrower communication chatbots, servicing decision engines, and any vendor-embedded AI tools in your technology stack.
Freddie Mac's mandate requires transparency about AI use within the governance framework. When AI materially influences lending decisions, existing CFPB requirements under ECOA and Regulation B already require specific, accurate adverse action notices. State laws like Colorado's AI Act add further borrower notification requirements. Mortgage companies should prepare disclosure language for AI-assisted decisions.
The mandate covers vendor-embedded AI alongside internally developed tools. Mortgage companies must inventory all third-party AI, require vendor disclosure of AI components, include governance clauses in contracts, assess vendor AI risk within their third-party risk management program, and monitor vendor model changes that could affect loan quality or compliance.
Non-compliance with Section 1302.8 can result in remediation requirements, increased examination scrutiny, suspension of seller/servicer approval, or revocation of approved status. Loss of Freddie Mac seller/servicer approval would prevent a mortgage company from selling loans to the GSE, effectively cutting off access to the secondary market.
As of March 2026, Fannie Mae has not issued an equivalent bulletin mandating AI governance for sellers and servicers. However, FHFA published its 2025 AI Compliance Plan covering both GSEs, and the GAO recommended FHFA provide written fair lending guidance for AI to both Fannie Mae and Freddie Mac. Mortgage companies should expect similar Fannie Mae requirements in the near term.