How to Write an AI RFP: Template, Sections, and Best Practices for 2026
Author
ZTABS Team
Date Published
If you are procuring AI development services, knowing how to write an AI RFP is the difference between getting proposals you can actually compare and getting a pile of vague promises. A standard software RFP does not work for AI projects. The uncertainty is higher, the deliverables are less predictable, and vendor capabilities vary wildly — from agencies that repackaged web development as "AI" to teams with dozens of production deployments.
This guide walks you through every section of an AI RFP (Request for Proposal), explains what to include, how to evaluate responses, and the mistakes that derail procurement. Whether you are sourcing a conversational AI agent, a document processing pipeline, or a custom ML model, the structure here applies.
Why AI Projects Need a Different RFP
Traditional software RFPs assume predictable scope: build X features with Y technology on Z timeline. AI projects do not work that way. Here is why your RFP needs to account for this.
Uncertainty is built into the process. You may not know which model architecture performs best until you test three of them. A retrieval-augmented generation (RAG) pipeline that works perfectly on clean data may need significant rework when it encounters your actual documents. The RFP needs to allow for discovery and iteration.
Data is a first-class dependency. In traditional software, data is something you store and display. In AI, data determines whether the system works at all. Your RFP must describe the data you have, its quality, and its accessibility — or vendors will either overbid to cover the risk or underbid because they assumed the data was clean.
"Done" is a spectrum. A chatbot that answers 70% of questions correctly is a different product than one that answers 95% correctly. The cost difference between those two targets can be 3–5x. Your RFP needs measurable success criteria, not just feature lists.
Ongoing costs matter as much as build costs. LLM API fees, model retraining, monitoring, and infrastructure create recurring expenses that often exceed the original development cost within 18 months. A good RFP asks vendors to address the total cost of ownership.
If you have not yet defined what your AI project should accomplish, start with a project brief before writing the RFP.
Essential RFP Sections
Every AI RFP should include these ten sections. Missing even one leads to proposals that are impossible to compare.
- Project Overview — What you want to build and why
- Business Problem — The specific problem and its measurable cost
- Success Metrics — How you will measure whether the project succeeded
- Data Overview — What data is available, its quality, and access methods
- Technical Requirements — Constraints, integrations, security, and compliance
- Team and Governance — Internal stakeholders, decision-making process, and access
- Timeline — Key milestones and hard deadlines
- Budget Range — Realistic budget parameters
- Evaluation Criteria — How you will score proposals
- Proposal Format — What vendors should include in their response
Section-by-Section Guide
1. Project Overview
State what you want to build in plain language. Avoid jargon-heavy descriptions that assume the vendor knows your industry. Include:
- One-sentence summary: "We need an AI system that [does X] for [audience] using [data source]."
- Background: Why this project exists now (competitive pressure, cost problem, customer demand, strategic initiative).
- Scope boundaries: What is explicitly out of scope for this engagement.
Example:
We need an AI-powered document processing system that extracts key terms, dates, and obligations from commercial lease agreements and populates our property management database. We process approximately 200 leases per month. The current manual process takes 45 minutes per lease and has a 12% error rate.
2. Business Problem
Quantify the problem you are solving. Vendors use this to calibrate their solution and justify pricing. Include:
- Current process description (who does it, how long it takes, what it costs)
- Pain points (errors, delays, scalability limits, compliance risk)
- Business impact (revenue lost, hours wasted, customer churn)
Be specific. "We want to improve efficiency" tells a vendor nothing. "Our support team spends 1,200 hours per month answering repetitive questions that could be automated" gives them something to design against.
3. Success Metrics
Define what success looks like using measurable criteria. Without this, you will argue with the vendor about whether the project is "done."
| Metric | Target | How Measured | |--------|--------|-------------| | Accuracy | ≥ 92% correct extractions | Manual audit of 100 random samples monthly | | Processing time | < 3 minutes per document | System logs | | User adoption | > 80% of team using within 60 days | Usage analytics | | Cost reduction | 50% reduction in manual processing cost | Before/after comparison | | Uptime | 99.5% availability | Monitoring dashboard |
Include both hard requirements (must-have thresholds) and stretch goals (nice-to-have improvements).
4. Data Overview
This is the section most AI RFPs get wrong — or skip entirely. Vendors need to understand your data to estimate effort accurately.
What to include:
- Data types: Text documents, database records, images, audio, structured/unstructured
- Volume: How much data is available for training or processing
- Quality: Is it clean, labeled, consistent? Are there known issues?
- Access: How will the vendor access the data? API, database connection, file transfer, VPN?
- Sensitivity: Does the data contain PII, PHI, financial records, or trade secrets?
- Labeling: Is any data already labeled or annotated for ML purposes?
- Historical data: How far back does the data go?
Example data summary:
We have 15,000 historical lease agreements in PDF format (2018–present). Approximately 60% are scanned documents (require OCR), 40% are digital PDFs. Documents range from 5 to 120 pages. No existing labels or annotations. Data is stored in SharePoint and accessible via API. Documents contain tenant PII (names, SSNs, financial data). All processing must occur within our Azure environment.
5. Technical Requirements
Specify constraints the vendor must work within. Be precise about what is non-negotiable and what is flexible.
Cover these areas:
- Cloud/hosting: Must deploy on AWS/Azure/GCP, on-premises, or flexible?
- Integration points: Systems the AI must connect to (CRM, ERP, databases, APIs)
- Security requirements: SOC 2, HIPAA, GDPR, data residency, encryption
- Performance: Latency requirements, throughput, concurrent users
- Language/region: Languages the system must support
- Existing tech stack: Relevant infrastructure already in place
- Data handling: Where data can and cannot be sent (no external API calls, no third-party model providers, etc.)
Do not over-specify the solution architecture unless you have a strong technical reason. Telling vendors which LLM to use or which vector database to deploy limits their ability to propose the best approach.
6. Team and Governance
Vendors need to understand who they will work with and how decisions get made.
- Project sponsor: Who owns the budget and has final approval
- Day-to-day contact: Who the vendor works with daily
- Technical contact: Who provides access to systems, data, and infrastructure
- Subject matter experts: Who can answer domain questions and validate output
- Approval process: How many people need to approve deliverables, and what is the expected turnaround
Also specify your preferred collaboration model (Agile sprints, fixed milestones, time-and-materials) and communication cadence (weekly standups, bi-weekly demos).
7. Timeline
Provide a realistic timeline. AI projects typically require:
- Discovery/scoping: 2–4 weeks
- Data preparation: 2–8 weeks (often underestimated)
- Development/iteration: 6–16 weeks
- Testing and evaluation: 2–6 weeks
- Deployment and handoff: 2–4 weeks
Sample timeline structure:
| Phase | Duration | Key Deliverable | |-------|----------|----------------| | Vendor selection | 4–6 weeks | Signed contract | | Discovery | 2–3 weeks | Technical specification, data assessment | | MVP / proof of concept | 4–6 weeks | Working prototype with core functionality | | Iteration and refinement | 4–8 weeks | Production-ready system meeting success metrics | | Deployment and training | 2–3 weeks | Live system, documentation, team training | | Warranty/support | 4–12 weeks | Bug fixes, performance tuning |
Include hard deadlines if they exist (regulatory deadlines, product launches, board presentations) and flag them clearly.
8. Budget Range
Share a budget range. Yes, really.
Vendors without a budget range either overbid (to be safe) or underbid (to win, then change-order you later). A range like "$80,000–$150,000 for MVP plus first year of maintenance" tells vendors whether the project is feasible and helps them calibrate scope.
If you genuinely have no idea what AI development costs, read our AI agent development cost breakdown to establish a baseline before writing the RFP.
9. Evaluation Criteria
Tell vendors exactly how you will score their proposals. This forces better responses and makes your selection defensible.
Example weighting:
| Criterion | Weight | |-----------|--------| | Technical approach and architecture | 25% | | Team experience and qualifications | 20% | | Relevant AI project portfolio | 20% | | Pricing and value | 15% | | Timeline and milestones | 10% | | References and case studies | 10% |
Publish this in the RFP. Vendors who know the scoring criteria will address what matters to you instead of padding proposals with irrelevant content.
10. Proposal Format
Standardize responses so you can compare them. Specify:
- Page limit: 15–25 pages is reasonable for an AI project
- Required sections: Mirror your evaluation criteria
- Pricing format: Fixed price, time-and-materials, phased pricing, or a combination
- Technical approach: Architecture diagram, technology choices with justifications
- Team bios: Specific people who will work on the project, not company-wide capabilities
- Case studies: 2–3 relevant past projects with measurable outcomes
- References: At least 2 clients the evaluators can contact
- Questions: A section for vendor questions and assumptions
Evaluation Criteria: How to Score Responses
Once proposals arrive, score them systematically. Here is what to look for in each area.
Technical Approach (25%)
- Does the proposed architecture make sense for your problem?
- Did they explain model selection rationale, not just name-drop GPT-4?
- Do they address data preparation, testing, and monitoring — not just development?
- Is there a clear plan for handling edge cases and failures?
- Did they identify risks and propose mitigations?
Team Experience (20%)
- Are the proposed team members named, with relevant backgrounds?
- Do they have AI/ML experience, or are they web developers pivoting to AI?
- Will senior people do the work, or just sell it?
- What is the team's availability and allocation?
For deeper vetting, use our list of 25 questions to ask an AI development company during the evaluation stage.
Past AI Work (20%)
- Can they show production AI systems (not just demos or prototypes)?
- Are the case studies relevant to your industry or problem type?
- Do they share measurable outcomes (accuracy rates, cost savings, processing improvements)?
- Can you verify the work with references?
Pricing Model (15%)
- Is the pricing transparent and broken down by phase?
- Do they account for ongoing costs (hosting, API fees, maintenance)?
- Is the pricing model appropriate for the work type? (Fixed price for well-defined scope, T&M for discovery-heavy work)
- Are there hidden assumptions that could inflate costs later?
References (10%)
- Contact at least two references for your top candidates
- Ask references about communication, deadline adherence, handling of scope changes, and post-launch support
- Ask specifically about AI performance in production vs. what was promised
AI-Specific Requirements to Include
Standard RFPs miss these. Include them to get proposals that address the full AI lifecycle.
Model Selection Approach
Ask vendors to explain how they will choose the right model — not just which model they plan to use. Good vendors evaluate multiple options (open-source vs. proprietary, model size vs. cost vs. accuracy tradeoffs) and can justify their recommendation.
Data Handling and Privacy
Specify exactly where your data can go. Key questions:
- Can data be sent to external APIs (OpenAI, Anthropic, Google)?
- Must all processing happen within your cloud environment?
- What data retention policies must the vendor follow?
- How will training data be handled after the project ends?
Testing and Evaluation Framework
Require a testing plan that goes beyond "we will test it." Ask for:
- Evaluation datasets: How will accuracy be measured? Against what benchmark?
- Edge case testing: How will the system handle unusual or adversarial inputs?
- Regression testing: How will updates be validated against prior performance?
- User acceptance testing: What does the UAT plan look like?
Monitoring and Observability
The system needs to be monitored after deployment. Require vendors to describe:
- How they will track model performance (accuracy drift, latency, error rates)
- Alerting mechanisms for performance degradation
- Logging and audit trails for compliance
- Dashboard or reporting for stakeholders
Retraining and Maintenance Plan
AI systems degrade over time as data patterns change. Ask vendors to address:
- When and how the model will be retrained
- How new data will be incorporated
- Who is responsible for ongoing model performance (vendor, your team, shared)
- Estimated ongoing maintenance costs (monthly/annual)
Common Mistakes That Derail AI Procurement
Being too prescriptive on technology
Specifying "must use GPT-4 with LangChain and Pinecone" in your RFP tells vendors you have already decided the architecture. This prevents experienced teams from proposing better alternatives. Describe the problem and constraints. Let vendors propose the solution.
No data section
Sending an RFP with no data description is like asking a construction company to bid on a building without showing them the lot. Every vendor will either make optimistic assumptions (leading to cost overruns) or pad their estimate significantly (making you overpay). Include the data overview section, even if the answer is "we have not assessed our data yet."
Fixed price for R&D-heavy work
AI projects often require experimentation. If you do not know whether your data supports the accuracy targets you want, a fixed-price contract creates misaligned incentives — the vendor will cut corners to stay on budget. For discovery and proof-of-concept phases, time-and-materials or phased contracts work better.
Ignoring ongoing costs
A system that costs $100,000 to build but $4,000 per month to operate costs $148,000 by the end of year one and $196,000 by end of year two. Require vendors to estimate total cost of ownership for at least 24 months.
No success metrics
Without measurable success criteria, there is no objective way to determine if the project delivered value. "Build a chatbot" is not a success metric. "Build a chatbot that resolves 60% of Tier 1 support tickets without human intervention within 90 days of launch" is.
Asking for fixed timelines on uncertain scope
Demanding a firm delivery date for a project that requires data exploration and model experimentation creates adversarial dynamics. Use milestone-based timelines where each phase has deliverables, and the next phase is scoped based on what was learned.
RFP Timeline and Process
A well-run AI RFP process takes 8–12 weeks from draft to signed contract. Rushing it leads to poor vendor selection.
Recommended timeline
| Week | Activity | |------|----------| | 1–2 | Draft RFP internally, gather input from stakeholders and technical team | | 3 | Review and finalize RFP, identify vendor longlist (6–10 vendors) | | 4 | Distribute RFP to vendors | | 4–5 | Q&A period (allow vendors to ask clarifying questions, share answers with all) | | 5–7 | Vendors prepare and submit proposals | | 7–8 | Initial evaluation, create shortlist (2–4 vendors) | | 8–9 | Vendor presentations and technical deep-dives | | 9–10 | Reference checks, final evaluation | | 10–11 | Negotiate terms and finalize contract | | 11–12 | Contract signed, project kickoff |
Process tips
- Share Q&A publicly. When one vendor asks a clarifying question, share the answer with all vendors. This levels the playing field and improves all proposals.
- Require live presentations. A written proposal tells you about the company. A live presentation tells you about the team you will actually work with.
- Run a paid pilot. For high-stakes projects ($100K+), consider a paid 2–4 week pilot with your top 1–2 vendors before committing to a full engagement. This costs $5,000–$15,000 but can save you from a six-figure mistake.
- Check for cultural fit. AI projects require close collaboration. The vendor's communication style, responsiveness, and willingness to push back on bad ideas matters as much as technical ability.
Frequently Asked Questions
How long should an AI RFP be?
10–20 pages is the sweet spot. Shorter and you leave out critical information vendors need. Longer and you are likely over-specifying the solution instead of the problem. The RFP should be detailed enough that a qualified vendor can produce a meaningful proposal without a discovery call — but concise enough that they actually read the whole thing.
Should I include a budget range in the RFP?
Yes. Vendors who do not know your budget either price themselves out (bidding $200K when your budget is $80K) or price too low to win (then change-order you later). A range like "$75K–$150K including first year of support" helps vendors calibrate their proposal. If you are unsure about realistic budgets, our cost breakdown guide can help.
How many vendors should I invite?
Send the RFP to 5–8 vendors. Fewer than 5 limits your options. More than 8 creates an evaluation burden that slows down the process without improving outcomes. Build your longlist from referrals, industry directories, case studies, and research on AI development companies.
What if I do not have a technical team to evaluate proposals?
Hire an independent AI consultant to review proposals, or engage a technically strong member of your team even if AI is not their specialty. The goal is having someone who can evaluate whether the proposed architecture is reasonable, not someone who can build it themselves. You can also ask vendors to present their approach in non-technical terms as part of the proposal format.
Can I use this RFP structure for off-the-shelf AI tools?
This guide is designed for custom AI development engagements. If you are evaluating off-the-shelf AI tools or platforms, the procurement process is different — you are comparing existing capabilities rather than commissioning custom development. That said, sections on success metrics, data requirements, and total cost of ownership still apply.
Next Steps
Writing an AI RFP is not something you do in an afternoon. Budget a week to do it right, with input from the people who understand the business problem, the data, and the technical constraints.
To get started:
- Complete a project brief to clarify scope and requirements
- Assess your data readiness and document what you have
- Define 3–5 measurable success metrics
- Set a realistic budget range using cost benchmarks
- Use the section structure in this guide to draft your RFP
- Circulate internally for review before sending to vendors
If you want to skip the RFP process and talk directly to an AI development team that has built production AI systems across industries, get in touch with our team. We can help you scope the project, assess feasibility, and move to development faster — whether that starts with a project brief, a paid pilot, or a full engagement.
Explore our AI development services to see what we build and how we work.
Need Help Building Your Project?
From web apps and mobile apps to AI solutions and SaaS platforms — we ship production software for 300+ clients.
Related Articles
AI Agent Orchestration: How to Coordinate Agents in Production
AI agent orchestration is how you coordinate multiple agents, tools, and workflows into reliable production systems. This guide covers orchestration patterns, frameworks, state management, error handling, and the protocols (MCP, A2A) that make it work.
10 min readAI Agent Testing and Evaluation: How to Measure Quality Before and After Launch
You cannot ship an AI agent to production without a testing strategy. This guide covers evaluation datasets, accuracy metrics, regression testing, production monitoring, and the tools and frameworks for testing AI agents systematically.
10 min readAI Agents for Accounting & Finance: Bookkeeping, AP/AR, and Reporting
AI agents automate accounting tasks — invoice processing, expense management, reconciliation, and financial reporting — reducing manual work by 60–80% while improving accuracy. This guide covers use cases, ROI, compliance, and implementation.