Vendor evaluation scorecard with weighted criteria for a 6-figure software purchase
Views
3.6K
Copies
613
Likes
324
Comments
0
Copy rate
17.0%
Prompt
You are a procurement lead at a mid-market company. Build a vendor evaluation scorecard for a major software purchase. The scorecard must reduce bias and produce a defensible decision.
Software category: [category]
Budget range: [budget]
Key stakeholders: [stakeholders]
Include:
1. Weighted evaluation criteria across 6 dimensions
2. Scoring scale (1–5) with concrete anchors
3. Stakeholder weighting (who scores what and why)
4. Dealbreaker criteria
5. Template for running the final decision meeting
Make the weighting logic explicit.Customise this prompt
Fill in 3 variables to personalise this prompt
Preview
You are a procurement lead at a mid-market company. Build a vendor evaluation scorecard for a major software purchase. The scorecard must reduce bias and produce a defensible decision.
Software category: [category]
Budget range: [budget]
Key stakeholders: [stakeholders]
Include:
1. Weighted evaluation criteria across 6 dimensions
2. Scoring scale (1–5) with concrete anchors
3. Stakeholder weighting (who scores what and why)
4. Dealbreaker criteria
5. Template for running the final decision meeting
Make the weighting logic explicit.
Example output
# Vendor Evaluation Scorecard — Customer Data Platform
**Category:** CDP for mid-market B2B SaaS
**Budget:** $180k–$300k annual
**Stakeholders:** Marketing Ops (primary), Engineering, Data, Security, Finance
---
## Weighted Criteria (Total 100 points)
| Dimension | Weight | Rationale |
|---|---|---|
| Product fit and core features | 25 | This is why we are buying |
| Integration ecosystem | 15 | 70% of value comes from connected tools |
| Time to value | 15 | We need to be in production within 90 days |
| Security and compliance | 15 | SOC 2, GDPR, PII handling are hard blockers |
| Total cost of ownership (3-year) | 15 | Hidden costs routinely double sticker price |
| Vendor viability and support | 15 | Implementation partners and roadmap confidence |
---
## Scoring Scale with Anchors
### Product Fit
- **5** — Covers every must-have and 80% of nice-to-haves out of the box
- **4** — Covers all must-haves, requires light config for nice-to-haves
- **3** — Covers must-haves with documented workarounds
- **2** — Missing one must-have that requires custom engineering
- **1** — Missing multiple must-haves
### Integration Ecosystem
- **5** — Native integrations with 100% of our current stack and 2+ on our roadmap
- **4** — Native for current stack, roadmap items via API
- **3** — Covers current stack, some via middleware
- **2** — Requires custom integration work for key tools
- **1** — Closed ecosystem or poor API docs
### Time to Value
- **5** — Production use case live in 30 days with vendor professional services
- **3** — 60–90 days with standard implementation
- **1** — 6+ months or requires internal engineering sprint
### Security and Compliance
- **5** — SOC 2 Type II, ISO 27001, GDPR DPA ready, EU data residency
- **3** — SOC 2 Type II, can sign our DPA with minor edits
- **1** — Only self-attestation, no third-party audit
### Total Cost of Ownership (3-year)
- Score inversely proportional to all-in cost including implementation, training, and staffing
### Vendor Viability
- **5** — Public or late-stage, clear roadmap, named customer reference at our stage
- **3** — Series B/C, lukewarm references
- **1** — Early-stage, no comparable customer references
---
## Stakeholder Weighting
| Criterion | Primary Scorer | Secondary Scorer |
|---|---|---|
| Product fit | Marketing Ops | Data |
| Integrations | Engineering | Marketing Ops |
| Time to value | Marketing Ops | Engineering |
| Security | Security | Engineering |
| TCO | Finance | Procurement lead |
| Vendor viability | Procurement lead | Marketing Ops |
The primary scorer's number is the number of record. Secondary scorers can flag if they would have scored 2+ points different — that triggers a discussion before the final meeting.
---
## Dealbreakers (Override Any Score)
- Cannot sign our DPA without redlines we consider unacceptable
- No SOC 2 Type II report available
- Requires more than 1 FTE in year one to maintain
- Pricing model penalizes growth (e.g. charges per contact above 2x expected volume)
---
## Final Decision Meeting Template
1. **Scores on screen** — each vendor's total and breakdown (5 min)
2. **Anomalies discussion** — any scores that varied significantly between primary and secondary scorers (15 min)
3. **Dealbreaker check** — any blockers that override the math (10 min)
4. **Reference check debrief** — what we heard from at least 2 customer references per finalist (15 min)
5. **Recommendation vote** — each voting stakeholder states their preference and one-sentence reason (10 min)
6. **Decision or next steps** — commit or name the specific follow-up needed
The score is guidance, not verdict. The meeting is where judgment enters the process.