| name | proposal-writing-05-relevant-experience |
| description | Write the relevant experience section of a consulting proposal. Use when the user asks to draft past projects, project spotlights, similar assignments, or the experience section of a proposal. |
Relevant Experience
Acknowledgement: Shared by Peter Bamuhigire, techguypeter.com, +256 784 464178.
Use When
- Use this skill to draft or revise the relevant experience section of a proposal or EoI.
- Load it when evaluator confidence depends on comparable project evidence.
Do Not Use When
- The task is unrelated to past performance or comparable assignments.
- The user only needs supporting domain knowledge rather than this section.
Required Inputs
- The assignment brief and any evaluation criteria on experience.
- The selected proposer profile.
- Verified project examples, dates, roles, clients, scope, and outcomes.
Workflow
- Read the bid requirements and determine what kinds of experience matter most.
- Load the proposer profile and any procurement or sector context that shapes relevance.
- Select the strongest comparable assignments and structure them using the guidance below.
- Convert each selected assignment into a proof story: context, challenge, role, intervention, result, and relevance.
- Emphasize outcomes, relevance, and evaluator-fit rather than raw volume.
- Verify consistency with the profile, team, and methodology sections.
- For premium proposals or public case studies, apply the premium commercial writing case-study pattern before finalising.
Quality Standards
- Keep examples directly relevant to the assignment and evaluation criteria.
- Use British English and East African professional tone unless the bid format requires otherwise.
- Prefer quantified outcomes, named roles, and clear similarity signals.
Anti-Patterns
- Do not list projects with no outcomes or no relevance logic.
- Do not exaggerate roles, scope, or contracting relationships.
- Do not overload the section with marginal examples that dilute the strongest evidence.
Outputs
- A proposal-ready relevant experience section with evaluator-focused project evidence.
References
../profiles/SKILL.md for proposer selection and voice.
../sectors/SKILL.md for procurement and sector routing.
../proposal-storytelling-and-evaluator-journey/SKILL.md and ../references/proposal-narrative-patterns-and-case-story-spine.md for case-story structure and evaluator relevance.
../premium-commercial-writing/SKILL.md for premium proof discipline, case-study relevance, and public-facing case-study polish.
- Relevant proposal-wide references when positioning or proof structure needs reinforcement.
This section is typically the most heavily weighted in evaluation scoring. It is where the firm proves — not claims — that it has done this before and can do it again.
What to Gather Before Writing
For each project to include:
- Client name and country
- Assignment title
- Duration and year
- The firm's specific role (lead, sub-contractor, individual consultant)
- Scope: what was actually done
- Quantified outcomes — this is the most important input
- A reference contact (name, title, phone or email)
Aim to gather at least four projects. Prioritise projects that are:
- The same type of assignment as the current bid
- In the same country or sub-region as the client
- For the same type of client (government ministry, NGO, regulatory body, private sector)
- The largest or most prestigious engagements available
Structure
Summary Table
Open with a table listing all included projects at a glance. This allows evaluators to quickly assess coverage before reading the detail.
| # | Client | Country | Assignment | Year | Key Outcome |
|---|
Project Cards
Follow with a detailed card for each project using a consistent template:
CLIENT: [Organisation name], [Country]
ASSIGNMENT: [Title]
PERIOD: [Month/Year – Month/Year]
OUR ROLE: [Lead implementor / Team leader / Sub-consultant]
SCOPE: [Two to three sentences describing what was done]
OUTCOMES:
• [Quantified result — e.g., "Reduced processing time from 14 days to 3 days"]
• [Quantified result — e.g., "Trained 127 staff across four departments"]
• [Quantified result — e.g., "System achieved 99% uptime in first six months"]
REFERENCE: [Name, Title, Phone/Email]
Order project cards by relevance to the current bid, not by date.
For premium, digital, AI, website, service-design, or support-heavy proposals, add one line after each project card: Relevance to this assignment: [specific parallel in outcome, user group, technology, service journey, risk, operating context, or support model]. This helps evaluators see why the example matters, not only that it exists.
The Outcomes Rule
Every project card must contain at least one quantified outcome. If the user cannot provide numbers, ask specifically:
- How many users, staff, or beneficiaries were served?
- What changed as a result — processing time, error rate, cost, coverage?
- Was the project delivered on time and on budget?
- What did the client say after delivery?
If exact figures are unavailable, use ranges or relative measures: "reduced by approximately 30–40%", "served over 5,000 beneficiaries".
A project card without outcomes is significantly weaker than one with them. Never submit a card that only describes what was done without stating what was achieved.
Tone Rules
- Three to six pages depending on the number of projects
- Factual and specific — evaluators may contact references to verify claims
- Order by relevance, not chronology
- Never include projects that are not genuinely comparable — it weakens the section
- Follow east-african-english standards throughout