Skip to content

21.5 Maturity Models and Roadmaps

Organizations rarely achieve comprehensive supply chain security overnight. The journey from ad-hoc practices to systematic, automated security requires sustained effort across multiple dimensions—tooling, processes, skills, and culture. Maturity models provide a framework for understanding where you are, where you need to be, and what capabilities to develop along the way. Combined with roadmaps that sequence improvements over time, maturity models transform abstract security goals into concrete, achievable plans.

The value of maturity models lies not in achieving the highest level for its own sake, but in providing a common vocabulary for discussing capability gaps and a structured approach to improvement. An organization at maturity level 2 is not "failing"—it simply has different priorities and capabilities than one at level 4. The question is whether your current maturity aligns with your risk profile and business requirements.

Supply Chain Security Maturity Models

Maturity models describe progressive levels of capability, typically ranging from initial/ad-hoc practices to optimized/continuous improvement. While several general security maturity models exist—including the Building Security In Maturity Model (BSIMM) and OWASP SAMM—supply chain security-specific models are still emerging.

Five-level supply chain security maturity model:

Level Name Characteristics
Level 1 Initial Ad-hoc practices, reactive response, no formal program
Level 2 Developing Basic tooling deployed, manual processes, inconsistent coverage
Level 3 Defined Standardized processes, automated scanning, policy-driven
Level 4 Managed Comprehensive coverage, metrics-driven, proactive risk management
Level 5 Optimizing Continuous improvement, advanced automation, industry leadership

Capabilities at each level:

Level 1 - Initial: - No formal dependency tracking - Vulnerabilities discovered reactively (through incidents or external reports) - No SBOM generation or management - Developers make dependency decisions without security input - Incident response improvised when supply chain issues arise

Level 2 - Developing: - Dependency scanning deployed for some applications - Manual SBOM generation for critical applications - Basic policies exist but enforcement is inconsistent - Some developer awareness of supply chain risks - Vulnerability remediation tracked but SLAs often missed - Security team aware of major supply chain incidents

Level 3 - Defined: - Dependency scanning integrated into CI/CD for all applications - Automated SBOM generation for all releases - Documented policies with automated enforcement - Regular training on supply chain security - Vulnerability remediation meets SLAs consistently - Incident response playbooks include supply chain scenarios - Metrics tracked and reported to leadership

Level 4 - Managed: - Comprehensive scanning including transitive dependencies - SBOM management with automated vulnerability correlation - Policy-as-code with admission control enforcement - Provenance verification for dependencies - Risk-based prioritization with business context - Proactive monitoring for emerging threats - Integration with enterprise risk management - Regular third-party assessments

Level 5 - Optimizing: - Continuous process improvement based on metrics and incidents - Advanced automation including self-healing capabilities - Contribution to upstream security (reporting, fixes, funding) - Industry benchmark comparisons - Innovation in supply chain security practices - Influence on standards and community practices - Predictive risk analysis

SLSA framework as maturity model:

The Supply-chain Levels for Software Artifacts (SLSA) framework provides another maturity perspective, focused specifically on build integrity. SLSA v1.0 (released April 2023) focuses on the Build track:

SLSA Build Level Requirements
Build L0 No SLSA requirements met
Build L1 Provenance exists showing how the package was built
Build L2 Hosted build service with signed provenance
Build L3 Hardened builds with non-falsifiable provenance

SLSA v1.0 focuses on the Build track; additional tracks (e.g., Source track) are planned for future versions. SLSA levels can complement general maturity models by providing specific build security targets within broader supply chain security programs.

Assessment Methodologies

Maturity assessment establishes your current state, identifying strengths and gaps against the maturity model. Assessment approaches range from informal self-assessment to rigorous third-party audit.

Assessment approach options:

Approach Rigor Cost Best For
Self-assessment Low-Medium Low Initial baseline, ongoing monitoring
Peer review Medium Low-Medium Validation of self-assessment, knowledge sharing
Consultant assessment Medium-High Medium Objective perspective, industry benchmarking
Third-party audit High High Compliance requirements, external assurance

Self-assessment process:

  1. Assemble assessment team: Include representatives from security, engineering, operations, and compliance

  2. Review evidence: Collect artifacts demonstrating capabilities—tool configurations, process documents, metrics, incident records

  3. Score against criteria: For each capability area, determine which maturity level best describes current state

  4. Document rationale: Record evidence supporting each score; this enables tracking over time

  5. Identify quick wins: Note capabilities that are close to the next level with modest effort

  6. Validate with stakeholders: Review findings with practitioners to confirm accuracy

Self-assessment template:

# Supply Chain Security Maturity Assessment
**Assessment Date**: [Date]
**Assessors**: [Names/Roles]

### Capability Area: Dependency Scanning

**Current Level**: 2 - Developing

**Evidence**:
- `Snyk` deployed in CI/CD for 60% of applications
- No scanning for infrastructure-as-code dependencies
- Manual triage process, no automation
- Weekly reports generated but not consistently reviewed

**Gaps to Level 3**:
- Scanning coverage for remaining 40% of applications
- IaC dependency scanning
- Automated triage based on policy
- Integration with vulnerability management system

**Estimated Effort to Level 3**: 2-3 months

### [Repeat for each capability area]

Third-party assessment considerations:

External assessments provide objectivity and industry benchmarking but require careful scoping:

  • Define scope clearly: Which applications, environments, and capabilities are in scope?
  • Select qualified assessors: Look for supply chain security expertise, not just general security
  • Prepare evidence in advance: Assessments are more efficient when documentation is ready
  • Plan for findings: Budget time and resources to address assessment findings
  • Use findings constructively: Assessment is about improvement, not blame

Gap Analysis and Prioritization

Assessment reveals gaps between current and desired maturity. Gap analysis structures these findings for action; prioritization sequences them based on risk, effort, and dependencies.

Gap analysis methodology:

  1. Identify desired state: What maturity level do you need for each capability? This depends on risk profile, regulatory requirements, and business context—not every organization needs Level 5 everywhere.

  2. Document gaps: For each capability area, list specific gaps between current and desired state.

  3. Assess gap characteristics:

  4. Risk impact: How much does this gap increase risk?
  5. Effort required: What resources are needed to close the gap?
  6. Dependencies: Does closing this gap require other gaps to be closed first?
  7. Quick win potential: Can partial progress be achieved quickly?

  8. Prioritize gaps: Use a prioritization framework to sequence gap closure.

Gap prioritization framework:

                     High Risk Impact
         ┌─────────────────┼─────────────────┐
         │                 │                 │
         │   Priority 2    │   Priority 1    │
         │   (Plan)        │   (Immediate)   │
         │                 │                 │
High     ├─────────────────┼─────────────────┤  Low
Effort   │                 │                 │  Effort
         │   Priority 4    │   Priority 3    │
         │   (Backlog)     │   (Quick Wins)  │
         │                 │                 │
         └─────────────────┼─────────────────┘
                     Low Risk Impact
  • Priority 1: High risk, low effort—address immediately
  • Priority 2: High risk, high effort—plan and resource
  • Priority 3: Lower risk, low effort—quick wins to demonstrate progress
  • Priority 4: Lower risk, high effort—address when resources permit

Gap prioritization example:

Gap Risk Impact Effort Priority Timing
No scanning for 40% of apps High Medium 1 Q1
No SBOM management system Medium High 2 Q2-Q3
No provenance verification Medium Medium 2 Q2
Incomplete developer training Low Low 3 Q1
No contribution to upstream Low High 4 Future

Roadmap Development: Phased Approach

A roadmap translates prioritized gaps into a time-sequenced plan, organizing improvements into phases that build upon each other.

Roadmap template structure:

## Supply Chain Security Program Roadmap

### Vision Statement
[Where we want to be in 2-3 years]

### Current State Summary
- Overall Maturity: Level 2
- Key Gaps: [List top 3-5]
- Risk Posture: [Assessment]

### Phase 1: Foundation (Q1-Q2 Year 1)
**Objective**: Establish basic coverage and visibility

**Key Initiatives**:
1. Expand dependency scanning to 100% of applications
   - Owner: [Name]
   - Dependencies: None
   - Success Criteria: All apps scanned in CI/CD

2. Implement automated SBOM generation
   - Owner: [Name]
   - Dependencies: Scanning deployment
   - Success Criteria: SBOMs for all releases

3. Document supply chain security policies
   - Owner: [Name]
   - Dependencies: None
   - Success Criteria: Policies approved and published

**Resources Required**: [Headcount, budget, tools]
**Target Maturity**: Level 3 for core capabilities

### Phase 2: Automation (Q3-Q4 Year 1)
**Objective**: Reduce manual effort, improve consistency

**Key Initiatives**:
1. Deploy policy-as-code enforcement
2. Implement automated vulnerability triage
3. Integrate SBOM with vulnerability management

**Target Maturity**: Full Level 3

### Phase 3: Advanced Capabilities (Year 2)
**Objective**: Proactive risk management

**Key Initiatives**:
1. Implement provenance verification
2. Deploy admission control for deployments
3. Establish metrics-driven improvement program

**Target Maturity**: Level 4 for critical applications

### Dependencies and Risks
- [Key dependencies between phases]
- [Risks that could delay progress]

### Resource Summary
| Phase | Headcount | Tooling | Total Investment |
|-------|-----------|---------|------------------|
| Phase 1 | 2 FTE | $100K | $350K |
| Phase 2 | 2 FTE | $50K | $300K |
| Phase 3 | 3 FTE | $150K | $600K |

Phasing principles:

  1. Build foundations first: Basic visibility (scanning, SBOMs) enables everything else
  2. Sequence dependencies: Don't plan initiatives that depend on incomplete prerequisites
  3. Include quick wins early: Early visible progress builds momentum and credibility
  4. Plan for learning: Initial phases reveal information that refines later phases
  5. Maintain flexibility: Roadmaps should adapt as circumstances change

Milestone Setting and Progress Tracking

Milestones mark significant achievements along the roadmap, providing checkpoints for progress assessment and stakeholder communication.

Effective milestone characteristics:

  • Specific: Clear definition of what completion means
  • Measurable: Objective criteria for determining achievement
  • Meaningful: Represents genuine progress, not arbitrary checkpoint
  • Time-bound: Target date for completion

Milestone examples:

Milestone Definition Success Criteria Target Date
Full scan coverage All applications scanned in CI/CD 100% of apps have passing scans Q1 End
SBOM baseline SBOMs generated for all releases SBOMs in artifact registry for all releases Q2 End
Policy enforcement Automated policy checks in pipeline No deployments bypass policy checks Q3 End
Metrics program Regular reporting to leadership Monthly executive reports published Q4 End
Level 3 certification Third-party validation of maturity Assessment confirms Level 3 Year 1 End

Progress tracking mechanisms:

  • Milestone dashboard: Visual representation of milestone status and timeline
  • Monthly reviews: Regular assessment of progress against plan
  • Stakeholder updates: Communication to sponsors and leadership on progress
  • Risk tracking: Monitoring factors that could delay milestones
  • Adjustment process: Defined method for revising timelines when needed

Progress reporting template:

## Roadmap Progress Report - [Month Year]

### Overall Status: 🟢 On Track / 🟡 At Risk / 🔴 Behind

### Milestone Progress
| Milestone | Status | Original Date | Current Forecast | Notes |
|-----------|--------|---------------|------------------|-------|
| Full scan coverage | 🟢 Complete | Q1 | Achieved Q1 | 100% coverage |
| SBOM baseline | 🟡 In Progress | Q2 | Q2 (at risk) | 80% complete, tooling delay |
| Policy enforcement | ⚪ Not Started | Q3 | Q3 | Depends on SBOM |

### Key Accomplishments This Month
- [Accomplishment 1]
- [Accomplishment 2]

### Issues and Risks
- [Issue/Risk 1]: [Status and mitigation]

### Adjustments to Plan
- [Any changes to scope, timeline, or resources]

Continuous Improvement Cycles

Maturity is not a destination but a practice. Continuous improvement ensures that your program evolves with changing threats, technologies, and organizational needs.

Continuous improvement integration:

  1. Regular reassessment: Conduct maturity assessments annually (or more frequently during rapid improvement phases) to validate progress and identify new gaps

  2. Incident learning: Every supply chain incident—internal or industry-wide—is an improvement opportunity. Formalize learning loops that translate incident findings into program improvements.

  3. Metrics-driven refinement: Use the metrics program (Section 21.4) to identify underperforming areas and prioritize improvements

  4. Benchmark comparison: Periodically compare your program against industry benchmarks and peer organizations to identify improvement opportunities

  5. Technology evolution: As new tools and techniques emerge (e.g., Sigstore, SLSA), evaluate and incorporate them into your program

  6. Stakeholder feedback: Gather input from developers, security teams, and leadership on program effectiveness and friction points

Plan-Do-Check-Act cycle:

        ┌────────────────────┐
        │       PLAN         │
        │  - Set objectives  │
        │  - Define actions  │
        └─────────┬──────────┘
    ┌─────────────▼───────────────┐
    │                             │
┌───▼───┐                     ┌───▼───┐
│  ACT  │                     │  DO   │
│       │◄────────────────────│       │
│Improve│                     │Execute│
└───┬───┘                     └───┬───┘
    │                             │
    └─────────────┬───────────────┘
        ┌─────────▼──────────┐
        │      CHECK         │
        │  - Measure results │
        │  - Compare to plan │
        └────────────────────┘

Apply this cycle at multiple levels:

  • Program level: Annual or semi-annual cycle for major program improvements
  • Initiative level: Quarterly cycle for specific capability improvements
  • Operational level: Monthly or weekly cycle for process refinements

Avoiding maturity model pitfalls:

  • Level chasing: Pursuing higher levels for status rather than genuine capability improvement
  • Checkbox compliance: Meeting criteria technically without achieving underlying security improvement
  • Assessment fatigue: Over-assessing without acting on findings
  • Rigid adherence: Following the model when business needs suggest different priorities
  • Ignoring context: Applying same maturity targets regardless of risk profile or organizational constraints

Maturity models are maps, not territories. They help you navigate, but the actual terrain—your organization's specific risks, constraints, and culture—determines the right path. Use maturity models as guides, not rigid prescriptions.

Recommendations

We recommend the following approaches to maturity models and roadmaps:

  1. Adopt a maturity model: Use an established framework or develop one appropriate to your context. The model provides common vocabulary and structured improvement approach.

  2. Assess honestly: Accurate assessment is more valuable than favorable assessment. Identify where you truly are to plan effectively for where you need to be.

  3. Set appropriate targets: Not every organization needs Level 5 maturity. Set targets based on your risk profile, regulatory requirements, and business context.

  4. Prioritize gaps rigorously: You cannot close every gap simultaneously. Use structured prioritization to sequence improvements by risk impact and feasibility.

  5. Build roadmaps that adapt: Roadmaps should guide but not constrain. Build in review points where plans can be adjusted based on learning and changing circumstances.

  6. Track milestones visibly: Clear milestones with visible tracking create accountability and enable stakeholder communication.

  7. Embed continuous improvement: Maturity is maintained through ongoing effort. Build learning loops that translate experience into program refinement.

  8. Celebrate progress: Acknowledge achievements along the way. Sustained improvement requires motivation, and recognition of progress builds momentum.

Maturity models and roadmaps transform supply chain security from an overwhelming challenge into a manageable journey. By understanding where you are, defining where you need to be, and planning the path between, you create a foundation for systematic, sustainable improvement.