Executive Summary
Defense programs are deploying DevSecOps practices at unprecedented speed — driven by the Software Acquisition Pathway (DoD 5000.87), DISA's DevSecOps Reference Design, and the escalating urgency of delivering software-intensive capabilities faster than the threat environment evolves. Yet most programs lack a rigorous, quantitative framework for measuring whether their DevSecOps practices are actually improving. They track activity — pipeline runs, sprint velocity, code coverage percentages — without a coherent model connecting these metrics to mission outcomes or program health.
This white paper, authored by Kurt A. Richardson, PhD, presents the Continuum DevSecOps Maturity Framework (CDMF) — a five-domain, 40-KPI scoring model built specifically for defense programs. CDMF is informed by DORA research, the DISA DevSecOps Fundamentals Guide, DoD 5000.87 requirements, and Continuum's operational experience across Space Force, Navy, and Army programs. It provides program managers, contracting officers, and technical leads with a quantitative, defensible, and actionable measurement system.
Most defense programs measuring DevSecOps are measuring the wrong things — pipeline mechanics rather than outcomes. The CDMF shifts measurement from activity (how many pipelines run) to outcomes (how fast, how reliably, how securely does software reach operational capability). This distinction is the difference between a measurement system that drives improvement and one that generates compliance artifacts.
Introduction: The Measurement Gap
The DoD has made a decisive commitment to DevSecOps. The 2019 DoD Digital Modernization Strategy, the 2021 DoD DevSecOps Reference Design, and the Software Acquisition Pathway all reflect a strategic judgment that the ability to deliver software rapidly, securely, and continuously is itself a warfighting capability. Programs that cannot demonstrate this capability increasingly find themselves disadvantaged in source selection, program reviews, and budget justification.
But commitment to DevSecOps and effective practice of DevSecOps are different things. A program can have a CI/CD pipeline, use agile sprints, and produce vulnerability scan reports — and still be delivering software at the same speed, with the same defect density, and with the same security posture as a waterfall program from 2005. The pipeline exists; the outcomes have not changed. Measurement is the mechanism by which organizations discover whether their practices are producing the results they were adopted to achieve.
The DoD Measurement Mandate
DoD 5000.87 requires programs operating under the Software Acquisition Pathway to demonstrate continuous delivery capability through "software development and deployment metrics." The DISA DevSecOps Reference Design specifies that programs should measure cycle time, defect escape rate, and pipeline reliability. CMMC Level 2 requires documented and measured security practices. None of these directives specify how to build a comprehensive measurement system — that gap is what the CDMF addresses.
Who This Paper Is For
This framework is written for program managers and deputy program managers needing to report DevSecOps progress to program offices and FYDP stakeholders; contracting officers writing CLIN structures and evaluation criteria for DevSecOps-intensive task orders; technical leads and DevSecOps practitioners building measurement pipelines; and auditors and oversight organizations assessing program DevSecOps claims. It does not require deep technical background — the framework is designed to be both technically precise and programmatically useful.
Why Maturity Measurement Matters
Measurement in DevSecOps serves four distinct functions that are often conflated but require different metrics and different audiences:
The DORA Research Foundation
The DORA (DevOps Research and Assessment) research program, now operated by Google Cloud, is the most rigorous large-scale empirical study of software delivery performance in existence. The 2023 State of DevOps report covers over 36,000 professionals across industries and establishes validated causal links between specific technical practices and organizational outcomes. DORA's four key metrics — Deployment Frequency, Lead Time for Changes, Time to Restore Service, and Change Failure Rate — form the empirical backbone of the CDMF's Delivery Speed and Quality domains.
| DORA Metric | Elite Performers | High Performers | Medium Performers | Low Performers |
|---|---|---|---|---|
| Deployment Frequency | On-demand (multiple/day) | 1–7 days | 1–6 months | >6 months |
| Lead Time for Changes | <1 hour | 1 day – 1 week | 1–6 months | >6 months |
| Time to Restore Service | <1 hour | <1 day | 1 day – 1 week | >1 week |
| Change Failure Rate | 0–5% | 5–10% | 10–15% | 15–30% |
DORA benchmarks reflect commercial software organizations. Defense programs operating under ATOs, classified environments, and DoD change management requirements will typically score lower on raw deployment frequency — and that is appropriate. The CDMF adjusts benchmarks for defense-specific constraints, separating deployment capability (how fast could you deploy if authorized) from deployment cadence (how frequently do you actually deploy, given program governance requirements).
Existing Maturity Models & Gaps
Several DevSecOps maturity models exist, but none fully addresses the defense program context. Understanding their strengths and gaps is prerequisite to understanding what the CDMF adds.
| Model | Origin | Strengths | Defense Gaps |
|---|---|---|---|
| DORA Framework | Google / DORA Research | Empirically validated; causal links to outcomes; widely adopted benchmark data | No security domain; commercial focus; no compliance measurement; no defense-specific constraints |
| CMMI-DEV v2.0 | CMMI Institute | Comprehensive process areas; government-recognized; appraisal methodology | Process-focused, not outcome-focused; limited DevOps coverage; no security integration; heavyweight assessment |
| DISA DevSecOps Ref Design | DISA | DoD-specific; pipeline architecture guidance; security requirements | Descriptive architecture, not measurement framework; no KPI definitions; no scoring methodology |
| SAFe DevOps Competency | Scaled Agile Inc. | Integrated with Agile planning; continuous delivery pipeline model | Enterprise Agile context; limited security measurement; not defense-specific; commercial benchmarks only |
| CNCF Cloud Native Maturity | CNCF / Cloud Native Foundation | Cloud-native technology coverage; progressive maturity levels | Technology-centric; no team/culture domain; no compliance measurement; commercial cloud assumptions |
| CDMF (this paper) | Continuum Resources LLC | Five domains including security and compliance; defense-specific benchmarks; quantitative KPIs; interactive scoring | Designed specifically for DoD; aligned to 5000.87, DISA, CMMC |
The Continuum DevSecOps Maturity Framework
The CDMF organizes DevSecOps measurement into five domains, each containing eight KPIs, scored on a 1–5 maturity scale. The five domains span the full DevSecOps value stream — from software delivery speed through team culture — and together produce a holistic maturity score with a defensible, auditable methodology.
Scoring Methodology
Each KPI is scored on a 1–5 scale with defined thresholds at each level. Domain scores are the unweighted average of the eight KPI scores within that domain. The Overall CDMF Score is a weighted average of domain scores — weighting is configurable to reflect program priorities but defaults to equal weighting across all five domains.
| Score Level | Label | Characteristics | Program Posture |
|---|---|---|---|
| Level 1 | Initial | Ad hoc or inconsistent practices; no automated measurement; reactive response to failures | High risk; significant improvement investment required |
| Level 2 | Defined | Documented practices; some automation; measurement beginning; reactive culture predominates | Moderate risk; clear improvement path visible |
| Level 3 | Managed | Consistent practices; automated pipelines; metrics tracked; security integrated; proactive improvement | Adequate for most DoD programs; continuous improvement active |
| Level 4 | Advanced | Optimizing practices; predictive metrics; security-first culture; rapid recovery; high automation | Strong posture; demonstrates elite DevSecOps capability |
| Level 5 | Elite | Continuous experimentation; feedback loops at every level; security and delivery unified; industry-leading metrics | Elite performance; DoD benchmark program |
Delivery Speed & Frequency
Delivery speed is the most visible and most frequently misunderstood DevSecOps dimension. Programs often conflate deployment frequency with delivery capability — measuring how often they push updates without measuring whether they could push updates more frequently if the governance environment permitted. For defense programs under ATO-gated change management, the CDMF distinguishes between pipeline capability (the technical ability to deploy on demand) and authorized deployment cadence (how frequently governance processes permit production deployments).
Domain 1 KPI Definitions
| KPI | Definition | Level 3 Target | Level 4–5 Target |
|---|---|---|---|
| D1.1 — Pipeline Capability Rate | % of completed features that could be deployed to production within 1 business day if governance approved | >70% | >90% |
| D1.2 — Authorized Deployment Frequency | Number of production deployments per month (normalized for program governance constraints) | ≥2/month | ≥weekly |
| D1.3 — Lead Time for Changes | Time from code commit to production deployment for a standard change (p50) | <5 days | <24 hours |
| D1.4 — Pipeline Cycle Time | Time from pipeline trigger to deployment-ready artifact (p50 and p95) | p50 <30 min | p50 <10 min |
| D1.5 — Pipeline Success Rate | % of pipeline runs that complete successfully without manual intervention | >85% | >95% |
| D1.6 — Batch Size | Average number of commits or stories per production deployment (smaller = lower risk) | <20 commits | <5 commits |
| D1.7 — Queue Wait Time | Average time changes wait in review, approval, or testing queues before pipeline execution | <2 days | <4 hours |
| D1.8 — Environment Availability | % uptime of non-production environments (dev, test, staging) that block pipeline progress | >95% | >99% |
D1.1 measures what your pipeline can do; D1.2 measures what your governance allows. A program with D1.1=Level 5 (pipeline always ready) and D1.2=Level 2 (infrequent authorized deployments) is a healthy technical capability constrained by governance overhead. A program with D1.1=Level 2 and D1.2=Level 2 has both a technical and a governance problem. These require different interventions and should never be conflated in reporting.
Quality & Reliability
Quality in DevSecOps is not just the absence of defects — it is the ability to deliver changes with high confidence, detect failures quickly when they occur, and restore service rapidly. DORA research has conclusively established that high-performing teams have lower change failure rates despite deploying more frequently — disproving the intuition that speed comes at the cost of quality. The mechanisms that enable high-frequency deployment (small batch sizes, automated testing, feature flags, progressive delivery) are the same mechanisms that prevent defects.
Domain 2 KPI Definitions
| KPI | Definition | Level 3 Target | Level 4–5 Target |
|---|---|---|---|
| D2.1 — Change Failure Rate | % of production deployments that result in degraded service, rollback, or emergency fix | <10% | <5% |
| D2.2 — Mean Time to Restore (MTTR) | Average time from production incident detection to service restoration (p50) | <4 hours | <1 hour |
| D2.3 — Defect Escape Rate | % of defects first detected in production vs. detected earlier in pipeline | <15% | <5% |
| D2.4 — Test Automation Coverage | % of testable code paths covered by automated tests (unit + integration + e2e combined) | >70% | >85% |
| D2.5 — Test Execution Time | Total automated test suite execution time (impacts pipeline cycle time and developer feedback loop) | <15 min | <5 min |
| D2.6 — Technical Debt Ratio | Ratio of estimated debt remediation time to estimated new feature development time (from static analysis) | <20% | <10% |
| D2.7 — Rollback Frequency | Number of production rollbacks per quarter (distinct from Change Failure Rate — measures rollback as mechanism) | <2/quarter | 0–1/quarter |
| D2.8 — Mean Time Between Failures (MTBF) | Average time between production reliability events requiring incident response | >30 days | >90 days |
Security Posture
Security posture measurement in DevSecOps goes beyond traditional vulnerability management to assess whether security is embedded in every stage of the pipeline — "shifted left" to the point where security findings are discovered and remediated during development, not at ATO review or during operational testing. For defense programs, the DISA STIG compliance rate and the vulnerability mean-time-to-mitigate (MTTM) are the most operationally significant security KPIs.
Domain 3 KPI Definitions
| KPI | Definition | Level 3 Target | Level 4–5 Target |
|---|---|---|---|
| D3.1 — DISA STIG Compliance Rate | % of applicable STIG checks passing in latest scan across all deployed components | >85% | >95% |
| D3.2 — Critical Vuln MTTM | Mean time from critical vulnerability discovery (CVSS 9.0+) to verified remediation in production | <30 days | <7 days |
| D3.3 — SAST/DAST Pipeline Coverage | % of pipeline runs that execute both static and dynamic security analysis before deployment authorization | >90% | 100% |
| D3.4 — SBOM Coverage | % of deployed artifacts with a current, machine-readable SBOM in SPDX or CycloneDX format | >80% | 100% |
| D3.5 — Security Gate Pass Rate | % of pipeline runs that pass all security policy gates without waiver or exception | >75% | >90% |
| D3.6 — Open Critical/High Findings (POA&M Age) | Average age of open Critical and High security findings on the POA&M | <60 days | <30 days |
| D3.7 — Secret Exposure Events | Number of confirmed credential/secret exposure events per quarter (target: zero; tracked for trend) | 0/quarter | 0/quarter |
| D3.8 — Dependency Freshness | % of third-party dependencies within 2 major versions of current and free of known CVEs | >80% | >95% |
Secret Exposure Events is the only CDMF KPI where the Level 3 and Level 4–5 targets are identical: zero. A single confirmed credential exposure event in a DoD program is a potential reportable incident under DFARS 252.204-7012 and requires immediate remediation, root cause analysis, and program office notification. Programs should track this metric not for improvement targeting but for trend analysis — any non-zero quarter requires a formal corrective action review.
Compliance Readiness
Compliance readiness in the CDMF measures not just whether a program is compliant today, but whether it is continuously maintaining the evidence, documentation, and processes required to demonstrate compliance on demand. The distinction between "compliant during the audit" and "continuously compliant" is the difference between a reactive compliance program and a mature one. For programs under DoD 5000.87, continuous compliance is not optional — the Software Acquisition Pathway requires ongoing measurement and reporting, not point-in-time snapshots.
Domain 4 KPI Definitions
| KPI | Definition | Level 3 Target | Level 4–5 Target |
|---|---|---|---|
| D4.1 — ATO Currency Rate | % of system components operating under a current, non-expired ATO or IATT | 100% | 100% + continuous monitoring |
| D4.2 — RMF Control Compliance Rate | % of implemented RMF controls in a compliant state in the most recent assessment | >90% | >97% |
| D4.3 — CMMC Practice Coverage | % of applicable CMMC Level 2 practices with documented implementation and evidence (for CUI-handling programs) | >85% | >95% |
| D4.4 — Audit Finding Closure Rate | % of audit findings closed within the committed remediation timeline (from most recent audit cycle) | >80% | >95% |
| D4.5 — Documentation Currency | % of required program documentation (SSP, SAR, POAM, ConMon) updated within the required cycle | >85% | >97% |
| D4.6 — ConMon Automation Rate | % of continuous monitoring activities (scans, log reviews, metric collection) that are automated | >70% | >90% |
| D4.7 — Software Acquisition Pathway Metrics | % of DoD 5000.87-required delivery metrics currently collected and reported to program office | >80% | 100% |
| D4.8 — Supply Chain Risk Assessment Currency | % of critical third-party components with a current (annual) supply chain risk assessment | >75% | >95% |
Team Capability & Culture
DORA research identifies team culture — specifically, the Westrum organizational culture typology — as the strongest single predictor of software delivery performance. Programs with generative cultures (high trust, information flows freely, risks are shared, failure is treated as an opportunity to learn) significantly outperform those with pathological or bureaucratic cultures, even when controlling for technical practice maturity. This domain is the hardest to measure but the most important to improve.
Domain 5 KPI Definitions
| KPI | Definition | Level 3 Target | Level 4–5 Target |
|---|---|---|---|
| D5.1 — Deployment Autonomy | % of production deployments that teams can execute without requiring external approvals beyond the defined governance gate | >70% | >90% |
| D5.2 — Blameless Postmortem Rate | % of production incidents that result in a completed blameless postmortem within 5 business days | >80% | >95% |
| D5.3 — Training Investment | Average hours per team member per quarter in DevSecOps-relevant training (technical, security, process) | >8 hours/qtr | >16 hours/qtr |
| D5.4 — Westrum Culture Score | Quarterly team survey score on Westrum organizational culture scale (1–7), measuring information flow quality | >5.0 | >6.0 |
| D5.5 — Flow Efficiency | Ratio of value-adding time to total lead time across the delivery pipeline (eliminates wait time waste) | >40% | >60% |
| D5.6 — Cross-Functional Collaboration Index | Quarterly survey measuring dev, sec, ops, and test team coordination effectiveness (1–5 scale) | >3.5 | >4.2 |
| D5.7 — On-Call Burden | Average after-hours incident response hours per engineer per month (measures operational health) | <4 hrs/mo | <1 hr/mo |
| D5.8 — Team Retention Rate | Annual team retention rate for DevSecOps practitioners (high turnover destroys knowledge and slows delivery) | >80% | >90% |
KPI Reference Library
The following interactive library provides implementation guidance for each of the 40 CDMF KPIs — data sources, measurement tools, calculation methods, and common pitfalls. Select a domain to focus the view.
Interactive Scoring Tool
Use the following tool to self-assess your program against the CDMF. Rate each capability area on the 1–5 scale by clicking the dots. The tool calculates domain scores and an overall CDMF score in real time. This assessment is designed for a team discussion session — plan 60–90 minutes with dev, sec, ops, and program management representatives present.
Defense Program Benchmarks
The following benchmarks represent aggregate performance data from Continuum's program engagements and publicly available DoD DevSecOps assessments. They are intended to provide context for self-assessment scores — not as absolute targets, since program-specific constraints (classification level, mission criticality, governance overhead) significantly affect achievable scores.
Typical CDMF Scores by Program Type
Common Score Patterns and What They Mean
| Score Pattern | Typical Diagnosis | Priority Interventions |
|---|---|---|
| D1 High, D2 Low | Program deploys frequently but with high defect rates — speed without quality gates. Common in early agile adoptions that skipped test automation investment. | Invest in D2.3 (defect escape) and D2.4 (test automation coverage) before further increasing deployment frequency |
| D1 Low, D3 High | Strong security culture but pipeline bottlenecks blocking delivery — over-gating. Security gate configuration may be adding friction without proportionate risk reduction. | Analyze D1.7 (queue wait time) — are security gates causing queuing? Automate security checks to reduce gate latency |
| D3 High, D4 Low | Good security practices but poor compliance documentation. Technical security implemented but not documented for ATO purposes — common in technically strong teams that underinvest in documentation. | D4.5 (documentation currency) and D4.6 (ConMon automation) — automate evidence collection from existing tools |
| D1–D4 High, D5 Low | Strong technical practice but cultural/organizational dysfunction. Technically excellent team with high on-call burden, low collaboration, or retention problems — a burnout risk indicator. | Prioritize D5.7 (on-call burden) and D5.8 (retention) — technical excellence is not sustainable without cultural health |
| All Domains Level 2 | Uniformly developing program. Practices documented but inconsistent; some automation; reactive culture. This is the most common profile for programs 12–24 months into DevSecOps adoption. | Focus on one domain for improvement per quarter; avoid parallel improvement initiatives that dilute effort and produce no domain breakthrough |
DoD-Specific Guidance
The CDMF is designed to be applicable across program types, but defense programs face specific constraints that affect how KPIs are interpreted and targeted. This section provides guidance for the three most common DoD deployment contexts.
Software Acquisition Pathway (DoD 5000.87)
Programs operating under the Software Acquisition Pathway must demonstrate continuous delivery capability to progress through lifecycle gates. The program office expects to see evidence of the metrics in Table 7 of the DISA DevSecOps Fundamentals Guide. The CDMF maps directly to these requirements:
- Deployment frequency and lead time (D1.2, D1.3) are explicitly required by 5000.87 Section 3.4 and must be reported at program reviews.
- Change failure rate and defect escape rate (D2.1, D2.3) demonstrate quality of continuous delivery and are evaluated during Operational Acceptance reviews.
- DISA STIG compliance and POA&M currency (D3.1, D3.6) are required for ATO maintenance under the RMF Continuous Monitoring strategy.
- The Software Acquisition Pathway Metrics KPI (D4.7) specifically tracks whether the program is collecting and reporting the full set of metrics required by the pathway — a meta-metric that validates measurement system completeness.
Classified / SIPR Environment Adjustments
Programs operating classified pipelines face additional constraints that affect benchmark targets:
- Deployment frequency (D1.2): Authorized deployment cadence is typically lower due to classified change management requirements. Adjust benchmark to ≥monthly rather than ≥weekly for Level 3 on classified systems.
- SBOM coverage (D3.4): Open-source tooling for SBOM generation may not be available in air-gapped environments. Programs should document the limitation and use available alternatives; the KPI measures the program's best achievable posture given constraints.
- Westrum culture score (D5.4): Survey instruments for classified programs must be administered through cleared channels. Programs should ensure survey anonymity is maintained to get accurate responses.
Continuum's Space Force Program Reference
Continuum led the first SpOC Operational Acceptance under the Software Acquisition Pathway — establishing what CDMF Level 3–4 looks like in practice for a Space Force program. Key validated benchmarks from that engagement: Lead Time for Changes below 3 days for standard changes; STIG compliance above 90% at acceptance; blameless postmortem process established with 85% capture rate. These are achievable targets, not aspirational ones.
The measurement framework that supported the first SpOC Operational Acceptance under the Software Acquisition Pathway used CDMF-aligned metrics to demonstrate delivery capability to the program office. The metrics provided the evidence base for the acceptance decision — not subjective assessments, but quantified delivery performance against defined thresholds. This is the model for how CDMF scores should be used in future Operational Acceptance events.
Improvement Roadmap
Improving DevSecOps maturity is a sequenced investment, not a parallel initiative. Programs that attempt to improve all five domains simultaneously typically improve none — effort is diffused, progress is invisible, and stakeholders lose confidence in the improvement program. The following phased roadmap reflects the sequencing that Continuum recommends based on the typical causal dependencies between domains.
Conduct the CDMF self-assessment. Instrument the pipeline for automated data collection on all 40 KPIs where feasible. Establish baseline values. Identify the two or three KPIs with the largest gap between current state and Level 3 targets. Do not begin improvement initiatives until you have a measurement baseline — otherwise you cannot demonstrate improvement.
Invest in the D1 and D2 foundational capabilities — pipeline automation, test coverage, and defect detection. These are the prerequisites for everything else: you cannot measure security gate effectiveness without a reliable pipeline; you cannot trust compliance automation without reliable test data. Target: bring D1 and D2 to Level 3 before investing heavily in D3–D5.
Once the pipeline is reliable and quality gates are effective, integrate security tooling as native pipeline stages. Automate STIG scanning, SAST/DAST execution, SBOM generation, and vulnerability correlation. Drive D3 to Level 3. Reducing Critical Vulnerability MTTM and improving STIG compliance rate at this stage typically requires both tooling investment and process change around remediation workflows.
Automate the evidence collection for continuous monitoring. Connect pipeline telemetry to RMF documentation. Automate POA&M updates from vulnerability scan results. Drive D4 to Level 3. A compliance automation investment at this stage — after D1–D3 are mature — is dramatically more effective because the underlying data quality is high.
Invest explicitly in the culture KPIs — blameless postmortem processes, training investment, on-call burden reduction, and Westrum culture improvement. Begin pursuing Level 4–5 targets in the domains where you are strongest. Culture improvement is an ongoing investment, not a project — establish it as a permanent program element with quarterly measurement and leadership accountability.
The Continuum Approach
Continuum Resources does not offer DevSecOps maturity assessment as an advisory service disconnected from delivery. Every Continuum program engagement is itself a CDMF-measured deployment — we operate the measurement framework on our own program work, track our own KPIs, and use the same evidence base we deliver to clients to support our own program reporting. When we assess a client program's maturity, we are comparing against benchmarks we have validated through operational practice, not consulting frameworks we have read about.
- CDMF Baseline Assessment: Independent structured assessment of program DevSecOps maturity across all five domains and 40 KPIs. Includes data collection planning, team interviews, pipeline artifact analysis, and a scored findings report with benchmark comparison. Deliverable: CDMF score card, gap analysis, and prioritized improvement roadmap.
- KPI Instrumentation Design: Design and implementation of the automated data collection infrastructure to continuously measure CDMF KPIs from pipeline telemetry, scanning tools, ticketing systems, and survey instruments. Deliverable: measurement pipeline producing weekly KPI reports.
- Quarterly Maturity Reviews: Ongoing quarterly assessments tracking CDMF score progression, validating improvement initiatives, and updating the improvement roadmap. Provides the continuous evidence base for program management reporting and ATO maintenance.
- DoD 5000.87 Metrics Package: Development of the specific metrics package required by the Software Acquisition Pathway, formatted for program office reporting and Operational Acceptance review. Aligned to the DISA DevSecOps Fundamentals Guide Table 7 metrics.
- Culture Measurement Program: Design and administration of the Westrum culture survey and cross-functional collaboration measurement program — the aspects of DevSecOps maturity that automated tools cannot capture. Includes facilitated results workshops and action planning.
Engagement Models
| Engagement | Scope | Duration | Outcome |
|---|---|---|---|
| Baseline Assessment | Full 40-KPI CDMF assessment with benchmark comparison, gap analysis, and improvement roadmap | 4–6 weeks | CDMF score card, gap analysis, prioritized roadmap for program management reporting |
| KPI Instrumentation | Automated data collection pipeline for continuous CDMF measurement from existing tooling | 6–10 weeks | Weekly automated KPI reports; measurement infrastructure for continuous tracking |
| Quarterly Review Program | Recurring quarterly CDMF assessments tracking progression; roadmap updates; PM reporting support | Ongoing | Documented maturity progression; evidence package for SAP reviews and ATO maintenance |
| Full DevSecOps Program | Baseline + KPI instrumentation + improvement delivery + quarterly reviews; end-to-end DevSecOps maturity improvement | 12–18 months | Documented progression from current to target maturity with full evidence base |
Conclusion
DevSecOps maturity without measurement is aspiration. The CDMF provides defense programs with the structured, quantitative measurement system needed to move from aspiration to evidence — 40 KPIs across five domains that, taken together, answer the question program offices, oversight organizations, and warfighters actually need answered: Is this program getting better at delivering software faster, safer, and more reliably than it was before?
The framework is not the objective. The objective is delivering operational capability to the warfighter at the speed of relevance, with the security posture the threat environment requires. The CDMF is the measurement instrument that tells you whether you are on that path — and if not, which specific practices, which specific metrics, and which specific investments will move you there.
Ready to Measure Your DevSecOps Maturity?
Contact Continuum Resources for a complimentary CDMF Baseline Assessment tailored to your program type, classification environment, and current pipeline maturity.
Get in Touch →References
- [DORA-2023] Google Cloud DORA Research Team — "2023 State of DevOps Report" — dora.dev, 2023. The empirical research foundation for CDMF Domains 1 and 2 benchmarks.
- [DORA-2022] Forsgren, N., Humble, J., Kim, G. — "Accelerate: The Science of Lean Software and DevOps" — IT Revolution Press, 2018. The causal research linking DevOps practices to organizational performance.
- [DISA-DEVSECOPS] DISA — "DevSecOps Fundamentals Guide v2.0" — October 2021. Authoritative DoD reference for DevSecOps pipeline requirements and metrics.
- [DOD-5000-87] Department of Defense — "Operation of the Software Acquisition Pathway" — DoD Instruction 5000.87, October 2020. The acquisition framework establishing continuous delivery requirements for DoD software programs.
- [CMMC-2] Department of Defense — "Cybersecurity Maturity Model Certification (CMMC) 2.0" — 32 CFR Part 170, 2024. Compliance framework governing CDMF Domain 4 compliance KPIs for CUI-handling programs.
- [WESTRUM-2004] Westrum, R. — "A typology of organisational cultures" — BMJ Quality & Safety, vol. 13, 2004. The organizational culture model underlying CDMF Domain 5 cultural measurement.
- [NIST-SSDF] NIST — "Secure Software Development Framework (SSDF) v1.1" — SP 800-218, 2022. Practices informing CDMF Domain 3 security KPI definitions.
- [SAFe-DEVSECOPS] Scaled Agile Inc. — "SAFe DevOps Competency" — scaledagileframework.com, 2023. Continuous delivery pipeline model and flow efficiency concepts applied in CDMF Domain 5.
- [CR-05] Richardson, K.A. — "WP-CR-2025-05: Zero Trust Architecture for CI/CD Pipelines" — Continuum Resources, 2025. The pipeline security architecture that CDMF Domain 3 KPIs are designed to measure.
- [CR-03] Richardson, K.A. — "WP-CR-2025-03: AI Governance for Federal Contractors" — Continuum Resources, 2025. Governance framework complementary to CDMF Domain 4 compliance measurement.
- [ACCELERATE] Forsgren, N., Kersten, M., Humble, J. — "The DORA State of DevOps 2019–2023 Research Program" — Evidence base for DORA four-key metric benchmarks used in this framework.