Preparing Evidence for DCC Level 0: What Assessors Actually Look For
By Jay Hopkins · Published 3 March 2026 · Updated 21 April 2026 · 11 min read
Evidence is the currency of DCC Level 0. Every answer in your self-assessment must be supported by something an assessor can inspect. Weak evidence generates clarification cycles; fabricated evidence generates findings and sometimes outright refusal. This guide covers what the assessor is actually looking for, control by control, with specific examples of strong and weak evidence.
The fundamental principle: operational, not performative
The single most important idea in DCC evidence is that assessors want to see that your controls are operating in the normal course of business, not that you produced documentation for the assessment.
A policy written last week and signed the day before submission is weaker evidence than a policy that has been in place for two years with three documented reviews. A patch report generated this morning for the assessment is weaker than a monthly patch report series showing the same process running for twelve months.
The assessor's test: could this organisation produce the same evidence tomorrow, next week, next quarter, without special effort? If the answer is yes, the evidence is operational. If it required a mini-project to produce, it is performative, and the assessor's confidence in the control's sustainability drops accordingly.
Evidence categories
Across the L0 requirement set, evidence falls into five categories:
Policy and procedure documents. Information Security Policy, Acceptable Use Policy, Incident Response Plan, joiner-mover-leaver procedure, Supplier Due Diligence procedure.
Configuration outputs. Firewall rulesets, cloud security group exports, Group Policy or MDM configuration, AD user exports, patch management dashboard screenshots.
Records of operation. Patch deployment reports, vulnerability scan outputs, user access review sign-offs, incident ticket logs, change management records.
Training and vetting records. Security awareness training completion logs, BS 7858 vetting certificates, starter checklists.
Supply chain assurance evidence. Supplier questionnaires, copies of supplier CE certificates, MSP contracts showing security obligations.
Most submissions need evidence from all five categories. A submission heavy on category 1 (lots of policies) and light on category 3 (few operational records) is the classic "policy shelf-ware" warning sign for an assessor.
Control-by-control evidence expectations
Firewalls
Strong evidence:
Exported firewall ruleset with timestamps on rule creation and last review.
Quarterly firewall review log showing rules reviewed, removed, or updated.
Screenshot of the firewall administrative login page confirming MFA is enforced.
For cloud: Terraform or AWS Config output showing the current security group state.
Weak evidence:
A written statement that "our firewall is configured appropriately".
A ruleset exported for the first time ever, five minutes before submission.
Secure configuration
Strong evidence:
A one-page device build standard document per device type.
MDM or Intune policy export showing enforcement.
A sample device compliance report showing 95%+ adherence across the estate.
Weak evidence:
"We use standard builds."
A build document written specifically for the assessment.
Security update management
Strong evidence:
Patch management policy stating the 14-day critical SLA.
A monthly patch dashboard screenshot series from the last three months.
An exception log showing any systems outside the SLA and the approved exception.
Weak evidence:
"We have WSUS."
A single current patch status report without historical context.
User access control
Strong evidence:
The last three monthly user access review sign-offs.
MFA enforcement policy export from Microsoft Entra or equivalent.
The last five joiner-mover-leaver records with dates.
Separation of duty evidence (named administrators with dual accounts).
Weak evidence:
User list CSV with no review or sign-off evidence.
"We disable leavers' accounts."
Malware protection
Strong evidence:
AV/EDR management console screenshot showing compliance across the endpoint estate.
Recent alert handling evidence (ticket or response log).
Email gateway configuration showing attachment and link handling.
Weak evidence:
"All our laptops have antivirus."
Screenshot of a single device's AV status.
Governance evidence
Beyond the technical controls, L0 requires governance evidence:
Information Security Policy. Covering scope, roles, acceptable use, data classification, incident reporting, remote working, supplier management, physical security. Signed by a director, reviewed within the last 12 months.
Incident Response Plan. Named roles, notification thresholds, MOD contact routes, testing evidence (even a single tabletop exercise in the last 12 months).
Acceptable Use Policy. Signed by every in-scope user as part of their onboarding or annual reacknowledgement.
Supplier Due Diligence procedure. How you evaluate your own suppliers for cybersecurity posture.
Staff vetting. BS 7858 certificates for staff with access to MOD data in sensitive roles, or equivalent evidence.
Evidence retention
The scheme expects evidence to be retained for the duration of the certificate (three years) plus a reasonable period for annual attestation and audit. A practical baseline:
Policies and procedures: the current version plus the previous two versions.
Configuration outputs: quarterly snapshots for the three-year certificate duration.
Operational records: retained for at least 12 months rolling.
Training and vetting records: retained for employment duration plus six years.
This is not a formal scheme requirement but is what the annual attestation process needs to draw on.
Common evidence errors
Submitting the document without the record. A policy without evidence it is applied is not evidence.
Over-submitting. Uploading 40 unrelated documents and asking the assessor to find the relevant parts. Submit specifically against each question.
Evidence without authorship or date. Unsigned, undated documents raise immediate authenticity questions.
Screenshots cropped too tightly. If the hostname, timestamp, or user is cropped out, the evidence does not support the claim.
Translated-then-edited evidence. Evidence that has been cleaned up "for neatness" often loses the operational fingerprint an assessor is specifically looking for.
How Fig Group prepares evidence
At Fig, evidence preparation is a consultant-supported activity, not a DIY exercise:
The consultant runs an evidence planning call at engagement opening and produces a specific evidence list for your scope.
You upload evidence into a shared secure folder structured by control family.
The consultant reviews the evidence before submission, flagging thin items, inconsistencies, or missing pieces.
The Fig technology platform runs across your estate to provide machine-generated evidence for the technical controls (which is stronger than screenshots because the data is objective).
Final submission happens only when the consultant is satisfied the evidence pack is complete.
This pre-review discipline is why Fig's first-pass rate on L0 is materially higher than audit-only models: the assessor sees a clean pack, not a work-in-progress.
Key questions from MOD suppliers researching this topic.
What evidence do DCC Level 0 assessors look for first?
Assessors usually start with evidence that proves operational control: policies in use with review dates, security configurations with timestamps, access governance records with sign-offs, and patch or malware reporting with historical context.
Can we use existing documentation instead of creating new files?
Yes - and you should. Existing operational evidence is stronger than newly drafted paperwork because it demonstrates controls are genuinely embedded, not produced only for audit. Assessors specifically look for the "operational fingerprint" on evidence.
How recent should our evidence be?
Evidence should reflect real operating practice. Configuration outputs and operational records should be from the last 30 to 90 days; policies should have been reviewed within the last 12 months. Outdated screenshots and stale reports are a common cause of clarification requests.
Are screenshots alone enough for certification?
Screenshots help but rarely stand alone. Assessors expect corroborating records: policy statements that frame the screenshot, change logs that show configuration history, or management sign-off evidence for reviews the screenshot represents.
How long should evidence records be retained?
Retain evidence for the certificate duration (three years) plus annual attestation cycles. A practical baseline: current-version plus two previous versions of policies, quarterly configuration snapshots, rolling 12-month operational records, and employment-duration-plus-six-years for vetting records.
Related DCC articles
Keep reading.
Technical Guides
Preparing for DCC Level 1 Assessment: A Practical Six-Phase Guide
DCC Level 1 is substantially more involved than Level 0 and is where most suppliers underestimate the effort. This guide walks through a practical six-phase preparation approach covering scoping, governance, technical controls, platform gap analysis, mock assessment, and submission.
How Long Does Defence Cyber Certification Take? A Realistic Timeline for L0 and L1 Assessment
The honest answer to "how long does DCC take" depends more on the supplier's starting posture than on the Certification Body's turnaround. L0 can complete in under three weeks for a prepared organisation. L1 is a six to twelve week engagement. This guide walks through both, with the specific factors that lengthen or shorten each phase.
Scoping Your Organisation for DCC Level 0: The Decisions That Make or Break Your Assessment
Scope is the decision that most often determines whether a DCC Level 0 engagement runs clean or drags on for weeks. This guide walks through how scope is actually constructed, the decisions an assessor will challenge, the five common scoping errors, and how to align DCC scope with your Cyber Essentials scope.