This practice is about asking one question before every change: could this break something related to security or CUI protection? It’s not about being paralyzed by analysis. It’s about doing the thinking before implementation, not after. CM.L2-3.4.3 tracks the change. CM.L2-3.4.4 analyzes whether the change is safe. They work together. When changes affect access controls or system boundaries, that analysis also informs AC.L2-3.1.1 authorization decisions.
What the assessor is actually evaluating
The assessor is checking whether you think about security when things change. Specifically:
Did someone with security knowledge look at this change? Not necessarily a full-time security person. But someone who knows what CUI is, where it lives, and what happens if systems go down or get tampered with. In a small shop, that might be the owner. The point is that security wasn’t an afterthought.
Did you document the impact analysis? The thinking has to be visible. You can’t just say you thought about it. You need evidence that says “we looked at this change and determined the risk is low because it’s only affecting non-CUI systems” or “we determined this is high-risk because it touches the file server where CUI is stored, so we scheduled testing in advance.”
Did the analysis inform the approval decision? If something was flagged as high-risk, did that change when it was approved? Did approval happen at a higher level? Did implementation get scheduled differently? If the analysis came back clean, did everyone just rubber-stamp it? The analysis needs to matter.
What a realistic SSP definition looks like
[Organization Name] reviews all proposed changes to information systems for security impact prior to approval. For each change request, the Systems Administrator or IT Director documents an impact analysis that considers the following: whether the change affects systems storing, processing, or transmitting CUI; potential impact on system availability, integrity, or confidentiality; required testing or rollback procedures; and any required security controls or exceptions.
Changes are categorized as low-risk or high-risk based on the impact analysis. Low-risk changes (e.g., applying vendor patches to non-CUI systems, installing approved software) are approved by the Systems Administrator. High-risk changes (e.g., modifications to CUI storage systems, changes to security controls, network configuration changes) require approval by the IT Director and are tested in a controlled environment before production implementation.
The impact analysis and categorization are documented in the change request and retained with the change record.
The key details:
It names who does the analysis. Not a generic “the appropriate person.” The Systems Administrator or IT Director, identified by role and capable of being pointed at when the assessor asks.
It describes what the analysis covers. Not “security impact.” The example spells out: does it touch CUI systems? What’s the risk to availability, integrity, confidentiality? Do we need testing? Can we roll it back? These are the real questions that matter.
It ties analysis to risk categorization. The analysis isn’t just a report that gets filed. It results in a risk level that changes how the approval and implementation happen. Low-risk changes are approved faster. High-risk changes get tested.
How to present your evidence
- Change management process that includes security impact analysis
- Completed change records showing impact analysis for recent changes
- Examples of both low-risk and high-risk change analysis
- Testing records for high-risk changes
- Documentation of risk categorization and how it drove approval/implementation decisions
When the assessor reaches CM.L2-3.4.4, have these ready:
Impact analysis examples in your change records. Don’t create fake ones. Pull real changes from the last six months. If one was a Windows patch on a workstation, the analysis might be “non-CUI system, vendor-supported update, low risk.” If one was a configuration change on your CUI server, the analysis should be longer and more detailed.
Both categories represented. You need to show that you distinguish between low and high-risk changes. If every change in your log is identical or has the same canned analysis, the assessor will suspect you’re going through the motions without thinking.
Testing records for anything high-risk. If you categorized a change as high-risk, you should have evidence that you tested it before rolling it to production. That might be a lab environment test, a pilot group test, or a rollout schedule that included a verification step.
A clear connection between analysis and approval. If a change was marked high-risk, it should be approved by a higher authority or at a higher level in your organization. If a change was marked low-risk, the approval is faster. The logic should be traceable.
Keep answers short. Show the evidence, don't describe it. Let the assessor drive. For more on how to present in the assessment room, see How to Present Evidence in the Assessment Room.
Common failures
No impact analysis documented.** Changes are tracked and approved, but nothing in the change record says anything about security. The assessor will ask what security impact this change had and won't find an answer.
Generic or identical analysis for all changes.** Every change record has the exact same two-sentence impact statement. That looks like a template that someone filled in without thinking. Real changes have different profiles and should have different analyses.
High-risk changes not tested before production.** If you identified something as high-risk and implemented it directly to production, the assessor will push. If you identified it as high-risk, you should have tested it somewhere safe first.
Analysis that contradicts the approval.** You documented that something is high-risk and required IT Director approval, but the approval came from a junior admin. Or you marked something as low-risk but it touched a critical CUI system. Inconsistency creates doubt.
Clear, documented thinking about each change. Different analyses for different types of changes. Risk categorization that makes sense and actually drives how the change gets approved and implemented. Testing for anything risky. When all that lines up, the assessor trusts that security thinking is integrated into your process as core practice, not merely a checkbox compliance item.
Short, practical breakdowns of what assessors actually ask and how to answer. No compliance jargon, no sales pitch. Subscribe free on Substack.
The strongest MSSPs I've worked with maintain their own change management system with documented impact analysis. They can export a change log showing security impact was considered for each change. That's a sign they're mature enough to be trusted with CUI systems.
If you use an MSP/MSSP
If your MSP or MSSP proposes changes, they should be documenting the security impact analysis. If they’re not, ask for it. This is foundational ITIL practice, and a professional services provider should have it.
What you’re looking for:
The MSP’s impact analysis. When they propose a change, do they include a security risk assessment? If they just say “we’re going to patch the servers” and nothing more, that’s not enough.
Your review and approval. Once the MSP provides the analysis, you need to review it and approve it. That’s your role in the process. Don’t just accept what they say as the authority on whether something is risky. You know your environment and your CUI. You should be the one deciding whether the risk level is acceptable.
Feedback loop. If the MSP categorizes something as low-risk but you think it touches critical systems and should be high-risk, that needs to be a conversation. Your change management process should allow you to override or escalate.
The stronger MSSPs I've worked with maintain their own change management system with documented impact analysis. They can export a change log for your review period that shows security impact was considered for each change. That's a sign they're mature enough to be trusted with CUI systems. If your MSP can provide that, it makes your life easier in the assessment. If they can't, you need to implement the analysis layer yourself, on top of their change tracking.
This page covers CM.L2-3.4.4 from NIST SP 800-171 Rev 2 (3.4.4). The guidance here is based on experience in real CMMC assessments and is intended to help you prepare. It is not legal or compliance advice. Your organization’s situation is unique, and you should work with qualified professionals for formal assessment preparation.