Show Completion Rates and Behaviour Change in QBRs (2026)
Show Completion Rates and Behaviour Change in QBRs with benchmarks, risk impact, and a 4-step narrative MSPs can use. Turn updates into retention. Read now.

DefendWise
DefendWise
TL;DR
Completion rates prove training was delivered. Behaviour change proves it worked. MSPs who show both completion rates and behaviour change in QBRs transform a routine status update into a retention conversation backed by real risk reduction data. This guide defines each metric, provides industry benchmarks, and gives you a framework for presenting security awareness training results that clients actually care about.
Introduction
Most MSP quarterly business reviews follow a familiar script. Ticket counts, uptime numbers, patch compliance, and somewhere near the end, a line item showing that 92% of the client’s staff completed their security awareness training. Everyone nods. The meeting moves on.
That single completion figure is doing almost no work for you. It doesn’t tell the client whether their people are actually better at spotting phishing. It doesn’t quantify risk reduction. And it definitely doesn’t justify next quarter’s spend.
The MSPs winning retention conversations are the ones who show completion rates and behaviour change in QBRs together, layering participation data with outcome data to answer the question clients actually care about: are we reducing risk, or just staying busy?
This guide defines what completion rates and behaviour change metrics actually measure, provides the benchmarks you need for context, and walks through a QBR presentation framework that connects training activity to business outcomes. It’s written for MSPs and vCIOs delivering security awareness training who want their quarterly reviews to land.
What Is a Training Completion Rate?
Training completion rate is the percentage of assigned employees who finished a training module within the designated completion window, typically 30 to 90 days from assignment. An employee counts as “complete” when they reach the final screen and either pass an assessment or acknowledge completion.
This metric proves one thing: the organizational mechanism for delivering training works. People were assigned modules and they finished them. That matters for compliance, for audit evidence, and for establishing baseline discipline.
What Completion Rate Doesn’t Prove
It does not prove that employees paid attention. It does not prove they retained the information. And it does not prove their behaviour changed afterward.
The metric is also vulnerable to gaming. Employees who click through modules at maximum speed while reading email in another window still register as completions. Research from PhishSkill found that 95% completion with poor engagement produces worse outcomes than 75% completion with genuine attention, and that module design matters more than enforcement.
Completion Rate Benchmarks by Industry
Knowing where your client sits relative to their industry peers gives the completion figure context in a QBR. Here are current benchmarks:
| Industry | Typical Completion Rate | Notes |
|---|---|---|
| Financial Services | 90 to 98% | Regulatory pressure from FINRA and PCI DSS drives enforcement |
| Healthcare | 85 to 95% | High completion but paradoxically high phishing click rates |
| Government (Federal) | 90 to 98% | FISMA requirements with audit enforcement |
| Government (State/Local) | 65 to 85% | Less regulatory pressure |
| Technology | 70 to 90% | Engineers 85 to 95%, sales and admin 60 to 75% |
| Education | 60 to 80% | Faculty resistance to mandates, structural barriers |
| Retail/Hospitality | 60 to 80% | High turnover, seasonal dips below 50% |
Source: PhishSkill completion rate benchmarks, April 2026
As a practical MSP target, ISA Cybersecurity recommends treating 90% or higher as strong and 70% or lower as a warning sign that something is wrong with the training program itself, whether that’s complexity, boredom, or organizational apathy.
The Pricing Factor Most MSPs Overlook
Per-seat pricing creates a quiet incentive to limit who gets trained. If you’re paying per user, covering every receptionist, intern, and contractor starts to feel expensive. That constraint artificially caps completion rates and leaves gaps in coverage. Flat-fee models, like DefendWise’s unlimited-user approach, remove this barrier entirely, making universal training coverage economically viable regardless of headcount.
What Is Behaviour Change in Security Awareness Training?
Behaviour change means measurable shifts in how employees act when facing real or simulated threats. Not knowing the right answer on a quiz, but doing the right thing when a phishing email hits their inbox on a Tuesday afternoon.
As Brightside AI puts it: completion rates prove training happened. They don’t prove behavior changed. Your team can watch every module, pass every quiz, and still fall for the next well-crafted phish.
This is where the real story lives for your QBR. Five metrics capture behaviour change in ways that matter.
1. Phishing Click Rate (Phish-Prone Percentage)
This measures the percentage of employees who click a link or open an attachment in a simulated phishing email. It’s the most widely used behaviour change metric in security awareness programs.
The numbers tell a clear story. According to KnowBe4’s benchmarking study of over 9.5 million users across 30,000 organizations, the baseline phish-prone percentage across all industries was 34.3% before training. After 90 days of consistent training, that figure dropped to 17.6%. After a full year, it reached 5%, an 86% reduction.
Your target: below 5% click rate. Anything above 20% should raise concern.
2. Reporting Rate
Reporting rate measures the percentage of employees who correctly identify and flag suspicious emails through official channels (like a phish-alert button). This captures proactive behaviour rather than passive non-clicking. Someone who ignores a phishing email isn’t helping. Someone who reports it is actively contributing to the organisation’s defence.
Brightside AI recommends targeting at least 70% reporting rates as a benchmark for strong security culture. Research shows reporting rates can more than double with consistent training, climbing from roughly 34% before training to 74% after 12 months.
3. Dwell Time (Time to Report)
Dwell time measures the gap between when a phishing email arrives and when someone reports it. Shorter dwell times mean faster threat detection and less time for an attacker to move laterally or exfiltrate data.
The financial stakes are real. The IBM Cost of a Data Breach Report found that breaches detected in less than 200 days cost approximately $3.87 million, while those lasting longer climb to $5.01 million. Track median dwell time quarterly and show the trend line going down.
4. Repeat-Clicker Reduction
Some employees fail phishing simulations repeatedly. Tracking whether these habitual clickers improve over time is one of the clearest indicators of genuine behaviour change. Research indicates that behaviour-focused programs make users six times less likely to click and seven times more likely to report threats compared to legacy approaches.
5. Human Risk Score
A composite metric that quantifies an individual’s likelihood of falling victim to cyberattacks based on observed behaviours: failing phishing tests, weak passwords, ignoring patches, reporting speed. Think of it like a credit score for cybersecurity. Lower means higher risk. Platforms that calculate this give you a single number to track per user and per organisation over time.
The Healthcare Paradox: Why Completion Alone Fails
Here is the most compelling argument you can make in a QBR. Healthcare organisations consistently achieve 85 to 95% completion rates, among the highest of any industry. Yet they also have some of the highest phishing click rates. Completion without behaviour change is a checkbox exercise. When you show completion rates and behaviour change in QBRs side by side, this gap becomes impossible to ignore.
What Is a QBR (Quarterly Business Review)?
In the MSP context, a quarterly business review is a structured meeting where the MSP presents performance data, risk posture, and strategic recommendations to the client’s decision-makers. It’s the primary touchpoint for demonstrating value beyond break-fix support.
QBRs Are Shifting From Operational Reviews to Risk Reviews
The role of QBRs has changed. According to the 2025 MSP Global State of the Industry Report, more than two-thirds of MSPs say their clients expect them to play a strategic role in cybersecurity and risk management, not just technology operations.
The old format (ticket counts, uptime percentages, patching statistics, service usage) is accurate and often impressive, but it rarely sparks meaningful decisions. Ironscales put it well: if QBRs focus on what was delivered rather than what changed, they risk becoming disconnected from the realities clients are navigating outside that conference room.
The client’s real question is simple: Are we meaningfully reducing risk, or are we just staying busy?
Security awareness training metrics are where MSPs can answer that question most concretely. When you show completion rates and behaviour change in QBRs, you move from reporting activity to demonstrating outcomes.
How Completion Rates and Behaviour Change Work Together in QBRs
Think of it in three layers.
Layer 1: Participation. Completion rates. Did people show up for training? This is your input metric.
Layer 2: Behaviour. Phishing click rates, reporting rates, dwell time, repeat-clicker trends. Did people change what they do? This is your outcome metric.
Layer 3: Culture. Proactive reporting of real (not simulated) threats, peer-to-peer security conversations, voluntary engagement with optional training. Did the organisation shift its posture? This is your long-term metric.
Most MSPs present Layer 1 and stop. The ones retaining clients at higher margins present all three.
A Four-Step QBR Narrative
Here is a concrete structure for presenting security awareness training results. Each step builds on the previous one.
Step 1: “Here’s who completed training.”
Show the completion rate for the quarter alongside the trend over previous quarters. Compare to the industry benchmark. If the client is a 40-person accounting firm hitting 95% completion, that means only two people missed. Name the gap (were they on leave? new hires not yet enrolled?) and show the plan to close it.
Step 2: “Here’s how their behaviour changed.”
This is the slide that earns attention. Show the phishing simulation click rate going down. Show the reporting rate going up. If you track dwell time, show it shrinking. If you have repeat-clicker data, show how many former habitual clickers have improved. Trend lines matter more than any single quarter’s numbers.
Step 3: “Here’s what that means for your risk.”
Translate the behaviour data into business terms. The IBM Cost of a Data Breach Report found that employee training reduced breach costs by an average of $258,629. That figure gives you a dollar amount to put in front of a client CFO. Pair it with the Verizon DBIR finding that 60% of breaches involve the human element, and the value proposition writes itself.
A 2025 study in the International Journal of Science and Research Archive found that behaviourally-driven training produced a 48% increase in phishing email detection and a 36% reduction in policy violations. Those are the kinds of outcomes that justify continued investment.
Step 4: “Here’s what we recommend next quarter.”
Close with forward-looking recommendations. Targeted training for repeat clickers. New modules covering emerging threats (deepfakes, QR phishing, AI voice scams). Policy changes based on what the data revealed. This positions the MSP as a strategic advisor, not a vendor reading off a dashboard.
Practitioners on Reddit’s r/msp community confirm this approach works. One MSP noted that “the retention argument basically writes itself when you can show behavior trend data in QBRs.” Completion data opens the door. Behaviour change data closes the deal.
Benchmarks Cheat Sheet for MSP QBRs
Use this table as a quick reference when preparing QBR slides. Every metric has a clear target and a source your client can verify.
| Metric | What It Measures | Good Target | Timeline | Key Source |
|---|---|---|---|---|
| Completion Rate | Training delivered | 90%+ | Per quarter | ISA Cybersecurity |
| Phishing Click Rate | Susceptibility to phishing | Below 5% | After 12 months of training | KnowBe4 Benchmarking Report |
| Reporting Rate | Proactive threat detection | 70%+ | After 12 months of training | Brightside AI |
| Dwell Time | Speed of threat reporting | Declining quarter over quarter | Ongoing trend | Hoxhunt KPI Framework |
| Repeat-Clicker Rate | Habitual risk improvement | Declining quarter over quarter | Ongoing trend | Industry consensus |
| Human Risk Score | Composite individual risk | Improving trend | Ongoing trend | Keepnet Labs |
When you show completion rates and behaviour change in QBRs using this benchmarks table, you give clients the industry context they need to understand whether their numbers are good, average, or concerning.
Common Mistakes When Presenting SAT Metrics in QBRs
The “Green Dashboard” Trap
Showing a 95% completion rate in a green box and moving on. It looks great. It proves nothing about whether people are safer. Without behaviour change data alongside it, you’re giving the client a false sense of progress.
Comparing to Unrealistic Benchmarks
Expecting a 0% phishing click rate is not realistic. Even well-trained organisations have people who occasionally slip. Setting unrealistic targets makes real progress look like failure.
Presenting Snapshots Instead of Trends
One quarter’s phishing click rate is a data point. Three quarters is a trend. Always show trend lines. A client whose click rate went from 28% to 14% to 8% can see the trajectory. A single “8%” number without context doesn’t communicate progress.
Skipping the Financial Connection
Security metrics in isolation don’t resonate with business decision-makers. Connect every behaviour change metric to financial or compliance outcomes. The $258,629 average breach cost reduction figure from IBM is there for exactly this reason.
Using Vendor Jargon
“Phish-prone percentage” means nothing to a client CEO. “The percentage of your staff who clicked a simulated phishing link” does. Translate every metric into plain language. Save the acronyms for your internal team.
Ignoring Compliance Evidence
Many clients need security awareness training evidence for Essential Eight, ISO 27001, NIST CSF, or cyber-insurance questionnaires. If your QBR doesn’t explicitly map training data to these frameworks, you’re leaving value on the table. Platforms that generate compliance-ready reporting mapped to these standards save hours of manual collation.
How to Automate QBR Reporting for Security Awareness Training
The biggest barrier to showing completion rates and behaviour change in QBRs consistently is the admin burden. Pulling data from the training platform, formatting it into slides, adding trend lines, doing this across 30 or 50 client tenants, that eats hours every quarter.
Modern SAT platforms built for MSPs should handle this automatically. The features that matter most:
Auto-generated branded reports. White-label PDFs with your logo, your colours, your domain. The client sees you as the security authority, not a third-party vendor. This matters for trust and for the QBR narrative.
Multi-tenant roll-ups. A single console where you can view and compare completion rates, click rates, and reporting rates across every client. Spot the outliers fast. Prepare per-client QBR decks without switching between portals.
Compliance evidence packs. Pre-mapped to Essential Eight, ISO 27001, NIST CSF, and cyber-insurance questionnaires. Instead of manually assembling audit evidence, export a ready-made pack.
Microsoft 365 directory sync. Auto-enrol new joiners and remove leavers without manual user maintenance. This directly affects completion rates, because gaps in enrolment create gaps in coverage that show up as lower numbers in your QBR.
Trend data built in. Quarter-over-quarter comparisons generated automatically, not assembled in a spreadsheet.
DefendWise was built specifically for this workflow. It’s an AI-native, white-label security awareness training platform for MSPs with a multi-tenant console, automated onboarding, compliance reporting, and branded QBR PDFs. One flat fee of $399 per month covers unlimited users and unlimited client organisations, so there’s no per-seat math complicating your margins as client headcounts grow.
If your current platform makes it hard to show behaviour change data in QBRs alongside completion rates, that’s a tooling problem worth solving. Start a free 7-day trial to see what automated QBR reporting looks like in practice, with no credit card and no sales call required.
Putting It All Together
The shift from showing completion rates alone to showing completion rates and behaviour change in QBRs is not cosmetic. It fundamentally changes the conversation from “did training happen?” to “is training working?”
Completion is your input metric. It proves the machine runs. Behaviour change is your outcome metric. It proves the machine produces results. Presenting both, layered with industry benchmarks and financial context, positions the MSP as a strategic risk advisor rather than a checkbox vendor.
The data supports this approach from every angle. KnowBe4’s benchmarking across 30,000 organisations shows phishing susceptibility dropping from 34.3% to 5% with sustained training. IBM quantifies the financial impact at $258,629 in average breach cost savings. And practitioners across MSP communities consistently report that behaviour trend data is what turns QBRs into retention events.
Clients don’t renew because you ran training. They renew because you proved it made their organisation safer. Show both metrics. Show them together. Show them every quarter.
Frequently Asked Questions
What is a good completion rate for security awareness training?
A completion rate of 90% or higher is considered strong across most industries. Below 70% signals a problem, whether that’s training content that’s too complex, too boring, or an organisational culture that doesn’t treat security training seriously. Financial services and federal government organisations routinely hit 90 to 98% due to regulatory pressure, while education and retail tend to land between 60 and 80%.
How do you measure behaviour change in security awareness training?
Five metrics capture behaviour change effectively: phishing simulation click rate (target below 5% after 12 months), email reporting rate (target 70% or higher), dwell time (the gap between a phishing email arriving and someone reporting it), repeat-clicker reduction (tracking whether habitual failers improve), and human risk score (a composite score combining multiple behavioural signals). Each measures a different dimension of whether employees act differently after training.
Why is completion rate alone not enough for QBRs?
Completion rate proves training was delivered. It doesn’t prove anyone learned anything or changed their behaviour. Healthcare is the clearest example: the industry consistently achieves 85 to 95% completion rates while maintaining some of the highest phishing click rates. Without behaviour change data, a high completion number can create a false sense of security that leaves the client exposed.
What security metrics should MSPs include in QBRs?
At minimum: training completion rate, phishing simulation click rate (with trend over time), reporting rate, and compliance status against relevant frameworks. For more sophisticated presentations, add dwell time, repeat-clicker data, human risk scores, and financial context (like IBM’s $258,629 average breach cost reduction figure). Always show trends, not just single-quarter snapshots.
How often should MSPs run phishing simulations to show behaviour change in QBRs?
Monthly simulations provide enough data points to show meaningful trends in quarterly reviews. Running simulations less frequently (once per quarter, for example) gives you only one data point per QBR period, making it impossible to distinguish a real trend from random variation. Vary the simulation types and difficulty levels to prevent employees from recognising a pattern.
What is a human risk score?
A human risk score is a composite metric that quantifies an individual employee’s likelihood of causing a security incident based on observed behaviours. It factors in phishing test results, password practices, patch compliance, reporting speed, and other signals. It works similarly to a credit score, giving you a single number to track over time per person and per organisation. A declining score across the workforce is one of the clearest indicators of culture-level change.
How do I handle clients with low completion rates in a QBR?
Don’t hide the number. Present it alongside the reason and the fix. Low completion usually traces back to one of three causes: training content that doesn’t engage, a lack of management enforcement, or structural barriers like high turnover or shift work. Show the client the industry benchmark for their sector, identify what’s dragging the number down, and propose a specific plan (shorter modules, manager escalation workflows, better onboarding automation) with a target for next quarter.
Can showing behaviour change data in QBRs help with client retention?
Yes, and practitioners confirm it. MSPs on Reddit’s r/msp community report that showing behaviour trend data alongside completion rates directly supports retention conversations. When clients can see that their phishing click rate dropped from 30% to 7% over three quarters, the value of the training program becomes tangible. That’s a harder thing to cancel than a line item that just says “security awareness training, delivered.”