Plan 1 of 3

First-Time Implementation

You've been appointed to build a bank-wide risk identification process where none exists (or where the current process is a compliance exercise that satisfies nobody). This is the project plan.

30
Weeks end-to-end
2-3
Full-time staff
6-10
Part-time BU coordinators
6
Phases with decision gates

Team Structure

You don't need a large team. You need a small, senior team with direct access to the CRO and authority to convene business unit heads.

Role Commitment Who Why
Executive Sponsor 2-3 hrs/week CRO or Deputy CRO Removes blockers, signs off on methodology decisions, presents to Board
Risk Identification Lead Full-time New hire or senior internal appointment Owns the process end-to-end. Designs, builds, facilitates, reports
Methodology Analyst Full-time (Weeks 1-20) From risk analytics or 2nd line Taxonomy design, scoring calibration, data analysis, template building
BU Risk Coordinators 1-2 days/week during active phase One per major business unit (6-10 people) Complete bottom-up templates, attend workshops, own BU-level risk content
Regulatory Liaison 1 day/week From compliance or regulatory affairs Regulatory mapping, traceability, ensures alignment with supervisory expectations
GRC / Data Support 2 days/week From risk technology or operational risk Tooling, data extraction, GRC platform configuration (if applicable)
Practical note: At a mid-tier bank (assets under $50B / £40B), the Risk Identification Lead often doubles as the Methodology Analyst. At a G-SIB, these are separate roles. The BU Risk Coordinators are not new hires — they're existing risk managers in each business unit who take on this responsibility as part of the annual cycle.
1
Foundation Setting
Weeks 1-6

Before you identify a single risk, you need the architecture: governance mandate, taxonomy, regulatory mapping, risk criteria, and context assessment. Skip this and everything downstream is built on sand.

Weeks 1-2 Governance and Mandate

  • Draft the Risk Identification Policy (scope, roles, frequency, reporting lines)
  • Secure CRO sign-off and Board Risk Committee awareness paper
  • Define the RACI: who owns identification, who contributes, who approves, who audits
  • Identify BU Risk Coordinators and brief them on their role
  • Agree on the annual calendar (when each phase runs, tied to ICAAP/ILAAP/Board cycle)

Weeks 3-4 Risk Taxonomy and Regulatory Mapping

  • Design the three-level taxonomy (L1 categories, L2 sub-categories, L3 granular risks)
    Tab 1
  • Map taxonomy to regulatory categories across all relevant jurisdictions
    Tab 2
  • Cross-reference against industry loss databases (ORX, SAS OpRisk, internal loss data)
  • Validate taxonomy with CRO and business unit heads (circulate for comment)
  • Map COSO objectives to each L1/L2 category

Week 5 External and Internal Context

  • Conduct PESTLE analysis (Political, Economic, Social, Technological, Legal, Environmental)
    Tab 3
  • Assess internal environment across the seven COSO ERM elements
    Tab 4
  • Review recent regulatory examination findings and internal audit reports
  • Compile a "straw man" risk list (initial universe seeded from taxonomy + context + loss data)

Week 6 Risk Criteria and Scoring Design

  • Define five-level impact scales: Financial, Regulatory, Reputational, Customer/Operational
    Tab 5
  • Define likelihood scale (1-5) anchored to frequency or probability
    Tab 5
  • Define vulnerability and speed-of-onset scales
    Tab 5
  • Calibrate CET1 impact anchors (institution-specific)
  • Get CRO sign-off on scoring methodology
Phase 1 Deliverables
  • Approved Risk Identification Policy document
  • Three-level risk taxonomy with regulatory mapping (Tabs 1-2)
  • PESTLE and internal environment assessments (Tabs 3-4)
  • Calibrated risk criteria and scoring methodology (Tab 5)
  • Straw man risk list for workshop seeding
  • Annual calendar agreed with CRO
G
Decision Gate 1: CRO reviews foundation outputs and confirms the taxonomy, criteria, and scoring methodology before proceeding to process design. If the taxonomy doesn't fit the institution's business model, everything downstream will need rework.
2
Process Design
Weeks 7-10

Design the mechanics of how identification will actually work: the workshop format, the bottom-up template, the reconciliation process, and the assessment workflow.

Week 7 Top-Down Workshop Design

  • Design the SWIFT workshop structure (opening, independent review, systematic walk-through, prioritisation, emerging risks, close)
    Tab 6
  • Build the SWIFT Prompt Matrix with institution-specific guide words per L1 category
    Tab 7
  • Draft the participant list (CRO, business unit heads, CFO, COO, Head of Compliance, CTO)
  • Design the pre-workshop independent assessment form
    Tab 9

Week 8 Bottom-Up Template Design

  • Finalise the standardised 11-field bottom-up template
    Tab 8
  • Add worked examples (2-3 per template) so BU coordinators understand the expected quality
  • Map the 10 specialist sub-processes (RCSA, Conduct, Cyber, AML, Third-Party, Model Risk, etc.) and identify who feeds each
  • Create a submission timeline and quality checklist for BU coordinators

Week 9 Reconciliation and Assessment Design

  • Design the reconciliation process: how top-down and bottom-up outputs are compared
    Tab 10
  • Define gap types (blind spot, scope gap, assessment gap) and resolution process
  • Design the four-dimensional scoring worksheet with formulas
    Tab 11
  • Design the risk interaction matrix structure
    Tab 12

Week 10 Documentation and Reporting Design

  • Design the 14-field risk inventory record
    Tab 15
  • Design the one-page risk profile template
    Tab 16
  • Design the principal risk report format for the Board Risk Committee
    Tab 18
  • Document the entire end-to-end process in a methodology document (for audit trail and training)
Phase 2 Deliverables
  • Workshop agenda, SWIFT prompt matrix, participant list, and logistics
  • Bottom-up template with worked examples and submission guide
  • Reconciliation process design with gap taxonomy
  • Scoring, interaction, and reporting templates ready for pilot
  • End-to-end methodology document
G
Decision Gate 2: CRO and BU Risk Coordinators review the templates and process design. BU coordinators must confirm the bottom-up template is feasible given their data availability. This is where you learn if the process is too ambitious or too light.
3
Pilot
Weeks 11-16

Run the full process with one or two business units before rolling out bank-wide. The pilot will expose every assumption that doesn't survive contact with reality.

Weeks 11-12 Pilot Bottom-Up

  • Select 1-2 pilot business units (choose one complex, one simpler)
  • Brief pilot BU coordinators on the template and expectations
  • Pilot BUs complete bottom-up templates
    Tab 8
  • Review submissions: are risk definitions clear? Are controls specific? Is data quality assessed honestly?
  • Document what worked, what confused people, what needs to change

Week 13 Pilot Workshop

  • Distribute pre-workshop independent assessments to pilot participants (7-10 days before)
    Tab 9
  • Run a half-day SWIFT workshop with the pilot group
    Tabs 6-7
  • Test the facilitation approach: does the SWIFT format generate genuine new risks or just confirm existing ones?
  • Record workshop outputs and facilitator observations

Week 14 Pilot Reconciliation

  • Compare pilot workshop output (top-down) against pilot BU submissions (bottom-up)
    Tab 10
  • Identify gaps: risks found top-down but missing bottom-up, and vice versa
  • Test the gap resolution process: does it work? How long does it take?

Week 15 Pilot Scoring and Assessment

  • Score pilot risks using four-dimensional methodology
    Tab 11
  • Test the interaction matrix with pilot risks
    Tab 12
  • Run a bow-tie analysis on one material pilot risk
    Tab 13
  • Assess scoring calibration: are scores differentiating meaningfully, or is everything clustered at 3?

Week 16 Pilot Review and Refinement

  • Conduct lessons-learned session with all pilot participants
    Tab 29
  • Revise templates, scoring criteria, and process flow based on pilot findings
  • Update methodology document with changes
  • Prepare CRO briefing on pilot results and recommended adjustments
Don't skip the pilot. Every institution that has rolled out risk identification without piloting has had to re-do the first cycle. Common pilot discoveries: the bottom-up template has too many fields (or too few), the SWIFT guide words don't map to the institution's business mix, the scoring criteria produce clustered scores that don't differentiate, and the reconciliation process takes three times longer than expected.
G
Decision Gate 3: CRO reviews pilot results. Key question: does the process produce meaningfully different output from the existing risk register? If it just confirms what everyone already knew, the methodology needs sharpening before full roll-out.
4
Full Roll-Out
Weeks 17-22

Now run the full process across all business units. This is the first complete risk identification cycle.

Week 17 Launch

  • Brief all BU Risk Coordinators (workshop + written guide + worked examples)
  • Distribute bottom-up templates to all business units with deadline
    Tab 8
  • Distribute pre-workshop independent assessments to senior management
    Tab 9
  • Activate specialist sub-process inputs (RCSA, cyber, conduct, AML, model risk, third-party)

Weeks 18-19 Bottom-Up Collection and Quality Review

  • BUs complete and submit bottom-up templates
  • Risk ID team reviews each submission for quality (definitions clear? controls specific? data quality honest?)
  • Challenge sessions with BU coordinators who submit weak content (this will happen)
  • Collect and integrate specialist sub-process outputs

Week 20 Top-Down SWIFT Workshop

  • Distribute data pack (PESTLE summary, straw man, pre-workshop assessment aggregation)
  • Run full SWIFT workshop (half day with senior management)
    Tabs 6-7
  • Include Delphi panel for emerging risks (or run as separate session)
    Tab 21
  • Document all identified risks, prioritisation votes, and emerging risk assessments

Week 21 Reconciliation

  • Map top-down outputs against bottom-up submissions
    Tab 10
  • Identify and classify all gaps (blind spots, scope gaps, assessment gaps)
  • Run gap resolution meetings with relevant BU heads
  • Build the enterprise portfolio view (aggregate exposure across all BUs)

Week 22 Assessment

  • Score all identified risks: inherent (4 dimensions) and residual (with control effectiveness)
    Tab 11
  • Build the risk interaction matrix (T/A/C relationships between material risks)
    Tab 12
  • Run concentration analysis: single-name, sector, geographic
    Tab 14
  • Complete bow-tie analysis for the 5-10 most material risks
    Tab 13
  • Log all disagreements in the disagreement log (do not suppress minority views)
    Tab 23
G
Decision Gate 4: CRO reviews the full risk identification output. Key questions: are there any material risks missing? Does the scoring differentiate meaningfully? Were any gaps escalated that the institution has never discussed before? If the answer to the third question is "no," the reconciliation process may not be working.
5
Documentation and Integration
Weeks 23-26

Turn the identification output into the living documents that connect to capital planning, Board reporting, and regulatory submissions.

Week 23 Risk Inventory and Profiles

  • Populate the 14-field risk inventory for every identified risk
    Tab 15
  • Create one-page risk profiles for all material risks (typically 15-25 risks)
    Tab 16
  • Assign risk owners and confirm they accept ownership

Week 24 KRIs and Monitoring

  • Define 2-3 KRIs per material risk with green/amber/red thresholds
    Tab 17
  • Map material risks to stress scenarios (ICAAP, ILAAP, CCAR, recovery plan)
    Tab 19
  • Document assumptions and register them for challenge
    Tab 22

Week 25 Board Reporting

  • Produce the first Principal Risk Report for the Board Risk Committee
    Tab 18
  • Include: material risks, scores, trends, appetite status, KRI alerts, key changes
  • CRO presents to Board Risk Committee

Week 26 Regulatory Integration

  • Complete the regulatory vs economic risk gap analysis
    Tab 20
  • Feed material risk list into ICAAP/ILAAP scenario design
  • Confirm regulatory traceability (every methodology phase maps to applicable requirements)
    Tab 28
Phase 5 Deliverables
  • Complete risk inventory with all identified risks (Tab 15)
  • Risk profiles for all material risks (Tab 16)
  • KRI dashboard with live thresholds (Tab 17)
  • First Board Risk Committee report (Tab 18)
  • Stress scenario mapping for ICAAP/ILAAP (Tab 19)
  • Regulatory vs economic gap analysis (Tab 20)
6
Embed and Sustain
Weeks 27-30

The process exists. Now make it survive the next reorganisation, budget cut, and CRO departure. This phase turns a project into a permanent capability.

Week 27 Training Programme

  • Design the five-level training programme (Board, CRO function, BU leads, front-line, new joiners)
    Tab 26
  • Deliver initial training to BU Risk Coordinators
  • Create training materials that survive staff turnover (methodology guide, template instructions, worked examples)

Week 28 Audit and Assurance

  • Design the seven-area internal audit programme
    Tab 27
  • Brief Internal Audit on the methodology (they need to understand it to audit it)
  • Schedule the first independent audit of the risk identification process

Week 29 Ongoing Cycle Design

  • Define the event-driven trigger framework (six categories of events that force re-identification)
    Tab 24
  • Set baseline process KPIs (completion rates, timeliness, scoring consistency)
    Tab 25
  • Design the quarterly re-identification cycle (see Plan 3 below)
  • Schedule next annual full cycle in the corporate calendar

Week 30 Lessons Learned and Close

  • Conduct full lessons-learned review across all phases
    Tab 29
  • Publish the "Year 1" report to the CRO summarising the process, findings, and recommendations
  • Transition from "project" to "business as usual" operating model
G
Decision Gate 6: CRO and Board Risk Committee confirm the risk identification process is established and meets regulatory expectations. The process transitions from project status to BAU. The Risk Identification Lead's role becomes permanent (if it wasn't already).

Plan 2 of 3

Annual Full Cycle

The process is built. This is the playbook for running the complete annual risk identification cycle — typically in Q4 to feed into ICAAP/ILAAP and annual planning in Q1.

12
Weeks per cycle
1-2
Full-time staff
6-10
Part-time BU coordinators
1
Half-day workshop
Timing: Most banks run the annual cycle in Q4 (October–December) so that output feeds into ICAAP/ILAAP drafting in January and Board approval by March. If your regulatory cycle is different, adjust accordingly — the key constraint is that risk identification output must be available before capital planning begins.
1
Preparation
Weeks 1-2
  • Refresh the PESTLE analysis: what has changed in the external environment since last cycle?
    Tab 3
  • Refresh internal environment assessment: org changes, strategy shifts, new products, M&A
    Tab 4
  • Review and update taxonomy if needed (new risk categories from regulatory change or new business lines)
    Tab 1
  • Update the straw man risk list with last year's inventory + any new risks from quarterly cycles
  • Distribute pre-workshop independent assessments to senior management (allow 7-10 days)
    Tab 9
  • Distribute bottom-up templates to BU coordinators with deadline and updated worked examples
    Tab 8
  • Review prior year's lessons learned and process KPIs — apply improvements to this cycle
    Tabs 25, 29
2
Bottom-Up Collection
Weeks 3-5
  • BU coordinators complete standardised bottom-up templates (11 fields per risk)
    Tab 8
  • Collect specialist sub-process inputs: RCSA, conduct risk, cyber/ICT, AML/CFT, third-party, model risk
  • Risk ID team reviews each submission for quality: Are definitions specific enough to act on? Are controls named, not vague? Is data quality honestly assessed?
  • Challenge sessions with BUs that submit weak or copy-paste content from last year
  • Track completion rates and chase late submissions (process KPI)
    Tab 25
Quality test: If a BU's submission looks identical to last year's, it probably is. Require each BU to explicitly confirm what has changed and identify at least one new or materially changed risk. If nothing has changed in a year, the BU isn't looking hard enough.
3
Top-Down Workshop
Weeks 6-7
  • Prepare workshop data pack: PESTLE summary, internal environment changes, aggregated pre-workshop assessments, straw man risk list
  • Run half-day SWIFT workshop (typically 3-4 hours with senior management)
    Tabs 6-7
  • Facilitate systematic walk-through of each L1 category using SWIFT guide words
  • Run multivoting prioritisation to rank top-down risks by perceived materiality
  • Run Delphi panel for emerging risks (3-5 year horizon) — can be same session or separate
    Tab 21
  • Document all workshop outputs, including minority views and disagreements
    Tab 23
4
Reconciliation
Weeks 8-9
  • Map top-down workshop outputs against bottom-up BU submissions: which risks appear in both? Which appear in only one?
    Tab 10
  • Classify every gap: blind spot (missed entirely), scope gap (narrow definition), assessment gap (different severity view)
  • Run gap resolution meetings with relevant BU heads — every gap needs an owner and a resolution
  • Build the enterprise portfolio view: do BU-level risks aggregate into something the institution hasn't priced?
  • Update assumption register with any challenged assumptions
    Tab 22
This is where the methodology earns its keep. The reconciliation step is the single most valuable part of the process. If your top-down and bottom-up views agree perfectly, something is wrong — either the workshop wasn't challenging enough or the BUs tailored their submissions to match expectations. Real reconciliation should surface 3-10 genuine gaps per cycle.
5
Assessment
Week 10
  • Score all identified risks: four dimensions (Impact, Likelihood, Vulnerability, Speed of Onset), inherent and residual
    Tab 11
  • Update risk interaction matrix (T=Trigger, A=Amplifier, C=Correlated) for all material risks
    Tab 12
  • Run concentration analysis: single-name, sector, geographic exposures against limits
    Tab 14
  • Refresh bow-tie analysis for the 5-10 most material risks
    Tab 13
  • Assess data quality / confidence rating on every risk score
  • Determine materiality classification for each risk
6
Documentation, Reporting, and Integration
Weeks 11-12
  • Update the risk inventory (add new risks, remove retired risks, refresh all 14 fields)
    Tab 15
  • Update risk profiles for all material or changed risks
    Tab 16
  • Refresh KRI dashboard (new KRIs, updated thresholds, current values)
    Tab 17
  • Produce the annual Principal Risk Report for Board Risk Committee
    Tab 18
  • Update stress scenario mapping (link inventory to ICAAP/ILAAP/CCAR scenarios)
    Tab 19
  • Update regulatory vs economic gap analysis
    Tab 20
  • Confirm regulatory traceability (all requirements mapped to process steps)
    Tab 28
  • Conduct lessons-learned review and update process KPIs
    Tabs 25, 29
  • CRO presents to Board Risk Committee
Annual Cycle Output Package
  • Updated risk inventory (complete, 14 fields per risk)
  • Risk profiles for all material risks
  • KRI dashboard with current status
  • Principal Risk Report for the Board
  • Stress scenario mapping (feeds into ICAAP/ILAAP)
  • Emerging risk register (3-5 year horizon)
  • Reconciliation gap analysis report
  • Regulatory vs economic gap analysis
  • Lessons learned and process KPIs

Plan 3 of 3

Quarterly Re-Identification

Not a full cycle — a targeted check: what has changed? Are there new risks? Have existing risks materially shifted? This is what the Fed means by "quarterly re-identification" in SR 15-18, and what the PRA expects in their ongoing risk assessment requirements.

10
Working days
1
Full-time staff
6-10
BU coordinators (1 hour each)
4
Times per year
Critical distinction: Quarterly re-identification is not just refreshing scores on the existing risk list. It asks: are there new risks that weren't there three months ago? Have any risks changed in nature, not just severity? This is the difference between re-assessment (updating numbers) and re-identification (asking whether the list itself is still right). Most banks do the former. Regulators expect the latter.
1
Trigger Review
Days 1-3
  • Review all six event-driven trigger categories: has anything fired since last quarter?
    Tab 24
  • Market dislocation: Any significant market events? Rate shocks, credit spread widening, equity drawdowns, commodity spikes?
  • Regulatory change: New regulations, supervisory statements, enforcement actions, or examination findings?
  • Peer event: Did a comparable institution suffer a material loss? (If yes, could it happen here?)
  • Internal incident: Material operational losses, near-misses, audit findings, or compliance breaches?
  • Strategic change: New products, markets, M&A, major outsourcing, or reorganisation?
  • Emerging risk signal: Has any emerging risk moved closer to materialisation?
    Tab 21
  • Review KRI dashboard: any amber or red breaches?
    Tab 17
  • Quick pulse check with BU Risk Coordinators: "Anything new or materially changed?" (email or 15-minute call)
2
Targeted Update
Days 4-7
  • Add any newly identified risks to the inventory (full 14-field record)
    Tab 15
  • Re-score any risks where context has materially changed (four dimensions)
    Tab 11
  • Update risk interaction matrix if new interactions have emerged
    Tab 12
  • Refresh concentration analysis if portfolio composition has shifted
    Tab 14
  • Update emerging risk register: horizon changes, new signals, upgraded/downgraded risks
    Tab 21
  • Challenge any assumptions that have been tested by events
    Tab 22
  • Retire any risks that are no longer relevant (document why)
Light-touch is fine. In a quiet quarter with no trigger events, the quarterly review might take 3-4 days and result in minor score adjustments and no new risks. That's acceptable — the value is in the discipline of checking, not in the volume of changes. But if Q3 2025 had a peer bank failure and a regulatory change, the quarterly should be more thorough.
3
Reporting
Days 8-10
  • Update the Principal Risk Report (changes-only format: new risks, changed scores, changed trends)
    Tab 18
  • Flag any new or upgraded material risks to the Board Risk Committee (interim paper if needed)
  • Update risk profiles for any risks that changed materially
    Tab 16
  • Refresh KRI dashboard with current quarter values and trend arrows
    Tab 17
  • Log any assumption challenges or disagreements
    Tabs 22-23
  • Update process KPIs for the quarter
    Tab 25
  • File quarterly review documentation (audit trail)
Quarterly Output
  • Updated risk inventory (changes highlighted)
  • Quarterly principal risk report (changes-only format)
  • Updated KRI dashboard
  • Updated emerging risk register
  • Trigger review documentation (audit trail)

Putting It All Together

The Annual Calendar

Once implemented, the risk identification process runs on a predictable annual rhythm with quarterly checkpoints.

Q1 (Jan–Mar)

Quarterly review

Post-annual cycle check. Feed risk output into ICAAP/ILAAP drafts. Board approves annual risk report. Internal audit begins.

Q2 (Apr–Jun)

Quarterly review

Mid-year trigger review. Internal audit findings received. Training delivery. Process improvement actions from lessons learned.

Q3 (Jul–Sep)

Quarterly review + preparation

Q3 trigger review. Begin preparing for annual cycle: update PESTLE, refresh taxonomy, schedule workshop, brief BU coordinators.

Q4 (Oct–Dec)

Full annual cycle

12-week full cycle: bottom-up collection, SWIFT workshop, reconciliation, assessment, documentation, Board reporting. Feeds Q1 ICAAP/ILAAP.

Ready to build the process?

Download the complete toolkit — the book, templates, AI prompts, and Copilot agent. Everything referenced in these playbooks.