Team Structure
You don't need a large team. You need a small, senior team with direct access to the CRO and authority to convene business unit heads.
| Role |
Commitment |
Who |
Why |
| Executive Sponsor |
2-3 hrs/week |
CRO or Deputy CRO |
Removes blockers, signs off on methodology decisions, presents to Board |
| Risk Identification Lead |
Full-time |
New hire or senior internal appointment |
Owns the process end-to-end. Designs, builds, facilitates, reports |
| Methodology Analyst |
Full-time (Weeks 1-20) |
From risk analytics or 2nd line |
Taxonomy design, scoring calibration, data analysis, template building |
| BU Risk Coordinators |
1-2 days/week during active phase |
One per major business unit (6-10 people) |
Complete bottom-up templates, attend workshops, own BU-level risk content |
| Regulatory Liaison |
1 day/week |
From compliance or regulatory affairs |
Regulatory mapping, traceability, ensures alignment with supervisory expectations |
| GRC / Data Support |
2 days/week |
From risk technology or operational risk |
Tooling, data extraction, GRC platform configuration (if applicable) |
Practical note: At a mid-tier bank (assets under $50B / £40B), the Risk Identification Lead often doubles as the Methodology Analyst. At a G-SIB, these are separate roles. The BU Risk Coordinators are not new hires — they're existing risk managers in each business unit who take on this responsibility as part of the annual cycle.
Before you identify a single risk, you need the architecture: governance mandate, taxonomy, regulatory mapping, risk criteria, and context assessment. Skip this and everything downstream is built on sand.
Weeks 1-2 Governance and Mandate
Draft the Risk Identification Policy (scope, roles, frequency, reporting lines)
Secure CRO sign-off and Board Risk Committee awareness paper
Define the RACI: who owns identification, who contributes, who approves, who audits
Identify BU Risk Coordinators and brief them on their role
Agree on the annual calendar (when each phase runs, tied to ICAAP/ILAAP/Board cycle)
Weeks 3-4 Risk Taxonomy and Regulatory Mapping
Design the three-level taxonomy (L1 categories, L2 sub-categories, L3 granular risks)
Tab 1
Map taxonomy to regulatory categories across all relevant jurisdictions
Tab 2
Cross-reference against industry loss databases (ORX, SAS OpRisk, internal loss data)
Validate taxonomy with CRO and business unit heads (circulate for comment)
Map COSO objectives to each L1/L2 category
Week 5 External and Internal Context
Conduct PESTLE analysis (Political, Economic, Social, Technological, Legal, Environmental)
Tab 3
Assess internal environment across the seven COSO ERM elements
Tab 4
Review recent regulatory examination findings and internal audit reports
Compile a "straw man" risk list (initial universe seeded from taxonomy + context + loss data)
Week 6 Risk Criteria and Scoring Design
Define five-level impact scales: Financial, Regulatory, Reputational, Customer/Operational
Tab 5
Define likelihood scale (1-5) anchored to frequency or probability
Tab 5
Define vulnerability and speed-of-onset scales
Tab 5
Calibrate CET1 impact anchors (institution-specific)
Get CRO sign-off on scoring methodology
Phase 1 Deliverables
- Approved Risk Identification Policy document
- Three-level risk taxonomy with regulatory mapping (Tabs 1-2)
- PESTLE and internal environment assessments (Tabs 3-4)
- Calibrated risk criteria and scoring methodology (Tab 5)
- Straw man risk list for workshop seeding
- Annual calendar agreed with CRO
G
Decision Gate 1: CRO reviews foundation outputs and confirms the taxonomy, criteria, and scoring methodology before proceeding to process design. If the taxonomy doesn't fit the institution's business model, everything downstream will need rework.
Design the mechanics of how identification will actually work: the workshop format, the bottom-up template, the reconciliation process, and the assessment workflow.
Week 7 Top-Down Workshop Design
Design the SWIFT workshop structure (opening, independent review, systematic walk-through, prioritisation, emerging risks, close)
Tab 6
Build the SWIFT Prompt Matrix with institution-specific guide words per L1 category
Tab 7
Draft the participant list (CRO, business unit heads, CFO, COO, Head of Compliance, CTO)
Design the pre-workshop independent assessment form
Tab 9
Week 8 Bottom-Up Template Design
Finalise the standardised 11-field bottom-up template
Tab 8
Add worked examples (2-3 per template) so BU coordinators understand the expected quality
Map the 10 specialist sub-processes (RCSA, Conduct, Cyber, AML, Third-Party, Model Risk, etc.) and identify who feeds each
Create a submission timeline and quality checklist for BU coordinators
Week 9 Reconciliation and Assessment Design
Design the reconciliation process: how top-down and bottom-up outputs are compared
Tab 10
Define gap types (blind spot, scope gap, assessment gap) and resolution process
Design the four-dimensional scoring worksheet with formulas
Tab 11
Design the risk interaction matrix structure
Tab 12
Week 10 Documentation and Reporting Design
Design the 14-field risk inventory record
Tab 15
Design the one-page risk profile template
Tab 16
Design the principal risk report format for the Board Risk Committee
Tab 18
Document the entire end-to-end process in a methodology document (for audit trail and training)
Phase 2 Deliverables
- Workshop agenda, SWIFT prompt matrix, participant list, and logistics
- Bottom-up template with worked examples and submission guide
- Reconciliation process design with gap taxonomy
- Scoring, interaction, and reporting templates ready for pilot
- End-to-end methodology document
G
Decision Gate 2: CRO and BU Risk Coordinators review the templates and process design. BU coordinators must confirm the bottom-up template is feasible given their data availability. This is where you learn if the process is too ambitious or too light.
Run the full process with one or two business units before rolling out bank-wide. The pilot will expose every assumption that doesn't survive contact with reality.
Weeks 11-12 Pilot Bottom-Up
Select 1-2 pilot business units (choose one complex, one simpler)
Brief pilot BU coordinators on the template and expectations
Pilot BUs complete bottom-up templates
Tab 8
Review submissions: are risk definitions clear? Are controls specific? Is data quality assessed honestly?
Document what worked, what confused people, what needs to change
Week 13 Pilot Workshop
Distribute pre-workshop independent assessments to pilot participants (7-10 days before)
Tab 9
Run a half-day SWIFT workshop with the pilot group
Tabs 6-7
Test the facilitation approach: does the SWIFT format generate genuine new risks or just confirm existing ones?
Record workshop outputs and facilitator observations
Week 14 Pilot Reconciliation
Compare pilot workshop output (top-down) against pilot BU submissions (bottom-up)
Tab 10
Identify gaps: risks found top-down but missing bottom-up, and vice versa
Test the gap resolution process: does it work? How long does it take?
Week 15 Pilot Scoring and Assessment
Score pilot risks using four-dimensional methodology
Tab 11
Test the interaction matrix with pilot risks
Tab 12
Run a bow-tie analysis on one material pilot risk
Tab 13
Assess scoring calibration: are scores differentiating meaningfully, or is everything clustered at 3?
Week 16 Pilot Review and Refinement
Conduct lessons-learned session with all pilot participants
Tab 29
Revise templates, scoring criteria, and process flow based on pilot findings
Update methodology document with changes
Prepare CRO briefing on pilot results and recommended adjustments
Don't skip the pilot. Every institution that has rolled out risk identification without piloting has had to re-do the first cycle. Common pilot discoveries: the bottom-up template has too many fields (or too few), the SWIFT guide words don't map to the institution's business mix, the scoring criteria produce clustered scores that don't differentiate, and the reconciliation process takes three times longer than expected.
G
Decision Gate 3: CRO reviews pilot results. Key question: does the process produce meaningfully different output from the existing risk register? If it just confirms what everyone already knew, the methodology needs sharpening before full roll-out.
Now run the full process across all business units. This is the first complete risk identification cycle.
Week 17 Launch
Brief all BU Risk Coordinators (workshop + written guide + worked examples)
Distribute bottom-up templates to all business units with deadline
Tab 8
Distribute pre-workshop independent assessments to senior management
Tab 9
Activate specialist sub-process inputs (RCSA, cyber, conduct, AML, model risk, third-party)
Weeks 18-19 Bottom-Up Collection and Quality Review
BUs complete and submit bottom-up templates
Risk ID team reviews each submission for quality (definitions clear? controls specific? data quality honest?)
Challenge sessions with BU coordinators who submit weak content (this will happen)
Collect and integrate specialist sub-process outputs
Week 20 Top-Down SWIFT Workshop
Distribute data pack (PESTLE summary, straw man, pre-workshop assessment aggregation)
Run full SWIFT workshop (half day with senior management)
Tabs 6-7
Include Delphi panel for emerging risks (or run as separate session)
Tab 21
Document all identified risks, prioritisation votes, and emerging risk assessments
Week 21 Reconciliation
Map top-down outputs against bottom-up submissions
Tab 10
Identify and classify all gaps (blind spots, scope gaps, assessment gaps)
Run gap resolution meetings with relevant BU heads
Build the enterprise portfolio view (aggregate exposure across all BUs)
Week 22 Assessment
Score all identified risks: inherent (4 dimensions) and residual (with control effectiveness)
Tab 11
Build the risk interaction matrix (T/A/C relationships between material risks)
Tab 12
Run concentration analysis: single-name, sector, geographic
Tab 14
Complete bow-tie analysis for the 5-10 most material risks
Tab 13
Log all disagreements in the disagreement log (do not suppress minority views)
Tab 23
G
Decision Gate 4: CRO reviews the full risk identification output. Key questions: are there any material risks missing? Does the scoring differentiate meaningfully? Were any gaps escalated that the institution has never discussed before? If the answer to the third question is "no," the reconciliation process may not be working.
Turn the identification output into the living documents that connect to capital planning, Board reporting, and regulatory submissions.
Week 23 Risk Inventory and Profiles
Populate the 14-field risk inventory for every identified risk
Tab 15
Create one-page risk profiles for all material risks (typically 15-25 risks)
Tab 16
Assign risk owners and confirm they accept ownership
Week 24 KRIs and Monitoring
Define 2-3 KRIs per material risk with green/amber/red thresholds
Tab 17
Map material risks to stress scenarios (ICAAP, ILAAP, CCAR, recovery plan)
Tab 19
Document assumptions and register them for challenge
Tab 22
Week 25 Board Reporting
Produce the first Principal Risk Report for the Board Risk Committee
Tab 18
Include: material risks, scores, trends, appetite status, KRI alerts, key changes
CRO presents to Board Risk Committee
Week 26 Regulatory Integration
Complete the regulatory vs economic risk gap analysis
Tab 20
Feed material risk list into ICAAP/ILAAP scenario design
Confirm regulatory traceability (every methodology phase maps to applicable requirements)
Tab 28
Phase 5 Deliverables
- Complete risk inventory with all identified risks (Tab 15)
- Risk profiles for all material risks (Tab 16)
- KRI dashboard with live thresholds (Tab 17)
- First Board Risk Committee report (Tab 18)
- Stress scenario mapping for ICAAP/ILAAP (Tab 19)
- Regulatory vs economic gap analysis (Tab 20)
The process exists. Now make it survive the next reorganisation, budget cut, and CRO departure. This phase turns a project into a permanent capability.
Week 27 Training Programme
Design the five-level training programme (Board, CRO function, BU leads, front-line, new joiners)
Tab 26
Deliver initial training to BU Risk Coordinators
Create training materials that survive staff turnover (methodology guide, template instructions, worked examples)
Week 28 Audit and Assurance
Design the seven-area internal audit programme
Tab 27
Brief Internal Audit on the methodology (they need to understand it to audit it)
Schedule the first independent audit of the risk identification process
Week 29 Ongoing Cycle Design
Define the event-driven trigger framework (six categories of events that force re-identification)
Tab 24
Set baseline process KPIs (completion rates, timeliness, scoring consistency)
Tab 25
Design the quarterly re-identification cycle (see Plan 3 below)
Schedule next annual full cycle in the corporate calendar
Week 30 Lessons Learned and Close
Conduct full lessons-learned review across all phases
Tab 29
Publish the "Year 1" report to the CRO summarising the process, findings, and recommendations
Transition from "project" to "business as usual" operating model
G
Decision Gate 6: CRO and Board Risk Committee confirm the risk identification process is established and meets regulatory expectations. The process transitions from project status to BAU. The Risk Identification Lead's role becomes permanent (if it wasn't already).