How to Conduct Training Needs Analysis With Free Templates

When audits loom, incidents spike or regulations shift across IATA, IMDG and ADR, the reflex is often “book a course.” The harder question is what to train, who to train, and why. Without a clear line from risk and performance to learning, budgets get burned on generic content, certificates lapse, and compliance — not to mention confidence on the job — is left to chance.

A rigorous training needs analysis (TNA) changes that. It links business and compliance goals to roles and capabilities, gathers evidence from data and people, separates true skills gaps from process or tooling issues, and prioritises what matters. The result is targeted, measurable outcomes and the right blend of delivery — on‑the‑job aids, e‑learning, virtual classroom or in‑house sessions — so learning transfers to the job.

This practical guide shows you exactly how to conduct a TNA, step by step, with free templates and examples you can use straight away. You’ll clarify goals, choose the level of analysis (organisation, team/role, individual), engage stakeholders, define required standards, collect and analyse evidence, prioritise needs, select solutions, build a plan and budget, implement for transfer, and measure impact. We’ll also walk through a lithium batteries shipping example and cover record‑keeping and recertification. Here’s how to run a TNA that stands up to auditors and improves real‑world performance.

Step 1. Clarify the goal and confirm TNA is the right solution

Before you gather any data, anchor your training needs analysis to a clear business or compliance outcome. CIPD emphasises starting from organisational performance and statutory requirements, not “a course by default”. AIHR also notes that training isn’t always the fix; gaps may stem from culture, process, tooling or incentives. When learning how to conduct training needs analysis, begin by defining the outcome and testing if a capability gap is truly the cause.

Decision checklist: is TNA the right response?

  • Outcome defined: What performance or compliance result must change (e.g., IATA/IMDG/ADR compliance, incident rate, turnaround time, audit finding)?
  • Measure agreed: How will success be measured and by when (baseline, target, timeframe)?
  • Root cause fits training: Evidence suggests a skills/knowledge/behaviour gap, not primarily process, tech, staffing or policy.
  • Compliance drivers: Statutory/recertification obligations require role‑specific competence.
  • Stakeholder alignment: Sponsors, managers and SMEs agree on the problem and constraints.

Craft a concise goal statement: “Reduce lithium battery shipment rejections by 30% by Q3 by closing pack/mark/declare capability gaps in export ops, aligned to current IATA PI 965/966 and ADR 2025 updates.” This sets scope, standards and metrics, and keeps the TNA focused on what must improve, not just what to teach.

Step 2. Choose the level(s) of analysis: organisation, team/role, individual

Choosing the right level for your training needs analysis prevents scatter‑gun data collection and anchors findings to decisions. Both CIPD and AIHR highlight three legitimate levels — organisation, group/job role and individual — which can be used singly or in combination depending on the goal you set in Step 1.

  • Organisation level: Use when goals are enterprise‑wide (for example, aligning all DG handling to IATA/IMDG/ADR updates). Typical inputs: audit findings, incident trends, regulator feedback, customer complaints, workforce and capability reports.
  • Team/role level: Use when a department or function shows performance gaps (for example, export ops misdeclaring lithium batteries). Inputs: process KPIs, quality escapes, LMS completions by role, observations, manager focus groups.
  • Individual level: Use for specific people or critical posts (for example, a new DGSA candidate or a shipper repeatedly failing checks). Inputs: performance reviews, assessments, observed behaviour, certification status.

When to combine levels

If an audit flags systemic non‑compliance, start at organisation level to set standards, then drill into affected roles. If only one site or route underperforms, begin at team/role and validate individual capability. As you learn how to conduct training needs analysis, plan to cascade findings down — and roll them up — so evidence stays aligned from enterprise priorities to the person doing the job.

Step 3. Assemble your TNA team and engage stakeholders

A solid training needs analysis stands or falls on who’s in the room. CIPD advises involving subject matter experts, operational managers and impacted employees; AIHR stresses stakeholder management to align expectations. Bring the right voices together early, set clear roles, and protect confidentiality because you’ll handle sensitive performance and compliance data. This is a key discipline when learning how to conduct training needs analysis that leads to action.

Build the core TNA team

Bring a small, empowered group who can access data, interpret standards and authorise change.

  • Executive sponsor: Sets the goal, unblocks access, approves scope and trade‑offs.
  • TNA lead (L&D/DGSA): Orchestrates the analysis, methods and reporting; ensures compliance lens.
  • Compliance/Quality: Interprets IATA/IMDG/ADR and local policy; validates competence standards.
  • Operations manager(s): Owns process performance; provides KPIs and context.
  • SMEs by mode/role: Translates standards into real tasks; spots practical gaps and risks.
  • Data/People analytics: Mines LMS, incident and performance data; ensures data integrity.
  • HR/LMS admin: Confirms certifications, expiries and learning records; manages learner cohorts.
  • Front‑line representatives: Surfaces realities, workarounds and usability issues.
  • IT/Security (as needed): Enables safe data access and tool support.
  • Finance/PMO (as needed): Frames budget constraints and timelines.

Engage stakeholders early and often

Plan structured touchpoints so insights are co‑owned, not imposed.

  • Kick‑off: Align on outcomes, scope, timelines, confidentiality and decision rights.
  • Discovery: Stakeholder interviews, focus groups and observations scheduled by role/site.
  • Data access: Formal requests for KPIs, audits, LMS and incident logs with privacy controls.
  • Interim playback: Share emerging themes and root‑cause hypotheses for challenge and buy‑in.
  • Solution shaping: Co‑design success measures and feasibility constraints before prioritising.
  • Sign‑off: Agree the final gap analysis and next steps to move into solution design.

Set a simple RACI for key deliverables, publish a one‑page brief (“why, what, when, who”), and keep minutes of decisions. This keeps everyone aligned and accelerates consent when you move from findings to action.

Step 4. Define required capabilities, standards and compliance requirements

This is where you nail “what good looks like.” CIPD advises starting with capability analysis and statutory requirements, then using competency frameworks; AIHR adds translating outcomes into behaviours, skills and knowledge. If you’re learning how to conduct training needs analysis that survives audits, anchor every capability to a recognised standard (IATA DGR, IMDG Code, ADR/RID) and your SOPs.

Translate standards into role‑based capabilities

Begin with the regulations that apply to your modes and sites, then map them to real tasks for each role.

  • List applicable rules: IATA DGR, IMDG Code, ADR/RID, plus licences, customer mandates and internal policies.
  • Map tasks to standards: Classify, pack, mark/label, document, segregate, load/unload, report — cite the clause or chapter that governs each.
  • Define proficiency levels: Awareness, Application, Mastery for each task to set depth of competence.
  • Specify evidence: Certificates required, refresher cadence, on‑the‑job sign‑offs and acceptable records.

Write measurable competence standards and outcomes

Turn each task into a standard people can be assessed against.

  • Use a clear pattern: Action + Object + Conditions + Standard + Evidence.
  • Example: “Prepare a compliant Shipper’s Declaration for dangerous goods under the current IATA DGR, using the company template, with zero errors on first check, evidenced by dual sign‑off.”

Apply compliance and risk filters

Not all capabilities are equal; prioritise those with legal and safety impact.

  • Legal must‑haves first: Anything explicitly mandated by IATA/IMDG/ADR or certifications.
  • High‑risk tasks next: Steps where errors lead to incidents, rejections or audit findings.
  • Operational essentials: Capabilities that materially affect KPIs (throughput, right‑first‑time).

Document the source standard for every requirement and agree it with Compliance/Quality. This keeps your training needs analysis defensible, focused and ready for evaluation in later steps.

Step 5. Inventory existing training, certifications and performance data

Before collecting fresh evidence, mine what you already have. CIPD recommends “organisational data and intelligence” as a starting point; AIHR points to HRIS/LMS and skills inventories. When you’re learning how to conduct training needs analysis that avoids duplication and targets the real gaps, build a consolidated picture of current capability, compliance and performance.

Build a single view of current capability

Create one working register that links roles to standards, training completed and outcomes. Prioritise accuracy and source-of-truth fields.

  • Training catalogue mapped to standards: Courses, objectives and the IATA/IMDG/ADR clauses they cover.
  • LMS records: Enrolments, completions, assessment scores, pass/fail, delivery mode.
  • Certificates and licences: Issue/expiry dates, refresher cadence, modality-specific approvals.
  • Competence matrices/OJT sign‑offs: Task-level evidence, observer, date and validity.
  • Audits and inspections: Internal/external findings, severity, corrective actions, repeat rates.
  • Incidents and quality escapes: Near misses, shipment rejections, root causes, costs.
  • Process KPIs: Right‑first‑time, error types, cycle time, rework and customer complaints.
  • Coaching/observation notes: Behavioural evidence, barriers, usability issues.
  • Training effort and spend: Hours, backfill, travel, vendor costs.

Data hygiene and quick metrics

Validate duplicates, close obvious data gaps, and calculate baseline indicators to guide focus.

  • Compliance coverage: trained_and_in_date / required_in_scope.
  • Expiry risk (90 days): certificates_expiring_≤90d / required_in_scope.
  • Right‑first‑time uplift: RFT_trained − RFT_untrained.
  • Repeat finding rate: repeat_audit_findings / total_findings.
  • Duplication index: overlapping_courses_covering_same_clause.

This inventory becomes your evidence base. With it in hand, you can now plan purposeful data collection to fill the gaps and validate what the numbers suggest.

Step 6. Plan your data collection and select methods

With your capability map and current-state inventory in hand, design how you will gather fresh, credible evidence. Convert hypotheses into answerable questions, decide which methods will surface the best signal, define sampling, cadence and logistics, and agree how you’ll triangulate results. CIPD advises a mixed‑methods approach and clear plans for frequency and extent; AIHR lists observations, questionnaires, interviews, assessments, skills audits, HRIS and text mining. When learning how to conduct training needs analysis, plan confidentiality from the outset.

Define the questions and map the best method

Frame 1–2 questions per capability and choose the lightest method that can reliably answer them. Blend sources to validate findings.

  • Mine existing data: LMS, incidents, audits, KPIs for trends and hotspots; use cautious text/HRIS mining to spot patterns.
  • Targeted surveys: Perception and knowledge at scale; keep short, role‑specific, and tied to standards.
  • Interviews/focus groups: Depth and context on barriers, workarounds and feasibility.
  • Observations/work samples: Real behaviours against a checklist aligned to IATA/IMDG/ADR and SOPs.
  • Assessments: Knowledge or practical tests where certification requires evidence.

Sampling, scheduling, ethics and quality

Plan for coverage across roles, sites and shifts without disrupting operations. Protect people and data, and bake in quality controls.

  • Sampling plan: Define who, where and how many; include high‑risk tasks and low‑volume edge cases.
  • Scheduling: Fit around peaks; gain manager buy‑in; provide alternatives for remote/shift workers.
  • Pilot instruments: Test surveys/checklists; fix ambiguous or leading items.
  • Calibration: Use standard rubrics; brief assessors to improve inter‑rater reliability.
  • Privacy & security: Anonymise where possible, obtain informed consent, secure storage, and restrict access to sensitive performance data.

Decide upfront how you’ll integrate outputs into a single gap matrix so Step 7 runs fast and clean.

Step 7. Collect evidence: surveys, interviews, observations and assessments

This is where your plan meets the floor. Execute quickly, respectfully and consistently: use standard templates, brief participants, protect confidentiality, and focus on high‑signal moments linked to IATA/IMDG/ADR tasks. If you’re showing teams how to conduct training needs analysis that leads to change, keep collection light on disruption and heavy on usable evidence.

Surveys (breadth at speed)

  • Keep them short and role‑specific: 8–12 items tied to the capabilities you defined.
  • Ask three things: knowledge (objective items), confidence (Likert), and barriers (free text).
  • Anchor to standards: Reference the clause/task in each item to aid analysis.
  • Protect anonymity: Aggregate reporting; flag ethics and data handling upfront.

Interviews and focus groups (depth and context)

  • Use semi‑structured guides: “What makes [task] hard on a busy shift?” “Where do errors creep in?”
  • Record with consent: Capture quotes and examples; note environmental or process constraints.
  • Triangulate: Include managers, SMEs and front‑line staff to balance perspectives.

Observations and work samples (see the real job)

  • Checklist aligned to SOPs and codes: Behaviourally anchored, pass/fail plus comments.
  • Calibrate observers: Shadow in pairs initially to improve reliability.
  • Note conditions: Tools, time pressure, handovers; apply a stop rule if unsafe.

Knowledge tests and practical assessments (evidence of competence)

  • Validate items with SMEs: Map each to a capability and source clause.
  • Set clear thresholds: Agree pass criteria and retake rules with Compliance/Quality.
  • Secure integrity: Version control, invigilation as needed, and immediate feedback where appropriate.

Close the loop daily: log completions, tag evidence to roles/tasks, and capture emergent risks. You’re now ready to consolidate, compare against standards, and move into gap and root‑cause analysis.

Step 8. Analyse skills gaps and diagnose root causes

Now convert evidence into decisions. Compare each role’s observed capability against the standards you defined, then quantify the gap and explain why it exists. CIPD’s focus on capability analysis and AIHR’s warning that not every problem is a training problem both apply here: distinguish skill/knowledge gaps from process, tooling or culture issues. When you learn how to conduct training needs analysis well, this is the moment you protect budget and target risk.

Build the gap matrix

Create a single view per capability that shows requirement, current evidence, the size of the gap and its risk. Keep it auditable by citing the source (IATA/IMDG/ADR clause, SOP, assessment ID).

  • Structure: Capability | Required level | Current level | Evidence | Risk (legal/safety/operational) | Gap score | Notes.
  • Level scale: Awareness = 1, Application = 2, Mastery = 3.
  • Compute a gap score: gap_score = max(0, required - current) * risk_weight * frequency_weight.
  • Use hard data first: RFT by task, incident/rejection counts, audit repeats; then triangulate with surveys/interviews.

Diagnose root causes

For each material gap, identify the dominant cause and the right lever. Document the reasoning; you’ll use it to justify solutions and measures later.

  • 5 Whys / fishbone: Surface people, process, tools, environment and policy factors behind the error.
  • Skill vs will vs way: Is it knowledge/skill, motivation/feedback, or workflow/tools/SOP clarity?
  • Compliance triggers: If the standard is misunderstood or changed recently, favour targeted refresher and job aids; if SOPs conflict with the code, fix process first.
  • Feasibility check: Validate with SMEs and managers that proposed fixes are practical in real operating conditions.

Only tag items as “training needs” when the root cause is competence. Everything else becomes a non‑training action (SOP tweak, checklist, system prompt, staffing). This keeps your training needs analysis tight and prepares you to prioritise and set outcomes next.

Step 9. Prioritise needs and set measurable learning outcomes

With root causes clear, convert your gap matrix into a prioritised, auditable backlog. CIPD advises focusing on the most critical performance gaps and defining how impact will be measured; AIHR stresses aligning outcomes to behaviours, not just courses. In practice, only include items where competence is the lever; park process/tooling fixes in a separate action log. This is a pivotal moment in how to conduct training needs analysis that protects time and budget.

Use a simple scoring model to order the work:
priority_score = gap_score × business_impact × compliance_weight ÷ effort
where business_impact reflects KPI movement (RFT, rejections, incidents), compliance_weight emphasises legal/safety exposure, and effort estimates time/cost to close.

Apply clear rules to sequence “must do” items:

  • Legal or safety critical: Mandated by IATA/IMDG/ADR or linked to incidents.
  • High-volume/high-risk tasks: Frequent steps with costly errors.
  • Time-sensitive expiries: Certificates due within 90 days.
  • Feasible quick wins: Low effort with visible impact.
  • Strategic alignment: Direct line to stated business goals.

Now write measurable learning outcomes that mirror the capability standard and evidence. Use: Audience + Behaviour + Conditions + Standard + Measure + When.

  • Export operators: “By 30 June, prepare a compliant IATA DG Shipper’s Declaration with zero critical errors on first check across three consecutive live jobs, evidenced by dual sign‑off and RFT ≥ 98%.”
  • Warehouse team (IMDG): “By Q3, apply segregation rules during stuffing for classes 3/8 with 100% adherence on sampled loads, verified via observation checklist.”

You’re ready to choose solutions and delivery modes that can achieve these outcomes at least cost and disruption.

Step 10. Select learning solutions and delivery modes

With priorities and outcomes set, pick the lightest mix of solutions that reliably change behaviour on the job. CIPD cautions against defaulting to “a course”; instead build an integrated, blended approach and plan how impact will be measured and transferred. AIHR likewise recommends considering non‑training fixes and using ADDIE to align design with outcomes. This is a pivotal moment in how to conduct training needs analysis without wasting budget.

Match solutions to the gap

  • Knowledge refresh (codes/clauses): Short e‑learning/microlearning with checks; quick‑reference job aids mapped to IATA/IMDG/ADR clauses.
  • Hands‑on skill/behaviour: Facilitated workshops, scenario‑based practice, role plays, coached on‑the‑job (OJT) with observed sign‑offs.
  • High‑risk/low‑frequency tasks: Simulations, drills and supervised practice against an observation checklist; periodic revalidation.
  • Process/system clarity: Update SOPs, add workflow prompts or checklists; coaching for adoption (non‑training first).
  • Certification/recertification: Formal courses meeting regulatory standards (e.g., IATA/IMDG/ADR modules) and DGSA preparation where applicable.

Choose delivery modes

  • E‑learning/microlearning: Scale and speed for knowledge; spaced boosters to reduce decay.
  • Virtual classroom (regulator‑approved where required): Interactive practice with minimal travel.
  • Classroom/public: Deep practice and assessment with peers when equipment or invigilation is needed.
  • In‑house/on‑site: Contextualised scenarios, SOP alignment and minimal disruption.
  • OJT/coaching/mentoring: Real‑work application and transfer, evidenced by sign‑off.

Design for transfer and measurement

  • Blend and space: Pre‑work for baseline, live practice for skill, post‑work boosters for retention.
  • Performance support: Checklists, templates and job aids at the point of need.
  • Evaluation: Track engagement, assessment, on‑the‑job RFT/incident movement and compliance coverage against the outcomes you set in Step 9.

Step 11. Build the training plan, schedule and budget

You’ve chosen solutions; now turn them into a delivery plan that tells everyone who will learn what, when, how, and at what cost. Treat this like a mini‑project: phase for risk, respect operational peaks, and link every activity back to the measurable outcomes you set in Step 9. This is where learning how to conduct training needs analysis becomes a calendar, a cost line and a commitment.

Convert priorities into a workable plan

Sequence the work so compliance and performance move first, while dependencies and capacity are protected.

  • Chunk into work packages: One capability = one package with outcome, audience, delivery mode and evidence.
  • Phase by risk and expiry: Tackle legal/safety critical items and certificates due ≤90 days first.
  • Map cohorts and capacity: Who needs it, how many seats per session, SME/trainer availability, assessor coverage.
  • Honour dependencies: Update SOPs/job aids before practice; align systems prompts and checklists.
  • Schedule intelligently: Avoid peak ops, mix formats (microlearning pre‑work, live practice, OJT sign‑off), and space learning.
  • Governance and controls: RACI, sign‑off points, change control for scope/schedule/cost, and a simple risk log.

Build a transparent budget (and sanity‑check ROI)

Capture all costs and show value in operational terms.

  • Direct costs: Vendor fees, materials, invigilation, venue/tech.
  • Indirect costs: Learner time, backfill/overtime, SME/trainer prep, evaluation effort.
  • Implementation aids: SOP updates, job aids, LMS configuration.
  • Contingency: 10–15% for schedule slippage or extra cohorts.
  • Quick maths: cost_per_learner = (direct_costs + backfill_costs + travel) / learners
    ROI = (benefit - cost) / cost, where benefit = avoided rejections/fines + hours saved.

Use a one‑page plan per package (template)

Keep it scannable so sponsors and managers can approve quickly.

Work packageAudienceOwnerStartEndDependenciesDeliveryCostSuccess KPI
Lithium battery declaration refresherExport ops (Site A/B)L&D + Compliance03 Jun28 JunSOP v7, job aid v2Micro + VC + OJT£12.5kRFT ≥ 98%, rejections −30%

Once the plan and budget are signed off, brief managers, schedule cohorts, publish the training calendar, and ready your comms and performance support for a clean launch into Step 12.

Step 12. Implement, communicate and enable on-the-job transfer

This is the moment the plan becomes performance. Implementation should be quiet on disruption and loud on clarity: who learns what, when, why, and how success will be evidenced on the job. CIPD stresses transfer and measurement; AIHR’s ADDIE model places you squarely in Implement. If you’re learning how to conduct training needs analysis that actually changes outcomes, obsess over manager engagement, performance support and real‑work practice.

Launch clear communications

  • State the why and the win: Link cohorts to the goal, KPIs and compliance standards (IATA/IMDG/ADR) in plain language.
  • Be explicit on commitment: Dates, duration, delivery mode, pre‑work and OJT sign‑offs.
  • Set expectations for managers: Release time, coaching duties, evidence required, and how progress will be tracked.
  • Tell learners what support exists: Job aids, SMEs, escalation routes; reiterate privacy and data handling.

Enable transfer on the job

  • Publish updated SOPs and job aids first: Make them easy to find at point of work and map to clauses.
  • Schedule immediate application: Real tasks within 24–72 hours of training, with observed checklists.
  • Use OJT sign‑offs: Behaviourally anchored evidence captured in the LMS/HRIS.
  • Reinforce with spacing and nudges: Micro refreshers, quick quizzes and manager huddles.
  • Buddy/coach cover: SMEs available during peak risk tasks; calibrate assessors for consistency.

Run operational controls

  • Capture evidence in real time: Attendance, scores, observations, sign‑offs and exceptions.
  • Hold weekly stand‑ups: Review adoption, remove blockers, adjust schedule if operations shift.
  • Apply change control: If a code/SOP changes mid‑rollout, pause, update materials, then resume.
  • Close the loop with stakeholders: Share early signals against outcomes to sustain buy‑in.

Strong implementation and transfer make Step 13 straightforward: you’ll have clean evidence to measure impact, report and iterate with confidence.

Step 13. Measure impact, report and iterate continuously

Evaluation is where your training needs analysis proves its worth. CIPD stresses measuring impact, engagement and transfer; AIHR’s ADDIE model closes with Evaluate. Track what you promised in Step 9, using mixed evidence and comparing against the baselines you set. Keep results auditable by tying every metric to a capability, role and source (IATA/IMDG/ADR clause, SOP, assessment ID).

What to measure (and how)

  • Engagement: Enrolment, attendance, completion, time‑to‑complete; flag no‑shows and drop‑offs.
  • Learning: Assessment scores, re‑test uplift, error types by clause; knowledge_gain = post − pre.
  • Behaviour/transfer: OJT sign‑offs, observed adherence to SOPs, right‑first‑time by task.
  • Results: Shipment rejections, incident/near‑miss counts, repeat audit findings, cycle time and rework.
  • Compliance coverage: In‑date certificates vs required; expiry_risk_90d = expiring_≤90d / in_scope.
  • Stakeholder signal: Manager and learner feedback on usability of job aids and feasibility on shift.

Reporting cadence and decisions

  • Weekly (during rollout): Engagement, learning, early transfer signals; remove blockers fast.
  • Monthly (stabilised): Results and compliance trends; show baseline → target → actual.
  • Per cohort/site: Simple dashboards with traffic‑light status and narrative on risks and fixes.

Iterate deliberately

  • If behaviour didn’t shift: Revisit root causes; strengthen OJT, job aids, manager coaching.
  • If process/tooling is the brake: Log as a non‑training action; fix SOPs/systems first.
  • If outcomes are met: Embed in SOPs, schedule refreshers, and retire duplicate courses.

Close the loop by feeding insights back into the next cycle. That’s how to conduct training needs analysis as an ongoing LNA, not a one‑off event.

Step 14. Download your free TNA templates and examples

Speed up delivery and keep your evidence audit‑ready with downloadable templates aligned to Steps 1–13. They’re built for dangerous goods contexts, so you can map capabilities directly to IATA DGR, IMDG Code and ADR clauses and your SOP IDs. If you’re learning how to conduct training needs analysis efficiently, these will help you move from discussion to decisions faster.

  • TNA brief & RACI: One‑page scope, roles and governance.
  • Capability & standards map: Role tasks with clause references and proficiency levels.
  • Data inventory workbook: Training, certificates, KPIs, audits and incidents.
  • Data collection kit: Survey, interview and observation templates plus test blueprint.
  • Gap matrix & root cause: Scoring model and notes.
  • Prioritisation model: priority_score = gap × impact × compliance ÷ effort.
  • Learning outcomes builder: Audience–Behaviour–Conditions–Standard–Measure.
  • Solution selector & blend planner: Map gaps to delivery modes.
  • Plan, schedule & budget: Calendar, cohorting and cost calculator.
  • OJT sign‑off & competence matrix: Evidence of transfer.
  • Evaluation dashboard: Baselines, targets and actuals.

Formats: Excel/Google Sheets, Word/Docs and PowerPoint/Slides. Copy, customise and version‑control for your operation.

Step 15. Example walkthrough: lithium batteries shipping TNA (IATA/ADR/IMDG)

Here’s a compact, real‑world walkthrough to show how to conduct training needs analysis for lithium batteries shipped by air, road and sea. The scenario: a spike in shipment rejections and audit findings on lithium consignments. The objective is to cut rejections, lift right‑first‑time (RFT) documentation and packing, and restore full compliance against the IATA Dangerous Goods Regulations (DGR), the IMDG Code and ADR.

  • Goal and scope: Reduce lithium battery rejections within the next quarter; restore 100% in‑date certifications for affected roles; align SOPs to current IATA/IMDG/ADR requirements.
  • Levels of analysis: Organisation (set standards), team/role (export operations, warehouse, driver/loader, customer‑facing shippers), and targeted individual follow‑ups.
  • Capabilities defined: Classify, pack, mark/label, segregate and document lithium shipments; proficiency levels set (Awareness/Application/Mastery) with evidence expectations per role.
  • Inventory scan: Mapped courses to standards; spotted certificate expiries and overlapping modules; RFT dips clustered around lithium tasks; recurring audit themes.
  • Data collection: Short role‑specific knowledge checks; floor observations with a clause‑mapped checklist; SME/manager interviews to surface process constraints.
  • Gap and root‑cause analysis: Knowledge gaps on special provisions; label placement and document accuracy issues; SOPs lagging the latest code; some workflow/tooling barriers.
  • Prioritisation and outcomes: Legal/safety‑critical first; written outcomes per role (for example, “prepare lithium DG documentation error‑free on first check across consecutive live jobs”).
  • Solutions and delivery: Micro refreshers and quick‑reference job aids mapped to the codes; virtual clinics for tricky scenarios; in‑house practicals with observed OJT sign‑offs; SOP updates before rollout.
  • Plan, budget and measures: Phased four‑week sprint around operational peaks; tracked engagement, knowledge gain, OJT evidence, RFT and rejection trend; compliance coverage monitored and expiries cleared.

This end‑to‑end slice shows the disciplines from Steps 1–14 in action: standards‑anchored capabilities, mixed evidence, clear root causes, targeted solutions and measurable outcomes across IATA, IMDG and ADR contexts.

Step 16. Ensure compliance, record-keeping and recertification

Compliance isn’t a one‑off event. To make your work on how to conduct training needs analysis stick, you need airtight records and predictable recertification that track back to IATA DGR, IMDG Code and ADR/RID, as well as your SOPs. Auditors expect traceability from standard to learner, evidence of competence, and proof your records are current.

Build an audit‑ready training record

  • Single source of truth: Central LMS/HRIS register for all DG roles and sites.
  • Role-to-standard map: Each capability linked to the governing clause and SOP ID.
  • Evidence attached: Certificates, assessment IDs/scores, OJT sign‑offs, assessor names/dates.
  • Validity tracking: Issue/expiry dates, refresher cadence, and regulator approvals where relevant.
  • Version control: Code/SOP version trained against; change history.
  • Exception logging: Any temporary dispensations with risk acceptance and end dates.

Control recertification and change

  • Automated alerts and dashboards: expiry_risk_90d = expiring_≤90d / in_scope.
  • Rolling cohorts: Pre‑book refreshers for high‑risk roles; clear backlogs fast.
  • Change triggers: Retrain when codes/SOPs update, after extended leave, or on role change.
  • Gated work: Assignment to DG tasks contingent on in‑date competence.

Governance and data protection

  • Clear RACI: Who approves, observes, audits and escalates.
  • Periodic internal audits: Sample records against standards; fix gaps promptly.
  • Confidentiality and retention: Role‑based access, lawful basis, and defined retention periods.

This final step closes the loop: your training needs analysis produces evidence that stays current, defensible and ready for any audit.

Key takeaways and next steps

You now have a practical, audit‑ready way to run a TNA that improves real‑world performance and keeps IATA/IMDG/ADR compliance intact. The rhythm is consistent: start with outcomes, choose the right level, engage stakeholders, define standards, mine existing data, collect mixed evidence, separate training from non‑training fixes, prioritise by risk, blend delivery for transfer, measure, and keep records current.

  • Clarify the goal: Test whether training is the lever before acting.
  • Map capabilities to standards: Tie tasks to clauses/SOPs with proficiency levels.
  • Triangulate evidence: Use surveys, interviews, observations and assessments.
  • Prioritise by risk and effort: Set measurable outcomes that mirror the standard.
  • Design for transfer: Blend solutions, capture OJT evidence, report and iterate.

If you want expert support to tailor this to dangerous goods roles — from lithium batteries to DGSA — speak to Logicom Hub. We’ll help turn your TNA into confident, compliant performance on the job.