Templates for TNA and Learning Outcomes

Templates for TNA and Learning Outcomes

Related Programmes

No related courses found.

A training programme without a Training Needs Analysis (TNA) is a solution in search of a problem. It may be well-designed, engaging, and professionally delivered — and still miss the point entirely, because no one has systematically established what the real performance gap is, where it sits in the organisation, who is affected, and what addressing it would actually look like. Equally, a TNA without clearly written learning outcomes leaves even a well-diagnosed need without a measurable destination. The two are inseparable: the TNA tells you what problem you are solving; the learning outcomes tell you how you will know you have solved it.

This article provides both the conceptual framework and the practical templates L&D professionals need to conduct rigorous TNAs and write learning outcomes that are specific, measurable, and genuinely linked to performance. Every template included here is ready to adapt for your own organisation, regardless of sector or scale.

Key Takeaways

  • A TNA should operate at three levels simultaneously: organisational, job or role, and individual — as defined by the Open University’s framework for workplace learning.
  • The most reliable TNAs triangulate data from multiple sources — surveys, interviews, performance data, and observation — rather than relying on any single method.
  • Learning outcomes must begin with an observable, measurable action verb. Vague verbs such as “understand,” “know,” or “appreciate” cannot be assessed and should always be replaced.
  • Bloom’s Revised Taxonomy provides a hierarchical framework for writing learning outcomes that match the cognitive demand of the job — from basic recall through to analysis, evaluation, and creation.
  • The SMART framework (Specific, Measurable, Achievable, Relevant, Time-bound) and Bloom’s Taxonomy are most effective when used together, not as alternatives.
  • Learning outcomes written at the right cognitive level directly inform assessment design, content sequencing, and the choice of delivery method — making them the single most important design document in any training programme.

What is a Training Needs Analysis — and why does it matter?

A Training Needs Analysis is a process to identify the gap between the actual and the desired knowledge, skills, and abilities in a job. The need for such analysis usually arises due to an organisational problem — a lower-than-expected quarter for the sales team, changing technology threatening to impact the continuity of operations, or constantly low customer satisfaction scores forcing a team to become more agile and customer-focused.

The purpose is not simply to identify what training is needed. It is to ensure that training is the right response at all. Performance gaps can arise from a lack of knowledge or skill — in which case training is appropriate — but they can equally stem from unclear expectations, inadequate tools, poor processes, or motivational factors that no training programme will resolve. A well-conducted TNA distinguishes between these causes and recommends training only where it will genuinely make a difference.

90%

Of L&D professionals agree proactively building skills will help their organisations navigate the future of work

57→65

eNPS score improvement reported after implementing TNA-led L&D programmes

3 levels

A comprehensive TNA must address: organisational, job-role, and individual

4+

Data collection methods should be triangulated for a reliable TNA

The three levels of a TNA

Most commentators agree that the TNA is the most important part of the training lifecycle. A classic TNA will usually examine needs at three levels: organisational, job-task, and individual. Understanding what each level reveals — and what questions to ask at each — is the foundation of any robust needs analysis process.

Level 1

Organisational analysis

Links the TNA to corporate strategy and HRD planning. Examines how well the organisation is equipped to meet current and future challenges — factoring in technology change, workforce planning, and strategic direction.

Data sources: Corporate documents, strategic plans, workforce data, senior stakeholder interviews, engagement surveys

Level 2

Job / role analysis

Identifies the knowledge, skills, and behaviours required to perform a specific role effectively. Establishes the “should be” standard against which current employee capability is compared.

Data sources: Job descriptions, competency frameworks, interviews with job-holders, observation, performance standards

Level 3

Individual analysis

Pinpoints the specific gaps in knowledge, skill, or behaviour for individual employees. Identifies who needs training, in what areas, and at what depth — enabling targeted rather than blanket provision.

Data sources: Performance appraisals, self-assessments, manager observations, assessment centres, 360-degree feedback

TNA data collection methods

Surveys, interviews, performance reviews, and observations should be used together to gather comprehensive data. Relying on just one method can lead to incomplete or biased insights. The table below summarises the main data collection methods, their strengths, and when each is most appropriate.

Method Strengths Limitations Best used for
Surveys & questionnaires Scalable; reaches large groups quickly; quantifiable data; anonymity encourages honesty Surface-level; low response rates possible; cannot probe nuance Organisational-level and individual self-assessment at scale
Interviews Rich qualitative data; uncovers hidden needs; builds stakeholder buy-in Time-intensive; subject to interviewer bias; difficult to scale Senior stakeholders, managers, and key role-holders
Focus groups Surfaces shared experiences; efficient for exploring team-level needs; builds consensus Group dynamics may suppress minority views; requires skilled facilitation Role or team-level analysis; validating survey findings
Performance data review Objective; directly links gaps to measurable outcomes; credible with senior leadership Does not explain why a gap exists; data quality depends on systems Identifying organisational and team-level performance gaps
Observation Captures actual behaviour (not reported behaviour); identifies hidden task components Observer effect may alter behaviour; resource-intensive; ethical considerations Job/role-level analysis; process or compliance training
Appraisal & 360 data Multi-source; already collected; directly reflects individual performance against role expectations Quality varies; may reflect manager bias; historical rather than forward-looking Individual-level gap identification; leadership development TNAs

Template 1: TNA scoping document

The TNA scoping document is completed before any data collection begins. It defines the boundaries of the analysis, establishes what business problem or strategic objective is driving it, and secures stakeholder agreement on the purpose and expected outputs. Without this step, TNA work frequently expands beyond its original scope or produces findings that do not connect clearly to a decision.

Template 1 — TNA Scoping Document

Organisation / business unit [Insert department, team, or business unit name]
TNA lead [Name and role of the person conducting the TNA]
Sponsoring stakeholder [Name and role of the senior sponsor commissioning this TNA]
Business problem or trigger [What performance issue, strategic objective, or organisational change has prompted this TNA? Be specific — include data where available, e.g. “Sales conversion rate has declined from 42% to 31% over two quarters”]
Desired outcome [What does success look like? What should be different once the gap has been addressed? E.g. “Conversion rate returns to 40%+ and remains stable for two consecutive quarters”]
TNA scope [Which levels will this TNA address? Tick all that apply: ☐ Organisational ☐ Job / role ☐ Individual. Which roles or populations are in scope?]
Data collection methods [Which methods will be used? ☐ Survey ☐ Interviews ☐ Focus groups ☐ Performance data review ☐ Observation ☐ Appraisal / 360 data ☐ Other]
Key stakeholders to involve [List names / roles of managers, employees, subject matter experts, and HR contacts to be interviewed or surveyed]
Timeline [Data collection: __ / __ / ____ to __ / __ / ____  |  Findings report: __ / __ / ____  |  Recommendations agreed: __ / __ / ____]
Assumptions and constraints [Any budget limitations, time constraints, access restrictions, or assumptions built into the scope of this analysis]
Is training the likely solution? [Preliminary view on whether the gap is a knowledge/skill issue (training appropriate), or a process/motivation/environment issue (non-training intervention needed). To be confirmed following data collection.]

Template 2: skills gap analysis matrix

Once data has been collected, the skills gap analysis matrix provides a structured format for recording and comparing current capability against required capability for each role in scope. The gap score — the difference between required and current level — drives prioritisation of training interventions.

Template 2 — Skills Gap Analysis Matrix  |  Role: ________________  |  Date: ________________

Skill / competency Required level
(1–5)
Current level
(1–5)
Gap score Priority Training recommended?
[e.g. Consultative selling techniques] 4 2 –2 High Yes — structured programme
[e.g. CRM system proficiency] 3 2 –1 Medium Yes — on-the-job coaching
[e.g. Product knowledge — core range] 4 4 0 None No — gap not present
[Add skill / competency] TBC
[Add skill / competency] TBC
Scoring guide: 1 = No capability  |  2 = Basic awareness  |  3 = Developing  |  4 = Proficient  |  5 = Expert / can coach others  |  Gap = Required minus Current  |  Priority High = gap of 2+; Medium = gap of 1; None = gap of 0

Template 3: TNA findings and recommendations summary

Once all data has been gathered and analysed, the findings and recommendations summary translates the raw analysis into actionable decisions. This is the document presented to the sponsoring stakeholder — it should be concise, evidence-based, and focused on the “so what” rather than the methodology.

Template 3 — TNA Findings & Recommendations Summary

Business problem addressed [Restate the original business problem or strategic objective that triggered this TNA]
Data sources used [List what data was collected, from whom, and when — e.g. “14 manager interviews, survey of 62 employees, review of Q2 performance data and exit interview themes”]
Key findings [Summarise the 3–5 most significant findings from the data — what are the main gaps, their causes, and their scope? Which are skill/knowledge issues (training-addressable) and which are not?]
Populations requiring training [Which roles, teams, or individuals have been identified as requiring training, and at what level of urgency?]
Non-training recommendations [Any gaps that are better addressed through process redesign, role clarification, management practice changes, or other non-training interventions — with brief rationale]
Proposed training interventions [For each training gap: proposed intervention type, target audience, estimated duration, and indicative timeline — e.g. “Workshop-based consultative selling programme for 18 account managers, 2 days, Q2”]
Indicative budget requirement [High-level cost estimate for the recommended training interventions, including design, delivery, and facilitation]
Success metrics [How will you know the training has worked? List 2–3 measurable indicators — e.g. “Sales conversion rate returns to 40%+ by end of Q3; post-training competency assessment average of 80%+; manager-rated behaviour change at 60 days”]
Stakeholder sign-off [Name: ________________  |  Role: ________________  |  Date: ________________  |  Signature: ________________]

From TNA to learning outcomes: the critical bridge

The TNA establishes what needs to change. Learning outcomes define — precisely and measurably — what a learner must be able to do differently as a result of training. They are the bridge between diagnosis and design, and they determine everything that follows: the content selected, the delivery methods used, the activities designed, and the assessments employed.

Learning objectives are clearly written, specific statements of observable learner behaviour or action that can be measured upon completion of an educational activity. They are the foundation for instructional alignment whereby the learning objectives, assessment tools, and instructional methods mutually support the desired learning outcome.

“A course or training programme without learning outcomes is like a vehicle without a steering wheel — both spell a lack of direction, and that’s a disaster waiting to happen.”

— Teachfloor, How to Write Learning Objectives Using Bloom’s Taxonomy

Bloom’s Taxonomy: writing outcomes at the right cognitive level

Most training programmes tell learners what they need to know. Bloom’s Taxonomy asks a different question: what do they need to be able to do with it? That shift — from content delivery to capability building — is why this framework, first published in 1956, still shapes how L&D professionals and instructional designers structure learning today.

The revised Bloom’s Taxonomy organises learning into six levels of increasing cognitive complexity. The critical insight for L&D practitioners is that the level at which you write a learning outcome should match the cognitive demand of the job — not default to the lowest level because it is easiest to teach or assess.

Level Cognitive skill Key action verbs Example learning outcome (corporate training)
1 — Remember Recall facts from memory Define, list, name, recall, state, identify, match, recognise By the end of this module, participants will be able to list the five stages of the company’s complaint escalation process
2 — Understand Explain and interpret meaning Explain, describe, summarise, interpret, classify, compare, paraphrase Explain the difference between assertive and aggressive communication styles and when each is likely to occur
3 — Apply Use knowledge in new situations Apply, demonstrate, implement, use, solve, execute, perform, show Demonstrate the STAR feedback model in a role-play scenario with a direct report
4 — Analyse Break down and examine components Analyse, differentiate, compare, contrast, examine, distinguish, organise Analyse a sample customer interaction and identify where the conversation deviated from consultative selling principles
5 — Evaluate Make judgements and justify decisions Evaluate, judge, justify, critique, argue, assess, rank, defend, appraise Evaluate the appropriateness of three proposed conflict resolution approaches given the specifics of a case study scenario
6 — Create Produce something new and original Design, develop, construct, build, propose, plan, formulate, devise Design a 90-day onboarding plan for a new team member that reflects the team’s current priorities and the individual’s development needs

Strong vs. weak learning outcomes: a comparison

Verbs such as “understand”, “know”, “learn”, “appreciate”, “believe”, “be familiar with”, and “comprehend” are not observable or measurable and should be avoided. The table below shows common weak formulations alongside stronger rewrites that meet the SMART and Bloom’s criteria.

Weak — avoid this Strong — use this instead
Participants will understand the importance of active listening Participants will demonstrate active listening by accurately summarising a colleague’s concern before responding
Learners will be aware of the company’s data protection policy Learners will identify at least three data handling scenarios that require escalation under the company’s GDPR policy
Delegates will learn about coaching approaches for line managers Delegates will apply the GROW model in a 20-minute coaching conversation and receive structured peer feedback against defined criteria
Participants will appreciate the value of diversity in teams Participants will analyse a team scenario and identify two specific ways in which diverse perspectives contributed to — or were excluded from — the decision-making process
Learners will know how to handle a difficult performance conversation Learners will conduct a structured performance conversation using the SBI (Situation-Behaviour-Impact) model, maintaining a constructive tone throughout a role-play with an assessor

Template 4: learning outcomes writing frame

Use this template when writing learning outcomes for any training programme. Each outcome should stand alone — one action verb, one skill or knowledge area, one context or condition, and one criterion where possible. Avoid bundling two outcomes into a single statement.

Template 4 — Learning Outcomes Writing Frame

Programme title [Insert programme name]
Target audience [Role, team, or group for whom these outcomes are written]
Linked TNA gap [Which skill / knowledge gap identified in the TNA do these outcomes address?]
Bloom’s level targeted [Remember / Understand / Apply / Analyse / Evaluate / Create — choose the level that matches the cognitive demand of the job, not the easiest level to deliver]
Outcome stem “By the end of this programme / module / session, participants will be able to…”
# Action verb Skill / knowledge / behaviour Context / condition Standard / criterion
LO1 Demonstrate [The skill or knowledge to be applied] [In what setting or scenario] [To what standard or level of accuracy]
LO2
LO3
LO4
Assessment method [How will each outcome be assessed? e.g. Role-play with structured observation, written scenario response, manager verification checklist, skills assessment at 30 days post-training]
Link to business metric [Which business outcome will these learning outcomes contribute to, and how will that contribution be measured? e.g. “Improvement in customer satisfaction score” / “Reduction in grievance volume” / “Increase in internal promotion rate”]

Aligning TNA, learning outcomes, and assessment: the instructional design chain

A TNA and a set of well-written learning outcomes are only as valuable as the alignment between them and everything that follows in the design process. The table below shows how each element of the instructional design chain feeds the next — and the questions to ask at each stage to maintain alignment.

The instructional design alignment chain

Stage Key question Output
1. TNA What is the performance gap, why does it exist, and is training the right solution? Scoping document, skills gap matrix, recommendations summary
2. Learning outcomes What must learners be able to do differently at the end — at the right cognitive level? SMART, Bloom’s-aligned outcome statements for each gap
3. Assessment design How will we know each outcome has been achieved — and at what point in the learning journey? Assessment methods matched to each outcome; evaluation plan
4. Content design What content, activities, and resources are needed to enable learners to achieve each outcome? Session plans, materials, e-learning modules, case studies
5. Delivery method What format and channel best supports the learning outcomes and the target audience? Workshop, e-learning, blended, coaching, on-the-job practice
6. Evaluation Did the training produce the behaviour change and business outcome identified in the TNA? Kirkpatrick Level 3 and 4 data; ROI measurement; next TNA cycle

Conclusion

A rigorous TNA and clearly written learning outcomes are not administrative burdens — they are the professional foundations of effective L&D practice. Together, they ensure that every training investment is made for a specific, evidenced reason; that its success can be defined in advance and measured afterwards; and that the design of the programme flows logically from the needs identified rather than from habit, convenience, or supplier catalogue.

The templates provided in this article are starting points. Every organisation’s TNA will reflect its own context, data landscape, and stakeholder dynamics. What does not change is the underlying logic: diagnose precisely, write outcomes at the right cognitive level, align assessment and design to those outcomes, and measure what matters. When those steps are followed consistently, training stops being a cost and becomes an investment with a visible return.

Alpha Learning Centre supports organisations at every stage of this process — from TNA facilitation and learning outcome design through to programme delivery and evaluation. Contact us to find out how we can help you build a more rigorous, impactful L&D function.

Advance Your Expertise with Targeted Training

Select from a wide range of professional courses tailored to industry standards, helping you stay competitive in a rapidly evolving global market.