Deliverable guides
Find purpose, required sections, naming rules, and examples for each key output so you build and hand off correctly.
Mini-guides for the main outputs. Each: purpose, required sections, naming, what good looks like, common problems, example.
Module plan (planning doc)
Purpose: Single handoff so the development team can build without guessing. Defines objectives, evidence, activities, and simulation frame.
Required sections: Module overview (title, UCI, duration, role, 3–5 objectives); Assessment strategy (types, coverage map, scoring); Activity map (sections, durations, types, materials); Simulation frame (stand-ups, ticket source, KPIs, escalation); Source materials (links).
Naming: Reference by module number/UCI in the doc; no separate file naming rule for the plan itself.
What good looks like: A developer can start building slides and labs and knows exactly which objective each asset supports; simulation frame is specific enough to write stand-up prompts and ticket scenarios.
Common problems: Objectives too vague; assessment map missing or not aligned; simulation frame decorative; source links missing; open issues not listed.
Example: Planned Document Example . Planning page Step 6.
Facilitator guide (FG)
Purpose: Tell the instructor what to do during delivery: when to run stand-ups, what to prompt, timing, coaching tips, red flags. Not just content summary.
Required sections: Module/lesson overview; timing (per section or day); facilitation moves (e.g. “First 5 min: stand-up”; “After lab: debrief with these questions”); coaching tips and common pitfalls; simulation-specific prompts (ticket context, KPIs); links to slides/labs/assets.
Naming: By module (e.g. MBP 301 = Module Blueprint 301; FG can be “FG 301” or “Facilitator Guide – Module 301”).
What good looks like: An instructor can run the session by following the guide without having to invent when to do what. Simulation (stand-up, tickets, KPIs) is clearly tied to each block.
Common problems: Notes are content recap only, not actionable; no timing; simulation prompts missing; too thin so instructor wings it.
Example: See Developing (Custom) and Implementation for FG expectations.
Labs (guided lab, SBA)
Purpose: Hands-on task in workplace context. Guided lab = steps + reflection; SBA = scenario + content parts/steps + questions (client-style). Learner should feel they’re doing a job task.
Required sections (guided lab): Title; Objectives (2–3 bullets); Requirements; Scenario/Instructions; Steps (with subtasks); Reflection.
Required sections (SBA): Title; Objectives (2–4); Requirements; Scenario/Instructions; Content Parts and Steps; Questions (e.g. for QA in documentation).
Naming: GLAB 301.1.1 (guided lab), SBA 301, etc. Naming convention. Graded: R-ALAB, R-SBA.
What good looks like: Instructions are complete; scenario affects the task (not just flavor); steps are clear; reflection ties back to the objective. No “assume the learner knows X” without stating X.
Common problems: Vague instructions; scenario decorative; missing requirements or steps; no reflection; doesn’t map to an objective.
Example: Custom and Vendor lab examples. Templates.
KBAs and SBAs (assessments)
Purpose: KBA = knowledge check (concepts); SBA = skills-based (doing). Both must map to a planning objective and feel job-aligned where possible.
Required sections: Clear prompt/scenario; questions or tasks; success criteria or rubric; link to objective(s) in blueprint.
Naming: KBA 301, SBA 301; graded: R-KBA, R-SBA. Always graded when used as required assessment.
What good looks like: Each item maps to one or more objectives; questions/tasks are unambiguous; scoring is clear; simulation context shapes the task (e.g. client question, ticket context).
Common problems: Assessment checks recall instead of performance; no objective mapping; vague prompts; simulation context absent or decorative.
Example: QA standards and practice tool; Planning Step 3.
Slide decks
Purpose: Context, concepts, and examples that support the objectives. Should feed into doing (labs, discussions), not replace it. 75% of lesson time should be activated learning.
Required: One objective per slide cluster (or clear link); workplace/simulation context where relevant; clear visuals; reflection or discussion prompts where appropriate. No slide deck is the only deliverable for a module. There must be activities and assessments.
Naming: By lesson/module (e.g. Lesson 301.1 slides). Prefix table: Developing.
What good looks like: Slides set up the scenario and concepts; learners then do something (lab, discussion, exit ticket). Simulation (role, scenario) appears in the deck, not only in the FG.
Common problems: Slides reteach what could be read async; no link to objectives or activities; no workplace context; deck used as the whole lesson (passive).
Example: Custom slide example and Vendor wrapper.
Simulation scenarios (wrapper / frame)
Purpose: The “world” learners work in: company, role, team, stand-ups, ticket source, KPIs. Used in planning (simulation frame) and in delivery (FG, slides, labs reference it). Not a single document. It appears in the plan, FG, and materials.
Required (in planning doc): Workplace context and role; team structure; stand-ups (when, what); ticket source; KPIs; escalation path.
Required (in delivery): FG and slides that reference this frame so instructors and learners know the scenario.
Naming: No single file; frame is a section in the plan and reflected in asset names/context.
What good looks like: The frame changes what learners do (tickets, stand-ups, deliverables). “Workplace aligned” = task and context match; “too classroomy” = generic exercises with a scenario sentence pasted on.
Common problems: Frame is decorative (mentioned once); instructors don’t know how to run stand-ups or tickets; KPIs never used; scenario doesn’t affect the task.
Example: Planning Step 5; Module 134 example; Vendor wrapper.
Quick reference
| Deliverable | Purpose | Key link |
|---|---|---|
| Module plan | Handoff for build | Planning Step 6 |
| Facilitator guide | Instructor do list | Custom, Implementation |
| Labs / SBAs | Job-aligned tasks | Naming, Templates |
| KBAs / SBAs | Mapped assessments | Planning Step 3, QA |
| Slide decks | Context for doing | Custom, Vendor |
| Simulation frame | Role, stand-ups, KPIs | Planning Step 5 |