Skip to Content

Quality Assurance Process

Run self-review, complete the QA worksheet, submit a clean package, and address feedback until approved. Use the PS ID Companion  for rubric-aligned feedback alongside the worksheet.

What you need before starting

What you need to produce

Completed QA worksheet; submission package (slides, labs, assessments, FG, blueprint, notes for reviewer); responses to QA feedback until final approval.

What to do

  1. Run the “Before you submit” checklist (below).
  2. Complete the QA Worksheet & Rubric for the module.
  3. Optionally run PS ID Companion  review; revise from feedback.
  4. Assemble and label the submission package; submit via Dashboard.
  5. Address first-round feedback; resubmit.
  6. Address second-round if needed; obtain final approval.

Exit criteria

QA is complete when: worksheet is complete; self-review done; package submitted and labeled; feedback addressed; status is Approved.

Common mistakes

  • Skipping self-review and sending incomplete or mislabeled files.
  • Thin facilitator notes or objectives/assessments not aligned (causes rework).
  • Simulation context decorative, not task-affecting; vague lab/SBA instructions; broken links or wrong naming.

Can you skip QA if the timeline is tight?

  • Select an answer to view feedback.

Before you submit for QA

Run this self-check so reviewers can focus on quality, not basics:

  • Objectives match assessments: Each objective has at least one assessment that measures it; no orphan objectives or assessments.
  • Instructions are complete: Labs and SBAs have clear scenario, steps, and requirements; nothing assumed.
  • Simulation context affects the task: The workplace frame shapes what learners do, not just the story around it.
  • Activity timing is realistic: Durations and sequencing are feasible for the cohort.
  • Facilitator notes are actionable: Notes tell the instructor what to do (e.g. “Prompt stand-up in first 5 min”), not just what to know.
  • Naming and file structure are correct: Prefixes and organization match the naming convention.
  • Links and assets work: All links resolve; embedded assets load; no broken references.
  • Content variety: Does the module rely solely on slides and labs, or does it include varied formats (scenarios, discussions, video, interactive elements)? If it’s just “slides then lab then test,” it needs more. See Non-Negotiables & Standards.

What to send and how to label

  • Package: Slides, labs, assessments, facilitator guide, module blueprint, notes for reviewer (known gaps or open questions).
  • Label: Module number, asset type, version if needed. Submit via Product Development Dashboard.
  • Common rework causes: Thin facilitator notes; objectives/assessments not aligned; simulation decorative; vague lab/SBA instructions; broken links or wrong naming; incomplete worksheet.

QA Worksheet and Rubric

Use the QA Worksheet & Rubric  for self-reviews and to guide QA reviewers through the evaluation.

QA Review Workflow

SME Self-Review

Est. Time Allocation: 6 hours
  • Complete QA Worksheet & Rubric.
  • Review all module components.
  • Document any issues found.

First-Round QA Review

Est. Time Allocation: 5 hours per 5.5 hours of content
  • QA team reviews worksheet and materials.
  • Provides detailed feedback.
  • Scores against rubric.

SME Review and Revisions

Est. Time Allocation: 6 hours
  • Address QA comments and questions.
  • Make corrections and revisions.
  • Resubmit via Airtable.

Second-Round QA Review

Est. Time Allocation: 2.5 hours
  • Verify first-round feedback addressed.
  • Provide additional feedback if needed.
  • Track progress in Airtable.

Final QA Review

Est. Time Allocation: 1 hour
  • Review for final approval.
  • Ensure all feedback addressed.
  • Move to “Approved” status.

Product Development Dashboard

Track QA progress and feedback using the Product Development Dashboard.

Quality indicators (review criteria)

Expand for full review criteria

Review Criteria

General Standards

  • Content accuracy.
  • Technical completeness.
  • Alignment with objectives.
  • Accessibility compliance.

Writing Standards

  • Clarity and consistency.
  • Grammar and style.
  • Terminology usage.
  • Instructional clarity.

Interactive QA Practice Tool

Practice evaluating curriculum quality by reviewing a module against these standards. Mark each standard as “Yes” only if all verification points are met.

This tool helps you develop your ability to assess curriculum quality systematically. Use the Additional Notes to record your observations and reasoning.

QA Review Worksheet

Module Information

1. Module Overview and Introduction

The Introduction provides information on how learners can get started and where they may find essential course components.

  • Introduction clearly explains module purpose.
  • Learning objectives are clearly stated.
  • Required materials are listed.
  • Navigation instructions are provided.
  • Technical requirements are specified.

2. Learning Objectives (Competencies)

Outcomes are appropriate for the module's rigor and are suited for the modular level, title, and description.

  • Objectives are measurable and specific.
  • Bloom's taxonomy level is appropriate.
  • Objectives align with module title.
  • Objectives are achievable within module timeframe.
  • Prerequisites are clearly stated.

3. Assessment and Measurement

Assessments measure the achievement of the stated learning objectives.

  • Assessments directly measure objectives
  • Grading criteria are clearly defined
  • Assessment instructions are clear
  • Rubrics are provided where needed
  • Practice opportunities are available

4. Writing Standards

Writing adheres to standard English rules and best practices.

  • Grammar and spelling are correct
  • Terminology is consistent
  • Writing style is professional
  • Content is well-organized
  • Instructions are clear and concise

5. Workplace Simulation / Alignment

Materials read as professional practice (role, scenario, deliverable), not test prep. Content is framed in workplace context.

  • Target role or workplace context is clear
  • Scenarios feel like real job tasks
  • Deliverables match workplace expectations (e.g., KB article, ticket notes)
  • Simulation frame (stand-ups, tickets, KPIs) is reflected where applicable

Next step

ImplementationHand off the course package and support instructor training.

Last updated on