Custom Module Development Guide
This guide walks you through the process of creating a custom module using the Understanding by Design (UbD) framework and the Per Scholas Workplace Simulation model. You will plan backward from job-ready outcomes, build daily learning experiences around real-world tasks, and design assessments that reflect actual work.
We will use a real example throughout—our Windows Server Administration module—to show how each step works in practice.
Development Process
Step 1: Confirm Where the Module Fits
Before designing anything, make sure you understand where your module sits in the course. This helps you avoid repeating vendor content, connect your learning goals to what came before, and align with future simulation and assessment moments.
Clarify the Context
- What course or track is this part of?
- What modules or vendor content come before and after?
- What skill bundle does this fall under?
- What simulation arc is in play?
- What real-world role are learners stepping into?
Example: Windows Server Administration (Module 161)
- Course: IT Support
- Comes After: CompTIA A+ Core 2 + A+ Certification
- Skill Bundle: Enterprise IT tools and services
- Preceding Module: A+ Review and Certification
- Following Module: Capstone - Troubleshooting and System Monitoring
- Simulation Arc: Learners have just passed their A+ exam and are now starting their first day as junior infrastructure technicians on the EdgeTech IT team
- Module Focus: Managing core Windows Server tasks in a real production environment simulation
This context sets the tone: the learner isn’t a student anymore—they’re a newly hired junior sysadmin joining a team.
Step 2: Define Clear Learning Outcomes
Now that you know where your module fits, define what learners should walk away being able to do. Each learning outcome should be:
- Tied to the workplace
- Measurable
- Aligned with either technical skills, professional behaviors, or simulation performance
Use Bloom’s verbs to focus on what learners can do—not just what they understand.
Type | Outcome Example | Ask Yourself |
---|---|---|
Technical | Configure Active Directory and user permissions | Will this prepare them for what they’ll actually do on the job? |
Behavioral | Escalate a server issue with clear documentation and follow-up | Can we observe and evaluate this behavior during a simulation? |
Simulation | Participate in team standup as a Tier 1 sysadmin | Will learners apply this in a realistic workplace setting? |
Example: Windows Server Administration
- Configure and manage user accounts in Active Directory
- Set up basic file and print services in Windows Server
- Document system changes and escalate issues using internal handoff templates
- Participate in daily team standups with ticket updates and KPI tracking
Step 3: Build Content and Learning Activities
Now build the content that helps learners reach the goals above. Everything should reflect what a junior team member would be doing on the job—reading documentation, following procedures, using the tools, and working in a team.
What to include:
- Instructional slides or guides for key concepts
- Labs or hands-on tasks using real tools or virtual environments
- Scenario briefings and role-based team tasks
- Team reflections or retrospectives
- Standup meetings and asynchronous check-ins
Example Activities
- Slide Deck: Walkthrough of user account creation and GPO management
- Lab: Create, disable, and reset user accounts in a test domain
- Pair Task: Review and edit Active Directory permissions using a shared VM
- Standup: Report on server maintenance progress and blockers
- Scenario Briefing: “You’re onboarding three new employees who need access to shared folders, printers, and company drives by noon today.”
Step 4: Design Assessments
With outcomes and content in place, now design assessments that give clear evidence of learner progress. These should feel like real deliverables or checkpoints from the workplace. Include both individual and team-based tasks.
Type | Example | Focus |
---|---|---|
Knowledge | Quiz on key Windows Server features and terminology | Understanding of foundational concepts |
Skills | Lab where learners configure file sharing and permissions | Demonstration of applied skills |
Behavioral | Peer review or instructor check-in after standup | Professional communication and collaboration |
Simulation | KPI tracking during simulation days | Measurable workplace behaviors (e.g., documentation, escalation, clarity) |
Example Assessments
- KBA: 10-question quiz on user roles, group policy, and server roles
- SBA Lab: Create a secure shared drive with read/write permissions for HR staff
- Reflection Prompt: “Describe a blocker you hit today and how you resolved or escalated it.”
- KPI Rubric: Clear updates during standup, accurate escalation notes, response to feedback
Step 5: Build the Simulation Layer
Now connect your learning to a simulation context that ties it all together. This is where learners apply what they’ve learned in a “real job” setting with the added pressure of team dynamics, time constraints, blockers, and responsibilities.
Simulation Elements to Plan:
- The scenario (what’s happening this week?)
- The learner’s role and team structure
- The primary tasks or challenges
- KPIs or observable behaviors you’ll assess
Example: Windows Server Administration
- Scenario: You’re on the EdgeTech internal IT team during a company-wide onboarding sprint
- Trigger: 25 new employees are joining this week. You’re responsible for user setup, printer access, shared drive permissions, and access tickets
- Simulation Days: Each day represents a different onboarding wave or issue escalation
- KPIs: Documentation accuracy, follow-through on escalations, professional tone in communication, timely standup participation
Step 6: Review and Iterate
Before finalizing your module, use our internal quality standards to review for clarity, alignment, and simulation fidelity. Share drafts with peers or SMEs and pilot the module if possible.
Internal Review Checklist
- Are outcomes, content, and assessments all aligned?
- Do simulation scenarios feel authentic?
- Are tools and workflows realistic for the learner’s role?
- Is everything accessible, up to date, and clearly written?
- Does the module follow the Curriculum Style Guide and use the provided templates?
Feedback from the Windows Server module pilot:
- Labs were clear, but instructions needed screenshots
- Learners wanted one sample handoff template before writing their own
- Retrospectives worked well—added reflection prompts to all days
- KPI tracking helped instructors provide targeted feedback
Developer Tips
Module Context
Suggest real-world workplace scenarios where understanding [Your Module Topic, e.g., 'Windows Boot Process'] is critical for an entry-level IT Support role.
Generate 3 diverse, realistic scenarios for an IT Help Desk technician related to troubleshooting the Windows boot sequence. Include potential user descriptions and urgency levels.
Analyze the following course outline and suggest where a module on [Your Module Topic] would fit best, explaining the connections to preceding and subsequent topics: [Paste Course Outline Here]
Review this course outline: [Outline]. Where would a module on 'Windows Boot Troubleshooting' fit logically? What prerequisite knowledge should be covered before, and what advanced topics could follow?
Learning Outcomes
Review the following learning outcomes for clarity, measurability, and relevance to an IT Support role. Suggest improvements using strong action verbs: [Paste Outcomes Here]
Critique these learning outcomes for a module on Active Directory basics: [Outcome 1, Outcome 2]. Are they SMART? Suggest revisions using verbs from Bloom's Taxonomy.
Generate examples of observable behaviors in a simulated help desk environment that would demonstrate mastery of the following outcome: [Paste Outcome Here]
For the outcome 'Accurately document troubleshooting steps in a ticketing system', provide 3 examples of how a learner could demonstrate this during a simulated support call.
Assessment Design
Generate 5 multiple-choice questions testing knowledge of [Specific Concept, e.g., 'common Windows boot errors'], including plausible distractors and explanations for the correct answer.
Create 5 multiple-choice questions about the difference between `ipconfig` and `ping` for network troubleshooting. Include explanations for why the correct answer is right and the others are wrong.
Draft a rubric to assess the skill of [Specific Skill, e.g., 'remotely assisting a user with a software installation']. Include criteria for technical accuracy, communication clarity, and professionalism, with 3 performance levels (e.g., Novice, Competent, Proficient).
Draft a 3-level rubric (Needs Improvement, Meets Expectations, Exceeds Expectations) for assessing a learner's ability to clearly explain a technical solution to a non-technical user during a simulated support interaction.
Simulation Development
Generate 3 detailed user personas needing IT support. Include their job role, technical comfort level, personality type (e.g., impatient, friendly, confused), and a specific technical issue related to [Module Topic].
Create a persona for a busy Marketing Manager who needs help connecting to the office VPN but is not tech-savvy and is currently stressed about a deadline.
Outline a branching decision path for a simulation where a learner must diagnose [Specific Problem, e.g., 'a printer connectivity issue']. Include potential diagnostic steps, common mistakes, and consequences for incorrect choices.
Outline a decision tree for a simulation task where a Tier 1 tech must decide whether to resolve a user's software issue directly or escalate it to Tier 2. Include factors like issue complexity, user permissions, and estimated resolution time.
- Adapt & Specify: Modify these example prompts with your specific module topic, concepts, and learner context.
- Provide Context: Give the AI background information (e.g., target audience, learning objectives) for more relevant responses.
- Iterate: If the first response isn't perfect, refine your prompt and ask again. Try asking for alternatives or different formats.
- Validate: Always review AI output for accuracy, relevance, and alignment with your instructional goals. Don't trust it blindly.
- Combine: Use AI as a brainstorming partner or draft assistant, then apply your expertise to finalize the content.
The PS ID Companion builds on these AI best practices and is your go-to tool for planning, brainstorming, and generating high-quality instructional prompts tailored to Per Scholas modules. Use it to:
- Scaffold your learning objectives and daily plans
- Get prompt suggestions for labs, scenarios, and assessments
- Overcome creative blocks or writer’s block
- Ensure your prompts are job-aligned and simulation-ready
Quality Review Checklist
Quality Review Checklist
- Alignment: Objectives, content, activities, and assessments are aligned.
- Accessibility: Meets WCAG standards; all media accessible.
- Accuracy: Technical content is correct and up-to-date.
- Clarity: Instructions and explanations are clear and unambiguous.
- Completeness: All necessary info, resources, and steps included.
- Engagement: Activities are motivating and relevant.
- Functionality: Links, interactive elements, and labs work as expected.
- Consistency: Follows style guide and module template.