MOC Section 3

This section provides guidance for completing the Maintenance of Certification (MOC) Section 3 portion of your accreditation application. Activities submitted under Section 3 must meet the Royal College of Physicians and Surgeons of Canada’s standards for either Self-Assessment Programs (SAPs) or Simulation-Based Activities. These standards ensure that accredited activities are developed by physician organizations, are based on identified learning needs, and include mechanisms for assessment, feedback, and evaluation.

Use the help provided for each question to ensure your responses align with the required administrative, educational, and ethical standards. If you are unsure which standard applies to your activity, or need further clarification, please contact the CPD Office.

180. Describe the key knowledge areas or themes assessed by this self-assessment program

Short Explanation:
This question asks you to clearly identify the primary subject areas, competencies, or clinical themes that the self-assessment program is designed to evaluate. These should align with the learning objectives and reflect the educational needs of the target audience.

Hints for Answering:

  • Refer to the results of your needs assessment to determine the most relevant knowledge areas.
  • Consider including themes such as diagnostic reasoning, treatment planning, procedural skills, or interprofessional collaboration, depending on the focus of your program.
  • Ensure that the knowledge areas are specific and relevant to the intended audience’s scope of practice.

Tips:

  • Use concise, descriptive language to outline each theme.
  • Avoid overly broad or vague terms (e.g., “general medicine”)—be as specific as possible.
  • Ensure consistency between the themes listed here and the learning objectives provided elsewhere in the application.

Example Answer:
The self-assessment program titled "Optimizing Care in Geriatric Medicine" is designed for family physicians and internists who provide care to older adults. The program assesses knowledge in the following key areas:

  1. Polypharmacy and Deprescribing – Identifying potentially inappropriate medications and applying deprescribing guidelines.
  2. Cognitive Impairment – Recognizing early signs of dementia and differentiating between types of cognitive decline.
  3. Falls Prevention – Assessing fall risk and implementing evidence-based prevention strategies.
  4. Advance Care Planning – Facilitating goals-of-care discussions and documenting patient preferences.
  5. Multimorbidity Management – Coordinating care for patients with multiple chronic conditions using a patient-centered approach.
  6.  

Why this is important:
Clearly defining the knowledge areas ensures that the self-assessment program is purposeful, targeted, and aligned with the Royal College’s educational standards. It also helps reviewers assess whether the program addresses meaningful gaps in knowledge or practice.

Need more help? Email CPD

181. What learning methods were selected to help the CPD program meet the stated learning objectives? Describe the rationale for the selected format (e.g. multiple-choice questions, short answer questions, etc.) to enable participants to review their current knowledge or skills in relation to current scientific evidence

Short Explanation:
This question asks you to describe the instructional methods used in your self-assessment program and explain why these methods are appropriate for achieving the learning objectives. You should also explain how the format allows participants to compare their knowledge or skills against current evidence or best practices.

Hints for Answering:

  • Identify the types of questions or activities used (e.g., multiple-choice, case-based scenarios, short answer).
  • Explain how these formats support self-assessment and reflection.
  • Reference how the format aligns with the learning objectives and the needs of the target audience.
  • Mention any feedback mechanisms that help participants understand their performance.

Tips:

  • Be specific about the structure and delivery of the assessment (e.g., online platform, timed modules, immediate feedback).
  • Highlight how the format encourages active learning and critical thinking.
  • Ensure that the rationale connects the method to both the content and the learner’s ability to assess their own competence.

Example Answer:
The self-assessment program "Optimizing Care in Geriatric Medicine" uses a combination of multiple-choice questions (MCQs) and case-based scenarios to assess participants’ knowledge and clinical reasoning. The selected methods support the learning objectives by:

  1. Multiple-choice questions – These are structured to test factual knowledge and application of clinical guidelines. Each question includes detailed feedback with references to current evidence-based resources.
  2. Case-based scenarios – These simulate real-world clinical situations, requiring participants to apply knowledge in context and make decisions based on patient presentations.
  3. Immediate feedback – After each question, participants receive explanations for correct and incorrect answers, along with links to relevant guidelines and literature.
  4. Reflective prompts – At the end of each module, participants are encouraged to reflect on areas of strength and identify opportunities for further learning.
  5.  

This format was chosen to promote active engagement, support knowledge retention, and help participants evaluate their clinical decision-making in relation to current best practices.

Why this is important:
Selecting appropriate learning methods ensures that the self-assessment program is educationally effective and aligned with accreditation standards. It also enhances the learner’s ability to identify knowledge gaps and plan future learning based on evidence-informed feedback.

Need more help? Email CPD

182. Describe the process that allows participants to demonstrate or apply knowledge, skills, clinical judgment or attitudes (e.g. through the creation of an answer sheet and scoring or web based assessment tools)

Short Explanation:
This question asks you to describe how your self-assessment program enables participants to actively demonstrate their knowledge, skills, clinical judgment, or attitudes. You must also explain how their responses are captured and how feedback is provided.

Hints for Answering:

  • Describe the format used to collect participant responses (e.g., digital answer sheet, online quiz platform, paper-based form).
  • Explain how responses are scored or evaluated.
  • Indicate how and when participants receive feedback, and what kind of feedback is provided (e.g., correct answers, explanations, references).
  • Ensure that the process supports self-reflection and identifies areas for improvement.

Tips:

  • Include details about any tools or platforms used (e.g., learning management systems, custom-built web tools).
  • If applicable, describe how the process supports formative learning (e.g., feedback loops, reflective prompts).
  • Make sure the documentation you submit matches the description provided here.

Example Answer:
In the "Optimizing Care in Geriatric Medicine" self-assessment program, participants complete a series of case-based multiple-choice questions using a secure online platform. The process includes:

  1. Web-based answer sheet – Participants select their responses directly within the platform, which automatically records and stores their answers.
  2. Automated scoring – Each response is scored in real time, with immediate feedback provided after submission.
  3. Detailed feedback – For each question, participants receive:
    • An explanation of the correct answer
    • A rationale for why other options are incorrect
    • References to current clinical guidelines or literature
  4. Reflective summary – At the end of the module, participants receive a summary of their performance, highlighting strengths and areas for improvement. They are encouraged to document a personal learning plan based on this feedback.
  5.  

Why this is important:
Providing a structured process for participants to demonstrate and reflect on their knowledge ensures that the self-assessment program meets educational standards. It also supports meaningful learning by helping participants identify gaps and plan future development.

Need more help? Email CPD

183. How will feedback be provided to participants on their performance to enable the identification of any areas requiring improvement through the development of a future learning plan?

Short Explanation:
This question asks you to describe the feedback mechanisms built into your self-assessment program. The goal is to ensure that participants receive meaningful, evidence-based feedback that helps them recognize knowledge gaps and plan future learning.

Hints for Answering:

  • Explain how feedback is delivered (e.g., immediately after each question, at the end of the module, via downloadable report).
  • Describe the type of feedback provided (e.g., correct answers, explanations, references, reflective prompts).
  • Indicate whether participants are encouraged or required to create a learning plan based on their results.
  • Participants should be able to access the feedback after the program ends so they can continue their learning.

Tips:

  • Feedback should be specific, actionable, and linked to current evidence or guidelines.
  • Consider including tools or prompts that support reflection and planning (e.g., “commitment to change” forms, personal learning plan templates).

Example Answer:
In the "Optimizing Care in Geriatric Medicine" self-assessment program, participants receive structured feedback through the following process:

  1. Immediate feedback – After each question, participants are shown the correct answer, a brief explanation, and references to relevant clinical guidelines or literature.
  2. Performance summary – At the end of the module, participants receive a personalized report summarizing their performance across all knowledge areas.
  3. Reflective prompts – The report includes prompts encouraging participants to reflect on areas where they performed well and those needing improvement.
  4. Learning plan template – Participants are provided with a downloadable template to document their learning goals, strategies for improvement, and a timeline for follow-up.
  5.  

This approach ensures that feedback is timely, evidence-based, and supports the development of a future learning plan tailored to the participant’s needs.

Why this is important:
Providing detailed and constructive feedback helps participants identify gaps in their knowledge or skills and supports the development of targeted learning strategies. This aligns with the goals of Section 3 activities to promote reflective practice and continuous improvement.

Need more help? Email CPD

184. Does the program provide participants with references justifying the appropriate answer?

Short Explanation:
This question asks whether your self-assessment program includes references that support the correct answers provided to participants. These references should be credible, up-to-date, and relevant to the content being assessed.

Hints for Answering:

  • If your program includes references (e.g., clinical guidelines, peer-reviewed articles, textbooks), select Yes.
  • If references are not provided, select No.

Tips:

  • References should be included in the feedback provided after each question or in a summary document.
  • Use sources that are evidence-based and widely accepted in the medical community.
  • Ensure that references are current and clearly linked to the content of the question.

Example Answer:
Yes. In the "Optimizing Care in Geriatric Medicine" self-assessment program, each question includes references to relevant clinical guidelines and peer-reviewed literature. These references are provided alongside the feedback for each answer to support evidence-based learning and help participants explore topics in greater depth.

Why this is important:
Providing references enhances the credibility of the self-assessment program and supports participants in verifying and expanding their knowledge. It also aligns with the Royal College’s emphasis on evidence-informed continuing professional development. Education Standard 5 states: "Providing specific feedback on which answers were correct and incorrect with references enables specialists to determine if there are important aspects of their knowledge, skills, clinical judgment or attitudes that need to be addressed through engaging in further learning activities."

Need more help? Email CPD

185. Describe how the references are provided to participants

Short Explanation:
This question asks you to explain the method by which participants access the references that support the correct answers in your self-assessment program. These references should be easily accessible and clearly linked to the content being assessed.

Hints for Answering:

  • Indicate whether references are embedded directly in the feedback, listed at the end of the module, or provided in a downloadable format.
  • Describe the format (e.g., hyperlinks, citations, PDF handout) and how participants are guided to use them.
  • Ensure that the references are clearly associated with specific questions or learning points.

Tips:

  • Use a consistent referencing format (e.g., APA, Vancouver) throughout the program.
  • If using hyperlinks, ensure they are functional and lead to reputable sources.
  • Consider including a reference list or bibliography at the end of each module for easy review.
  • Ensure that participants can access the references after completing the program for ongoing learning.

Example Answer:
In the "Optimizing Care in Geriatric Medicine" self-assessment program, references are provided in the following ways:

  1. Inline with feedback – After each question, participants receive a brief explanation of the correct answer, followed by one or more references to relevant clinical guidelines or peer-reviewed articles.
  2. Clickable links – Where possible, references include hyperlinks to open-access sources or institutional library resources.
  3. Reference list – A complete list of all references used in the module is available at the end of the program in a downloadable PDF format, organized by topic for easy navigation.
  4. Post-program access – Participants retain access to the reference list and feedback summaries after completing the program, allowing them to revisit the material and support ongoing learning.
  5.  

Why this is important:
Providing accessible, evidence-based references supports transparency, reinforces learning, and encourages participants to engage with current clinical literature. Ensuring post-program access further supports reflective practice and continuous professional development.

Need more help? Email CPD

190. How will the overall learning program and each individual module (if applicable) be evaluated by participants?

Short Explanation:
This question asks you to describe how participants will evaluate the simulation-based learning program, including both the overall experience and any individual modules or scenarios.

Hints for Answering:

  • Indicate whether you are using the evaluation form template provided by the CPD Office.
  • Describe any additional evaluation methods you may be using to gather participant feedback.

Tips:

  • The CPD Office’s evaluation form meets accreditation requirements.
  • You may enhance the evaluation process with additional tools (e.g., follow-up surveys, debriefing feedback, or reflective prompts).
  • Consider how you will use the evaluation data for quality improvement.

Example Answer:
The "Acute Care Simulation for Internal Medicine Residents" program uses the standard evaluation form provided by the CPD Office to collect participant feedback. This form meets accreditation requirements and includes questions on:

  1. Achievement of learning objectives
  2. Relevance of the simulation to clinical practice
  3. Quality of facilitation and feedback
  4. Perceived impact on future practice
  5. Identification of bias or commercial influence
  6.  

In addition to the standard form, participants are invited to provide open-ended comments and suggestions for improvement. Optional follow-up surveys may also be used to assess longer-term impact on clinical performance.

Why this is important:
A structured evaluation process ensures that the simulation-based activity meets educational standards and provides valuable insights for continuous improvement. Using the CPD Office’s approved template ensures compliance with accreditation requirements.

Need more help? Email CPD

191. Describe the key knowledge areas or themes assessed by this simulation program

Short Explanation:
This question asks you to identify the core clinical or professional competencies that the simulation-based activity is designed to assess. These should reflect the learning objectives and be relevant to the target audience’s scope of practice.

Hints for Answering:

  • Refer to the needs assessment and learning objectives to determine the most relevant knowledge areas.
  • Consider including both clinical and non-clinical competencies (e.g., communication, teamwork, crisis resource management).
  • Be specific—avoid overly broad categories like “emergency medicine” or “surgery.”

Tips:

  • Use clear, concise language to list each theme or competency.
  • Ensure alignment with CanMEDS roles or other recognized competency frameworks, if applicable.
  • The themes should reflect what participants are expected to demonstrate or apply during the simulation.

Example Answer:
The "Acute Care Simulation for Internal Medicine Residents" program assesses the following key knowledge areas and competencies:

  1. Recognition and initial management of shock – Identifying types of shock and initiating appropriate resuscitation measures.
  2. Airway management – Performing rapid assessment and intervention in patients with compromised airways.
  3. Team communication and leadership – Applying closed-loop communication and role delegation in high-stress scenarios.
  4. Clinical decision-making under pressure – Prioritizing interventions and adapting to evolving clinical situations.
  5. Ethical decision-making and professionalism – Navigating complex scenarios involving goals of care and patient autonomy.
  6.  

Why this is important:
Clearly identifying the knowledge areas ensures that the simulation is purposeful, targeted, and aligned with accreditation standards. It also helps reviewers assess whether the activity addresses meaningful gaps in practice and supports the development of essential competencies.

Need more help? Email CPD

192. What simulation methods were selected to enable participants to demonstrate their abilities, skills, clinical judgment or attitudes? e.g. Role playing, standardized patients, theatre-based simulation, task trainers, virtual patients etc.

Short Explanation:
This question asks you to describe the simulation modalities used in your program and explain how they allow participants to demonstrate key competencies. The methods should be appropriate for the learning objectives and the clinical or professional skills being assessed.

Hints for Answering:

  • Identify the specific simulation methods used (e.g., high-fidelity mannequins, standardized patients, virtual simulations).
  • Explain how each method supports the demonstration of knowledge, skills, judgment, or attitudes.
  • Consider how realism, interactivity, and feedback are integrated into the simulation experience.

Tips:

  • Choose methods that align with the complexity of the scenarios and the competencies being assessed.
  • If multiple methods are used, describe how they complement each other.
  • Ensure that the selected methods allow for observation, assessment, and feedback.

Example Answer:
The "Acute Care Simulation for Internal Medicine Residents" program uses a combination of simulation methods to support a comprehensive learning experience:

  1. High-fidelity mannequins – Used for scenarios involving cardiac arrest, respiratory failure, and shock management. These mannequins allow participants to perform real-time assessments and interventions in a realistic environment.
  2. Standardized patients – Employed in scenarios involving communication challenges, such as goals-of-care discussions and delivering difficult news. These interactions assess empathy, professionalism, and communication skills.
  3. Task trainers – Used for procedural skills such as central line insertion and airway management, allowing for hands-on practice and assessment.
  4. Facilitated debriefing – Each simulation is followed by a structured debrief that reinforces learning objectives and encourages reflection on clinical judgment and team dynamics.
  5.  

Why this is important:
Selecting appropriate simulation methods ensures that participants can meaningfully engage with the content and demonstrate the competencies being targeted. It also supports the validity and educational impact of the simulation-based activity.

Need more help? Email CPD

193. How will learners participate in the simulation?

Short Explanation:
This question asks you to describe how participants will engage with the simulation-based activity. This includes the structure of their involvement, the roles they will play, and how they will interact with the simulation environment.

Hints for Answering:

  • Indicate whether learners will participate individually, in teams, or both.
  • Describe the roles they will assume (e.g., primary clinician, observer, team leader).
  • Explain how learners will interact with the simulation (e.g., hands-on procedures, decision-making, communication tasks).

Tips:

  • Be clear about the level of active participation expected.
  • If observers are included, describe how they are engaged (e.g., through observation checklists, participation in debriefing).
  • Ensure that the participation model supports the learning objectives and allows for assessment and feedback.

Example Answer:
In the "Acute Care Simulation for Internal Medicine Residents" program, learners participate in small interprofessional teams of 3–5 individuals. Each participant rotates through active roles such as:

  1. Team leader – Responsible for clinical decision-making and directing team actions.
  2. Primary responder – Performs hands-on assessments and interventions (e.g., airway management, IV access).
  3. Communicator – Interfaces with standardized patients or family members to gather history and convey information.
  4.  

All participants are actively involved in the simulation scenarios and are observed by faculty using structured assessment tools. Observers also participate in the debriefing sessions, where they reflect on team dynamics, communication, and clinical performance.

Why this is important:
Clearly outlining how learners will participate ensures that the simulation is interactive, learner-centered, and aligned with the intended competencies. It also supports meaningful feedback and reflective learning.

Need more help? Email CPD

194. How will learners provide responses to online simulation? (e.g. Through an online response sheet or web based assessment tools)

Short Explanation:
This question asks you to describe the method by which participants will submit their responses during an online simulation-based activity. The process should allow learners to demonstrate their knowledge, skills, or decision-making in a structured and trackable way.

Hints for Answering:

  • Identify the platform or tool used to collect responses (e.g., learning management system, custom web form, simulation software).
  • Describe how responses are recorded (e.g., typed answers, multiple-choice selections, scenario-based decision trees).
  • Indicate whether the tool allows for feedback and performance tracking.

Tips:

  • Ensure the response method is user-friendly and accessible.
  • If using a custom or third-party tool, briefly describe its features.
  • Make sure the method supports both formative assessment and feedback delivery.

Example Answer:
In the "Virtual Emergency Response Simulation" program, learners provide responses through a secure web-based simulation platform. The process includes:

  1. Interactive decision points – Participants are presented with branching scenarios where they must select clinical actions or provide short written responses.
  2. Online response sheet – Each decision is recorded in real time using an embedded response form within the simulation interface.
  3. Automated tracking – The platform logs all responses and timestamps, allowing facilitators to review decision-making pathways.
  4. Integrated feedback – After each scenario, participants receive immediate feedback on their choices, including explanations and references.
  5.  

Why this is important:
A clear and structured response process ensures that learners can actively engage with the simulation, demonstrate their competencies, and receive meaningful feedback. It also supports documentation and evaluation for accreditation purposes.

Need more help? Email CPD

195. How will learners receive feedback after the completion of an online simulation?

Short Explanation:
This question asks you to describe the method and timing of feedback delivery to participants after they complete an online simulation. Feedback should help learners understand their performance and identify areas for improvement.

Hints for Answering:

  • Indicate whether feedback is provided immediately, after a delay, or during a scheduled debrief.
  • Describe the format of the feedback (e.g., automated reports, annotated responses, facilitator comments).
  • Explain how the feedback supports reflection and future learning.

Tips:

  • Feedback should be specific, evidence-based, and linked to the learning objectives.
  • Consider including references, performance summaries, or reflective prompts.
  • Ensure that the feedback method is accessible and easy to interpret.

Example Answer:
In the "Virtual Emergency Response Simulation" program, learners receive feedback immediately after completing each scenario. The feedback process includes:

  1. Automated performance summary – Participants receive a detailed report outlining their decisions, correct and incorrect responses, and the rationale behind each answer.
  2. Evidence-based references – Each feedback item includes links to relevant clinical guidelines or literature to support further learning.
  3. Reflective prompts – The feedback report includes questions encouraging learners to reflect on their performance and identify areas for improvement.
  4. Downloadable feedback – Participants can download their feedback report for future reference and use it to inform their personal learning plan.
  5.  

Why this is important:
Timely, structured feedback helps learners understand their strengths and areas for growth. It also supports reflective practice and aligns with the Royal College’s standards for simulation-based CPD activities.

Need more help? Email CPD

196. How will learners receive feedback (debrief) after the completion of a live simulation?

Short Explanation:
This question asks you to describe the debriefing process that follows live simulation activities. Debriefing is a critical component of simulation-based learning, allowing participants to reflect on their performance, receive feedback, and consolidate learning.

Hints for Answering:

  • Describe when and how the debriefing occurs (e.g., immediately after the scenario, in small groups, one-on-one).
  • Indicate who facilitates the debrief (e.g., trained faculty, simulation educators).
  • Explain the structure or model used if known (e.g., advocacy-inquiry, plus-delta, PEARLS).

Tips:

  • Ensure the debriefing is structured, learner-centered, and psychologically safe.
  • Include opportunities for participants to reflect on clinical decisions, communication, teamwork, and emotional responses.
  • Consider using tools or prompts to guide reflection and support future learning.
  • Engage observers in the debriefing—even if they did not actively participate in the simulation, they can offer valuable insights and learn from the discussion.
  • To ensure learners receive credit for the full duration of the activity, including times when they are not actively participating in the simulation, they must also be engaged in the debriefing process.

Example Answer:
In the "Acute Care Simulation for Internal Medicine Residents" program, a structured debriefing session is conducted immediately following each live simulation scenario. The debriefing process includes:

  1. Facilitated discussion – Led by trained faculty using the PEARLS framework (Promoting Excellence and Reflective Learning in Simulation), which combines learner self-assessment, focused facilitation, and directive feedback.
  2. Team-based reflection – Participants discuss what went well, what could be improved, and how team dynamics influenced outcomes.
  3. Clinical reasoning review – Facilitators guide learners through key decision points, highlighting evidence-based practices and alternative approaches.
  4. Emotional processing – Space is provided for participants to express emotional reactions and discuss the human factors involved in the scenario.
  5. Observer engagement – Observers are invited to share their perspectives and contribute to the discussion, enhancing the learning experience for all.
  6. Action planning – Learners are encouraged to identify takeaways and set goals for future practice.
  7.  

Why this is important:
Debriefing transforms the simulation experience into meaningful learning by helping participants reflect on their actions, receive feedback, and plan for improvement. Including observers in the debriefing enriches the discussion and reinforces learning across the group. It also ensures that all participants, regardless of their role in the simulation, are eligible to receive credit for the full activity.

Need more help? Email CPD

197. How will feedback (debrief) be provided to learners on their performance to enable the identification of any areas requiring improvement through the development of a future learning plan?

Short Explanation:
This question asks you to describe how the debriefing process will help learners understand their performance and identify areas for improvement. The goal is to support reflective practice and the development of a personalized learning plan.

Hints for Answering:

  • Explain how feedback during the debrief is linked to the learning objectives and performance expectations.
  • Describe how facilitators guide learners to recognize strengths and areas for growth.
  • Indicate whether tools or prompts are used to help learners document their reflections or create a learning plan.

Tips:

  • Use a structured debriefing model (e.g., PEARLS, advocacy-inquiry) to ensure consistency and depth.
  • Encourage learners to reflect on both clinical and non-clinical aspects of their performance (e.g., communication, teamwork).
  • Consider providing a written feedback report—this can enhance retention, support future learning, and serve as a reference for the learner’s ongoing development.
  • Provide tools such as a learning plan template or commitment to change form to help learners translate feedback into action.
  • Ensure that all participants, including observers, are engaged in the debrief and have the opportunity to reflect on their learning.

Example Answer:
In the "Acute Care Simulation for Internal Medicine Residents" program, feedback is provided through a structured debriefing session immediately following each scenario. The process includes:

  1. Facilitated reflection – Faculty guide learners through a discussion of their clinical decisions, communication, and teamwork using the PEARLS framework.
  2. Performance-based feedback – Facilitators provide targeted feedback based on observed behaviors and alignment with learning objectives.
  3. Self-assessment and peer input – Learners are encouraged to assess their own performance and receive input from peers and observers.
  4. Learning plan development – Participants are provided with a template to document areas for improvement, strategies for further learning, and goals for future practice.
  5. Optional written report – Learners may receive a brief written summary of their performance and feedback to support reflection and future planning.
  6.  

Why this is important:
Effective debriefing helps learners internalize feedback, recognize gaps in their performance, and take ownership of their professional development. Providing a written summary can reinforce learning and support the creation of a meaningful future learning plan.

Need more help? Email CPD

198. How will the simulation program be evaluated by the learners?

Short Explanation:
This question asks you to describe the process by which learners will evaluate the simulation-based program. The evaluation should capture feedback on the program’s effectiveness, relevance, and educational value.

Hints for Answering:

  • Indicate whether you are using the CPD Office’s standard evaluation template.
  • Describe any additional evaluation tools or methods you may be using.
  • Explain what aspects of the program learners are asked to evaluate (e.g., achievement of learning objectives, realism of scenarios, quality of facilitation).

Tips:

  • The CPD Office strongly recommends using its standard evaluation template, which includes all elements required for Royal College accreditation.
  • You may supplement the standard form with additional questions or follow-up surveys.
  • Ensure the evaluation process includes both quantitative (e.g., rating scales) and qualitative (e.g., open-ended comments) components.
  • Consider how you will use the evaluation data to improve future iterations of the program.

Example Answer:
The "Acute Care Simulation for Internal Medicine Residents" program uses the CPD Office’s standard evaluation template to gather feedback from participants. This form includes questions on:

  1. Achievement of stated learning objectives
  2. Quality of facilitation and feedback
  3. Perceived impact on future practice
  4. Identification of any commercial bias
  5.  

In addition to the standard form, participants are invited to provide open-ended comments and suggestions for improvement. The feedback is reviewed by the planning committee and used to inform future program development.

Why this is important:
Learner evaluations provide essential insights into the effectiveness and quality of the simulation program. Using the CPD Office’s approved template ensures compliance with accreditation standards and supports continuous quality improvement.

Need more help? Email CPD

199. Will the program use any evaluation strategies, other than self-report, to assess the degree to which the intended outcomes were achieved?

Short Explanation:
This question asks whether your simulation program includes any evaluation methods beyond participant self-report to assess whether learning objectives and intended outcomes were met.

Hints for Answering:

  • If you are using only self-report (e.g., post-activity surveys), select No.
  • If you are using additional strategies (e.g., facilitator observation, pre/post-tests), select Yes and describe them.
  • Consider how these strategies provide objective or structured evidence of learning.

Tips:

  • The use of multiple evaluation methods to assess learning outcomes is encouraged.
  • Using strategies beyond self-report can strengthen the educational impact and credibility of your program.
  • These strategies can also inform future improvements and support accreditation documentation.

Examples of Evaluation Strategies Other Than Self-Report:

  1. Pre- and post-tests – Assess knowledge or skills before and after the simulation to measure learning gains.
  2. Facilitator observation and feedback – Faculty use structured tools to observe and assess learner performance during the simulation.
  3. In-practice observations – Learners are observed applying skills in their clinical environment following the simulation.
  4. Chart reviews – Patient charts are reviewed to assess changes in clinical practice or documentation habits following the simulation.
  5. Peer assessment – Participants provide structured feedback to one another based on observed behaviors.
  6. Video review and scoring – Simulations are recorded and reviewed for detailed performance analysis.
  7.  

Example Answer:
Yes. In the "Acute Care Simulation for Internal Medicine Residents" program, faculty use structured observation checklists to assess learner performance during each scenario. These checklists are aligned with the learning objectives and include criteria for clinical decision-making, communication, and teamwork. In addition, participants complete a pre- and post-simulation quiz to assess knowledge acquisition. Where feasible, follow-up in-practice observations and chart reviews are conducted to evaluate the transfer of learning into clinical settings.

Why this is important:
Incorporating evaluation strategies beyond self-report enhances the validity of your program’s outcomes assessment. It also supports continuous improvement and demonstrates a commitment to high-quality, evidence-informed education.

Need more help? Email CPD

200. Please describe the evaluation method

Short Explanation:
This question asks you to provide a detailed description of the evaluation strategy or strategies your program will use to assess whether the intended learning outcomes were achieved—especially if you answered “Yes” to Question 199.

Hints for Answering:

  • Describe the specific tools, processes, or instruments used to evaluate learner performance.
  • Indicate how the evaluation aligns with the learning objectives and competencies targeted by the simulation.
  • If multiple methods are used (e.g., observation, chart review, pre/post-tests), explain how they work together.

Tips:

  • Be clear about who conducts the evaluation (e.g., faculty, peers, external reviewers).
  • Describe how data will be collected, analyzed, and used for feedback or program improvement.
  • If applicable, mention how the evaluation supports the development of a future learning plan.

Example Answer:
In the "Acute Care Simulation for Internal Medicine Residents" program, a multi-method evaluation approach is used to assess learner outcomes:

  1. Structured observation checklists – Faculty use validated checklists during each simulation to assess clinical decision-making, communication, and teamwork.
  2. Pre- and post-tests – Participants complete a short quiz before and after the simulation to measure knowledge acquisition.
  3. In-practice observation – Where feasible, faculty observe learners applying simulation-based skills in their clinical environment within 4–6 weeks of the session.
  4. Chart reviews – Patient charts are reviewed to assess whether simulation-based learning has influenced documentation or clinical decision-making.
  5. Data analysis – Results from all evaluation tools are compiled and reviewed by the planning committee to identify trends, strengths, and areas for improvement.
  6.  

Why this is important:
A well-defined evaluation method provides objective evidence of learning and supports continuous improvement of the simulation program. It also helps ensure that the activity meets accreditation standards and delivers meaningful educational outcomes.

Need more help? Email CPD

Submitter Comment

Short Explanation:
This is an open text box where you can share any additional information, clarifications, or concerns related to your application. It’s your opportunity to communicate directly with the application reviewer.

Hints for Answering:

  • 🗒️ Use this space to:
    • Clarify any responses you feel may need context
    • Note any pending details or documents
    • Ask questions or flag concerns about the application process
    • Provide background information that may help reviewers understand your program

Tips:

  • Be clear and concise—this field is reviewed by the CPD Office and can help prevent delays or follow-up emails.
  • If you’re unsure where to include a specific detail elsewhere in the application, this is a good place to mention it.

Need more help? Email CPD

CPD Reviewer Comment

Short Explanation:
This section is reserved for internal use by the CPD reviewer. It allows the reviewer to provide feedback, suggestions, or required revisions based on their evaluation of the application.

Hints for Understanding:

  • Applicants cannot edit this field directly.
  • Reviewers may use this space to highlight missing information, clarification requests, or recommendations for improvement.

Tips:

  • Although you cannot change the content of this section, you should carefully review and implement any suggested revisions.
  • Addressing reviewer comments thoroughly and promptly can help expedite approval and ensure your application meets accreditation standards.
  • If you are unsure how to respond to a comment, contact the CPD Office for guidance.

Why this is important:
Reviewer comments provide valuable guidance to help align your application with accreditation requirements. Responding to this feedback demonstrates a commitment to quality improvement and increases the likelihood of your activity being approved.

Need more help? Email CPD