SlideShare una empresa de Scribd logo
1 de 30
CBME and Assessment
Competency-Based Medical
           Education
 is an outcomes-based approach to the
  design, implementation, assessment and
  evaluation of a medical education program
  using an organizing framework of
  competencies


                 The International CMBE Collaborators 2009
Traditional versus CBME: Start with System Needs




Frenk J. Health professionals for a new century: transforming education
to strengthen health systems in an interdependent world. Lancet. 2010
                                                                          3
The Transition to Competency
 Fixed length, variable outcome    Structure/Process
                                   •Knowledge acquisition
                                   •Single subjective measure
                                   •Norm referenced evaluation
                                   •Evaluation setting removed
                                   •Emphasis on summative
          Competency Based
             Education
                                   Competency Based
                                   •Knowledge application
                                   •Multiple objective measures
                                   •Criterion referenced
                                   •Evaluation setting: DO
Variable length, defined outcome
                                   •Emphasis on formative

Caraccio et al 2002
Miller’s Assessment Pyramid
        Impact on Patient

                      Faculty observation, audits, surveys
              DOES

              SHOWS
                            Standardized Patients
               HOW
           KNOWS HOW
                                  Extended matching / CRQ


              KNOWS                     MCQ EXAM
Training and Safe Patient Care

                Trainee performance* X
            Appropriate level of supervision**
        Must = Safe, effective patient-centered care

* a function of level of competence in context
**a function of attending competence in context
Educational Program
Variable                   Structure/Process          Competency-based
Driving force:             Content-knowledge          Outcome-knowledge
   curriculum                 acquisition                application
Driving force: process     Teacher                    Learner
Path of learning           Hierarchical               Non-hierarchical
                              (Teacher→student)          (Teacher↔student)
Responsibility: content    Teacher                    Student and Teacher
Goal of educ.              Knowledge acquisition      Knowledge application
  encounter
Typical assessment tool Single subject measure        Multiple objective measures
Assessment tool            Proxy                      Authentic (mimics real tasks
                                                         of profession)
Setting for evaluation     Removed (gestalt)          Direct observation

Evaluation                 Norm-referenced            Criterion-referenced
Timing of assessment       Emphasis on summative      Emphasis on formative
Program completion         Fixed time                 Variable time
Carracchio, et al. 2002.
Assessment “Building Blocks”
 Choice of right outcomes tied to an effective
  curriculum – step 1!!
 Right combination of assessment methods
  and tools
  – MiniCEX, DOPS, Chart stimulated recall (CSR),
    medical record audit
 Effective application of the methods and tools
 Effective processes to produce good
  judgments
Measurement Tools: Criteria
Cees van der Vleuten’s utility index:
 Utility = V x R x A x EI x CE/Context*
  – Where:
      V = validity
      R = reliability
      A = acceptability
      E = educational impact
      C = cost effectiveness

    *Context = ∑ Microsystems
Criteria for “Good” Assessment 1
     – Validity or Coherence
     – Reproducibility or Consistency
     – Equivalence
     – Feasibility
     – Educational effect
     – Catalytic effect
         • This is the “new” addition – relates to
           feedback that “drives future learning
           forward.”
     – Acceptability
1
    Ottawa Conference Working Group 2010
Measurement Model
Donabedian Model (adapted)
• Structure: the way a training program is set up
   and the conditions under which the program is
   administered
    •     Organization, people, equipment and technology
•       Process: the activities that result from the training
        program
•       Outcomes: the changes (desired or undesired) in
        individuals or institutions that can be attributed to
        the training program
Assessment During Training: Components
                                       Clinical Competency Committee
                           •Periodic review – professional growth opportunities for all
                           •Early warning systems


          Advisor
                                              Structured Portfolio
                                            •ITE (formative only)
                                            •Monthly Evaluations
        Trainee                                                                           Program Leaders
                                            •MiniCEX
•Review portfolio                                                                     •Review portfolio
                                            •Medical record audit/QI
•Reflect on contents                                                                  periodically and
                                            project
•Contribute to portfolio                                                              systematically
                                            •Clinical question log
                                                                                      •Develop early warning
                                            •Multisource feedback
                                                                                      system
                                            •Trainee contributions
                                                                                      •Encourage reflection
                                            (personal portfolio)
                                                   o Research project                 and self-assessment




                                   Program Summative Assessment Process



                                      Licensing and Certification
                           • Licensure and certification in Qatar
Model For Programmatic Assessment
               (With permission from CPM van der Vleuten)


   Training            v         v                 v        v                v   v
  Activities
Assessment
  Activities
Supporting
  Activities
                                            Committee
               = learning task                                        Time
               = learning artifact
               = single assessment data-point
               = single certification data point for mastery tasks
               = learner reflection and planning
               = social interaction around reflection (supervision)
               = learning task being an assessment task also
Assessment Subsystem
 An assessment subsystem is a group of
  people who work together on a regular basis
  to perform evaluation and provide feedback
  to a population of trainees over a defined
  period of time
 This system has a structure to carry out
  evaluation processes that produce an
  outcome
 The assessment subsystem must ultimately
  produce a valid entrustment judgment
Assessment Subsystem

 This group shares:
  – Educational goals and outcomes
  – Linked assessment and evaluation processes
  – Information about trainee performance
  – A desire to produce a trainee truly competent (at
    a minimum) to enter practice or fellowship at the
    end of training
Assessment Subsystem
 The subsystem must:
  – Involve the trainees in the evaluation structure
    and processes
  – Provide both formative and summative
    evaluation to the trainees
  – Be embedded within, not outside the overall
    educational system (assessment not an “add-
    on”
  – Provide a summative judgment for the
    profession and public
     • Effective Evaluation = Professionalism
Subsystem Components
 Effective Leadership
 Clear communication of goals
  – Both trainees and faculty
 Evaluation of competencies is multi-faceted
 Data and Transparency
  – Involvement of trainees
  – Self-directed assessment and reflection by
    trainees
  – Trainees must have access to their “file”
Subsystem Components

 “Competency” committees
  – Need wisdom and perspectives of the group
 Continuous quality improvement
  – The evaluation program must provide data as
    part of the CQI cycle of the program and
    institution
  – Faculty development
 Supportive Institutional Culture
Multi-faceted Evaluation

                                              Systems-based
                                              prac                                  Interpersonal skills
                                                                                    and Communication


                          Medical record
Practice-based              audit and                                       MSF: Directed
learning and                QI project                                       per protocol
improvement
                                                                             Twice/year

                                          Structured Portfolio
                     EBM/                                                   Mini-CEX:
                  Question Log                                               10/year

   Patient care

                              Faculty                               ITE:
                            Evaluations                            1/year
                                                                                        Medical
                                                                                        knowledge




                                                 Professionalism




■ Trainee-directed                                                          ■ Direct observation
Assessment During Training: Components
                                       Clinical Competency Committee
                           •Periodic review – professional growth opportunities for all
                           •Early warning systems


          Advisor
                                              Structured Portfolio
                                            •ITE (formative only)
                                            •Monthly Evaluations
        Trainee                                                                           Program Leaders
                                            •MiniCEX
•Review portfolio                                                                     •Review portfolio
                                            •Medical record audit/QI
•Reflect on contents                                                                  periodically and
                                            project
•Contribute to portfolio                                                              systematically
                                            •Clinical question log
                                                                                      •Develop early warning
                                            •Multisource feedback
                                                                                      system
                                            •Trainee contributions
                                                                                      •Encourage reflection
                                            (personal portfolio)
                                                   o Research project                 and self-assessment




                                   Program Summative Assessment Process


                                        Licensing and Certification
                           • USLME
                           •American Boards of Medical Specialties
Performance Data
 A training program cannot reach its full
  potential without robust and ongoing
  performance data
  – Aggregation of individual trainee performance
  – Performance measurement of the quality and
    safety of the clinical care provided by the
    training institution and the program
Competency Committees
Assessment During Training: Components
                                       Clinical Competency Committee
                           •Periodic review – professional growth opportunities for all
                           •Early warning systems


          Advisor
                                              Structured Portfolio
                                            •ITE (formative only)
                                            •Monthly Evaluations
        Trainee                                                                           Program Leaders
                                            •MiniCEX
•Review portfolio                                                                     •Review portfolio
                                            •Medical record audit/QI
•Reflect on contents                                                                  periodically and
                                            project
•Contribute to portfolio                                                              systematically
                                            •Clinical question log
                                                                                      •Develop early warning
                                            •Multisource feedback
                                                                                      system
                                            •Trainee contributions
                                                                                      •Encourage reflection
                                            (personal portfolio)
                                                   o Research project                 and self-assessment




                                   Program Summative Assessment Process


                                        Licensing and Certification
                           • USLME
                           •American Boards of Medical Specialties
Model For Programmatic Assessment
               (With permission from CPM van der Vleuten)


   Training            v         v                 v        v                v   v
  Activities
Assessment
  Activities
Supporting
  Activities
                                            Committee
               = learning task                                        Time
               = learning artifact
               = single assessment data-point
               = single certification data point for mastery tasks
               = learner reflection and planning
               = social interaction around reflection (supervision)
               = learning task being an assessment task also
Committees and Information
 Evaluation (“competency”) committees can be
  invaluable
  • Develop group goals
  • “Real-time” faculty development
  • Key for dealing with difficult trainees
 Key “receptor site” for frameworks/milestones
  • Synthesis and integration of multiple assessments
“Wisdom of the Crowd”
 Hemmer (2001) – Group conversations more
  likely to uncover deficiencies in professionalism
  among students
 Schwind, Acad. Med. (2004) –
   • 18% of resident deficiencies requiring
     active remediation only became apparent
     through group discussion.
      • Average discussion 5 minutes/resident
        (range 1 – 30 minutes)
“Wisdom of the Crowd”
 Williams, Teach. Learn. Med. (2005)
   • No evidence that individuals in groups
     dominate discussions.
      • No evidence of ganging up or piling on
 Thomas (2011) – Group assessment
  improved inter-rater reliability and reduced
  range restriction in multiple domains in an
  internal medicine residency
Narratives and Judgments
 Pangaro (1999) – matching students to a
  “synthetic” descriptive framework (RIME) reliable
  and valid across multiple clerkships
 Regehr (2007) – Matching students to a
  standardized set of holistic, realistic vignettes
  improved discrimination of student performance
 Regehr (2012) – Faculty created narrative
  “profiles” (16 in all) found to produce consistent
  rankings of excellent, competent and problematic
  performance.
The “System”
                                               Accreditation:
     Residents          Institution            ACGME/RRC
                        and Program

 Assessments within                         Program Aggregation
       Program:
•Direct observations
•Audit and             Judgment and    NAS Milestones
performance data         Synthesis:
•Multi-source FB        Committee      ABIM Fastrak
•Simulation
•ITExam
                                               No Aggregation

   Faculty, PDs                                 Certification:
    and others                                     ABIM

          Milestone and EPAs
  as Guiding Framework and Blueprint
Questions

Más contenido relacionado

La actualidad más candente

Domains of learning & progression of learning
Domains of learning & progression of learningDomains of learning & progression of learning
Domains of learning & progression of learningProf. Dr. Hironmoy Roy
 
Systems approach,Principles of Adult learning & Learning process in Medical E...
Systems approach,Principles of Adult learning & Learning process in Medical E...Systems approach,Principles of Adult learning & Learning process in Medical E...
Systems approach,Principles of Adult learning & Learning process in Medical E...anitasreekanth
 
Integration in Competency based medical education
Integration in Competency based medical educationIntegration in Competency based medical education
Integration in Competency based medical educationKhan Amir Maroof
 
Choosing teaching learning method
Choosing teaching learning methodChoosing teaching learning method
Choosing teaching learning methodRitu71
 
Teaching Methods aligned to objectives
Teaching Methods aligned to objectivesTeaching Methods aligned to objectives
Teaching Methods aligned to objectivesDevender Kumar
 
Introduction to assessment in medical education training
Introduction to assessment in medical education trainingIntroduction to assessment in medical education training
Introduction to assessment in medical education trainingDr Saurabh Jain
 
Introduction to REVISED BASIC COURSE WORKSHOP IN MED EDUCATION
Introduction to REVISED BASIC COURSE WORKSHOP IN MED EDUCATIONIntroduction to REVISED BASIC COURSE WORKSHOP IN MED EDUCATION
Introduction to REVISED BASIC COURSE WORKSHOP IN MED EDUCATIONanitasreekanth
 
Assessment- Introduction, Internal & CIA, (Formative/Summative), Planning of ...
Assessment- Introduction, Internal & CIA, (Formative/Summative), Planning of ...Assessment- Introduction, Internal & CIA, (Formative/Summative), Planning of ...
Assessment- Introduction, Internal & CIA, (Formative/Summative), Planning of ...Prof. Dr. Hironmoy Roy
 
Internal assessment & formative assessment
Internal assessment & formative assessmentInternal assessment & formative assessment
Internal assessment & formative assessmentManpreet Nanda
 
Principles of student assessment in medical education 2017 SATYA
Principles of student assessment in medical education  2017 SATYA Principles of student assessment in medical education  2017 SATYA
Principles of student assessment in medical education 2017 SATYA sathyanarayanan varadarajan
 
Teaching learning methods & Interactive innovative teaching
Teaching learning methods & Interactive innovative teachingTeaching learning methods & Interactive innovative teaching
Teaching learning methods & Interactive innovative teachingProf. Dr. Hironmoy Roy
 
Assessment planning and quality assurance, dr. anindya
Assessment planning and quality assurance, dr. anindyaAssessment planning and quality assurance, dr. anindya
Assessment planning and quality assurance, dr. anindyaDrAnindyaDasgupta
 
Efeective clinical and practical skill teaching
Efeective clinical and practical skill teachingEfeective clinical and practical skill teaching
Efeective clinical and practical skill teachingProf. Dr. Hironmoy Roy
 
Day 2 session 1 choosing a teaching method for objectives & competencies
Day 2 session 1 choosing a teaching method for objectives & competenciesDay 2 session 1 choosing a teaching method for objectives & competencies
Day 2 session 1 choosing a teaching method for objectives & competenciessatyajit saha
 
Integrated assessment in medical education
Integrated assessment in medical educationIntegrated assessment in medical education
Integrated assessment in medical educationPoligar
 
Electives in Undergraduate Medical Education: A sneak-peek
Electives in Undergraduate Medical Education: A sneak-peekElectives in Undergraduate Medical Education: A sneak-peek
Electives in Undergraduate Medical Education: A sneak-peeklavanyasumanthraj
 

La actualidad más candente (20)

Domains of learning & progression of learning
Domains of learning & progression of learningDomains of learning & progression of learning
Domains of learning & progression of learning
 
Systems approach,Principles of Adult learning & Learning process in Medical E...
Systems approach,Principles of Adult learning & Learning process in Medical E...Systems approach,Principles of Adult learning & Learning process in Medical E...
Systems approach,Principles of Adult learning & Learning process in Medical E...
 
Integration in Competency based medical education
Integration in Competency based medical educationIntegration in Competency based medical education
Integration in Competency based medical education
 
Choosing teaching learning method
Choosing teaching learning methodChoosing teaching learning method
Choosing teaching learning method
 
Teaching Methods aligned to objectives
Teaching Methods aligned to objectivesTeaching Methods aligned to objectives
Teaching Methods aligned to objectives
 
Introduction to assessment in medical education training
Introduction to assessment in medical education trainingIntroduction to assessment in medical education training
Introduction to assessment in medical education training
 
Introduction to REVISED BASIC COURSE WORKSHOP IN MED EDUCATION
Introduction to REVISED BASIC COURSE WORKSHOP IN MED EDUCATIONIntroduction to REVISED BASIC COURSE WORKSHOP IN MED EDUCATION
Introduction to REVISED BASIC COURSE WORKSHOP IN MED EDUCATION
 
Assessment- Introduction, Internal & CIA, (Formative/Summative), Planning of ...
Assessment- Introduction, Internal & CIA, (Formative/Summative), Planning of ...Assessment- Introduction, Internal & CIA, (Formative/Summative), Planning of ...
Assessment- Introduction, Internal & CIA, (Formative/Summative), Planning of ...
 
Internal assessment & formative assessment
Internal assessment & formative assessmentInternal assessment & formative assessment
Internal assessment & formative assessment
 
Principles of student assessment in medical education 2017 SATYA
Principles of student assessment in medical education  2017 SATYA Principles of student assessment in medical education  2017 SATYA
Principles of student assessment in medical education 2017 SATYA
 
Teaching learning methods & Interactive innovative teaching
Teaching learning methods & Interactive innovative teachingTeaching learning methods & Interactive innovative teaching
Teaching learning methods & Interactive innovative teaching
 
INTRODUCTION TO ASSESSMENT
INTRODUCTION TO ASSESSMENTINTRODUCTION TO ASSESSMENT
INTRODUCTION TO ASSESSMENT
 
Assessment planning and quality assurance, dr. anindya
Assessment planning and quality assurance, dr. anindyaAssessment planning and quality assurance, dr. anindya
Assessment planning and quality assurance, dr. anindya
 
Efeective clinical and practical skill teaching
Efeective clinical and practical skill teachingEfeective clinical and practical skill teaching
Efeective clinical and practical skill teaching
 
MEDICAL ELECTIVES.pptx
MEDICAL ELECTIVES.pptxMEDICAL ELECTIVES.pptx
MEDICAL ELECTIVES.pptx
 
Day 2 session 1 choosing a teaching method for objectives & competencies
Day 2 session 1 choosing a teaching method for objectives & competenciesDay 2 session 1 choosing a teaching method for objectives & competencies
Day 2 session 1 choosing a teaching method for objectives & competencies
 
Cisp foundation
Cisp foundationCisp foundation
Cisp foundation
 
Dr vijayata ppt
Dr vijayata pptDr vijayata ppt
Dr vijayata ppt
 
Integrated assessment in medical education
Integrated assessment in medical educationIntegrated assessment in medical education
Integrated assessment in medical education
 
Electives in Undergraduate Medical Education: A sneak-peek
Electives in Undergraduate Medical Education: A sneak-peekElectives in Undergraduate Medical Education: A sneak-peek
Electives in Undergraduate Medical Education: A sneak-peek
 

Destacado

Competency-based Medical Education
Competency-based Medical EducationCompetency-based Medical Education
Competency-based Medical EducationJanet Corral
 
Using data (analytics:analysis) to guide (e)teaching and (e)learning
Using data (analytics:analysis) to guide (e)teaching and (e)learningUsing data (analytics:analysis) to guide (e)teaching and (e)learning
Using data (analytics:analysis) to guide (e)teaching and (e)learningPoh-Sun Goh
 
Learning Analytics - George Siemens
Learning Analytics - George SiemensLearning Analytics - George Siemens
Learning Analytics - George SiemensSungjin Nam
 
Personal and personalised learning and teaching (updated)
Personal and personalised learning and teaching (updated)Personal and personalised learning and teaching (updated)
Personal and personalised learning and teaching (updated)Poh-Sun Goh
 
Applied learning analytics to class and program learning activities
Applied learning analytics to class and program learning activitiesApplied learning analytics to class and program learning activities
Applied learning analytics to class and program learning activitiesPoh-Sun Goh
 
Personal and personalised learning and teaching
Personal and personalised learning and teachingPersonal and personalised learning and teaching
Personal and personalised learning and teachingPoh-Sun Goh
 
Dcla13 discourse, computation and context – sociocultural dcla
Dcla13 discourse, computation and context – sociocultural dclaDcla13 discourse, computation and context – sociocultural dcla
Dcla13 discourse, computation and context – sociocultural dclaSimon Knight
 
SoLAR Storm talk: epistemology, pedagogy, assessment
SoLAR Storm talk: epistemology, pedagogy, assessmentSoLAR Storm talk: epistemology, pedagogy, assessment
SoLAR Storm talk: epistemology, pedagogy, assessmentSimon Knight
 
XIP Dashboard: Visual Analytics from Automated Rhetorical Parsing of Scient...
XIP Dashboard: Visual Analytics from Automated Rhetorical Parsing of Scient...XIP Dashboard: Visual Analytics from Automated Rhetorical Parsing of Scient...
XIP Dashboard: Visual Analytics from Automated Rhetorical Parsing of Scient...Simon Buckingham Shum
 
Using data to guide (e)teaching and (e)learning
Using data to guide (e)teaching and (e)learningUsing data to guide (e)teaching and (e)learning
Using data to guide (e)teaching and (e)learningPoh-Sun Goh
 
Everything I have learnt about eLearning
Everything I have learnt about eLearningEverything I have learnt about eLearning
Everything I have learnt about eLearningPoh-Sun Goh
 
Role of digital analytics to guide learning (updated)
Role of digital analytics to guide learning (updated)Role of digital analytics to guide learning (updated)
Role of digital analytics to guide learning (updated)Poh-Sun Goh
 
Flipped classroom - Workplace learning - Transfer
Flipped classroom - Workplace learning - TransferFlipped classroom - Workplace learning - Transfer
Flipped classroom - Workplace learning - TransferPoh-Sun Goh
 
One (more) representation of the continuum of learning
One (more) representation of the continuum of learningOne (more) representation of the continuum of learning
One (more) representation of the continuum of learningPoh-Sun Goh
 
OpenEssayist: Extractive Summarisation and Formative Assessment (DCLA13)
OpenEssayist: Extractive Summarisation and Formative Assessment (DCLA13)OpenEssayist: Extractive Summarisation and Formative Assessment (DCLA13)
OpenEssayist: Extractive Summarisation and Formative Assessment (DCLA13)Nicolas Van Labeke
 
The Evidence Hub: Harnessing the Collective Intelligence of Communities to Bu...
The Evidence Hub: Harnessing the Collective Intelligence of Communities to Bu...The Evidence Hub: Harnessing the Collective Intelligence of Communities to Bu...
The Evidence Hub: Harnessing the Collective Intelligence of Communities to Bu...Anna De Liddo
 
Using data (analytics and analysis) to guide (e)teaching and (e)learning
Using data (analytics and analysis) to guide (e)teaching and (e)learningUsing data (analytics and analysis) to guide (e)teaching and (e)learning
Using data (analytics and analysis) to guide (e)teaching and (e)learningPoh-Sun Goh
 
MOOCs and the Flipped Classroom; A Way, Not THE Way
MOOCs and the Flipped Classroom; A Way, Not THE WayMOOCs and the Flipped Classroom; A Way, Not THE Way
MOOCs and the Flipped Classroom; A Way, Not THE WayPoh-Sun Goh
 
State and Directions of Learning Analytics Adoption (Second edition)
State and Directions of Learning Analytics Adoption (Second edition)State and Directions of Learning Analytics Adoption (Second edition)
State and Directions of Learning Analytics Adoption (Second edition)Dragan Gasevic
 
The continuum of learning
The continuum of learningThe continuum of learning
The continuum of learningPoh-Sun Goh
 

Destacado (20)

Competency-based Medical Education
Competency-based Medical EducationCompetency-based Medical Education
Competency-based Medical Education
 
Using data (analytics:analysis) to guide (e)teaching and (e)learning
Using data (analytics:analysis) to guide (e)teaching and (e)learningUsing data (analytics:analysis) to guide (e)teaching and (e)learning
Using data (analytics:analysis) to guide (e)teaching and (e)learning
 
Learning Analytics - George Siemens
Learning Analytics - George SiemensLearning Analytics - George Siemens
Learning Analytics - George Siemens
 
Personal and personalised learning and teaching (updated)
Personal and personalised learning and teaching (updated)Personal and personalised learning and teaching (updated)
Personal and personalised learning and teaching (updated)
 
Applied learning analytics to class and program learning activities
Applied learning analytics to class and program learning activitiesApplied learning analytics to class and program learning activities
Applied learning analytics to class and program learning activities
 
Personal and personalised learning and teaching
Personal and personalised learning and teachingPersonal and personalised learning and teaching
Personal and personalised learning and teaching
 
Dcla13 discourse, computation and context – sociocultural dcla
Dcla13 discourse, computation and context – sociocultural dclaDcla13 discourse, computation and context – sociocultural dcla
Dcla13 discourse, computation and context – sociocultural dcla
 
SoLAR Storm talk: epistemology, pedagogy, assessment
SoLAR Storm talk: epistemology, pedagogy, assessmentSoLAR Storm talk: epistemology, pedagogy, assessment
SoLAR Storm talk: epistemology, pedagogy, assessment
 
XIP Dashboard: Visual Analytics from Automated Rhetorical Parsing of Scient...
XIP Dashboard: Visual Analytics from Automated Rhetorical Parsing of Scient...XIP Dashboard: Visual Analytics from Automated Rhetorical Parsing of Scient...
XIP Dashboard: Visual Analytics from Automated Rhetorical Parsing of Scient...
 
Using data to guide (e)teaching and (e)learning
Using data to guide (e)teaching and (e)learningUsing data to guide (e)teaching and (e)learning
Using data to guide (e)teaching and (e)learning
 
Everything I have learnt about eLearning
Everything I have learnt about eLearningEverything I have learnt about eLearning
Everything I have learnt about eLearning
 
Role of digital analytics to guide learning (updated)
Role of digital analytics to guide learning (updated)Role of digital analytics to guide learning (updated)
Role of digital analytics to guide learning (updated)
 
Flipped classroom - Workplace learning - Transfer
Flipped classroom - Workplace learning - TransferFlipped classroom - Workplace learning - Transfer
Flipped classroom - Workplace learning - Transfer
 
One (more) representation of the continuum of learning
One (more) representation of the continuum of learningOne (more) representation of the continuum of learning
One (more) representation of the continuum of learning
 
OpenEssayist: Extractive Summarisation and Formative Assessment (DCLA13)
OpenEssayist: Extractive Summarisation and Formative Assessment (DCLA13)OpenEssayist: Extractive Summarisation and Formative Assessment (DCLA13)
OpenEssayist: Extractive Summarisation and Formative Assessment (DCLA13)
 
The Evidence Hub: Harnessing the Collective Intelligence of Communities to Bu...
The Evidence Hub: Harnessing the Collective Intelligence of Communities to Bu...The Evidence Hub: Harnessing the Collective Intelligence of Communities to Bu...
The Evidence Hub: Harnessing the Collective Intelligence of Communities to Bu...
 
Using data (analytics and analysis) to guide (e)teaching and (e)learning
Using data (analytics and analysis) to guide (e)teaching and (e)learningUsing data (analytics and analysis) to guide (e)teaching and (e)learning
Using data (analytics and analysis) to guide (e)teaching and (e)learning
 
MOOCs and the Flipped Classroom; A Way, Not THE Way
MOOCs and the Flipped Classroom; A Way, Not THE WayMOOCs and the Flipped Classroom; A Way, Not THE Way
MOOCs and the Flipped Classroom; A Way, Not THE Way
 
State and Directions of Learning Analytics Adoption (Second edition)
State and Directions of Learning Analytics Adoption (Second edition)State and Directions of Learning Analytics Adoption (Second edition)
State and Directions of Learning Analytics Adoption (Second edition)
 
The continuum of learning
The continuum of learningThe continuum of learning
The continuum of learning
 

Similar a CBME and Assessment

Presentation Education & Quality (Paul Garré)
Presentation Education & Quality (Paul Garré)Presentation Education & Quality (Paul Garré)
Presentation Education & Quality (Paul Garré)DirkVanWaelderen
 
Methods for developing assessment instruments to generate useful data in t…
Methods for developing assessment instruments to generate useful data in t…Methods for developing assessment instruments to generate useful data in t…
Methods for developing assessment instruments to generate useful data in t…Pat Barlow
 
Summary of course curriculum evaluation
Summary of course curriculum evaluationSummary of course curriculum evaluation
Summary of course curriculum evaluationShafeeq Hussain
 
Well br wbl maturity toolkit v 0.7
Well br wbl maturity toolkit v 0.7Well br wbl maturity toolkit v 0.7
Well br wbl maturity toolkit v 0.7balham
 
Module 7 control systems of distance education
Module 7   control systems of distance educationModule 7   control systems of distance education
Module 7 control systems of distance educationStephen Esber
 
A model for programmatic assessment
A model for programmatic assessmentA model for programmatic assessment
A model for programmatic assessmentGerinorth
 
Get the RoI: Maximise Business Impact with eLearning
Get the RoI: Maximise Business Impact with eLearningGet the RoI: Maximise Business Impact with eLearning
Get the RoI: Maximise Business Impact with eLearning24x7 Learning
 
How Classteacher Assessment Program Helps Students.
How Classteacher Assessment Program Helps Students.How Classteacher Assessment Program Helps Students.
How Classteacher Assessment Program Helps Students.Mindshaper Technologies
 
Management development program.pptx_govind
Management development program.pptx_govindManagement development program.pptx_govind
Management development program.pptx_govindMUDIT Gupta
 
Lecture 7. student evaluation
Lecture 7. student evaluationLecture 7. student evaluation
Lecture 7. student evaluationShafiqur Rehman
 
Assessing students in clinical practice
Assessing students in clinical practiceAssessing students in clinical practice
Assessing students in clinical practicesgray2
 

Similar a CBME and Assessment (20)

Presentation Education & Quality (Paul Garré)
Presentation Education & Quality (Paul Garré)Presentation Education & Quality (Paul Garré)
Presentation Education & Quality (Paul Garré)
 
Unit 8 updated
Unit 8 updatedUnit 8 updated
Unit 8 updated
 
Methods for developing assessment instruments to generate useful data in t…
Methods for developing assessment instruments to generate useful data in t…Methods for developing assessment instruments to generate useful data in t…
Methods for developing assessment instruments to generate useful data in t…
 
Trg evaluation
Trg evaluationTrg evaluation
Trg evaluation
 
Summary of course curriculum evaluation
Summary of course curriculum evaluationSummary of course curriculum evaluation
Summary of course curriculum evaluation
 
Well br wbl maturity toolkit v 0.7
Well br wbl maturity toolkit v 0.7Well br wbl maturity toolkit v 0.7
Well br wbl maturity toolkit v 0.7
 
Coaching knowres talk
Coaching knowres talkCoaching knowres talk
Coaching knowres talk
 
Module 7 control systems of distance education
Module 7   control systems of distance educationModule 7   control systems of distance education
Module 7 control systems of distance education
 
A model for programmatic assessment
A model for programmatic assessmentA model for programmatic assessment
A model for programmatic assessment
 
A Framework for Health IT Evaluation
A Framework for Health IT EvaluationA Framework for Health IT Evaluation
A Framework for Health IT Evaluation
 
Controlling
ControllingControlling
Controlling
 
Get the RoI: Maximise Business Impact with eLearning
Get the RoI: Maximise Business Impact with eLearningGet the RoI: Maximise Business Impact with eLearning
Get the RoI: Maximise Business Impact with eLearning
 
Vilnius pres dianne lalancette
Vilnius pres dianne lalancetteVilnius pres dianne lalancette
Vilnius pres dianne lalancette
 
How Classteacher Assessment Program Helps Students.
How Classteacher Assessment Program Helps Students.How Classteacher Assessment Program Helps Students.
How Classteacher Assessment Program Helps Students.
 
Models and approaches to decentralized evaluation system - considerations for...
Models and approaches to decentralized evaluation system - considerations for...Models and approaches to decentralized evaluation system - considerations for...
Models and approaches to decentralized evaluation system - considerations for...
 
Management development program.pptx_govind
Management development program.pptx_govindManagement development program.pptx_govind
Management development program.pptx_govind
 
Lecture 7. student evaluation
Lecture 7. student evaluationLecture 7. student evaluation
Lecture 7. student evaluation
 
QRIS Validation Webinar
QRIS Validation WebinarQRIS Validation Webinar
QRIS Validation Webinar
 
Training evauation
Training evauationTraining evauation
Training evauation
 
Assessing students in clinical practice
Assessing students in clinical practiceAssessing students in clinical practice
Assessing students in clinical practice
 

Más de jakinyi

Developing Milestones for GI
Developing Milestones for GIDeveloping Milestones for GI
Developing Milestones for GIjakinyi
 
Transplant Hepatology Pilot
Transplant Hepatology PilotTransplant Hepatology Pilot
Transplant Hepatology Pilotjakinyi
 
ACGME/RRC Update on NAS
ACGME/RRC Update on NASACGME/RRC Update on NAS
ACGME/RRC Update on NASjakinyi
 
GI Face-Off - Issues Facing Medium Programs
GI Face-Off - Issues Facing Medium ProgramsGI Face-Off - Issues Facing Medium Programs
GI Face-Off - Issues Facing Medium Programsjakinyi
 
Issues Facing Small Programs
Issues Facing Small ProgramsIssues Facing Small Programs
Issues Facing Small Programsjakinyi
 
Assessing Procedural Competencies
Assessing Procedural CompetenciesAssessing Procedural Competencies
Assessing Procedural Competenciesjakinyi
 
AGA Academy of Educators
AGA Academy of EducatorsAGA Academy of Educators
AGA Academy of Educatorsjakinyi
 
Assessing Procedural Competencies
Assessing Procedural Competencies   Assessing Procedural Competencies
Assessing Procedural Competencies jakinyi
 
Coordinator lecture saviano
Coordinator lecture   savianoCoordinator lecture   saviano
Coordinator lecture savianojakinyi
 
Keynote update on the program directors%27 caucus activities and introducti...
Keynote   update on the program directors%27 caucus activities and introducti...Keynote   update on the program directors%27 caucus activities and introducti...
Keynote update on the program directors%27 caucus activities and introducti...jakinyi
 
Utilizing tools and resources to assess fellows decross sunday feb 24 1030-1100
Utilizing tools and resources to assess fellows decross sunday feb 24 1030-1100Utilizing tools and resources to assess fellows decross sunday feb 24 1030-1100
Utilizing tools and resources to assess fellows decross sunday feb 24 1030-1100jakinyi
 
Issues facing large programs oxentenko (1)
Issues facing large programs   oxentenko (1)Issues facing large programs   oxentenko (1)
Issues facing large programs oxentenko (1)jakinyi
 
Acg train the trainer overview oxentenko
Acg train the trainer overview   oxentenkoAcg train the trainer overview   oxentenko
Acg train the trainer overview oxentenkojakinyi
 
Gi face off - issues facing medium programs - charlene prather
Gi face off - issues facing medium programs - charlene pratherGi face off - issues facing medium programs - charlene prather
Gi face off - issues facing medium programs - charlene pratherjakinyi
 
Addressing this decade's primary challenge for medical training alguire
Addressing this decade's primary challenge for medical training alguireAddressing this decade's primary challenge for medical training alguire
Addressing this decade's primary challenge for medical training alguirejakinyi
 
Next accreditation system for program coordinators meyer3
Next accreditation system for program coordinators   meyer3Next accreditation system for program coordinators   meyer3
Next accreditation system for program coordinators meyer3jakinyi
 
Nas lisa
Nas lisaNas lisa
Nas lisajakinyi
 
Final summit presentation suzanne rose
Final summit presentation   suzanne  roseFinal summit presentation   suzanne  rose
Final summit presentation suzanne rosejakinyi
 

Más de jakinyi (18)

Developing Milestones for GI
Developing Milestones for GIDeveloping Milestones for GI
Developing Milestones for GI
 
Transplant Hepatology Pilot
Transplant Hepatology PilotTransplant Hepatology Pilot
Transplant Hepatology Pilot
 
ACGME/RRC Update on NAS
ACGME/RRC Update on NASACGME/RRC Update on NAS
ACGME/RRC Update on NAS
 
GI Face-Off - Issues Facing Medium Programs
GI Face-Off - Issues Facing Medium ProgramsGI Face-Off - Issues Facing Medium Programs
GI Face-Off - Issues Facing Medium Programs
 
Issues Facing Small Programs
Issues Facing Small ProgramsIssues Facing Small Programs
Issues Facing Small Programs
 
Assessing Procedural Competencies
Assessing Procedural CompetenciesAssessing Procedural Competencies
Assessing Procedural Competencies
 
AGA Academy of Educators
AGA Academy of EducatorsAGA Academy of Educators
AGA Academy of Educators
 
Assessing Procedural Competencies
Assessing Procedural Competencies   Assessing Procedural Competencies
Assessing Procedural Competencies
 
Coordinator lecture saviano
Coordinator lecture   savianoCoordinator lecture   saviano
Coordinator lecture saviano
 
Keynote update on the program directors%27 caucus activities and introducti...
Keynote   update on the program directors%27 caucus activities and introducti...Keynote   update on the program directors%27 caucus activities and introducti...
Keynote update on the program directors%27 caucus activities and introducti...
 
Utilizing tools and resources to assess fellows decross sunday feb 24 1030-1100
Utilizing tools and resources to assess fellows decross sunday feb 24 1030-1100Utilizing tools and resources to assess fellows decross sunday feb 24 1030-1100
Utilizing tools and resources to assess fellows decross sunday feb 24 1030-1100
 
Issues facing large programs oxentenko (1)
Issues facing large programs   oxentenko (1)Issues facing large programs   oxentenko (1)
Issues facing large programs oxentenko (1)
 
Acg train the trainer overview oxentenko
Acg train the trainer overview   oxentenkoAcg train the trainer overview   oxentenko
Acg train the trainer overview oxentenko
 
Gi face off - issues facing medium programs - charlene prather
Gi face off - issues facing medium programs - charlene pratherGi face off - issues facing medium programs - charlene prather
Gi face off - issues facing medium programs - charlene prather
 
Addressing this decade's primary challenge for medical training alguire
Addressing this decade's primary challenge for medical training alguireAddressing this decade's primary challenge for medical training alguire
Addressing this decade's primary challenge for medical training alguire
 
Next accreditation system for program coordinators meyer3
Next accreditation system for program coordinators   meyer3Next accreditation system for program coordinators   meyer3
Next accreditation system for program coordinators meyer3
 
Nas lisa
Nas lisaNas lisa
Nas lisa
 
Final summit presentation suzanne rose
Final summit presentation   suzanne  roseFinal summit presentation   suzanne  rose
Final summit presentation suzanne rose
 

CBME and Assessment

  • 2. Competency-Based Medical Education  is an outcomes-based approach to the design, implementation, assessment and evaluation of a medical education program using an organizing framework of competencies The International CMBE Collaborators 2009
  • 3. Traditional versus CBME: Start with System Needs Frenk J. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010 3
  • 4. The Transition to Competency Fixed length, variable outcome Structure/Process •Knowledge acquisition •Single subjective measure •Norm referenced evaluation •Evaluation setting removed •Emphasis on summative Competency Based Education Competency Based •Knowledge application •Multiple objective measures •Criterion referenced •Evaluation setting: DO Variable length, defined outcome •Emphasis on formative Caraccio et al 2002
  • 5. Miller’s Assessment Pyramid Impact on Patient Faculty observation, audits, surveys DOES SHOWS Standardized Patients HOW KNOWS HOW Extended matching / CRQ KNOWS MCQ EXAM
  • 6. Training and Safe Patient Care Trainee performance* X Appropriate level of supervision** Must = Safe, effective patient-centered care * a function of level of competence in context **a function of attending competence in context
  • 7. Educational Program Variable Structure/Process Competency-based Driving force: Content-knowledge Outcome-knowledge curriculum acquisition application Driving force: process Teacher Learner Path of learning Hierarchical Non-hierarchical (Teacher→student) (Teacher↔student) Responsibility: content Teacher Student and Teacher Goal of educ. Knowledge acquisition Knowledge application encounter Typical assessment tool Single subject measure Multiple objective measures Assessment tool Proxy Authentic (mimics real tasks of profession) Setting for evaluation Removed (gestalt) Direct observation Evaluation Norm-referenced Criterion-referenced Timing of assessment Emphasis on summative Emphasis on formative Program completion Fixed time Variable time Carracchio, et al. 2002.
  • 8. Assessment “Building Blocks”  Choice of right outcomes tied to an effective curriculum – step 1!!  Right combination of assessment methods and tools – MiniCEX, DOPS, Chart stimulated recall (CSR), medical record audit  Effective application of the methods and tools  Effective processes to produce good judgments
  • 9. Measurement Tools: Criteria Cees van der Vleuten’s utility index:  Utility = V x R x A x EI x CE/Context* – Where: V = validity R = reliability A = acceptability E = educational impact C = cost effectiveness *Context = ∑ Microsystems
  • 10. Criteria for “Good” Assessment 1 – Validity or Coherence – Reproducibility or Consistency – Equivalence – Feasibility – Educational effect – Catalytic effect • This is the “new” addition – relates to feedback that “drives future learning forward.” – Acceptability 1 Ottawa Conference Working Group 2010
  • 11. Measurement Model Donabedian Model (adapted) • Structure: the way a training program is set up and the conditions under which the program is administered • Organization, people, equipment and technology • Process: the activities that result from the training program • Outcomes: the changes (desired or undesired) in individuals or institutions that can be attributed to the training program
  • 12. Assessment During Training: Components Clinical Competency Committee •Periodic review – professional growth opportunities for all •Early warning systems Advisor Structured Portfolio •ITE (formative only) •Monthly Evaluations Trainee Program Leaders •MiniCEX •Review portfolio •Review portfolio •Medical record audit/QI •Reflect on contents periodically and project •Contribute to portfolio systematically •Clinical question log •Develop early warning •Multisource feedback system •Trainee contributions •Encourage reflection (personal portfolio) o Research project and self-assessment Program Summative Assessment Process Licensing and Certification • Licensure and certification in Qatar
  • 13. Model For Programmatic Assessment (With permission from CPM van der Vleuten) Training v v v v v v Activities Assessment Activities Supporting Activities Committee = learning task Time = learning artifact = single assessment data-point = single certification data point for mastery tasks = learner reflection and planning = social interaction around reflection (supervision) = learning task being an assessment task also
  • 14. Assessment Subsystem  An assessment subsystem is a group of people who work together on a regular basis to perform evaluation and provide feedback to a population of trainees over a defined period of time  This system has a structure to carry out evaluation processes that produce an outcome  The assessment subsystem must ultimately produce a valid entrustment judgment
  • 15. Assessment Subsystem  This group shares: – Educational goals and outcomes – Linked assessment and evaluation processes – Information about trainee performance – A desire to produce a trainee truly competent (at a minimum) to enter practice or fellowship at the end of training
  • 16. Assessment Subsystem  The subsystem must: – Involve the trainees in the evaluation structure and processes – Provide both formative and summative evaluation to the trainees – Be embedded within, not outside the overall educational system (assessment not an “add- on” – Provide a summative judgment for the profession and public • Effective Evaluation = Professionalism
  • 17. Subsystem Components  Effective Leadership  Clear communication of goals – Both trainees and faculty  Evaluation of competencies is multi-faceted  Data and Transparency – Involvement of trainees – Self-directed assessment and reflection by trainees – Trainees must have access to their “file”
  • 18. Subsystem Components  “Competency” committees – Need wisdom and perspectives of the group  Continuous quality improvement – The evaluation program must provide data as part of the CQI cycle of the program and institution – Faculty development  Supportive Institutional Culture
  • 19. Multi-faceted Evaluation Systems-based prac Interpersonal skills and Communication Medical record Practice-based audit and MSF: Directed learning and QI project per protocol improvement Twice/year Structured Portfolio EBM/ Mini-CEX: Question Log 10/year Patient care Faculty ITE: Evaluations 1/year Medical knowledge Professionalism ■ Trainee-directed ■ Direct observation
  • 20. Assessment During Training: Components Clinical Competency Committee •Periodic review – professional growth opportunities for all •Early warning systems Advisor Structured Portfolio •ITE (formative only) •Monthly Evaluations Trainee Program Leaders •MiniCEX •Review portfolio •Review portfolio •Medical record audit/QI •Reflect on contents periodically and project •Contribute to portfolio systematically •Clinical question log •Develop early warning •Multisource feedback system •Trainee contributions •Encourage reflection (personal portfolio) o Research project and self-assessment Program Summative Assessment Process Licensing and Certification • USLME •American Boards of Medical Specialties
  • 21. Performance Data  A training program cannot reach its full potential without robust and ongoing performance data – Aggregation of individual trainee performance – Performance measurement of the quality and safety of the clinical care provided by the training institution and the program
  • 23. Assessment During Training: Components Clinical Competency Committee •Periodic review – professional growth opportunities for all •Early warning systems Advisor Structured Portfolio •ITE (formative only) •Monthly Evaluations Trainee Program Leaders •MiniCEX •Review portfolio •Review portfolio •Medical record audit/QI •Reflect on contents periodically and project •Contribute to portfolio systematically •Clinical question log •Develop early warning •Multisource feedback system •Trainee contributions •Encourage reflection (personal portfolio) o Research project and self-assessment Program Summative Assessment Process Licensing and Certification • USLME •American Boards of Medical Specialties
  • 24. Model For Programmatic Assessment (With permission from CPM van der Vleuten) Training v v v v v v Activities Assessment Activities Supporting Activities Committee = learning task Time = learning artifact = single assessment data-point = single certification data point for mastery tasks = learner reflection and planning = social interaction around reflection (supervision) = learning task being an assessment task also
  • 25. Committees and Information  Evaluation (“competency”) committees can be invaluable • Develop group goals • “Real-time” faculty development • Key for dealing with difficult trainees  Key “receptor site” for frameworks/milestones • Synthesis and integration of multiple assessments
  • 26. “Wisdom of the Crowd”  Hemmer (2001) – Group conversations more likely to uncover deficiencies in professionalism among students  Schwind, Acad. Med. (2004) – • 18% of resident deficiencies requiring active remediation only became apparent through group discussion. • Average discussion 5 minutes/resident (range 1 – 30 minutes)
  • 27. “Wisdom of the Crowd”  Williams, Teach. Learn. Med. (2005) • No evidence that individuals in groups dominate discussions. • No evidence of ganging up or piling on  Thomas (2011) – Group assessment improved inter-rater reliability and reduced range restriction in multiple domains in an internal medicine residency
  • 28. Narratives and Judgments  Pangaro (1999) – matching students to a “synthetic” descriptive framework (RIME) reliable and valid across multiple clerkships  Regehr (2007) – Matching students to a standardized set of holistic, realistic vignettes improved discrimination of student performance  Regehr (2012) – Faculty created narrative “profiles” (16 in all) found to produce consistent rankings of excellent, competent and problematic performance.
  • 29. The “System” Accreditation: Residents Institution ACGME/RRC and Program Assessments within Program Aggregation Program: •Direct observations •Audit and Judgment and NAS Milestones performance data Synthesis: •Multi-source FB Committee ABIM Fastrak •Simulation •ITExam No Aggregation Faculty, PDs Certification: and others ABIM Milestone and EPAs as Guiding Framework and Blueprint

Notas del editor

  1. So what is the outcome, and what is the framework?
  2. Areas in red are to emphasize that the learner can and must have an active role in the process