New York College of Osteopathic Medicine
Learning Outcomes Assessment 2009-2010
January 2009
Taskforce Members
John R. McCarthy, Ed.D.
Pelham Mead, Ed.D.
Mary Ann Achziger, M.S.
Felicia Bruno, M.A.
Claire Bryant, Ph.D.
Leonard Goldstein, DDS, PH.D.
Abraham Jeger, Ph.D.
Rodika Zaika, M.S.
Ron Portanova, Ph.D.
Pre-
Doctoral
Data
Post-Graduate Data
Career
data
Pre-Matriculation
Table of Contents
OVERVIEW 4
I. Introduction and Rationale 5
II. Purpose and Design 9
III. Specifics of the Plan 11
Mission of NYCOM 11
Learning Outcomes 11
Compiling the Data 17
Stakeholders 17
IV. Plan Implementation 18
Next Steps 18
V. Conclusion 20
A. OUTCOME INDICATORS – DETAIL 24
1. Pre-matriculation data 24
Forms 26
2. Academic (pre-clinical) course-work 47
Forms – LDB / DPC Track 49
Forms – Institute for Clinical Competence (ICC) 55
3. Clinical Clerkship Evaluations / NBOME Subject Exams 86
Forms 88
4. Student feedback (assessment) of courses/Clinical clerkship
PDA project 92
Forms 94
5. COMLEX USA Level I, Level II CE & PE,
Level III data (NBOME) 120
6. Residency match rates and overall placement rate 121
2
7. Feedback from (AACOM) Graduation Questionnaire 122
Forms 123
8. Completion rates (post-doctoral programs) 142
9. Specialty certification and licensure 143
10. Career choices and geographic practice location 144
11. Alumni Survey 145
Forms 146
B. BENCHMARKS 151
Bibliography 152
Appendices: 153
Chart 1 Proposed Curriculum and Faculty Assessment Timeline
Institute for Clinical Competence:
Neurological Exam – Student Version Parts I & II
Taskforce Members
List of Tables and Figures
Figure 1 Cycle of Assessment 9
Figure 2 Outcome Assessment along the Continuum 15
Figure 3 Data Collection Phases 22
Table 1 Assessment Plan Guide 23
3
New York College of Osteopathic Medicine
Learning Outcomes Assessment Plan
February 2009
Overview
This document was developed by the NYCOM Task Force on Learning Outcomes
Assessment and was accepted by the dean in January 2009. Although a few of the assessment
tools and processes described in the document are new, most have been employed at NYCOM
since its inception to inform curriculum design and implementation and to gauge progress and
success in meeting the institution’s mission, goals and objectives.
The Learning Outcomes Assessment Plan documents the processes and measures used by
the institution to gauge student achievement and program (curricular) effectiveness. The results
of these activities are used by faculty to devise ways to improve student learning and by
administrators and other stakeholder groups to assess institutional effectiveness and inform
planning, decision-making, and resource allocation.
Certain of the measures described in later sections of this document constitute key
performance indicators for the institution, for which numerical goals have been set. Performance
on these measures has a significant effect on institutional planning and decision-making
regarding areas of investment and growth, program improvement, and policy.
4
Key performance indicators and benchmarks are summarized below and also on 151
Indicator Benchmarks
Number of Applicants Maintain relative standing among Osteopathic Medical
Colleges
Admissions Profile Maintain or improve current admissions profile based
on academic criteria (MCAT, GPA, Colleges attended
Attrition 3% or less
Remediation rate
(preclinical)
2% reduction per year
COMLEX USA scores
(first-time pass rates,
mean scores)
Top quartile
Students entering
OGME
Maintain or improve OGME placement
Graduates entering
Primary Care careers
Maintain or improve Primary Care placement
Career characteristics Regarding Licensure, Board Certification, Geographic
Practice, and Scholarly achievements–TBD
I. Introduction and Rationale
At NYCOM we believe it is our societal responsibility to monitor our students’ quality of
education through continual assessment of educational outcomes. On-going program evaluation
mandates longitudinal study (repeated observations over time) and the utilization of empirical
data based on a scientific methodology.
At Thomas Jefferson University, an innovative study was implemented circa 1970, which
was ultimately titled “Jefferson Longitudinal Study of Medical Education”.1 As a result of
implementation of this longitudinal study plan, Thomas Jefferson University was praised by the
1 Center for Research in Medical Education and Health Care: Jefferson Longitudinal Study of Medical Education,
Thomas Jefferson University, 2005.
5
Accreditation Team for the Middle States Commission on Higher Education for “…..their
academic interest in outcome data, responsiveness to faculty and department needs and the clear
use of data to modify the curriculum and teaching environment….their use of this data has
impacted many components of the curriculum, the learning environment, individual student
development, and program planning…” (TJU, 2005).
The Jefferson Longitudinal Study of Medical Education has been the most productive
longitudinal study of medical students and graduates of a single medical school. This study has
resulted in 155 publications in peer review journals. Many were presented before national or
international professional meetings prior to their publication (TJU, 2005).
According to Hernon and Dugan (2004), the pressure on higher education institutions to
prove accountability has moved beyond the acceptance and reliance of self-reports and anecdotal
evidence compiled during the self-regulatory accreditation process. It now encompasses an
increasing demand from a variety of constituencies to demonstrate institutional effectiveness by
focusing on quality measures, such as educational quality, and cost efficiencies.
Accountability focuses on results as institutions quantify or provide evidence that they are
meeting their stated mission, goals, and objectives. Institutional effectiveness is concerned, in
part, with measuring (Hernon and Dugan, 2004):
Programmatic outcomes: such as applicant pool, retention rates, and graduation rates.
Such outcomes are institution-based and may be used to compare internal year-to-year
institutional performance and as comparative measures with other institutions.
Student learning outcomes: oftentimes referred to as educational quality and concerned
with attributes and abilities, both cognitive and affective, which reflect how student
experiences at the institution supported their development as individuals. Students are
expected to demonstrate acquisition of specific knowledge and skills.
6
At NYCOM, we recognize that our effectiveness as an institution must ultimately be
assessed and expressed by evaluating our success in achieving our Mission in relation to the
following Outcomes:
1. Student Learning / Program Effectiveness
2. Research and Scholarly Output
3. Clinical Services
The present document focuses on #1, above, viz., Student Learning / Program Effectiveness.
That is, it is intended only as a Learning Outcomes Assessment Plan. At the same time, we are
cognizant that Institutional Effectiveness/Outcomes derive from numerous inputs, or “means” to
these “ends,” including:
1. Finances
2. Faculty Resources
3. Administrative Resources
4. Student Support Services
5. Clinical Facilities and Resources
6. Characteristics of the Physical Plant
7. Information Technology Resources
8. Library Resources
We believe it is our obligation to continually assess the impact of any changes in the inputs,
processes, and outputs of this institution.
The evaluation approach in this Assessment Plan provides for on-going data collection
and analysis targeted specifically at assessing outcomes of student achievement and program
effectiveness (educational quality). Assessment of achievement and program effectiveness is
based on objective, quantifiable information (data).
As a result of the NYCOM Learning Outcome Assessment Plan’s continual assessment
cycle, the report is available, with scheduled updates, as a resource in the decision-making
process.
7
The report provides outcomes data, recommendations, and suggestions intended to inform key
policy makers and stakeholders2 of areas of growth and/or improvement, together with proposed
changes to policy that strengthen both overall assessment and data-driven efforts to improve
student learning.
2 NYCOM Administration, academic committees, faculty, potential researchers, and students.
8
II. Purpose and Design
Well-designed plans for assessing student learning outcomes link learning outcomes,
measures, data analysis, and action planning in a continuous cycle of improvement illustrated
below.
Figure 1 Cycle of Assessment
Ten principles guide the specifics of NYCOM’s Learning Outcomes Assessment Plan:
1. The plan provides formative and summative assessment of student learning.3
2. The primary purpose for assessing outcomes is to improve student learning.
3. Developing and revising an assessment plan is a long-term, dynamic, and collaborative
process.
4. Assessments use the most reliable and valid instruments available.
3 Examples of the former include post-course roundtable discussions, Institute for Clinical Competence (ICC)
seminars, and data from the Course/Faculty Assessment Program. Examples of the latter include the AACOM
Graduation Questionnaire, COMLEX scores, NBOME subject exam scores, and clerkship evaluations.
Define
intended
Learning
Outcomes
Identify
methods
of measuring
outcomes
Collect Data
Review results
and use to make
decisions
regarding program
improvement
Start
Here
9
5. Assessment priorities are grounded in NYCOM’s mission, goals, and learning outcomes.
6. The assessment involves a multi-method approach.
7. Assessment of student learning is separate from evaluation of faculty.
8. The primary benefit of assessment is the provision of evidence-based analysis to inform
decision-making concerning program revision and improvement and resource allocation.
9. The assessment plan must provide a substantive and sustainable mechanism for fulfilling
NYCOM’s responsibility to ensure the quality, rigor, and overall effectiveness of our
programs in educating competent and compassionate physicians.
10. The assessment plan yields valid measures of student outcomes that provide stakeholders
with relevant and timely data to make informed decisions on changes in curricular design,
implementation, program planning, and the overall learning environment.
Outcomes assessment is a continuous process of measuring institutional effectiveness
focusing on planning, determining, understanding, and improving student learning. At
NYCOM, we are mindful that an integral component of this assessment plan is to ensure that the
plan and the reporting process measures what it is intended to measure (student achievement and
program effectiveness).
10
III. Specifics of the Plan
The NYCOM assessment plan articulates eleven student learning outcomes, which are
linked to both the institutional mission and the osteopathic core competencies
Mission of NYCOM
The New York College of Osteopathic Medicine of the New York Institute of
Technology is committed to training osteopathic physicians for a lifetime of learning and
practice, based upon the integration of evidence-based knowledge, critical thinking and the tenets
of osteopathic principles and practice. The college is also committed to preparing osteopathic
physicians for careers in primary care, including health care in the inner city and rural
communities, as well as to the scholarly pursuit of new knowledge concerning health and
disease. NYCOM provides a continuum of educational experiences to its students, extending
through the clinical and post-graduate years of training. This continuum provides the future
osteopathic physician with the foundation necessary to maintain competence and compassion, as
well as the ability to better serve society through research, teaching, and leadership.
Learning Outcomes
The following eleven (11) Learning Outcomes that guide this plan stem from NYCOM’s mission
(above) and the osteopathic core competencies:
1. The Osteopathic Philosophy: Upon graduation, a student must possess the ability to
demonstrate the basic knowledge of Osteopathic philosophy and practice, as well as
Osteopathic Manipulative Treatment.
2. Medical Knowledge: A student must possess the ability to demonstrate medical
knowledge through passing of course tests, standardized tests of the NBOME, post-
11
course rotation tests, research activities, presentations, and participation in directed
reading programs and/or journal clubs, and/or other evidence-based medicine activities.
3. Practice-based learning and improvement: Students must demonstrate their ability to
critically evaluate their methods of clinical practice, integrate evidence-based medicine
into patient care, show an understanding of research methods, and improve patient care
practices
4. Professionalism: Students must demonstrate knowledge of professional, ethical, legal,
practice management, and public health issues applicable to medical practice.
5. Systems-based practice: Students must demonstrate an understanding of health care
delivery systems, provide effective patient care and practice cost-effective medicine
within the system.
6. Patient Care: Students must demonstrate the ability to effectively treat patients and
provide medical care which incorporates the osteopathic philosophy, empathy, preventive
medicine education, and health promotion.
7. Communication skills: Students must demonstrate interpersonal and communication
skills with patients and other healthcare professionals, which enable them to establish and
maintain professional relationships with patients, families, and other healthcare providers.
8. Primary Care: Students will be prepared for careers in primary care, including health care
in the inner city, as well as rural communities.
9. Scholarly/Research Activities: Students will be prepared for the scholarly pursuit of new
knowledge concerning health and disease. Students in NYCOM’s 5-year Academic
Medicine Scholars Program will be prepared as academic physicians in order to address
12
this nation’s projected health care provider shortage and the resulting expansion of
medical school training facilities.
10. Global Medicine and Health policy: Students will be prepared to engage in global health
practice, policy, and the development of solutions to the world’s vital health problems.
11. Cultural Competence: Students will be prepared to deliver the highest quality medical
care, with the highest degree of compassion, understanding, and empathy toward cultural
differences in our global society.
The NYCOM assessment plan provides for analysis of learning outcomes for two
curricular tracks and four categories of student
NYCOM has historically tracked student data across the curriculum, paying particular
attention to cohorts of students (see below), as well as NYCOM’s two curricular tracks:
a) Lecture-Based Discussion track: integrates the biomedical and clinical sciences along
continuous didactic ‘threads’ delivered according to a systems based approach;
b) Doctor Patient Continuum track: a problem-based curriculum, whose cornerstone is
small-group, case-based learning.
Current data gathering incorporates tracking outcomes associated with several subcategories of
student (important to the institution) within the 4-year pre-doctoral curriculum and the 5-year
pre-doctoral Academic Medicine Scholars curriculum. The pre-doctoral populations are defined
according to the following subcategories:
Traditional:4
BS/DO: The BS/DO program is a combined baccalaureate/doctor of osteopathic
medicine program requiring successful completion of a total of 7 years (undergraduate, 3
years; osteopathic medical school, 4 years).
MedPrep: A pre-matriculation program offering academic enrichment to facilitate the
acceptance of underrepresented minority and economically disadvantaged student
applicants.5
4 All other students not inclusive of BS/DO, MedPrep, and EPP defined cohorts.
5 The program is funded by the New York State Collegiate Science and Technology Entry Program and the
NYCOM Office of Equity and Opportunity Programs.
13
EPP (Émigré Physician Program): A 4-year program, offered by NYCOM, to educate
émigré physicians to become DOs to enable them to continue their professional careers in
the U.S.
The NYCOM assessment plan includes data from four phases of the medical education
continuum (as illustrated in Figure 2 and Figure 3): pre-matriculation, the four-year predoctoral
curriculum6, post-graduation data, and careers and practice data
Within the NYCOM Learning Outcome Assessment Plan, the Task Force has chosen the
following outcome indicators for assessment of program effectiveness at different points in the
medical education continuum:
Pre-matriculation data, including first-year student survey;
Academic (pre-clinical) course-work (scores on exams, etc.) – attrition rate;
Clinical Clerkship Evaluations (3rd/4th year) and NBOME Subject Exams;
Student feedback (assessment) of courses and 3rd and 4th year clinical clerkships and
PDA-based Patient and Educational Activity Tracking;
COMLEX USA Level I, Level II CE & PE, and Level III data, including:
o First-time and overall pass rates and mean scores;
o Comparison of NYCOM first time and overall pass rates and mean scores to
national rankings;
Residency match rate and placement rate (AOA / NRMP);
Feedback from AACOM Graduation Questionnaire;
Completion rates of Post-Doctoral programs;
Specialty certification and licensure;
Career choices (practice type–academic, research, etc.);
Geographic practice locations;
Alumni survey.
The Outcome Indicators—Detail sections of this plan ( 24 150) show the various
data sources and include copies of the forms or survey questionnaires utilized in the data
gathering process.
The NYCOM assessment plan identifies specific sources of data for each phase
Figure 2 illustrates which of the above measures are most relevant at each phase of the medical
education continuum.
6 And the five-year pre-doctoral Academic Medicine Scholars program
14
15
16
The NYCOM assessment plan describes the collection and reporting of data,
responsibilities for analysis and dissemination, and the linkage to continuous program
improvement and institutional planning
Compiling the Data
Discussions with departmental leaders and deans confirmed that data gathering occurs at
various levels throughout the institution. Development of a central repository (centralized
database) facilitates data gathering, data mining and overall efficiency as it relates to data
analysis, report generation, and report dissemination. This includes utilization of internal
databases (internal to NYCOM) as well as interfacing with external organizations’ databases,
including the AOA (American Osteopathic Association), AACOM (American Association of
Colleges of Osteopathic Medicine), AMA (American Medical Association), and the ABMS
(American Board of Medical Specialties).
Stakeholders
Information from the data collection serves to inform NYCOM administration, relevant
faculty, appropriate research and academic/administrative committees, including the following:
Curriculum Committee
Student Progress Committee
Admissions Committee
Deans and Chairs Committee
Clinical and Basic Science Chairs
Research Advisory Group
Academic Senate
The NYCOM assessment plan sets forth benchmarks, goals and standards of performance
The major elements of the plan are summarized in Table 1: Assessment Plan Guide:
Learning Outcomes/Metrics/Benchmarks found at the end of this chapter.
17
IV. Plan Implementation
As discussed earlier, most of the assessment tools and processes described in the
document have been employed at NYCOM since its inception to inform curriculum design and
implementation and to gauge progress and success in meeting the institution’s mission, goals and
objectives. Beginning in fall 2008, however, assessment efforts have been made more
systematic; policies, procedures, and accountabilities are now documented and more widely
disseminated.
The Office of Program Evaluation and Assessment (OPEA), reporting to the Associate
Dean for Academic Affairs is responsible for directing all aspects of plan refinement and
implementation.
Next steps
1. Develop a shared, central repository for pre-matriculation, pre-doctoral, and postgraduate
data (see Figure 3). Time Frame: Academic Year 2010-2011
Centralized database: Development of a (shared or central) repository
(database) utilized by internal departments of NYCOM. WEAVEonline is
a web-bases assessment system, utilized by numerous academic
institutions across the country, for assessment and planning purposes.
Utilizing this program facilitates centralization of data. The central
database is comprised of student data categorized as follows:
Pre-matriculation Data includes demographics, AACOM pre-matriculation survey, academic
data (GPA), and other admissions data (MCAT’s, etc.).
Data is categorized according to student cohort as previously written and
described (see item III. Specifics of the Plan on pages 13-14).
18
Pre-doctoral Data includes academic (pre-clinical) course work, course grades, end-ofyear
grade point averages, the newly implemented, innovative Course /
Faculty assessment program data (described in Section 4), ratings of
clinical clerkship performance, performance scores on COMLEX USA
Level I and Level II CE & PE, descriptors of changes in academic status
(attrition), and AACOM Graduation questionnaires.
Post-graduate/Career Data includes residency match rate, residency choice, hospitals of
residency, geographic location, chosen specialty, performance on
COMLEX Level III, geographic and specialty area(s) of practice
following graduation, licensure, board certification status, scholarly work,
professional activities/societies, faculty appointments, type(s) of practice
(academic, clinical, research).
This database supports and assimilates collaborative surveys utilized by
internal departments in order to capture requested data (see item III.
Specifics of the Plan on pages 13-14) essential for tracking students during
and after post-graduate training. Specific data (e.g., COMLEX Level III,
board certification, and licensure) is provided by external databases,
through periodic reporting means, or queries from NYCOM, therefore the
database provides for assimilation of this external data, in order to
incorporate into institutional reporting format.
2. Establish metrics. Time Frame: Academic Year 2010-2011
Benchmarks and Reporting: Conduct a retrospective data analysis in
order to establish baseline metrics (see Compiling the Data on page 17).
19
Following development of these metrics, institutional benchmarks are
established. Benchmarks align with Institutional Goals as written above.
Reporting of data analysis occurs on an annual basis. An annual
performance report is compiled from all survey data and external sources.
Timeframe for reporting is congruent with end of academic year. Updates
to report occur semi-annually, as additional (external) data is received.
Data reporting includes benchmarking against Institutional Goals
(mission), in order to provide projections around effectiveness of learning
environment, quality improvement indicators, long-range and strategic
planning processes, and cost analysis/budgetary considerations.
Report dissemination to key policy makers and stakeholders, as previously
identified (see Stakeholders on page 17) in addition to other staff, as
deemed appropriate for inclusion in the reporting of assessment analysis.
V. Conclusion
The impact on student learning of such things as changes in the demographics of medical school
applicants, admissions criteria, curricula, priorities, and methods of delivery of medical education
deserve careful discussion, planning, and analysis before, during, and after implementation. This
plan facilitates change management at three points:
o Planning, by providing evidence to support decision-making;
o Implementation, by establishing mechanisms for setting performance targets and
monitoring results, and
20
o Evaluation, by systematically measuring outcomes against goals and providing evidence
of whether the change has achieved its intended objectives.
At NYCOM, accountability is seen as both a requirement and a responsibility. As healthcare
delivery, pedagogy, and the science of medicine constantly change, monitoring the rigor and
effectiveness of the learning environment through assessment of student learning outcomes
throughout the medical education continuum becomes paramount.
21
Figure 3 Data Collection Phases
Pre-doctoral Data
Pre-matriculation
Data
Post-Graduate
Data
Career
Data
Assessment
Process
22
Learning Outcomes7 Data Collection Phases8 Assessment Methods Metrics9 Development of
benchmarks10
Students will:
Demonstrate basic knowledge of OPP
& OMT
Demonstrate medical knowledge
Demonstrate competency in practicebased
learning and improvement
Demonstrate professionalism and
ethical practice
Demonstrate an understanding of
health care delivery systems
Demonstrate the ability to effectively
treat patients
Demonstrate interpersonal and
communication skills
Be prepared for careers in primary
care
Be prepared for the scholarly pursuit
of new knowledge
Be prepared to engage in global
health practice, policy, and solutions
to world health problems
Be prepared to effectively interact
with people of diverse cultures and
deliver the highest quality of medical
care
• Pre-matriculation
• Pre-doctoral
• Post-graduate
• Career
• Didactic Academic
Performance
• LDB Curriculum
• DPC Curriculum
• Formative / Summative
Experiences: Patient
Simulations (SP’s /
Robotic)
• Student-driven Course,
Clerkship, and Faculty
Assessment
• Clinical Clerkship
Performance
• PDA-Based Patient and
Education Tracking
• Surveys
• Standardized Tests
• Alumni Feedback
Vis a Vis:
• Admissions Data
(Applicant Pool
demographics)
• Course Exams
• End-of-year pass rates
• Coursework
• Analysis of Residency
Trends Data
• Standardized Tests
Subject Exams
• COMLEX 1 & II Scores
• Analysis of Specialty
Choice
• Analysis of geographic
practice area
• Academic Attrition
rates
• Remediation rates
• Graduation and postgraduate
data
• External surveys
• Applicant Pool
• Admissions Profile
• Academic Attrition
rates
• Remediation rates
(pre-clinical years)
• COMLEX USA
Scores I & II (1st
time pass rate /
mean score)
• Number of
graduates entering
OGME programs
• Graduates entering
Primary Care (PC)11
• Career Data:
Licensure (within
3 years);
Board
Certification;
Geographic
Practice Area;
Scholarly
achievements
7 Complete detail of Learning Outcomes found in III., pages 11-13.
8 See Figure 3, page 22.
9 List of Metrics is not all-inclusive.
10 See complete detail of benchmarks—pages 5 & 151.
11 Primary Care: Family Medicine, Internal Medicine, and Pediatrics.
Table 1 – Assessment Plan Guide: Learning Outcomes / Data Sources / Metrics
23
Outcome Indicators – Detail
1. Pre-matriculation data
Data gathered prior to students entering NYCOM, and broken down by student
cohort, which includes the following:
Traditional, MedPrep, and BS/DO students
AACOM pre-matriculation survey given to students;
Total MCAT scores;
Collegiate GPA (total GPA-including undergraduate/graduate);
Science GPA;
College(s) attended;
Undergraduate degree (and graduate degree, if applicable;
Gender,;
Age;
Ethnicity;
State of residence;
Pre-admission interview score.
Additional data is gathered on the MedPrep student cohort and incorporates the
following:
Pre-matriculation lecture based exam and quiz scores;
Pre-matriculation DPC (Doctor Patient Continuum) based facilitator assessment
scores and content exam scores;
24
ICC (Institute for Clinical Competence) Professional Assessment Rating (PARS)
Scores.
Émigré Physician Program students
TOEFL (Test of English as a Foreign Language) score;
EPP Pre-Matriculation Examination score;
Medical school attended;
Date of MD degree;
Age;
Ethnicity;
Country of Origin.
25
Specific forms/questionnaires utilized to capture the above-detailed information include the
following:
MedPrep 2008 Program Assessment
MedPrep Grade Table
NYCOM Admissions Interview Evaluation Form
Application for Émigré Physicians Program (EPP)
AACOM Pre-matriculation survey (first-year students)
NYCOM Interview Evaluation Form – Émigré Physicians Program
Samples of the forms/questionnaires follow
26
MedPrep 2008 Program Assessment
Successful completion of the MedPrep Pre-Matriculation Program takes into consideration the
following 3 assessment components:
1. Lecture-Discussion Based (LDB)
2. DPC (Doctor Patient Continuum)
3. ICC (Institute for Clinical Competence)
A successful candidate must achieve a passing score for all 3 components. Strength in one
area will not compensate for weakness in another.
1. The first component assesses the Lecture-Discussion Based portion of the MedPrep Pre-
Matriculation Program. It is comprised of 3 multiple choice quizzes and 1 multiple choice exam.
Histology
Biochemistry
Physiology
Genetics
Physiology
OMM
Pharmacology
Pathology
Microbiology
Clinical Reasoning Skills
Each of the three quizzes constitutes 10% of an individuals overall LDB score and the final exam
(to be conducted on June 27) constitutes 70% of an individuals overall LDB score (comprising
100%) in the Lecture-Discussion portion of the program.
2. The second is based upon your performance in the DPC portion of the MedPrep Pre-
Matriculation Program. There will be a facilitator assessment (to be conducted on June 26),
which will comprise 30% of an individual’s grade and a final written assessment which will be
70% of an individual’s overall DPC score.
** Note – Both the Lecture-Discussion Based and DPC passing scores are calculated as
per NYCOM practice:
Average (mean) minus one standard deviation
Not to be lower than 65%
Not to be higher than 70%
27
3. The third component is the ICC encounter designed to assess your Doctor Patient
Interpersonal skills. This assessment is evaluated on the PARS scale described to you in the
Doctor Patient Interpersonal Skills session on June 12, by Dr. Errichetti.
After the program ends, on June 27th, all three components of the assessment will be compiled
and reviewed by the MedPrep Committee. The director of admissions, who is a member of the
committee, will prepare notification letters that will be mailed to you within two weeks.
Please note:
The written communication you will receive ONLY contains acceptance information. NO
grades will be distributed. Exams or other assessments (with the exception of the Lecture-
Discussion Based quizzes, which have already been returned) will not be shared or returned.
Please DO NOT contact anyone at NYCOM requesting the status of your candidacy. No
information will be given on the phone or to students on campus.
Thank you for your participation in the MedPrep Pre-Marticulation Program. The faculty
and staff have been delighted to meet and work with you. We wish you success!
Sincerely,
Bonnie Granat
28
Last Name, First Name
Quiz #1
Score
(10% of
Overall
LDB
Score)
Quiz #2
Score
(10% of
Overall
LDB Score)
Quiz #3
Score
(10% of
Overall
LDB
Score)
LDB Final
Exam
Score
(70% of
Overall LDB
Score)
Overall LBD
Score
(Exam and
Quizzes
Combined)
Overall
DPC
Score
Overall
ICC
Score
29
NEW YORK COLLEGE OF OSTEOPAHTIC MEDICINE
ADMISSIONS INTERVIEW EVALUATION FORM
Applicant______________________________________________________ Date____/_____/____
CATEGORY
CRITERIA
VALUE
RATING
I. PERSONAL PRESENTATION
MATURITY
LIFE EXPERIENCE /TRAVEL
EXTRA CURRICULAR ACTIVITIES/HOBBIES
COMMUNICATION SKILLS
SELF ASSESSMENT (STRENGTHS/WEAKNESSES)
AACOMAS & SUPPLEMENTAL STATEMENT
50
II. OSTEOPATHIC MOTIVATION
KNOWLEDGE OF THE PROFESSION
TALKED TO A DO/LETTER FROM A DO
15
III. PRIMARY CARE MOTIVATION
INTEREST IN PRIMARY CARE
15
IV. OVERALL IMPRESSION
EXPOSURE TO MEDICINE
– VOLUNTEER EXPERIENCE
– EMPLOYMENT EXPERIENCE
– UNIQUE ACADEMIC EXPERIENCES
– RESEARCH
20
TOTAL RATING
100
OTHER COMMENTS: PLEASE USE OTHER SIDE
(REQUIRED)
INTERVIEWER:
Name______________________________
Signed__________________________________________
30
Comments on Applicant _____________________________________________________
COMMENTS:
Interviewer_______________________________________
31
32
14. List all Colleges attended (Undergraduate, Graduate, Professional – US and Home Country) List in chronological order
Institution Name Location Dates of Major
Attendance Subject
Degree granted
or expected (Date)
Medical Specialty (if any) ___________________ No. of years in practice _________
15. Have you had any U.S. military experience ? Yes ( ) No ( )
If yes, was your discharge honorable? Yes ( ) No ( )
16. List employment in chronological order, beginning with your current position:
Title or Description Where Dates Level of Responsibility
17. Work/daytime telephone number________________________
area code phone
18. How do you plan to finance your NYCOM education? Personal funds ________ Loans
19. Were you ever the recipient of any action for unacceptable academic performance or conduct
violations (e.g. probation, dismissal, suspension, disqualification, etc.) by any
college or school? Yes ( ) No
If yes, were you ever denied readmission? Yes ( ) No
20. Have you ever been convicted of a misdemeanor or felony (excluding parking violations)? Yes ( ) No(
If your answer to #19 or #20 is yes, please explain fully:
21. Evaluation Service used: Globe Language Services ______ Joseph Silny & Assocs. ______
World Education Services ______ IERF _____
*22. TOEFL Score(s): ________________________________
*ALL CANDIDATES MUST TAKE TOEFL / TOEFL
Scores Cannot Be Older Than 2 YEARS
If you plan to take or retake the TOEFL, enter date: _____/_____/ mo.
yr.
(NYCOM’s TOEFL Code is #2486; copies cannot be accepted)
( )
( )
)
33
USMLE WILL NOT BE ACCEPTED IN LIEU OF TOEFL
All evaluations must be received directly from the evaluation service and are subject to approval by the New York
College of Osteopathic Medicine.
Personal Comments: Please discuss your reasons for applying to the EPP program.
Selection of candidates is competitive; achieving a minimum, passing TOEFL Score
does not automatically guarantee an interview.
I certify that all information submitted in support of my application is complete and correct to the best of my knowledge.
Date: Signature: ______________________________________
PLEASE MAIL APPLICATION AND FEE ($60.00 CHECK OR MONEY ORDER ONLY, PAYABLE TO NYCOM) TO:
New York College of Osteopathic Medicine
Of New York Institute of Technology
Office of Admissions/ Serota Academic Center Room 203
Northern Blvd.
Old Westbury, NY 11568-8000
34
35
36
37
38
39
40
41
42
43
44
45
NEW YORK COLLEGE OF OSTEOPATHIC MEDICINE
INTERVIEW EVALUATION FORM – É MIGRE PHYSICIANS PROGRAM
Applicant:___________________________________ Date:________________
State:___________________________
CATEGORY
CRITERIA TO BE
ADDRESSED VALUE RATING
1. Oral Comprehension
Ability to understand questions, content
30
2. Personal Presentation
Appropriate response, ability to relate to
interviewers
30
3. Verbal Expression
Clarity, articulation, use of
grammar
30
4. Overall Impression
Unique experiences, employment ,
research
10
OVERALL
RATING
100
INTERVIEWER RECOMMENDATION:
Accept_____________
Reject_____________
COMMENTS:______________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
NAME:_____________________________
SIGNED:____________________________
46
2. Academic (pre-clinical) course-work
Data captured during NYCOM’s pre-clinical 4-year pre-doctoral program and 5-year
Academic Medicine Scholars program which includes the following:
Curricular Tracks: Lecture Based-Discussion / Doctor Patient Continuum
Pre-clinical course pass/failure rate as determined by class year (year 1 and year
2) and overall at end of year 2 (tracking each class and in aggregate for two
years);
Failure rates of (components) Nervous System course or Behavior course;
Course grades (H/P/F);
Exam scores;
Scores (pass/fail rate) on Core Clinical Competency OSCE exams;
Professionalism Assessment Rating Scale (PARS)
Students determined as pre-clinical course dismissals (and remediated);
Students determined double course failure (and remediated);
Failure rates due to cognitive and/or OMM lab portions of course
Repeat students (aligned with Learning Specialist intervention)
Changes in academic status (attrition-as identified above);
End-of-year class rankings.
47
Specific forms/questionnaires utilized to capture the above-detailed information include the
following:
Introduction to Osteopathic Medicine / Lecture-Based Discussion
Doctor-Patient Continuum (DPC) – Biopsychosocial Sciences I
Grading and Evaluation Policy
DPC – Clinical Sciences II – Grading Policy
Assessing the AOA Core Competencies at NYCOM
Institute for Clinical Competence (ICC) Professionalism Assessment
Rating Scale (PARS)
SimCom-T(eam) Holistic Scoring Guide
Case A – Dizziness, Acute (scoring guides)
Samples of the forms/questionnaires follow
48
Introduction to Osteopathic Medicine / Lecture-Based Discussion
Grading and Evaluation
1. At the conclusion of this course, students will receive a final cognitive score and a final OMM laboratory
score.
2. Both a student’s final cognitive score and a student’s final OMM laboratory score must be at a
passing level in order to pass this course.
3. Cognitive Score
a. A student’s cognitive score is comprised of the following two components:
i. Written Examinations and Quizzes pertaining to course lectures and corresponding
required readings, cases, course notes, and PowerPoint presentations
ii. Anatomy Laboratory Examinations and Quizzes
b. The weighting of the two components of the final cognitive score is as follows:
Summary of Cognitive Score Breakdown
Cognitive Score Component % of Final Cognitive Score
Written Examinations and Quizzes 75%
Anatomy Laboratory Examinations and
Quizzes
25%
Total Cognitive Score 100%
c. Written Examinations and Quizzes
i. There will be three written examinations and four written quizzes in this course.
ii. The written examinations and quizzes will consist of material from all three threads
(Cellular and Molecular Basis of Medicine, Structural and Functional Basis of Medicine,
Practice of Medicine).
iii. Up to 25% of the written exam and quiz material will come from directed readings.
iv. For the purpose of determining passing for this course, the written examinations will be
worth 90% of the final written score and the quizzes will be worth 10% (2.5% each) of the
final written score. This weighting is illustrated in the following table:
Summary of Written Exam/Quiz Score Breakdown
Written Exam/Quiz # % of Final Written Score
Written Exam #1 25%
Written Exam #2 30%
Written Exam #3 35%
Total Written Exam Score 90%
Written Quiz #1 2.5%
Written Quiz #2 2.5%
Written Quiz #3 2.5%
Written Quiz #4 2.5%
Total Written Quiz Score 10%
Total Written Score 100%
d. Anatomy Laboratory Examinations and Quizzes
i. There will be two Anatomy laboratory examinations in this course
ii. There will be Anatomy laboratory quizzes in this course, conducted during Anatomy
laboratory sessions.
iii. For the purpose of determining passing for this course, each Anatomy lab examination
49
will be worth 45% of students’ final Anatomy lab score and all Anatomy lab quizzes
combined will be worth 10% of students’ final Anatomy lab score. This weighting is
illustrated in the following table:
Summary of Anatomy Lab Exam/Quiz Score Breakdown
Anatomy Lab Exam/Quiz # % of Final Anatomy Score
Anatomy Lab Exam #1 45%
Anatomy Lab Exam #2 45%
Anatomy Lab Quizzes 10%
Total Anatomy Lab Exam/Quiz Score 100%
4. OMM Laboratory Score
a. A student’s OMM laboratory score in this course is comprised of an OMM laboratory examination
and laboratory quizzes, as follows:
i. There will be one OMM laboratory practical examination in this course
ii. There will be two OMM laboratory practical quizzes in this course conducted during OMM
laboratory sessions
iii. There will be a series of OMM laboratory written quizzes in this course conducted during
OMM laboratory sessions.
b. The weighting of the components of the OMM laboratory final score is as follows: For the purpose
of determining passing for this course, the OMM laboratory practical examination will be worth 70%
of the final OMM laboratory score, the OMM laboratory practical quizzes will be worth 20% (10%
each) of the final OMM laboratory score, and the OMM laboratory written quizzes will be worth 10%
(all OMM lab written quizzes combined) of the OMM laboratory score. This weighting is illustrated
in the following table:
Summary of OMM Laboratory Exam/Quiz Score Breakdown
OMM Laboratory Exam/Quiz % of Final OMM Laboratory Score
OMM Laboratory Practical Exam 70%
OMM Laboratory Practical Quiz #1 10%
OMM Laboratory Practical Quiz #2 10%
OMM Laboratory Written Quizzes (all quizzes
combined)
10%
Total OMM Laboratory Score 100%
5. Examinations and quizzes may be cumulative.
6. Honors Determination
a. For the purpose of determining who will be eligible to receive a course grade of Honors (“H”), the
final cognitive score and final OMM laboratory score will be combined in a 75%/25% ratio,
respectively.
b. Using the formula noted above, students scoring in the top 10% (and who have not taken a makeup
exam within the course or remediated the course) will receive a course grade of Honors.
50
DOCTOR PATIENT CONTINUUM(DPC) – BIOPSYCHOSOCIAL
SCIENCES I
Grading and Evaluation Policy:
The examinations and evaluations are weighed as follows:
Evaluation Criteria: Percent of Grade
Content Examination 55%
Component Examinations 25%
Facilitator Assessment 20%
Content Examination: There will a mid-term exam and an end of the term exam, each weighted equally. The
examinations will cover the learning issues submitted by the case-study groups. Questions will be based on the
common learning issues (covered by all groups) and learning issues specific to individual groups (unique issues).
Component Exams: Distribution of the component exams will be as follows:
Exams based on Anatomy lectures and labs = 20%
Graded assignments offered by problem set instructors, which might include quizzes, position papers,
and/or other exercises = 5%
Facilitator Assessment: Facilitators will meet individually with students twice during the term to evaluate their
performance. The first evaluation will be ‘formative’ only, i.e., to advise students of their progress and will not be
recorded for grade. The end of the term evaluation will be used to assess the student’s progress/participation in the
group and other class related activities. Students will also complete Self-Assessment Forms to supplement the
evaluation process.
The grading of this course is on a “PASS/FAIL/HONORS” basis.
1) Students will be evaluated each Term using the multiple components as described above.
2) Each year at the end of the 1st Term:
a) All students will be assigned an interim grade of I (Incomplete);
b) Each student will be informed of his/her final average, a record of which will be maintained in the office of
the DPC Academic Coordinator and the Director of the DPC program.
3) Students who earn less than a 1st-Term average of 70%, or a content exam score of <65%, will be officially
informed that their performance was deficient for the 1st Term. The student, in consultation with the Course
Coordinator, will present a plan designed to resolve the deficiency. This information will also be forwarded to
the Associate Dean of Academic Affairs for tracking purposes.
4) Students with a 1st-Term average <70%, or a content exam score of <65%, will be allowed to continue with the
class. However, in order to pass the year the student must achieve a final yearly average (1st- and 2ndterm)
of 70% or greater with a content exam average (for the two Terms) of 65% or greater.
5) All students who meet the requirements for passing the year (see 4) will then be awarded the grade of P (Pass)
or H (Honors) for each of the two Terms.
51
6) Students who fail the year (see 4) will be awarded a grade of I (Incomplete) and will be permitted (with
approval of the Associate Dean for Academic Affairs) to sit for a comprehensive reassessment-examination.
The reassessment exam will be constructed by the course faculty and administered by the Course Coordinator.
The exam may include both written and oral components. Successful completion of the reassessment
examination will result in the awarding of a grade of P for the two Terms. Failure of the comprehensive
reassessment exam will result in the awarding of a grade of F (Fail) for the two terms, and a recommendation to
the Associate Dean of Academic Affairs that the student be dismissed from the College.
7) Students whose failure of the year (i.e. overall yearly average <70%) can be attributed to low facilitator
assessment scores present a special concern. The student has been determined, by his/her facilitators, to be
deficient in the skills necessary to effectively interact with patients and colleagues. This deficiency may not be
resolvable by examination. Such failures will be evaluated by the Director of the DPC program, the Associate
Dean of Academic Affairs and/or the Committee on Student Progress (CSP) to determine possible remediation
programs or to consider other options including dismissal.
52
DOCTOR PATIENT CONTINUUM(DPC) – CLINICAL SCIENCES II
Grading Policy:
1. The grading of this course is on a “PASS/FAIL/HONORS” basis. Grades will be determined by performance
in the three components of the course, OMM, Clinical Skills, and Clinical Practicum, as follows:
Evaluation Criteria: Percent of Grade
OMM 40%
Clinical Skills 40%
Clinical Practicum 20%
In both the OMM and Clinical Skills components of the course, student evaluations will encompass written
and practical examinations. In order to pass the course, both the written and practical examinations in OMM
AND Clinical Skills must be passed. Students who fail to achieve a passing score in either Clinical Skills or
OMM will be issued a grade of “I” (Incomplete). Such students will be offered the opportunity to remediate
the appropriate portion of the course. Re-evaluation will be conducted under the supervision of the DPC
faculty. Successful completion of the re-evaluation examination, both written and practical, will result in the
awarding of a grade of P (Pass). Failure of the comprehensive reassessment exam will result in the
awarding of a grade of U (Unsatisfactory) for this course.
2. Grading of the OMM component will be evaluated according to the following criteria:
Evaluation Criteria: Percent of Grade
OMM written (weighted) 50%
OMM practical (average) 50%
3. Grading of the Clinical Practicum component will be evaluated according to the following criteria:
Evaluation Criteria: Percent of Grade
Attendance and Participation 15%
Case Presentation 35%
Clinical Mentor Evaluation 50%
53
4. Grading of the Clinical Skills component will be evaluated according to the following criteria:
Evaluation Criteria: Percent of Grade
Class participation/assignments 5%
ICC participation/assignments 10%
Timed examination #1
– Practical portion 20%
– Written portion 5%
Timed examination #2
– Practical portion 20%
– Written portion 5%
Timed Comprehensive examination
– Practical portion 25%
– Written portion 10%
Pre-clinical Years: Years I and II DPC Track
54
Assessing the American Osteopathic Association (AOA) Core Competencies at
New York College of Osteopathic Medicine (NYCOM)
A. Background
In recent years, there has been a trend toward defining, teaching and assessing a number
of core competencies physicians must demonstrate. The Federation of State medical Boards
sponsored two Competency-Accountability Summits in which a “theoretical textbook” on good
medical practice was drafted to guide the development of a competency-based curriculum. The
competencies include: medical knowledge, patient care, professionalism, interpersonal
communication, practice-based learning, and system-based practice. The AOA supports the
concepts of core competency assessment and added an additional competency: osteopathic
philosophy and osteopathic clinical medicine.
Arguably it is desirable to begin the process of core competency training and assessment
during the pre-clinical year. Patient simulations, i.e. using standardized patients and robotic
simulator, allow for such training and assessment under controlled conditions. Such a pre-clinical
program provides basic clinical skills acquisition in a patient-safe environment. NYCOM has
responded to this challenge by creating a two-year “Core Clinical Competencies” seminar that
requires students to learn and practice skills through various patient simulations in the Institute
For Clinical Competence (ICC). In this seminar the ICC assesses a sub-set of the above
competencies taught in the lecture-based and discussion-based clinical education tracks.
The following is a list of the competencies assessed during the pre-clinical years at
NYCOM, and reassessed during the third year (osteopathic medicine objective structured
clinical examination) and fourth year (voluntary Clinical Skills Capstone Program). It should be
noted that there is a fair amount of skills overlap between the competencies, for example, the
issue of proper communication can be manifested in a number of competencies.
B. Core Clinical Competencies
1. Patient Care: Provide compassionate, appropriate effective treatment, health promotion
Skills:
Data-gathering: history-taking, physical examination (assessed with clinical skills
checklists)
Develop differential diagnosis
Interpret lab results, studies
Procedural skills, e.g. intubation, central line placement, suturing, catheterization
Provide therapy
2. Interpersonal and communication skills: Effective exchange of information and collaboration
with patients, their families, and health professionals.
Skills:
Communication with patients and their families across a spectrum of multicultural
backgrounds (assessed with the Professionalism Assessment Rating Scale)
55
Health team communication
Written communication (SOAP note, progress note)
3. Professionalism: Commitment to carrying out professional responsibilities and ethical
committments
Skills:
Compassion, respect, integrity for others
Responsiveness to patient needs
Respect for privacy, autonomy
Communication and collaboration with other professionals
Demonstrating appropriate ethical consideration
Sensitivity and responsiveness to a diverse patient population including e.g. gender,
age, religion, culture, disabilities, sexual orientation.
4. Osteopathic Philosophy and Osteopathic Clinical Medicine: Demonstrate, apply knowledge
of osteopathic manipulative treatment (OMT); integrate osteopathic concepts and OMT into
medical care; treating the person, and not just the symptoms
Skills:
Utilize caring, compassionate behavior with patients
Demonstrate the treatment of people rather than the symptoms
Demonstrate understanding of somato-visceral relationships and the role of the
musculoskeletal disease
Demonstrate listening skills in interaction with patients
Assessing disease (pathology) and illness (patient’s response to disease)
Eliciting psychosocial information
C. Assessment of Core Competencies
The ICC utilizes formative assessment to evaluate learner skills and the effectiveness of
NYCOM’s clinical training programs. Data on student performance in the ICC is tracked from
the first through the fourth year. The ICC satellite at St. Barnabas assesses students during their
clerkship years as well as interns and residents in a number of clinical services. It uses a variety
of methods to assess competencies:
1. Written evaluations
Analytic assessment – skills checklists that document data-gathering ability
Global-holistic rating scales to assess doctor-patient communication (Professionalism
Assessment Rating Scale) and health team communication (SimCom-T)
SOAP note and progress note assessment
2. Debriefing / feedback – a verbal review of learner actions following a patient simulation
program provided by standardized patients and instructors as appropriate.
56
Core Clinical Competencies 590 (MS 1)
Core Clinical Competencies 690 (MS 2)
The courses provide a horizontal integration between clinical courses provided by the LDB and
DPC programs (small group discussion and demonstration) and the OMM department. It
provides practice with simulated patients (some variation in this aspect as noted below),
formative assessment, end-of-year summative assessment and remediation.
1. SP PROGRAM, METRICS AND HOURS
MS 1 Program – SP Different program, same standardized examination
LDB
SP program: training with formative assessment (see next bullet for formative assessment
metrics)
End of year OSCE assessing history-taking (checklists designed for each SP case), PE (see
attached physical examination criteria) and interpersonal communication (see attached
program in doctor-patient communication “Professionalism Assessment Rating Scale)
Hours: 13.5 / year (including OSCE)
DPC
Clinic visits to substitute for SP encounters
End of year OSCE (same as LDB)
Hours: Should be equivalent to the number of SP hours in the LDB program
NOTE: The purpose of the OSCE is to assess the clinical training of both the LDB and DPC
programs. It is assumed the LDB and DPC faculty will work on this OSCE together with the
OMM department.
MS 1 Program – Patient Simulation Program
LDB and DPC
Same program in basic procedures for both LDB and DPC students as outlined in the
syllabus distributed during the curriculum committee
Hours: 5 hours / year
57
MS 2 Program – SP
LDB and DPC – same program, different approaches, same standardized exam
SP program: training with formative assessment (see next bullet for formative assessment
metrics)
End of year OSCE assessing history-taking (checklists designed for each SP case), PE (see
attached physical examination criteria) and interpersonal communication (see attached
program in doctor-patient communication “Professionalism Assessment Rating Scale)
Hours: 13.5 hours / year (including OSCE)
NOTE: It is assumed that the LDB and DPC program schedules will vary but that the
content will be equivalent
MS 2 Program – Patient Simulation Program
LDB and DPC – same program, same standardized exam
Students work in the same group throughout the year
End of year OSCE assessing medical team communication using the SimCom-T rating scale
(attached)
Group grade assigned for the OSCE (reflecting the spirit of the SimCom-T rating scale)
Hours: 11 / year (including OSCE)
2. Attendance
All activities and exams are mandatory.
Make ups are done at the discretion of the ICC
NOTE: Make ups will be done as close to an activity as possible because delaying them, e.g. to
the end of the year, will incur additional training expenses (e.g. re-training a SP for a case played
months earlier) for the ICC.
3. Grading and remediation
Pass / fail
Grading is based upon:
o Attendance
o Participation
o End-of-year OSCE (standards to be set)
58
ICC Hours
MS1
Clinical Practice OSCE Total
Hours
LDB 8 SP exercises @1.5 hours each
12 hours per student
5 patient simulation program exercises @ 1 hours
each
5 hours per student
End-of-year SP OSCE
1.5 hours per student
(approximately 6.25 days)
13.5 hours
(SP)
5 hours
(Pat Sim)
Total = 18.5
DPC Clinic experience to substitute for SP exercises
Students will receive information re:
communication and PE competencies
5 patient simulation program exercises @ 1 hours
each
5 hours per student
0 hours
(SP)
5 hours
Pat Sim
Total = 5
MS2
Clinical Practice OSCE Total
Hours
LDB
DPC
8 SP exercises @1.5 hours each
12 hours per student
6 patient simulation program exercises, plus ACLS
10 hours per student
End-of-year SP OSCE
1.5 hours per student
(approximately 6.25 days)
End-of-year Pat Sim OSCE
1 hour per student
(approximately 5 days)
13.5 hours
(SP)
11 hours
(Pat Sim)
Total = 24.5
59
© 2007 NYCOM Do not reproduce or distribute without permission 9/4/07
Institute For Clinical Competence (ICC)
Professionalism Assessment Rating Scale (PARS)
Dear Students:
As part of your professional development, standardized patients (SPs) in the ICC will be
evaluating your interpersonal communication with them using the Professionalism Assessment
Rating Scale (PARS).
This scale evaluates two types of interpersonal communication, both important to quality health
care:
Patient Relationship Quality – Rapport, empathy, confidence and body language.
Patient Examination Quality – Questioning, listening, information exchanging and careful and
thorough physical examination.
Arguably patients (real or simulated) are in the best position to assess your interpersonal
communication with them because you are directly relating to them during an intimate, face-toface,
hands-on encounter. They are in the best position, literally, to observe your eye contact,
demeanor and body language because they are in the room with you. We would recommend you
take their feedback seriously, but perhaps “with a grain of salt.”
The term standardized patient is to some degree a misnomer – SPs can be standardized to
present the same challenge and the same medical symptoms to each student, but they cannot be
standardized to feel the same way about you and your work with them compared to other
students. This is true in life as well as clinical work – some people will like you better than others,
and patients are people! You may communicate with one patient the way you do with the next,
but receive slightly different ratings. This is to be expected. Unlike the analytic checklists we use
to document if you asked particular questions or performed certain exams correctly, there are no
dichotomous / “right or wrong” communication ratings. Patients are people who may tune into
different things during an encounter. We think this slight variation in observation is an asset that
will help you understand that patients are individuals who must be approached as individuals.
Another word about the ratings you will receive – the ratings are not absolute numbers that
constitute an unconditional assessment of your communication skills. Some days you may be
better than other days. We use the ratings numbers (1-8 holistic scale) to chart progress over
time. We do see improvements during the first two years of the typical student’s training but the
ratings are used to track your progress as much as to structure a conversation with the SP, or
faculty member, during debriefing. We would recommend you take responsibility during SP
debriefing and ask them questions about the work you just did.
The holistic 1 – 8 scale is broken down into two parts: Ratings of 1 – 4 are considered “lower
quality” communication, i.e. what might be considered acceptable at a novice or trainee level, but
less acceptable for an experienced professional. Ratings of 5 – 8 are considered “higher quality”
communication, i.e. more professional-quality communication regardless of the training or
experience level.
60
© 2007 NYCOM DO NOT DISCLOSE, DISTRIBUTE OR REPRODUCE WITHOUT PERMISSION 3/18/07
Professionalism Assessment Rating Scale (PARS)
Standardized patients will rate “to what degree” you demonstrated relationship quality and
examination quality on the following nine factors:
RELATIONSHIP QUALITY
To what degree did the student …
Lower Higher
Quality Quality
1 Establish and maintain rapport 1 2 3 4 5 6 7 8
2 Demonstrate empathy 1 2 3 4 5 6 7 8
3 Instill confidence 1 2 3 4 5 6 7 8
4 Use appropriate body language 1 2 3 4 5 6 7 8
EXAMINATION QUALITY
To what degree did the student …
Lower Higher
Quality Quality
5 Elicit information clearly, effectively 1 2 3 4 5 6 7 8
6 Actively listen 1 2 3 4 5 6 7 8
7 Provide timely feedback / information / counseling 1 2 3 4
5 6 7 8
8 Perform a thorough, careful physical exam or
treatment
1 2 3 4 5 6 7
8
Less experienced, More
or unprofessional professional
The following pages are a guide to the PARS, giving examples of “lower quality” and
“higher quality” communication.
61
© 2007 NYCOM DO NOT DISCLOSE, DISTRIBUTE OR REPRODUCE WITHOUT PERMISSION 3/18/07
1 Establish and maintain rapport
Establish and maintain a positive, respectful collaborative working relationship with the patient.
Lower Quality
1 2 3 4
Higher Quality
5 6 7 8
Overly familiar.
“Hi Bill, I’m John. How are you doing
today.”
Appropriate address, e.g.
“Hi Mr. Jones, I’m Student-doctor Smith. Is it
OK if I call you Bill?”
No agenda set. Set agenda, e.g.
No collaboration with the patient, i.e. carries
out the exam without patient consent or
agreement.
“We have ___ minutes for this exam. I’ll take a
history, examine you…..etc.”
Collaborative mindset
“Let’s figure out what’s going on.”
“We’re going to work out this problem together.”
Took notes excessively, i.e. spent more time
taking notes than interacting.
Spent more time interacting with the patient than
taking notes.
Began physically examining patient without
“warming” patient up, asking consent, etc.
Asked consent for obtaining a physical
examination, e.g.
“Is it OK for me to do a physical exam?”
Did not protect patient’s modesty, e.g.
Did not use a drape sheet
Respected patient’s modesty at all times e.g.
Used a drape sheet when appropriate
Did not direct patient to get dressed after
exam
Letting patient cover up follow an examination.
Left door open when examining patient.
Talked “down” to patient, did not seem to
respect patient’s intelligence.
Seemed to assume patient is intelligent.
Rude, crabby or overtly disrespectful. Never rude, crabby; always respectful.
Dress, hygiene problems:
Wore distracting perfume/cologne.
Dressed professionally, i.e. in a clean white coat,
clean clothes, etc.
Poor hygiene, e.g. uncleanly, dirty nails,
body odor, did not wash hands, etc.
Touched hair continually
Unprofessional dress, e.g. wore jeans,
facial jewelry (e.g. tongue or nose studs),
overly suggestive or revealing garments
Seemed angry with the patient.
Seemed to like the patient.
62
© 2007 NYCOM DO NOT DISCLOSE, DISTRIBUTE OR REPRODUCE WITHOUT PERMISSION 3/18/07
2 Demonstrate empathy
Demonstrate both empathy (compassion, understanding, concern, support) and inquisitiveness
(curiosity, interest) in the patient’s medical problem and life situation.
Lower Quality Higher Quality
1 2 3 4 5 6 7 8
EMPATHY
No expressions of concern about patient’s
condition or situation.
Expressed concern about patient’s condition or
situation, e.g.
“That must be painful.”
“I’m here to try to help you.”
Failed to acknowledge positive behavior /
lifestyle changes the patient has made.
Reinforced behavior/lifestyle changes the patient
has made, e.g. “That’s great you quit smoking.”
Failed to acknowledge suggested behavior /
lifestyle changes might be difficult.
Acknowledged that suggested behavior/lifestyle
changes might be difficult.
Empathic expression seemed insincere,
superficial.
Empathic expressions seemed genuine.
Detached, aloof, overly “business-like,” robotic in
demeanor.
Compassionate and caring, “warm.”
Seeming lack of compassion, caring.
Accused patient of being a non-compliant, e.g.
“Why don’t you take better care of yourself?”
“You should have come in sooner.”
Positive reinforcement of things patient is doing
well, e.g.
“That’s great that you stopped smoking.”
“I’m glad you are taking your medication on a
regular basis.
INQUISITIVENESS – An aspect of empathy is inquisitiveness, the ability to attempt to
understand the patient, both medically and personally.
Focused on symptoms, but not the patient, i.e.
did not explore how the medical problem /
symptoms affect the patient’s life.
Tried to understand how the medical problem /
symptoms affect the patient’s life, or vice versa.
“How is this affecting your life?”
“Tell me about yourself.”
Failed to explore activities of daily living. “Describe a typical day in your life.”
“Tell me about your stress.”
Failed to explore patient’s response to diagnosis
and / or treatment.
Inquires as to patient’s response to diagnosis and
/ or treatment
Failed to explore barriers to behavior / lifestyle
change.
Explored barriers to behavior / lifestyle change.
63
© 2007 NYCOM DO NOT DISCLOSE, DISTRIBUTE OR REPRODUCE WITHOUT PERMISSION 3/18/07
3 Instill confidence
Instilling confidence that the medical student or doctor is able to help and treat the patient.
Lower Quality
1 2 3 4
Higher Quality
5 6 7 8
Conveyed his / her anxiety, e.g. Conveyed an appropriately confident demeanor,
e.g.
Made eye contact
By avoiding eye contact
Laughing or smiling nervously
Sweaty hand shake
Made statement such as:
“This is making me nervous.”
“This is the first time I’ve ever done this.”
“I don’t know what I’m doing.”
Apologized inappropriately to the patient. E.g.
“I’m sorry, but I have to examine you.”
Shook hands firmly, etc.
Overly confident, cocky.
Never cocky, appropriately humble without
undermining the patient’s confidence.
When making suggestions, used tentative
language, e.g.
“Maybe you should try…”
“I’m not sure but …”
When making suggestions, used authoritative
language, e.g.
“What I suggest you do is…”
Made excuses for his/her lack of skill or
preparation by making statements such as:
Offered to help the patient or get information if he
/ she could not provide it by saying, e.g.
“I’m just a medical student.”
“Let me ask the attending physician”
“They didn’t explain this to me.”
“Do you know what I’m supposed to do next?”
“I don’t know but let me find out for you.”
64
© 2007 NYCOM DO NOT DISCLOSE, DISTRIBUTE OR REPRODUCE WITHOUT PERMISSION 3/18/07
4 Use appropriate body language
The ability to use appropriate gestures, signs and body cues.
Lower Quality Higher Quality
1 2 3 4 5 6 7 8
Overly casual posture, e.g. leaning against
the wall or putting feet up on a stool when
interviewing the patient.
Professional posture, i.e. carried himself / herself
like an experienced, competent physician.
Awkward posture, e.g.
• Stood stiffly when taking a history
• Stood as if he / she was unsure what to do
with his / her body.
Natural, poised posture.
Uncomfortable or inappropriate eye contact
e.g. stared at the patient too long and / or
never looked at the patient.
Used appropriate eye contact.
Avoided eye contact when listening.
Made eye contact when listening, whether eye
level of not.
Stood or sat too close or too distant from the
patient.
Maintained an appropriate “personal closeness”
and “personal distance.”
Turned away from the patient when listening.
Maintained appropriate body language when
listening to the patient.
65
© 2007 NYCOM DO NOT DISCLOSE, DISTRIBUTE OR REPRODUCE WITHOUT PERMISSION 3/18/07
5 Elicit information clearly, effectively
Effectively ask questions in an articulate, understandable, straightforward manner.
Lower Quality Higher Quality
1 2 3 4 5 6 7 8
Used closed-ended, yes / no questions
exclusively, e.g.
Used open-ended questions to begin an inquiry,
and closed-ended questions to clarify, e.g.
“How many days have you “Tell me about the problem.”
been sick?” “What do you do in a typical day?”
“Ever had surgery?” “How is your health in general?”
“Any cancer in your family?”
Used open-ended questions / non-clarifying
questions exclusively.
Used open-ended questions to begin an inquiry,
and closed-ended questions to clarify.
Student’s questions were inarticulate, e.g.
mumbled, spoke too fast, foreign accent
problems, stuttered*, etc.
* NOTE: Consider stuttering a form of inarticulation for
rating purposes, i.e. do not make allowances for
stuttering
Student was articulate, asked questions in an
intelligible manner.
Asked confusing, multi-part or overly complex
questions, e.g.
“Tell me about your past medical
conditions, surgeries and allergies.”
Asked one question at a time, in a straight-forward
manner.
“Tell me about your allergies.”
Asked direct questions, e.g.
Asked leading questions, e.g.
“No cancer in your family, right?”
“No surgeries?” “Do you have any cancer in your family?
“You only have sex with your wife, right?” “Any surgeries?”
“Are you monogamous?”
Jumped from topic to topic Organized interview.
in a “manic,” disjointed or
disorganized way.
Stayed focused, asked follow up questions before
moving to another topic.
Asked questions in a robotic way, Asked questions in a conversational way, i.e.
listened to the response, and then asked another
question.
i.e. as if reading from a prepared
checklist.
Constantly cut off patient, i.e. did
not let patient finish sentences.
Allowed patient to finish sentences and thoughts
before asking the next question.
66
© 2007 NYCOM DO NOT DISCLOSE, DISTRIBUTE OR REPRODUCE WITHOUT PERMISSION 3/18/07
6 Actively listen
Both listen and respond appropriately to the patients’ statements and questions.
Lower Quality Higher Quality
1 2 3 4 5 6 7 8
Asked questions without listening to the
patient’s response.
Asked questions and listened to patient’s
response.
No overt statements made indicating he / she
was listening.
Said, e.g. “I’m listening.”
Turned away from the patient when listening.
Maintained appropriate body language when
listening to the patient.
Kept asking the same question(s) because
the physician didn’t seem to remember what
he / she asks.
If necessary, asked the same questions to obtain
clarification, e.g.
“Can you tell me again how much you smoke?”
“I know you told me this, but when was the last
time you saw your doctor?”
Wrote notes without indicating he / she was
listening.
When writing indicated he / she is listening, e.g.
“I have to write down a few things down when
we talk, OK?”
Did not seem to be listening, seemed
distracted.
Attentive to the patient.
Kept talking, asking questions, etc. if the
patient was discussing a personal issue, a
health concern, fear, etc.
Was silent when necessary, e.g. if the patient was
discussing a personal issue, a health concern,
fear, etc.
67
7 Provide timely feedback / information / counseling
Explain, summarize information (e.g. results of physical exams, provides patient education
activities, etc.), or provide counseling in a clear and timely manner.
Lower Quality Higher Quality
1 2 3 4 5 6 7 8
Did not explain examination procedures, e.g.
just started examining the patient without
explaining what he / she was doing.
Explained procedures, e.g.
“I’m going to check your legs for edema.”
“I’m going to listen to your heart.”
Did not provide feedback at all, or provided
minimal feedback
Periodically provided feedback regarding what he /
she heard the patient saying.
“It sounds like your work schedule makes it
difficult for you to exercise.”
“I hear in your voice that your family situation is
causing you a lot of stress.”
Did not summarize information at all. Periodically summarized information.
“You had this cough for 3 weeks, it’s getting
worse and now you’ve got a fever. No one is
sick at home and you haven’t been around
anyone who is sick.”
Provided empty feedback or unprofessional
feedback, e.g.
Feedback was meaningful, useful and timely.
“OK…..OK…..OK…..OK…”
“Gotcha..gotcha…gotcha,..”
“Great ” “Awesome” “Cool”
Examined the patient without providing
feedback about the results of the exam.
Provided feedback about results of the physical
exam.
“Your blood pressure seems fine.”
Refused to give the patient information he /
she requested, e.g.
“You don’t need to know that.”
“That’s not important.”
Give information to the patient when requested, or
offered to get it if he / she couldn’t answer the
patient’s questions.
Used medical jargon without explanation, e.g. Explained medical terms.
“What you experienced was a myocardial
infarction.”
“What you experienced is a myocardial
infarction, meaning a heart attack.”
Ended the exam abruptly.
Let the patient know what the next step was,
provided closure.
No closure, no information about the next
steps
“Let’s review the exam and your health…”
© 2007 NYCOM DO NOT DISCLOSE, DISTRIBUTE OR REPRODUCE WITHOUT PERMISSION 3/18/07
68
© 2007 NYCOM DO NOT DISCLOSE, DISTRIBUTE OR REPRODUCE WITHOUT PERMISSION 3/18/07
8 Conduct a thorough, careful physical exam or treatment
Conduct physical exams and / or treatment in a thorough, careful manner vs. a tentative or
superficial manner.
Lower Quality Higher Quality
1 2 3 4 5 6 7 8
Conducted a superficial examination, e.g. Conducted a careful examination, e.g.
Avoided touching the patient Examined on skin when appropriate
Touched patient with great tentativeness
Hurried through the exam. Used the full amount of time allotted to examine
the patient.
Avoided inspecting (looking at) the patient’s
body / affected area.
Thoroughly inspected (looked at) the affected
area e.g. with gown open.
Consistently palpated, auscultated and / or
percussed over the exam gown.
Consistently palpated, auscultated and / or
percussed on skin.
Exam not bi-lateral (when appropriate). Bi-lateral exam (when appropriate).
Rough exam, e.g. Conducted a smooth exam from beginning to
Started, stopped, re-started the exam. end.
Fumbled with instruments
Did not look to see what patient’s expressions
were during an examination in order to assess
pain.
Looked for facial expressions to assess pain.
Did not thoroughly examine the site of the
chief complaint, e.g.
Thoroughly examined the site of the chief
complaint.
Did not examine heart and / or lungs if
chief complaint was a breathing problem
69
© 2007 NYCOM DO NOT DISCLOSE, DISTRIBUTE OR REPRODUCE WITHOUT PERMISSION 3/18/07
9 Conduct the examination in an organized manner
Overall conduct the exam in an organized, systematic way vs. a disorganized or unsystematic
way.
Lower Quality Higher Quality
1 2 3 4 5 6 7 8
No clear opening, e.g. Clear opening, e.g.
Did not set an agenda Set an agenda and followed it
Abruptly began the exam Began the exam after a proper introduction
Medical interview not organized – history
jumped from topic to topic
Organize the medical interview vs. jumping from
topic to topic
No clear closure, e.g. Clear closure, e.g.
Did not summarize information gathered
during the history and physical
examination
Summarized information gathered during the
history and physical examination
Did not ask patient “Any more questions?” Asked patient “Any more questions?”
Did not clarify next steps Clarified next steps
70
SimCom-T(eam) Holistic Scoring Guide
The SimCom-T is a holistic health care team communication training program and rating scale. The nine-factor scale of SimCom-T
rates team members’ performance as a unit, i.e. individual team member performance should be considered a reflection upon the
entire team.
Rate each factor individually.
Ratings should be global, i.e. reflect the most characteristic performance of the team vs. individual incidents.
The following pages are a guide to SimCom-T, providing behavioral examples representative of each score for the SimCom-T
competencies.
Score Performance Level Description – The team…
1 Limited ….consistently demonstrates novice and / or dysfunctional team attributes
2 Basic ….inconsistently operates at a functional level
3 Progressing ….demonstrates basic and average attributes
4 Proficient ….proficient and consistent in performance
5 Advanced ….experienced and performing at a significant expert level
CNE Not applicable ….A factor could not be evaluated for some reason
Competency Lower
Quality
Higher
Quality
1 Leadership establishment and maintenance 1 2 3 4 5 CNE
2 Global awareness 1 2 3 4 5 CNE
3 Recognition of critical events 1 2 3 4 5 CNE
4 Information exchange 1 2 3 4 5 CNE
5 Team support 1 2 3 4 5 CNE
6 External team support 1 2 3 4 5 CNE
7 Patient support 1 2 3 4 5 CNE
8 Mutual trust and respect 1 2 3 4 5 CNE
9 Flexibility 1 2 3 4 5 CNE
10 Overall Team Performance 1 2 3 4 5 CNE
71
1. Leadership Establishment and Maintenance
Team members both establish leadership and maintain leadership throughout.
Lower Quality Higher Quality
Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Leader not
established
▪ Roles not assigned
▪ No discussion
regarding role
assignment
▪ Unable to identify
leader
▪ Many leaders
▪ No clear role
definition
▪ Leadership not
explicit throughout
event
▪ Leadership not
maintained
throughout the event
▪ Role switching
without leader
involvement
▪ Leader explicitly
identified
▪ Roles defined
▪ Leadership explicitly
identified and
maintained
▪ Roles defined and
maintained
▪ Leader delegates
responsibility
Examples ▪ Team operating
dysfunctionally
without a leader
▪ Team members
taking on similar roles
and role switching
consistently
▪ Team members
unsure of who is
responsible for
different tasks
▪ Leader timid and
does not take charge
▪ Team member roles
unclear and/or
inconsistent
▪ A team member asks,
“Who is running the
code?” and another
says, “I am,” but does
not take communicate
leadership
responsibilities.
▪ Team members are
assigned roles but do
not take on the
assignment
▪ Team members
select a leader
▪ A team member
volunteers to handle
the situation
▪ Roles clearly defined
by team members
and/or leader
▪ Leadership and roles
are established very
early in the event and
is maintained
throughout the event
▪ Clarity of leadership
and roles is evident
throughout the event
and with the team
members
72
2. Global Awareness
Team members monitor and appropriately respond to the total situation, i.e. the work environmental and the patient’s condition.
Lower Quality Higher Quality
Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Does not monitor the
environment and
patient
▪ Does not respond to
changes in the
environment and
patient
▪ Monitoring and
response to changes
in the environment
and patient rarely
occur
▪ Fixation errors
▪ Monitoring and
response to the
environment and
patient are not evident
throughout the event
▪ Monitors the
environment and
patient
▪ Respond to changes
in the environment
and patient
▪ Consistently monitors
the environment and
patient
▪ Consistently respond
to changes in the
environment and
patient
Examples ▪ There is no summary
of procedures, labs
ordered, or results of
labs
▪ Team is task oriented
and does not
communicate about
the event
▪ Event manager loses
focus and becomes
task oriented
▪ There is no clear
review of the lab
results and/or
summary of
procedures.
▪ Leader says, “Team,
lets review our
differential diagnosis
and labs,” and team
does not respond to
the leader.
▪ Some of the team
members discuss
among themselves
results and possible
problems.
▪ Leader says, “Team,
lets review our
differential diagnosis
and labs,” and team
reviews the situation.
▪
▪ Event manager
remains at the foot of
the bed keeping a
global assessment of
the situation
▪ Leader announces
plan of action for the
event.
73
3. Recognition of Critical Events
Team promptly notes and responds to critical changes in the patient’s status and / or environment.
Lower Quality Higher Quality
Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Does not monitor or
respond to critical
deviations from steady
state
▪ Fails to recognize or
acknowledge crisis
▪ “Tunnel Vision”
▪ Fixation errors are
consistently apparent
▪
▪ Team reactive rather
than proactive
▪ Critical deviations
from steady state are
not announced for
other members
▪ Monitors and
responds to critical
deviations from steady
state
▪ Recognizes need for
action
▪ All team members
consistently monitors
and responds to
critical deviations from
steady state
▪ Anticipates potential
problems
▪ Practices a proactive
approach and attitude
▪ Recognizes need for
action
▪ “Big Picture”
Examples ▪ Patient stops
breathing, and team
does not recognize
the situation
throughout the event
▪ Patient is pulseless,
and no CPR is started
throughout the event
▪ Patient stops
breathing, and team
does not recognize
this situation for a
critical time period
▪ Patient is pulseless,
and no CPR is started
for a critical time
period
▪ ▪ Leader says, “Team,
lets review our
differential diagnosis,
are there any
additional tests that
we should request?”
▪ “John, the sats are
dropping, please be
ready, we might have
to intubate.”
▪ “Melissa, the blood
pressure is dropping.
Get ready to start the
2nd IV and order a
type and cross.”
74
4. Information Exchange
Patient and procedural information is exchanged clearly.
Lower Quality Higher Quality
Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Communication
between team
members is not
noticeable
▪ Requests by others
are not acknowledged
▪ No feedback loop
▪ No orders given
▪ Vague
communication
between team
members
▪ Not acknowledging
requests by others
▪ Feedback loop left
opened
▪ Orders not clearly
given
▪ Communication
between team and
response to requests
by others inconsistent
▪ Feedback loops open
and closed
▪ Orders not directed to
a specific team
member
▪ Team communicates
and acknowledges
requests throughout
the event
▪ Feedback loops
closed
▪ Explicit
communication
consistently
throughout the event
▪ Team acknowledges
communication
▪ Closed loop
communication
throughout event
Examples ▪ No summary of
events.
▪ No additional
information sought
from the team
members.
▪ Event manager says,
“I need a defibrillator,
we might have to
shock this patient,”
and no team member
acknowledges the
order. The request
was not given
explicitly to a team
member.
▪
▪ One team member
says to another in a
low voice, “We need
to place a chest tube,”
but the event
manager does not
hear the
communication.
▪ Event manager
requests a
defibrillator, but not
explicitly to a
particular team
member; several
team members
attempt to get the
defibrillator
▪ Jonathan says to
event manager, “We
need to place a chest
tube.” Event manager
responds, “OK, get
ready for it.”
▪ Leader says, “Team,
lets summarizes what
has been done so
far.”
▪ Leader says, “Mary
please start an IV.”
Mary responds,
“Sorry, I do not know
how, please ask
someone else to do
it.”
▪ Event manager
summarizes events.
▪ Event manager seeks
additional information
from all team
members
▪ Event manager says,
“Peter, I want you to
get the defibrillator,
we might have to
shock this patient.”
Peter responds, “Yes,
I know where it is and
I’ll get it.”
75
5. Team Support
The team works as a unit, asking for or offering assistance when needed vs. team members “going it alone.”
Lower Quality Higher Quality
Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ No assistance or help
asked for or offered
▪ Team members act
unilaterally
▪ No recognition of
mistakes
▪ Team members
watching and not
participating
▪ Team members take
over when not
needed
▪ Mistakes not
addressed to the
team
▪ Negative feedback
▪ Assistance is offered
when needed only
after multiple requests
▪ Team recognizes
mistakes and
constructively
addresses them
▪ Team member(s)
ask(s) for help when
needed
▪ Assistance provided
to team member(s)
who need(s) it
Examples ▪ During a shoulder
dystocia event, the
critical situation is
recognized, but no
help is requested or
attempts to resolve
situation on their own
▪ Wrong blood type
delivered and
administered, an no
backup behaviors to
correct the mistake
▪ Team member
administers
medication without
consulting the event
manager
▪ Charles knows that
the patient is a
Jehovah Witness and
does not let the team
know when a T&C is
ordered.
▪ Team does not
communicate that
he/she doesn’t know
how to use a
defibrillator and
attempts to do it
anyways and fails.
▪ ▪ ▪ During a shoulder
dystocia event, the
critical situation is
recognized, and event
manager calls for help
▪ Wrong blood type
delivered, attempt
made by team
member to administer
the blood but another
team member
recognizes the
mistake and stops the
transfusion before it
starts
▪ Team member
consults with the
event manager before
administering
medication
76
6. External Team Support
Work team provides “external team” (family members and / or other health care professionals) with information and support as
needed
Lower Quality Higher Quality
Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Team fails to
recognize or interact
with other significant
people who are
present during the
encounter
▪ Team recognizes
other significant
people who are
present during the
encounter but
ignores to interact
with them
▪ Team inconsistently
interacts with other
significant people who
are present during the
encounter
▪ Team interacts with
other significant
people who are
present during the
encounter
▪ Team effectively
interacts with other
significant people who
are present during the
encounter
Examples ▪ Team fails to interact
with a distraught
family member and/or
para-professional
▪ Team fails to interact
appropriately with a
distraught family
member
▪ Team does not
cooperate with a
para-professional
▪ ▪ ▪
77
7. Patient Support
Work team provides the patient and significant others with information and emotional support as needed.
Lower Quality Higher Quality
Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Team fails to interact
with patient if
conscious
▪ Team fails to show
empathy or respect
for a patient
(conscious or
unconscious)
▪ Team fails to provide
appropriate
information when
requested to do so
▪ Teams interaction
with patient is
minimal and when
done so is lacking in
respect or empathy
▪ Team inconsistently
shows empathy or
respect for a patient
(conscious or
unconscious)
▪ Team inconsistently
provides information
when requested to do
so
▪ Team shows empathy
toward patient
▪ Team provides
appropriate
information when
requested to do so
▪ Team demonstrates
consistent and
significant respect
and empathy for
patient
▪ Appropriate
information is
provided consistently
Examples ▪ Team deals with an
unconscious patient
with a lack of respect,
e.g. by joking about
his / her condition
▪ Charles knows that
the patient is a
Jehovah Witness and
does not let the team
know when a T&C is
ordered.
▪ ▪ ▪ Charles lets the
leader know that the
patient is a Jehovah
Witness and that she
refused blood
products.
▪
78
8. Mutual Trust and Respect
The team demonstrates civility, courtesy and trust in collective judgment.
Lower Quality Higher Quality
Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Team exhibits e.g.
rudeness, overt
distrust/mistrust,
anger or overt doubt
or suspicion toward
each other
▪ Few team members
exhibit rudeness,
overt distrust, anger
or suspicion toward
each other
▪ Team inconsistently
demonstrates respect,
rudeness, distrust or
anger toward each
other
▪ Team exhibits e.g.
civility, courtesy, and
trust in collective
judgment
▪ Team is significantly
respectful of each
other
▪ Praise when
appropriate
Examples ▪ Angry, stressed event
manager says to team
member, “I can’t
believe you can’t
intubate the patient.
What’s the matter with
you?”
▪ Team member says
to another, “You don’t
know what you’re
doing-let me do it for
you.”
▪ Event manager
recognizes a chest
tube is needed, and
barks, “Michelle, I
want you to put in a
chest tube, I want you
to do it now, and I
want you to do it right
on your first attempt.”
▪ Leader overbearing
and intimidating
▪ ▪ Stressed but
composed leader
recognizes a team
member cannot
intubate the patient
and offers assistance
▪ Team member says
to another, “Are you
OK? Let me know if I
can help you.”
▪ Event manager
recognizes a chest
tube is needed and
says, “Michelle, this
patient needs a chest
tube-can you put it in
now?”
▪ Leader is clear, direct,
and calm.
▪ Team members will
thank each other
when appropriate.
79
9. Flexibility
The team adapts to challenges, multitasks effectively, reallocates functions, and uses resources effectively; team self correction.
Lower Quality Higher Quality
Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Team rigidly adheres
to individual team
roles
▪ Inefficient resource
allocation / use
▪ Minimal adaptability
and/or hesitation to
changing situations
▪ Team can adapt to
certain situations, but
not all
▪ Generally very flexible
▪ Multi-tasks effectively
▪ Reallocates functions
▪ Uses resources
effectively
▪ Team adapts to
challenges
consistently
▪ Engages selfcorrection
Examples ▪ Ambu-bag not
working, and no
reallocation of
resources established
▪ Team members stay
in individual roles,
failing to support each
other e.g. by failing to
recognize fatigue of
those giving CPR
▪ Patient’s hysterical
family member
disrupts the team and
team continues
providing care,
ignoring disruptive
relative
▪ ▪ ▪ Ambu-bag not
working, and an
airway team member
gives mouth-to-mouth
with a mask and
event manager asks
another team member
to retrieve a working
ambu-bag
▪ Team members
alternate giving CPR,
recognizing fatigue of
those giving CPR
▪ Patient’s hysterical
family member
disrupts the team and
a team manages the
situation, e.g.
removes, counsels, or
reassures the family
member
▪
80
10. Overall Team Performance
Lower Quality Higher Quality
Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Consistently
operating at a novice
training level
▪ Demonstrates
inconsistent efforts to
operate at a
functional level
▪ Inconsistently
demonstrates below
and average
attributes
▪ Demonstrates
significant
cohesiveness as a
team unit;
▪ Performs proficiently
▪ Consistently operates
at an experienced
and professional
level; performs as
experts
Training
Level
▪ Team requires
training at all levels;
unable to function
independently
▪ Team needs training
at multiple levels to
function
independently
▪ Team needs focused
training to function
independently
▪ Team can function
independently with
supervision
▪ Team functions
independently
81
Case A – Dizziness, Acute
Student ___________________________ Student ID _________ SP ID _________
History Scoring: Give students credit (Yes) if they ask any of the following questions and / or SPs
give the following responses. If question(s) not asked or response(s) not give, give no credit (No).
HISTORY CHECKLIST Yes No
1 ONSET, e.g. “When did dizziness start?”
• “The dizziness started last night when I was cleaning up after dinner.”
2 PAST MEDICAL HISTORY OF PROBLEM, e.g. “Ever had this problem
before?”
“I almost passed out once in restaurant a few months ago. The EMT
truck came and checked me out and they thought I was dehydrated
from exercising. I had just come from the gym.”
3 QUALITY, e.g. “Describe the dizziness.”
• “Every few minutes or so I get the feeling the room is spinning and I
feel a little nauseous, then it goes away and I feel OK. Then it starts all
over again.”
4 AGGRAVATING, e.g. “What makes the dizziness worse?”
“Standing up with my eyes open makes me feel dizzy.”
5 PALLIATIVE, e.g. “What makes the dizziness better?”
“Closing my eyes and laying down makes the dizziness better.”
6 HEAD INJURIES, e.g. “Have you bumped or injured your head?”
• “No head injuries.”
7 PAST MEDICAL HISTORY, e.g. “How is your health in general?”
“In general I’ve been very healthy.”
8 MEDICATIONS, e.g. “Are you taking any medications for this problem or
anything else?”
“I’m not taking anything. I thought of taking Dramamine but I wasn’t
sure it would help.”
9 DIET, e.g. “What do you eat in a typical day?”
“A regular diet, toast and coffee in the morning, usually take out for
lunch, Chinese, a pizza or sub, something like that, and a regular meal
at night.”
10 TOBACCO USE, e.g. “Do you smoke?”
• “I used to smoke ó a pack a day, but now I’m down to 4 or 5,
sometimes a couple more if I’m stressed.”
11 ADLs, e.g. “How is this affecting your life?”
“I couldn’t go to work today.”
82
Case A – Dizziness, Acute
PE SCORING:
COLUMN 1: NO CREDIT: If any box is checked, exam was done “incorrectly” or
“incompletely.” Checked “Incorrect Details” box records reason(s) why.
COLUMN 2: FULL CREDIT: If “Correct” box is checked, exam was done “Correctly /
Completely.”
COLUMN 3: NO CREDIT: If “Not Done” box is checked, exam was not attempted at all.
Physical Examination Checklist 1
Incorrect
Details
2
Correct
3
Not
Done
12 Perform fundoscopic examination
Did not ask the patient to fix their gaze at point in
front of them.
Exam room not darkened.
Otoscope used instead of ophthalmoscope
“Left eye-left hand-left eye” or “right eye-right
hand -right eye rule” not followed.
Exam not bilateral.
□
□
□
□
□
□
□
13 Assess Cranial Nerve II – Optic – Assess Visual
Fields by Confrontation
Examiner not at approximate eye-level with
patient, and / or no eye contact.
Examiner’s hands not placed outside of patient’s
field of vision.
Did not ask “Tell me when you see my fingers.”
Did not test both upper and lower fields, and / or
bilaterally.
□
□
□
□
□
□
14 Assess Cranial Nerves II and III – Optic and
Oculomotor: Assess direct and consensual
reactions
Did not shine a light obliquely into each pupil
twice to check both the direct reaction and
consensual reaction.
Did not assess bilaterally.
□
□
□
□
15 Assess Cranial Nerves II and III – Optic and
Oculomotor: Assess near reaction and near
response
Did not test in normal room light.
Finger, pencil, etc. placed too close or too far
from the patient’s eye.
Did not ask the patient to look alternately at the
finger or pencil and into the distance.
□
□
□
□
□
83
Case A – Dizziness, Acute
PE SCORING:
COLUMN 1: NO CREDIT: If any box is checked, exam was done “incorrectly” or
“incompletely.” Checked “Incorrect Details” box records reason(s) why.
COLUMN 2: FULL CREDIT: If “Correct” box is checked, exam was done “Correctly /
Completely.”
COLUMN 3: NO CREDIT: If “Not Done” box is checked, exam was not attempted at all.
1
Incorrect
Details
2
Correct
3
Not
Done
16 Assess Cranial Nerve III – Oculomotor: Assess
convergence
Did not ask the patient to follow his / her finger or
pencil as he / she moves it in toward the bridge of
the nose.
□
□
□
17 Assess Cranial Nerve III, IV and VI – Oculomotor,
trochlear and abducens: Assessing extraocular
muscle movement
Examiner did not assess extra-ocular muscle
movements in at least 6 positions of gaze using,
for example, the “H” pattern.
Did not instruct patient to not move the head
during the exam.
□
□
□
□
18 Assess Cranial Nerve VIII – Acoustic / Weber test
Did not produce a sound from tuning fork, e.g. by
not holding the fork at the base
Did not place the base of the tuning fork firmly on
top middle of the patient’s head.
Did not ask the patient where the sound appears
to be coming from.
□
□
□
□
□
19 Assess Cranial Nerve VIII – Acoustic / Rinne test
Did not produce a sound from tuning fork, e.g. by
not holding the fork at the base
Did not place the base of the tuning fork against
the mastoid bone behind the ear.
Did not ask patient to say when he / she no longer
hears the sound, hold the end of the fork near the
patient’s ear and ask if he / she can hear the
vibration.
Did not tap again for the second ear.
Did not assess bilaterally.
□
□
□
□
□
□
□
20 Assess Gait
Did not ask patient to walk, turn and come back to
look for imbalance, postural, asymmetry and type
of gait (e.g. shuffling, walking on toes, etc.)
□
□
□
21 Perform Romberg Test
Did not direct patient to stand with feet together,
eyes closed, for at least 20 seconds without
support.
Did not stand in a supportive position, e.g. behind
patient or with hand behind patient.
□
□
□
□
84
Case A – Dizziness, Acute
RELATIONSHIP QUALITY
To what degree did the student …
Lower Higher
Quality Quality
1 Establish and maintain rapport 1 2 3 4 5 6 7 8
2 Demonstrate empathy 1 2 3 4 5 6 7 8
3 Instill confidence 1 2 3 4 5 6 7 8
4 Use appropriate body language 1 2 3 4 5 6 7 8
EXAMINATION QUALITY
To what degree did the student …
Lower Higher
Quality Quality
5 Elicit information clearly, effectively 1 2 3 4 5 6 7 8
6 Actively listen 1 2 3 4 5 6 7 8
7 Provide timely feedback / information / counseling 1 2 3 4 5 6 7 8
8 Perform a thorough, careful physical exam or
treatment
1 2 3 4 5 6 7 8
85
3. Clinical Clerkship Evaluations / NBOME Subject Exams
Data compiled from 3rd/4th year clerkships includes:
Student Performance Evaluations from specific hospitals (attending/supervising
physicians, and/or residents) based upon the 7 core Osteopathic Competencies.
Data is broken down further by student cohort: traditional, BS/DO, and Émigré
and is quantified according to curricular track (Lecture Discussion-Based and
Doctor Patient Continuum);
NBOME Subject Exam scores for each of the (6) core clerkships and OMM.
Core clerkships include:
a) Family Medicine
b) Medicine
c) OB-GYN
d) Pediatrics
e) Psychiatry
f) Surgery
NBOME Subject Exam statistics are shared with 3rd year students as a frame of
reference to determine their performance relative to their NYCOM peers. These
data also serve as a general guide for COMLEX II CE preparation and
performance;
Students provide feedback on their clinical experiences during their clerkships,
via the “PDA project”:
a) The PDA is a tool utilized for monitoring clerkship activities. The
DEALS (Daily Educational Activities Logs Submission) focuses on
educational activities, while the LOG portion focuses on all major
student-patient encounters. A rich data set is available for comparing
patient encounters and educational activities across all sites for all
clerkships.
86
b) PDA data is used as a multimodal quality assessment tool for curricular
exposure as well as OMM integration across all hospitals (including
“outside” clerkships) for Patient Encounters and Educational Activities.
Reports from student focus groups—these reports are based upon in-person group
interviews by a full-time NYCOM Medical Educator and feedback is analyzed in
order to ensure consistency in clerkship education and experiences, as well as for
program improvement indicators.
87
Specific forms/questionnaires utilized to capture the above-detailed information include the
following:
Clinical Clerkship Student Performance Evaluation
Samples of the forms/questionnaires follow
88
NEW YORK COLLEGE OF OSTEOPATHIC MEDICINE
OFFICE OF CLINICAL EDUCATION
Northern Boulevard -– Old Westbury, NY 11568-8000
Tel.: 516-686-3718 – Fax: 516-686-3833
(*) Only ONE form, with COMPOSITE GRADE & COMMENTS should be sent to the Hospital’s Office of
Medical Education
for the DME SIGNATURE .
COURSE # _______________________________(For NYCOM Purpose
ONLY)
STUDENT: _____________________,_______________Class Year:
______HOSPITAL:_______________________
Last First
ROTATION(Specialty)_____________________________ROTATION DATES:
____/____/____ ____/____/____
From
To
EVALUATOR: _________________________________________ TITLE:
_______________________________________
(Attending Physician / Faculty Preceptor)
A. Student logs by PDA REVIEWED (at least 10 patients) NOT REVIEWED
B. Student’s unique “STRENGTHS” (Very Important –To be incorporated into the
College’s Dean’s Letter)
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
____________
C. Student’s LIMITATIONS (areas requiring special attention for future professional growth)
______________________________________________________________________________
______________________________________________________________________________
____________________________________
89
D. For items below CIRCLE the most appropriate number corresponding to the
following rating scale:
Exceptional=5 Very Good = 4 Average = 3 Marginal = 2 1 = FAILURE N/A OR no opportunity to observe
CORE COMPETENCY (See definitions on reverse side) RATING
Patient Care 5 4 3 2 1 N/A
Medical Knowledge 5 4 3 2 1 N/A
Practice-Based Learning & Improvement 5 4 3 2 1 N/A
Professionalism 5 4 3 2 1 N/A
System-Based Practice 5 4 3 2 1 N/A
Interpersonal and Communication Skills 5 4 3 2 1 N/A
Osteopathic Manipulative Medicine 5 4 3 2 1 N/A
OVERALL GRADE 5 4 3 2 1(FAILURE
Evaluator Signature:____________________________________________________ Date:
_______/________/_______
Student Signature: ____________________________________________________ Date:
_______/________/_______
(Ideally at Exit Conference)
(*) DME Signature: _________________________________________________ Date:
_______/________/_______
Please Return to: Hospital’s Office of Medical Education
OVER
The Seven Osteopathic Medical Competencies
Physician Competency is a measurable demonstration of suitable or sufficient
knowledge, skill sets, experience, values, and behaviors, that meet established
professional standards, supported by the best available medical evidence, that are in
the best interest of the well-being and health of the patient.
Patient Care: Osteopathic patient care is the ability to effectively determine and
monitor the nature of a patient’s concern or problem; to develop, maintain, and to
bring to closure the therapeutic physician-patient relationship; to appropriately
incorporate osteopathic principles, practices and manipulative treatment; and to
implement effective diagnostic and treatment plans, including appropriate patient
education and follow-up, that are based on best medical evidence.
90
Medical Knowledge: Medical Knowledge is the understanding and
application of biomedical, clinical, epidemiological, biomechanical, and social and
behavioral sciences in the context of patient-centered care.
Practice-Based Learning & Improvement: Practice-Based learning
and improvement is the continuous evaluation of clinical practice utilizing evidence-based
medicine approaches to develop best practices that will result in optimal patient care
outcomes.
Professionalism: Medical professionalism is a duty to consistently demonstrate
behaviors that uphold the highest moral and ethical standards of the osteopathic profession.
This includes a commitment to continuous learning and the exhibition of personal and social
accountability. Medical professionalism extends to those normative behaviors ordinarily
expected in the conduct of medical education, training, research, and practice.
System-Based Practice: System-based practice is an awareness of and
responsiveness to the larger context and system of health care, and the ability to effectively
identify and integrate system resources to provide care that is of optimal value to individuals
and society at large.
Interpersonal & Communication Skills: Interpersonal and
communication skills are written, verbal, and non-verbal behaviors that facilitate
understanding the patient’s perspective. These skills include building the physician-patient
relationship, opening the discussion, gathering information, empathy, listening, sharing
information, reaching agreement on problems and plans, and providing closure. These skills
extend to communication with patients, families, and members of the health care team.
Osteopathic Manipulative Medicine: Osteopathic philosophy is a holistic
approach that encompasses the psychosocial, biomedical, and biomechanical aspects of both
health and disease, and stresses the relationship between structure and function, with
particular regard to the musculoskeletal system.
Definitions Provided by the National Board of Osteopathic Medical Examiners
(NBOME)
91
4. Student feedback (assessment) of courses / Clinical clerkship / PDA project
Data received on courses and faculty through the newly implemented, innovative
Course / Faculty Assessment program (see below-NYCOM Student Guide for
Curriculum and Faculty Assessment). Students (randomly) assigned (by teams)
to evaluate one course (and associated faculty) during 2-year pre-clinical
curriculum. Outcome of student-team assessment is presented to Curriculum
Committee, in the form of a one-page Comprehensive Report;
Clerkship Feedback (quantitative and “open-ended” feedback) provided through
“Matchstix” (web-based feedback program): this information is shared with
NYCOM Deans and Clinical Chairs, Hospital Director’s of Medical Education
(DMEs), Hospital Department Chairs and Clerkship Supervisors. Also, the
information is posted on the “web” to assist and facilitate 2nd year students
choosing 3rd year Core Clerkship Sites (transparency). This data is also utilized
via two (2) year comparisons of quantitative data and student feedback shared
with NYCOM Deans & Chairs, as well as Hospital DMEs;
Clerkship Feedback via PDA: quantitative and open-ended (qualitative) feedback
on all clerkships is collected via student PDA submission. The information is
utilized as a catalyst for clerkship quality enhancement. This data-set is used as a
multimodal quality assessment tool for curricular exposure as well as OMM
integration across all hospitals (including “outside” clerkships) for Patient
Encounters and Educational Activities;
92
Reports from student focus groups—these reports are based upon in-person group
interviews by a full-time NYCOM Medical Educator and feedback is analyzed in
order to ensure consistency in clerkship education and experiences, as well as for
program improvement indicators;
93
Specific forms/questionnaires utilized to capture the above-detailed information include the
following:
NYCOM Student Guide for Curriculum and Faculty Assessment
Clerkship (site) feedback from Clerkship students
Clinical Clerkship Focus Group Form
4th Year PDA Feedback Questionnaire
Student End-of-Semester Program Evaluations (DPC)
DPC Program Assessment Plan
Osteopathic Manipulative Medicine (OMM) Assessment Forms
Samples of the forms/questionnaires follow
94
95
Site Feedback
Rotation: Surgery
Site: (*) MAIMONIDES MEDICAL CENTER
This is an anonymous feedback form. No student identification data is transmitted.
Questions marked with * are mandatory.
Section I. Please respond to each statement in this section according to the following
scale.
STRONGLY DISAGREE <-> STRONGLY AGREE
1* There were adequate learning opportunities (teaching patients, diversity of pathology and
diagnostic procedures)
Strongly Disagree Disagree Neutral Agree Strongly Agree
2* There were opportunities to practice osteopathic diagnosis and therapy
Strongly Disagree Disagree Neutral Agree Strongly Agree
3* There was adequate supervision and feedback (e.g., reviews of my H&P, progress notes and
clinical skills)
Strongly Disagree Disagree Neutral Agree Strongly Agree
4* I had the opportunity to perform procedures relevant for my level of training
Strongly Disagree Disagree Neutral Agree Strongly Agree
5* I was evaluated fairly for my level of knowledge and skills
Strongly Disagree Disagree Neutral Agree Strongly Agree
6* Attending physicians and/or house staff were committed to teaching
Strongly Disagree Disagree Neutral Agree Strongly Agree
7* Overall, I felt meaningfully engaged and well integrated with the clinical teams (e.g., given
sufficient patient care responsibilities)
Strongly Disagree Disagree Neutral Agree Strongly Agree
96
8* The DME and/or clerkship director was responsive to my needs as a student
Strongly Disagree Disagree Neutral Agree Strongly Agree
9* There were adequate library resources at this facility
Strongly Disagree Disagree Neutral Agree Strongly Agree
10* A structured program of directed readings and/or journal club was a component of this
rotation.
Strongly Disagree Disagree Neutral Agree Strongly Agree
11* The lectures were appropriate for this rotation (e.g., quality, quantity and relevance of
topics)
Strongly Disagree Disagree Neutral Agree Strongly Agree
12* Educationally useful teaching rounds were conducted on a regular basis.
Strongly Disagree Disagree Neutral Agree Strongly Agree
13* This rotation reflected a proper balance of service and education
Strongly Disagree Disagree Neutral Agree Strongly Agree
14* This rotation incorporated a psychosocial component in patient care
Strongly Disagree Disagree Neutral Agree Strongly Agree
15* Overall, I would recommend this rotation to others
Strongly Disagree Disagree Neutral Agree Strongly Agree
Section II. Psychomotor skills
Indicate the number you performed on an average week during this rotation for each of
the following:
16* History and Physicals
97
17* Osteopathic structural examinations
18* Osteopathic Manipulative Treatments
19* Starting IVs
20* Venipunctures
21* Administering injections
22* Recording notes on medical records
23* Reviewing X-Rays
24* Reviewing EKGs
25* Urinary catherizations
26* Insertion and removal of sutures
27* Minor surgical procedures (assist)
28* Major surgical procedures (assist)
29* Care of dressings and drains
98
30* Sterile field maintenance
Section III
31* Comment on unique STRENGTHS and Positive Features of this rotation
32* Comment on the LIMITATIONS and Negative Features of this rotation
33* Comment on the extent in which the Learning Objectives for the rotation were met (e.g.,
specific topics/patient populations to which you were or not exposed)
Section IV. Please list your clinical instructors with whom you had substantial contact
on this rotation and provide a general rating of their effectiveness as Teachers using the
scale below.
5=EXCELLENT, 4=VERY GOOD, 3=AVERAGE, 2=BELOW AVERAGE,
1=POOR
For example – John Smith – 4
34* List clinical instructors and rating in the box below
To submit your feedback, enter your password below and then click on Submit Feedback button
Submit Feedback
Cancel
99
Focus Groups on Clinical Clerkships
NAME OF HOSPITAL:
LOCATION:
DATE OF SITE VISIT:
The student’s comments on the clinical rotations are as follows:
(Name of Clerkship)
STRENGHTS:
WEAKNESSES:
100
4th Year PDA Feedback Questionnaire
1. Clinic Site
2. Rotation
3. Date
4. There were adequate learning opportunities
5. There were opportunities to practice Osteopathic diagnosis & therapy
6. I was evaluated fairly for my level of knowledge and skills
7. Attending physicians and/or house staff were committed to teaching
8. Overall, I felt meaningfully engaged and well integrated with the clinical teams
9. The DME and/or clerkship director was responsive to my needs as a student
10. This rotation reflected a proper balance of service and education
11. Overall, I would recommend this clerkship to others
12. Comments
13. Strengths/Positive Features of Rotation
14. Limitations/Negative Features of Rotation
15. List and Rate Clinical Instructors
101
Student End-of-Semester Program Evaluations
The DPC Student End-of-Semester Program Evaluation is an assessment of
each course that occurred during the semester and the corresponding faculty
members.
DPC END OF SEMESTER EVALUATION
Directions:
1. Please write in your year of graduation here: .
2. Enclosed you will find a blank scantron sheet.
3. Please make sure that you are using a #2 pencil to fill in your answers.
4. Please fill in the following Test Form information on the Scantron Sheet:
DPC Class 2011 – Bubble in Test Form A
DPC Class 2012 – Bubble in Test Form B
5. No other identifying information is necessary.
6. Please complete each of the following numbered sentences throughout
this evaluation using the following responses:
A. Excellent – couldn’t be better
B. Good – only slight improvement possible
C. Satisfactory – about average
D. Fair – some improvement needed
E. Poor – considerable improvement needed
7. There are spaces after each section in which you can write comments.
(When making comments, please know that your responses will be shared with DPC faculty,
Dept. chairs, and deans, as part of ongoing program evaluation.)
BIOPSYCHOSOCIAL SCIENCES COURSE EVALUATION:
102
I. CASE STUDIES COMPONENT
Excellent Good Satisfactory
Fair Poor
1. This course, overall is A B C D E
2. My effort in this course, overall is A B C D E
3. The case studies used in small
group are A B C D E
4. My preparation for each group
session was A B C D E
5. Other available resources for use in
small group are A B C D E
6. Facilitator assessments are A B C D E
7. Self assessments are A B C D E
8. Content Exams – midterm and final
are A B C D E
9. The group process in my group can
be described as A B C D E
10. The wrap-ups in my group were A B C D E
11. The quality of the learning issues
developed by my group was A B C D E
Overall comments on Case Studies
II. STUDENT HOUR COMPONENT:
Excellent Good Satisfactory
Fair Poor
12. The monthly student hours are A B C D E
Overall Comments On The Student Hour
103
III. FACILITATOR RATINGS
Please circle your group number/the name of your group facilitator(s).
Group Facilitators
A Dr. _____________________ and Dr. _______ ______________
B Dr. _____________________ and Dr. ________ ______________
C Dr. _____________________ and Dr. ______________________
D Dr. _____________________ and Dr. _______________________
Please bubble in your response to each of the following items:
Strongly
Agree Agree Disagree Strongly
Disagree
13. Maintained appropriate directiveness 5 (A) 4 (B) 2 (C) 1 (D)
14. Supported appropriate group process 5 (A) 4 (B) 2 (C) 1 (D)
15. Supported student-directed learning 5 (A) 4 (B) 2 (C) 1 (D)
16. Gave appropriate feedback to group 5 (A) 4 (B) 2 (C) 1 (D)
17. Ensured that learning issues were
Appropriate 5 (A) 4 (B) 2 (C) 1 (D)
18. Overall, these facilitators were
effective 5 (A) 4 (B) 2 (C) 1 (D)
Overall Facilitator Comments
(Comments on individual facilitators are welcome)
104
IV. PROBLEM SETS/DISCUSSION SESSIONS COMPONENT
A. Course Evaluation:
Excellent Good Satisfactory
Fair Poor
19. These sessions, overall were A B C D E
20. My effort in these sessions, overall
was A B C D E
21. The organization of these sessions
was A B C D E
22. Handouts in general were A B C D E
Problem Sets/Discussion Sessions Comments
(Please comment as to whether problem sets were too many, too few, too involved.)
105
V. PROBLEM SETS/DISCUSSION SESSIONS COMPONENT
B. Presenter Evaluation:
Excellent Good Satisfactory
Fair Poor
23. The Problem Set topic on
was A B C D E
24. The instructor,
, for the problem set named
in #23 was
A B C D E
25. The Problem Set topic on
was A B C D E
26. The instructor,
, for the problem set named
in #25 was
A B C D E
27. The Problem Set topic on
was A B C D E
28. The instructor,
, for the problem set named
in #27 was
A B C D E
29. The Problem Set topic on
was A B C D E
30. The instructor,
, for the problem set named
in #29 was
A B C D E
31. The Problem Set topic on
was A B C D E
32. The instructor,
, for the problem set named
in #31 was
A B C D E
Problem Sets/Discussion Sessions Comments
(Comments on individual instructors are welcome)
106
VI. ANATOMY COMPONENT
A. Course Evaluation:
Excellent Good Satisfactory
Fair Poor
33. This component, overall was A B C D E
34. My effort in this component was A B C D E
35. My preparation for each lab session
was A B C D E
36. Organization of the component was A B C D E
37. Quizzes were A B C D E
38. Resource Hour / Reviews were A B C D E
Anatomy Component Comments
107
VII. ANATOMY COMPONENT
B. Teaching Evaluation:
Please bubble in your response to each of the following items:
Strongly
Agree Agree Disagree Strongly
Disagree
39. The faculty were available to answer
questions in the lab 5 (A) 4 (B) 2 (C) 1 (D)
40. The faculty Initiated student
discussion 5 (A) 4 (B) 2 (C) 1 (D)
41. The faculty were prepared for each
lab session 5 (A) 4 (B) 2 (C) 1 (D)
42. The faculty gave me feedback on how
I was doing 5 (A) 4 (B) 2 (C) 1 (D)
43. The faculty were enthusiastic about
the course 5 (A) 4 (B) 2 (C) 1 (D)
44. Overall, the instructors were effective 5 (A) 4 (B) 2 (C) 1 (D)
Anatomy Component Comments
(Comments on individual instructors are welcome)
108
CLINICAL SCIENCES COURSE
I. CLINICAL SKILLS LAB COMPONENT
A. Course Evaluation:
Excellent Good Satisfactory
Fair Poor
45. This component, overall was A B C D E
46. My effort in this component was A B C D E
47. My preparation for each lab session
was A B C D E
48. Organization of the component was A B C D E
49. Examinations were A B C D E
50. Handouts/PowerPoints were A B C D E
51. I would rate my physical exam and
history taking skills at this time to
be
A B C D E
Overall Comments on Clinical Skills Component / Individual Labs
(Comments on individual instructors are welcome)
109
I. CLINICAL SKILLS LAB COMPONENT
B. Teaching Evaluation:
Please bubble in your response to each of the following items:
Strongly
Agree Agree Disagree Strongly
Disagree
52. The faculty were available to answer
questions in the lab 5 (A) 4 (B) 2 (C) 1 (D)
53. The faculty initiated student
discussion 5 (A) 4 (B) 2 (C) 1 (D)
54. The faculty were prepared for each
lab session 5 (A) 4 (B) 2 (C) 1 (D)
55. The faculty Gave me feedback on
how I was doing 5 (A) 4 (B) 2 (C) 1 (D)
56. The faculty were enthusiastic about
the course 5 (A) 4 (B) 2 (C) 1 (D)
57. Overall, the instructors were effective 5 (A) 4 (B) 2 (C) 1 (D)
Overall Comments on Clinical Skills Component / Individual Labs
(Comments on individual instructors are welcome)
110
II. OMM COMPONENT
A. Course Evaluation:
Excellent Good Satisfactory
Fair Poor
58. This component, overall was A B C D E
59. My effort in this component was A B C D E
60. My preparation for each lab session
was A B C D E
61. Organization of the component was A B C D E
62. Presentations / Lectures were A B C D E
63. Handouts were A B C D E
64. Quizzes were A B C D E
65. Practical exams were A B C D E
66. Resource Hour / Reviews were A B C D E
Overall Comments on OMM Component / Individual Labs
(Comments on individual instructors are welcome)
111
II. OMM COMPONENT
B. Teaching Evaluation
Please bubble in your response to each of the following items:
Strongly
Agree Agree Disagree Strongly
Disagree
67. The faculty were available to answer
questions in the lab 5 (A) 4 (B) 2 (C) 1 (D)
68. The faculty Initiated student
discussion 5 (A) 4 (B) 2 (C) 1 (D)
69. The faculty were prepared for each
lab session 5 (A) 4 (B) 2 (C) 1 (D)
70. The faculty gave me feedback on how
I was doing 5 (A) 4 (B) 2 (C) 1 (D)
71. The faculty were enthusiastic about
the course 5 (A) 4 (B) 2 (C) 1 (D)
72. Overall, the instructors were effective 5 (A) 4 (B) 2 (C) 1 (D)
Overall Comments on OMM Component / Individual Labs
(Comments on individual instructors are welcome)
112
III. ICC COMPONENT
A. Course Evaluation:
Excellent Good Satisfactory
Fair Poor
73. This component, overall was A B C D E
74. My effort in this component was A B C D E
75. My preparation for each lab session
was A B C D E
76. Organization of this component was A B C D E
77. The helpfulness/usefulness of the
ICC standardized patient
encounters was
A B C D E
78. The helpfulness/usefulness of the
ICC robotic patient encounters was A B C D E
79. Are Clinical Skills laboratory
exercises appropriate for the ICC?
[A] YES [B] NO
A YES B NO – – –
Overall Comments on the ICC Component
(Comments on individual instructors are welcome)
113
IV. CLINICAL PRACTICUM COMPONENT
80. I participated in Clinical Practicum this semester: [A] YES [B] NO
If you answered NO to this question, you have finished this evaluation, if you answered YES,
please continue this questionnaire until the end. Thank you.
A. Course Evaluation
Excellent Good Satisfactory
Fair Poor
81. This component, overall was A B C D E
82. My effort in this component was A B C D E
83. My preparation for each lab session
was A B C D E
84. Organization of this component was A B C D E
85. The helpfulness/usefulness of the
Clinical Practicum was A B C D E
86. The organization of the case
presentations was A B C D E
87. Are Clinical Skills laboratory
exercises appropriate for the
Clinical Practicum?
A YES B NO – – –
Please bubble in your response to each of the following items:
Strongly
Agree
Agree Disagree Strongly
Disagree
88. The case presentation exercise was a
valuable learning experience 5 (A) 4 (B) 2 (C) 1 (D)
Overall Comments on Clinical Practicum Course
114
IV. CLINICAL PRACTICUM COMPONENT
B. Mentor Evaluation:
Please bubble in your response to each of the following items:
Strongly
Agree Agree Disagree Strongly
Disagree
89. The preceptor was available to
answer my questions 5 (A) 4 (B) 2 (C) 1 (D)
90. I was supported in my interaction
with patients 5 (A) 4 (B) 2 (C) 1 (D)
91. Student-directed learning was
supported 5 (A) 4 (B) 2 (C) 1 (D)
92. I had appropriate feedback 5 (A) 4 (B) 2 (C) 1 (D)
93. Overall, this preceptor/site was
effective 5 (A) 4 (B) 2 (C) 1 (D)
Preceptor Name _______________________
Overall Comments on Clinical Practicum Mentor
(Comments on individual instructors are welcome)
115
DPC: Program Assessment Plan
I. Pre matriculated Evaluation – What determines that an applicant will pick the DPC
program?
Comparison of the students who chose the LDB program vs. the DPC program with
regard to the following outcome measures:
GPA scores (overall, science)
MCAT scores
Gender
Age
Race
College size
College Geographic location
Prior PBL exposure
OMM understanding
Research Background
Volunteer Work
Employment Experience
Graduate Degree
Scholarships/Awards
II. Years at NYCOM – How do we evaluate if the DPC program is accomplishing its goals
while the students are at NYCOM?
Comparison of Facilitator Assessments for each term, to monitor student growth
Comparison of Clinical Practicum Mentor Evaluations from Term 2 and Term 3, to
evaluate the student’s clinical experience progress
Comparison of Content exam scores from terms 1 through 4.
Comparison of entrance questionnaire (administered during first week of medical
school) responses to corresponding exit questionnaire administered at the end of year
4
Evaluation of the Student DPC End-of-Term Evaluations
Comparison of the following measures to those outcomes achieved by the students in
the LDB program:
OMM scores
116
DPC: Program Assessment Plan
Anatomy scores
ICC PARS scores
ICC OSCE scores
Summer research
Summer Volunteerism
Research effort (publications, abstracts, posters, presentations)
Shelf-exams
COMLEX I, II, III scores and pass rate
Fellowships (Academic, Research)
III. Post Graduate Training Practice – What happens to the DPC student once they leave
NYCOM? How to they compare to those students who matriculated through the LDB
program?
Comparison of the following measures to those outcomes achieved by the students in
the LDB program:
Internships
Residencies
Fellowships
Specialty (medicine)
Specialty board certifications
AOA membership
AMA membership
Publications
Research
Teaching
117
OMM Assessment Forms
118
119
5. COMLEX USA Level I, Level II CE & PE, and Level III data (NBOME)
a) First-time and overall pass rates and mean scores;
b) Comparison to national averages;
c) Comparison to college (NYCOM) national ranking.
Report provided by Associate Dean for Academic Affairs
120
6. Residency match rates and overall placement rate
Data compiled as received from the American Osteopathic Association (AOA) and
the National Residency Match Program (NRMP).
Report provided by Associate Dean for Clinical Education
121
7. Feedback from (AACOM) Graduation Questionnaire
Annual survey report received from AACOM comparing NYCOM graduates
responses to numerous questions/categories (including demographics, specialty
choice, overall perception of pre-doctoral training, indebtedness, and more) to nationwide
osteopathic medical school graduating class responses.
122
Specific forms/questionnaires utilized to capture the above-detailed information include the
following:
AACOM Survey of Graduating Seniors
Samples of the forms/questionnaires follow
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
8. Completion rates (post-doctoral programs)
Percent of NYCOM graduates completing internship/residency training programs.
Report provided by Office of Program Evaluation and Assessment
142
9. Specialty certification and licensure
Data compiled from state licensure boards and other specialty certification
organization (board certification) on NYCOM graduates.
Report provided by Office of Program Evaluation and Assessment
143
10. Career choices and geographic practice location
Data includes practice type (academic, research, clinical, and so on) and practice
location. Data obtained from licensure boards, as well as NYCOM Alumni survey.
Report provided by Office of Program Evaluation and Assessment
144
11. Alumni Survey
Follow up survey periodically sent to alumni requesting information on topics
such as practice location, specialty, residency training, board certification and
so on.
145
Specific forms/questionnaires utilized to capture the above-detailed information include the
following:
Alumni Survey
Samples of the forms/questionnaires follow
146
ALUMNI SURVEY
NAME
LAST FIRST NYCOM CLASS YEAR
HOME ADDRESS
PRACTICE ADDRESS
HOME PHONE ( ) OFFICE PHONE ( )
E-MAIL ADDRESS
________________________________ _______________________________ _______________________
INTERNSHIP HOSPITAL RESIDENCY HOSPITAL FIELD OF STUDY
FELLOWSHIPS COMPLETED:
CERTIFICATIONS YOU HOLD:
IF SPOUSE IS ALSO A NYCOM ALUMNUS, PLEASE INDICATE SPOUSE’S NAME AND CLASS YEAR:
EXCLUDING INTERNSHIP, RESIDENCY AND FELLOWSHIP, HAVE YOU EARNED ANY ADDITIONAL ACADEMIC DEGREES OR CERTIFICATES BEYOND
YOUR MEDICAL DEGREE (I.E., MPH, MBA, MHA, PHD, MS)? (PLEASE LIST)
CURRENT PRACTICE STATUS: FULL-TIME PRACTICE___ PART-TIME PRACTICE _____ INTERN/RESIDENCY _____ RETIRED/NOT PRACTICING _____
147
What specialty do you practice most
frequently? (Choose one)
Allergy and Immunology
Anesthesiology
Cardiology
Colorectal Surgery
Dermatology
Emergency Medicine
Endocrinology
Family Practice
Gastroenterology
Geriatrics
Hematology
Infectious Diseases
Internal Medicine
Neruology
Neonatology
Nephrology
Neurology
Nuclear Medicine
Obstetrics & Gynecology
Occupational Medicine
Ophthalmology
Oncology
Otolaryngology
Orthopedic Surgery
Psychiatry
Pediatrics
Plastic/Recon. Surgery
Physical Medicine/Rehab
Pathology
Pulmonary Medicine
Radiology
Rheumatology
Surgery (general)
Thoracic Surgery
Radiation Therapy
Urology
Other (Please specify)
____________________
Current military status (if applicable):
Active Duty
Inactive reserve
Active Reserve
What is the population of the
geographic area of your practice?
(Choose one)
5,000,000 +
1,000,000 – 4,999,999
500,000 – 999,999
250,000 – 499,999
100,000 – 249,999
50,000 – 99,999
25,000 – 49,999
10,000 – 24,999
5,000 – 9,999
Less than 5,000
How would you describe this
geographic area? (Choose one)
Inner City
Urban
Suburban
Small Town – Rural
Small town – industrial
Other ______________________
What functions do you perform in
your practice? (check all that apply)
Preventive care/patient education
Acute care
Routine/non-acute care
Consulting
Supervisory/managerial responsibilities
Research
Teaching
Hospital Rounds
What best describes the setting in
which you spend the most time ?
Intensive Care Unit of Hospital
Inpatient Unit of Hospital (not ICU/CCU)
Outpatient Unit of Hospital
Hospital Emergency Room
Hospital Operating Room
Freestanding Urgent Care Center
Freestanding Surgical Facility
Nursing Home or LTC Facility
Solo practice physician office
Single Specialty Group practice physician
office
Multiple Specialty Group practice physician
office
University Student Health facility
School-based Health center
HMO facility
Rural Health Clinic
Inner-city Health Center
Other Community Health Center
Other Freestanding Outpatient facility
Correctional facility
Industrial facility
Mobile Health Unit
Other (Please specify)
__________________________________
Do you access medical information
via the internet ?
Never
Sometimes
Often
What percent of your time is spent in primary
care? (family medicine or gen. internal medicine)
0%
1 – 25%
25 – 50%
50 – 75%
75 – 100%
What percent of your practice is outpatient?
0%
1 – 25%
25 – 50%
50 – 75%
75 – 100%
148
Do you engage in any of the following
activities? (check all that apply)
Professional organization
leadership position
Volunteer services in the
community
School or team physician
Free medical care
Leadership in church,
congregation
Local government
Speaking on medical
topics to community
groups
How many CME programs or other
professional training sessions did you
attend last year?
none
1-5
5-10
10-15
more than 15
Have you ever done any
of the following?
Author or co-author
a professional paper
Contribute to an article
Direct a research project
Participate in clinical
research
Present a lecture at a
professional meeting or
CME program
Serve on a panel
discussion at a
professional meeting
How often do you read
medical literature regarding
new research findings?
Rarely
Several times a year
Monthly
Weekly
Daily
How frequently do you apply
osteopathic concepts into
patient care?
Never
Rarely
Often
Always
In your practice do you employ any of
the following?
(check all that apply)
Structural examination or
musculoskeletal
considerations in
diagnosis
Indirect OMT techniques
High Velocity OMT
Myofascial OMT
Cranial OMT
Palpatory diagnosis
Please indicate how important each of the following skills
has been in your success as a physician, and how well
NYCOM prepared you in that skill.
Biomedical science knowledge base
Clinical skills
Patient educator skills
Empathy and compassion for patients
Understanding of cultural differences
Osteopathic philosophy
Clinical decision making
Foundation of ethical standards
Ability to communicate with other health care providers
Ability to communicate with patients and families
Knowing how to access community resources
Ability to understand and apply new medical information
Understanding of the payor/reimbursement system
How important to my practice
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
How well NYCOM prepared me
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
149
Ability to search and retrieve needed information
Manipulative treatment skill
Ability to use medical technology
Diagnostic skill
Skill in preventive care
Understanding of public health issues & the public health
system
Professionalism
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Strong Moderate Weak
Please return to:
NYCOM of NYIT, Office of Alumni Affairs
Northern Boulevard, Serota Bldg., Room 218
Old Westbury, New York 11568
or
fax to (516) 686-3891 or (516) 686-3822
as soon as possible.
Thank you for your cooperation!
150
NYCOM Benchmarks
1-Applicant Pool
Benchmark: To maintain relative standing among Osteopathic Medical Colleges based on
the number of applicants.
2-Admissions Profile
Benchmark: Maintain or improve current admissions profile based on academic criteria such
as MCAT, GPA, or Colleges attended.
3-Academic Attrition Rates
Benchmark: To maintain or improve our current 3% Academic Attrition rate
4-Remediation rates (pre-clinical years)
Benchmark: A 2% a year reduction in the students remediating in pre-clinical years.
5-COMLEX USA Scores
Benchmark: Top quartile in the National Ranking of 1st time pass rate and Mean Score.
6-Students entering Osteopathic Graduate Medical Education (OGME)
Benchmark: Maintain or improve the current OGME placement.
7-Graduates entering Primary Care (PC) 12
Benchmark: Maintain or improve the current Primary Care placement.
8-Career Data -Licensure (within 3 years, post-graduate), Board Certification , Geographic
Practice, and Scholarly achievements.
Benchmark: TBD
12 Family Medicine, Internal Medicine, and Pediatrics
151
BIBLIOGRAPHY
Gonnella, J.S., Hojat, M., & Veloski, J.J. Jefferson Longitudinal Study of Medical Education.
Retrieved December 17, 2008, from http://jdc.jefferson.edu/jlsme/1
Hernon, P. & Dugan, R.E. (2004). Outcomes Assessment in Higher Education. Libraries
Unlimited: Westport, CT
152
APPENDICES
153
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
1 Assess Cranial Nerve I
– Olfactory
Examiner checks for
patient’s sense of smell by,
e.g. coffee, soap,
peppermint, orange peels,
etc.
2 Assess Cranial Nerve II
– Optic: Assessing Visual
Fields by Confrontation
Examiner stands at
approximate eye-level
with patient, making eye
contact.
Patient is then asked to
return examiner’s gaze
e.g. by saying “Look at
me.”
Examiner starts by
placing his / her hands
outside the patient’s field
of vision, lateral to head.
With fingers wiggling (so
patient can easily see
them) the examiner
brings his / her fingers
into the patient’s field of
vision.
Hands diagonal
Or, hands horizontal
Examiner must ask the patient “Tell me when you see my
fingers.”
Assess upper, middle and lower fields, bilaterally.
154
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
3 Assess Cranial Nerve II –
Optic: Accessing Visual
Acuity
For ICC purposes,
handheld Rosenbaum
Pocket Screener (eye
chart)
NOTE: Use handheld
Snellen eye chart if
patient stand 20’ from
the chart
Ask patient to cover one
eye while testing the
other eye
Rosenbaum eye chart
is held in good light
approximately 14” from
eye
Determine the smallest
line of print from which
patient can read more
than half the letters
The patient’s visual
acuity score is recorded
as two numbers, e.g.
“20/30” where the top
number is the distance
the patient is from the
chart and the bottom
number is the distance
the normal eye can
read that line.
Repeat with the other
eye
155
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
4 Assessing Cranial Nerves II and III
– Optic and Oculomotor:
Assessing direct and Consensual
Reactions
Examiner asks the patient to look into the
distance, then shines a light obliquely into
each pupil twice to check both the direct
reaction (pupillary constriction in the same
eye) and consensual reaction (pupillary
constriction in the opposite eye).
Must be assessed bilaterally.
5 Assessing Cranial Nerves II and III – Optic
and Oculomotor: Assessing Near Reaction
and Near Response
Assessed in normal room light, testing one
eye at a time.
Examiner holds a finger, pencil, etc. about
10 cm. from the patient’s eye.
Asks the patient to look alternately at the
finger or pencil and then into the distance.
Note pupillary constriction with near focus.
Close focus
Distant focus
156
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
6 Assessing Cranial Nerve III
– Oculomotor: Assessing Convergence
Examiner asks the patient to follow his / her
finger or pencil as he / she moves it in
toward the bridge of the nose to within about
5 to 8 centimeters.
Converging eyes normally follow the object
to within 5 – 8 cm. of the nose.
7 Assessing Cranial Nerve III, IV and VI
– Oculomotor, Trochlear And Abducens:
Assessing Extra Ocular Muscle Movement
Examiner assesses muscle movements in at
least 6 positions of gaze by tracing, for
example, an “H pattern” with the hand and
asking the patient to follow the hand with
their eyes without turning the head.
157
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
8 Assessing Cranial Nerve V
– Trigeminal (Sensory) Ophthalmic Maxillary
Examiner assesses sensation in 3
sites:
Ophthalmic
Maxillary
Mandibular
Examiner may use fingers,
cotton, etc. for the
assessment.
Assess bilaterally.
Mandibular
9 Assessing Cranial Nerve V
– Trigeminal (Motor)
Examiner asks the patient to
move jaw his or her jaw from
side to side
OR
Examiner palpates the
masseter muscles and asks
patient to clinch his / her teeth.
Note strength of muscle
contractions.
OR
158
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
10 Assessing Cranial
Nerve VII – Facial:
Motor Testing
Examiner asks patient to
perform any 4 of the
following 6 exams:
Raise both eyebrows
Close eyes tightly,
then try to open
against examiner’s
resistance
Frown
Smile
Show upper and lower
teeth
Puff out cheeks
Note any weakness or
asymmetry.
Raise eyebrows Opening eyes against resistance
Frown Smile
Show teeth Puff cheeks
159
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
11
Assess Cranial Nerve VIII
– Acoustic
Weber test – for
lateralization
Use a 512 Hz or 1024
Hz turning fork.
Examiner starts the fork
vibrating e.g. by tapping
it on the opposite hand,
leg, etc.
Base of the tuning fork
placed firmly on top of
the patient’s head.
Patient asked “Where
does the sound appear
to be coming from?”
(normally it will be
sensed in the midline).
160
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
12 Assessing Cranial Nerve
VIII – Acoustic
Rinne test – to compare
air and bone conduction
Use a 512 Hz or 1024
Hz turning fork.
Examiner starts the fork
vibrating, e.g. by
tapping it on the
opposite hand, leg, etc.
Base of fork placed
against the mastoid
bone behind the ear.
Patient asked to say
when he / she no longer
hears the sound
Mastoid Bone
When sound no longer
heard, examiner moves
the tuning fork (without
re-striking it) and holds
it near the patient’s ear
and ask if he / she can
hear the vibration.
Examiner must vibrate
the tuning fork again for
the second ear.
Bilateral exam.
NOTE: (AC>BC): Air
conduction greater than
bone conduction.
Ear
161
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
13 Assessing Cranial Nerve VIII –
– Gross Auditory Acuity
Examiner asks patient to
occlude (cover) one ear.
Examiner then whispers
words or numbers into nonoccluded
ear from
approximately 2 feet away.
Asks patient to repeat
whispered words or
numbers.
Compare bilaterally.
OR
Examiner asks patient to
occlude (cover) one ear.
Examiner rubs thumb and
forefinger together next to
patient’s non-occluded ear
and asks the patient if the
sound is heard.
Compare bilaterally.
162
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
14 Assessing Cranial Nerve IX
and X – Glossopharyngeal
and Vagus: Motor Testing
First, examiner asks the
patient to swallow.
Swallowing
Next, patient asked to say
‘aah’ and examiner
observes for symmetrical
movement of the soft
palate or a deviation of the
uvula.
OPTIONAL: Use a light
source to help visualize
palate and uvula.
NOTE: sensory component of
cranial nerves IX and X is
testing for the “gag reflex”
Saying “Aah”
163
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
15 Assessing Cranial Nerve XI
– Spinal Accessory:
Motor Testing
Examiner asks the patient to
shrug his / her shoulders up
against the examiner’s
hands. Apply resistance.
Note strength and
contraction of trapezius
muscles.
Next, patient asked to turn
his or her head against
examiner’s hand. Apply
resistance.
Observe the contraction of
the opposite sternocleidomastoid
muscle.
Assess bilaterally.
164
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
16 Assessing Cranial Nerve
XII – Hypoglossal:
Motor Testing
First, examiner inspects
patient’s tongue as it
lies on the floor of the
mouth.
Note any asymmetry,
atrophy or
fasciculations.
Next, patient asked to
protrude the tongue.
Note any asymmetry,
atrophy or deviations
from the midline.
Finally, patient asked to
move the tongue from
side to side.
Note any asymmetry of
the movement.
Inspect tongue Protruding Tongue
Side to Side Movement
165
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
17 Assessing Lower Extremities –
Motor Testing
With patient in supine position, test
bilaterally
Test flexion of the hip by placing
your hand on patient’s thigh, and
ask them to raise his / her leg
against resistance.
Test extension of the hip by
having patient push posterior
thigh against your hand
CONTINUED
166
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
18 Assessing Lower Extremities –
Motor Testing
With patient in seated position, test
bilaterally
Test adduction of the hip by
placing hands firmly between the
knees, and asking them to bring
the knees together
Test abduction of the hip by
placing hands firmly outside the
knees, and asking patient to
spread their legs against
resistance
167
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
19 Assessing Upper Extremities –
Motor Testing
Examiner asks patient to pull (flex)
and push (extend) the arms against
the examiner’s resistance.
Bilateral exam.
Flexion
Extension
20 Assessing Lower Extremities –
Motor Testing
Examiner asks the patient to pull
(flex) and push (extend) the legs
against the examiner’s resistance.
Bilateral exam.
Flexion
Extension
168
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
21 Assessing Lower Extremities –
Motor Testing
Examiner asks patient to dorsiflex
and plantarflex the ankle against
resistance
Compare bilaterally
169
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
22 Assessing the Biceps Reflex
Examiner partially flexes patient’s
arm.
Strike biceps tendon with reflex
hammer (pointed or flat end) with
enough force to elicit a reflex, but not
so much to cause patient discomfort.
OPTIONAL: Examiner places the thumb
or finger firmly on biceps tendon with the
pointed end of reflex hammer only.
Reflexes must be assessed
bilaterally.
Examiner must produce a reflex for
credit.
OR
23 Assessing the Triceps Reflex
Examiner flexes the patient’s arm at
the elbow, and then taps the triceps
tendon with reflex hammer.
Reflexes must be assessed
bilaterally.
Examiner must produce a reflex for
credit.
170
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
24 Assessing the Brachioradialis
Reflex
With the patient’s hand resting
in a relaxed position, e.g. on a
table, his / her lap or supported
by examiner’s arm, the
examiner strikes the radius
about 1 or 2 inches above the
wrist with the reflex hammer.
Reflexes must be assessed
bilaterally.
Examiner must produce a reflex
for credit.
171
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
25 Assessing the Patellar Tendon Reflex
First, patient asked to sit with their legs
dangling off the exam table.
Reflexes assessed by striking the
patient’s patellar tendon with a reflex
hammer on skin.
Reflexes must be assessed bilaterally.
Examiner must produce a reflex for
credit.
OPTIONS:
Examiner can place his / her hand on
the on patient’s quadriceps, but this is
optional.
Patient’s knees can be crossed.
172
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
25 Assessing the Achilles
Reflex
Examiner dorsiflexes the
patient’s foot at the ankle
Achilles tendon struck with
the reflex hammer on skin,
socks completely off
(removed at the direction
of the examiner).
Reflexes must be
assessed bilaterally.
Examiner must produce a
reflex for credit.
173
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
26 Assessing the Plantar, or Babinski,
Response
Examiner strokes the lateral aspect of
the sole from the heel to the ball of
the foot, curving medially across the
ball, with an object such as the end of
a reflex hammer.
On skin, socks completely off
(removed at the direction of the
examiner).
Assessment must be done bilaterally
Note movement of the toes (normally
toes would curl downward).
174
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
27 Assessing Rapid
Alternating Movements
Pronate Supinate
Examiner must do all three
assessments for credit:
Examiner directs the
patient to pronate and
supinate one hand
rapidly on the other.
Touching Thumbs Rapidly Patient directed to
touch his / her thumb
rapidly to each finger
on same hand,
bilaterally.
Slapping Thighs Rapidly
Patient directed to slap
his / her thigh rapidly
with the back side of
the hand, and then with
the palm side of the
hand, bilaterally.
175
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
29 Assessing Finger-to-Nose
Movements
Examiner directs the patient to touch
the examiner’s finger with his or her
finger, and then to place his or her
finger on their nose.
Examiner moves his / her finger
randomly during multiple movements.
176
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
30 Assessing Gait
Examiner asks patient to perform the
following:
Walk, turn and come back
Note imbalance, postural asymmetry,
type of gait (e.g. shuffling, walking on
toes, etc.), swinging of the arms, and
how patient negotiates turns.
Heel-to-toe (tandem walking)
Note an ataxia not previously obvious
Shallow knee bend
Note difficulties here suggest
proximal weakness (extensors of
hip), weakness of the quadriceps (the
extensor of the knee), or both.
177
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
31 Performing the Romberg Test
Examiner directs the patient to stand
with feet together, eyes closed for
at least 20 seconds without support.
During this test, examiner must stand
behind the patient to provide support
in case the patient loses his / her
balance.
32 Testing for Pronator Drift
Examiner directs the patient to stand
with eyes closed, simultaneously
extending both arms, with palms
turned upward, for at least 20
seconds.
During this test, examiner must stand
behind the patient to provide support
in case the patient loses his / her
balance.
178
NEUROLOGICAL EXAMINATION
©2009 New York College of Osteopathic Medicine 011509
SPECIAL TESTING
1 Sensory Testing
First, examiner
demonstrates what
sharp vs. dull means by
brushing the patient
with a soft object, e.g. a
cotton ball or smooth
end of tongue
depressor, and a semisharp
object, e.g.
broken end tongue
depressor.
Examiner performs this
test on arms and legs
bilaterally by randomly
brushing the patient’s
arms and legs with the
soft and semi-sharp
objects, e.g. a cotton
ball, semi-sharp object,
etc..
Patient directed to keep
his / her eyes closed
during the examination
as he or she identifies
sharp vs. dull on skin.
Bilateral exam, upper
and lower extremities.
179
TASKFORCE MEMBERS
John R. McCarthy, Ed.D. Associate Director, Clerkship Education
Pelham Mead, Ed.D. Director, Faculty Development
Mary Ann Achziger, M.S. Associate Dean, Student Affairs
Felicia Bruno, M.A. Assistant Dean, Student Administrative
Services/Alumni Affairs/Continuing Education
Claire Bryant, Ph.D. Assistant Dean, Preclinical Education
Leonard Goldstein, DDS, PH.D. Director, Clerkship Education
Abraham Jeger, Ph.D. Associate Dean, Clinical Education
Rodika Zaika, M.S. Director, Admissions
Ron Portanova, Ph.D. Associate Dean, Academic Affairs
180