New York College of Osteopathic Medicine Learning Outcomes Assessment 2009-2010.

By Dr. Pelham Mead, Director of Assessment and Faculty Development.

Taskforce Members

John R. McCarthy, Ed.D. Pelham Mead, Ed.D. Mary Ann Achziger, M.S. Felicia Bruno, M.A. Claire Bryant, Ph.D.
Leonard Goldstein, DDS, PH.D. Abraham Jeger, Ph.D.
Rodika Zaika, M.S. Ron Portanova, Ph.D.

Table of Contents
OVERVIEW 4
I. Introduction and Rationale 5
II. Purpose and Design 9
III. Specifics of the Plan 11
Mission of NYCOM 11
Learning Outcomes 11
Compiling the Data 17
Stakeholders 17
IV. Plan Implementation 18
Next Steps 18
V. Conclusion 20
A. OUTCOME INDICATORS – DETAIL 24

  1. Pre-matriculation data 24
    Forms 26
  2. Academic (pre-clinical) course-work 47
    Forms – LDB / DPC Track 49
    Forms – Institute for Clinical Competence (ICC) 55
  3. Clinical Clerkship Evaluations / NBOME Subject Exams 86
    Forms 88
  4. Student feedback (assessment) of courses/Clinical clerkship
    PDA project 92
    Forms 94
  5. COMLEX USA Level I, Level II CE & PE,
    Level III data (NBOME) 120
  6. Residency match rates and overall placement rate 121
  7. Feedback from (AACOM) Graduation Questionnaire 122
    Forms 123
  8. Completion rates (post-doctoral programs)
    142
  9. Specialty certification and licensure 143
  10. Career choices and geographic practice location 144
  11. Alumni Survey 145
    Forms 146
    B. BENCHMARKS 151

Bibliography 152

Appendices: 153

Chart 1 Proposed Curriculum and Faculty Assessment Timeline Institute for Clinical Competence:
Neurological Exam – Student Version Parts I & II Taskforce Members

List of Tables and Figures

Figure 1 Cycle of Assessment 9
Figure 2 Outcome Assessment along the Continuum 15
Figure 3 Data Collection Phases 22
Table 1 Assessment Plan Guide 23

New York College of Osteopathic Medicine
Learning Outcomes Assessment Plan February 2009

Overview

This document was developed by the NYCOM Task Force on Learning Outcomes Assessment and was accepted by the dean in January 2009. Although a few of the assessment tools and processes described in the document are new, most have been employed at NYCOM since its inception to inform curriculum design and implementation and to gauge progress and success in meeting the institution’s mission, goals and objectives.
The Learning Outcomes Assessment Plan documents the processes and measures used by the institution to gauge student achievement and program (curricular) effectiveness. The results of these activities are used by faculty to devise ways to improve student learning and by administrators and other stakeholder groups to assess institutional effectiveness and inform planning, decision-making, and resource allocation.
Certain of the measures described in later sections of this document constitute key performance indicators for the institution, for which numerical goals have been set. Performance on these measures has a significant effect on institutional planning and decision-making regarding areas of investment and growth, program improvement, and policy.

Key performance indicators and benchmarks are summarized below and also on page 151 of the plan.
Indicator Benchmarks
ï Number of Applicants Maintain relative standing among Osteopathic Medical Colleges
ï Admissions Profile Maintain or improve current admissions profile based on academic criteria (MCAT, GPA, Colleges attended
ï Attrition 3% or less
ï Remediation rate (preclinical)
2% reduction per year
ï COMLEX USA scores (first-time pass rates, mean scores) Top quartile
ï Students entering OGME Maintain or improve OGME placement
ï Graduates entering Primary Care careers Maintain or improve Primary Care placement
ï Career characteristics Regarding Licensure, Board Certification, Geographic Practice, and Scholarly achievements–TBD

I. Introduction and Rationale

At NYCOM we believe it is our societal responsibility to monitor our students’ quality of education through continual assessment of educational outcomes. On-going program evaluation mandates longitudinal study (repeated observations over time) and the utilization of empirical data based on a scientific methodology.
At Thomas Jefferson University, an innovative study was implemented circa 1970, which was ultimately titled “Jefferson Longitudinal Study of Medical Education”.1 As a result of implementation of this longitudinal study plan, Thomas Jefferson University was praised by the

1 Center for Research in Medical Education and Health Care: Jefferson Longitudinal Study of Medical Education, Thomas Jefferson University, 2005.

Accreditation Team for the Middle States Commission on Higher Education for “…..their academic interest in outcome data, responsiveness to faculty and department needs and the clear use of data to modify the curriculum and teaching environment….their use of this data has impacted many components of the curriculum, the learning environment, individual student development, and program planning…” (TJU, 2005).
The Jefferson Longitudinal Study of Medical Education has been the most productive longitudinal study of medical students and graduates of a single medical school. This study has resulted in 155 publications in peer review journals. Many were presented before national or international professional meetings prior to their publication (TJU, 2005).
According to Hernon and Dugan (2004), the pressure on higher education institutions to prove accountability has moved beyond the acceptance and reliance of self-reports and anecdotal evidence compiled during the self-regulatory accreditation process. It now encompasses an increasing demand from a variety of constituencies to demonstrate institutional effectiveness by focusing on quality measures, such as educational quality, and cost efficiencies.
Accountability focuses on results as institutions quantify or provide evidence that they are meeting their stated mission, goals, and objectives. Institutional effectiveness is concerned, in part, with measuring (Hernon and Dugan, 2004):
ï Programmatic outcomes: such as applicant pool, retention rates, and graduation rates. Such outcomes are institution-based and may be used to compare internal year-to-year institutional performance and as comparative measures with other institutions.

ï Student learning outcomes: oftentimes referred to as educational quality and concerned with attributes and abilities, both cognitive and affective, which reflect how student experiences at the institution supported their development as individuals. Students are expected to demonstrate acquisition of specific knowledge and skills.

At NYCOM, we recognize that our effectiveness as an institution must ultimately be assessed and expressed by evaluating our success in achieving our Mission in relation to the following Outcomes:

  1. Student Learning / Program Effectiveness
  2. Research and Scholarly Output
  3. Clinical Services

The present document focuses on #1, above, viz., Student Learning / Program Effectiveness. That is, it is intended only as a Learning Outcomes Assessment Plan. At the same time, we are cognizant that Institutional Effectiveness/Outcomes derive from numerous inputs, or “means” to these “ends,” including:

  1. Finances
  2. Faculty Resources
  3. Administrative Resources
  4. Student Support Services
  5. Clinical Facilities and Resources
  6. Characteristics of the Physical Plant
  7. Information Technology Resources
  8. Library Resources

We believe it is our obligation to continually assess the impact of any changes in the inputs, processes, and outputs of this institution.
The evaluation approach in this Assessment Plan provides for on-going data collection and analysis targeted specifically at assessing outcomes of student achievement and program effectiveness (educational quality). Assessment of achievement and program effectiveness is based on objective, quantifiable information (data).
As a result of the NYCOM Learning Outcome Assessment Plan’s continual assessment cycle, the report is available, with scheduled updates, as a resource in the decision-making process.

The report provides outcomes data, recommendations, and suggestions intended to inform key policy makers and stakeholders2 of areas of growth and/or improvement, together with proposed changes to policy that strengthen both overall assessment and data-driven efforts to improve student learning.

2 NYCOM Administration, academic committees, faculty, potential researchers, and students.

II. Purpose and Design

Well-designed plans for assessing student learning outcomes link learning outcomes, measures, data analysis, and action planning in a continuous cycle of improvement illustrated below.

Figure 1 Cycle of Assessment

Ten principles guide the specifics of NYCOM’s Learning Outcomes Assessment Plan:

  1. The plan provides formative and summative assessment of student learning.3
  2. The primary purpose for assessing outcomes is to improve student learning.
  3. Developing and revising an assessment plan is a long-term, dynamic, and collaborative process.
  4. Assessments use the most reliable and valid instruments available.

3 Examples of the former include post-course roundtable discussions, Institute for Clinical Competence (ICC) seminars, and data from the Course/Faculty Assessment Program. Examples of the latter include the AACOM Graduation Questionnaire, COMLEX scores, NBOME subject exam scores, and clerkship evaluations.

  1. Assessment priorities are grounded in NYCOM’s mission, goals, and learning outcomes.
  2. The assessment involves a multi-method approach.
  3. Assessment of student learning is separate from evaluation of faculty.
  4. The primary benefit of assessment is the provision of evidence-based analysis to inform decision-making concerning program revision and improvement and resource allocation.
  5. The assessment plan must provide a substantive and sustainable mechanism for fulfilling NYCOM’s responsibility to ensure the quality, rigor, and overall effectiveness of our programs in educating competent and compassionate physicians.
  6. The assessment plan yields valid measures of student outcomes that provide stakeholders with relevant and timely data to make informed decisions on changes in curricular design, implementation, program planning, and the overall learning environment.

Outcomes assessment is a continuous process of measuring institutional effectiveness focusing on planning, determining, understanding, and improving student learning. At NYCOM, we are mindful that an integral component of this assessment plan is to ensure that the plan and the reporting process measures what it is intended to measure (student achievement and program effectiveness).

III. Specifics of the Plan

The NYCOM assessment plan articulates eleven student learning outcomes, which are linked to both the institutional mission and the osteopathic core competencies

Mission of NYCOM

The New York College of Osteopathic Medicine of the New York Institute of Technology is committed to training osteopathic physicians for a lifetime of learning and practice, based upon the integration of evidence-based knowledge, critical thinking and the tenets of osteopathic principles and practice. The college is also committed to preparing osteopathic physicians for careers in primary care, including health care in the inner city and rural communities, as well as to the scholarly pursuit of new knowledge concerning health and disease. NYCOM provides a continuum of educational experiences to its students, extending through the clinical and post-graduate years of training. This continuum provides the future osteopathic physician with the foundation necessary to maintain competence and compassion, as well as the ability to better serve society through research, teaching, and leadership.

Learning Outcomes

The following eleven (11) Learning Outcomes that guide this plan stem from NYCOM’s mission (above) and the osteopathic core competencies:

  1. The Osteopathic Philosophy: Upon graduation, a student must possess the ability to demonstrate the basic knowledge of Osteopathic philosophy and practice, as well as Osteopathic Manipulative Treatment.
  2. Medical Knowledge: A student must possess the ability to demonstrate medical knowledge through passing of course tests, standardized tests of the NBOME, post-

course rotation tests, research activities, presentations, and participation in directed reading programs and/or journal clubs, and/or other evidence-based medicine activities.

  1. Practice-based learning and improvement: Students must demonstrate their ability to critically evaluate their methods of clinical practice, integrate evidence-based medicine into patient care, show an understanding of research methods, and improve patient care practices
  2. Professionalism: Students must demonstrate knowledge of professional, ethical, legal, practice management, and public health issues applicable to medical practice.
  3. Systems-based practice: Students must demonstrate an understanding of health care delivery systems, provide effective patient care and practice cost-effective medicine within the system.
  4. Patient Care: Students must demonstrate the ability to effectively treat patients and provide medical care which incorporates the osteopathic philosophy, empathy, preventive medicine education, and health promotion.
  5. Communication skills: Students must demonstrate interpersonal and communication skills with patients and other healthcare professionals, which enable them to establish and maintain professional relationships with patients, families, and other healthcare providers.
  6. Primary Care: Students will be prepared for careers in primary care, including health care in the inner city, as well as rural communities.
  7. Scholarly/Research Activities: Students will be prepared for the scholarly pursuit of new knowledge concerning health and disease. Students in NYCOM’s 5-year Academic Medicine Scholars Program will be prepared as academic physicians in order to address

this nation’s projected health care provider shortage and the resulting expansion of medical school training facilities.

  1. Global Medicine and Health policy: Students will be prepared to engage in global health practice, policy, and the development of solutions to the world’s vital health problems.
  2. Cultural Competence: Students will be prepared to deliver the highest quality medical care, with the highest degree of compassion, understanding, and empathy toward cultural differences in our global society.
    The NYCOM assessment plan provides for analysis of learning outcomes for two curricular tracks and four categories of student

NYCOM has historically tracked student data across the curriculum, paying particular attention to cohorts of students (see below), as well as NYCOM’s two curricular tracks:
a) Lecture-Based Discussion track: integrates the biomedical and clinical sciences along continuous didactic ‘threads’ delivered according to a systems based approach;
b) Doctor Patient Continuum track: a problem-based curriculum, whose cornerstone is small-group, case-based learning.

Current data gathering incorporates tracking outcomes associated with several subcategories of student (important to the institution) within the 4-year pre-doctoral curriculum and the 5-year pre-doctoral Academic Medicine Scholars curriculum. The pre-doctoral populations are defined according to the following subcategories:
ï Traditional:4
ï BS/DO: The BS/DO program is a combined baccalaureate/doctor of osteopathic medicine program requiring successful completion of a total of 7 years (undergraduate, 3 years; osteopathic medical school, 4 years).
ï MedPrep: A pre-matriculation program offering academic enrichment to facilitate the acceptance of underrepresented minority and economically disadvantaged student applicants.5

4 All other students not inclusive of BS/DO, MedPrep, and EPP defined cohorts.
5 The program is funded by the New York State Collegiate Science and Technology Entry Program and the NYCOM Office of Equity and Opportunity Programs.

ï EPP (Émigré Physician Program): A 4-year program, offered by NYCOM, to educate émigré physicians to become DOs to enable them to continue their professional careers in the U.S.

The NYCOM assessment plan includes data from four phases of the medical education continuum (as illustrated in Figure 2 and Figure 3): pre-matriculation, the four-year pre- doctoral curriculum6, post-graduation data, and careers and practice data

Within the NYCOM Learning Outcome Assessment Plan, the Task Force has chosen the following outcome indicators for assessment of program effectiveness at different points in the medical education continuum:
ï Pre-matriculation data, including first-year student survey;
ï Academic (pre-clinical) course-work (scores on exams, etc.) – attrition rate;
ï Clinical Clerkship Evaluations (3rd/4th year) and NBOME Subject Exams;
ï Student feedback (assessment) of courses and 3rd and 4th year clinical clerkships and PDA-based Patient and Educational Activity Tracking;
ï COMLEX USA Level I, Level II CE & PE, and Level III data, including:
o First-time and overall pass rates and mean scores;
o Comparison of NYCOM first time and overall pass rates and mean scores to national rankings;
ï Residency match rate and placement rate (AOA / NRMP);
ï Feedback from AACOM Graduation Questionnaire;
ï Completion rates of Post-Doctoral programs;
ï Specialty certification and licensure;
ï Career choices (practice type–academic, research, etc.);
ï Geographic practice locations;
ï Alumni survey.

The Outcome Indicators—Detail sections of this plan (pages 24 through 150) show the various data sources and include copies of the forms or survey questionnaires utilized in the data gathering process.
The NYCOM assessment plan identifies specific sources of data for each phase

Figure 2 illustrates which of the above measures are most relevant at each phase of the medical education continuum.

6 And the five-year pre-doctoral Academic Medicine Scholars program

The NYCOM assessment plan describes the collection and reporting of data, responsibilities for analysis and dissemination, and the linkage to continuous program improvement and institutional planning

Compiling the Data

Discussions with departmental leaders and deans confirmed that data gathering occurs at various levels throughout the institution. Development of a central repository (centralized database) facilitates data gathering, data mining and overall efficiency as it relates to data analysis, report generation, and report dissemination. This includes utilization of internal databases (internal to NYCOM) as well as interfacing with external organizations’ databases, including the AOA (American Osteopathic Association), AACOM (American Association of Colleges of Osteopathic Medicine), AMA (American Medical Association), and the ABMS (American Board of Medical Specialties).
Stakeholders

Information from the data collection serves to inform NYCOM administration, relevant faculty, appropriate research and academic/administrative committees, including the following:

ï Curriculum Committee
ï Student Progress Committee
ï Admissions Committee
ï Deans and Chairs Committee
ï Clinical and Basic Science Chairs
ï Research Advisory Group
ï Academic Senate

The NYCOM assessment plan sets forth benchmarks, goals and standards of performance

The major elements of the plan are summarized in Table 1: Assessment Plan Guide: Learning Outcomes/Metrics/Benchmarks found at the end of this chapter.

IV. Plan Implementation

As discussed earlier, most of the assessment tools and processes described in the document have been employed at NYCOM since its inception to inform curriculum design and implementation and to gauge progress and success in meeting the institution’s mission, goals and objectives. Beginning in fall 2008, however, assessment efforts have been made more systematic; policies, procedures, and accountabilities are now documented and more widely disseminated.
The Office of Program Evaluation and Assessment (OPEA), reporting to the Associate Dean for Academic Affairs is responsible for directing all aspects of plan refinement and implementation.
Next steps

  1. Develop a shared, central repository for pre-matriculation, pre-doctoral, and post- graduate data (see Figure 3). Time Frame: Academic Year 2010-2011

Centralized database: Development of a (shared or central) repository (database) utilized by internal departments of NYCOM. WEAVEonline is a web-bases assessment system, utilized by numerous academic institutions across the country, for assessment and planning purposes.
Utilizing this program facilitates centralization of data. The central database is comprised of student data categorized as follows:
Pre-matriculation Data includes demographics, AACOM pre-matriculation survey, academic data (GPA), and other admissions data (MCAT’s, etc.).
Data is categorized according to student cohort as previously written and described (see item III. Specifics of the Plan on pages 13-14).

Pre-doctoral Data includes academic (pre-clinical) course work, course grades, end-of- year grade point averages, the newly implemented, innovative Course / Faculty assessment program data (described in Section 4), ratings of clinical clerkship performance, performance scores on COMLEX USA Level I and Level II CE & PE, descriptors of changes in academic status (attrition), and AACOM Graduation questionnaires.

Post-graduate/Career Data includes residency match rate, residency choice, hospitals of residency, geographic location, chosen specialty, performance on COMLEX Level III, geographic and specialty area(s) of practice following graduation, licensure, board certification status, scholarly work, professional activities/societies, faculty appointments, type(s) of practice (academic, clinical, research).
This database supports and assimilates collaborative surveys utilized by internal departments in order to capture requested data (see item III. Specifics of the Plan on pages 13-14) essential for tracking students during and after post-graduate training. Specific data (e.g., COMLEX Level III, board certification, and licensure) is provided by external databases, through periodic reporting means, or queries from NYCOM, therefore the database provides for assimilation of this external data, in order to incorporate into institutional reporting format.

  1. Establish metrics. Time Frame: Academic Year 2010-2011

Benchmarks and Reporting: Conduct a retrospective data analysis in order to establish baseline metrics (see Compiling the Data on page 17).

Following development of these metrics, institutional benchmarks are established. Benchmarks align with Institutional Goals as written above.

Reporting of data analysis occurs on an annual basis. An annual performance report is compiled from all survey data and external sources. Timeframe for reporting is congruent with end of academic year. Updates to report occur semi-annually, as additional (external) data is received.

Data reporting includes benchmarking against Institutional Goals (mission), in order to provide projections around effectiveness of learning environment, quality improvement indicators, long-range and strategic planning processes, and cost analysis/budgetary considerations.

Report dissemination to key policy makers and stakeholders, as previously identified (see Stakeholders on page 17) in addition to other staff, as deemed appropriate for inclusion in the reporting of assessment analysis.

V. Conclusion

The impact on student learning of such things as changes in the demographics of medical school applicants, admissions criteria, curricula, priorities, and methods of delivery of medical education deserve careful discussion, planning, and analysis before, during, and after implementation. This plan facilitates change management at three points:
o Planning, by providing evidence to support decision-making;

o Implementation, by establishing mechanisms for setting performance targets and monitoring results, and

o Evaluation, by systematically measuring outcomes against goals and providing evidence of whether the change has achieved its intended objectives.
At NYCOM, accountability is seen as both a requirement and a responsibility. As healthcare delivery, pedagogy, and the science of medicine constantly change, monitoring the rigor and effectiveness of the learning environment through assessment of student learning outcomes throughout the medical education continuum becomes paramount.

Figure 3 Data Collection Phases

Pre-matriculation Data

Pre-doctoral Data

Career Data

Assessment Process

Post-Graduate Data

Table 1 – Assessment Plan Guide: Learning Outcomes / Data Sources / Metrics

Learning Outcomes7 Data Collection Phases8 Assessment Methods Metrics9 Development of benchmarks10
Students will:
Demonstrate basic knowledge of OPP & OMT

ï Pre-matriculation

ï Pre-doctoral

ï Post-graduate

ï Career
ï Didactic Academic Performance
ï LDB Curriculum
ï DPC Curriculum
ï Formative / Summative Experiences: Patient Simulations (SP’s / Robotic)
ï Student-driven Course, Clerkship, and Faculty Assessment
ï Clinical Clerkship Performance
ï PDA-Based Patient and Education Tracking
ï Surveys
ï Standardized Tests
ï Alumni Feedback Vis a Vis:
ï Admissions Data (Applicant Pool demographics)
ï Course Exams
ï End-of-year pass rates
ï Coursework
ï Analysis of Residency Trends Data
ï Standardized Tests Subject Exams
ï COMLEX 1 & II Scores
ï Analysis of Specialty Choice
ï Analysis of geographic practice area
ï Academic Attrition rates
ï Remediation rates
ï Graduation and post- graduate data
ï External surveys
ï Applicant Pool
ï Admissions Profile
ï Academic Attrition rates
ï Remediation rates (pre-clinical years)
ï COMLEX USA
Scores I & II (1st time pass rate / mean score)
ï Number of graduates entering OGME programs
ï Graduates entering Primary Care (PC)11
ï Career Data:
Licensure (within 3 years);
Board Certification; Geographic Practice Area; Scholarly achievements
Demonstrate medical knowledge
Demonstrate competency in practice- based learning and improvement
Demonstrate professionalism and ethical practice
Demonstrate an understanding of health care delivery systems
Demonstrate the ability to effectively treat patients
Demonstrate interpersonal and communication skills
Be prepared for careers in primary care
Be prepared for the scholarly pursuit of new knowledge
Be prepared to engage in global health practice, policy, and solutions to world health problems
Be prepared to effectively interact with people of diverse cultures and deliver the highest quality of medical care

7 Complete detail of Learning Outcomes found in III., pages 11-13.
8 See Figure 3, page 22.
9 List of Metrics is not all-inclusive.
10 See complete detail of benchmarks—pages 5 & 151.
11 Primary Care: Family Medicine, Internal Medicine, and Pediatrics.

Outcome Indicators – Detail

  1. Pre-matriculation data
    Data gathered prior to students entering NYCOM, and broken down by student cohort, which includes the following:
    Traditional, MedPrep, and BS/DO students

 AACOM pre-matriculation survey given to students;

 Total MCAT scores;

 Collegiate GPA (total GPA-including undergraduate/graduate);

 Science GPA;

 College(s) attended;

 Undergraduate degree (and graduate degree, if applicable;

 Gender,;

 Age;

 Ethnicity;

 State of residence;

 Pre-admission interview score.

Additional data is gathered on the MedPrep student cohort and incorporates the following:
 Pre-matriculation lecture based exam and quiz scores;

 Pre-matriculation DPC (Doctor Patient Continuum) based facilitator assessment scores and content exam scores;

 ICC (Institute for Clinical Competence) Professional Assessment Rating (PARS) Scores.

Émigré Physician Program students

 TOEFL (Test of English as a Foreign Language) score;

 EPP Pre-Matriculation Examination score;

 Medical school attended;

 Date of MD degree;

 Age;

 Ethnicity;

 Country of Origin.

Specific forms/questionnaires utilized to capture the above-detailed information include the following:

 MedPrep 2008 Program Assessment
 MedPrep Grade Table
 NYCOM Admissions Interview Evaluation Form
 Application for Émigré Physicians Program (EPP)
 AACOM Pre-matriculation survey (first-year students)
 NYCOM Interview Evaluation Form – Émigré Physicians Program Samples of the forms/questionnaires follow

MedPrep 2008 Program Assessment
Successful completion of the MedPrep Pre-Matriculation Program takes into consideration the following 3 assessment components:

  1. Lecture-Discussion Based (LDB)
  2. DPC (Doctor Patient Continuum)
  3. ICC (Institute for Clinical Competence)

A successful candidate must achieve a passing score for all 3 components. Strength in one area will not compensate for weakness in another.

  1. The first component assesses the Lecture-Discussion Based portion of the MedPrep Pre- Matriculation Program. It is comprised of 3 multiple choice quizzes and 1 multiple choice exam.
    ï Histology
    ï Biochemistry
    ï Physiology
    ï Genetics
    ï Physiology
    ï OMM
    ï Pharmacology
    ï Pathology
    ï Microbiology
    ï Clinical Reasoning Skills

Each of the three quizzes constitutes 10% of an individuals overall LDB score and the final exam (to be conducted on June 27) constitutes 70% of an individuals overall LDB score (comprising 100%) in the Lecture-Discussion portion of the program.

  1. The second is based upon your performance in the DPC portion of the MedPrep Pre- Matriculation Program. There will be a facilitator assessment (to be conducted on June 26), which will comprise 30% of an individual’s grade and a final written assessment which will be 70% of an individual’s overall DPC score.

** Note – Both the Lecture-Discussion Based and DPC passing scores are calculated as per NYCOM practice:
 Average (mean) minus one standard deviation
 Not to be lower than 65%
 Not to be higher than 70%

  1. The third component is the ICC encounter designed to assess your Doctor Patient Interpersonal skills. This assessment is evaluated on the PARS scale described to you in the Doctor Patient Interpersonal Skills session on June 12, by Dr. Errichetti.

After the program ends, on June 27th, all three components of the assessment will be compiled and reviewed by the MedPrep Committee. The director of admissions, who is a member of the committee, will prepare notification letters that will be mailed to you within two weeks.

Please note:

The written communication you will receive ONLY contains acceptance information. NO grades will be distributed. Exams or other assessments (with the exception of the Lecture- Discussion Based quizzes, which have already been returned) will not be shared or returned.

Please DO NOT contact anyone at NYCOM requesting the status of your candidacy. No information will be given on the phone or to students on campus.

Thank you for your participation in the MedPrep Pre-Marticulation Program. The faculty and staff have been delighted to meet and work with you. We wish you success!

Sincerely,

Bonnie Granat

Quiz #1     Quiz #3 LDB Final           
Score   Quiz #2 Score   Exam    Overall LBD     
(10% of Score   (10% of Score   Score       
Overall (10% of Overall (70% of (Exam and   Overall Overall

LDB Overall LDB Overall LDB Quizzes DPC ICC

Last Name, First Name
Score)
LDB Score)
Score)
Score)
Combined)
Score
Score

NEW YORK COLLEGE OF OSTEOPAHTIC MEDICINE ADMISSIONS INTERVIEW EVALUATION FORM

Applicant Date / /

CATEGORY
CRITERIA
VALUE
RATING

I. PERSONAL PRESENTATION
MATURITY
LIFE EXPERIENCE /TRAVEL
EXTRA CURRICULAR ACTIVITIES/HOBBIES COMMUNICATION SKILLS
SELF ASSESSMENT (STRENGTHS/WEAKNESSES) AACOMAS & SUPPLEMENTAL STATEMENT

50

II. OSTEOPATHIC MOTIVATION
KNOWLEDGE OF THE PROFESSION TALKED TO A DO/LETTER FROM A DO
15

III. PRIMARY CARE MOTIVATION
INTEREST IN PRIMARY CARE
15

IV. OVERALL IMPRESSION
EXPOSURE TO MEDICINE

  • VOLUNTEER EXPERIENCE
  • EMPLOYMENT EXPERIENCE
  • UNIQUE ACADEMIC EXPERIENCES
  • RESEARCH
    20

TOTAL RATING
100

OTHER COMMENTS: PLEASE USE OTHER SIDE
(REQUIRED)

Comments on Applicant

COMMENTS:

Interviewer

APPLICATION FOR EMIGRE PHYSICIANS PROGRAM (EPP)
Application Deadline: March 16, 2009

  1. SSN 2. Name
    Last First
  2. Do you have educational materials under another name? Yes ( ) No ( ) If yes, indicate name _
  3. Have you previously applied? Yes () No () Year(s):
  4. Preferred Mailing Address _
    Street Apt.#
    Telephone ( ) _

City State Zip code

Area code

Number

E-mail:

  1. Permanent and/or Legal Residence
    Street Apt.# Telephone_( ) _

City State Zip code

Area code

Number

  1. Year you emigrated to the United States

NOTE: Only U.S. Citizens or Permanent Residents** are eligible.
(Attach copy of citizenship papers/green card, front and back)
(** APPLICANTS MUST BE IN POSSESSION OF “GREEN CARD” AT TIME OF APPLICATION)

  1. Are you a U.S. citizen? Yes () No ()
  2. Are you a Permanent Resident? Yes ( ) No ( ) Year Green Card issued Green Card No. _
  3. Sex: Male ( ) Female ( )

1 1 . Date of Birth: “”‘//”””/-‘-/ 11 a. Place of Birth (city, country) M DY

  1. How do you describe yourself? Black Mex. Amer/Chicano _
    White Other Hispanic _

Asian/Pac.Isl. —

  1. P I e a s e I i s t t h e m e m b e r s o f y o u r h o u s e h o I d :

Name, Relationship to you (e.g. spouse, child, etc.), Age

Institution Name Location Dates of Major Attendance Subject

Degree granted
or expected (Date)

Medical Specialty (if any) No. of years in practice

  1. Have you had any U.S. military experience ? Yes ( ) No ( ) If yes, was your discharge honorable? Yes ( ) No ( )
  2. List employment in chronological order, beginning with your current position:

Title or Description Where Dates Level of Responsibility

  1. Work/daytime telephone number
    area code phone
  2. How do you plan to finance your NYCOM education? Personal funds Loans
  3. Were you ever the recipient of any action for unacceptable academic performance or conduct
    violations (e.g. probation, dismissal, suspension, disqualification, etc.) by any ( )
    college or school? Yes ( ) No( )
    If yes, were you ever denied readmission? Yes ( ) No
    )
  4. Have you ever been convicted of a misdemeanor or felony (excluding parking violations)? Yes ( ) No( If your answer to #19 or #20 is yes, please explain fully:
  5. Evaluation Service used: Globe Language Services Joseph Silny & Assocs.
    World Education Services IERF

*22. TOEFL Score(s):

*ALL CANDIDATES MUST TAKE TOEFL / TOEFL
Scores Cannot Be Older Than 2 YEARS

If you plan to take or retake the TOEFL, enter date: / / mo.
yr .

(NYCOM’s TOEFL Code is #2486; copies cannot be accepted)

USMLE WILL NOT BE ACCEPTED IN LIEU OF TOEFL

All evaluations must be received directly from the evaluation service and are subject to approval by the New York College of Osteopathic Medicine.

Personal Comments: Please discuss your reasons for applying to the EPP program.

I certify that all information submitted in support of my application is complete and correct to the best of my knowledge. Date: Signature:

PLEASE MAIL APPLICATION AND FEE ($60.00 CHECK OR MONEY ORDER ONLY, PAYABLE TO NYCOM) TO:

2008-09 Academic Year Survey – First Year Students

TO THE STUDENTS: Your opinions and attitudes about your medical education, your plans for medical practice, and information about your debt are very Important as the colleges and the osteopathic profession develop and plan for the future of osteopathic medical education. Please take some time to complete the following questionnaire to help in planning the future of osteopathic medical education. The information you provide in this survey will be reported only in aggregate or summary form; individually Identifiable information will not be made available to the colleges or other organizations. The reason we ask for your identification is to allow for longitudinal studies linking your responses as first year students to your responses when this survey is readministered again in your fourth year.

Please print in Capital Letters:

Please fill in marks like this:

Last
Name Suffix

First Name

Osteopathic College

Middle Name
or Maiden Name if Married Woman Using Husband’s Name

0 ATSU-SOMA 0 LECOM-Bradenton 0 OU-COM 0 TUNCOM
0 ATSU/KCOM 0 LECOM-PA 0 PCOM 0 UMDNJ-SOM
0 AZCOM 0 LMU-DCOM 0 PCSOM 0 UNECOM
0 CCOM 0 MSUCOM 0 PNWU-COM 0 LfNTHSCffCOM
0 DMU-COM 0 NSU-COM 0 RVUCOM 0 VCOM
0 GA-PCOM 0 NYCOM 0 TOUROCOM 0 WestemU/COMP
0 KCUMB-COM 0 OSU-COM 0 TUCOM-CA 0 WVSOM

Part I: CAREER PLANS

Pl. Plans Upon Graduation: Please indicate what type ofosteopathic internship you plan to do. (Choose only one.)

0 a. Traditional rotating
0 b. Special emphasis Indicate type: I. Anesthesiology 0 2. Diagnostic Radiology 0
3. Emergency Med. 0 4. Family Practice 0
5. General Surgery 0 6. Psychiatry 0
7. Pathology 0
0 c. Specialty track Indicate type: I. Internal Medicine 0 2. Internal Medicine/Peds. 0
3. Ob/Gyn 0 4. Otolaryn./Facial Plastic Surg. 0
5. Pediatrics 0 6. Urological Surgery 0

0
d. Pursue AOA/ACGME dual approved internship
0 e. Not planning osteopathic internship. Reason: I. Allopathic residency 0
2. Other 0

0 f. Undecided

Please specify

P2. a. Immediate Post-Internship Residency Plans: Select the one item that best describes your plans immediately after internship (or upon graduation if not planning an osteopathic internship).

0 I. Pursue osteopathic residency
0 2. Pursue allopathic residency (see Item P2b)
0 3. Pursue AONACGME dual approved residency (see Hem P2b)
0 4. Enter governmental service (e.g., military, NHS Corps, Indian Health Service, V.A., state/local health dept.) (see Item P2b)

If you are not doing a residency, please indicate your post-internship plans.

0 5.
0 6.
0 7.
0 8.
0 9.

Practice in an HMO
Self-employed with or without a partner
Employed in group or other type of private practice (salary, commission, percentage) Other professional activity (e.g., teaching, research, administration, fellowship) Undecided or indefinite post-graduation/internship plans

b. If you plan to pursue an allopathic or AOA/ ACGME dual approved residency, please give all the reasons that apply to you.

0 I. Desire specialty training not available in osteopathic program
0 2. Believe better training and educational opportunities available
0 3. Located in more uitable geographic location(s)
0 4. Located in larger institutions
0 5. Better chance of being accepted in program
0 6. Allow ABMS Board certification
0 7. Opens more career opportunities
0 8. Military or government service obligation
0 9. Shorter training period
0 10. Higher pay
0 11. Other, please specify

P3. Long-Range Plans: Select the one item that best describes your intended activity five years after internship and residency training.

0 I. Enter governmental service (e.g.. military, NHS Corps, Indian Health Service, V.A., state/local health dept.)
0 2. Practice in an HMO
0 3. Self-employed with or without a partner
0 4. Employed in group or other type of private practice (salary, commission, percentage)
0 5. Other professional activity (e.g., teaching, research, administration, fellowship)
0 6. Undecided or indefinite
P4.-a. Area of Interest: Select one specialty in which you are most likely to work or seek training.

0 I. Family Practice 0 17. Ob/Gyn including subspecialties
0 2. General Internal Medicine 0 18. Ophthalmology
0 3. Internal Medicine Subspecialty 0 19. Otolaryngology
0 4. Osteopathic Manip. Ther. & Neuromusculoskeletal Med. 0 20. Pathology including subspecialties
0 5. General Pediatrics 0 21. Physical Medicine & Rehabilitation Med.
0 6. Pediatrics Subspecialty 0 22. Preventive Medicine including subspec.
0 7. Allergy and Immunology 0 23. Proctology
0 8. Anesthesiology 0 24. Radiology (Diagnostic) including subspec.
0 9. Critical Care 0 25. Sports Medicine
0 10. Dennatology 0 26. General Surgery
0 11. Emergency Medicine 0 27. Orthopedic Surgery
0 12. Geriatrics 0 28. Surgery, subspecialty
0 13. Medical Genetics 0 29. Vascular Surgery
0 14. Neurology including subspecialties 0 30. Urology/Urological Surgery
0 15. Psychiatry including subspecialties 0 3I. Undecided or Indefinite
0 16. Nuclear Medicine

P4b. Please select one item t·hat best describes your plans for board certification.

0 I. AOA Boards (osteopathic)
0 2. ABMS Boards (allopathic) (see Item P4c)
0 3. Both boards (see Item P4c)

0 4.

0 5.
0 6.

Other. please specify

Not planning board certification Undecided or indefinite

c. If you selected ABMS or both boards in item P4b, please indicate all the reasons for your choice.

0 I. ABMS board certification is more widely recognized
0 2. ABMS board certification has more colleague acceptance
0 3. ABMS board certification carries more prestige
0 4. ABMS board certification provides more opportunities (career, residencies, etc.)
0 5. Personal desire for dual certification

0 6. Hospital privileges more readily obtained with ABMS board certification.
0 7. Licenses more readily obtained with ABMS board certi tication
0 8. Other. please specify

PS. Please indicate the importance of each of the following factors affecting your specialty choice decision. Use the scale below.
( I) Major Influence (2) Strong /11/lllence (3) Moderate lnjluem:e (4) Minor lnjlue,ru (5) No lnjluence/NA
a. Intellectual content of the specialty (type of work. diagnostic programs. diversity) 0 0 0 0 0
b. Like dealing with people (type of person, type of patient) more than techniques 0 0 0 0 0

P6. Answer only ONE item.
a. State (two-letter abbreviation) where you expect to locate after completion of internship and residency?

b. Fill in if non-U.S. 0
c. Fill in if unknown/undecided 0

P7. a. What is the population of the city/town/area of legal residence where you plan to be employed or in practice after completion of internship or residency?

I. Major metropolitan area (1,000,001 or more) 0 7. Town under 2,500 0

  1. Metropolitan area (500,00 I – 1,000.000) 0 8. Other, please specify 0
  2. City (100,001 – 500,000) 0
  3. City (50,00 I – I 00,000) 0
  4. City or town ( I 0,00 I – 50,000) 0
  5. City or town (2,501 – 10,000) 0 9. Undecided or indefinite 0

b. Are you planning to practice in any underserved or shortage areas? Yes 0 No 0 Unsure 0

41

42

A6. Non-educational Debts You Will Incur While in A7a. How many years do b. Do you anticipate
Medical School: Show the total amount of non- you expect to take participating In a
educational school debt (such as car loans, credit cards, to repay the student loan con-
medical expenses, and living expenses) that you will indebtedness for solidation program
incur during medical school. Do not include your home your osteopathic for repayment?
mortgage in this figure. If none, enter zero. education? (Max
Yrs. 30)

$ 0 Yes
0 © ® ® ® ® ® © 0 No
0 0 0 0 0 (i) (i) (i) 0 Undecided
© © 0 0 0 0 0 0
© © G) 0 0 © 0 0
0 0 0 0 0 0 0 0
© © 0 0 0 © © ©
© © © © © © © 0
0 0 0 0 0 0 0 0
© 0 0 0 0 © © ©
© © © ® © © © 0

Part Ill: DEMOGRAPHIC DATA

This information is for classification purposes only and is considered confidential. Information will only be used by AACOM and affiliated organizations in totals or averages.

DI. Date of Birth I I D2. Sex: Male 0
female 0

D3. Marital Status: Married/cohabiting 0
Single/other 0

D4. SSN

AACOM asks for your Social Security Number so that we can track data longitudinally-a similar survey is administered during graduation, and this number allows us to analyze changes in responses. AACOM provides reports to the COMs only in aggregate and does not include any individual identifiers.

DS. Dependents: Including yourself, how many dependents do you support financially? 2 3 4 5 or more
0 0 0 0 0

  1. Ethnic background: Indicate your ethnic identification from the categories below. Please mark all that apply.

a. Black/African American 0 h. Chinese 0 n. Indian/Pakistani 0
b. American Indian/Alaskan Native 0 I. Filipino 0 0. Other Pacific Islander 0
C. White 0 j. Hawaiian 0 p. Southeast Asian (non-Vietnamese) 0
d. Mexican American/Chicano 0 k. Korean 0 q. Other Asian 0
e. Puerto Rican (Mainland) 0 I. Vietnamese 0 r. Other, specify
0
f. Puerto Rican (Commonwealth) 0 m. Japanese 0
g. Other Hispanic 0

D7. Citizenship Status: U.S.
Permanent Resident Other

0
0
0 Please specify

D8. State of Legal Residence: Use 2 letter postal abbreviation.

D9. Population of city/town/area of legal residence:

a. Major metropolitan area (1,000,001 or more) 0
b. Metropolitan area (500,00 I – 1,000,000) 0
c. City (100,001 – 500,000) 0
d. City (50,001 – 100,000) 0

e. City or town ( I0,00 I – 50,000) 0
f. City or town (2,50I – I 0,000) 0
g. Town under 2,500 0
h. Other 0

Please specify

D10. a. Father’s Education: Select the highest level of education your father attained. Complete this item even ifhe is deceased.

I. Professional Degree (DO/MD, JD, DDS, etc.) 0
(See Item DIOb below)

  1. Doctorate (Ph.D., Ed.D., etc.) 0
  2. Master’s 0
  3. Bachelor’s 0
  4. Associate Degree/Technical Certificate 0
  5. High School Graduate 0
  6. Less than High School 0

b. If your father’s professional degree is in the Health Professions field, please select one of the following: DO/MD 0 Other 0

D11. a. Mother’s EducatJon: Select the highest level of education your mother attained. Complete this item even if she is deceased.

I. Professional Degree (DO/MD, JD, DDS, etc.) 0
(See Item Dllb below)

  1. Doctorate (Ph.D., Ed.D., etc.) 0
  2. Master’s 0
  3. Bachelor’s 0
  4. Associate Degree/Technical Certificate 0
  5. High School Graduate 0
  6. Less than High School 0

b. If your mother’s professional degree is in the Health Professions field, please select one of the following: DO/MD 0 Other 0

D12. Parents’ Income: Give your best estimate of your parents’ combined income before taxes for the prior year.

a. Less than $20,000 0 d. $50,000 – $74,999 0 g. $200,000 or more 0
b. $20,000 – $34,999 0 e. $75,000 – $99,999 0 h. Deceased/Unknown 0
c. $35,000 – $49,999 0 f. $100,000 – $199,999 0

D13. Financial Independence: Do you consider yourself financially independent from your parents? Yes 0

No 0

Thank you very much for your cooperation!

NEW YORK COLLEGE OF OSTEOPATHIC MEDICINE INTERVIEW EVALUATION FORM – ÉMIGRE PHYSICIANS PROGRAM

Applicant: Date:

State:

CATEGORY
CRITERIA TO BE ADDRESSED

VALUE

RATING

  1. Oral Comprehension Ability to understand questions, content
    30
  2. Personal Presentation
    Appropriate response, ability to relate to interviewers

30

  1. Verbal Expression
    Clarity, articulation, use of grammar
    30
  2. Overall Impression Unique experiences, employment , research
    10

OVERALL RATING
100

INTERVIEWER RECOMMENDATION:
Accept
Reject

COMMENTS:

NAME:

SIGNED:

  1. Academic (pre-clinical) course-work
    Data captured during NYCOM’s pre-clinical 4-year pre-doctoral program and 5-year Academic Medicine Scholars program which includes the following:
    Curricular Tracks: Lecture Based-Discussion / Doctor Patient Continuum

 Pre-clinical course pass/failure rate as determined by class year (year 1 and year 2) and overall at end of year 2 (tracking each class and in aggregate for two years);
 Failure rates of (components) Nervous System course or Behavior course;

 Course grades (H/P/F);

 Exam scores;

 Scores (pass/fail rate) on Core Clinical Competency OSCE exams;

 Professionalism Assessment Rating Scale (PARS)

 Students determined as pre-clinical course dismissals (and remediated);

 Students determined double course failure (and remediated);

 Failure rates due to cognitive and/or OMM lab portions of course

 Repeat students (aligned with Learning Specialist intervention)

 Changes in academic status (attrition-as identified above);

 End-of-year class rankings.

Specific forms/questionnaires utilized to capture the above-detailed information include the following:

 Introduction to Osteopathic Medicine / Lecture-Based Discussion
 Doctor-Patient Continuum (DPC) – Biopsychosocial Sciences I Grading and Evaluation Policy
 DPC – Clinical Sciences II – Grading Policy
 Assessing the AOA Core Competencies at NYCOM
 Institute for Clinical Competence (ICC) Professionalism Assessment Rating Scale (PARS)
 SimCom-T(eam) Holistic Scoring Guide
 Case A – Dizziness, Acute (scoring guides) Samples of the forms/questionnaires follow

Introduction to Osteopathic Medicine / Lecture-Based Discussion

Grading and Evaluation

  1. At the conclusion of this course, students will receive a final cognitive score and a final OMM laboratory score.
  2. Both a student’s final cognitive score and a student’s final OMM laboratory score must be at a passing level in order to pass this course.
  3. Cognitive Score
    a. A student’s cognitive score is comprised of the following two components:
    i. Written Examinations and Quizzes pertaining to course lectures and corresponding required readings, cases, course notes, and PowerPoint presentations
    ii. Anatomy Laboratory Examinations and Quizzes
    b. The weighting of the two components of the final cognitive score is as follows:
    Summary of Cognitive Score Breakdown
    Cognitive Score Component % of Final Cognitive Score
    Written Examinations and Quizzes 75%
    Anatomy Laboratory Examinations and
    Quizzes 25%
    Total Cognitive Score 100%

c. Written Examinations and Quizzes
i. There will be three written examinations and four written quizzes in this course.
ii. The written examinations and quizzes will consist of material from all three threads (Cellular and Molecular Basis of Medicine, Structural and Functional Basis of Medicine, Practice of Medicine).
iii. Up to 25% of the written exam and quiz material will come from directed readings.
iv. For the purpose of determining passing for this course, the written examinations will be worth 90% of the final written score and the quizzes will be worth 10% (2.5% each) of the final written score. This weighting is illustrated in the following table:
Summary of Written Exam/Quiz Score Breakdown
Written Exam/Quiz # % of Final Written Score
Written Exam #1 25%
Written Exam #2 30%
Written Exam #3 35%
Total Written Exam Score 90%
Written Quiz #1 2.5%
Written Quiz #2 2.5%
Written Quiz #3 2.5%
Written Quiz #4 2.5%
Total Written Quiz Score 10%
Total Written Score 100%

d. Anatomy Laboratory Examinations and Quizzes
i. There will be two Anatomy laboratory examinations in this course
ii. There will be Anatomy laboratory quizzes in this course, conducted during Anatomy laboratory sessions.
iii. For the purpose of determining passing for this course, each Anatomy lab examination

will be worth 45% of students’ final Anatomy lab score and all Anatomy lab quizzes combined will be worth 10% of students’ final Anatomy lab score. This weighting is illustrated in the following table:
Summary of Anatomy Lab Exam/Quiz Score Breakdown
Anatomy Lab Exam/Quiz # % of Final Anatomy Score
Anatomy Lab Exam #1 45%
Anatomy Lab Exam #2 45%
Anatomy Lab Quizzes 10%
Total Anatomy Lab Exam/Quiz Score 100%

  1. OMM Laboratory Score
    a. A student’s OMM laboratory score in this course is comprised of an OMM laboratory examination and laboratory quizzes, as follows:
    i. There will be one OMM laboratory practical examination in this course
    ii. There will be two OMM laboratory practical quizzes in this course conducted during OMM laboratory sessions
    iii. There will be a series of OMM laboratory written quizzes in this course conducted during OMM laboratory sessions.
    b. The weighting of the components of the OMM laboratory final score is as follows: For the purpose of determining passing for this course, the OMM laboratory practical examination will be worth 70% of the final OMM laboratory score, the OMM laboratory practical quizzes will be worth 20% (10% each) of the final OMM laboratory score, and the OMM laboratory written quizzes will be worth 10% (all OMM lab written quizzes combined) of the OMM laboratory score. This weighting is illustrated in the following table:
    Summary of OMM Laboratory Exam/Quiz Score Breakdown
    OMM Laboratory Exam/Quiz % of Final OMM Laboratory Score
    OMM Laboratory Practical Exam 70%
    OMM Laboratory Practical Quiz #1 10%
    OMM Laboratory Practical Quiz #2 10%
    OMM Laboratory Written Quizzes (all quizzes
    combined) 10%
    Total OMM Laboratory Score 100%
  2. Examinations and quizzes may be cumulative.
  3. Honors Determination
    a. For the purpose of determining who will be eligible to receive a course grade of Honors (“H”), the final cognitive score and final OMM laboratory score will be combined in a 75%/25% ratio, respectively.
    b. Using the formula noted above, students scoring in the top 10% (and who have not taken a make- up exam within the course or remediated the course) will receive a course grade of Honors.

DOCTOR PATIENT CONTINUUM(DPC) – BIOPSYCHOSOCIAL SCIENCES I

Grading and Evaluation Policy:

The examinations and evaluations are weighed as follows:

Evaluation Criteria: Percent of Grade
Content Examination 55%
Component Examinations 25%
Facilitator Assessment 20%

Content Examination: There will a mid-term exam and an end of the term exam, each weighted equally. The examinations will cover the learning issues submitted by the case-study groups. Questions will be based on the common learning issues (covered by all groups) and learning issues specific to individual groups (unique issues).

Component Exams: Distribution of the component exams will be as follows:
ï Exams based on Anatomy lectures and labs = 20%
ï Graded assignments offered by problem set instructors, which might include quizzes, position papers, and/or other exercises = 5%

Facilitator Assessment: Facilitators will meet individually with students twice during the term to evaluate their performance. The first evaluation will be ‘formative’ only, i.e., to advise students of their progress and will not be recorded for grade. The end of the term evaluation will be used to assess the student’s progress/participation in the group and other class related activities. Students will also complete Self-Assessment Forms to supplement the evaluation process.

The grading of this course is on a “PASS/FAIL/HONORS” basis.

1) Students will be evaluated each Term using the multiple components as described above.
2) Each year at the end of the 1st Term:
a) All students will be assigned an interim grade of I (Incomplete);
b) Each student will be informed of his/her final average, a record of which will be maintained in the office of the DPC Academic Coordinator and the Director of the DPC program.
3) Students who earn less than a 1st-Term average of 70%, or a content exam score of <65%, will be officially informed that their performance was deficient for the 1st Term. The student, in consultation with the Course Coordinator, will present a plan designed to resolve the deficiency. This information will also be forwarded to the Associate Dean of Academic Affairs for tracking purposes.
4) Students with a 1st-Term average <70%, or a content exam score of <65%, will be allowed to continue with the class. However, in order to pass the year the student must achieve a final yearly average (1st- and 2nd- term) of 70% or greater with a content exam average (for the two Terms) of 65% or greater.
5) All students who meet the requirements for passing the year (see 4) will then be awarded the grade of P (Pass) or H (Honors) for each of the two Terms.

6) Students who fail the year (see 4) will be awarded a grade of I (Incomplete) and will be permitted (with approval of the Associate Dean for Academic Affairs) to sit for a comprehensive reassessment-examination. The reassessment exam will be constructed by the course faculty and administered by the Course Coordinator. The exam may include both written and oral components. Successful completion of the reassessment examination will result in the awarding of a grade of P for the two Terms. Failure of the comprehensive reassessment exam will result in the awarding of a grade of F (Fail) for the two terms, and a recommendation to the Associate Dean of Academic Affairs that the student be dismissed from the College.
7) Students whose failure of the year (i.e. overall yearly average <70%) can be attributed to low facilitator assessment scores present a special concern. The student has been determined, by his/her facilitators, to be deficient in the skills necessary to effectively interact with patients and colleagues. This deficiency may not be resolvable by examination. Such failures will be evaluated by the Director of the DPC program, the Associate Dean of Academic Affairs and/or the Committee on Student Progress (CSP) to determine possible remediation programs or to consider other options including dismissal.

DOCTOR PATIENT CONTINUUM(DPC) – CLINICAL SCIENCES II

Grading Policy:

  1. The grading of this course is on a “PASS/FAIL/HONORS” basis. Grades will be determined by performance in the three components of the course, OMM, Clinical Skills, and Clinical Practicum, as follows:

Evaluation Criteria: Percent of Grade

OMM 40%
Clinical Skills 40%
Clinical Practicum 20%

In both the OMM and Clinical Skills components of the course, student evaluations will encompass written and practical examinations. In order to pass the course, both the written and practical examinations in OMM AND Clinical Skills must be passed. Students who fail to achieve a passing score in either Clinical Skills or OMM will be issued a grade of “I” (Incomplete). Such students will be offered the opportunity to remediate the appropriate portion of the course. Re-evaluation will be conducted under the supervision of the DPC faculty. Successful completion of the re-evaluation examination, both written and practical, will result in the awarding of a grade of P (Pass). Failure of the comprehensive reassessment exam will result in the awarding of a grade of U (Unsatisfactory) for this course.

  1. Grading of the OMM component will be evaluated according to the following criteria:

Evaluation Criteria: Percent of Grade

OMM written (weighted) 50%
OMM practical (average) 50%

  1. Grading of the Clinical Practicum component will be evaluated according to the following criteria:

Evaluation Criteria: Percent of Grade

Attendance and Participation 15%
Case Presentation 35%
Clinical Mentor Evaluation 50%

  1. Grading of the Clinical Skills component will be evaluated according to the following criteria:

Evaluation Criteria: Percent of Grade

Class participation/assignments 5%
ICC participation/assignments 10%
Timed examination #1
ñ Practical portion 20%
ñ Written portion 5%
Timed examination #2
ñ Practical portion 20%
ñ Written portion 5%
Timed Comprehensive examination
ñ Practical portion 25%
ñ Written portion 10%

Pre-clinical Years: Years I and II DPC Track

Assessing the American Osteopathic Association (AOA) Core Competencies at New York College of Osteopathic Medicine (NYCOM)
A. Background

In recent years, there has been a trend toward defining, teaching and assessing a number of core competencies physicians must demonstrate. The Federation of State medical Boards sponsored two Competency-Accountability Summits in which a “theoretical textbook” on good medical practice was drafted to guide the development of a competency-based curriculum. The competencies include: medical knowledge, patient care, professionalism, interpersonal communication, practice-based learning, and system-based practice. The AOA supports the concepts of core competency assessment and added an additional competency: osteopathic philosophy and osteopathic clinical medicine.

Arguably it is desirable to begin the process of core competency training and assessment during the pre-clinical year. Patient simulations, i.e. using standardized patients and robotic simulator, allow for such training and assessment under controlled conditions. Such a pre-clinical program provides basic clinical skills acquisition in a patient-safe environment. NYCOM has responded to this challenge by creating a two-year “Core Clinical Competencies” seminar that requires students to learn and practice skills through various patient simulations in the Institute For Clinical Competence (ICC). In this seminar the ICC assesses a sub-set of the above competencies taught in the lecture-based and discussion-based clinical education tracks.

The following is a list of the competencies assessed during the pre-clinical years at NYCOM, and reassessed during the third year (osteopathic medicine objective structured clinical examination) and fourth year (voluntary Clinical Skills Capstone Program). It should be noted that there is a fair amount of skills overlap between the competencies, for example, the issue of proper communication can be manifested in a number of competencies.

B. Core Clinical Competencies

  1. Patient Care: Provide compassionate, appropriate effective treatment, health promotion

Skills:
ï Data-gathering: history-taking, physical examination (assessed with clinical skills checklists)
ï Develop differential diagnosis
ï Interpret lab results, studies
ï Procedural skills, e.g. intubation, central line placement, suturing, catheterization
ï Provide therapy

  1. Interpersonal and communication skills: Effective exchange of information and collaboration with patients, their families, and health professionals.

Skills:
ï Communication with patients and their families across a spectrum of multicultural backgrounds (assessed with the Professionalism Assessment Rating Scale)

ï Health team communication
ï Written communication (SOAP note, progress note)

  1. Professionalism: Commitment to carrying out professional responsibilities and ethical committments

Skills:
ï Compassion, respect, integrity for others
ï Responsiveness to patient needs
ï Respect for privacy, autonomy
ï Communication and collaboration with other professionals
ï Demonstrating appropriate ethical consideration
ï Sensitivity and responsiveness to a diverse patient population including e.g. gender, age, religion, culture, disabilities, sexual orientation.

  1. Osteopathic Philosophy and Osteopathic Clinical Medicine: Demonstrate, apply knowledge of osteopathic manipulative treatment (OMT); integrate osteopathic concepts and OMT into medical care; treating the person, and not just the symptoms

Skills:
ï Utilize caring, compassionate behavior with patients
ï Demonstrate the treatment of people rather than the symptoms
ï Demonstrate understanding of somato-visceral relationships and the role of the musculoskeletal disease
ï Demonstrate listening skills in interaction with patients
ï Assessing disease (pathology) and illness (patient’s response to disease)
ï Eliciting psychosocial information

C. Assessment of Core Competencies

The ICC utilizes formative assessment to evaluate learner skills and the effectiveness of NYCOM’s clinical training programs. Data on student performance in the ICC is tracked from the first through the fourth year. The ICC satellite at St. Barnabas assesses students during their clerkship years as well as interns and residents in a number of clinical services. It uses a variety of methods to assess competencies:

  1. Written evaluations
    ï Analytic assessment – skills checklists that document data-gathering ability
    ï Global-holistic rating scales to assess doctor-patient communication (Professionalism Assessment Rating Scale) and health team communication (SimCom-T)
    ï SOAP note and progress note assessment
  2. Debriefing / feedback – a verbal review of learner actions following a patient simulation program provided by standardized patients and instructors as appropriate.

Core Clinical Competencies 590 (MS 1)
Core Clinical Competencies 690 (MS 2)

The courses provide a horizontal integration between clinical courses provided by the LDB and DPC programs (small group discussion and demonstration) and the OMM department. It provides practice with simulated patients (some variation in this aspect as noted below), formative assessment, end-of-year summative assessment and remediation.

  1. SP PROGRAM, METRICS AND HOURS
    MS 1 Program – SP Different program, same standardized examination LDB
     SP program: training with formative assessment (see next bullet for formative assessment
    metrics)
     End of year OSCE assessing history-taking (checklists designed for each SP case), PE (see attached physical examination criteria) and interpersonal communication (see attached program in doctor-patient communication “Professionalism Assessment Rating Scale)
     Hours: 13.5 / year (including OSCE)

DPC
 Clinic visits to substitute for SP encounters
 End of year OSCE (same as LDB)
 Hours: Should be equivalent to the number of SP hours in the LDB program

NOTE: The purpose of the OSCE is to assess the clinical training of both the LDB and DPC programs. It is assumed the LDB and DPC faculty will work on this OSCE together with the OMM department.

MS 1 Program – Patient Simulation Program

LDB and DPC

 Same program in basic procedures for both LDB and DPC students as outlined in the syllabus distributed during the curriculum committee
 Hours: 5 hours / year

MS 2 Program – SP

LDB and DPC – same program, different approaches, same standardized exam
 SP program: training with formative assessment (see next bullet for formative assessment metrics)
 End of year OSCE assessing history-taking (checklists designed for each SP case), PE (see attached physical examination criteria) and interpersonal communication (see attached program in doctor-patient communication “Professionalism Assessment Rating Scale)
 Hours: 13.5 hours / year (including OSCE)
 NOTE: It is assumed that the LDB and DPC program schedules will vary but that the content will be equivalent

MS 2 Program – Patient Simulation Program

LDB and DPC – same program, same standardized exam
 Students work in the same group throughout the year

End of year OSCE assessing medical team communication using the SimCom-T rating scale (attached)
 Group grade assigned for the OSCE (reflecting the spirit of the SimCom-T rating scale)
 Hours: 11 / year (including OSCE)

  1. Attendance
     All activities and exams are mandatory.
     Make ups are done at the discretion of the ICC

NOTE: Make ups will be done as close to an activity as possible because delaying them, e.g. to the end of the year, will incur additional training expenses (e.g. re-training a SP for a case played months earlier) for the ICC.

  1. Grading and remediation
     Pass / fail
     Grading is based upon:
    o Attendance
    o Participation
    o End-of-year OSCE (standards to be set)

ICC Hours

MS1 Clinical Practice OSCE Total Hours
LDB 8 SP exercises @1.5 hours each 12 hours per student End-of-year SP OSCE
1.5 hours per student (approximately 6.25 days) 13.5 hours (SP)

5 patient simulation program exercises @ 1 hours each
5 hours per student

5 hours (Pat Sim)
Total = 18.5
DPC Clinic experience to substitute for SP exercises
 Students will receive information re: communication and PE competencies 0 hours (SP)
5 patient simulation program exercises @ 1 hours each
5 hours per student
5 hours Pat Sim
Total = 5

MS2 Clinical Practice OSCE Total Hours
LDB DPC 8 SP exercises @1.5 hours each 12 hours per student End-of-year SP OSCE
1.5 hours per student (approximately 6.25 days) 13.5 hours (SP)

6 patient simulation program exercises, plus ACLS 10 hours per student
End-of-year Pat Sim OSCE 1 hour per student (approximately 5 days)
11 hours (Pat Sim)
Total = 24.5

Institute For Clinical Competence (ICC)
Professionalism Assessment Rating Scale (PARS)

Dear Students:

As part of your professional development, standardized patients (SPs) in the ICC will be evaluating your interpersonal communication with them using the Professionalism Assessment Rating Scale (PARS).

This scale evaluates two types of interpersonal communication, both important to quality health care:

 Patient Relationship Quality – Rapport, empathy, confidence and body language.
 Patient Examination Quality – Questioning, listening, information exchanging and careful and thorough physical examination.

Arguably patients (real or simulated) are in the best position to assess your interpersonal communication with them because you are directly relating to them during an intimate, face-to- face, hands-on encounter. They are in the best position, literally, to observe your eye contact, demeanor and body language because they are in the room with you. We would recommend you take their feedback seriously, but perhaps “with a grain of salt.”

The term standardized patient is to some degree a misnomer – SPs can be standardized to present the same challenge and the same medical symptoms to each student, but they cannot be standardized to feel the same way about you and your work with them compared to other students. This is true in life as well as clinical work – some people will like you better than others, and patients are people! You may communicate with one patient the way you do with the next, but receive slightly different ratings. This is to be expected. Unlike the analytic checklists we use to document if you asked particular questions or performed certain exams correctly, there are no dichotomous / “right or wrong” communication ratings. Patients are people who may tune into different things during an encounter. We think this slight variation in observation is an asset that will help you understand that patients are individuals who must be approached as individuals.

Another word about the ratings you will receive – the ratings are not absolute numbers that constitute an unconditional assessment of your communication skills. Some days you may be better than other days. We use the ratings numbers (1-8 holistic scale) to chart progress over time. We do see improvements during the first two years of the typical student’s training but the ratings are used to track your progress as much as to structure a conversation with the SP, or faculty member, during debriefing. We would recommend you take responsibility during SP debriefing and ask them questions about the work you just did.

The holistic 1 – 8 scale is broken down into two parts: Ratings of 1 – 4 are considered “lower quality” communication, i.e. what might be considered acceptable at a novice or trainee level, but less acceptable for an experienced professional. Ratings of 5 – 8 are considered “higher quality” communication, i.e. more professional-quality communication regardless of the training or experience level.

© 2007 NYCOM Do not reproduce or distribute without permission 9/4/07

Professionalism Assessment Rating Scale (PARS)

Standardized patients will rate “to what degree” you demonstrated relationship quality and
examination quality on the following nine factors:

RELATIONSHIP QUALITY

To what degree did the student … Lower Higher
Quality Quality
1 Establish and maintain rapport 1 2 3 4 5 6 7 8
2 Demonstrate empathy 1 2 3 4 5 6 7 8
3 Instill confidence 1 2 3 4 5 6 7 8
4 Use appropriate body language 1 2 3 4 5 6 7 8
EXAMINATION QUALITY

To what degree did the student … Lower Higher
Quality Quality
5 Elicit information clearly, effectively 1 2 3 4 5 6 7 8
6 Actively listen 1 2 3 4 5 6 7 8
7 Provide timely feedback / information / counseling 1 2 3 4 5 6 7 8
8 Perform a thorough, careful physical exam or
treatment 1 2 3 4 5 6 7 8

Less experienced, More
or unprofessional professional

The following pages are a guide to the PARS, giving examples of “lower quality” and “higher quality” communication.

1 Establish and maintain rapport
Establish and maintain a positive, respectful collaborative working relationship with the patient.

Lower Quality

1 2 3 4 Higher Quality

5 6 7 8
Overly familiar.
 “Hi Bill, I’m John. How are you doing today.” Appropriate address, e.g.
 “Hi Mr. Jones, I’m Student-doctor Smith. Is it OK if I call you Bill?”
No agenda set.

No collaboration with the patient, i.e. carries out the exam without patient consent or agreement. Set agenda, e.g.
 “We have minutes for this exam. I’ll take a history, examine you…..etc.”

Collaborative mindset
 “Let’s figure out what’s going on.”
 “We’re going to work out this problem together.”
Took notes excessively, i.e. spent more time
taking notes than interacting. Spent more time interacting with the patient than
taking notes.
Began physically examining patient without
“warming” patient up, asking consent, etc. Asked consent for obtaining a physical
examination, e.g.
 “Is it OK for me to do a physical exam?”
Did not protect patient’s modesty, e.g.
 Did not use a drape sheet
 Did not direct patient to get dressed after exam
 Left door open when examining patient. Respected patient’s modesty at all times e.g.
 Used a drape sheet when appropriate
 Letting patient cover up follow an examination.
Talked “down” to patient, did not seem to
respect patient’s intelligence. Seemed to assume patient is intelligent.
Rude, crabby or overtly disrespectful. Never rude, crabby; always respectful.
Dress, hygiene problems:
 Wore distracting perfume/cologne.
 Poor hygiene, e.g. uncleanly, dirty nails, body odor, did not wash hands, etc.
 Touched hair continually
 Unprofessional dress, e.g. wore jeans, facial jewelry (e.g. tongue or nose studs), overly suggestive or revealing garments Dressed professionally, i.e. in a clean white coat,
clean clothes, etc.
Seemed angry with the patient. Seemed to like the patient.

2 Demonstrate empathy

Demonstrate both empathy (compassion, understanding, concern, support) and inquisitiveness (curiosity, interest) in the patient’s medical problem and life situation.
Lower Quality

1 2 3 4 Higher Quality

5 6 7 8
EMPATHY
No expressions of concern about patient’s
condition or situation. Expressed concern about patient’s condition or
situation, e.g.
 “That must be painful.”
 “I’m here to try to help you.”
Failed to acknowledge positive behavior /
lifestyle changes the patient has made. Reinforced behavior/lifestyle changes the patient
has made, e.g. “That’s great you quit smoking.”
Failed to acknowledge suggested behavior /
lifestyle changes might be difficult. Acknowledged that suggested behavior/lifestyle
changes might be difficult.
Empathic expression seemed insincere,
superficial. Empathic expressions seemed genuine.
Detached, aloof, overly “business-like,” robotic in
demeanor.

Seeming lack of compassion, caring. Compassionate and caring, “warm.”
Accused patient of being a non-compliant, e.g.
 “Why don’t you take better care of yourself?”
 “You should have come in sooner.” Positive reinforcement of things patient is doing
well, e.g.
 “That’s great that you stopped smoking.”
 “I’m glad you are taking your medication on a regular basis.
INQUISITIVENESS – An aspect of empathy is inquisitiveness, the ability to attempt to understand the patient, both medically and personally.
Focused on symptoms, but not the patient, i.e.
did not explore how the medical problem / symptoms affect the patient’s life.

Failed to explore activities of daily living. Tried to understand how the medical problem /
symptoms affect the patient’s life, or vice versa.
 “How is this affecting your life?”
 “Tell me about yourself.”
 “Describe a typical day in your life.”
 “Tell me about your stress.”
Failed to explore patient’s response to diagnosis
and / or treatment. Inquires as to patient’s response to diagnosis and
/ or treatment
Failed to explore barriers to behavior / lifestyle
change. Explored barriers to behavior / lifestyle change.

3 Instill confidence

Instilling confidence that the medical student or doctor is able to help and treat the patient.
Lower Quality

1 2 3 4 Higher Quality

5 6 7 8
Conveyed his / her anxiety, e.g.
 By avoiding eye contact
 Laughing or smiling nervously
 Sweaty hand shake

Made statement such as:
 “This is making me nervous.”
 “This is the first time I’ve ever done this.”
 “I don’t know what I’m doing.”

Apologized inappropriately to the patient. E.g.
 “I’m sorry, but I have to examine you.” Conveyed an appropriately confident demeanor,
e.g.
 Made eye contact
 Shook hands firmly, etc.
Overly confident, cocky. Never cocky, appropriately humble without
undermining the patient’s confidence.
When making suggestions, used tentative
language, e.g.
 “Maybe you should try…”

 “I’m not sure but …” When making suggestions, used authoritative
language, e.g.
 “What I suggest you do is…”
Made excuses for his/her lack of skill or
preparation by making statements such as:
 “I’m just a medical student.”

 “They didn’t explain this to me.”

 “Do you know what I’m supposed to do next?” Offered to help the patient or get information if he
/ she could not provide it by saying, e.g.

 “Let me ask the attending physician”

 “I don’t know but let me find out for you.”

4 Use appropriate body language

The ability to use appropriate gestures, signs and body cues.
Lower Quality

1 2 3 4 Higher Quality

5 6 7 8
Overly casual posture, e.g. leaning against
the wall or putting feet up on a stool when interviewing the patient. Professional posture, i.e. carried himself / herself
like an experienced, competent physician.
Awkward posture, e.g.
ï Stood stiffly when taking a history
ï Stood as if he / she was unsure what to do with his / her body. Natural, poised posture.
Uncomfortable or inappropriate eye contact
e.g. stared at the patient too long and / or never looked at the patient. Used appropriate eye contact.
Avoided eye contact when listening. Made eye contact when listening, whether eye
level of not.
Stood or sat too close or too distant from the
patient. Maintained an appropriate “personal closeness”
and “personal distance.”
Turned away from the patient when listening. Maintained appropriate body language when
listening to the patient.

5 Elicit information clearly, effectively

Effectively ask questions in an articulate, understandable, straightforward manner.
Lower Quality

1 2 3 4 Higher Quality

5 6 7 8
Used closed-ended, yes / no questions
exclusively, e.g.

 “How many days have you been sick?”
 “Ever had surgery?”
 “Any cancer in your family?” Used open-ended questions to begin an inquiry,
and closed-ended questions to clarify, e.g.

 “Tell me about the problem.”
 “What do you do in a typical day?”
 “How is your health in general?”
Used open-ended questions / non-clarifying
questions exclusively.
Used open-ended questions to begin an inquiry,
and closed-ended questions to clarify.
Student’s questions were inarticulate, e.g.
mumbled, spoke too fast, foreign accent problems, stuttered*, etc.

  • NOTE: Consider stuttering a form of inarticulation for rating purposes, i.e. do not make allowances for stuttering Student was articulate, asked questions in an
    intelligible manner.
    Asked confusing, multi-part or overly complex
    questions, e.g.

 “Tell me about your past medical conditions, surgeries and allergies.” Asked one question at a time, in a straight-forward
manner.

 “Tell me about your allergies.”
Asked leading questions, e.g.
 “No cancer in your family, right?”
 “No surgeries?”
 “You only have sex with your wife, right?” Asked direct questions, e.g.

 “Do you have any cancer in your family?
 “Any surgeries?”
 “Are you monogamous?”
Jumped from topic to topic
in a “manic,” disjointed or disorganized way. Organized interview.

Stayed focused, asked follow up questions before moving to another topic.
Asked questions in a robotic way,
i.e. as if reading from a prepared checklist. Asked questions in a conversational way, i.e.
listened to the response, and then asked another question.
Constantly cut off patient, i.e. did
not let patient finish sentences. Allowed patient to finish sentences and thoughts
before asking the next question.

6 Actively listen

Both listen and respond appropriately to the patients’ statements and questions.

Lower Quality

1 2 3 4 Higher Quality

5 6 7 8
Asked questions without listening to the
patient’s response. Asked questions and listened to patient’s
response.
No overt statements made indicating he / she
was listening. Said, e.g. “I’m listening.”
Turned away from the patient when listening. Maintained appropriate body language when
listening to the patient.
Kept asking the same question(s) because
the physician didn’t seem to remember what he / she asks. If necessary, asked the same questions to obtain
clarification, e.g.
 “Can you tell me again how much you smoke?”
 “I know you told me this, but when was the last time you saw your doctor?”
Wrote notes without indicating he / she was
listening. When writing indicated he / she is listening, e.g.
 “I have to write down a few things down when we talk, OK?”
Did not seem to be listening, seemed
distracted. Attentive to the patient.
Kept talking, asking questions, etc. if the
patient was discussing a personal issue, a health concern, fear, etc. Was silent when necessary, e.g. if the patient was
discussing a personal issue, a health concern, fear, etc.

7 Provide timely feedback / information / counseling
Explain, summarize information (e.g. results of physical exams, provides patient education activities, etc.), or provide counseling in a clear and timely manner.
Lower Quality

1 2 3 4 Higher Quality

5 6 7 8
Did not explain examination procedures, e.g. just started examining the patient without explaining what he / she was doing.
Explained procedures, e.g.

 “I’m going to check your legs for edema.”

 “I’m going to listen to your heart.”
Did not provide feedback at all, or provided minimal feedback Periodically provided feedback regarding what he / she heard the patient saying.
 “It sounds like your work schedule makes it difficult for you to exercise.”
 “I hear in your voice that your family situation is causing you a lot of stress.”
Did not summarize information at all. Periodically summarized information.
 “You had this cough for 3 weeks, it’s getting worse and now you’ve got a fever. No one is sick at home and you haven’t been around anyone who is sick.”
Provided empty feedback or unprofessional feedback, e.g.

 “OK…..OK…..OK…..OK…”
 “Gotcha..gotcha…gotcha,..”
 “Great ” “Awesome” “Cool” Feedback was meaningful, useful and timely.
Examined the patient without providing feedback about the results of the exam. Provided feedback about results of the physical exam.

 “Your blood pressure seems fine.”
Refused to give the patient information he / she requested, e.g.

“You don’t need to know that.” “That’s not important.” Give information to the patient when requested, or offered to get it if he / she couldn’t answer the patient’s questions.
Used medical jargon without explanation, e.g.

 “What you experienced was a myocardial infarction.” Explained medical terms.

 “What you experienced is a myocardial infarction, meaning a heart attack.”
Ended the exam abruptly.

No closure, no information about the next steps Let the patient know what the next step was, provided closure.

 “Let’s review the exam and your health…”

8 Conduct a thorough, careful physical exam or treatment

Conduct physical exams and / or treatment in a thorough, careful manner vs. a tentative or superficial manner.
Lower Quality

1 2 3 4 Higher Quality

5 6 7 8
Conducted a superficial examination, e.g.
 Avoided touching the patient
 Touched patient with great tentativeness Conducted a careful examination, e.g.
 Examined on skin when appropriate
Hurried through the exam. Used the full amount of time allotted to examine
the patient.
Avoided inspecting (looking at) the patient’s
body / affected area. Thoroughly inspected (looked at) the affected
area e.g. with gown open.
Consistently palpated, auscultated and / or
percussed over the exam gown. Consistently palpated, auscultated and / or
percussed on skin.
Exam not bi-lateral (when appropriate). Bi-lateral exam (when appropriate).
Rough exam, e.g.
 Started, stopped, re-started the exam.
 Fumbled with instruments Conducted a smooth exam from beginning to
end.
Did not look to see what patient’s expressions
were during an examination in order to assess pain. Looked for facial expressions to assess pain.
Did not thoroughly examine the site of the
chief complaint, e.g.
 Did not examine heart and / or lungs if chief complaint was a breathing problem Thoroughly examined the site of the chief
complaint.

9 Conduct the examination in an organized manner

Overall conduct the exam in an organized, systematic way vs. a disorganized or unsystematic way.
Lower Quality

1 2 3 4 Higher Quality

5 6 7 8
No clear opening, e.g.
 Did not set an agenda
 Abruptly began the exam

Medical interview not organized – history jumped from topic to topic

No clear closure, e.g.
 Did not summarize information gathered during the history and physical examination

 Did not ask patient “Any more questions?”

 Did not clarify next steps Clear opening, e.g.
 Set an agenda and followed it
 Began the exam after a proper introduction

Organize the medical interview vs. jumping from topic to topic

Clear closure, e.g.

 Summarized information gathered during the history and physical examination

 Asked patient “Any more questions?”

 Clarified next steps

SimCom- T( eam) Holistic Scoring Guide

The SimCom-T is a holistic health care team communication training program and rating scale. The nine-factor scale of SimCom-T rates team members’ performance as a unit, i.e. individual team member performance should be considered a reflection upon the entire team.

Rate each factor individually.
Ratings should be global, i.e. reflect the most characteristic performance of the team vs. individual incidents.

Competency  Lower   Higher

Quality Quality
1 Leadership establishment and maintenance 1 2 3 4 5 CNE
2 Global awareness 1 2 3 4 5 CNE
3 Recognition of critical events 1 2 3 4 5 CNE
4 Information exchange 1 2 3 4 5 CNE
5 Team support 1 2 3 4 5 CNE
6 External team support 1 2 3 4 5 CNE
7 Patient support 1 2 3 4 5 CNE
8 Mutual trust and respect 1 2 3 4 5 CNE
9 Flexibility 1 2 3 4 5 CNE
10 Overall Team Performance 1 2 3 4 5 CNE

The following pages are a guide to SimCom-T, providing behavioral examples representative of each score for the SimCom-T competencies.

Score Performance Level Description – The team…
1 Limited ….consistently demonstrates novice and / or dysfunctional team attributes
2 Basic ….inconsistently operates at a functional level
3 Progressing ….demonstrates basic and average attributes
4 Proficient ….proficient and consistent in performance
5 Advanced ….experienced and performing at a significant expert level
CNE Not applicable ….A factor could not be evaluated for some reason

  1. Leadership Establishment and Maintenance

Team members both establish leadership and maintain leadership throughout.

Lower Quality   Higher Quality  

Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Leader not established
▪ Roles not assigned
▪ No discussion regarding role assignment ▪ Unable to identify leader
▪ Many leaders
▪ No clear role definition ▪ Leadership not explicit throughout event
▪ Leadership not maintained throughout the event
▪ Role switching without leader involvement ▪ Leader explicitly identified
▪ Roles defined ▪ Leadership explicitly identified and maintained
▪ Roles defined and maintained
▪ Leader delegates responsibility
Examples ▪ Team operating dysfunctionally without a leader
▪ Team members taking on similar roles and role switching consistently
▪ Team members unsure of who is responsible for different tasks ▪ Leader timid and does not take charge
▪ Team member roles unclear and/or inconsistent ▪ A team member asks, “Who is running the code?” and another says, “I am,” but does not take communicate leadership responsibilities.
▪ Team members are assigned roles but do not take on the assignment ▪ Team members select a leader
▪ A team member volunteers to handle the situation
▪ Roles clearly defined by team members and/or leader ▪ Leadership and roles are established very early in the event and is maintained throughout the event
▪ Clarity of leadership and roles is evident throughout the event and with the team members

  1. Global Awareness

Team members monitor and appropriately respond to the total situation, i.e. the work environmental and the patient’s condition.

Lower Quality   Higher Quality  

Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Does not monitor the environment and patient
▪ Does not respond to changes in the environment and
patient ▪ Monitoring and response to changes in the environment and patient rarely occur
▪ Fixation errors ▪ Monitoring and response to the environment and patient are not evident throughout the event ▪ Monitors the environment and patient
▪ Respond to changes in the environment and patient ▪ Consistently monitors the environment and patient
▪ Consistently respond to changes in the environment and
patient
Examples ▪ There is no summary of procedures, labs ordered, or results of labs
▪ Team is task oriented and does not communicate about the event ▪ Event manager loses focus and becomes task oriented
▪ There is no clear review of the lab results and/or summary of procedures. ▪ Leader says, “Team, lets review our differential diagnosis and labs,” and team does not respond to the leader.
▪ Some of the team members discuss among themselves results and possible problems. ▪ Leader says, “Team, lets review our differential diagnosis and labs,” and team reviews the situation.
▪ ▪ Event manager remains at the foot of the bed keeping a global assessment of the situation
▪ Leader announces plan of action for the event.

  1. Recognition of Critical Events

Team promptly notes and responds to critical changes in the patient’s status and / or environment.

Lower Quality   Higher Quality  

Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Does not monitor or respond to critical deviations from steady state
▪ Fails to recognize or acknowledge crisis
▪ “Tunnel Vision” ▪ Fixation errors are consistently apparent
▪ ▪ Team reactive rather than proactive
▪ Critical deviations from steady state are not announced for other members ▪ Monitors and responds to critical deviations from steady state
▪ Recognizes need for action ▪ All team members consistently monitors and responds to critical deviations from steady state
▪ Anticipates potential problems
▪ Practices a proactive approach and attitude
▪ Recognizes need for action
▪ “Big Picture”
Examples ▪ Patient stops breathing, and team does not recognize the situation throughout the event
▪ Patient is pulseless, and no CPR is started throughout the event ▪ Patient stops breathing, and team does not recognize this situation for a critical time period
▪ Patient is pulseless, and no CPR is started for a critical time period ▪ ▪ Leader says, “Team, lets review our differential diagnosis, are there any additional tests that we should request?” ▪ “John, the sats are dropping, please be ready, we might have to intubate.”
▪ “Melissa, the blood pressure is dropping. Get ready to start the 2nd IV and order a type and cross.”

  1. I nformation Exchange

Patient and procedural information is exchanged clearly.

Lower Quality   Higher Quality  

Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Communication between team members is not noticeable
▪ Requests by others are not acknowledged
▪ No feedback loop
▪ No orders given ▪ Vague communication between team members
▪ Not acknowledging requests by others
▪ Feedback loop left opened
▪ Orders not clearly given ▪ Communication between team and response to requests by others inconsistent
▪ Feedback loops open and closed
▪ Orders not directed to a specific team member ▪ Team communicates and acknowledges requests throughout the event
▪ Feedback loops closed ▪ Explicit communication consistently throughout the event
▪ Team acknowledges communication
▪ Closed loop communication throughout event
Examples ▪ No summary of events.
▪ No additional information sought from the team members. ▪ Event manager says, “I need a defibrillator, we might have to shock this patient,” and no team member acknowledges the order. The request was not given explicitly to a team member.
▪ ▪ One team member says to another in a low voice, “We need to place a chest tube,” but the event manager does not hear the communication.
▪ Event manager requests a defibrillator, but not explicitly to a particular team member; several team members
attempt to get the defibrillator ▪ Jonathan says to event manager, “We need to place a chest tube.” Event manager responds, “OK, get ready for it.”
▪ Leader says, “Team, lets summarizes what has been done so far.”
▪ Leader says, “Mary please start an IV.” Mary responds, “Sorry, I do not know how, please ask someone else to do
it.” ▪ Event manager summarizes events.
▪ Event manager seeks additional information from all team members
▪ Event manager says, “Peter, I want you to get the defibrillator, we might have to shock this patient.” Peter responds, “Yes, I know where it is and I’ll get it.”

  1. Team Support

The team works as a unit, asking for or offering assistance when needed vs. team members “ going it alone.”

Lower Quality   Higher Quality  

Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ No assistance or help asked for or offered
▪ Team members act unilaterally
▪ No recognition of mistakes
▪ Team members watching and not
participating ▪ Team members take over when not needed
▪ Mistakes not addressed to the team
▪ Negative feedback ▪ Assistance is offered when needed only after multiple requests ▪ Team recognizes mistakes and constructively addresses them ▪ Team member(s) ask(s) for help when needed
▪ Assistance provided to team member(s) who need(s) it
Examples ▪ During a shoulder dystocia event, the critical situation is recognized, but no help is requested or attempts to resolve situation on their own
▪ Wrong blood type delivered and administered, an no backup behaviors to correct the mistake
▪ Team member administers medication without consulting the event manager ▪ Charles knows that the patient is a Jehovah Witness and does not let the team know when a T&C is ordered.
▪ Team does not communicate that he/she doesn’t know how to use a defibrillator and attempts to do it anyways and fails. ▪ ▪ ▪ During a shoulder dystocia event, the critical situation is recognized, and event manager calls for help
▪ Wrong blood type delivered, attempt made by team member to administer the blood but another team member recognizes the mistake and stops the transfusion before it starts
▪ Team member consults with the event manager before administering
medication

  1. External Team Support

Work team provides “ external team” (family members and / or other health care professionals) with information and support as needed

Lower Quality   Higher Quality  

Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Team fails to recognize or interact with other significant people who are present during the encounter ▪ Team recognizes other significant people who are present during the encounter but ignores to interact
with them ▪ Team inconsistently interacts with other significant people who are present during the encounter ▪ Team interacts with other significant people who are present during the encounter ▪ Team effectively interacts with other significant people who are present during the encounter
Examples ▪ Team fails to interact with a distraught family member and/or para-professional ▪ Team fails to interact appropriately with a distraught family member
▪ Team does not cooperate with a para-professional ▪ ▪ ▪

  1. Patient Support

Work team provides the patient and significant others with information and emotional support as needed.

Lower Quality   Higher Quality  

Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Team fails to interact with patient if conscious
▪ Team fails to show empathy or respect for a patient (conscious or unconscious)
▪ Team fails to provide appropriate information when requested to do so ▪ Teams interaction with patient is minimal and when done so is lacking in respect or empathy ▪ Team inconsistently shows empathy or respect for a patient (conscious or unconscious)
▪ Team inconsistently provides information when requested to do so ▪ Team shows empathy toward patient
▪ Team provides appropriate information when requested to do so ▪ Team demonstrates consistent and significant respect and empathy for patient
▪ Appropriate information is provided consistently
Examples ▪ Team deals with an unconscious patient with a lack of respect,
e.g. by joking about his / her condition
▪ Charles knows that the patient is a Jehovah Witness and does not let the team know when a T&C is
ordered. ▪ ▪ ▪ Charles lets the leader know that the patient is a Jehovah Witness and that she refused blood products. ▪

  1. Mutual Trust and Respect

The team demonstrates civility, courtesy and trust in collective judgment.

Lower Quality   Higher Quality  

Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Team exhibits e.g. rudeness, overt distrust/mistrust, anger or overt doubt or suspicion toward
each other ▪ Few team members exhibit rudeness, overt distrust, anger or suspicion toward each other ▪ Team inconsistently demonstrates respect, rudeness, distrust or anger toward each other ▪ Team exhibits e.g. civility, courtesy, and trust in collective judgment ▪ Team is significantly respectful of each other
▪ Praise when appropriate
Examples ▪ Angry, stressed event manager says to team member, “I can’t believe you can’t intubate the patient. What’s the matter with you?”
▪ Team member says to another, “You don’t know what you’re doing-let me do it for you.”
▪ Event manager recognizes a chest tube is needed, and barks, “Michelle, I want you to put in a chest tube, I want you to do it now, and I want you to do it right on your first attempt.” ▪ Leader overbearing and intimidating ▪ ▪ Stressed but composed leader recognizes a team member cannot intubate the patient and offers assistance
▪ Team member says to another, “Are you OK? Let me know if I can help you.”
▪ Event manager recognizes a chest tube is needed and says, “Michelle, this patient needs a chest tube-can you put it in now?” ▪ Leader is clear, direct, and calm.
▪ Team members will thank each other when appropriate.

  1. Flexibility

The team adapts to challenges, multitasks effectively, reallocates functions, and uses resources effectively; team self correction.

Lower Quality   Higher Quality  

Score 1 2 3 4 5 CNE
Level Limited Basic Progressing Proficient Advanced
Description ▪ Team rigidly adheres to individual team roles
▪ Inefficient resource allocation / use ▪ Minimal adaptability and/or hesitation to changing situations ▪ Team can adapt to certain situations, but not all ▪ Generally very flexible
▪ Multi-tasks effectively
▪ Reallocates functions
▪ Uses resources effectively ▪ Team adapts to challenges consistently
▪ Engages self- correction
Examples ▪ Ambu-bag not working, and no reallocation of resources established
▪ Team members stay in individual roles, failing to support each other e.g. by failing to recognize fatigue of those giving CPR
▪ Patient’s hysterical family member disrupts the team and team continues providing care, ignoring disruptive relative ▪ ▪ ▪ Ambu-bag not working, and an airway team member gives mouth-to-mouth with a mask and event manager asks another team member to retrieve a working ambu-bag
▪ Team members alternate giving CPR, recognizing fatigue of those giving CPR
▪ Patient’s hysterical family member disrupts the team and a team manages the situation, e.g. removes, counsels, or
reassures the family member ▪

  1. Overall Team Performance Lower Quality Higher Quality
    Score 1 2 3 4 5 CNE
    Level Limited Basic Progressing Proficient Advanced
    Description ▪ Consistently operating at a novice training level ▪ Demonstrates inconsistent efforts to operate at a functional level ▪ Inconsistently demonstrates below and average attributes ▪ Demonstrates significant cohesiveness as a team unit;
    ▪ Performs proficiently ▪ Consistently operates at an experienced and professional level; performs as experts
    Training Level ▪ Team requires training at all levels; unable to function independently ▪ Team needs training at multiple levels to function independently ▪ Team needs focused training to function independently ▪ Team can function independently with supervision ▪ Team functions independently

Case A – Dizziness, Acute

Student Student ID SP ID

History Scoring: Give students credit (Yes) if they ask any of the following questions and / or SPs give the following responses. If question(s) not asked or response(s) not give, give no credit (No).

HISTORY CHECKLIST   Yes No

1 ONSET, e.g. “When did dizziness start?”
ï “The dizziness started last night when I was cleaning up after dinner.”
2 PAST MEDICAL HISTORY OF PROBLEM, e.g. “Ever had this problem before?”
 “I almost passed out once in restaurant a few months ago. The EMT truck came and checked me out and they thought I was dehydrated from exercising. I had just come from the gym.”
3 QUALITY, e.g. “Describe the dizziness.”
ï “Every few minutes or so I get the feeling the room is spinning and I feel a little nauseous, then it goes away and I feel OK. Then it starts all over again.”
4 AGGRAVATING, e.g. “What makes the dizziness worse?”
 “Standing up with my eyes open makes me feel dizzy.”
5 PALLIATIVE, e.g. “What makes the dizziness better?”
 “Closing my eyes and laying down makes the dizziness better.”
6 HEAD INJURIES, e.g. “Have you bumped or injured your head?”
ï “No head injuries.”
7 PAST MEDICAL HISTORY, e.g. “How is your health in general?”
 “In general I’ve been very healthy.”
8 MEDICATIONS, e.g. “Are you taking any medications for this problem or anything else?”
 “I’m not taking anything. I thought of taking Dramamine but I wasn’t sure it would help.”
9 DIET, e.g. “What do you eat in a typical day?”
 “A regular diet, toast and coffee in the morning, usually take out for
lunch, Chinese, a pizza or sub, something like that, and a regular meal at night.”
10 TOBACCO USE, e.g. “Do you smoke?”
ï “I used to smoke ½ a pack a day, but now I’m down to 4 or 5, sometimes a couple more if I’m stressed.”
11 ADLs, e.g. “How is this affecting your life?”
 “I couldn’t go to work today.”

Case A – Dizziness, Acute

PE SCORING:
 COLUMN 1: NO CREDIT: If any box is checked, exam was done “incorrectly” or “incompletely.” Checked “Incorrect Details” box records reason(s) why.
 COLUMN 2: FULL CREDIT: If “Correct” box is checked, exam was done “Correctly / Completely.”
 COLUMN 3: NO CREDIT: If “Not Done” box is checked, exam was not attempted at all.

Physical Examination Checklist  1

Incorrect Details 2
Correct 3
Not Done
12 Perform fundoscopic examination
 Did not ask the patient to fix their gaze at point in front of them.
 Exam room not darkened.
 Otoscope used instead of ophthalmoscope
 “Left eye-left hand-left eye” or “right eye-right hand -right eye rule” not followed.
 Exam not bilateral.







13 Assess Cranial Nerve II – Optic – Assess Visual Fields by Confrontation
 Examiner not at approximate eye-level with patient, and / or no eye contact.
 Examiner’s hands not placed outside of patient’s field of vision.
 Did not ask “Tell me when you see my fingers.”
 Did not test both upper and lower fields, and / or bilaterally.





14 Assess Cranial Nerves II and III – Optic and Oculomotor: Assess direct and consensual reactions
 Did not shine a light obliquely into each pupil twice to check both the direct reaction and consensual reaction.
 Did not assess bilaterally.




15 Assess Cranial Nerves II and III – Optic and Oculomotor: Assess near reaction and near response
 Did not test in normal room light.
 Finger, pencil, etc. placed too close or too far from the patient’s eye.
 Did not ask the patient to look alternately at the finger or pencil and into the distance.




Case A – Dizziness, Acute

PE SCORING:
 COLUMN 1: NO CREDIT: If any box is checked, exam was done “incorrectly” or “incompletely.” Checked “Incorrect Details” box records reason(s) why.
 COLUMN 2: FULL CREDIT: If “Correct” box is checked, exam was done “Correctly / Completely.”
 COLUMN 3: NO CREDIT: If “Not Done” box is checked, exam was not attempted at all.

    1

Incorrect Details 2
Correct 3
Not Done
16 Assess Cranial Nerve III – Oculomotor: Assess convergence
 Did not ask the patient to follow his / her finger or pencil as he / she moves it in toward the bridge of the nose.




17 Assess Cranial Nerve III, IV and VI – Oculomotor, trochlear and abducens: Assessing extraocular muscle movement
 Examiner did not assess extra-ocular muscle movements in at least 6 positions of gaze using, for example, the “H” pattern.
 Did not instruct patient to not move the head during the exam.




18 Assess Cranial Nerve VIII – Acoustic / Weber test
 Did not produce a sound from tuning fork, e.g. by not holding the fork at the base
 Did not place the base of the tuning fork firmly on top middle of the patient’s head.
 Did not ask the patient where the sound appears to be coming from.




19 Assess Cranial Nerve VIII – Acoustic / Rinne test
 Did not produce a sound from tuning fork, e.g. by not holding the fork at the base
 Did not place the base of the tuning fork against the mastoid bone behind the ear.
 Did not ask patient to say when he / she no longer hears the sound, hold the end of the fork near the patient’s ear and ask if he / she can hear the vibration.
 Did not tap again for the second ear.
 Did not assess bilaterally.





20 Assess Gait
 Did not ask patient to walk, turn and come back to look for imbalance, postural, asymmetry and type of gait (e.g. shuffling, walking on toes, etc.)



21 Perform Romberg Test
 Did not direct patient to stand with feet together, eyes closed, for at least 20 seconds without support.
 Did not stand in a supportive position, e.g. behind patient or with hand behind patient.



Case A – Dizziness, Acute

RELATIONSHIP QUALITY

To what degree did the student …
Lower Higher
Quality Quality
1 Establish and maintain rapport 1 2 3 4 5 6 7 8
2 Demonstrate empathy 1 2 3 4 5 6 7 8
3 Instill confidence 1 2 3 4 5 6 7 8
4 Use appropriate body language 1 2 3 4 5 6 7 8
EXAMINATION QUALITY

To what degree did the student …
Lower Higher
Quality Quality
5 Elicit information clearly, effectively 1 2 3 4 5 6 7 8
6 Actively listen 1 2 3 4 5 6 7 8
7 Provide timely feedback / information / counseling 1 2 3 4 5 6 7 8
8 Perform a thorough, careful physical exam or treatment 1 2 3 4 5 6 7 8

  1. Clinical Clerkship Evaluations / NBOME Subject Exams

Data compiled from 3rd/4th year clerkships includes:

 Student Performance Evaluations from specific hospitals (attending/supervising physicians, and/or residents) based upon the 7 core Osteopathic Competencies. Data is broken down further by student cohort: traditional, BS/DO, and Émigré and is quantified according to curricular track (Lecture Discussion-Based and Doctor Patient Continuum);
 NBOME Subject Exam scores for each of the (6) core clerkships and OMM. Core clerkships include:
a) Family Medicine
b) Medicine
c) OB-GYN
d) Pediatrics
e) Psychiatry
f) Surgery

NBOME Subject Exam statistics are shared with 3rd year students as a frame of reference to determine their performance relative to their NYCOM peers. These data also serve as a general guide for COMLEX II CE preparation and performance;
 Students provide feedback on their clinical experiences during their clerkships, via the “PDA project”:
a) The PDA is a tool utilized for monitoring clerkship activities. The DEALS (Daily Educational Activities Logs Submission) focuses on educational activities, while the LOG portion focuses on all major student-patient encounters. A rich data set is available for comparing patient encounters and educational activities across all sites for all clerkships.

b) PDA data is used as a multimodal quality assessment tool for curricular exposure as well as OMM integration across all hospitals (including “outside” clerkships) for Patient Encounters and Educational Activities.

 Reports from student focus groups—these reports are based upon in-person group interviews by a full-time NYCOM Medical Educator and feedback is analyzed in order to ensure consistency in clerkship education and experiences, as well as for program improvement indicators.

Specific forms/questionnaires utilized to capture the above-detailed information include the following:

 Clinical Clerkship Student Performance Evaluation Samples of the forms/questionnaires follow

NEW YORK COLLEGE OF OSTEOPATHIC MEDICINE
OFFICE OF CLINICAL EDUCATION
Northern Boulevard -– Old Westbury, NY 11568-8000
Tel.: 516-686-3718 – Fax: 516-686-3833
(*) Only ONE form, with COMPOSITE GRADE & COMMENTS should be sent to the Hospital’s Office of Medical Education

for the DME SIGNATURE .
ONLY)

COURSE # (For NYCOM Purpose

STUDENT: , Class Year:
HOSPITAL:
Last First

ROTATION(Specialty) ROTATION DATES:
/ / / /
To
EVALUATOR: TITLE:

From

(Attending Physician / Faculty Preceptor)

A. Student logs by PDA  REVIEWED (at least 10 patients)  NOT REVIEWED
B. Student’s unique “STRENGTHS” (Very Important –To be incorporated into the College’s Dean’s Letter)

C. Student’s LIMITATIONS (areas requiring special attention for future professional growth)

D. For items below CIRCLE the most appropriate number corresponding to the following rating scale:

Exceptional=5 Very Good = 4 Average = 3 Marginal = 2 1 = FAILURE N/A OR no opportunity to observe

CORE COMPETENCY (See definitions on reverse side)   RATING

Patient Care 5 4 3 2 1 N /A
Medical Knowledge 5 4 3 2 1 N /A
Practice-Based Learning & Improvement 5 4 3 2 1 N /A
Professionalism 5 4 3 2 1 N /A
System-Based Practice 5 4 3 2 1 N /A
Interpersonal and Communication Skills 5 4 3 2 1 N/A
Osteopathic Manipulative Medicine 5 4 3 2 1 N /A

OVERALL RADEG 5 4 3 2 1(FAILURE

Evaluator Signature: Date:
/ /
Student Signature: Date:
/ /
(Ideally at Exit Conference)
(*) DME Signature: Date:
/ /

Please Return to: → Hospital’s Office of Medical Education OVER →

The Seven Osteopathic Medical Competencies

Physician Competency is a measurable demonstration of suitable or sufficient knowledge, skill sets, experience, values, and behaviors, that meet established professional standards, supported by the best available medical evidence, that are in the best interest of the well-being and health of the patient.

Patient Care: Osteopathic patient care is the ability to effectively determine and monitor the nature of a patient’s concern or problem; to develop, maintain, and to bring to closure the therapeutic physician-patient relationship; to appropriately incorporate osteopathic principles, practices and manipulative treatment; and to implement effective diagnostic and treatment plans, including appropriate patient education and follow-up, that are based on best medical evidence.

Medical Know ledge: Medical Knowledge is the understanding and application of biomedical, clinical, epidemiological, biomechanical, and social and behavioral sciences in the context of patient-centered care.

Practice- Based Learning & I m provem ent: Practice-Based learning and improvement is the continuous evaluation of clinical practice utilizing evidence-based medicine approaches to develop best practices that will result in optimal patient care outcomes.

Professionalism : Medical professionalism is a duty to consistently demonstrate behaviors that uphold the highest moral and ethical standards of the osteopathic profession. This includes a commitment to continuous learning and the exhibition of personal and social accountability. Medical professionalism extends to those normative behaviors ordinarily expected in the conduct of medical education, training, research, and practice.

System – Based Practice: System-based practice is an awareness of and responsiveness to the larger context and system of health care, and the ability to effectively identify and integrate system resources to provide care that is of optimal value to individuals and society at large.

I nterpersonal & Com m unication Skills: Interpersonal and communication skills are written, verbal, and non-verbal behaviors that facilitate understanding the patient’s perspective. These skills include building the physician-patient relationship, opening the discussion, gathering information, empathy, listening, sharing information, reaching agreement on problems and plans, and providing closure. These skills extend to communication with patients, families, and members of the health care team.

Osteopathic Manipulative Medicine: Osteopathic philosophy is a holistic approach that encompasses the psychosocial, biomedical, and biomechanical aspects of both health and disease, and stresses the relationship between structure and function, with particular regard to the musculoskeletal system.
Definitions Provided by the National Board of Osteopathic Medical Examiners (NBOME)

  1. Student feedback (assessment) of courses / Clinical clerkship / PDA project

 Data received on courses and faculty through the newly implemented, innovative Course / Faculty Assessment program (see below-NYCOM Student Guide for Curriculum and Faculty Assessment). Students (randomly) assigned (by teams) to evaluate one course (and associated faculty) during 2-year pre-clinical curriculum. Outcome of student-team assessment is presented to Curriculum Committee, in the form of a one-page Comprehensive Report;
 Clerkship Feedback (quantitative and “open-ended” feedback) provided through “Matchstix” (web-based feedback program): this information is shared with NYCOM Deans and Clinical Chairs, Hospital Director’s of Medical Education (DMEs), Hospital Department Chairs and Clerkship Supervisors. Also, the information is posted on the “web” to assist and facilitate 2nd year students choosing 3rd year Core Clerkship Sites (transparency). This data is also utilized via two (2) year comparisons of quantitative data and student feedback shared with NYCOM Deans & Chairs, as well as Hospital DMEs;
 Clerkship Feedback via PDA: quantitative and open-ended (qualitative) feedback on all clerkships is collected via student PDA submission. The information is utilized as a catalyst for clerkship quality enhancement. This data-set is used as a multimodal quality assessment tool for curricular exposure as well as OMM integration across all hospitals (including “outside” clerkships) for Patient Encounters and Educational Activities;

 Reports from student focus groups—these reports are based upon in-person group interviews by a full-time NYCOM Medical Educator and feedback is analyzed in order to ensure consistency in clerkship education and experiences, as well as for program improvement indicators;

Specific forms/questionnaires utilized to capture the above-detailed information include the following:

 NYCOM Student Guide for Curriculum and Faculty Assessment
 Clerkship (site) feedback from Clerkship students
 Clinical Clerkship Focus Group Form
 4th Year PDA Feedback Questionnaire
 Student End-of-Semester Program Evaluations (DPC)
 DPC Program Assessment Plan
 Osteopathic Manipulative Medicine (OMM) Assessment Forms Samples of the forms/questionnaires follow

Rotation: Surgery
Site: (*) MAIMONIDES MEDICAL CENTER
This is an anonymous feedback form. No student identification data is transmitted.

Questions marked with * are mandatory.

Section I. Please respond to each statement in this section according to the following scale.

STRONGLY DISAGREE <-> STRONGLY AGREE

1* There were adequate learning opportunities (teaching patients, diversity of pathology and diagnostic procedures)
Strongly Disagree Disagree Neutral Agree Strongly Agree

2* There were opportunities to practice osteopathic diagnosis and therapy
Strongly Disagree Disagree Neutral Agree Strongly Agree

3* There was adequate supervision and feedback (e.g., reviews of my H&P, progress notes and clinical skills)
Strongly Disagree Disagree Neutral Agree Strongly Agree

4* I had the opportunity to perform procedures relevant for my level of training
Strongly Disagree Disagree Neutral Agree Strongly Agree

5* I was evaluated fairly for my level of knowledge and skills
Strongly Disagree Disagree Neutral Agree Strongly Agree

6* Attending physicians and/or house staff were committed to teaching
Strongly Disagree Disagree Neutral Agree Strongly Agree

7* Overall, I felt meaningfully engaged and well integrated with the clinical teams (e.g., given sufficient patient care responsibilities)
Strongly Disagree Disagree Neutral Agree Strongly Agree

8* The DME and/or clerkship director was responsive to my needs as a student
Strongly Disagree Disagree Neutral Agree Strongly Agree

9* There were adequate library resources at this facility
Strongly Disagree Disagree Neutral Agree Strongly Agree

10* A structured program of directed readings and/or journal club was a component of this rotation.
Strongly Disagree Disagree Neutral Agree Strongly Agree

11* The lectures were appropriate for this rotation (e.g., quality, quantity and relevance of topics)
Strongly Disagree Disagree Neutral Agree Strongly Agree

12* Educationally useful teaching rounds were conducted on a regular basis.
Strongly Disagree Disagree Neutral Agree Strongly Agree

13* This rotation reflected a proper balance of service and education
Strongly Disagree Disagree Neutral Agree Strongly Agree

14* This rotation incorporated a psychosocial component in patient care
Strongly Disagree Disagree Neutral Agree Strongly Agree

15* Overall, I would recommend this rotation to others
Strongly Disagree Disagree Neutral Agree Strongly Agree

Section II. Psychomotor skills

Indicate the number you performed on an average week during this rotation for each of the following:

16* History and Physicals

17* Osteopathic structural examinations

18* Osteopathic Manipulative Treatments

19* Starting IVs

20* Venipunctures

21*
Administering injections

22* Recording notes on medical records

23* Reviewing X-Rays

24* Reviewing EKGs

25* Urinary catherizations

26* Insertion and removal of sutures

27* Minor surgical procedures (assist)

28* Major surgical procedures (assist)

29* Care of dressings and drains

30* Sterile field maintenance

Section III
31* Comment on unique STRENGTHS and Positive Features of this rotation

32* Comment on the LIMITATIONS and Negative Features of this rotation

33* Comment on the extent in which the Learning Objectives for the rotation were met (e.g., specific topics/patient populations to which you were or not exposed)

Section IV. Please list your clinical instructors with whom you had substantial contact on this rotation and provide a general rating of their effectiveness as Teachers using the scale below.

5=EXCELLENT, 4=VERY GOOD, 3=AVERAGE, 2=BELOW AVERAGE,
1=POOR
For example – John Smith – 4

34* List clinical instructors and rating in the box below

To submit your feedback, enter your password below and then click on Submit Feedback button

Focus Groups on Clinical Clerkships

N AME OF HOSPI TAL:

LOCATI ON:

DATE OF SI TE VI SI T:

The student’s com m ents on the clinical rotations are as follows:

( N am e of Clerkship) STREN GHTS:

W EAKN ESSES:

4th Year PDA Feedback Questionnaire

  1. Clinic Site
  2. Rotation
  3. Date
  4. There were adequate learning opportunities
  5. There were opportunities to practice Osteopathic diagnosis & therapy
  6. I was evaluated fairly for my level of knowledge and skills
  7. Attending physicians and/or house staff were committed to teaching
  8. Overall, I felt meaningfully engaged and well integrated with the clinical teams
  9. The DME and/or clerkship director was responsive to my needs as a student
  10. This rotation reflected a proper balance of service and education
  11. Overall, I would recommend this clerkship to others
  12. Comments
  13. Strengths/Positive Features of Rotation
  14. Limitations/Negative Features of Rotation
  15. List and Rate Clinical Instructors

Student End-of-Semester Program Evaluations
The DPC Student End-of-Semester Program Evaluation is an assessment of each course that occurred during the semester and the corresponding faculty members.

DPC END OF SEMESTER EVALUATION

Directions:

  1. Please write in your year of graduation here: .
  2. Enclosed you will find a blank scantron sheet.
  3. Please make sure that you are using a #2 pencil to fill in your answers.
  4. Please fill in the following Test Form information on the Scantron Sheet:
    ï DPC Class 2011 – Bubble in Test Form A
    ï DPC Class 2012 – Bubble in Test Form B
  5. No other identifying information is necessary.
  6. Please complete each of the following numbered sentences throughout this evaluation using the following responses:

A. Excellent – couldn’t be better
B. Good – only slight improvement possible
C. Satisfactory – about average
D. Fair – some improvement needed
E. Poor – considerable improvement needed

  1. There are spaces after each section in which you can write comments.

(When making comments, please know that your responses will be shared with DPC faculty, Dept. chairs, and deans, as part of ongoing program evaluation.)
BIOPSYCHOSOCIAL SCIENCES COURSE EVALUATION:

I. CASE STUDIES COMPONENT

  1. This course, overall is
  2. My effort in this course, overall is
  3. The case studies used in small group are
  4. My preparation for each group session was
  5. Other available resources for use in small group are
  6. Facilitator assessments are
  7. Self assessments are
  8. Content Exams – midterm and final are
  9. The group process in my group can be described as
  10. The wrap-ups in my group were
  11. The quality of the learning issues developed by my group was

Overall comments on Case Studies

II. STUDENT HOUR COMPONENT:

Excellent Good Satis-

Fair Poor

  1. The monthly student hours are A B C D E

Overall Comments On The Student Hour

III. FACILITATOR RATINGS

Please circle your group number/the name of your group facilitator(s). Group Facilitators
A Dr. and Dr.
B Dr. and Dr.
C Dr. and Dr.
D Dr. and Dr.

Please bubble in your response to each of the following items:

Strongly Agree  Agree   Disagree    Strongly Disagree
  1. Maintained appropriate directiveness 5 (A) 4 (B) 2 (C) 1 (D)
  2. Supported appropriate group process 5 (A) 4 (B) 2 (C) 1 (D)
  3. Supported student-directed learning 5 (A) 4 (B) 2 (C) 1 (D)
  4. Gave appropriate feedback to group 5 (A) 4 (B) 2 (C) 1 (D)
  5. Ensured that learning issues were Appropriate
    5 (A)
    4 (B)
    2 (C)
    1 (D)
  6. Overall, these facilitators were effective
    5 (A)
    4 (B)
    2 (C)
    1 (D)

Overall Facilitator Comments
(Comments on individual facilitators are welcome)

IV. PROBLEM SETS/DISCUSSION SESSIONS COMPONENT

A. Course Evaluation:

  1. These sessions, overall were
  2. My effort in these sessions, overall was
  3. The organization of these sessions was
  4. Handouts in general were

Problem Sets/Discussion Sessions Comments
(Please comment as to whether problem sets were too many, too few, too involved.)

V. PROBLEM SETS/DISCUSSION SESSIONS COMPONENT

B. Presenter Evaluation:

  1. The Problem Set topic on
    was
  2. The instructor,
    , for the problem set named in #23 was
  3. The Problem Set topic on
    was
  4. The instructor,
    , for the problem set named in #25 was
  5. The Problem Set topic on
    was
  6. The instructor,
    , for the problem set named in #27 was
  7. The Problem Set topic on
    was
  8. The instructor,
    , for the problem set named in #29 was
  9. The Problem Set topic on
    was
  10. The instructor,
    , for the problem set named in #31 was

Problem Sets/Discussion Sessions Comments
(Comments on individual instructors are welcome)

VI. ANATOMY COMPONENT

A. Course Evaluation:

Excellent Good Satis-

Fair Poor

  1. This component, overall was A B C D E
  2. My effort in this component was A B C D E
  3. My preparation for each lab session was A B C D E
  4. Organization of the component was A B C D E
  5. Quizzes were A B C D E
  6. Resource Hour / Reviews were A B C D E

Anatomy Component Comments

VII. ANATOMY COMPONENT

B. Teaching Evaluation:

Please bubble in your response to each of the following items:

  1. The faculty were available to answer questions in the lab
  2. The faculty Initiated student discussion
  3. The faculty were prepared for each lab session
  4. The faculty gave me feedback on how I was doing
  5. The faculty were enthusiastic about the course
  6. Overall, the instructors were effective

Anatomy Component Comments
(Comments on individual instructors are welcome)

CLINICAL SCIENCES COURSE

I. CLINICAL SKILLS LAB COMPONENT

A. Course Evaluation:

Excellent Good Satis-

Fair Poor

  1. This component, overall was A B C D E
  2. My effort in this component was A B C D E
  3. My preparation for each lab session was A B C D E
  4. Organization of the component was A B C D E
  5. Examinations were A B C D E
  6. Handouts/PowerPoints were A B C D E
  7. I would rate my physical exam and history taking skills at this time to
    A
    B
    C
    D
    E
    be

Overall Comments on Clinical Skills Component / Individual Labs
(Comments on individual instructors are welcome)

I. CLINICAL SKILLS LAB COMPONENT

B. Teaching Evaluation:

Please bubble in your response to each of the following items:

  1. The faculty were available to answer questions in the lab
  2. The faculty initiated student discussion
  3. The faculty were prepared for each lab session
  4. The faculty Gave me feedback on how I was doing
  5. The faculty were enthusiastic about the course
  6. Overall, the instructors were effective

Overall Comments on Clinical Skills Component / Individual Labs
(Comments on individual instructors are welcome)

II. OMM COMPONENT

A. Course Evaluation:

Excellent Good Satis-

Fair Poor

  1. This component, overall was A B C D E
  2. My effort in this component was A B C D E
  3. My preparation for each lab session was A B C D E
  4. Organization of the component was A B C D E
  5. Presentations / Lectures were A B C D E
  6. Handouts were A B C D E
  7. Quizzes were A B C D E
  8. Practical exams were A B C D E
  9. Resource Hour / Reviews were A B C D E

Overall Comments on OMM Component / Individual Labs
(Comments on individual instructors are welcome)

II. OMM COMPONENT

B. Teaching Evaluation

Please bubble in your response to each of the following items:

  1. The faculty were available to answer questions in the lab
  2. The faculty Initiated student discussion
  3. The faculty were prepared for each lab session
  4. The faculty gave me feedback on how I was doing
  5. The faculty were enthusiastic about the course
  6. Overall, the instructors were effective

Overall Comments on OMM Component / Individual Labs
(Comments on individual instructors are welcome)

III. ICC COMPONENT

A. Course Evaluation:

  1. This component, overall was
  2. My effort in this component was
  3. My preparation for each lab session was
  4. Organization of this component was
  5. The helpfulness/usefulness of the ICC standardized patient encounters was
  6. The helpfulness/usefulness of the ICC robotic patient encounters was
  7. Are Clinical Skills laboratory exercises appropriate for the ICC?
    [A] YES [B] NO

Overall Comments on the ICC Component
(Comments on individual instructors are welcome)

IV. CLINICAL PRACTICUM COMPONENT

  1. I participated in Clinical Practicum this semester: [A] YES [B] NO
    If you answered NO to this question, you have finished this evaluation, if you answered YES, please continue this questionnaire until the end. Thank you.

A. Course Evaluation

  1. This component, overall was
  2. My effort in this component was
  3. My preparation for each lab session was
  4. Organization of this component was
  5. The helpfulness/usefulness of the Clinical Practicum was
  6. The organization of the case presentations was
  7. Are Clinical Skills laboratory exercises appropriate for the Clinical Practicum?

Please bubble in your response to each of the following items:

Strongly Agree

Agree Disagree Strongly
Disagree

  1. The case presentation exercise was a valuable learning experience

5 (A) 4 (B) 2 (C) 1 (D)

Overall Comments on Clinical Practicum Course

IV. CLINICAL PRACTICUM COMPONENT

B. Mentor Evaluation:

Please bubble in your response to each of the following items:

  1. The preceptor was available to answer my questions
  2. I was supported in my interaction with patients
  3. Student-directed learning was supported
  4. I had appropriate feedback
  5. Overall, this preceptor/site was effective

Preceptor Name

Overall Comments on Clinical Practicum Mentor
(Comments on individual instructors are welcome)

DPC: Program Assessment Plan

I. Pre matriculated Evaluation – What determines that an applicant will pick the DPC program?
ï Comparison of the students who chose the LDB program vs. the DPC program with regard to the following outcome measures:
− GPA scores (overall, science)
− MCAT scores
− Gender
− Age
− Race
− College size
− College Geographic location
− Prior PBL exposure
− OMM understanding
− Research Background
− Volunteer Work
− Employment Experience
− Graduate Degree
− Scholarships/Awards

II. Years at NYCOM – How do we evaluate if the DPC program is accomplishing its goals while the students are at NYCOM?
ï Comparison of Facilitator Assessments for each term, to monitor student growth
ï Comparison of Clinical Practicum Mentor Evaluations from Term 2 and Term 3, to evaluate the student’s clinical experience progress
ï Comparison of Content exam scores from terms 1 through 4.
ï Comparison of entrance questionnaire (administered during first week of medical school) responses to corresponding exit questionnaire administered at the end of year 4
ï Evaluation of the Student DPC End-of-Term Evaluations
ï Comparison of the following measures to those outcomes achieved by the students in the LDB program:
− OMM scores

DPC: Program Assessment Plan

− Anatomy scores
− ICC PARS scores
− ICC OSCE scores
− Summer research
− Summer Volunteerism
− Research effort (publications, abstracts, posters, presentations)
− Shelf-exams
− COMLEX I, II, III scores and pass rate
− Fellowships (Academic, Research)

III. Post Graduate Training Practice – What happens to the DPC student once they leave NYCOM? How to they compare to those students who matriculated through the LDB program?
ï Comparison of the following measures to those outcomes achieved by the students in the LDB program:
− Internships
− Residencies
− Fellowships
− Specialty (medicine)
− Specialty board certifications
− AOA membership
− AMA membership
− Publications
− Research
− Teaching

OMM Assessment Forms

  1. COMLEX USA Level I, Level II CE & PE, and Level III data (NBOME)

a) First-time and overall pass rates and mean scores;

b) Comparison to national averages;

c) Comparison to college (NYCOM) national ranking.

Report provided by Associate Dean for Academic Affairs

  1. Residency match rates and overall placement rate

Data compiled as received from the American Osteopathic Association (AOA) and the National Residency Match Program (NRMP).

Report provided by Associate Dean for Clinical Education

  1. Feedback from (AACOM) Graduation Questionnaire

Annual survey report received from AACOM comparing NYCOM graduates responses to numerous questions/categories (including demographics, specialty choice, overall perception of pre-doctoral training, indebtedness, and more) to nation- wide osteopathic medical school graduating class responses.

Specific forms/questionnaires utilized to capture the above-detailed information include the following:

 AACOM Survey of Graduating Seniors Samples of the forms/questionnaires follow

MERICAN ASSOCIATION OF COLLEGES OF OSTEOPATHIC MEDICINE
2007-08 Academic Year Survey of Graduating Seniors

TO THE STUDENTS: Your opinions and attitudes.about your medical education, your plans for medical practice, and information about your debt are very important as the colleges and the osteopathic profession develop and plan for the future of osteopathic medical education. Please take some time to complete the following questionnaire to help in planning the future of osteopathic medical education. The information you provide In this survey will be reported only in aggregate or summary form; individually Identifiable information will not be made available to the colleges. The reason we ask for your identification is to allow for longitudinal studies linking your responses when you took a similar survey as a first-year medical student to your responses as a graduating medical student.

Please print in Capital Letters:

Please fill in marks like this: e

Last
Name Suffix

First Name

Osteopathic College

Middle Name

or Maiden Name if Married Woman Using Husband’s Name

0 ATSU-SOMA 0 LECOM-Bradenton 0 OU-COM 0 UMDNJ-SOM
0 ATSU/KCOM 0 LECOM-PA 0 PCOM 0 UNECOM
0 AZCOM 0 LMU-DCOM 0 PCSOM 0 UNTHSC/TCOM
0 CCOM 0 MSUCOM 0 TOUROCOM 0 VCOM
0 DMU-COM 0 NSU-COM 0 TUCOM-CA 0 WestemU/COMP
0 GA-PCOM 0 NYCOM 0 TUNCOM 0 WVSOM
0 KCUMB-COM 0 OSU-COM

Part I: OPINIONS

  1. Instruction. Please evaluate the amount of instruction provided in each of the areas listed below. Please note, this item concludes on the next page. Use the scale below.
    (1) Appropriate (1) Inadequate (3) Excessive

a. Basic medical science 000 I. Cost-effective medical practice 000
b. Behavioral science 000 m. Diagnostic skills 0©0
C. Biostatistics 000 n. Drug and alcohol abuse 000
d. Bioterrorism 000 o. Family/domestic violence 000
e. Care of ambulatory patients 000 p. Genetics 0©0
f. Care of elderly (geriatrics) 000 q. Health promotion & disease prevention q) ©0
g. Care of hospitalized patients 000 r. Human sexuality 000
h. Care of patients with HIV/AIDS 000 s. Independent learning & self-evaluation 000
i. Clinical decision-making 000 t. Infection control/health care setting 0©0
J. Clinical pharmacology 000 u. Infectious disease prevention 000
k. Clinical science 000 V. Integrative medicine 000

(I) Appropriate (2) Inadequate (3) Excessive

Patient education 000 rr. Therapeutic management 000

  1. Please rate your overall satisfaction with the quality of your medical education.

0 a. Very satisfied O b. Satisfied O c. Neither satisfied nor dissatisfied O d. Dissatisfied O e. Very dissatisfied

  1. Using the following scale, please Indicate how confident you are in your ability to perform the following examinations:
    Use the scale below.
    (1) Completely Confident (1) Mostly Confident (3) Fairly Confident
    (4) Somewhat Confident (5) Not At All Confulent (6) No Opportunity to Perform

a. General adult examination 0 © 0 0 0 ©
b. General pediatric examination 0 0 0 0 0 ©
C. Well-baby examination 0 0 0 0 0 ©
d. Breast and pelvic examination 0 © 0 0 0 ©
e. Prostate and testicular examination 0 0 0 0 0 ©
f. Osteopathic structural examination 0 0 0 0 0 ©
g. Sports participation examination 0 0 0 0 0 ©

  1. Please indicate whether you agree or disagree with the following statements about your first two years of medical education. Use the scale below.
    (1) Strongly Agree (1) Agree (3) Disagree (4) Strongly Disagree (5) No Opinion

a. Basic & clinical science course objectives were made clear to students 0 © 0 0 0
b. Basic science courses were sufficiently integrated with each other 0 0 0 0 0
C.
d. Basic science courses were sufficiently integrated with clinical training
Course objectives & examination content matched closely 0 0 0
0 © 0 0
0 0
0
e. Course work adequately prepared students for clerkships 0 0 0 0 0
f. The first two years of medical school were well organized 0 0 0 0 0
g. Students were provided with timely feedback on performance 0 © 0 0 0
h. There was adequate exposure to patient care during the first two years 0 0 0 0 0
I. Osteopathic principles were adequately integrated into course work
j. An appropriate amount of training was provided in OMT 0 0 0 0
0 0 0 0 0
0
k. There was adequate preparation for COMLEX Level I 0 0 0 0 0

  1. a. Please indicate whether you agree or disagree with the foUowing statements about your Required Clerkships during the last two years of medical education. Please use the scale below.
    (]) Strongly Agree (2) Agree (3) Disagree (4) Strongly Disagree (5) No Opinion

I. Clear goals and objectives were set 0 © © 0 ©

  1. I was able to design own goals and objectives 0 © © 0 ©
  2. Clear performance objectives were set 0 © © 0 ©
  3. Clerkships were well-organized 0 © © 0 ©
  4. Rounds were conducted as scheduled 0 © 0 0 ©
  5. Timely feedback was provided on perfonnance 0 © © 0 ©
  6. There was too large a role by residents in teaching and evaluation 0 © © 0 ©
  7. There was appropriate diversity of patients and their health issues 0 © 0 0 ©
  8. There was an appropriate number of inpatient experiences 0 0 0 0 ©
  9. Each clerkship had an osteopathic orientation 0 © 0 0 ©
  10. I was asked relevant and pertinent questions on patient diagnosis, treatment
    options, management, and follow-up care 0 0 © 0 ©
  11. I felt free to ask questions 0 © © 0 ©
  12. The attending seemed interested in my opinions 0 0 0 0 ©
  13. Personal concerns were addressed by the attending while on rotation 0 © © 0 ©
  14. I was treated with respect 0 © 0 0 ©
  15. I was able to discuss progress on rotation with attending 0 0 © 0 ©
  16. The attending critically evaluated me during rotation 0 © 0 0·0
  17. I was able to discuss the final rotation evaluation with the attending 0 0 © 0 ©
  18. The attending based the evaluation on direct observation
  19. I was able to meet & discuss areas of concern with the attending outside of the clinical setting
  20. I lived a reasonable distance from rotation sites
  21. The rotations prepared me for examinations 0
    0 0 0 0
    0 © 0 ©
    ©
  22. Testing was provided at end of each clerkship 0 0 © 0 ©
  23. There was adequate preparation for COMLEX Level 2-CE 0 © © 0 ©
  24. There was adequate preparation for COMLEX Level 2-PE 0 0 0 0 ©

b. Please indicate whether you agree or disagree with the foUowing statements about your Selective/Elective Clerkships during the last two years of medical education. Please note, this item concludes on the next page. Please use the scale below.
(]) Strongly Agree (2)Agree (3) Disagree (4) Strongly Disagree (5) No Opinion

  1. Clear goals and objectives were set 0 0 © 0 ©
  2. I was able to design own goals and objectives 0 © © 0 ©
    3.. Clear performance objectives were set 0 0 0 0 ©

( I) Strongly Agree (1) Agree (3) Disagree (4) Strongly Disagree (5) No Opinion

  1. Clerkships were well-organized 0 0 0 0 0
  2. Rounds were conducted as scheduled 0 0 0 0 0
  3. Timely feedback was provided on performance 0 0 0 0 ©
  4. There was too large a role by residents in teaching and evaluation 0 0 0 0 ©
  5. There was appropriate diversity of patients and their health issues 0 0 0 0 ©
  6. There was an appropriate number of inpatient experiences 0 0 0 0 ©
    IO. Each clerkship had an osteopathic orientation
  7. Osteopathic principles & practice (OPP) were well-integrated in each clerkship 0
    0 0
    0 0
    0 0
    0 ©
    ©
  8. There was appropriate technology usage for situation 0 0 0 0 0
  9. I was able to work on a personal basis with patients 0 0 0 0 ©
  10. The attending modeled excellent patient relationship skills 0 0 0 0 0
  11. Support staff was friendly and supportive 0 0 0 0 ©
  12. Coverage hours were set and finished on time 0 0 0 0 ©
  13. I was asked relevant and pertinent questions on patient diagnosis, treatment options, management, and follow-up care
  14. I felt free to ask questions 0 0 0 0 ©
  15. The attending seemed interested in my opinions 0 0 0 0 0
  16. Personal concerns were addressed by the attending while on rotation 0 0 0 0 0
  17. I was treated with respect 0 0 0 0 0
  18. [ was able to discuss progress on rotation with attending 0 0 0 0 0
  19. The attending critically evaluated me during rotation 0 0 0 0 0
  20. I was able to discuss the final rotation evaluation with the attending 0 0 0 0 0
  21. The attending based the evaluation on direct observation 0 0 0 0 0
  22. I was able to meet & discuss areas of concern with the attending outside of the clinical setting
  23. I lived a reasonable distance from rotation sites 0 0 0 0 ©
  24. The rotations prepared me for examinations 0 0 0 0 0
  25. Testing was provided at end of each clerkship 0 0 0 0 ©
  26. There was adequate preparation for COMLEX Level 2-CE 0 0 0 0 ©
  27. There was adequate preparation for COMLEX Level 2-PE 0 0 0 0 0
  28. a. How was your osteopathic medical school involved in your third- and fourth-year education? Check all that apply.
    0 1. COMLEX Level 2-CE preparation 0 2. COMLEX Level 2-PE preparation
    0 3. Distance learning 0 4. E-mail
    0 5. Faculty visit 0 6. Newsletter
    b. In your view bow appropriate was your osteopathk medical school involvement in your clerkship years?

0 I. Excessive involvement 0 2. Outstanding involvement
0 3. Adequate involvement 0 4. Some, but inadequate involvement
0 5. Not involved

08e. Other, please specify

  1. At this time, how satisfied are you that you selected osteopathic
    medicine as a career?

0 a. Very satisfied·
0 b. Satisfied
0 C. Mixed feelings
0 d. Dissatisfied
0 e. Very dissatisfied

  1. If given the opportunity to begin your medical education again, would you prefer to enroll In:

0 a.

The osteopathic medical school from which you are about to graduate

0 b. Another osteopathic medical school
0 c. An allopathic medical school
0 d. Would not have gone to medical school at all

  1. Please indicate your agreement with the following statements regarding your training in Osteopathic Manipulative Treatment, Principles and Practice. Please use the scale below.
    (1) Strongly agree (2)Agree (3) Neither agree nor disagree (4) Disagree (5) Strongly disagree

a. Well prepared to diagnose structural problems 0 0 0 0 0
b. Well prepared to treat structural problems 0 0 0 0 0
c. Well prepared to document findings in a structural examination 0 0 0 0 0
d. Had opportunity to practice OPP during first two years in medical school
e. Had opportunity to practice OPP during in-hospital rotations
f. Had opportunity to practice OPP during ambulatory primary care rotations
g. Had opportunity to practice OPP during ambulatory non- primary care rotations
h. Had osteopathic physician role models during the first two years in medical school
i. Had osteopathic physician role models during required in- hospital rotations
j. Had osteopathic physician role models during ambulatory
primary care rotations
k. Had osteopathic physician role models during ambulatory non- primary care rotations·
I. Had osteopathic physician role models during selectives/electives

  1. What percentage of your training was delivered by allopathlc physicians? None 1-25% 26-50% 51-75% 76-100%
    a. During the first two years of medical school 0 0 0 0 0
    b. During required in-hospital rotations 0 0 0 0 0
    c. During required ambulatory primary care rotations 0 0 0 0 0
    d. During required ambulatory non-primary care rotations 0 0 0 0 0
    e. During selectives/electives 0 0 0 0 0
  2. Please use as much of this page as you wish to submit suggestions for improvement or positive comments on your medical education. Your comments will be fed back to the schools absolutely ANONYMOUSLY in the spirit of helping to improve osteopathic medical education. Please print or write legibly.

Part II: CAREER PLANS

Pl. Plans Upon Graduation: Please indicate what type of osteopathic internship you plan to do. (Choose only one.)

0 a. Traditional rotating
0 b. Special emphasis Indicate type: I. Anesthesiology 0 2. Diagnostic Radiology 0
3. Emergency Med. 0 4. Family Practice 0
5. General Surgery 0 6. Psychiatry 0
0
C. Specialty track Indicate type: I. Internal Medicine 0 2. Internal Medicine/Peds. 0
3. Ob/Gyn 0 4. Oto\aryn./Facial Plastic Surg. 0
5. Pediatrics 0 6. Urological Surgery 0

0 d.
0 e.

Pursue AON ACGME dual approved internship
Not planning osteopathic internship. Reason: I. Allopathic residency 0

  1. Other 0

Please specify
0 f. Undecided

P2. a. Immediate Post-Internship Residency Plans: Select the one item that best describes your plans immediately after internship {or upon graduation if not planning an osteopathic internship).

0 l. Pursue osteopathic residency
0 2. Pursue allopathic residency (see Item P2b)
0 3. Pursue AONACGME dual approved residency (see Item P2b)
0 4. Enter governmental service (e.g., military, NHS Corps, Indian Health Service, V.A., state/local health dept.) (see
Item P2b)

If you are not doing a residency, please indicate your post-internship plans.

0 5.
0 6.
0 7.
0 8.
0 9.

Practice in an HMO
Self-employed with or without a partner
Employed in group or other type of private practice (salary, commission, percentage) Other professional activity (e.g., teaching, research, administration, fellowship) Undecided or indefinite post-graduation/internship plans

b. Uyou plan to pursue an allopathic or AOA/ACGME dual approved residency, please give all the reasons that apply to you.

0 l. Desire specialty training not available in osteopathic program
0 2. Believe better training and educational opportunities available
0 3. Located in more suitable geographic location(s)
0 4. Located in larger institutions
0 5. Better chance of being accepted in program
0 6. Allow ABMS Board certification
0 7. Opens more career opportunities
0 8. Military or government service obligation
0 9. Shorter training period
0 10. Higher pay
0 11. Other, please specify

P3. Long-Range Plans: Select the one item that best describes your intended activity five years after internship and residency training.

0 I. Enter governmental service (e.g., military, NHS Corps, Indian Health Service, V.A., state/local health dept.)
0 2. Practice in an HMO
0 3. Self-employed with or without a partner
0 4. Employed in group or other type of private practice (salary, commission, percentage)
0 5. Other professional activity (e.g., teaching, research, administration, fellowship)
0 6. Undecided or indefinite

P4. a. Area of Interest: Select one specialty in which you are most likely t work or seek training.

0 I. Family Practice 0 17. Ob/Gyn including subspecialties
0 2. General Internal Medicine 0 18. Ophthalmology
0 3. Internal Medicine Subspecialty 0 19. Otolaryngology
0 4. Osteopathic Manip. Ther. & Neuromusculoskeletal Med. 0 20. Pathology including subspecialties
0 5. General Pediatrics 0 21. Physical Medicine & Rehabilitation Med.
0 6. Pediatrics Subspecialty 0 22. Preventive Medii;ine including subspec.
0 7. Allergy and Immunology 0 23. Proctology
0 8. Anesthesiology 0 24. Radiology (Diagnostic) including subspec.
0 9. Critical Care 0 25. Sports Medicine
0 10. Dermatology 0 26. General Surgery
0 11. Emergency Medicine 0 27. Orthopedic Surgery
0 12. Geriatrics 0 28. Surgery, subspecialty
0 13. Medical Genetics 0 29. Vascular Surgery
0 14. Neurology including subspecialties 0 30. Urology/Urological Surgery
0 15. Psychiatry including subspecialties 0 31. Undecided or Indefinite
0 16. Nuclear Medicine

b. Please select one item that best describes your plans for board certification.

0 I. AOA Boards (osteopathic)
0 2. ABMS Boards (allopathic) (see Item P4c)
0 3. Both boards (see Item P4c)

0 4. Other, please specify

0 5. Not planning board certification
0 6. Undecided or indefinite

c. If you selected ABMS or both boards in item P4b, please indicate all the reasons for your choice.

0 I. ABMS board certification is more widely recognized
0 2. ABMS board certification has more colleague acceptance
0 3. ABMS board certification carries more prestige
0 4. ABMS board certification provides more opportunities (career, residencies, etc.)
0 5. Personal desire for dual certification

0 6. Hospital privileges more readily obtained with ABMS board certification.
0 7. Licenses more readily obtained with ABMS board certification
0 8. Other, please specify

PS. Please indicate the importance of each of the following factors affecting your specialty choice decision. Use the scale below.
(I) Major Influence (2) Strong Influence (3) Moderate Influence (4) Minor Influence (5) No Influence/NA

g.
h.
i.
J.
k.
I. Peer influence (encouragement from practicing physicians, faculty, or other students) Skills/abilities (possess the skills required for the specialty or its patient population) Debt level (level of debt, length of residency, high malpractice insurance premiums) Academic environment (courses, clerkships in the specialty area)
Opportunity for research/creativity
Desire for independence 0
0
0
0
0
0 0
0
0
0
0
0 0
0
0
0
0
0 0
0
0
0
0
0 0
0
0
0
0
0
m. Previous experience 0 0 0 0 0

Part IV: DEMOGRAPHIC DATA

This information is for classification purposes only and is considered confidential. Information will only be used by AACOM and affiliated organizations in totals or averages.

DI. Date of Birth I I D2. Sex: Male 0
Female 0

D3. Marital Status: Married/cohabiting 0
Single/other 0

D4. SSN

AACOM asks for your Social Security Number so that we can track data longitudinally-a similar survey is administered on matriculation, and this number allows us to analyze changes in responses. AACOM provides reports to the COMs only in aggregate and does not include any individual identifiers.

D5. Dependents: Including yourself, how many dependents do you support financially? 2 3 4 5 or more
0 0 0 0 0

D6. Ethnic background: Indicate your ethnic identification from the categories below. Please mark all that apply.

0
der 0
non-Vietnamese) 0
0
0

D7. Citizenship Status: U.S. 0
P.ermanent Resident 0
Other 0 Please specify

D8. State of Legal Residence: Use 2 letter postal abbreviation.

D9. Population of city/town/area of legal residence:

a. Major metropolitan area (1,000,00 l or more) 0
b. Metropolitan area (500,001 – 1,000,000) 0
c. City (100,001 – 500,000) 0
d. City (50,001 – 100,000) 0

e. City or town (10,00 l – 50,000) 0
f. City or town (2,501 – 10,000) 0
g. Town under 2,500 0
h. Other 0

Please specify

D10. a. Father’s Education: Select the highest level of education your father attained. Complete this item even ifhe is deceased.

l. Professional Degree (DO/MD, JD, DDS, etc.) 0
(See Item DI Ob below)

  1. Doctorate (Ph.D., Ed.D., etc.) 0
  2. Master’s 0
  3. Bachelor’s 0
  4. Associate Degree/Technical Certificate 0
  5. High School Graduate 0
  6. Less than High School 0

b. If your father’s professional degree is in the Health Professions field, please select one of the following: DO/MD 0 Other 0

DI I. a. Mother’s Education: Select the highest level of education your mother attained. Complete this item even if she is deceased.

I. Professional Degree (DO/MD, JD, DDS, etc.) 0
(See Item Dl lb below)

  1. Doctorate (Ph.D., Ed.D., etc.) 0
  2. Master’s 0
  3. Bachelor’s 0
  4. Associate Degree/Technical Certificate 0
  5. High School Graduate 0
  6. Less than High School 0

b. If your mother’s professional degree is in the Health Professions field, please select one of the following: DO/MD 0 Other 0

D12. Parents’ Income: Give your best estimate of your parents’ combined income before taxes for the prior year.

a. Less than $20,000 0 d. $50,000 – $74,999 0 g. $200,000 or more 0
b. $20,000 – $34,999 0 e. $75,000 – $99,999 0 h. Deceased/Unknown 0
c. $35,000 – $49,999 0 f. $100,000- $199,999 0

D13. Financial Independence: Do you consider yourself financially independent from your parents? Yes 0

No 0

medical school
physician.

  1. Completion rates (post-doctoral programs)

Percent of NYCOM graduates completing internship/residency training programs.

Report provided by Office of Program Evaluation and Assessment

  1. Specialty certification and licensure

Data compiled from state licensure boards and other specialty certification organization (board certification) on NYCOM graduates.

Report provided by Office of Program Evaluation and Assessment

  1. Career choices and geographic practice location

Data includes practice type (academic, research, clinical, and so on) and practice location. Data obtained from licensure boards, as well as NYCOM Alumni survey.

Report provided by Office of Program Evaluation and Assessment

  1. Alumni Survey

Follow up survey periodically sent to alumni requesting information on topics such as practice location, specialty, residency training, board certification and so on.

Specific forms/questionnaires utilized to capture the above-detailed information include the following:

 Alumni Survey Samples of the forms/questionnaires follow

ALUMNI SURVEY

NAME

LAST

FIRST

NYCOM CLASS YEAR
HOME ADDRESS
PRACTICE ADDRESS

HOME PHONE (
)
OFFICE PHONE (
)
E-MAIL ADDRESS

INTERNSHIP HOSPITAL

RESIDENCY HOSPITAL

FIELD OF STUDY

FELLOWSHIPS COMPLETED:

CERTIFICATIONS YOU HOLD:
IF SPOUSE IS ALSO A NYCOM ALUMNUS, PLEASE INDICATE SPOUSE’S NAME AND CLASS YEAR:
EXCLUDING INTERNSHIP, RESIDENCY AND FELLOWSHIP, HAVE YOU EARNED ANY ADDITIONAL ACADEMIC DEGREES OR CERTIFICATES BEYOND YOUR MEDICAL DEGREE (I.E., MPH, MBA, MHA, PHD, MS)? (PLEASE LIST)

CURRENT PRACTICE STATUS:
FULL-TIME PRACTICE
PART-TIME PRACTICE
INTERN/RESIDENCY
RETIRED/NOT PRACTICING

What specialty do you practice most frequently? (Choose one)

 Allergy and Immunology
 Anesthesiology
 Cardiology
 Colorectal Surgery
 Dermatology
 Emergency Medicine
 Endocrinology
 Family Practice
 Gastroenterology
 Geriatrics
 Hematology
 Infectious Diseases

 Internal Medicine
 Neruology
 Neonatology
 Nephrology
 Neurology
 Nuclear Medicine
 Obstetrics & Gynecology
 Occupational Medicine
 Ophthalmology
 Oncology
 Otolaryngology
 Orthopedic Surgery
 Psychiatry

 Pediatrics
 Plastic/Recon. Surgery
 Physical Medicine/Rehab
 Pathology
 Pulmonary Medicine
 Radiology
 Rheumatology
 Surgery (general)
 Thoracic Surgery
 Radiation Therapy
 Urology
 Other (Please specify)

Current military status (if applicable):  Active Duty  Inactive reserve  Active Reserve

What is the population of the
geographic area of your practice?
(Choose one)

How would you describe this geographic area? (Choose one)

 5,000,000 +
 1,000,000 – 4,999,999
 500,000 – 999,999
 250,000 – 499,999
 Inner City
 Urban

 100,000 – 249,999
 50,000 – 99,999
 25,000 – 49,999

 Suburban
 Small Town – Rural

 10,000 – 24,999
 5,000 – 9,999
 Less than 5,000

 Small town – industrial Other

What functions do you perform in your practice? (check all that apply)

What best describes the setting in which you spend the most time ?

 Preventive care/patient education
 Acute care
 Routine/non-acute care
 Consulting
 Intensive Care Unit of Hospital
 Inpatient Unit of Hospital (not ICU/CCU)
 Outpatient Unit of Hospital
 Hospital Emergency Room
 Hospital Operating Room
 Freestanding Urgent Care Center
 Freestanding Surgical Facility
 Nursing Home or LTC Facility
 Solo practice physician office
 Single Specialty Group practice physician office
 Multiple Specialty Group practice physician office

 Supervisory/managerial responsibilities
 Research
 Teaching
 Hospital Rounds
 University Student Health facility
 School-based Health center
 HMO facility
 Rural Health Clinic
 Inner-city Health Center
 Other Community Health Center
 Other Freestanding Outpatient facility
 Correctional facility
 Industrial facility
 Mobile Health Unit
 Other (Please specify)

Do you access medical information What percent of your time is spent in primary What percent of your practice is outpatient?

 0%
 1 – 25%
 25 – 50%
 50 – 75%
 75 – 100%
via the internet ? care? (family medicine or gen. internal medicine)
 Never  0%
 Sometimes  1 – 25%
 Often  25 – 50%
 50 – 75%
 75 – 100%

Do you engage in any of the following activities? (check all that apply)  Professional organization leadership position
 Volunteer services in the community  School or team physician
 Free medical care
 Leadership in church,
congregation  Local government
 Speaking on medical topics to community groups

How many CME programs or other
professional training sessions did you attend last year?
Have you ever done any of the following?
How often do you read
medical literature regarding new research findings?
How frequently do you apply
osteopathic concepts into patient care?
 none
 1-5
 5-10
 10-15
 more than 15  Author or co-author a professional paper
 Contribute to an article
 Direct a research project
 Participate in clinical research
 Present a lecture at a professional meeting or CME program
 Serve on a panel discussion at a professional meeting  Rarely
 Several times a year
 Monthly
 Weekly
 Daily  Never
 Rarely
 Often
 Always
In your practice do you employ any of the following?
(check all that apply)  Structural examination or musculoskeletal considerations in diagnosis  Indirect OMT techniques
 High Velocity OMT
 Myofascial OMT  Cranial OMT
 Palpatory diagnosis

Please indicate how important each of the following skills has been in your success as a physician, and how well NYCOM prepared you in that skill.

Biomedical science knowledge base

Clinical skills Patient educator skills
Empathy and compassion for patients Understanding of cultural differences
Osteopathic philosophy Clinical decision making Foundation of ethical standards
Ability to communicate with other health care providers Ability to communicate with patients and families Knowing how to access community resources
Ability to understand and apply new medical information

Understanding of the payor/reimbursement system
How important to my practice
How well NYCOM prepared me

Strong

 Moderate

Weak

Strong

 Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak
Strong  Moderate Weak Strong  Moderate Weak

Ability to search and retrieve needed information
Strong
 Moderate
Weak
Strong
 Moderate Weak
Manipulative treatment skill Strong  Moderate Weak Strong  Moderate Weak
Ability to use medical technology Strong  Moderate Weak Strong  Moderate Weak
Diagnostic skill Strong  Moderate Weak Strong  Moderate Weak
Skill in preventive care Strong  Moderate Weak Strong  Moderate Weak
Understanding of public health issues & the public health
system Strong  Moderate Weak Strong  Moderate Weak
Professionalism Strong  Moderate Weak Strong  Moderate Weak

Please return to:

NYCOM of NYIT, Office of Alumni Affairs Northern Boulevard, Serota Bldg., Room 218
Old Westbury, New York 11568 or
fax to (516) 686-3891 or (516) 686-3822
as soon as possible.

Thank you for your cooperation!

NYCOM Benchmarks

1- Applicant Pool
Benchmark: To maintain relative standing among Osteopathic Medical Colleges based on the number of applicants.

2- Admissions Profile
Benchmark: Maintain or improve current admissions profile based on academic criteria such as MCAT, GPA, or Colleges attended.

3- Academic Attrition Rates
Benchmark: To maintain or improve our current 3% Academic Attrition rate

4- Remediation rates (pre-clinical years)
Benchmark: A 2% a year reduction in the students remediating in pre-clinical years.

5- COMLEX USA Scores
Benchmark: Top quartile in the National Ranking of 1st time pass rate and Mean Score.

6- Students entering Osteopathic Graduate Medical Education (OGME) Benchmark: Maintain or improve the current OGME placement.

7- Graduates entering Primary Care (PC) 12
Benchmark: Maintain or improve the current Primary Care placement.

8- Career Data -Licensure (within 3 years, post-graduate), Board Certification , Geographic Practice, and Scholarly achievements.
Benchmark: TBD

12 Family Medicine, Internal Medicine, and Pediatrics

BIBLIOGRAPHY

Gonnella, J.S., Hojat, M., & Veloski, J.J. Jefferson Longitudinal Study of Medical Education.
Retrieved December 17, 2008, from http://jdc.jefferson.edu/jlsme/1

Hernon, P. & Dugan, R.E. (2004). Outcomes Assessment in Higher Education. Libraries Unlimited: Westport, CT

APPENDICES

NEUROLOGICAL EXAMINATION

1 Assess Cranial Nerve I

  • Olfactory
    Examiner checks for patient’s sense of smell by,
    e.g. coffee, soap, peppermint, orange peels, etc.
    2 Assess Cranial Nerve II
  • Optic: Assessing Visual Fields by Confrontation

 Examiner stands at approximate eye-level with patient, making eye contact.

 Patient is then asked to return examiner’s gaze
e.g. by saying “Look at me.”

 Examiner starts by placing his / her hands outside the patient’s field of vision, lateral to head.

 With fingers wiggling (so patient can easily see them) the examiner brings his / her fingers into the patient’s field of vision.

s

n

 Examiner must ask the patient “Tell me when you see my fingers.”

 Assess upper, middle and lower fields, bilaterally.

NEUROLOGICAL EXAMINATION

3
Assess Cranial Nerve II – Optic: Accessing Visual Acuity

 For ICC purposes, handheld Rosenbaum Pocket Screener (eye chart)

 NOTE: Use handheld Snellen eye chart if patient stand 20’ from the chart

 Ask patient to cover one eye while testing the other eye

 Rosenbaum eye chart is held in good light approximately 14” from eye

 Determine the smallest line of print from which patient can read more than half the letters

 The patient’s visual acuity score is recorded as two numbers, e.g. “20/30” where the top number is the distance the patient is from the chart and the bottom number is the distance the normal eye can read that line.

 Repeat with the other eye

NEUROLOGICAL EXAMINATION

4 Assessing Cranial Nerves II and III

  • Optic and Oculomotor: Assessing direct and Consensual Reactions

 Examiner asks the patient to look into the distance, then shines a light obliquely into each pupil twice to check both the direct reaction (pupillary constriction in the same eye) and consensual reaction (pupillary constriction in the opposite eye).

 Must be assessed bilaterally.

5
Assessing Cranial Nerves II and III – Optic and Oculomotor: Assessing Near Reaction and Near Response

 Assessed in normal room light, testing one eye at a time.

 Examiner holds a finger, pencil, etc. about 10 cm. from the patient’s eye.

 Asks the patient to look alternately at the finger or pencil and then into the distance.

 Note pupillary constriction with near focus.

Close focus

Distant focus

NEUROLOGICAL EXAMINATION

6 Assessing Cranial Nerve III

  • Oculomotor: Assessing Convergence

 Examiner asks the patient to follow his / her finger or pencil as he / she moves it in toward the bridge of the nose to within about 5 to 8 centimeters.

 Converging eyes normally follow the object to within 5 – 8 cm. of the nose.

7 Assessing Cranial Nerve III, IV and VI

  • Oculomotor, Trochlear And Abducens: Assessing Extra Ocular Muscle Movement

 Examiner assesses muscle movements in at least 6 positions of gaze by tracing, for example, an “H pattern” with the hand and asking the patient to follow the hand with their eyes without turning the head.

NEUROLOGICAL EXAMINATION

8 Assessing Cranial Nerve V

  • Trigeminal (Sensory)
    Ophthalmic

Maxillary

Examiner assesses sensation in 3 sites:
 Ophthalmic
 Maxillary
 Mandibular

 Examiner may use fingers, cotton, etc. for the assessment.
 Assess bilaterally.
Mandibular

9 Assessing Cranial Nerve V

  • Trigeminal (Motor)

 Examiner asks the patient to move jaw his or her jaw from side to side

OR

 Examiner palpates the masseter muscles and asks patient to clinch his / her teeth.

 Note strength of muscle contractions.

NEUROLOGICAL EXAMINATION

Show teeth Puff cheeks

NEUROLOGICAL EXAMINATION

11
Assess Cranial Nerve VIII – Acoustic

Weber test – for lateralization

 Use a 512 Hz or 1024 Hz turning fork.

 Examiner starts the fork vibrating e.g. by tapping it on the opposite hand, leg, etc.

 Base of the tuning fork placed firmly on top of the patient’s head.

 Patient asked “Where does the sound appear to be coming from?” (normally it will be sensed in the midline).

NEUROLOGICAL EXAMINATION

12 Assessing Cranial Nerve

Mastoid Bone

Ear
VIII – Acoustic
Rinne test – to compare
air and bone conduction
 Use a 512 Hz or 1024 Hz turning fork.
 Examiner starts the fork vibrating, e.g. by tapping it on the opposite hand, leg, etc.
 Base of fork placed against the mastoid bone behind the ear.
 Patient asked to say when he / she no longer hears the sound
 When sound no longer heard, examiner moves the tuning fork (without re-striking it) and holds it near the patient’s ear and ask if he / she can hear the vibration.
 Examiner must vibrate the tuning fork again for the second ear.
 Bilateral exam.
NOTE: (AC>BC): Air
conduction greater than bone conduction.

NEUROLOGICAL EXAMINATION

13 Assessing Cranial Nerve VIII –

  • Gross Auditory Acuity

 Examiner asks patient to occlude (cover) one ear.

 Examiner then whispers words or numbers into non- occluded ear from approximately 2 feet away.

 Asks patient to repeat whispered words or numbers.

 Compare bilaterally.

OR

 Examiner asks patient to occlude (cover) one ear.

 Examiner rubs thumb and forefinger together next to patient’s non-occluded ear and asks the patient if the sound is heard.

 Compare bilaterally.

NEUROLOGICAL EXAMINATION

14
Assessing Cranial Nerve IX and X – Glossopharyngeal and Vagus: Motor Testing

 First, examiner asks the patient to swallow.

 Next, patient asked to say ‘aah’ and examiner observes for symmetrical movement of the soft palate or a deviation of the uvula.

 OPTIONAL: Use a light source to help visualize palate and uvula.

NOTE: sensory component of cranial nerves IX and X is testing for the “gag reflex”

Swallowing

Saying “Aah”

NEUROLOGICAL EXAMINATION

15 Assessing Cranial Nerve XI

  • Spinal Accessory: Motor Testing

 Examiner asks the patient to shrug his / her shoulders up against the examiner’s hands. Apply resistance.
 Note strength and contraction of trapezius muscles.

 Next, patient asked to turn his or her head against examiner’s hand. Apply resistance.
 Observe the contraction of the opposite sternocleido- mastoid muscle.

 Assess bilaterally.

NEUROLOGICAL EXAMINATION

16 Assessing Cranial Nerve

Inspect tongue Protruding Tongue

Side to Side Movement
XII – Hypoglossal:
Motor Testing
 First, examiner inspects patient’s tongue as it lies on the floor of the mouth.
 Note any asymmetry, atrophy or fasciculations.
 Next, patient asked to protrude the tongue.
 Note any asymmetry, atrophy or deviations from the midline.

 Finally, patient asked to move the tongue from side to side.
 Note any asymmetry of the movement.

NEUROLOGICAL EXAMINATION

17
Assessing Lower Extremities – Motor Testing

With patient in supine position, test bilaterally

 Test flexion of the hip by placing your hand on patient’s thigh, and ask them to raise his / her leg against resistance.

 Test extension of the hip by having patient push posterior thigh against your hand

CONTINUED

NEUROLOGICAL EXAMINATION

18
Assessing Lower Extremities – Motor Testing

With patient in seated position, test bilaterally

 Test adduction of the hip by placing hands firmly between the knees, and asking them to bring the knees together

 Test abduction of the hip by placing hands firmly outside the knees, and asking patient to spread their legs against resistance

NEUROLOGICAL EXAMINATION

19
Assessing Upper Extremities – Motor Testing

 Examiner asks patient to pull (flex) and push (extend) the arms against the examiner’s resistance.

 Bilateral exam.

20
Assessing Lower Extremities – Motor Testing

 Examiner asks the patient to pull (flex) and push (extend) the legs against the examiner’s resistance.

 Bilateral exam.

NEUROLOGICAL EXAMINATION

21
Assessing Lower Extremities – Motor Testing

 Examiner asks patient to dorsiflex and plantarflex the ankle against resistance

 Compare bilaterally

NEUROLOGICAL EXAMINATION

22 Assessing the Biceps Reflex

 Examiner partially flexes patient’s arm.

 Strike biceps tendon with reflex hammer (pointed or flat end) with enough force to elicit a reflex, but not so much to cause patient discomfort.

OPTIONAL: Examiner places the thumb or finger firmly on biceps tendon with the pointed end of reflex hammer only.

 Reflexes must be assessed bilaterally.

 Examiner must produce a reflex for

OR

23 credit.
Assessing the Triceps Reflex

 Examiner flexes the patient’s arm at the elbow, and then taps the triceps tendon with reflex hammer.

 Reflexes must be assessed bilaterally.

 Examiner must produce a reflex for cr

NEUROLOGICAL EXAMINATION

24
Assessing the Brachioradialis Reflex

 With the patient’s hand resting in a relaxed position, e.g. on a table, his / her lap or supported by examiner’s arm, the examiner strikes the radius about 1 or 2 inches above the wrist with the reflex hammer.

 Reflexes must be assessed bilaterally.

 Examiner must produce a reflex for credit.

NEUROLOGICAL EXAMINATION

25 Assessing the Patellar Tendon Reflex

 First, patient asked to sit with their legs dangling off the exam table.

 Reflexes assessed by striking the patient’s patellar tendon with a reflex hammer on skin.

 Reflexes must be assessed bilaterally.

 Examiner must produce a reflex for credit.

OPTIONS:
 Examiner can place his / her hand on the on patient’s quadriceps, but this is optional.
 Patient’s knees can be crossed.

NEUROLOGICAL EXAMINATION

25
Assessing the Achilles Reflex

 Examiner dorsiflexes the patient’s foot at the ankle

 Achilles tendon struck with the reflex hammer on skin, socks completely off (removed at the direction of the examiner).

 Reflexes must be assessed bilaterally.

 Examiner must produce a reflex for credit.

NEUROLOGICAL EXAMINATION

26
Assessing the Plantar, or Babinski, Response

 Examiner strokes the lateral aspect of the sole from the heel to the ball of the foot, curving medially across the ball, with an object such as the end of a reflex hammer.

 On skin, socks completely off (removed at the direction of the examiner).

 Assessment must be done bilaterally

 Note movement of the toes (normally toes would curl downward).

NEUROLOGICAL EXAMINATION

27 Assessing Rapid
Alternating Movements

Examiner must do all three assessments for credit:

 Examiner directs the patient to pronate and supinate one hand rapidly on the other.

 Patient directed to touch his / her thumb rapidly to each finger on same hand, bilaterally.
h

 Patient directed to slap his / her thigh rapidly with the back side of the hand, and then with the palm side of the hand, bilaterally.
h

NEUROLOGICAL EXAMINATION

29
Assessing Finger-to-Nose Movements

 Examiner directs the patient to touch the examiner’s finger with his or her finger, and then to place his or her finger on their nose.
 Examiner moves his / her finger randomly during multiple movements.

NEUROLOGICAL EXAMINATION

30 Assessing Gait

Examiner asks patient to perform the following:

Walk, turn and come back

 Note imbalance, postural asymmetry, type of gait (e.g. shuffling, walking on toes, etc.), swinging of the arms, and how patient negotiates turns.

Heel-to-toe (tandem walking)
 Note an ataxia not previously obvious

Shallow knee bend
 Note difficulties here suggest proximal weakness (extensors of hip), weakness of the quadriceps (the extensor of the knee), or both.

NEUROLOGICAL EXAMINATION

31 Performing the Romberg Test

 Examiner directs the patient to stand with feet together, eyes closed for
at least 20 seconds without support.

 During this test, examiner must stand behind the patient to provide support in case the patient loses his / her balance.

32 Testing for Pronator Drift

 Examiner directs the patient to stand with eyes closed, simultaneously extending both arms, with palms turned upward, for at least 20 seconds.

 During this test, examiner must stand behind the patient to provide support in case the patient loses his / her balance.

NEUROLOGICAL EXAMINATION

and lower extremities.

TASKFORCE MEMBERS

John R. McCarthy, Ed.D. Associate Director, Clerkship Education
Pelham Mead, Ed.D. Director, Faculty Development
Mary Ann Achziger, M.S. Associate Dean, Student Affairs
Felicia Bruno, M.A. Assistant Dean, Student Administrative Services/Alumni Affairs/Continuing Education
Claire Bryant, Ph.D. Assistant Dean, Preclinical Education
Leonard Goldstein, DDS, PH.D. Director, Clerkship Education
Abraham Jeger, Ph.D. Associate Dean, Clinical Education
Rodika Zaika, M.S. Director, Admissions
Ron Portanova, Ph.D. Associate Dean, Academic Affairs


Leave a Reply

%d