Understanding MCC’s examination process: There’s more to the score | Medical Council of Canada
Search
Close this search box.
Search
News
NewsUnderstanding MCC’s examination process: There’s more to the score

Understanding MCC’s examination process: There’s more to the score

August 2, 2018

This year, it’s not only the pass score that has changed for the Medical Council of Canada Qualifying Examination (MCCQE) Part I; it’s also the scale on which scores are reported.

As candidates receive their results from the first session of the MCCQE Part I based on the new Blueprint, they will probably focus on the key words “pass” or “fail.” But candidates, in addition to program directors, may also want to know how well they performed.

This year, two important changes were introduced: a new pass score and a new scale on which scores are reported. Scores prior to 2018 were reported on a 50−950 scale. Beginning in 2018, scores are now reported on a 100−400 scale. A score of 300, for example, means something different on the new scale.

Understanding how and why the scale for the MCCQE Part I has changed can help candidates and program directors interpret results and make decisions.

From Blueprint to exam to pass score

These changes go back to the Medical Council of Canada (MCC) Blueprint. “A blueprint is a framework to set up our exams,” explains Dr. Claire Touchie, Chief Medical Education Officer at the MCC. Established through a practice analysis in 2013, the new Blueprint marked a major shift in the knowledge, skills and behaviours required from physicians tested on the exams.

Quotation marks

 

We are now focusing more on communication and professional behaviours, psychosocial aspects of care, and health promotion and illness prevention. Given this evolution, it is best practice to change the scale on which the exam is scored to highlight the different emphasis.”

Dr. Liane Patsula,
Associate Director of Psychometrics and Assessment Services, MCC

 

“With a new exam framework and scale, the next step was a standard setting exercise to determine the passing score,” says Dr. Touchie. “Because this was a new Blueprint and exam structure, not only did we reset the score, but we also changed our standard.” She stresses that pass rates for MCC exams are based on the expected standard for physicians, not on passing or failing a certain percentage of candidates.

A rigorous standard setting exercise was carried out in June 2018 with a panel of 22 physicians from across Canada that represented faculties of medicine, different specialties, and years of experience supervising students and residents. The panel recommended a pass score of 226 that was approved by the Central Examination Committee (CEC; see “Setting the standard”). The CEC, composed of physicians and medical educators from across the country, is responsible for awarding pass/fail results to MCCQE Part I and Part II candidates.

Comparing old and new

Decorative - Person at computer

“Another reason to change the scale was to discourage candidates and program directors from automatically comparing scores between the old and new scales,” Dr. Touchie says.

But this can lead to difficulty. Dr. Patsula provided one scenario: “A program director might have 200 applicants. Some of them have a score on the old scale and some on the new. How do they compare?”

Although scores on the previous exams and the 2018 exams should not be compared directly, Dr. Patsula recommends determining where the scores fall in terms of the mean and the standard deviation that was in place for a given exam session (see How your MCCQE Part I score can be used to assess relative performance).

Understanding the score

To help candidates understand their results, the MCC will continue to send out a one-page Statement of Results (see an example), which includes the final result, the total score, as well as the score required to pass. In addition, a Supplemental Information Report (see an example) will be provided, presenting subscores in graphical format for dimensions of care and physician activities, allowing candidates to assess their relative strengths and weakness and compare their subscores with the mean subscores of first-time candidates who passed the exam in April 2018.

“These reports are first and foremost for candidates, who can decide whether to share one or both reports,” explains Dr. Patsula. With candidates and MCC stakeholders fully informed about the changes that were introduced to the MCCQE Part I, results interpretation should be more meaningful and useful.