Multisource Feedback As Part of the Medical Board of Australia's Professional Performance Framework: Outcomes From a Preliminary Study

Date
2018
Authors
Narayanan, A
Farmer, EA
Greco, MJ
Supervisor
Item type
Journal Article
Degree name
Journal Title
Journal ISSN
Volume Title
Publisher
BioMed Central
Abstract

Background: The recent introduction of the Professional Performance Framework by the Medical Board of Australia is intended to strengthen continuing professional development for the 100,000 or so medical practitioners in Australia. An important option within the Framework is the use of multisource feedback from patients, colleagues and self-evaluations to allow doctors to reflect on their performance and identify methods for self-improvement. The aim of this study is to explore the relationships between patient feedback, colleague feedback, and self-evaluation using the same questionnaires as used by patients and colleagues. Methods: Feedback data for around 2000 doctors belonging to four different groups were collected through non-probability sampling from nearly 100,000 patients and 24,000 colleagues. Reliability analysis was performed using single measures intraclass coefficients, Cronbach' alpha and signal-to-noise ratios. Analysis of variance was used to identify significant differences in scores between items and sub-populations of doctors; principal component analysis involving Kaiser-Meyer-Olkin (KMO) sampling adequacy and Bartlett's test for sphericity was used to reveal components of doctor performance; and correlation analysis was used for identifying convergence between sets of scores from different sources. Results: Patients rated doctors highest on respect shown and lowest on reassurance provided. Colleagues rated doctors highest on trustworthiness and lowest on ability to say 'no'. With regard to self-evaluation, doctors gave themselves lower scores on the patient questionnaire and the colleague questionnaire (10 and 12%, respectively) than they received from their patients and colleagues. There were weak but positive correlations between self-scores and scores received indicating some convergence of agreement, with doctors feeling more comfortable with self-evaluation from the perspective of patients than from colleagues. Conclusions: Supplementing patient and colleague feedback with self-evaluation may help doctors confirm for themselves areas for enhanced CPD through convergence. If self-evaluation is used, the colleague questionnaire may be sufficient, since aspects of clinical competence, management, communication and leadership as well as patient care can be addressed through colleague items. Mentoring of doctors in CPD should aim to make doctors feel more comfortable about being rated by colleagues to enhance convergence between self-scores and evaluations from the perspective of colleagues.

Description
Keywords
Multisource feedback; Self-evaluation; Professional performance; Continuing professional development
Source
BMC Medical Education, 18, 323 (2018). https://doi.org/10.1186/s12909-018-1432-7
Rights statement
© The Author(s). 2018 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.