Co-worker Assessment and Physician Multisource Feedback
Abstract
Background: The College of Physicians and Surgeons of Alberta’s Physician Achievement Review program (PAR) uses questionnaire data from several sources, including co-workers, to provide formative feedback to physicians to promote quality improvement and continuing professional development within the profession. The PAR co-worker assessment questionnaires (CAQ) used in PAR were developed over more than a decade and not been psychometrically evaluated since then. The CAQs have not been reviewed either on their own or across the nine medical specialties which comprise PAR.
Aim: The purpose of this study was to more fully understand the CAQs’ psychometric profile including an exploration of the interprofessional constructs being measured and any changes in performance scores over time.
Method: A purposive sample of co-worker data from 1341 physicians across nine medical specialties in Alberta was evaluated. Secured PAR databases containing CAQ data (n = 9674) were accessed and analyzed using univariate and multivariate parametric techniques to: (a) evaluate the psychometric profile of the CAQ within and across specialty grouping; (b) to determine if physician characteristics and co-worker familiarity were associated with PAR performance scores; and (c) to evaluate if a difference existed between Time1 PAR feedback and Time2 PAR performance.
Results: Internal consistency of all CAQs remains extremely high (i.e., > 0.90) suggesting a potential unidimensionality. Generalizability coefficients were not as robust as were originally reported. Variance components across the collection of CAQs indicated that opportunity for CAQ revision to improve its reliability is better directed at the processes associated with data collection (e.g., assessor selection practices) rather than revising questionnaire items. Principal components analyses were conducted as a variable reduction procedure. Each CAQ demonstrated a number of redundant questionnaire items (range: 5 to 11), and while the components structure remained similar to the factor structure published in the original research, component labeling was updated to reflect the CanMEDS competencies supporting interprofessional practice (i.e., communicator and collaborator). In the case of family physicians and pediatricians, a new component emerged reflecting the consolidation of the communicator and professional roles into one previously unidentified component labelled: good doctor. Several independent t-tests, ANOVA, and linear regression analyses were conducted providing evidence that certain physician characteristics together with co-worker familiarity were significant predictors of CAQ performance scores within speciality groupings. Different medical specialities were influenced differently by socio-demographic characteristics relative to CAQ scores; however, across all medical specialties co-worker familiarity demonstrated a significant positive linear relationship with CAQ scoring. Finally, significant increases and decreases in CAQ scores from Time1 to Time2 were found depending upon specialty grouping.
Conclusion: The CAQ provides reliable data to physicians relative to interprofessional collaboration. Opportunity exists to improve the reliability of these tools by addressing unique variance components generated by the unbalanced nature of multisource feedback collection processes. Given the manner in which the tools were constructed, future revision efforts ought to include focus groups of specialty-specific co-workers seeking to illuminate how different clinical contexts influence what interprofessionality uniquely means within that specialty grouping.
Description
Keywords
Education, Health Sciences, Education--Higher
Citation
Trueman, G. (2013). Co-worker Assessment and Physician Multisource Feedback (Doctoral thesis, University of Calgary, Calgary, Canada). Retrieved from https://prism.ucalgary.ca. doi:10.11575/PRISM/25478