Browsing by Author "Sibley, Kathryn M."
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Open Access A scoping review of the globally available tools for assessing health research partnership outcomes and impacts(2023-12-22) Mrklas, Kelly J.; Boyd, Jamie M.; Shergill, Sumair; Merali, Sera; Khan, Masood; Moser, Cheryl; Nowell, Lorelli; Goertzen, Amelia; Swain, Liam; Pfadenhauer, Lisa M.; Sibley, Kathryn M.; Vis-Dunbar, Mathew; Hill, Michael D.; Raffin-Bouchal, Shelley; Tonelli, Marcello; Graham, Ian D.Abstract Background Health research partnership approaches have grown in popularity over the past decade, but the systematic evaluation of their outcomes and impacts has not kept equal pace. Identifying partnership assessment tools and key partnership characteristics is needed to advance partnerships, partnership measurement, and the assessment of their outcomes and impacts through systematic study. Objective To locate and identify globally available tools for assessing the outcomes and impacts of health research partnerships. Methods We searched four electronic databases (Ovid MEDLINE, Embase, CINAHL + , PsychINFO) with an a priori strategy from inception to June 2021, without limits. We screened studies independently and in duplicate, keeping only those involving a health research partnership and the development, use and/or assessment of tools to evaluate partnership outcomes and impacts. Reviewer disagreements were resolved by consensus. Study, tool and partnership characteristics, and emerging research questions, gaps and key recommendations were synthesized using descriptive statistics and thematic analysis. Results We screened 36 027 de-duplicated citations, reviewed 2784 papers in full text, and kept 166 studies and three companion reports. Most studies originated in North America and were published in English after 2015. Most of the 205 tools we identified were questionnaires and surveys targeting researchers, patients and public/community members. While tools were comprehensive and usable, most were designed for single use and lacked validity or reliability evidence. Challenges associated with the interchange and definition of terms (i.e., outcomes, impacts, tool type) were common and may obscure partnership measurement and comparison. Very few of the tools identified in this study overlapped with tools identified by other, similar reviews. Partnership tool development, refinement and evaluation, including tool measurement and optimization, are key areas for future tools-related research. Conclusion This large scoping review identified numerous, single-use tools that require further development and testing to improve their psychometric and scientific qualities. The review also confirmed that the health partnership research domain and its measurement tools are still nascent and actively evolving. Dedicated efforts and resources are required to better understand health research partnerships, partnership optimization and partnership measurement and evaluation using valid, reliable and practical tools that meet partners’ needs.Item Open Access How are health research partnerships assessed? A systematic review of outcomes, impacts, terminology and the use of theories, models and frameworks(2022-12-14) Mrklas, Kelly J.; Merali, Sera; Khan, Masood; Shergill, Sumair; Boyd, Jamie M.; Nowell, Lorelli; Pfadenhauer, Lisa M.; Paul, Kevin; Goertzen, Amelia; Swain, Liam; Sibley, Kathryn M.; Vis-Dunbar, Mathew; Hill, Michael D.; Raffin-Bouchal, Shelley; Tonelli, Marcello; Graham, Ian D.Abstract Background Accurate, consistent assessment of outcomes and impacts is challenging in the health research partnerships domain. Increased focus on tool quality, including conceptual, psychometric and pragmatic characteristics, could improve the quantification, measurement and reporting partnership outcomes and impacts. This cascading review was undertaken as part of a coordinated, multicentre effort to identify, synthesize and assess a vast body of health research partnership literature. Objective To systematically assess the outcomes and impacts of health research partnerships, relevant terminology and the type/use of theories, models and frameworks (TMF) arising from studies using partnership assessment tools with known conceptual, psychometric and pragmatic characteristics. Methods Four electronic databases were searched (MEDLINE, Embase, CINAHL Plus and PsycINFO) from inception to 2 June 2021. We retained studies containing partnership evaluation tools with (1) conceptual foundations (reference to TMF), (2) empirical, quantitative psychometric evidence (evidence of validity and reliability, at minimum) and (3) one or more pragmatic characteristics. Outcomes, impacts, terminology, definitions and TMF type/use were abstracted verbatim from eligible studies using a hybrid (independent abstraction–validation) approach and synthesized using summary statistics (quantitative), inductive thematic analysis and deductive categories (qualitative). Methodological quality was assessed using the Quality Assessment Tool for Studies with Diverse Designs (QATSDD). Results Application of inclusion criteria yielded 37 eligible studies. Study quality scores were high (mean 80%, standard deviation 0.11%) but revealed needed improvements (i.e. methodological, reporting, user involvement in research design). Only 14 (38%) studies reported 48 partnership outcomes and 55 impacts; most were positive effects (43, 90% and 47, 89%, respectively). Most outcomes were positive personal, functional, structural and contextual effects; most impacts were personal, functional and contextual in nature. Most terms described outcomes (39, 89%), and 30 of 44 outcomes/impacts terms were unique, but few were explicitly defined (9, 20%). Terms were complex and mixed on one or more dimensions (e.g. type, temporality, stage, perspective). Most studies made explicit use of study-related TMF (34, 92%). There were 138 unique TMF sources, and these informed tool construct type/choice and hypothesis testing in almost all cases (36, 97%). Conclusion This study synthesized partnership outcomes and impacts, deconstructed term complexities and evolved our understanding of TMF use in tool development, testing and refinement studies. Renewed attention to basic concepts is necessary to advance partnership measurement and research innovation in the field. Systematic review protocol registration: PROSPERO protocol registration: CRD42021137932 https://www.crd.york.ac.uk/prospero/display_record.php?RecordID=137932 .Item Open Access Selecting implementation models, theories, and frameworks in which to integrate intersectional approaches(2022-08-04) Presseau, Justin; Kasperavicius, Danielle; Rodrigues, Isabel B.; Braimoh, Jessica; Chambers, Andrea; Etherington, Cole; Giangregorio, Lora; Gibbs, Jenna C.; Giguere, Anik; Graham, Ian D.; Hankivsky, Olena; Hoens, Alison M.; Holroyd-Leduc, Jayna; Kelly, Christine; Moore, Julia E.; Ponzano, Matteo; Sharma, Malika; Sibley, Kathryn M.; Straus, SharonAbstract Background Models, theories, and frameworks (MTFs) provide the foundation for a cumulative science of implementation, reflecting a shared, evolving understanding of various facets of implementation. One under-represented aspect in implementation MTFs is how intersecting social factors and systems of power and oppression can shape implementation. There is value in enhancing how MTFs in implementation research and practice account for these intersecting factors. Given the large number of MTFs, we sought to identify exemplar MTFs that represent key implementation phases within which to embed an intersectional perspective. Methods We used a five-step process to prioritize MTFs for enhancement with an intersectional lens. We mapped 160 MTFs to three previously prioritized phases of the Knowledge-to-Action (KTA) framework. Next, 17 implementation researchers/practitioners, MTF experts, and intersectionality experts agreed on criteria for prioritizing MTFs within each KTA phase. The experts used a modified Delphi process to agree on an exemplar MTF for each of the three prioritized KTA framework phases. Finally, we reached consensus on the final MTFs and contacted the original MTF developers to confirm MTF versions and explore additional insights. Results We agreed on three criteria when prioritizing MTFs: acceptability (mean = 3.20, SD = 0.75), applicability (mean = 3.82, SD = 0.72), and usability (median = 4.00, mean = 3.89, SD = 0.31) of the MTF. The top-rated MTFs were the Iowa Model of Evidence-Based Practice to Promote Quality Care for the ‘Identify the problem’ phase (mean = 4.57, SD = 2.31), the Consolidated Framework for Implementation Research for the ‘Assess barriers/facilitators to knowledge use’ phase (mean = 5.79, SD = 1.12), and the Behaviour Change Wheel for the ‘Select, tailor, implement interventions’ phase (mean = 6.36, SD = 1.08). Conclusions Our interdisciplinary team engaged in a rigorous process to reach consensus on MTFs reflecting specific phases of the implementation process and prioritized each to serve as an exemplar in which to embed intersectional approaches. The resulting MTFs correspond with specific phases of the KTA framework, which itself may be useful for those seeking particular MTFs for particular KTA phases. This approach also provides a template for how other implementation MTFs could be similarly considered in the future. Trial registration Open Science Framework Registration: osf.io/qgh64.