Are ‘hybrid’ interventions inherently self-sabotaging? ========================================================== * Penelope Hawe * Complexity * Healthcare quality improvement * Implementation science In this issue of *BMJ Quality & Safety*, Hampton and colleagues report a process evaluation of an intervention trial intended to encourage older patients’ involvement in their hospital care.1 The logic of the intervention, Your Care Needs You (YCNY), was that more patient involvement in aspects of care in hospital will carry over to home after discharge, preventing avoidable repeat admissions. YCNY was described as a ‘hybrid’ intervention. Ward-level staff were obliged to deliver ‘fixed’ components—a booklet, and advice sheet and a video. But they were also invited to design and deliver ‘flexible’ components, that is, any other components that the ward team thought would also encourage patients to take part in the selected aspects of their care (some examples were offered by the investigators). One of their eight wards went all in, embracing the challenge of designing flexible components. But the others chose differently, keeping with the fixed components only. Overwhelmingly, these became ‘taskified’, that is, delivered in a perfunctory way. The authors concluded that hybrid intervention research seems to be at a ‘crossroads’ and it is hard for already pressured staff to do the creative work required for the flexible components. That observation fits their data. But engagement with the fixed components was also lacklustre. Maybe, the received message of ward staff was simply, these components you *have to do*, but these other ones you *don’t*. Or maybe, some components are important and these others are less so. If so, then their results are not surprising. Perhaps hybrid interventions are destined to be problematic. Although the findings of the YCNY trial are yet to be fully reported, what has been reported so far resonates with many of us trying to make interventions more effective. It is timely, therefore, to explore the logic of hybrid interventions. ## Mixed and contradictory notions about core, flexible, fixed and adaptable components have developed over time Frustratingly, terms are used interchangeably but also differently. In essence, two types of intervention structures have been pursued. To some researchers, ‘core’ components are fixed or unchanging in different sites, but they may be accompanied by ‘flexible’ components that are allowed to vary from place to place. Fidelity (integrity) is defined by adherence to the delivery of the core components. This idea was first endorsed in 2000 in the Medical Research Council (MRC) guidance on complex interventions. It referred to ‘constant’ (core or fixed) and ‘variable’ (or adaptable) components.2 To other researchers, the *form* of the component is not fixed. It can vary from site to site (adapt) while maintaining fidelity to the *function* the components play in the intervention theory or hypothesised change process.3–5 All components are ‘core’ (ie, essential) and all are permitted to adapt if necessary, as part of a codesign process with the sites. If components adhere to the same function in different settings/sites then the integrity of the intervention is preserved.3–5 The advantage of this is that context-level adaptation is considered at the efficacy trial stage, creating at the outset (in theory at least) interventions which are demonstrably transportable from place to place. The new MRC guidance on complex interventions embraces this idea of functional fidelity.6 It then invites researchers to decide whether variation is permitted or prohibited on particular components, if researchers are still inclined to keep some components fixed in form. Damschroder and colleagues suggest that the distinction between core and adaptable components may only be discerned over time by trial and error in multiple contexts.7 Unfortunately, however, some of the language in use now makes it even more clear that the continuation of components with different status demands better a priori justification. Butler and colleagues use the term ‘discretionary’ when referring to additional or flexible components, a term which implies a less vital role in the change process.8 The Consolidated Framework for Implementation Research uses the term ‘adaptable periphery components’ which may inadvertently carry a similar marginal connotation.7 Essentially, researchers using this way of thinking have to be comfortable with saying to ward staff that they may deliver things which researchers (currently at least) deem not important to the theory or mechanism of change. In contrast, others argue that all the components are part of the cause or mechanism.3 4 If they are not, then they perhaps have no business being there.9 At best then, hybrid interventions send a deflating message about the value of practitioner-led thinking. But at worst, hybrid interventions may be self-sabotaging. ## Adaptable components harness practitioner agency and creativity Hybrid interventions are uniquely comprised mechanisms of action from opposite ends of Greenhalgh and colleagues’ innovation theory spectrum.10 The fixed components are underpinned by a managerial mechanism of action (ie, specific, orderly, planned and make-it-happen). This is tested alongside a more social and emergent mechanism of action that underpins the practitioner-led components (ie, unpredictable, self-organising and let-it-happen).10 It is not unusual to see reports of interventions unfolding successfully with a managerial mechanism of action (such as monitoring systems for community-based prevention).11 But complexity-harnessing interventions that minimally prescribe the process and provide maximum feedback to allow review and adjustment are successful, too.12 Indeed, there is strong direct evidence that fostering reinvention across sites and allowing practitioners to modify the interventions to suit their needs means practices are more likely to be adopted.10 This means that a hybrid intervention undermines the very actions that might lead to effective and sustained problem-solving by passing off the so-called flexible or periphery elements as optional. How this has come about is puzzling. Maybe, Greenhalgh’s seminal work on diffusion of innovation has been taken up in different ways because it is hard to shake the different explanatory schema that researchers automatically bring to what they observe. Greenhalgh’s team spoke about complex innovations having ‘fuzzy boundaries’. Interventions were conceptualised as having a ‘hard core’ (‘irreducible elements of the innovation itself’) and a ‘soft periphery’ (the organisational structures and systems for full implementation).10 Soft periphery elements were listed as part of the assimilation process. They were not listed as part of the innovation. However, to some researchers, elements are components to be delivered and counted, wherever they appear in a hypothesised change process. To others, elements might be capacities and processes to be identified and coached or better rewarded. In any event in network theory, a core-periphery structure does not necessarily equate to weakness at the periphery. Actors on the periphery just hold a different type of power to those in the centre.13 Peripheral actors connect central actors to novel resources (material, social, emotional, informational).13 These may prove make-or-break when it comes to adoption of innovation. ## We are still learning what interventions (really) are Intervention design requires deep reflection and what Greenhalgh calls ‘epistemological labour’. For example, components may have hidden and multiple functions. Resources designed to increase and distribute knowledge can have other (more) important roles, for example, to build relationships between staff and patients.14 The proportion of staff trained to deliver an intervention can be a better predictor of the outcomes than time spent by staff on the intervention delivery.15 In other words, complex interventions act through multistrand pathways of change. This means that conventional measures of the dose of intended components can fail to detect how change occurs. Intervention logic must be interrogated at the outset. Components must also be fully observed to see how they function in practice and from place to place. Replication studies have been fruitful.5 What gets theorised matters too. Individual behaviour theory tends to dominate intervention design.16 A more logical starting point would be to get to know the context or system into which the intervention is to be introduced and how the problem is recurrently produced by that system. Theories about settings and system dynamics can be drawn on to determine what functions need to be enhanced, extinguished or introduced by the intervention’s components or strategies. Activity setting theory, for example, identifies the number of roles in the setting (roles like leadership, supervision and feedback) and how they are distributed.17 Ecological systems theory invites consideration of the available resources/capacity (people, time, materials, skills) including whether these are sufficient to ‘couple’ with the intervention.18 ## Conclusion All intervention components should matter and their functions or roles in the local context/system need to be theorised in the change process. We cannot predict in advance what the most important components will be. It will vary in each site and depend on complex interaction dynamics. Studies like those of Hampton and her colleagues, with their extensive investment in ethnographic methods, are therefore vital for increasing our understanding.1 Indeed, a wide lens is critical to understanding intervention complexity and to counter the popular tendency to narrow the gaze to ‘barriers and enablers’.19 Finally, researchers should be fully cognisant of pre-existing dynamics, appreciating that adaptation is not just a programme-level phenomenon but a system-level capability.20 In other words, a worthwhile intervention is not merely something that has been implemented. It betters the system as a whole. ## Ethics statements ### Patient consent for publication Not applicable. ### Ethics approval Not applicable. ## Footnotes * Contributors PH wrote the paper and is the guarantor. * Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors. * Competing interests None declared. * Provenance and peer review Commissioned; internally peer reviewed. [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/) This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/). ## References 1. Hampton S , Murray J , Lawton R , et al . Understanding the challenges and successes of implementing a ‘hybrid’ intervention in health care settings: findings from a process evaluation of a patient involvement trial. BMJ Qual Saf 2025;34:92–9. [doi:10.1136/bmjqs-2024-017268](http://dx.doi.org/10.1136/bmjqs-2024-017268) [Abstract/FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MzoicWhjIjtzOjU6InJlc2lkIjtzOjc6IjM0LzIvOTIiO3M6NDoiYXRvbSI7czoxNzoiL3FoYy8zNC8yLzcwLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 2. Medical Research Council. A framework for the development and evaluation of randomised controlled trials for complex interventions to improve health. London: MRC, 2000. 3. Bauman LJ , Stein REK , Ireys HT . Reinventing fidelity: the transfer of social technology among settings. Am J Community Psychol 1991;19:619–39. [doi:10.1007/BF00937995](http://dx.doi.org/10.1007/BF00937995) [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1007/BF00937995&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=1755439&link_type=MED&atom=%2Fqhc%2F34%2F2%2F70.atom) [Web of Science](http://qualitysafety.bmj.com/lookup/external-ref?access_num=A1991GL39100010&link_type=ISI) 4. Hawe P , Shiell A , Riley T . Complex interventions: how “out of control” can a randomised controlled trial be?. BMJ 2004;328:1561–3. [FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiYm1qIjtzOjU6InJlc2lkIjtzOjEzOiIzMjgvNzQ1NS8xNTYxIjtzOjQ6ImF0b20iO3M6MTc6Ii9xaGMvMzQvMi83MC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 5. Fixsen DL , Naoom SF , Blase KA , et al . Implementation research. A Synthesis of the Literature. Tampa Florida University of South Florida. Louis de la Parte Florida Mental Health Institute. National Implementation Research Network (FMHI Publication #231); 2005. 6. Skivington K , Matthews L , Simpson SA , et al . A new framework for developing and evaluating complex interventions: update of Medical Research Council Guidance. BMJ 2021;374:n2061. [FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiYm1qIjtzOjU6InJlc2lkIjtzOjE3OiIzNzQvc2VwMzBfNS9uMjA2MSI7czo0OiJhdG9tIjtzOjE3OiIvcWhjLzM0LzIvNzAuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 7. Damschroder LJ , Aron DC , Keith RE , et al . Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 2009;4:50. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1186/1748-5908-4-50&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=19664226&link_type=MED&atom=%2Fqhc%2F34%2F2%2F70.atom) 8. Butler M , Epstein RA , Totten A , et al . AHRQ series on complex intervention systematic reviews—paper 3: adapting frameworks to develop protocols. J Clin Epidemiol 2017;90:19–27. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1016/j.jclinepi.2017.06.013&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=28720510&link_type=MED&atom=%2Fqhc%2F34%2F2%2F70.atom) 9. Evans RE , Moore G , Movsisyan A , et al . How can we adapt complex population health interventions for new contexts? Progressing debates and research priorities. J Epidemiol Community Health 2021;75:40–5. [doi:10.1136/jech-2020-214468](http://dx.doi.org/10.1136/jech-2020-214468) [Abstract/FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiamVjaCI7czo1OiJyZXNpZCI7czo3OiI3NS8xLzQwIjtzOjQ6ImF0b20iO3M6MTc6Ii9xaGMvMzQvMi83MC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 10. Greenhalgh T , Robert G , MacFarlane F , et al . Diffusion of innovations in service organisations. The Millbank Q 2004;82:518–629. 11. Green AM , Innes-Hughes C , Rissel C , et al . Codesign of the Population Health Information Management System to measure reach and practice change of childhood obesity programs. Public Health Res Pract 2018;28:e2831822. [doi:10.17061/phrp2831822](http://dx.doi.org/10.17061/phrp2831822) 12. Braithwaite J , Churruca K , Long JC , et al . When complexity science meets implementation science: a theoretical and empirical analysis of systems change. BMC Med 2018;16:63. [doi:10.1186/s12916-018-1057-z](http://dx.doi.org/10.1186/s12916-018-1057-z) 13. Wasserman S , Faust K . Social network analysis: methods and applications. New York, NY: Cambridge University Press, 1994. 14. Hawe P . Lessons from complex interventions to improve health. Annu Rev Public Health 2015;36:307–23. [doi:10.1146/annurev-publhealth-031912-114421](http://dx.doi.org/10.1146/annurev-publhealth-031912-114421) [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1146/annurev-publhealth-031912-114421&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=25581153&link_type=MED&atom=%2Fqhc%2F34%2F2%2F70.atom) 15. Goenka S , Tewari A , Arora M , et al . Process evaluation of a tobacco prevention program in Indian schools - methods, results and lessons learnt. Health Educ Res 2010;25:917–35. [doi:10.1093/her/cyq042](http://dx.doi.org/10.1093/her/cyq042) [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1093/her/cyq042&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=20884731&link_type=MED&atom=%2Fqhc%2F34%2F2%2F70.atom) [Web of Science](http://qualitysafety.bmj.com/lookup/external-ref?access_num=000284164500002&link_type=ISI) 16. Moore GF , Evans RE . What theory, for whom and in which context? Reflections on the application of theory in the development and evaluation of complex population health interventions. SSM Popul Health 2017;3:132–5. [doi:10.1016/j.ssmph.2016.12.005](http://dx.doi.org/10.1016/j.ssmph.2016.12.005) [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1016/j.ssmph.2016.12.005&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=29302610&link_type=MED&atom=%2Fqhc%2F34%2F2%2F70.atom) 17. O’Donnell CR , Tharpe RG , Wilson K . Activity settings as the unit of analysis: A theoretical basis for community intervention and development. American J of Comm Psychol 1993;21:501–20. [doi:10.1007/BF00942157](http://dx.doi.org/10.1007/BF00942157) [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1007/BF00942157&link_type=DOI) 18. Trickett EJ . Community Psychology: individuals and interventions in community context. Annu Rev Psychol 2009;60:395–419. [doi:10.1146/annurev.psych.60.110707.163517](http://dx.doi.org/10.1146/annurev.psych.60.110707.163517) [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1146/annurev.psych.60.110707.163517&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=19035828&link_type=MED&atom=%2Fqhc%2F34%2F2%2F70.atom) [Web of Science](http://qualitysafety.bmj.com/lookup/external-ref?access_num=000262615800016&link_type=ISI) 19. Haynes A , Loblay V . Rethinking barriers and enablers in qualitative health research: limitations, alternatives, and enhancements. Qual Health Res 2024;:10497323241230890. [doi:10.1177/10497323241230890](http://dx.doi.org/10.1177/10497323241230890) 20. Loblay V , Garvey K , Shiell A , et al . Can adaptation to ‘extraordinary’ times teach us about ways to strengthen community-based chronic disease prevention? Insights from the COVID-19 pandemic. Crit Public Health 2022;32:127–38. [doi:10.1080/09581596.2021.2006147](http://dx.doi.org/10.1080/09581596.2021.2006147) [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1080/09581596.2021.2006147&link_type=DOI)