A Systems-Aware Evaluation Approach for Innovation
Published in Social Sciences, Computational Sciences, and Public Health
  Health and social systems are constantly evolving. They operate as complex adaptive systems (CAS), where even small disruptions can lead to large, unpredictable outcomes. In these settings, evaluating innovation requires approaches beyond fixed project plans and linear assumptions (1,2).
To meet this challenge, researchers are developing evaluation methods that better reflect the complexity of real-world implementation. These approaches consider stakeholder diversity, system feedback, and emergent behaviours. Rather than relying on rigid metrics, they offer ways to generate context-sensitive insights and support evaluation processes that adapt as innovations unfold (1,2).
In this post, we share a short video about a recent study evaluating RAPIDx AI, a decision-support tool for emergency cardiac care, implemented across 12 hospitals in South Australia (3). While the study focused on clinicians, the broader approach highlights the importance of engaging a wide range of stakeholders, such as patients, administrative staff, and health service leaders.
In the study, we worked with a co-design group that included a broad mix of stakeholders, patients, clinicians, managers, researchers, policymakers, and digital developers. This group helped shape the evaluation process from the outset, contributing to design and validation stages (1,3). Surveys and interviews were developed in collaboration with these stakeholders and focused on capturing practical, ethical, and contextual dimensions of implementing innovation, such as perceived usefulness, acceptability, and feasibility in real-world conditions (3,4).
Even so, there is room to grow. In future evaluations, tools like surveys and interviews should be more widely applied, not just with clinicians, but also with administrative teams, leadership, and community members. This would offer a complete view of how innovations are experienced and adapted across different roles and environments, supporting more equitable and system-sensitive implementation (1,2).
This approach draws on insights from complexity science and transdisciplinary research, which see knowledge emerging through interaction between people, systems, tools, and contexts (1,4,5). From this perspective, evaluation is not merely a technical measurement exercise but a dynamic system engagement and learning process.
Such thinking is gaining traction across public health, social science, and computational research. Conventional, output-focused evaluation methods often miss key factors like power dynamics, shifting contexts, or unintended outcomes. Adaptive approaches, and their ongoing development, offer a way to understand innovation as it moves and is shaped by the people and systems involved (2,3).
We share this example not as a fixed model, but as an invitation to reflect on how we might evaluate innovation differently, embracing complexity, collaboration, and systems thinking. We welcome further dialogue with others working at the intersection of change, evaluation, and impact.
References
- 
Pinero De Plaza MA, Yadav L, Kitson A. Co-designing, measuring, and optimizing innovations and solutions within complex adaptive health systems. Front Health Serv. 2023;3:1154614. https://doi.org/10.3389/frhs.2023.1154614
 - 
Romera AJ, Bratman EZ, Pinero de Plaza MA, Descalzo AM, Ghneim-Herrera T. Freeing transdisciplinarity from the project straightjacket: reframing the problem. Soc Sci Humanit Open. 2025;11:101483. https://doi.org/10.1016/j.ssaho.2025.101483
 - 
Pinero De Plaza MA, Lambrakis K, Marmolejo-Ramos F, Beleigoli A, Archibald M, Yadav L, et al. Human-centred AI for emergency cardiac care: Evaluating RAPIDx AI with PROLIFERATE_AI. Int J Med Inform. 2025;196:105810. https://doi.org/10.1016/j.ijmedinf.2025.105810
 - 
Archibald MM, Lawless MT, Pinero de Plaza MA, Kitson AL. How transdisciplinary research teams learn to do knowledge translation (KT), and how KT in turn impacts transdisciplinary research: A realist evaluation and longitudinal case study. Health Res Policy Syst. 2023;21(1):20. https://doi.org/10.1186/s12961-023-00967-x
 - 
Pinero De Plaza MA, Conroy T, Mudd A, Kitson A. Using a complex network methodology to track, evaluate, and transform fundamental care. In: Nurses and Midwives in the Digital Age. IOS Press; 2021. p. 31–5. https://ebooks.iospress.nl/volumearticle/58606
 
Please sign in or register for FREE
If you are a registered user on Research Communities by Springer Nature, please sign in