Meet the Editor: An Interview with Lisa Saldana, our new Associate Editor for Implementation Science Communications

An interview with Lisa Saldana, who has joined the Implementation Science Communications Board. Director and Senior Research Scientist at Chestnut Health Systems’ Lighthouse Institute, Lisa is also Core Faculty with the Implementation Research Institute (IRI) an R25 at Washington University.
Meet the Editor: An Interview with Lisa Saldana, our new Associate Editor for Implementation Science Communications
Like

Share this post

Choose a social network to share with, or copy the URL to share elsewhere

This is a representation of how your post may appear on social media. The actual post will vary between social networks

Lisa Saldana is the Director and Senior Research Scientist at Chestnut Health Systems’ Lighthouse Institute and Director for the Center for Implementation Science at Lighthouse Institute. She has a research emphasis on advancing the use of evidence-based practice in public serving systems. A clinical psychologist by training, she is an NIH-funded implementation scientist focused on the development, evaluation, and implementation of prevention-focused EBPs. Lisa has led a multidisciplinary team of researchers and together during their time at Oregon Social Learning Center they developed the Stages of Implementation Completion ®(SIC) and Cost of Implementing New Strategies (COINS) tools that have been widely adopted to measure, track, and facilitate implementation process, milestones, and resource use for different behavioral and physical healthcare programs worldwide. She serves as a mentor to many emerging scientists with an interest in implementation science and evidence-based practice and has the privilege of collaborating with investigators and community partners throughout the country on numerous federally funded projects. 

The following is an interview between Elvin Geng, Editor-in-Chief of Implementation Science Communications, and Lisa.

Elvin: Thank you so much for joining the journal as an AE.  The timing is crucial - we have had a surge in submissions and we need the kind of expertise that you bring to guide the journal as well as the field.  On a personal level, I’m also very excited to work more closely with you because I have followed your work for years.  You developed the Stages of Implementation Completion and COINS — tools that have shaped how the field measures and supports implementation. These are major contributions to the field.  could I ask you to say a word or two about these, how they came out, and what has surprised you about how the research community has ended up using them?

Lisa: Thanks Elvin! I can’t tell you how excited I am to join this esteemed editorial board and team. Implementation Science Communications is one of my favorite journals and I’m delighted to support it during this period of massive growth. And congratulations to you and the editors for earning such an incredible initial impact score. 

Thanks for asking about the SIC and COINS tools and I wish I could take credit for the initial idea of measuring implementation process, but this idea originated from one of the original implementation trials funded under NIH’s initial dissemination and implementation research in health initiative. The grant was awarded to Oregon Social Learning Center with Patti Chamberlain as the PI and Hendricks Brown was a Co-I. Hendricks had the original notion that passing of time and what you did with that time was important for understanding implementation. The gap we were trying to fill was significant – the study was a head-to-head trial comparing the effectiveness of two bundled implementation strategies, and we needed to determine if the strategies were actually being used, how they were being used, and if they ended in successful implementation. All things that are a given part of your original proposal now, but for those initial trials, we were trying to figure out what the plane was actually supposed to look like, while we were building it, and flying it. 

I was at the right place at the right time, happened to have some measurement development skills, and was very privileged to be asked to lead this effort. I integrated my behavioral observation background and spent a lot of time grappling with how to operationalize processes. I also happened to have some (very minimal) training in economics and policy, and it was clear that as we observed implementation processes, that significant fiscal decisions were impacting implementation along the way. We, in turn, developed the COINS tool to be used alongside the SIC to capture the costs associated with the implementation process.  While both were developed to help us observe our outcomes for that initial trial, they now are commonly used throughout the field which has been such a fun ride to get to collaborate with people all over the world and learn about the really cool innovations they are trying to move into practice but has also absolutely unexpected. 

Elvin: As you step into the associate editor role at Implementation Science Communications, where do you see the biggest white spaces in the current literature? What kinds of manuscripts are you hoping to champion? 

Lisa: This is probably not surprising, but I am very grounded in the practical and pragmatic side of implementation science and I would like to see the field move toward helping systems integrate and utilize the knowledge learned over the last couple of decades to create systems change and build sustainable infrastructures. I would like to see us move beyond the theoretical and apply what we have learned on a larger scale. Across the globe we are witnessing how vulnerable our healthcare and scientific infrastructures are with many systems being dismantled with rapid speed. During the period of rebuild, it would be wonderful to see more manuscripts providing examples of methods, strategies, and outcomes focused on using knowledge from the IS field to help create infrastructures that can exist and sustain within communities independent of external resources. 

Elvin: If you had to name one conceptual or methodological shift you'd like to see the field make in the next five years, what would it be — and why?

Lisa: Great question. I would love to see rigorous designs developed that also take into consideration that real world partners do not work on the same timeline as research trials. This goes in both directions – partners who want to adopt an intervention but need more time than a particular design allows, and partners who need to implement a solution to an immediate need, now, and cannot wait for the study timeline. There are, of course, really cool variations of rollout designs emerging and it will be interesting to see how these methods evolve. And I am thrilled that “pacing” of implementation is gaining increased attention in the field, in part thanks to the special collection that ISC and IS are supporting.

There will always be a tension between the pace of science and real-world need, but I would love to see more innovation toward solving this gap. 

Elvin: One of the things I like to ask is what your favorite paper has been recently in implementation science and why. 

Lisa: Another great question. There is so much great work happening that it is hard to pick one, but related to your last question, I would have to say the recent article by Hendricks Brown and colleagues on rollout designs in Implementation Science, “What scientific inferences can be made with randomized implementation rollout trials.” I think this is a great example of a paper that challenges us to think beyond our typical effectiveness designs and to think about how we can leverage learnings across time from different cohorts that are on different implementation timelines. I appreciated how the authors laid out different methods that could be utilized to answer different questions. What really struck me though, was how they outlined the steps that are needed to maintain rigor for example, recommendations for when randomization of sites should occur under different designs. This seems simple, but it is anything but, and I found these insights really helpful as I think about our current and future trials. 

Elvin: Can you tell us the last book you read, what it was about and why you liked it? 

Lisa: I am biased, but my sister is an author and recently wrote a book, “What we remember will be saved.” It is a written documentary of the stories of refugees in the middle east who sustain their culture and values through their art and what they can literally bring with them on their backs. She brings a voice to their stories of preserving their heritage through food, song, textiles, dance and so forth. I really appreciated this book, not only because it was written by my sister, but because it provided a beautiful example of hope and the steps that people naturally take to sustain what is most valuable to them, even in the most unpredictable and challenging times. 

Elvin: Anything else you would like us to know about you?  

Lisa: Folks can say that I am a little pollyannaish, but I hope that glass half full perspective helps propel the Implementation Science field forward. I truly believe that our expertise is what is needed right now to help shape the direction of equitable and quality healthcare. I am excited to help support these advancement through my role with IS Communications! 

📝 Explore the journal and submit your manuscript.

📢Sign up to receive article alerts from Implementation Science Communications

Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Follow the Topic

Public Health
Life Sciences > Health Sciences > Public Health
Policy Implementation
Humanities and Social Sciences > Politics and International Studies > Public Policy > Policy Implementation
Health Care
Life Sciences > Health Sciences > Health Care
Health Policy
Life Sciences > Health Sciences > Public Health > Health Policy
Sustainability
Research Communities > Community > Sustainability
SDG 3: Good Health & Wellbeing
Research Communities > Community > Sustainability > UN Sustainable Development Goals (SDG) > SDG 3: Good Health & Wellbeing

What are SDG Topics?

An introduction to Sustainable Development Goals (SDGs) Topics and their role in highlighting sustainable development research.

Continue reading announcement

Related Collections

With Collections, you can get published faster and increase your visibility.

Advancing the Science and Metrics on the Pace of Implementation

The persistent 17-year span between discovery and application of evidence in practice has been a rallying cry for implementation science. That frequently quoted time period often implies that implementation needs to occur faster. But what do we really know about the time required to implement new evidence-based practices into routine settings of care. Does implementation take 17 years? Is implementation too slow? Can it be accelerated? Or, does a slower pace of implementing new evidence-based innovations serve a critical function? In many cases—pandemics, health inequities, urgent social crises—pressing needs demand timely implementation of rapidly accruing evidence to reduce morbidity and mortality. Yet many central tenets of implementation, such as trust, constituent inclusion, and adaptation, take time and may require a slow pace to ensure acceptability and sustained uptake.

To date, little attention and scant data address the pace of implementation. Speed is a rarely studied or reported metric in implementation science. Few in the field can answer the question, “how long does implementation take?” Answering that question requires data on how long various implementation phases take, or how long it takes to achieve implementation outcomes such as fidelity, adoption, and sustainment. Importantly, we lack good data on how different implementation strategies may influence the amount of time to achieve given outcomes.

To advance knowledge about how long implementation takes and how long it “should optimally” take, this collection seeks to stimulate the publication of papers that can advance the measurement of implementation speed, along with the systematic study of influences on and impacts of speed across diverse contexts, to more adequately respond to emerging health crises and benefit from emerging health innovations for practice and policy. In particular, we welcome submissions on 1) methodological papers that facilitate development, specification, and reporting on metrics of speed, and 2) data-based research (descriptive or inferential) that reports on implementation speed metrics, contextual factors and/or active strategies that affect speed, or the effects of implementation speed on important outcomes in various contexts.

Areas of interest include but are not limited to:

  • Data based papers documenting pace of moving through various implementation phases, and identifying factors (e.g., implementation context, process, strategies) that affect pace of implementation (e.g., accelerators and inhibitors)
  • Data based papers from multi-site, including multi-national, studies comparing pace of innovation adoption, implementation, and sustainment across various contexts
  • Data based papers reporting time to implementation in the face of urgent social conditions (e.g., climate change, disaster relief) Papers on how to accelerate time to delivery of treatment discoveries for specific health conditions (e.g., cancer, infectious disease, suicidality, opioid epidemic)
  • Data based papers on the timeliness of policy implementation, including factors influencing the time from data synthesis to policy recommendation, and from policy recommendation to implementation
  • Span of time needed to: achieve partner collaboration, including global health partnerships adapt interventions to make them more feasible, usable, or acceptable achieve specific implementation outcomes (e.g., adoption, fidelity, scale-up, sustainment) de-implement harmful or low-value innovations, or to identify failed implementation efforts
  • Effect of implementation pace on attainment of key outcomes such as constituent engagement, intervention acceptability or sustainability, health equity, or other evidence of clinical, community, economic, and/or policy benefits.
  • Papers addressing the interplay between pace and health equity, speed and sustainability, and other considerations that impact decision-making on implementation
  • Methodological pieces that advance designs for testing speed or metrics for capturing the pace of implementation

This Collection welcomes submission of a range of article types. Should you wish to submit to this Collection, please read the submission guidelines of the journal you are submitting to, i.e., Implementation Science or Implementation Science Communications, to confirm that type is accepted by the journal you are submitting to.

Articles for this Collection should be submitted via our submission systems in Implementation Science or Implementation Science Communications. During the submission process you will be asked whether you are submitting to a Collection, please select "Advancing the Science and Metrics on the Pace of Implementation" from the dropdown menu.

Articles will undergo the standard peer-review process of the journal they are considered in Implementation Science or Implementation Science Communications and are subject to all of the journal’s standard policies. Articles will be added to the Collection as they are published.

The Editors have no competing interests with the submissions which they handle through the peer-review process. The peer-review of any submissions for which the Editors have competing interests is handled by another Editorial Board Member who has no competing interests.

This Collection is organized by:

  • Gila Neta, PhD, MPP, National Institutes of Health, Bethesda, United States
  • Enola Proctor, PhD, Washington University in St. Louis, United States
  • Alex Ramsey, PhD, Washington University in St. Louis, United States

Publishing Model: Open Access

Deadline: Jun 30, 2026