Implementation Science Presentation

Implementation Science Presentation

  1. 1. Implementation Science Dr. Akanksha Dani JR2 Community Medicine
  2. Evidence Based Practices (EBP) Evidence based practice is the conscientious use of current best evidence in making decisions about patient care It is defined in terms of three legged tool 1.The best available research evidence bearing on whether and why a treatment works 2.Clinical expertise (clinical judgement and experience) to rapidly identify each patient’s unique health state and diagnosis, their individual risks and benefits of potential interventions 3. Patient preferences and values
  3.  Systematic reviews & meta- analysis of RCTs RCTs Cohort Studies Case-control studies Cross-sectional surveys Case reports Uncontrolled data (expert opinions, perspective…) No control group People with/without condition (prospective) People with/without condition (retrospective) EVIDENCE HIRARCHY
  4. Efficacy and Effectiveness Trials Implementation Science Presentation
  5. Background – It has been widely reported that EBPs take on an average 17 years to be incorporated into routine general practice – Research to practice gap- Whether Published findings of the researchers are translated into PUBLIC HEALTH IMPACT has not been a concern of traditional healthcare researcher
  6. Know – do gap Bridging the know-do gap is one of the most important challenge for public health in this century. It also poses the great opportunity for strengthening health system know Do
  7.  Implementation Science Definition: The scientific study of methods to promote the systematic uptake of research findings and other evidence based practices into routine practice and hence improve the quality and effectiveness of health services

    ORDER A PLAGIARISM – FREE PAPER NOW

  8. What Implementation Research will do?? 1.Quantify gap between routine and potential care with proven or promising interventions 2.Barrier analysis to understand hinders for change to occur 3.Describe what they do to implement a change 4.Enable successful adaptation and implementation of interventions 5.Assess implementation and outcomes 6.Understand sustainability of changes
  9. Discovery Development Delivery Improved Health Outcomes What is pathophysiology ? What is diagnosis & intervention How do we best deliver the intervention Does the intervention & delivery model work? Role of delivery system in closing the know do gaps
  10. Implementation research requires – TRANS-DESCIPLINARY RESEARCH That may include- • Health Service Researcher • Economists • Sociologists • Anthropologists • Organizational Scientists • Operational partners in administration • Front line clinicians • Patient
  11. Quality Improvement Implementation Science Dissemination Begin with specific problem in a specific healthcare system Typically begins with an EBPs that are underutilized Spread of information about the intervention Recognized at the level of the provider & led to the design & trial of strategies to improve a specific problem of a specific healthcare system Identifies and addresses resultant quality gaps at provider clinic or Healthcare system level Implementation science takes as a part of its mission an explicit goal of developing generalizable knowledge that can be widely applied beyond the individual system under study
  12. Examples in Implementation science Presenter : Dr. Rashmi Kulkarni (JR3) Guide : Dr. Shrikala Acharya (Additional Professor ) Dept of Community Medicine Seth GSMC and KEMH Implementation Science Presentation
  13. IR CHARACTERISTICS APPLICATION Systematic The systematic study of how a specific set of activities integrate an evidence-based public health intervention within specific settings and how health outcomes vary across communities Multidisciplinary • Analysis of biological, social, economic, political, system, and environmental factors that impact implementation • Interdisciplinary collaborations between behavioral and social scientists, clinicians, epidemiologists, statisticians, engineers, business analysts, policy makers, and stakeholders
  14. IR CHARACTERISTICS APPLICATION IN USE Contextual • relevant to local specificities and need • Generates generalizable knowledge that can be applied across contexts • Culture, community Complex • Dynamic and adaptive • Multi-scale: occurs at multiple levels of health care systems and • community practices • Analyzes multi-component programs and policies
  15. IR is NOT • Routine, applied operations research • Basic biomedical research (e.g., discovery of a new gene pathway or etiology research) • Initial or replication of intervention efficacy trials in a top-down controlled setting • Routine program progress reporting • M & E – part of IR. Not synonymous.
  16.  e.g. zinc deficiency and diarrhea • What is the association of zinc deficiency with severity of diarrhoea? Epidemiological research • What is the effect of Zinc as an adjunct in treatment of diarrhoea? Clinical Efficacy research • What is the effect of programme of promoting zinc as a adjunct in treatment of diarrhoea? Programme effectiveness research • How can the barriers to scaling up zinc promotion programme overcome so that it reaches all children with diarrhea? Implementation Research
  17.  Operational research (OR) – focuses on a specific, local, clearly defined setting and context. • Implementation research (IR) starts with a specific setting and applies findings to broader contexts through scale-up and other implementation processes. • Health systems research (HSR) focuses on a broader context, covering many settings under the umbrella of an entire system.
  18. Voluntary Medical Male Circumcision Scale-up in Eastern and Southern Africa • With evidence that male circumcision (MC) reduces the risk of HIV transmission in specific settings, countries in Eastern and Southern Africa are working to scale up MC service delivery and coverage. Delivering this evidence-based male circumcision intervention includes opportunities for operations research, implementation research, and health systems research.
  19. Operational Research • Which locations should be targeted for delivering MC services in Eastern Africa? Implementation research • How can access to MC services among populations who are currently not reached by MC services be improved? Health System Research • What has been the impact of the rapid scale-up of MC programs on fragile health
  20. IR Constraints to scale up • Funding • stakeholder access to information at different levels • lack of political support • frequent changes in staff or policies at any level • lack of skilled facilitators, • influence and relationship of different contextual levels (individual, organization, community, policy)
  21.  Interventions with skilled birth attendants to address the MDG of improving maternal, newborn, and child health Level of Constraint Type of Constraint Potential intervention Community/househ old Perceptions of SBAs, decision making Community-level promotion of services and behavioral modifications to increase demand for services Health services delivery • Shortage and distribution of appropriately qualified staff (of appropriate gender) • Adequate drugs and medical supplies • Lack of equipment and Infrastructure Task-shifting and redistribution of Personnel and resources
  22. Level of Constraint Type of Constraint Potential intervention Health sector policy and strategic management Employment systems, supply procurement processes Task-shifting and redistribution of personnel Public policies cutting across sectors Poor availability of communication Poor transport infrastructure Quality assurance and monitoring Transportation vouchers Environmental and contextual characteristics Corruption, weak government Geographic barriers Transparency Transportation vouchers Implementation Science Presentation
  23. Approaches to identify constraints • Systematic analysis • Discussion with concerned stakeholders • Routine monitoring of health sector activities • Annual health sector review meetings
  24. RE-AIM FRAMEWORK RE-AIM Design Questions Reach Who is intended to benefit, and who actually participates or is exposed to the intervention? Effectiveness What is the most important benefits you are trying to achieve and what is the likelihood of negative outcomes? Adoption Where is the program or policy applied, and WHO applied it? Implementation How consistently is the program or policy delivered; How will it be adapted; How much will it cost; and How/Why will the results come about? Maintenance When will the initiative become operational; how long will it be sustained (setting level); and how long are the results sustained (individual level)?
  25. Planned parenthood and smoking cessation example • Patient randomized study (n = 1154) in low income Planned Parenthood clinics. • Eligible and target population = women smokers coming into clinic for contraception, wellness, or non-pregnancy follow-up. • Intervention – 9-minute tailored video, clinician advice to quit, brief behavioral counseling, follow-up phone calls. • Control – Advice to Quit and Stop Smoking brochure. Glasgow R et al. A brief smoking cessation intervention. AJPH, 2000, 90: 786-789
  26. REAIM Framework • Reach: • 99% of planned parenthood clinics had identified smoking status, • 76% of smokers approached participated, • no differences on demographics for participants vs. decliners • Effectiveness: at 6-week follow-up, • 10.2% quit – Intervention group • 6.9% quit – Control group (p<.05) • Adoption: • Approached 4 clinics in lowest SES neighborhoods in area
  27.  Implementation: • >85% on all components • except phone calls (43%) • Maintenance: (after 6 months F/U) • Individual level: Higher but not significant (30-day biochemically validated abstinence: 6.4% vs 3.8%) • Conclusion : This brief, clinic based intervention appears to be effective in reaching and enhancing cessation among female smokers, a traditionally underserved population.
  28.  Discussion with concerned stakeholders 1. Specify and describe : • A discussion of TB control with stakeholders • Constraint – increasing defaulter rate of TB patients. • Possible causes – poor health services management, social stigma, negative attitudes of health workers towards TB patients.
  29.  Quantify and Elaborate : • How widespread the problem is? • In which regions it occurs more persistently? • Potential areas of low compliance, who is most affected ? • consequences of the problem. (increasing morbidity, deaths, waste of resources, development of multi-drug resistance, etc.)
  30.  Identify Contributing factors : • Poorly trained staff – inadequate TB health education materials, limited patient understanding of treatment, or failure to provide systematic advice and counseling to patients. • These factors may inhibit patient understanding of treatment requirements,causing the high defaulter rate.
  31. Q: Please select the best example of an implementation research question. A. What is the effect of zinc as an adjunct for treatment of diarrhea? B. What is the effect of distributing insecticide-treated nets to prevent malaria in vulnerable populations? C. How can tuberculosis treatment be delivered effectively in rural areas? D. Does a health education program increase access to antiretroviral therapy?
  32. Answer C. How can tuberculosis treatment be delivered effectively in rural areas?
  33.  Thank You
  34. IMPLEMENTATION SCIENCE Dr. Ashwini B. Sapkal Junior Resident 2 Department of Community Medicine Seth GSMC & KEM Hospital Parel, Mumbai
  35.  Principles and methods of implementation science Implementation studies differ from clinical studies The first step in understanding implementation studies is to distinguish implementation processes from the EBPs they seek to implement An implementation intervention is “a single method or technique to facilitate An implementation strategy is “an integrated set, bundle, or package of discreet implementation interventions ideally selected to address specific identified barriers to implementation success” .
  36.  Implementation interventions may include, for example, efforts to change behaviour at the patient, provider, system, or even policy level. Common examples include strategies at the provider level such as education/training, audit-feedback, and performance incentives.
  37. Strategies targeting the provider, team, or clinic levels may include QI techniques or other systems redesign efforts, team- based performance incentives, Learning collaboratives, or community-engagement. Facilitation, guided efforts by internal or external organizational staff to support multiple levels of system change through provider or team-based coaching, is increasingly recognized as critical to the success of many implementation effects .
  38. In contrast to clinical research, in which typically focuses on the health effects of an EBP, implementation studies typically focus on rates and quality of use of EBPs rather than their effects. Such EBPs may be as “simple” as increasing use of a single medication such as beta-blockers in individuals who have experienced a myocardial infarction OR the use of metabolic side effect monitoring for individuals taking antipsychotic medications OR as “complex” as instituting psychotherapies like cognitive behavioral therapy or even multi-component care paradigms such as the collaborative chronic care model. Implementation Science Presentation
  39. Implementation strategy and the EBP are two different concept For instance, in studying the effects of a program to increase effective cognitive-behavioral therapy use for bipolar disorder, the impact of cognitive-behavioral therapy on health status would be an EBP outcome (and more typically a focus for a clinical study) while measuring the proportion of clinicians providing cognitive behavioural therapy, or proportion of patients who attend a minimum level of cognitive-behavioural therapy sessions would be a more typical implementation study outcome.
  40. Thus the crux of implementation studies is their focus on evaluating the process of implementation and its impact on the EBP of interest.
  41. Definations: Evaluation research : It refers to the kind of applied social research that attempts to evaluate the effectiveness of social programs Impact :Measures how well program deliver an intervention and the outcomes – Example: Is ART treatment program (as a preventive strategy) reducing HIV incidence?
  42.  Impact Evaluation:
  43.  These studies can involve one or more of three broad types of evaluation 1] Process evaluation 2] Formative evaluation 3] Summative evaluation
  44.  Process Evaluation: • It describes the characteristics of use of an EBP. • Data are collected before, during, and/or after the implementation and analysed by the research team without feedback to the implementation team and without intent to change the ongoing process. • Process evaluation can be undertaken in a purely observational study (e.g., In preparation for developing an implementation strategy) or during the course of a spontaneous or planned system or policy change.
  45.  Formative Evaluation : • It utilizes the same methods as process evaluation • In this data are fed back to the implementation team and/or staff in the target system during the study in order to adapt and improve the process of implementation during the course of the protocol. • Formative evaluation is conceptually similar to fidelity monitoring that goes on as part of any traditional clinical trial, but differs in that it is specified a priority in a study hypothesis or research question
  46.  A quantitative version of formative evaluation in clinical trials is the use of sequential multiple assignment randomized trials (SMART) or adaptive intervention designs, which are used to either augment or switch treatments at critical decision points where there is evidence of initial non-response . • The use of formative evaluation has implications for controlled trial validity, which may differ between clinical and implementation trials.
  47.  Summative Evaluation : • It is a compilation, at study end, of the impact of the implementation strategy. • Summative evaluation measures typically assess impacts on process of care (e.g., Increased use or quality of targeted EBP). • Another common component of summative evaluation is to characterize the economic impact of an implementation strategy and its effects.
  48.  Implementation studies most commonly do not employ formal cost effectiveness analyses but conduct focal “business impact analyses” . • Such analyses focus on estimating the financial consequences of adoption of a clinical practice within a specific healthcare setting or system • Typically this includes costs to the system associated with both the implementation strategy and the utilization of the EBP.
  49.  Types of Evaluation Data : Data for the process, formative, and/or summative evaluation can come from various sources and can include either or both of quantitative and qualitative data. Data can be collected across various levels of observation including patients, providers, systems and broader environmental factors such as community, policy, or economic indices.
  50.  Common quantitative measures include structured surveys and tools that assess For ex: organizational context, provider attitudes and behaviours, or patient receptivity to change. Administrative data are often utilized, either in focal target populations or at the broader system level to characterize For ex : baseline and change in rates of utilization of particular practices.
  51. Measures of fidelity to the EBP are often central components of the evaluation plan, and these can be quantitative, qualitative, or both. Common qualitative data collection methods include semi- structured interviews with patients, providers or other stakeholders, focus groups ,direct observation of clinical processes and document review.
  52. Qualitative data collection and analysis can either be structured as a hypothesis-free exploration or related approaches or can utilize directed content analysis to address pre specified issues such as hypothesis-testing or measurement of intervention fidelity. Most implementation evaluation processes include mixed qualitative and quantitative measures and require careful attention in the study design to the various ways to combine such data.
  53. Implementation outcomes: Implementation Outcome Definition Acceptability Perception amongst stakeholders new intervention is agreeable Adoption Intension to apply or application of new intervention Appropriateness Perceived relevance of intervention to a setting, audience or problem Feasibility Extent to which an intervention can be applied Fidelity Extent to which an intervention gets applied as originally designed / intended Implementation costs Costs of the delivery strategy including the costs of the intervention itself Coverage Extent to which eligible patients / population actually receive intervention Sustainability Extent to which a new intervention becomes routinely available / is maintained post introduction Implementation Science Presentation
  54. RE-AIM FRAMEWORK- IMI CAMPAIGN RE-AIM Design Questions Reach Background Data : Situational analysis based on data from – National survey -HMIS -WHO Concurrent monitoring and Evaluation Selection Criteria : Estimated no. of children who missed DPT3 / P3 >13000 OR DPT3 / P3 Coverage < 70% 173 Districts( Rural) 17 Urban Areas Effectiveness [Risk reduction ] For Mission Indradhanush – 6.7 % increase in immunization coverage Adoption 3 round in 2017 and 2 rounds in 2018 Intersectoral Coordination- Other Ministries, other programs, mobilizers and partners
  55.  Implementation (Level of enforcement) Precampaign Activity : Situational Analysis -Training and Orientation -Preperation of microplan after headcount survey -Assessment of District readiness by -National Monitors ,common review of mission -Initiation of first round and subsequent round Maintenance( Sustainability) Daily Reporting – ANM Planning unit District Unit state unit National unit 30 Cluster sampling – Coverage Evaluation – Twice in a year ( in 5 rounds)
  56.  Controlled Implementation Trials Dr. Barsha Pathak JR-3 Dept. Of Community Medicine
  57. Concept • The controlled implementation trial helps to identify barriers and facilitators of evidence based practices (EBP) under naturalistic conditions. • How is it DIFFERENT???? • In other studies seek to enhance EBP ADOPTION by employing specific implementation strategies.
  58. TYPES 1. PARALLEL GROUP 2.INTERRUPTED TIME-SERIES 3. STEP-WEDGE 4.SMART [randomisation]
  59. design concept • Parallel Design Randomised and prospective. • Stepped Wedged Receive implementation support • SMART Design Multivariate nested analysis. • Time-series Design Outcome of interest is measured at multiple time points prior and after implementation effort.
  60.  Implementation science versus Traditional health services & Clinical trial
  61.  They focus on the impact of the implementation strategy on the use of an Evidence based practice (EBP) rather than health impact of the EBP itself. 2. They take a fundamentally different aspect to validate.
  62. STUDY Efficacy design OR Effective design  WHAT THEY FOCUS ON ??? Eg. • Addressed the health system issues and constraints. • Integrated health service provision in non-specialised settings by- New human resources, improving capacity, strengthening information centre, health financing, overall leadership and governance.
  63. WHAT DOES A IMPLEMENTATION SCIENCE DESIGN DO ???
  64.  Implementation science will explain us how evidence- based interventions work in real world/usual practice settings. • I.S pays particular attention to the audience that will be using the evidence-based research. • The context in which implementation of an evidence-based study is occurring and the factors that influence its implementation.
  65. EXAMPLE Implementation Science for closing the treatment gap for mental disorders
  66.  BACKGROUND • At least 10% of the world’s population is affected by one of a wide range of mental disorders. • However, knowledge about mental health disease has improved and also cost-effectiveness of interventions has improved.
  67.  THE TREATMENT GAP HAS REMAINED SIGNIFICANTLY LARGE!!! ( specially in low and middle income countries.) Implementation Science Presentation
  68.  To bridge the treatment gap for mental disorders, WHO envisages that the treatment for priority mental disorders will be provided by primary care doctors. • Hence, large scale training to train the general practitioners were taken up. But recent evaluation process underlines the limited effectiveness of training interventions. • BUT WHY ????
  69. Practical Difficulties • Evidence based Interventions working in only 50% replication site. • Unpredictable behaviour of the system around. • Lack of capacity to integrate. Hence – “ CONTEXT IS EVERYTHING” (by I.S)
  70.  This project is trying to address the “knowledge- action gap”. • Evidence to support the implementation and scale-up of mental health care in primary care and maternal health care. • A comprehensive mental health care plan has been developed. • The role of PRIME team was restricted to capacity building of staff on WHO mhGAP training material and evaluation of the outcome. PRIME-PROJECT
  71. • On evaluation they found that there was a poor translation of evidence-based WHO mhGAP guideline interventions into routine practices. • The barriers found were – 1. Providers were not particularly supportive. 2. Non-availability of psycho-tropic drugs and lack of reporting of mental health indicators in routine health management information system. 3. Poor accountability of district administration. 4. Low priority accorded to mental disorder.
  72. • Ultimately, the PRIME team proceeded with implementation phase. • CHANGE-PACKAGE was designed. i.e. 1.Mental health service delivery through existing public health system in LMICs can be strengthened only with strong facilitation by an external resource team. 2.An additional human resource in the form of a case manager is essential to establish true collaborative models of care.
  73. Measurement of Implementation Sciences. • ACCEPTABILITY. • ADOPTION • APPROPRIATENESS • FEASIBILITY • IMPLEMENTATION COSTS • COVERAGE AND SUSTAINABILITY.
  74.  Efficacy Design Principles Effectiveness Design Principles Implementation Design Principles. Hypothesis Interventions are implemented Interventions are implemented may be followed but not sustained Interventions are adopted and sustained. Population & Settings Any specialised setting. Non-specialised practice sites. Unit of observation may be patients, providers or primary health care centre, typical setting is non-specialized practice sites. Outcome measures Health outcomes are many Short and specific Emphasize on Adoption measures Inventions clinicians PhDs,MSWs are hired & trained by PI Counsellors for mental health Endogenous counsellors. Context At all costs, Trial should be successful Work within “typical” conditions Maintain typical conditions. Validity emphasis Internal>>external External>>internal Plan to optimise protocol in real time using formative evaluation while systematically documenting adaptations Implementation Science Presentation