Position Statements

AACP Position Statement on Implementation of Evidence-Based Practice

December, 2005


In recent years, considerable emphasis in public policy for behavioral health care has been placed on evidence based practice implementation. This has been a component of SAMHSA’s science-to-service initiative, a key recommendation of the President’s New Freedom Commission, and a focus for advocacy groups such as NAMI. SAMHSA has developed the National Registry of Effective Programs and Practices (NREPP) which will identify EBPs for behavioral health, as well as a variety of funding mechanisms to encourage implementation and dissemination of EBP at state and local levels. Policy makers and state legislatures are supporting, and passing, legislation that ties behavioral health funding to EBP implementation. Increasingly, EBP implementation is being identified as a “movement” that is directly related to improvement of the quality of behavioral health care.

Among community psychiatrists, the emerging “EBP movement” has aroused considerable controversy and vocal debate. On the one hand, community psychiatrists in both administrative and clinical roles recognize the importance of maximizing the utilization and the dissemination of the most effective clinical practices and the latest research in public sector settings.. This is particularly important in large public sector systems, when such practices may replace reliance on traditional approaches or program models that have not been demonstrated to have as much value as newer research-supported interventions. As described in initial work on evidence based medicine by the Cochrane Collaborative in the United Kingdom (, the positive impact of “EBP” implementation, within this framework, can include both “pruning”(our term) practices or programs that have been demonstrated to be suboptimal (e.g., some traditional SMI day treatment programs that provide maintenance rather than rehabilitative and recovery-oriented services) and “planting”(our term) practices and programs that have demonstrated value (e.g., individualized placement and support approaches to vocational rehabilitation for the same population). On the other hand, community psychiatrists in policy, administrative, and clinical roles recognize the complexity and diversity of the systems in which they work and the populations they serve, and are concerned about the apparent expectation that increasing proportions of funding be tied to evidence based practices which may relate to only a minority of all clinical activity. In many cases these EBPs may not have been adequately researched to determine all the factors that contribute to “effectiveness” in the full spectrum of systems, settings, and populations in which real community care takes place. This becomes a particular concern with regard to application of existing research to culturally diverse populations, and availability of adequate research on the effectiveness of culturally specific practices and programs.

Regardless of where individual community psychiatrists position themselves in this debate, the AACP has been able to establish a consensus on some key principles related to how EBPs should be utilized and implemented in public behavioral health systems. The current document is a position statement reflecting this consensus. This document is intended to provide both a focal point for AACP members to participate in state and local discussions about the implementation of EBPs, and strategies to improve quality of behavioral health care in general. It is also a vehicle with which AACP can make specific recommendations to SAMHSA and other behavioral health administrations about how EBPs are most likely to have a beneficial impact on the quality of care and the quality of system performance overall.

Six Principles

  1. Quality of care is the ultimate goal. AACP supports all efforts to improve the quality of behavioral health care delivered by public or private sector systems to the populations those systems are responsible to serve. Within this overarching goal of quality improvement, we view utilization of EBPs as a very important strategy, but also as just one of several strategies that must all be employed by behavioral health systems in order to improve overall quality.
  2. Continuous Quality Improvement within a comprehensive quality management systems approach is the essential process for achieving quality of care as a system outcome. AACP recognizes that the process of improving quality of behavioral health services and outcomes involves utilization of strategies of Continuous Quality Improvement that engage all levels of the system in a flexible partnership to identify, implement, and support the most effective approaches for the particular populations and settings in that system.
      1. At the program or intervention level, we support strategies that incentivize movement toward “fidelity” yet also encourage meaningful examination of how research-derived fidelity measures can be adapted locally. These fidelity measures should themselves be evaluated in order to expand knowledge about the application of the practice in question in diverse settings.
      2. At the clinician level, we recognize that workforce “readiness” and existing competencies are critical elements in EBP implementation. These require appropriate investment in a continuing process of formal training, ongoing supervision, practice improvement, competency development, and opportunities for input into the application of the practice as EBP implementation proceeds.
  1. Recovery oriented person centered services are a core feature of system development. AACP recognizes that scientific research and treatment is only one element of the equation that promotes successful outcome for individuals and families with mental health and substance use disorders. As noted by the Institute of Medicine (Institute of Medicine. Strengthening the Evidence Base and Quality Improvement Infrastructure" in Improving the Quality of Health Care for Mental andSubstance-Use Conditions: Quality Chasm Series Committee on Crossing the Quality Chasm: Adaptation to Mental Health and Addictive Disorders. Washington, DC, 2006 pg 140-209), EBP is the integration and application of the best research evidence with clinical expertise and client values. Even in the context of psychopharmacology, let alone other types of rehabilitative intervention, issues of recovery and hope, professional and peer support, culturally appropriate care, and self-directed care all must be factored into information that is derived from the scientific literature in order to maximize adherence and personal investment and autonomy in the recovery process. This is further supported by the recent movement toward “community based participatory research” (, in which interventions are evaluated in real world contexts with direct participation of the recipients of the interventions in both quantitative and qualitative evaluations of outcome.
  2. Creativity and flexibility are essential elements of the process. AACP believes that all efforts to improve quality of care in behavioral health should be designed to encourage, rather than stifle creativity on the part of programs and practitioners. Any strategy which devalues or discourages innovative “promising practices” or “emerging practices” that have not yet had the opportunity to be fully researched would be counterproductive to the application of creative behavioral health techniques to highly diverse populations in highly diverse settings. Research strategies in complex behavioral health systems should be cognizant of the new field of “complexity science” (, which is developing methodologies to describe and evaluate real world phenomena that are innately interactive and complex, in which the interactions themselves contribute to the “emergence” of new strategies and approaches,
  3. Evaluation using meaningful system data to measure quality and outcomes at the system level is a critical feature of EBP implementation in the context of systemic continuous quality improvement. AACP strongly values the use of objective data and evaluation to support quality improvement efforts. However, we also recognize that data and evaluation generated in research settings are only a portion of the data that are necessary for improving quality in real world systems. It cannot be simply assumed that “more” EBP in a system is directly tied to better quality of care overall. AACP therefore strongly encourages the need for formal evaluation of the impact of a range of system improvement strategies, (including EBPs) on measurable indicators of system quality and clinical outcomes.
  4. All efforts to implement EBP must reflect the context of the existing service system, the population it serves, and the need to use scarce resources efficiently. AACP recognizes that all behavioral health systems operate within limited financial resources and limited human resources. Formally recognized EBPs constitute only a small percentage of all the possible clinical activities within any system. AACP supports strategies that encourage systems to be creative in developing the capacity to engage in quality improvement activities that maximize the efficiency and effectiveness of scarce resources that will provide scientifically sound interventions to the widest possible populations, yet achieving the best outcomes. AACP does not support strategies that may encourage investment of scarce resources in certain highly expensive practices for narrow subpopulations without regard to the larger system impact and the need to maximize the effectiveness of resource utilization for the population as a whole.


  1. Quality of care is the ultimate goal. The Institute of Medicine has identified five components of health professional education that must be incorporated into the design of quality based systems to address the “quality chasm”. These five components are: person-directed care; interdisciplinary teamwork; utilization of evidence based practice, quality improvement, and informatics (use of information systems). This approach recognizes that the EBP initiative in general medicine was not intended to apply universally, but would encourage practitioners, within a larger framework for quality service delivery, to incorporate utilization of EBPs when they are available and appropriate while acknowledging that most activity in health care does not have a specific evidence base. Within this framework, AACP strongly supports the utilization of any available scientific evidence in a continuing quest to improve service delivery and clinical outcome.
  2. Continuous Quality Improvement is an essential process. More successful efforts to achieve widespread implementation of EBP (such as in Oregon) are emphasize that translating “science” into real world service systems must occur while recognizing that this translation is a complex interactive improvement process that has not been well studied in behavioral health systems. Consequently, systems that make an overall commitment to measurement based comprehensive quality management are more likely to be successful. Further, evidence based interventions must be flexible and have applicability in any setting, especially among culturally diverse populations. This must preclude overly circumscribed research-based program models that suffer from excessively taut definitions of “adherence” or “fidelity.” Significant issues in achieving success in this regard relate to the variability of efficacy in real world settings; the role of core clinical practices like empathic relationships (and the related need to balance “science” with personal recovery, autonomy, and choice for consumers), accurate and comprehensive assessment, and interdisciplinary teamwork. There is also a complex process of engaging clinicians at various levels of readiness to change in ongoing workforce development. Successful systems and programs recognize that this process is needed to engage consumers, families, clinicians, and managers in a participatory partnership that balances scientific research with real world application through a continuous feedback loop that is regularly evaluated and modified to achieve successively better results. Although the critical importance of this approach has been well described in conventional management science literature, and is consistent with the emerging field of complexity science, it has been surprisingly under-utilized in behavioral health service delivery.
  3. Recovery oriented person centered services are a core feature of system development. There is a growing awareness of the importance of recovery-oriented, self-directed care as a critical element of system design. This is a feature of large Federal initiatives such as Access to Recovery, and is a key recommendation of the New Freedom Commission Report. The balance between consumer involvement and strict scientific method is an essential feature of true quality improvement. This is most clearly recognizable in evaluation of medication algorithms where incorporation of components like peer education and support contributed to the variance in outcome as significantly as the prescription of particular medications based on the “scientific algorithm”( REF). In addition, Steve Morgan ( and Scott Miller ( Miller SD, Duncan BL, & Hubble MA, Outcome Informed Clinical Work, in Norcross JA and Goldfried M, eds. Handbook of Psychotherapy Integration, 2nd Edition. New York, Oxford University Press, 2005, pp. 84-104) have been advancing a concept termed “practice-based evidence” which essentially asserts that “data” generated by individual consumers or clients in a clinical context can and should be utilized along with research data and other information about successful treatment to inform the process of treatment planning and service delivery. This is fully consistent with community based participatory research, in which the data generated by the service recipients is a formal component of the research process of evaluating the impact of the intervention on the cohort as a whole.
  4. Creativity and flexibility are essential elements of the process. Current focus on formal identification of “official” EBPs and delineation of hierarchies of evidence to support such identification (as in the recent NREPP proposal for identification of which practices are truly “evidence-based”), has some definite value. However, current application of this methodology can have the unintended effect of stultifying the creativity and flexibility in the field that allows innovations to be rapidly adapted and piloted in order to improve care. For example, it is now common in discussions of EBP implementation to discuss “promising practices” or “emerging practices” as if they were inherently inferior, and therefore not worthy of the same level of attention and support. This type of approach fails to recognize that the field must go beyond the limitations of what can be researched in order to seek wider application and more creative adaptations with diverse populations in diverse settings, AND that this type of innovation should be strongly encouraged. In addition, the very nature of research on “programs” inherently stifles flexibility and creativity in creating a scientifically rigorous research design. For example, if a certain program has been evaluated to be “evidence-based”, this does not mean that individual interventions within the program might not have demonstrable value, even though they have not been specifically and separately studied outside the context of the program model. One obvious example is that “integrated treatment” of co-occurring disorders (IDDT Toolkit)(Center for Mental Health Services (2003, draft version). Co-occurring disorders: Integrated dusal disorders treatment implementation resource kit: Retrieved April 11, 2006, from is presented as a package, without acknowledging that specific components that are applicable for single disorders of either type (e.g. motivational enhancement, skill building groups, appropriately matched psychopharmacology) could be defined as reasonably evidence supported interventions in any setting, even if that setting is not providing all elements of the researched program model.
  5. Evaluation of EBP implementation in the context of comprehensive quality management and quality improvement using system data is a critical feature. The recent focus on detailed delineation of scientific evidence for particular interventions or programs by researchers contrasts sharply with the relatively meager emphasis on the need to develop data to evaluate the system process of quality improvement, and the role of any practice (evidence-based or otherwise) in contributing to better clinical outcomes. There is concern that the scientific research community has relatively little experience in the evaluation of complex system quality improvement processes. Further, data on system improvement is only just now beginning to be collected, through projects like the Co-occurring Disorder State Infrastructure Grants, Children’s System of Care Grants, and Transformation State Infrastructure Grants. ( The inherent assumptions in many discussions of EBP implementation that “success” is measured by “fidelity scores”, and that increasing “fidelity scores” will necessarily be associated with better clinical outcomes in a system has not actually been sufficiently evaluated. Any strategies for system improvement utilizing EBPs that do not provide for evaluation of the process of improvement, and incorporate the capacity to evaluate the success of various local adaptation approaches, will probably not do justice to the opportunity for the field to develop an increasingly “evidence-based” system improvement technology for real world application.
  6. Evaluation of system improvement, including EBP implementation, must occur within a real system context. As community psychiatrists, we recognize the fundamental challenge of providing services within the context of limited resources to real populations. The concept of “community engaged scholarship” (Morgan S., Miller S.) has emerged as a way of acknowledging the critical need to prioritize the formal linkage of research with the experience of learning and practicing in community systems and settings. Evaluation of EBP must facilitate understanding of how to utilize system constraints. A commonly cited example is the implementation of Assertive Community Treatment in rural settings without adequate populations to support a conventional team, and the consequent use of “hybrid” models that do not have the same level of research support (Santos AB. Deci PA. Lachance KR. Dias JK. Sloop TB. Hiers TG. Bevilacqua JJ..Providing assertive community treatment for severely mentally ill patients in a rural area. Hospital & Community Psychiatry. 1993 :44 ::34 -9). Further elaboration of adaptation of research into real systems involves active consideration of key questions like: within the context of limited funding and training resources, what is the best balance between investing in full implementation of a particular specialized, but highly costly “EBP” (like multi-systemic therapy) for a small number of clients, versus a broader application utilizing adaptation of MST strategies into existing community based wrap around services for a larger population (Henggeler, S.W., Schoenwald, S.K., Borduin, C.M., Rowland, M.D., & Cunningham, P.B. (1998), Multisystemic treatment of antisocial behavior in children and adolescents, New York: Guilford Press). If such discussions are not encouraged, and formally recognized, the ability of creative system developers to use existing research to have maximal impact may be severely compromised.

Recommendations to SAMHSA, state and local funders, and system developers and planners:

  1. Using the latest advances in scientific research in behavioral health care, including implementation of specifically defined evidence-based practices and programs, is a critical element of improving quality and outcomes in behavioral health service systems. SAMHSA, state, and local behavioral health authorities can play a vital role in this process, in five key areas:
    1. Supporting research in both:
      • expansion of knowledge and application of clinical intervention strategies in behavioral health populations, and
      • expansion of knowledge about systems approaches to implementation of innovative practices to improve quality.
    1. Collecting and disseminating information to the field about the latest advances in science, in BOTH treatment research and in system application and implementation research.
    2. Developing mechanisms to work in partnership with key intermediaries (SAMHSA with states; states with counties, regions, etc.) to initiate quality improvement processes to expand the capacity of systems at all levels in order to incorporate innovative approaches into routine practice.
    3. Consistent support for evaluation of the impact of evidence based practice implementation on quality and outcomes in the system as a whole, to make sure that the allocation of resources for these efforts has demonstrable value.
    4. Recognition that the process of implementation requires appropriate resources dedicated to organizing quality improvement at the system level, providing technical assistance and consultation at the program level, and supporting workforce development strategies for clinicians.
  1. In this regard, funding and other initiatives to encourage implementation of EBP should only occur when broad system quality management efforts to improve overall quality care and outcomes are supported. These efforts should recognize the equal importance of investment in recovery-oriented, person centered care, interdisciplinary teamwork, quality improvement, and informatics.
  2. SAMHSA’s efforts to catalogue EBP through the NREPP process might be useful to inform the public of existing science, but it is not sufficient as a mechanism to guide the field in the selection and implementation of EBPs. Rather, SAMHSA should develop a culture of partnership with state entities to encourage a wide-ranging discussion of the current and future role of evidence-based practices in state and local efforts to improve the quality and efficiency of service delivery. Within this partnership, SAMHSA and the states should identify a relatively SMALL list of evidence based interventions or programs that every system should consider implementing because they are fundamental contributors to overall quality of care. This should emphasize inclusion of very basic approaches (e.g., high quality recovery-oriented assessment and person-centered treatment planning, routine screening for comorbidity, opiate maintenance treatment, etc.), as well as culturally specific approaches, and insure adequate attention to both child and adult populations. This would assist systems to shift from the idea that “all practice should be EBP” to the idea that “there are particular high priority EBPs that we should be sure to implement over time in the context of overall quality improvement”. Once this list has been generated, a wide range of additional “promising practice” and “emerging practice” interventions and programs should be developed as well, with a description of the current level and limitation of available evidence and consensus support for each practice, with the goal of explicitly encouraging complex and participatory experimentation with adoption and further evaluation of these approaches with a wide range of populations in diverse settings.
  3. Any support for implementation of EBP should REQUIRE the use of organized system change strategies such as comprehensive quality management and quality improvement that necessitate the participation of consumers, providers, and other stakeholders.
  4. System quality improvement efforts involving EBP should incorporate and encourage development of the capacity to evaluate all of the following:
    1. the process of implementation and adaptation of the EBP to the local environment, including the pragmatics of dissemination of practice improvement strategies within systems that have scarce resources, and,
    2. the impact on overall system quality of care and clinical outcomes. Federal, state, and local systems should develop mechanisms for collecting information generated by these evaluations and make such data available to any systems wishing to embark on the process of EBP implementation.
  1. Creativity and flexibility should be encouraged by the following specific strategies at the federal, state, and local levels:
    1. Any listing of evidence-based and/or promising and emerging practices should incorporate areas for further exploration in implementation (which have not yet been researched) as opposed to just fidelity replication.
    2. There should be encouragement to describe promising and emerging practices, or creative adaptations of evidence-based or other formal program models in real world settings, even with only initial evidence of success, as a prelude for further investigation and implementation.
    3. There should be continued efforts and formal structures to develop quality improvement partnerships in which system intermediaries (states, counties) can provide organized feedback regarding their efforts to guide the adaptation of scientific advances to real world systems in the context of quality improvement.