Overview

The field of social work has called for the use of evidence-based practice; however, the concept of evidence based practice is not necessarily embraced by all in social work. Proponents of evidence based practice in social work maintain that by using empirically tested interventions is fundamental to building a core knowledge base in social work. Critics of evidence-based practice in social work have questioned what constitutes "best practices" and who identifies "best evidence." The questions continue today, fueling controversy and debate. Evidence-based practice is an ongoing and emendatory process. It is corrective and improving constantly, allowing for evolution in response to changing needs of clients and the environment. Regardless of the challenges, social workers will benefit from an understanding of how to conduct evidence-based research and incorporate the results into effective practice.

Education Category: Management
Release Date: 10/01/2023
Expiration Date: 09/30/2026

Table of Contents

Audience

This course is designed for social workers in all settings with an interest in evidence-based practice.

Accreditations & Approvals

As a Jointly Accredited Organization, NetCE is approved to offer social work continuing education by the Association of Social Work Boards (ASWB) Approved Continuing Education (ACE) program. Organizations, not individual courses, are approved under this program. Regulatory boards are the final authority on courses accepted for continuing education credit. NetCE is accredited by the International Accreditors for Continuing Education and Training (IACET). NetCE complies with the ANSI/IACET Standard, which is recognized internationally as a standard of excellence in instructional practices. As a result of this accreditation, NetCE is authorized to issue the IACET CEU. NetCE is recognized by the New York State Education Department's State Board for Social Work as an approved provider of continuing education for licensed social workers #SW-0033. This course is considered self-study, as defined by the New York State Board for Social Work. Materials that are included in this course may include interventions and modalities that are beyond the authorized practice of licensed master social work and licensed clinical social work in New York. As a licensed professional, you are responsible for reviewing the scope of practice, including activities that are defined in law as beyond the boundaries of practice for an LMSW and LCSW. A licensee who practices beyond the authorized scope of practice could be charged with unprofessional conduct under the Education Law and Regents Rules.

Designations of Credit

Social workers participating in this intermediate to advanced course will receive 3 Clinical continuing education clock hours. NetCE is authorized by IACET to offer 0.3 CEU(s) for this program.

Individual State Behavioral Health Approvals

In addition to states that accept ASWB, NetCE is approved as a provider of continuing education by the following state boards: Alabama State Board of Social Work Examiners, Provider #0515; Florida Board of Clinical Social Work, Marriage and Family Therapy and Mental Health Counseling, CE Broker Provider #50-2405; Illinois Division of Professional Regulation for Social Workers, License #159.001094; Illinois Division of Professional Regulation for Licensed Professional and Clinical Counselors, License #197.000185; Illinois Division of Professional Regulation for Marriage and Family Therapists, License #168.000190;

Course Objective

The purpose of this course is to increase the knowledge base of social workers and other allied mental health professionals who can work to incorporate the tenets of evidence-based practice into their own work with clients.

Learning Objectives

Upon completion of this course, you should be able to:

  1. Define evidence-based practice.
  2. Describe the historical trends of evidence-based practice.
  3. Discuss what constitutes "best evidence" for evidence-based practice.
  4. Identify the arguments for and against the inclusion of evidence-based practice in social work.
  5. Describe the barriers to and predictive factors for implementing evidence-based practice in social work.
  6. Discuss how evidence-based practice is used ethically in social work with racially and ethnically diverse populations and communities.
  7. Describe practical tips to implement evidence-based practice into day-to-day work.

Faculty

Alice Yick Flanagan, PhD, MSW, received her Master’s in Social Work from Columbia University, School of Social Work. She has clinical experience in mental health in correctional settings, psychiatric hospitals, and community health centers. In 1997, she received her PhD from UCLA, School of Public Policy and Social Research. Dr. Yick Flanagan completed a year-long post-doctoral fellowship at Hunter College, School of Social Work in 1999. In that year she taught the course Research Methods and Violence Against Women to Masters degree students, as well as conducting qualitative research studies on death and dying in Chinese American families.

Previously acting as a faculty member at Capella University and Northcentral University, Dr. Yick Flanagan is currently a contributing faculty member at Walden University, School of Social Work, and a dissertation chair at Grand Canyon University, College of Doctoral Studies, working with Industrial Organizational Psychology doctoral students. She also serves as a consultant/subject matter expert for the New York City Board of Education and publishing companies for online curriculum development, developing practice MCAT questions in the area of psychology and sociology. Her research focus is on the area of culture and mental health in ethnic minority communities.

Faculty Disclosure

Contributing faculty, Alice Yick Flanagan, PhD, MSW, has disclosed no relevant financial relationship with any product manufacturer or service provider mentioned.

Director of Development and Academic Affairs

Sarah Campbell

Director Disclosure Statement

The Director of Development and Academic Affairs has disclosed no relevant financial relationship with any product manufacturer or service provider mentioned.

About the Sponsor

The purpose of NetCE is to provide challenging curricula to assist healthcare professionals to raise their levels of expertise while fulfilling their continuing education requirements, thereby improving the quality of healthcare.

Our contributing faculty members have taken care to ensure that the information and recommendations are accurate and compatible with the standards generally accepted at the time of publication. The publisher disclaims any liability, loss or damage incurred as a consequence, directly or indirectly, of the use and application of any of the contents. Participants are cautioned about the potential risk of using limited knowledge when integrating new techniques into practice.

Disclosure Statement

It is the policy of NetCE not to accept commercial support. Furthermore, commercial interests are prohibited from distributing or providing access to this activity to learners.

Technical Requirements

Supported browsers for Windows include Microsoft Internet Explorer 9.0 and up, Mozilla Firefox 3.0 and up, Opera 9.0 and up, and Google Chrome. Supported browsers for Macintosh include Safari, Mozilla Firefox 3.0 and up, Opera 9.0 and up, and Google Chrome. Other operating systems and browsers that include complete implementations of ECMAScript edition 3 and CSS 2.0 may work, but are not supported. Supported browsers must utilize the TLS encryption protocol v1.1 or v1.2 in order to connect to pages that require a secured HTTPS connection. TLS v1.0 is not supported.

Implicit Bias in Health Care

The role of implicit biases on healthcare outcomes has become a concern, as there is some evidence that implicit biases contribute to health disparities, professionals' attitudes toward and interactions with patients, quality of care, diagnoses, and treatment decisions. This may produce differences in help-seeking, diagnoses, and ultimately treatments and interventions. Implicit biases may also unwittingly produce professional behaviors, attitudes, and interactions that reduce patients' trust and comfort with their provider, leading to earlier termination of visits and/or reduced adherence and follow-up. Disadvantaged groups are marginalized in the healthcare system and vulnerable on multiple levels; health professionals' implicit biases can further exacerbate these existing disadvantages.

Interventions or strategies designed to reduce implicit bias may be categorized as change-based or control-based. Change-based interventions focus on reducing or changing cognitive associations underlying implicit biases. These interventions might include challenging stereotypes. Conversely, control-based interventions involve reducing the effects of the implicit bias on the individual's behaviors. These strategies include increasing awareness of biased thoughts and responses. The two types of interventions are not mutually exclusive and may be used synergistically.

#71483: Evidence-Based Practice in Social Work

INTRODUCTION

"Evidence-based practice" is not necessarily a new or recent concept. In essence, evidence-based practice involves using the most current scientific evidence to support the decisions made for interventions and client care [1]. It has its roots in evidence-based medicine, and it can be traced back to World War II [2]. Today, it has been adopted in other fields such as counseling, mental health, criminal justice, nursing, and education [47]. Many assert that evidence-based practice is particularly relevant in an economic climate with decreased funding for agencies in which the question of whether the program, service, and/or intervention is effective is at the heart of approved reimbursement. Stakeholders (e.g., clients, insurers) want to feel confident that interventions have been demonstrated to achieve optimal success [48]. Furthermore, there is greater discussion of transparency and accountability today, not only in social work but in all fields. Regardless of whether social work practitioners view the collection of empirical data as a vital part of the occupation, social workers should be proficient in evaluating empirical literature in order to select the types of services and interventions most effective for their clients [3].

The social work profession has long advocated for evidence-based practice [4]. In an analysis of publications between 2006 and 2010 in PsychINFO, a major library database, twice as many citations used the key terms "social work" and "evidence-based practice" than the terms "psychotherapy" and "evidence-based practice." This may be due to social work's longstanding struggle with their identity and attempting to legitimize and locate itself within the sciences [48]. However, not everyone in the field of social work has embraced the concept. Proponents of evidence-based practice in social work maintain that empirically tested interventions are fundamental to building a core knowledge base in social work [1]. However, critics have questioned what constitutes "best evidence" and who gets to identify what encompasses "best evidence." The goal of this course is to provide an overview of the key definitions, historical evolution, and controversies of evidence-based social work practice, allowing professionals to be informed about the ongoing dialogue.

DEFINITIONS OF EVIDENCE-BASED PRACTICE

Evidence-based practice is defined as the "conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients" [5]. Various terms have been used to describe evidence-based practice, including evidence-based treatment, evidence-based intervention, and evidence-informed intervention. These terms are generally used interchangeably [6]. However, for the purpose of this course, the term "evidence-based practice" will be used.

Some have argued that certain fields, such as social work, counseling, clinical psychology, and therapy, are an art. When making day-to-day decisions and intervening with clients, practitioners tend to employ a combination of common practice wisdom, experiences, sense of self, the therapeutic relationship, professional ethics, and awareness of issues of transference and countertransference. Evidence-based practice should integrate the three basic elements: best researched evidence, clinical expertise, and patient values [49,73]. Evidence-based practice involves using the most up-to-date scientific knowledge, expert knowledge, and/or best research evidence to guide clinical practice, answer direct practice questions, and make the most appropriate clinical decisions [7,8,9]. It is important to note that evidence-based practice is a verb—it is something that is performed, not something that is produced [73]. Evidence based practice can be viewed as a vehicle for understanding what constitutes as valid knowledge as well as the production of such knowledge [74]. Consequently, some practitioners maintain that evidence-based practice is in conflict with the values instilled in these professions. Consider that evidence-based practice assists in answering two important questions: How do social workers know the services they offer are ethical and competent? How do they know that they are providing the best available treatment or intervention, or that services are offered in a way that benefits clients [10]? All practitioners should be able to answer these questions.

STEPS IN EVIDENCE-BASED PRACTICE

There are five steps to practicing evidence-based social work [11]. First, a question must be posed around the practice area or need, and the question must be answerable. For example, "What can be done to solve homelessness?" is not an easily answerable question. Next, the best available evidence should be searched in order to find an answer to the question that has been posed. The evidence should be critically evaluated regarding its scientific validity and usefulness, using systematic reviews whenever possible [50]. The evidence is then evaluated and integrated based on the practitioner's experiences, observations, and client values and situation before being applied to the practice decision. Finally, it is vital to evaluate the outcomes of the decision, if possible using single case designs. Some assert both quantitative and qualitative studies should be used, depending on the question posed [51]. Ideally, these steps are done with all clients—integrating evidence to make an informed intervention plan. For client problems that are not necessarily unique or that occur frequently (e.g., abuse), some experts recommend that evidence can be generally collected and analyzed without being applied to a specific client [12]. For problems that are more unusual, practitioners should always search for available current evidence formulated by others [12]. These steps do not divorce or devalue the "person" out of the practitioner or the client; the practitioner's values, biases, experiences, and worldviews and the client's strengths, cultural beliefs, religion/spirituality, and self-identified needs are still considered [73,75,76].

A HISTORICAL OVERVIEW OF EVIDENCE-BASED PRACTICE

As noted, the concept of evidence-based practice is not necessarily a new one. Even dating back to ancient times, there are some indications of attempts to make decisions regarding medical interventions based on prior testing. In the 1850s, Florence Nightingale's efforts to sanitize hospital conditions were based on evidence-based practice steps—Nightingale identified the problem and critically evaluated and appraised the evidence [20,52]. Evidence-based practice as it is known today can be traced back to World War II [2]. In the 1960s, there was a movement in the public service sector to evaluate welfare programs and interventions for efficacy, referred to the New Public Management Model [77]. However, it was not until 1972, and the work of Dr. Cochrane, then director of the Medical Research Council Epidemiology Research Unit in Cardiff and involved in an evaluation research project to assess a governmental agency, that the concept of evidence-based practice began to gain notice [2]. He also published a book titled Effectiveness and Efficiency: Random Reflections on Health Services. These steps laid the foundation for evidence-based practice in a variety of disciplines today. The underlying goal was to advocate for interventions supported by scientific evidence, placing medicine on a more secure scientific foundation [78].

Also in the 1970s, Dr. David Sackett and his colleagues wrote manuscripts (published in the Canadian Medical Association Journal) about how to critically appraise clinical research, and from these manuscripts, the term "critical appraisal" began to be widely applied to evidence-based medicine. In 1992, the Evidence-Based Medicine Working Group wrote an article about the role of teaching evidence-based medicine in the Journal of the American Medical Association[20]. In 2001, evidence-based medicine was selected by The New York Times as idea of the year [79].

With the proliferation of digital access to clinical information, journals, and abstracts beginning in the 1990s, evidence-based practice became more practical, with practitioners more easily able to access literature in a variety of settings [20]. Repositories of critical appraisal of evidence emerged, notably the Cochrane Collaboration [53]. Today, the Cochrane Collaboration in the United Kingdom and the Rand Corporation in the United States promote evidence-based medicine, advocating for using experimental studies, meta-analysis, and systematic reviews to determine effectiveness of interventions [21,54]. In the United States, there is ongoing debate about evidence-based practice and the efforts of researchers and scholars are often fragmented or sub-optimally organized [22]. However, much work has been done in the field of evidence-based mental health practice. There are organizations, such as the Human Services Research Institute, that promote studies on mental health issues and interventions and offer practitioners toolkits to help measure outcomes. The Center for Quality Assessment and Improvement in Mental Health has developed a searchable database to help practitioners locate the validated instruments and measures needed to conduct assessments in mental health. In addition, the Agency for Healthcare Research and Quality has developed principles for evaluating the effectiveness of interventions and publishes reviews of related evidence [22]. Today, this concept is met with both praise and criticism, fueled by sociopolitical and economic forces [79,80].

EVIDENCE-BASED PRACTICE IN SOCIAL WORK

Social work can trace the roots of evidence-based practice to Mary Richmond, who advocated for systematically collecting evidence before concluding a diagnosis in her seminal 1917 work Social Diagnosis[23]. In the early 20th century, Jane Addams, a famous social worker known for her role in the settlement house movement and her work with immigrants and the poor, also advocated for obtaining systematic data and evidence for community work [24].

In 1915, Abraham Flexner questioned whether social work was a legitimate profession. He argued that one of the criteria of a profession is that it draws upon scientific knowledge, which he stated social work did not [25]. In the 1970s, these questions regarding the legitimacy of social work as a profession were revisited, as some viewed social work as more of an art than a science. This led to the question of whether social workers should become familiar with and conduct empirically based practice [26,81].

In 1988, the National Institute of Mental Health created a task force to examine social work research and social work faculty's level of involvement in social work research [24]. This Task Force on Social Work Research published a report recommending that seven Social Work Research Development Centers be housed in social work programs, with each center testing intervention models and approaches for different population groups [24].

In the 1990s, the term "evidence-based practice" was applied in the social work literature by Eileen Gambrill [27,28]. Since then, social workers have grappled with implementing evidence-based practice. In 2006, the Austin Initiative was developed with the goal of having continual symposium meetings in order to advance the teaching of evidence-based practice in social work curricula [26]. In 2007, leaders in social work were invited by the National Institute of Mental Health to identify best practices in implementing evidence-based practice and discuss key exemplars in the field [82]. In 2015, the Council on Social Work Education revised their accrediting standards for educational programs with very clear statements about how social workers should understand, use, and implement research evidence to guide social work practice, policy, and service delivery [55]. However, today, there is much confusion and misconception of evidence-based practice in social work. Some use this term to mean identifying research-supported interventions, but others use it to mean a five-step decision-making process [83].

THE DEBATE ABOUT EVIDENCE-BASED SOCIAL WORK PRACTICE

Debates about evidence-based practice in social work abound. First, there is a question of the context in which it should be employed. For example, should evidence-based practice be used in a micro (with individuals) or macro (with organizations) context [1]? The terms used are also a source of controversy; semantics in employing the term "interventions" versus "treatment," or "client" versus "client systems" have emerged. The term "evidence" alone has raised questions regarding ownership of data and expertise. In evidence-based practice, evidence is defined as research findings of various quality (levels of evidence) according to the study structure and statistical weight of the resultant data [13]. Some experts differentiate between hard and soft data. Hard data refers to quantifiable evidence that is supported by research studies that follow the empirical process, while soft data refers to qualitative studies and anecdotes from clients' experiences; observations from interviews with clients, service providers, and other stakeholders; and formal and informal observations [14]. It is also unclear how practitioners should act if evidence does not exist or if there is conflicting evidence regarding a particular intervention [13].

Some have argued that the term "evidence-informed practice" is more suitable than "evidence-based practice," as it focuses on the practitioner's actions being informed rather than being based solely on the evidence [15]. However, the process of evidence-informed practice is similar to evidence-based practice, making the distinction primarily semantic [15].

The term "empirically supported intervention" is also used in the social work literature. However, empirically supported interventions utilize even more rigid criteria than evidence-based practice. For example, an evidence-based practitioner can use literature on systematic reviews to understand the effectiveness of an intervention or treatment, but in order for an intervention to be considered empirically supported, it must be based on evaluation of randomized clinical trial study data by at least two independent investigators [16]. Unlike psychology, social work does not tend to use descriptors such as strong, modest, or controversial to differentiate the evidence [56]. The empirically supported interventions paradigm has been advocated by the National Association of Social Workers, the National Association of Public Child Welfare Administrators, the National Institutes of Health, the Joint Commission, and the Council on Accreditation [16]. However, empirically supported interventions are still not well-distinguished in the literature [56]. Other terms are used, including research-supported, empirically supported, evidence-driven, or evidence-guided. However, the emergence of these new terminologies has introduced more confusion [51]. Further, some consider manualized and other standardized interventions to be examples of evidence-based practice, but standardization does not necessary equate with evidence-based practice [80].

As of 2023, the profession of social work has not reached a consensus on evidence-based practice. In a national study involving 973 social work faculty members, there was a lack of agreement regarding the definition and necessary components of evidence-based practice. Even though a large majority (90%) believed that experimental studies with random assignments and control groups were necessary for evidence-based practice, 40% also indicated that studies that were not experiments and did not involve a control group could be sufficient for evidence-based practice [17]. Regardless of the terms used, when social workers rely on knowledge based on scientific evidence, common practice wisdom may be defied. Gilgun asserts that [18]:

When social workers are steeped in relevant research, we then have to hold this knowledge lightly and be willing to modify our knowledge in response to clients. If we do this, then we will base practice on evidence from two directions—from what clients communicate to us in subtle and forthright ways and from what we know from multiple other sources.

This is in support of social work practitioners' role as lifelong learners [19].

As noted, a variety of pro and con arguments have been made regarding the incorporation of evidence-based principles into social work practice. It is likely that these controversies will continue, and it is good to have a firm grasp of both sides of the argument when one is considering its impact on the social work profession.

ARGUMENTS AGAINST THE INCORPORATION OF EVIDENCE-BASED PRACTICE

Differences Between Social Work and Medicine

As evidence-based practice was originally derived from medicine, some argue that differences between the two professional fields mean that the practice cannot be applied to social work [26,79]. Often, medical interventions can be broken into distinct steps or rules to be followed for all or most patients [79]. However, social work takes into account all social, biologic, psychologic, cultural, and institutional factors, making developing a set of universal steps for interventions much more difficult [26,52]. In social work, with its emphasis on working with those who are marginalized and vulnerable, the rules-based characteristics of evidence-based practice may appear at odds with core social work values [80]. Medicine is also highly positivistic (i.e., focusing on measurability), while social work is interpretivistic/constructivistic, relying on reflexivity and the social context [29,52]. This leads to unpredictability and difficulty in quantification of social work experiences.

Limitations of Empirical Literature

Opponents argue that the empirical knowledge base in social work is narrow and limited—meaning empirical information is not available for every clinical issue and every client population. It is important to note that if practitioners are unable to locate literature that meets evidence-based criteria, this does not necessarily mean that an intervention is "bad" [26]. The reality is that research topics are influenced by a host of factors, including funding sources and priorities, perception of what is "important," and other hidden social values [26].

Assigning of Value or Privilege

Some assert that evidence-based practice privileges certain knowledge—specifically, knowledge that is quantifiable and measurable [30]. Studies that use experimental designs that feature randomization and control and experimental groups are valued in evidence-based practice [79,84]. However, other types of quantitative and qualitative methods have value as well and can provide insight into clients' realities [26,30]. Social work as a field deals with diverse social problems, and change in clients' lives frequently occurs gradually and in small ways that cannot always be captured using the experimental designs favored by evidence-based practice [30]. Statistical significance in research cannot be equated with practice significance, and it does not take into account context, with the complexities and subtle nuances of clients' lived experience [79,84]. Furthermore, the insights and opinions of social workers and other service providers are valuable [57].

Negation of Clinical Experiences

The importance of practitioners' clinical experiences and intuition in shaping practice decisions should not be ignored [20]. Opponents of evidence-based social work practice posit that the essence of the art of clinical work with clients (e.g., use of self, intuition, rapport building with clients, past experiences) is not captured in evidence-based practice. Frequently, top-down decision-making does not take into account the context and complexities of human problems. So, practitioners must use discretion, making it challenging to standardize clinical practice decision-making [85].

Threat to Client Autonomy

Some experts maintain that a focus on evidence-based outcomes could potentially risk the client autonomy that social work values [30,56,81]. For example, evidence-based literature may demonstrate that a certain medication is effective, but a client may rather avoid pharmacologic interventions and employ alternative treatment. In this case, should the practitioner push for what the literature supports despite the client's wishes?

Practical Constraints

Opponents of evidence-based social work practice maintain that agencies often have limited resources (e.g., money, time, expertise) with which to promote evidence-based practice. In some cases, evidence-based practice interventions are inconsistent with the norms of a particular organization or employer. Social workers may find the status quo to be simpler and more aligned with daily operations [81]. This raises the issue of fidelity, or the degree to which agencies implement an evidence-based intervention or program and the degree to which it can be adapted to a specific setting [85].

Another practical impediment is the length of the research, publication, and dissemination process. Given how long the scientific process can take when it comes to data collection, evaluation, and publication of findings in peer-reviewed journals, the literature may be outdated even before public scrutiny and critiques of the findings are possible [31,80]. In addition, social workers do not appear to value and utilize such sources of knowledge. One study found that 22% of surveyed social workers had never read literature from peer-reviewed journals [79].

ARGUMENTS FOR THE INCLUSION OF EVIDENCE-BASED PRACTICE

Ethics and Values

Proponents of evidence-based social work argue that professionals are expected to practice good social work, which entails providing clients the best treatment possible [14]. In order to do so, social workers should amass and evaluate the available evidence before making informed decisions and formulating a client's intervention plan. In a climate that emphasizes efficiency, transparency, and accountability, social workers are also expected to demonstrate effectiveness to their funders [58]. The National Association of Social Workers Code of Ethics states [32]:

Social workers practice within their areas of competence and develop and enhance their professional expertise. Social workers continually strive to increase their professional knowledge and skills and to apply them in practice. Social workers should aspire to contribute to the knowledge base of the profession.

Although there may be challenges to implementing evidence-based practice, ethical codes demand that practitioners address these challenges rather than avoid them [19].

Client-Centered Care

As noted, one of the criticisms of evidence-based social work is that it is too mechanistic and procedural. In response to this, proponents claim that the steps in evidence-based practice are focused on each individual client. If followed correctly, the decisions that result from the process are deliberate and client-centered [19,86].

In addition, evidence-based practice does not violate client self-determination. All collected evidence is synthesized taking into account the client's background, characteristics, environment, support system resources, and preferences [19,86].

Valuing Clinical Expertise

Some experts object to the argument that evidence-based practice ignores the practitioner's vast clinical experience and background. One of the steps in evidence-based practice is the evaluation of evidence based on experience and the specific situation, which highlights the role of the practitioner's clinical background and intuition [19].

Objective Evaluations

Taking a broader definition of evidence-based practice and using a range of "hard" and "soft" data as best evidence, evidence-based clinical practice is advantageous because it moves away from client insight as being a sign of progress. Instead, it focuses on observable behavioral change [28]. This behavioral change can trigger clients' understanding of their emotional life [28].

Human Error

A mixed-methods study found that social workers relied on and valued knowledge that came from their work experience and their colleagues and supervisors [80]. Practitioners are not necessarily rational, failing to use critical thinking skills and objectivity in their clinical decision-making. Consequently, the rigorous and systematic process of evaluating the "why" in professional practice inherent in evidence-base practice would ensure that personal biases and human error are mitigated [73].

Improving Social Work Core Knowledge

Proponents of evidence-based practice in social work maintain that using empirically tested interventions is fundamental to building a core knowledge base in social work [1]. By expanding this core knowledge base, social work as a discipline can be more legitimate and "professional," ultimately increasing its credibility.

Educated Clients

In an increasingly technology-oriented society, consumers have greater access to information and are becoming more educated about the services they receive. Because of this, they expect service providers and professionals to be updated on the latest developments in their field [51]. For example, 80% of adult Internet users (18 years of age and older) search online for information on a health topic each year [33]. Among individuals with diagnosed psychiatric disorders who use the Internet, 64.7% have proactively employed the Internet to find more health-related information [34]. Because social work clients are active consumers, they can be encouraged to use the Internet to research and can offer feedback in terms of what they discover [9]. This becomes a vehicle for empowerment, improving communication and reducing the power disparities between practitioners and clients. However, this easy access to information increases the risk that clients will obtain erroneous information; it is the responsibility of social workers to educate clients to critically evaluate information [51].

Valuing All Research Methods

It is true that randomized experimental designs are preferred in evidence-based practice [19]. However, other research methods have value as well. Some "less rigorous" methods are more suitable to answering certain questions.

MEETING IN THE MIDDLE

Using the definition that evidence-based practice is the "integration of the best research evidence with clinical expertise and client values in making practice decisions," some experts advocate for an evidence-based practice model that bridges the viewpoints of proponents and opponents [1]. Best research evidence may refer to employing both applied (e.g., intervention and outcome research) and basic research studies [1]. Clients' values are based on their culture, upbringing, expectations, and environment, and these values are not discarded when using evidence-based practice. Evidence-base practice can be context sensitive [87]. It is not merely about the outcome or about what is perceived to work. Rather, the question of what works should be extended to: "What works for whom and in what context?" Instead, they should be at the forefront, shaping decisions made by both the client and social worker [1]. The practitioner's clinical expertise is also part of this equation. Similarly, Gilgun suggests that [18]:

There are four cornerstones of evidence-based practice in social work: (1) what we know from research and theory; (2) what we and other professionals have learned from our clients, or practice wisdom, which also includes professional values; (3) what we, as social workers, have learned from personal experience; and (4) what clients bring to practice situations. All four come into play and mutually affect each other as we go about our daily work with clients. In sum, evidence-based practice promotes a high degree of practitioner reflection and mindfulness.

IMPLEMENTING EVIDENCE-BASED PRACTICE

BARRIERS

There are logistical barriers for social workers and practitioners to fully implement evidence-based practice. Some social service organizations do not have the technology, hardware, datasets, and informational technology specialists to assist practitioners to access information needed for evidence-based practice [35]. Implementing evidence-based practice well requires social workers to spend time learning how to use resources, search for literature, and develop research skills, but social workers and other practitioners often have heavy caseloads and are extremely busy seeing clients [26,59,60]. One study with licensed master social workers found that perceived lack of time and expenses associated with evidence-based practice were prominent barriers [83].

Some social workers have expressed that their agencies are not supportive of evidence-based practice [55]. Managers and administrative leadership may not be receptive to evidence-based practice because of the seeming rigidity of the process and the realities of daily organizational life [77]. There are within-group differences in how managers and administrators perceive what evidence-based practice. Some equate evidence-based practice with business procedural models, aligning it with quality or human resource management [88]. On the other hand, others contextualize evidence-based practice in terms of specific project operations. In these cases, definitions are more varied [88]. Misalignments of perspectives can affect overall attitudes toward evidence-based practice.

Social workers often do not have sufficient research skills to conduct outcome studies and evaluate interventions, and locating and appraising evidence-based literature requires additional training [35,59]. Some social workers are anxious about their ability to formulate answerable research questions and use quantitative and qualitative methods and data analysis [51,57]. In addition, many social workers view research and the empirical literature as irrelevant and impractical. The research produced from clinical studies is often university-based and viewed as outdated or not reflecting the problems of the real world [29,57,60]. Consequently, some social workers tend not to employ research literature and do not view mutual collaboration between practitioners and researchers favorably [29].

Social workers' fears may also influence the willingness to employ evidence-based models. Many anxieties involve the perception that evidence-based practice is a top-down, mechanistic, one-size-fits-all approach that devalues client autonomy and clinical experience and wisdom [29]. Furthermore, perception of limited self-efficacy and resources needed to effectively implement evidence-based practice are barriers [82]. However, if used correctly, evidence-based practice supports client and practitioner expertise.

FACTORS THAT PREDICT ADOPTION

Although there are many barriers, there are also factors that support implementation of evidence-based social work practice. These organizational or individual factors are key in a social worker's decision to adopt an evidence-based practice.

Not surprisingly, social workers who are willing to alter their interventions in light of new evidence are more open to implementing evidence-based practice [35]. Social workers' desire and readiness for change should be supported by colleagues, employers, and the profession as a whole. When staff perceive that an organization is committed to fostering change and when practitioners are encouraged to take on challenging tasks that will help them grow, use of evidence-based practice increases [35]. When there is an organizational culture of continuous quality improvement and collaboration between clinicians, researchers, and policymakers, adoption rates for evidence-based practice increase [60]. Practitioners who are given external positive incentives (e.g., monetary bonuses, extra vacation time) for learning new skills and interventions are more likely to adopt evidence-based practice [35]. Because time and energy are often constrained due to heavy caseloads, social workers may feel more motivated if the time they put into implementing a new intervention is rewarded in the incentive structures of an agency [35]. Organizations may also provide education on the principles of evidence-based practice. Practitioners who participate in education and training that focuses on specific skills, such as research, library database searches, and critically appraising intervention studies, are more likely to incorporate evidence-based models into their practice [35]. Studies indicate that time and resources are important facilitators to evidence-based practice; practitioners with high levels of self-efficacy and time/resources are more likely to carry out evidence-based practice [61]. Social work educators and recent social work graduates tend to be most familiar with evidence-based practice [82]. This is mostly due to the Council on Social Work Education's mandate on the inclusion of evidence-based practice in social work curricula. This speaks to the need to train and empower social workers to execute evidence-based practice.

PROMOTION OF EVIDENCE-BASED PRACTICE IN SOCIAL WORK

There are several steps social service organizations and social workers can take to support the incorporation of evidence-based research into practice [29]. Agencies and universities should make coordinated efforts to join resources. Pooling empirical articles (e.g., systematic reviews, meta-analyses, randomized, controlled trials) that document the effectiveness of various interventions in different populations is beneficial for researchers and practitioners alike. Empirical findings should be translated into practical social work applications, keeping in mind that social workers tend to have busy schedules and intense caseloads. Active collaboration and trust among all stakeholders is necessary in order to effectively pool resources [89].

Increased access to digital and web-based learning technologies such as smartphones is crucial to allow social workers to conduct point-of-care literature searches [52]. In order to effectively conduct literature searches, training is needed in how to effectively use databases, the Internet, and other resources to locate articles and instruments. Social work education should prepare students to overcome the logistical barriers of implementing evidence-based practice. In addition, more research is needed to understand social workers' decision-making processes when applying findings from evidence-based research to their work with clients.

ETHICAL ISSUES

It is important that social work practitioners and researchers explore the ethical implications of conducting evidence-based practice [36]. Specifically, this type of practice can involve the ethical principles of beneficence, self-determination, conflict of interest, and confidentiality [37,62,90]:

  • Beneficence: Beneficence refers to the duty to do good, and practitioners should consider the benefits of conducting evidence-based research. What specific benefits will the group being studied obtain? What are the risks and does it have any negative outcomes for vulnerable populations? What additional safeguards can be implemented to promote the welfare of the client(s) and mitigate risks?

  • Self-determination: Self-determination refers to the duty to maximize an individual's rights to make his/her own decisions. Clients' self-determination and autonomy should be protected. For example, when implementing an intervention, do the clients feel they have no choice but to comply or risk having services somehow negatively affected? What information is shared about best available research when planning interventions? To what extent do clients understand the information to make an informed decision? The informed-consent process should not discount clients' values and preferences.

  • Conflict of interest: Potential dual-role issues and conflicts of interest can arise when a social worker is both the practitioner working with the client and the researcher collecting evidence about his/her own practice and interventions. What will the social worker report to the agency if the findings about the intervention are negative? When a practitioner is evaluating his/her own practice, he/she must ask whether the client's interest are prioritized.

  • Confidentiality or anonymity: All social workers should implement adequate safeguards to promote clients' privacy, anonymity, and/or confidentiality.

  • Study design: The study design and data collection procedures should serve to answer the research question. If not, this raises the issue of inconveniencing the clients, which is not ethical. Do the study design and procedures ultimately serve to empower the clients?

EVIDENCE-BASED PRACTICE IN A MULTICULTURAL CONTEXT: WORKING WITH DIVERSE CLIENTS

Social workers and practitioners wrestle with the challenge of providing culturally competent services. The cultural fit or relevance of evidence-based interventions for minority groups is a source of continued debate, as studies have traditionally excluded minority groups or have sample sizes too small to make the findings meaningful [63,64]. In addition, definitions of wellness, mental health, and health might have different meanings for different groups [65,66]. The goal of cultural competence is to reduce the differences between the norms and belief systems of clients from diverse cultural groups and the institutional cultural norms of service delivery agents. Ultimately, this will mitigate the disparities that exist in the current mental health and healthcare systems [38].

Historically, there are four ways that intervention studies have dealt with the issue of diversity [39]. These four categories may be viewed as existing along a continuum, with one end of the continuum consisting of full involvement of the cultural group (the most culturally sensitive) and the other end not recognizing the importance of including racial and ethnic minorities in the sample (the least culturally sensitive). The categories are:

  • Full partnership: With this approach, researchers collaborate with the community and consider the diverse interests of the group when formulating a culturally sensitive and competent intervention, identifying the research question(s), collecting data, analyzing and interpreting data, and disseminating research findings.

  • Ad hoc involvement: In this category, the population of interest and the community are asked to provide input after the intervention has been implemented or after the research question has been posed. The targeted group may be viewed as an ad-hoc advisory group.

  • No involvement in the process: In some cases, researchers include racial and ethnic minorities in the sample, but they do not attempt to target the intervention to the specific group, nor do they attempt to gain insight into the group's perspective about the intervention.

  • No diversity represented in the sample: Some researchers do not actively recruit diverse racial and ethnic minorities in the study sample. If there is some diversity in the sample, there are no sufficient numbers to be able to conduct statistically valid analyses.

Another way to explore the issue of cultural competency is through the debate about emic and etic perspectives. The etic perspective maintains that, along important dimensions, all humans are basically similar. On the other hand, the emic perspective argues that it is vital for professionals to begin from the paradigm that unique cultural characteristics exist in various cultural groups. This emic orientation acknowledges individual differences within culturally different populations while simultaneously viewing clients/patients within the context of their primary cultural group [40]. It is believed that etic interventions can be used for all groups and if modification is necessary, it would be minimal. This is also referred to as universal psychotherapy, which argues that the mechanism for change is the same for all clients. According to this perspective, all individuals share common denominators (e.g., psychological qualities or attributes) [91]. In contrast, emic interventions focus on formulating interventions that reflect the group's characteristics and value systems, with the belief that these interventions will be more effective for the target group [41]. The emic perspective calls for racial/ethnic psychotherapies or cultural adaptation treatments, whereby interventions are specifically tailored to the specific group [91]. Either approach may be taken when conducting evidence-based practice. However, the emic perspective is considered culturally sensitive and ethically sound, and these interventions may have a greater likelihood of success.

When implementing evidence-based social work practice, it is important to consider how evidence-based practice translates when working with culturally and racially diverse clients and the potential controversies. A research study's ecologic validity may be in question if the intervention is only tested and shown effective with white, middle-class clients [42]. In other words, practitioners should determine if there is a gap between the client's ethnocultural experience and background and the attributes of the research population [42]. The litmus test may be if there is some evidence that demonstrates the intervention's effectiveness with a specific racial or ethnic minority group and the community has come to own or accept the intervention to a certain degree [39].

Culturally adapting the intervention may be necessary. Cultural adaptation refers to a deliberative process of modifying an evidence-based intervention to take into account the culture of the group or client. The modification is collaborative, involving persons familiar with the cultural value systems of the targeted population [67]. Surface structure adaptations are "light" or "modest" in that minor attributes are modified to fit the client's preferences. An example of surface structure adaptions is racial/ethnic matching of the client and therapist [86]. Deep structure modifications involve incorporating cultural values and explanatory models of illness into interventions and engaging with local indigenous healers and the client's networks [64,86]. One of the limitations of culturally adapting evidence-based interventions is that the core of the standardized intervention could be altered [64]. However, it may be that there are common active factors that help promote change that can be incorporated into standardized protocols [65].

The issue of culture and how it is measured is also at the heart of the debate. In the discussion of evidence-based practice, it is acknowledged that culture is dynamic, involving a group's shared values, beliefs, sociopolitical histories, and rituals [41]. It also interacts with a group's or client's sociocultural and institutional context, including socioeconomic status, health disparities, racism, and access to health and mental health services [41]. These points will all affect the effectiveness of a given intervention.

The following five-step process may be used to develop culturally relevant evidence-based practice [68]:

  1. Understand the general literature about the risk and protective factors for the identified problem.

  2. Analyze systematic reviews of culturally relevant risk and protective factors.

  3. Translate cultural information about specific risk and protective factors to the specific ethnocultural context.

  4. Develop culturally specific quantitative measures/instruments.

  5. Use information from steps 1 through 4 to formulate evaluation study for culturally specific evidence-based practice.

Research methods and how evidence is gathered with diverse populations will also influence evidence-based practice discussions. Some scholars have advocated for an expansion of acceptable evidence-based research methods to include qualitative research methods [91]. When working with some ethnic populations (e.g., Native American groups), using the art of storytelling is a culturally congruent strategy of acquiring data. Evidence collection via digital storytelling (i.e., filming and audio recording) places the ownership in the hands of the community and its leaders, who often wish to perpetuate their history and traditions to the next generation [43]. The transcripts of these stories and the analyses of the resultant qualitative data could produce rich evidence.

The inclusion of certain qualitative research strategies would help to mitigate the concern many racial and ethnic minority communities have that researchers will enter into the community to impose their research agenda. Participatory action research, for example, promotes collaboration and assists in reducing the unequal and hierarchical structure of the researcher (expert) and the participant (research subject) [44]. This is extremely important when dealing with marginalized populations who have experienced a history of oppression and racism. Of course, there are limitations to using qualitative data (e.g., the risk of bias), but triangulating the data (i.e., obtaining multiple sources for verification) can minimize these risks. Overall, such methods are congruent with culturally sensitive practice and research [43,91]. To do this well, culturally adapted treatments require collaboration with local communities and opportunities for local healers, religious and spiritual leaders, and other cultural networks to offer input [91].

PRACTICAL TIPS FOR PRACTITIONERS

Formulating Research Questions

The first step in conducting evidence-based practice is to formulate a question that will drive the research [69]. This research question is a descriptive clinical inquiry of an intervention and potential outcome(s) [60]. The question will dictate the evidence used to answer and should be succinct, clear, and as specific as possible. One way to develop the question is to use the PICO formula: population, intervention, comparison, and outcome [45]. Some use PICOT, with the T representing time period of data collection [69]. Using this approach, the first step is to pinpoint the client population. The more detailed one can be regarding the attributes or characteristics of the client population, the more helpful the question will be and the more applicable the research findings will be. If possible, the practitioners should identify gender, socioeconomic status, racial/ethnic minority status, health conditions, religion, and other factors. Examples of strong population component include:

  • Male clients diagnosed with generalized anxiety disorder

  • Chinese immigrants who came to the United States in the last five years

  • Hispanic adolescents residing in single-parent households

Next, pinpoint the program, therapy, or treatment being considered. Again, it is vital to be as specific as possible when describing the intervention, including details such as the setting, frequency, type of modality, and provider [45]. For example:

  • Eight-week psychoeducational group

  • 15-minute diabetes prescreening

  • A set of three weekly social work home visits

Determining a comparison component is optional, but it can be useful when further evaluating the findings. The comparison group may be a control, a group of patients on a waiting list, or placebo.

Finally, define the intended outcome of the intervention. This will be the criteria by which success or effectiveness is measured [45]. Examples include:

  • Increased knowledge about triggers of anger

  • Decreased levels of depression as measured by the Center for Epidemiologic Studies Depression Scale

Ultimately, a good PICO question should be practically significant and ethical. In addition, the question should lead to data collection that will not harm research participants [69]. The Centre for Evidence-Based Medicine provides a good overview of developing focused PICO questions at https://www.cebm.ox.ac.uk/resources/ebm-tools/asking-focused-questions. In addition, the American Academy of Ambulatory Nursing Care has developed a template for PICOT questions, which is available at https://www.aaacn.org/sites/default/files/documents/misc-docs/1e_PICOT_Questions_template.pdf.

Search Techniques

A good way to begin searching databases is to identify key words. Key words may be gleaned from the PICO formula used to develop the question [45]. Alternatively, the entire question may be inputted, but this will likely result in more confused findings. Take, for example, the following question: How effective is an eight-week psychoeducational group about healthy eating in improving food choice decision for young mothers (younger than 30 years of age) with young children (younger than 12 years of age) compared to the food choice decisions of mothers on a waiting list? Using this question, a social worker might employ search terms such as "food choice," "young adults," and "psychoeducation" or "psychoeducation and healthy eating." From there, synonyms may be generated for the terms identified. When an article is retrieved, the practitioner can glean other common terminologies, which can provide new ideas on how to search for additional literature [92]. Depending on how many results return for a given search, filters may then be used, perhaps further limiting the search by methodology (e.g., clinical trial) or outcome (e.g., weight loss) [8,70,92]. If too many articles are found, decisions will need to be made about the criteria used for exclusion and inclusion (e.g., research design, publication date) [93].

Databases

Several databases are available for social workers to use in their search for articles and reports to use when adopting evidence-based practice. Some are free, while others require a paid subscription. Please note that this is far from a comprehensive list, and practitioners are encouraged to explore other options provided by their agency or organization.

ACP Journal Club
https://www.acpjournals.org/loi/ajc
Agency for Healthcare Quality and Research
https://www.ahrq.gov/prevention/guidelines/index.html
American Psychological Association Databases and Electronic Resources
https://www.apa.org/pubs/databases
BioMed Central
https://www.biomedcentral.com
California Evidence-Based Clearinghouse for Child Welfare
https://www.cebc4cw.org
The Campbell Collaboration
https://www.campbellcollaboration.org
The Cochrane Library
https://www.cochranelibrary.com
Epistemonikos
https://www.epistemonikos.org
Essential Evidence Plus
https://www.essentialevidenceplus.com
Evidence-Based Behavioral Practice
https://ebbp.org
BMJ Mental Health
https://mentalhealth.bmj.com
Evidence Alerts from BMJ
https://www.evidencealerts.com
Health Services/Technology Assessment Texts (HSTAT)
https://www.ncbi.nlm.nih.gov/books/NBK16710
Human Services Research Institute
https://www.hsri.org
Joanna Briggs Institute
https://joannabriggs.org
Medline/PubMed PICO Search
https://pubmedhh.nlm.nih.gov/pico
National Association of State Mental Health Program Directors Research Institute
https://www.nri-inc.org
Public Library of Science (PLOS)
https://plos.org
PubMed
https://pubmed.ncbi.nlm.nih.gov
Research in Practice
https://www.researchinpractice.org.uk/all
Substance Abuse and Mental Health Services Administration National Registry of Evidence-Based Programs and Practices
https://www.samhsa.gov/ebp-resource-center
Turning Research into Practice (Trip) Database
https://www.tripdatabase.com

The key to searching and learning how to best use the different databases is to practice. As an additional resource, the article "Evidence Searching for Evidence-Based Psychology Practice" provides a summary of the different types of databases and their strength and weaknesses [45]. The article may be accessed online at https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3077562.

With the increased use of the Internet, there have been more open-access sources for information that has not been peer reviewed. It is important to ensure that the information is credible. Experts recommend assessing the following elements in any potential source of information [60]:

  • Credentials and affiliations of the authors

  • Abstract containing a summary of the research

  • Reference list

Levels/Hierarchy of Evidence

A weighting system that is hierarchically based is employed to convey the quality of evidence [70]. However, there are many different ways to conceptualize the levels of evidence. One framework is based on the type of evidence produced by the methodology [71]:

  • Causal evidence: Using experiments (e.g., randomized control trials), quasi-experiments, and multivariate statistical analyses, the evidence substantively links the intervention or practice to the outcome. Causal evidence is considered the strongest.

  • Indicative evidence: Using pre- and post-test designs, the evidence would lead one to link the intervention to the outcome. Indicative evidence is moderately robust.

  • Descriptive evidence: Using logic models and descriptive designs (e.g., surveys, qualitative interviews), the evidence shows that it is plausible the intervention works and perhaps why it works. Descriptive evidence is the least robust.

The Cochrane Collaboration uses a system that classifies the level of evidence as [72]:

  • Level 1: Strongest level of evidence produced by systematic reviews of randomized control studies

  • Level 2: Moderate level of evidence produced by at least one randomized control study and possibly a quasi-experimental study

  • Level 3: Lower level of evidence produced by cohort studies, case control studies, or case series studies

CONCLUSION

Evidence-based practice in social work is premised on and supports the ethical principles and values of social work [10]. Knowing if interventions and programs are effective is important for funders, agencies, and practitioners. However, while the benefits of using evidence-based practice in the field of social work may be embraced in theory, there are challenges to real-life implementation that should be acknowledged. Fiscal constraints, time and resource constraints, and lack of training on the implementation of evidence-based practice are a few of the barriers that impede adoption. Just as important as being able to produce and expand the social work knowledge base is the necessity to translate the findings into practical applications for social workers. The findings from evidence-based searches should be made accessible to practitioners via online clearinghouses and databases that are easily available to users [46]. Furthermore, definitions of "best evidence" may need to be broadened to include qualitative research in addition to systematic reviews, clinical trials, and meta-analyses. Regardless of the challenges, social workers will benefit from an understanding of how to conduct evidence-based research and incorporate the results into effective practice.

Works Cited

1. McNeece CA, Thyer BA. Evidence-based practice and social work. J Evid Based Soc Work. 2004;1(1):7-25.

2. Boruch R, Soydan H, deMoya D. The Campbell collaboration. Brief Treat Crisis Inter. 2004;4(3):277-287.

3. Royse D, Thyer BA, Padgett DK, Logan TK. Program Evaluation: An Introduction. 6th ed. Boston, MA: Cengage Learning; 2016.

4. Thyer B, Pignotti M. Clinical social work and evidence-based practice: an introduction to the special issue. Clin Soc Work J. 2011;39(4):325-327.

5. Sackett DL, Rosenberg W, Gray JAM, Haynes RB, Richardson WS. Evidence-based medicine: what it is and what it isn't. Brit Med J. 1996;312:71-72.

6. Social Work Policy Institute. Evidence-Based Practice. Available at https://www.socialworkers.org/News/Research-Data/Social-Work-Policy-Research/Evidence-Based-Practice. Last accessed September 11, 2023.

7. Shdaimah CS. What does social work have to offer evidence-based practice? Ethics Soc Welfare. 2009;3(1):18-31.

8. Shlonsky A, Baker T, Fuller-Thomson E. Using methodological search filers to facilitate evidence-based social work practice. Clin Soc Work J. 2011;39(4):390-399.

9. Yunong H, Fengzhi M. A reflection on reasons, preconditions, and effects of implementing evidence-based practice in social work.Soc Work. 2009;54(2):177-181.

10. Farley AJ, Feaster D, Schapmire TJ, D'Ambrosio JG, et al. The challenges of implementing evidence-based practice: ethical considerations in practice, education, policy, and research. Social Work and Society. 2009;7(2).

11. Rubin A, Parrish D. Challenges to the future of evidence-based practice in social work education. J Soc Work Educ. 2007;43(3):405-428.

12. Straus SE, Glasziou P, Richardson WS, Haynes RB. Evidence-Based Medicine: How to Practice and Teach EBM. 5th ed. New York, NY: Churchill Livingstone; 2018.

13. Reinhardt JP. Research methods in evidence-based practice: understanding the evidence. Generations. 2010;34(1):36-42.

14. Murchach AD. What good is soft evidence? Soc Work. 2010;55(4):309-316.

15. Entwistle VA, Sheldon TA, Sowden A, Watt IS. Evidence-informed patient choice: practical issues of involving patients in decisions about health care technologies. Int J Technol Assess. 1998;14(2):212-225.

16. Walker JS, Briggs HE, Koroloff N, Friesen BJ. Implementing and sustaining evidence-based practice in social work. J Soc Work Educ. 2007;43:361-375.

17. Rubin A, Parrish D. Views of evidence-based practice among faculty in MSW programs: a national survey. Res Soc Work Educ. 2007;17(1):110-122.

18. Gilgun JF. The four cornerstones of evidence-based practice in social work. Res Soc Work Prac. 2005;15(1):52-61.

19. Gibbs L, Gambrill E. Evidence-based practice: counterarguments to objections. Res Soc Work Prac. 2002;12(3):452-476.

20. Rahman A, Applebaum R. What's all this about evidence-based practice? The roots, the controversies, and why it matters. Generations. 2010;34(1):6-12.

21. Broom A, Adams J, Tovey P. Evidence-based healthcare in practice: a study of clinician resistance, professional de-skilling and inter-specialty differentiation in oncology. Soc Sci Med. 2009;68(1):192-200.

22. Beinecke RH. Implementation of evidence-based mental health practice in England. Int J Ment Health. 2004/2005;33(4):64-79.

23. Lorenz W. Response: hermeneutics and accountable practice: lessons from the history of social work. Res Soc Work Prac. 2012;22(5): 492-498.

24. Jenson JM. Connecting science to intervention: advances, challenges, and the promise of evidence-based practice. Soc Work Res. 2005;29(3):131-135.

25. Flexner A. Is Social Work a Profession? Available at https://pages.uoregon.edu/adoption/archive/FlexnerISWAP.htm. Last accessed September 11, 2023.

26. Adams KB, LeCroy CW, Matto HC. Limitations of evidence-based practice for social work education: unpacking the complexity. J Soc Work Educ. 2009;45(2):165-186.

27. Gambrill E. Evidence-based practice: an alternative to authority-based practice. Fam Soc J Contemp H. 1999;80:341-350.

28. Zayas L, Drake B, Jonson-Reid M. Overrating or dismissing the value of evidence-based practice: consequences for clinical practice. Clin Soc Work J. 2011;39(4):400-405.

29. McNeill T. Evidence-based practice in an age of relativism: toward a model for practice. Soc Work. 2006;51(2):147-156.

30. Furman R. Ethical considerations of evidence-based practice. Soc Work. 2009;54(1):82-84.

31. Nevo I, Slonim-Nevo V. The myth of evidence-based practice: towards evidence-informed practice. Brit J Soc Work. 2011;41(6):1176-1197.

32. National Association of Social Workers. Code of Ethics. Available at https://www.socialworkers.org/About/Ethics/Code-of-Ethics/Code-of-Ethics-English. Last accessed June 26, 2023.

33. Rainie L. E-Patients and Their Hunt for Health Information. Available at https://www.pewresearch.org/internet/2013/10/10/e-patients-and-their-hunt-for-health-information. Last accessed June 26, 2023.

34. Khazaal Y, Chatton A, Cochand S, Zullino D. Quality of web-based information on cocaine addiction. Patient Educ Couns. 2008;72(2):336-341.

35. Franklin C, Hopson LM. Facilitating the use of evidence-based practice in community organizations. J Soc Work Educ. 2007;43(3): 377-404.

36. Antle, BJ, Regehr C. Beyond individual rights and freedoms: meta-ethics in social work research. Soc Work. 2003;48(1):135-144.

37. Kenyon P. What Would You Do? An Ethical Case Workbook for Human Service Professionals. Pacific Grove, CA: Brooks/Cole Publishing Company; 1999.

38. Nyatanga B. Cultural competence: a noble idea in a changing world. Int J Palliat Nurs. 2008;14(7):315-315.

39. Wells SJ, Merritt LM, Briggs HE. Bias, racism and evidence-based practice: the case for more focused development of the child welfare evidence base. Child Youth Serv Rev. 2009;31(11):1160-1171.

40. Locke DC. A not so provincial view of multicultural counseling. Counselor Education and Supervision. 1990;30(1):18-25.

41. La Roche MJ, Christopher MS. Changing paradigms from empirically supported treatment to evidence-based practice: a cultural perspective. Prof Psychol Res Prac. 2009;40(4):396-402.

42. Bernal G, Jiménez-Chafey MI, Rodríguez MMD. Cultural adaptation of treatments: a resource for considering culture in evidence-based practice. Prof Psychol Res Pr. 2009;40(4):361-368.

43. Lucero E. From tradition to evidence: decolonization of the evidence-based practice system. J Psychoactive Drugs. 2011;43(4):319-324.

44. Silverstein LB, Auerbach CF. Using qualitative research to develop culturally competent evidence-based practice. Am Psychol. 2009;64(4):274-275.

45. Falzon L. Davidson KW, Bruns D. Evidence searching for evidence-based psychology practice. Prof Psychol Res Pr. 2010;41(6):550-557.

46. Gray M, Schubert L. Sustainable social work: modelling knowledge production, transfer, and evidence-based practice. Int J Soc Welf. 2012;21(2):203-214.

47. National Association of Social Workers. Evidence-Based Practice: NASW Practice Snapshot. Available at https://www.socialworkers.org/News/Research-Data/Social-Work-Policy-Research/Evidence-Based-Practice. Last accessed September 9, 2020.

48. Konrad SC, Flynn ML, Sela-Amit M. Art in social work: equivocation, evidence, and ethical quandaries. Res Soc Work Pract. 2019;29(6):693-697.

49. Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press; 2001.

50. Graaf G, Ratliff GA. Preparing social workers for evidence-informed community-based practice: an integrative framework. Journal of Social Work Education. 2018;54:S5-S19.

51. Parrish DE. Evidence-based practice: a common definition matters. Journal of Social Work Education. 2018;54(3):407-411.

52. Mackey A, Bassendowski S. The history of evidence-based practice in nursing education and practice. Journal of Professional Nursing. 2017;33(1):51-55.

53. Dillard DM. The history of evidence-based practice. International Journal of Childbirth Education. 2017;32(2):7-10.

54. Littell JH, White H. The Campbell Collaboration: providing better evidence for a better world. Res Soc Work Pract. 2018;28(1):6-12.

55. Grady MD, Wike T, Putzu C, et al. Recent social work practitioners' understanding and use of evidence-based practice and empirically supported treatments. J SocWork Educ. 2018;54(1):163-179.

56. Drisko JW, Friedman A. Let's clearly distinguish evidence-based practice and empirically supported treatments. Smith College Studies in Social Work. 2019;89(3-4):264-281.

57. Kelly L. Reconceptualising professional knowledge: the changing role of knowledge and evidence in social work practice. Social Work Education. 2017;36(3):245-256.

58. Morago P. Evidence-based practice: from medicine to social work. European Journal of Social Work. 2006;9(4):461-477.

59. Wike TL, Grady M, Massey M, et al. Newly educated MSW social workers' use of evidence-based practice and evidence-supported interventions: results from an online survey. J Soc Work Educ. 2019;55(3):504-518.

60. Zimmerman K. Essentials of evidence based practice. International Journal of Childbirth Education. 2017;32(2):37-43.

61. Shapira Y, Enosh G, Havron N. What makes social work students implement evidence-based practice behaviors? J Soc Work Educ. 2017;53(2):187-200.

62. Helps S. The ethics of researching one's own practice. J Fam Therapy. 2017;39(3):348-365.

63. Abe J, Grills C, Ghavami N, Xiong G, Davis C, Johnson C. Making the invisible visible: identifying and articulating culture in practice-based evidence. American Journal of Community Psychology. 2018;62(1/2):121-134.

64. Huey SJ Jr, Tilley JL, Jones EO, Smith CA. The contribution of cultural competence to evidence-based care for ethnically diverse populations. Annual Review of Clinical Psychology. 2014;10:305-338.

65. Ramos G, Brookman-Frazee L, Kodish T, Rodriguez A, Lau AS. Community providers' experiences with evidence-based practices: The role of therapist race/ethnicity. Cultur Divers Ethnic Minor Psychol. 2020; [Epub ahead of print].

66. Rogers-Sirin L. Psychotherapy from the margins: how the pressure to adopt evidence-based-treatments conflicts with social justice-oriented practice. Journal for Social Action in Counseling & Psychology. 2017;9(1):55-78.

67. Wang M, Lam Y. Evidence-based practice in special education and cultural adaptations. Res Pract Pers Sev D. 2017;42(1):53-61.

68. Whitbeck LB. Some guiding assumptions and a theoretical model for developing culturally specific preventions with Native American people. J Community Psychol. 2006;34(2):183-192.

69. Riva JJ, Malik KM, Burnie SJ, Endicott AR, Busse JW. What is your research question? An introduction to the PICOT format for clinicians. Journal of the Canadian Chiropractic Association. 2012;56(3):167-171.

70. Teolis MG. Improving nurses' skills and supporting a culture of evidence-based practice. Medical Reference Services Quarterly. 2020;39(1):60-66.

71. Schalock RL, Gomez LE, Verdugo MA, Claes C. Evidence and evidence-based practices: are we there yet? Intellect Dev Disab. 2017;55(2):112-119.

72. Cochrane Consumer Network. Levels of Evidence. Available at https://consumers.cochrane.org/cochrane-and-systematic-reviews#levels. Last accessed June 26, 2023.

73. Verbist AN, Winters AM, Antle BF, Collins-Camargo C. A review of treatment decision-making models and factors in mental health practice. Families in Society. 2020;101(4):444-455.

74. Hübner L. Reflections on knowledge management and evidence-based practice in the personal social services of Finland and Sweden. Nordic Social Work Research. 2016;6(2):114-125.

75. Finne J, Ekeland T-J, Malmberg-Heimonen I. Social workers use of knowledge in an evidence-based framework: a mixed methods study. European Journal of Social Work. 2022;25(3):443-456.

76. Lwin K, Beltrano N. Rethinking evidence-based and evidence-informed practice: a call for evidence-informed decision making in social work education and child welfare practice. Social Work Education. 2022;41(2):166-174.

77. Liedgren P, Kullberg C. "Easy ride or born to be wild"? The travelling of evidence-based social work to Sweden. European Journal of Social Work. 2022;25(2):224-237.

78. Lilienfeld SO. What is "evidence" in psychotherapies? World Psychiatry. 2019;18(3):245-246.

79. Ekeland T-J. Evidence-based practice in social work: perceptions and attitudes among Norwegian social workers. European Journal of Social Work. 2019;22(4):611-622.

80. Finne J. Evidence-based practice in social work: who are the critics? Journal of Social Work. 2021;21(6):1433-1449.

81. Mosley JE, Marwell NP, Ybarra M. How the "what works" movement is failing human service organizations, and what social work can do to fix it. Human Service Organizations. 2019;43(4):326-335.

82. Prock KA, Drechsler K, Hessenauer S. Social workers' knowledge and attitudes about evidence-based practice: differences between graduate students, educators, and practitioners. Clinical Social Work Journal. 2022;50(3):233-241.

83. Washburn M, Parrish DE, Oxhandler HK, Garrison B, Ma AK. Licensed master of social workers' engagement in the process of evidence-based practice: barriers and facilitators. Journal of Evidence-Based Social Work. 2021;18(6):619-635.

84. Tomkins L, Bristow A. Evidence-based practice and the ethics of care: what works or what matters? Human Relations. 2023;76(1):118-143.

85. Bakkeli V, Breit E. From "what works" to "making it work:" a practice perspective on evidence-based standardization in frontline service organizations. Social Policy & Administration. 2022;56(1):87-102.

86. Fennig M. Cultural adaptations of evidence-based mental health interventions for refugees: implications for clinical social work. British Journal of Social Work. 2021;51(3):964-981.

87. Pawson R, Tilley N. Caring communities, paradigm polemics, design debates. Evaluation. 1998;4(1):73-90

88. Bäck A, Schwarz U, Hasson H, Richter A. Aligning perspectives? Comparison of top and middle-level managers' views on how organization influences implementation of evidence-based practice. British Journal of Social Work. 2020;50(4):1126-1145.

89. Denvall V, Skillmark M. Bridge over troubled water: Closing the research-practice gap in social work. British Journal of Social Work. 2021;51(7):2722-2739.

90. Drisko JW. Incorporating evidence-based practice into informed consent practice. Families in Society. 2021;102(1):67-77.

91. La Roche MJ. Changing multicultural guidelines: clinical and research implications for evidence-based psychotherapies. Professional Psychology: Research and Practice. 2021;52(2):111-120.

92. Pascoe KM, Waterhouse-Bradley B, McGinn T. Systematic literature searching in social work: a practical guide with database appraisal. Research on Social Work Practice. 2021;31(5):541-551.

93. Stannard D, Jacobs W. Accessing and selecting the best available evidence: the second step in evidence-based practice. AORN Journal. 2021;114:336-338.


Copyright © 2023 NetCE, PO Box 997571, Sacramento, CA 95899-7571
Mention of commercial products does not indicate endorsement.