applied research journal articles

The Official Journal of the International Society for Quality-of-Life Studies

Law Awareness and Abidance and Radicalism Prevention Among Hong Kong Youth

  • Chau-kiu Cheung
  • Cindy Xinshan Jia

Book Review: Handbook on Tourism and Quality of Life Research II

  • Robertico Croes

Is Too Much Time on the Internet Making us Less Satisfied with Life?

  • Ana Suárez Álvarez
  • María R. Vicente

applied research journal articles

Peter Krause: A Pioneer in Household Panel Surveys and Quality of Life Applications

  • Peter Krause

Attachment in Young Adults and Life Satisfaction at Age 30: A Birth Cohort Study

  • Julie A. Blake
  • Hannah J. Thomas
  • James G. Scott

applied research journal articles

Crafting One’s Life and its Relationship with Psychological Needs: A Scoping Review

  • Andrew D. Napier
  • Gavin R. Slemp
  • Dianne A. Vella-Brodrick

applied research journal articles

Educational Quality of the University of the Third Age and Subjective Well-being: Based on a Perspective of Self-determination

  • Jianxin Zhang

applied research journal articles

Incredible Work Environments in Brazil: What the 2020 Award- Winners can Teach us

  • Marcia Sierdovski
  • Luiz Alberto Pilatti
  • Claudia Tania Picinin

Support for a Single Underlying Dimension of Self-Reported Health in a Sample of Adults with Low Back Pain in the United States

  • Ron D. Hays
  • Anthony Rodriguez
  • Maria Orlando Edelen

The Art of Living Well: Cultural Participation and Well-Being

  • Fabrice Murtin
  • Leonardo Zanobetti

applied research journal articles

Peer Relationship Problems, Fear of Missing Out, Family Affective Responsiveness, and Internet Addiction among Chinese Adolescents: A Moderated Mediation Model

  • Yuhang Cheng

applied research journal articles

The Role of Environmental, Economic, and Social Dimensions of Sustainability in the Quality of Life in Spain

  • Nuria Huete-Alcocer
  • Víctor Raúl López-Ruiz
  • Domingo Nevado-Peña

applied research journal articles

Two Decades of Academic Service-Learning in Chinese Higher Education: A Review of Research Literature

  • Yang-yang Wan

applied research journal articles

Housing Tenure, Intrahousehold Homeownership Structure and Health

  • Tongtong Qiu
  • Siliang Wang

applied research journal articles

Planfulness in Psychological Well-being: Mediating Roles of Self-Efficacy and Presence of Meaning in Life

  • Theodoros Kyriazos

applied research journal articles

Changes in Job Strain in the US, Europe and Korea

  • Benoît Arnaud
  • Agnès Parent-Thirion

applied research journal articles

Bicultural Acceptance Attitude as a Protective Factor Against the Effect of Acculturative Stress on Life Satisfaction Among Korean Multicultural Adolescents

  • Jong-Hye Park
  • Sung-Man Bae

applied research journal articles

Childhood Psychological Maltreatment and Subjective Vitality: Longitudinal Mediating Effect of Cognitive Flexibility

  • Hasan Kütük
  • Seydi Ahmet Satıcı

applied research journal articles

The Effect of Loneliness on Subjective Well-Being: Evidence from the UK Household Longitudinal Study 2017–2021

  • Nico Seifert

applied research journal articles

Unraveling the Nexus between Overeducation and Depressive Symptoms in China: The Roles of Perceived Fairness of Earnings and Job Autonomy

  • Xiaohang Zhao
  • Skylar Biyang Sun

applied research journal articles

A Longitudinal Study of the Effect of Memory on the Quality of life of European Adults and Older Adults

  • Irene Fernández
  • Noemí Sansó
  • José M. Tomás

applied research journal articles

Exposure to Family Violence and School Bullying Perpetration among Children and Adolescents: Serial Mediating Roles of Parental Support and Depression

applied research journal articles

Which One is the Best for Evaluating the Multidimensional Structure of Meaning in Life Among Chinese: A Comparison of Three Multidimensional Scales

  • Zhiwei Zhou

applied research journal articles

‘Born Free’ Dreams: South African Township Youth Discuss Their Hopes for a Better Life in Future

  • Valerie Møller
  • Benjamin J. Roberts
  • Dalindyebo Zani

Why do Middle-Aged Adults Report Worse Mental Health and Wellbeing than Younger Adults? An Exploratory Network Analysis of the Swiss Household Panel Data

  • Dawid Gondek
  • Laura Bernardi
  • Chiara L. Comolli

applied research journal articles

Dynamic Analysis of Loneliness at Older Ages in Europe by Gender

  • Ricardo Pagan
  • Miguel Angel Malo

applied research journal articles

How Much are you Willing to Accept for Being Away From Home? Internal Migration and Job Satisfaction Among Formal-Informal Ecuadorian Workers

  • Cristian Ortiz
  • Aldo Salinas
  • Viviana Huachizaca

applied research journal articles

Decomposing Cultural Adaptation and Social Support in Relation to New Media Use and Psychological Well-Being Among Immigrants: a Chain Mediation Model

  • Damilola Adetola Bolaji
  • Tosin Yinka Akintunde

applied research journal articles

A randomized controlled trial of mindfulness-based intervention on individuals with physical disabilities in China

  • Lu-yin Liang
  • Daniel T. L. Shek

applied research journal articles

Temporal Focus Profiles in the College and the Workplace: Exploration and Relationships with Well-being Constructs in Mexico

  • Daniel A. Cernas-Ortiz

applied research journal articles

Successful Life Conduct in Very Old Age: Theoretical Implications and Empirical Support from a Population-Based Study

  • Roman Kaspar
  • Andrea Albrecht
  • Jaroslava Zimmermann

applied research journal articles

“Helping Others Makes Me Feel Better”: Trait Gratitude, Resilience, and Helping Behavior Improve Mental Health during a COVID-19 Lockdown

  • Ningning Feng

applied research journal articles

Changes in Daily Life Habits during COVID-19 and Their Transitory and Permanent Effects on Italian University Students’ Anxiety Level

  • Giovanni Busetta
  • Maria Gabriella Campolo
  • Demetrio Panarello

applied research journal articles

Association of Lifestyle Factors with Multimorbidity Risk in China: A National Representative Study

  • Xinying Sun

applied research journal articles

The Impact of the Pandemic on Health and Quality of Life of Informal Caregivers of Older People: Results from a Cross-National European Survey in an Age-Related Perspective

  • Marco Socci
  • Mirko Di Rosa
  • Sara Santini

The Effect of Attitudes Towards Money on Over-Indebtedness Among Microfinance Institutions’ Customers in Tanzania

  • Pendo Shukrani Kasoga
  • Amani Gration Tegambwage

applied research journal articles

Perceived Social Exclusion Partially Accounts for Social Status Effects on Subjective Well-Being: A Comparative Study of Japan, Germany, and the United States

  • Christina Sagioglou
  • Carola Hommerich

applied research journal articles

Fertility Intention in Hong Kong: Declining Trend and Associated Factors

  • Mengtong Chen
  • Camilla Kin Ming Lo

applied research journal articles

Can Social Participation Reduce and Postpone the Need for Long-Term Care? Evidence from a 17-Wave Nationwide Survey in Japan

  • Takashi Oshio
  • Kemmyo Sugiyama
  • Toyo Ashida

Transition Patterns of Intergenerational Solidarity and Digital Communication During and After the COVID-19 Pandemic in South Korea: Association with Older Parents’ Cognitive Decline

  • Woosang Hwang

applied research journal articles

Measurement Invariance of a Quality-of-life Measure, CASP-12, within the English Longitudinal Study of Ageing (ELSA)

  • Ali Alattas
  • Farag Shuweihdi
  • Robert West

applied research journal articles

Community Identity as an Indicator of Quality of Life: A Theoretical Model and Empirical Test

  • Yangyang Fan

applied research journal articles

What Makes People Happy with their Lives in Developing Countries? Evidence from Large-Scale Longitudinal Data on Ghana

  • Richmond Atta-Ankomah
  • Kwame Adjei-Mantey
  • Andrew Agyei-Holmes

applied research journal articles

Long Working Hours and Job Satisfaction in Platform Employment: An Empirical Study of On-Demand Delivery Couriers in China

  • Donghao Liu

applied research journal articles

Psychological Capital and Labor Market Participation of Arab Women in Israel

  • Rivka Sigal
  • Piotr Michoń

applied research journal articles

Urban Green Space Usage and Life Satisfaction During the Covid-19 Pandemic

  • Martin Refisch
  • Jörg Hartmann

applied research journal articles

Impact of Academic Service-Learning on Students: an Evaluation Study of a University-Level Initiative in China

  • Jing-wen Ju

applied research journal articles

Understanding Access and Utilization of Healthcare Services Among African Immigrant Women in the United States: the Application of Health Belief Model

  • Gashaye Melaku Tefera

Long Covid: A Syndemics Approach to Understanding and Response

  • Merrill Singer
  • Nicola Bulled

applied research journal articles

The Mediating Roles of Mindfulness in Marriage and Mindfulness in Parenting in the Relationship Between Parents’ Dispositional Mindfulness and Emotion Regulation of Their Children

  • Ezgi Güney Uygun
  • Seher Merve Erus

applied research journal articles

  • Find a journal
  • Publish with us
  • Track your research
  • Search Menu
  • Sign in through your institution
  • Advance articles
  • Author Guidelines
  • Submission Site

Open Access

  • Why Submit?
  • About Applied Linguistics
  • Editorial Board
  • Advertising and Corporate Services
  • Journals Career Network
  • Self-Archiving Policy
  • Dispatch Dates
  • Terms and Conditions
  • Journals on Oxford Academic
  • Books on Oxford Academic

Issue Cover

Glenn A. Martinez Ron Martinez

About the journal

Applied Linguistics  publishes research into language with relevance to real-world issues. The journal is keen to help make connections between scholarly discourses, theories, and research methods…

Applied Linguistics, Social Problems, and Social Change

This latest Virtual Issue explores how applied linguistics can support social change and address social issues. Topics examined include race and class in English Language Teaching, the linguistic and pedagogical issues around the use of gender-inclusive language in Spanish, working towards the diversity and equity of knowledge, and race, representation, and diversity in the American Association of Applied Linguistics.

Highly cited articles

Inform your research by reading a selection of papers currently making an impact. This collection of recent, highly cited articles showcases the high-quality research being published in the journal, and encompasses significant themes in the field. 

Browse the full collection

Open Access articles

Explore the full archive of Open Access articles from Applied Linguistics, including:

Deceptive Identity Performance: Offender Moves and Multiple Identities in Online Child Abuse Conversations  by Emily Chiang and Tim Grant

Discipline, Level, Genre: Integrating Situational Perspectives in a New MD Analysis of University Student Writing  by Sheena Gardner, Hilary Nesi, and Douglas Biber

Special Issues of Applied Linguistics

Considering 'trans-' perspectives in language theories and practices.

The notion of ‘trans-’ has been gaining momentum and visibility within an increasingly globalized world. This special issue brings together researchers working in different applied linguistics paradigms, research areas, and world regions to weigh divergent, as well as convergent views on the recent ‘trans- turn’ in applied linguistics. The articles are a mix of conceptually driven pieces illustrated with empirical data and data-driven pieces with full theorization, and they consider a variety of ‘trans-’ perspectives, including their theoretical origins and empirical applications.

Innovation in Research Methods

This special issue focuses on the emerging features of the methodological landscape that represent both challenges and opportunities. Its theme is innovation but it is not concerned with what is merely novel; its sweep is broader, exploring the relationship between methodological thinking and the evolution of new approaches within the discipline. In bringing these together, the collection aims to illustrate that methodological investment is as fundamental as theory building to disciplinary development.

Browse the table of contents

Resources for Authors and Researchers

applied research journal articles

Interested in submitting your research?

Read the Instructions for Authors and learn more about the  Applied Linguistics  submission process and requirements.

Make an impact with your work

Make an impact with your work

Have you published an article? What should you do now? Read our top tips on promoting your work to reach a wider audience and ensure your work makes an impact.

Top Tips for Publishing in Linguistics Journals

Watch our top tips for publishing in Linguistics Journals video, featuring helpful advice from our Linguistics Journals editors.

Latest articles

From the oupblog.

Close up of a black keyboard with illuminated keys

How linguistics can help us catch sex offenders

As sex offenders take ever-more sophisticated measures to mask their identities online, Emily Chiang and Tim Grant suggest that forensic linguistics may be the key to catching them out.

Row of books in a library

Racial biases in academic knowledge

Ryuko Kubota explores the way epistemological racism, or biases in academic knowledge, reinforces institutional and individual forms of racism.

preventing miscommunication: lessons from cross-cultural couples

Preventing miscommunication: lessons from cross-cultural couples

Dr Kaisa S. Pietikäinen discusses how miscommunications in cross-cultural couples are relatively uncommon, and how they often use  lingua franca  English as a way of communicating.

Translanguaging and Code-Switching: what's the difference?

Translanguaging and Code-Switching: what's the difference?

Li Wei explores the differences between these key analytical concepts of Translanguaging, and Code-Switching.

Alerts in the Inbox

Email alerts

Register to receive table of contents email alerts as soon as new issues of Applied Linguistics are published online.

Recommend to your library

Recommend to your library

Fill out our simple online form to recommend Applied Linguistics to your library.

Recommend now

open

Explore the full archive of Open Access articles from  Applied Linguistics.

Published in collaboration with

British association for applied linguistics (baal).

BAAL is a professional association based in the UK, which provides a forum for people interested in language and applied linguistics.

Find out more

International Association for Applied Linguistics (AILA)

AILA (originally founded in 1964 in France) is an international federation of national and regional associations of Applied Linguistics.

Related Titles

Cover image of current issue from ELT Journal

  • Recommend to your Library

Affiliations

  • Online ISSN 1477-450X
  • Print ISSN 0142-6001
  • Copyright © 2024 Oxford University Press
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Applied Research Articles: narrowing the gap between research and organizations

Revista de Gestão

ISSN : 2177-8736

Article publication date: 9 October 2018

Issue publication date: 9 October 2018

de Mello, A.M. and Pedroso, M. (2018), "Applied Research Articles: narrowing the gap between research and organizations", Revista de Gestão , Vol. 25 No. 4, pp. 338-339. https://doi.org/10.1108/REGE-10-2018-075

Emerald Publishing Limited

Copyright © 2018, Adriana Marotti de Mello and Marcelo Pedroso

Published in Revista de Gestão . Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode

Applied research articles: narrowing the gap between research and organizations

Since 1994, REGE publishes articles with the main objective of adding to the development of scientific knowledge in management. These articles stem from academic research, with relevant theoretical contribution to the field of administration. One of the classical, but still contemporary, discussions about academic research refers to its relevance and impact ( Saes et al. , 2017 ; MacIntosh et al. , 2017 ). Traditionally, academic research is supported by scientific and methodological rigor. Despite the need for such accuracy, research cannot do without relevance and impact on society ( Vermeulen, 2005 ).

However, there is a growing discussion on not only how to make management research be understood as relevant, but also how to assess its impact on the academy and society in general. There is a debate on the distance between research and practice, the gap between what the academy research works and the relevance and use of this knowledge by the management community. Although it seems somewhat obvious that management research should spread management policies and practices and vice versa, in a collaborative process, this does not seem to happen in a continuous and productive way for both sides ( Saes et al. , 2017 ; Wood, 2017 ; Banks et al. , 2016 ).

In other words, there is a claim for scientific knowledge developed in the academy to be used for the benefit of organizations and, ultimately, of the society. After all, an important part of scientific research is financed by society, through public research institutions (including state and federal universities) and funding agencies (such as CNPq and FAPESP). In recent years, important initiatives such as the establishment of professional master and PhD programs have increased the discussion and production of applied knowledge, seeking to transfer more directly, to public and private organizations, knowledge produced in the academy, thus making science find business.

In this regard, critical elements of research evaluation consider the following questions: does the scope of research consider a theoretical and/or practical gap? Does the research problem address something new or counter-intuitive? Can research results contribute to theory or to management practices?

Innovation: the paper presents new solutions to new problems.

Improvement: the paper brings new solutions to known problems.

Extrapolation: the paper extends known solutions to new problems.

We can find the expression “applied article” with changes and different designations, such as technical articles, technological reports and technical reports ( Motta, 2017 ). Notwithstanding the distinct names, these papers aim to study or solve a practical problem. The appropriate adoption of a research method can bring greater scientific rigor to this kind of paper.

In summary, applied articles have an approach from practice to theory. Thus, the research objective originates from a practical problem, which is studied or solved through the application of theoretical elements, preferably with the use of scientific methods. Therefore, the target audience of an applied article is researchers and teachers, as well as practitioners – and the latter, within the administration area, are mainly managers that work in public and private organizations.

In this respect, REGE starts publishing also applied articles in its editions. In addition, we launch this “Call for papers” regarding the Special Edition: applied articles, to be published in Number 4 of 2019. The invited Editor will be Dr Marcelo Pedroso, Coordinator of Professional Master in Entrepreneurship from the School of Economics, Business and Accounting at University of São Paulo.

public administration;

entrepreneurship, innovation and technology;

financial and accounting management;

human resources and organizations;

sustainability;

marketing; and

operations.

Adriana Marotti de Mello and Marcelo Pedroso

Banks , G.C. , Pollack , J.M. , Bochantin , J.E. , Kirkman , B.L. , Whelpley , C.E. and O’Boyle , E.H. ( 2016 ), “ Management’s science–practice gap: a grand challenge for all stakeholders ”, Academy of Management Journal , Vol. 59 No. 6 , pp. 2205 - 2231 .

Gregor , S. and Hevner , A.R. ( 2013 ), “ Positioning and presenting design science research for maximum impact ”, MIS Quarterly , Vol. 37 No. 2 , pp. 337 - 356 .

MacIntosh , R. , Beech , N. , Bartunek , J. , Mason , K. , Cooke , B. and Denyer , D. ( 2017 ), “ Impact and management research: exploring relationships between temporality, dialogue, reflexivity and praxis ”, British Journal of Management , Vol. 28 , pp. 3 - 13 , doi: 10.1111/1467-8551.12207 .

Motta , G.S. ( 2017 ), “ Como escrever um bom artigo tecnológico? ”, Revista de Administração Contemporânea , Vol. 21 No. 5 , available at: http://dx.doi.org/10.1590/1982-7849rac2017170258

Saes , M.S.M. , Mello , A.M. and Sandes-Guimarães , L.V. ( 2017 ), “ Revistas brasileiras em Administração: Relevância para quem? ”, Revista de Administração de Empresas , Vol. 57 No. 5 , pp. 515 - 519 .

Vermeulen , F. ( 2005 ), “ On rigor and relevance: fostering dialectic progress in management research ”, Academy of Management Journal , Vol. 48 , pp. 978 - 982 .

Wood , T. Jr ( 2017 ), “ Resisting and surviving the mainstream scientific model: findings on social relevance and social impact in the tropics ”, Management Learning , Vol. 48 No. 1 , pp. 65 - 79 , doi: 10.1177/1350507616659832 .

Related articles

We’re listening — tell us what you think, something didn’t work….

Report bugs here

All feedback is valuable

Please share your general feedback

Join us on our journey

Platform update page.

Visit emeraldpublishing.com/platformupdate to discover the latest news and updates

Questions & More Information

Answers to the most commonly asked questions here

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.37(1); 2014 May

The Evidence-Based Practice of Applied Behavior Analysis

Timothy a. slocum.

Utah State University, Logan, UT USA

Ronnie Detrich

Wing Institute, Oakland, CA USA

Susan M. Wilczynski

Ball State University, Muncie, IN USA

Trina D. Spencer

Northern Arizona University, Flagstaff, AZ USA

Oregon State University, Corvallis, OR USA

Katie Wolfe

University of South Carolina, Columbia, SC USA

Evidence-based practice (EBP) is a model of professional decision-making in which practitioners integrate the best available evidence with client values/context and clinical expertise in order to provide services for their clients. This framework provides behavior analysts with a structure for pervasive use of the best available evidence in the complex settings in which they work. This structure recognizes the need for clear and explicit understanding of the strength of evidence supporting intervention options, the important contextual factors including client values that contribute to decision making, and the key role of clinical expertise in the conceptualization, intervention, and evaluation of cases. Opening the discussion of EBP in this journal, Smith ( The Behavior Analyst, 36 , 7–33, 2013 ) raised several key issues related to EBP and applied behavior analysis (ABA). The purpose of this paper is to respond to Smith’s arguments and extend the discussion of the relevant issues. Although we support many of Smith’s ( The Behavior Analyst, 36 , 7–33, 2013 ) points, we contend that Smith’s definition of EBP is significantly narrower than definitions that are used in professions with long histories of EBP and that this narrowness conflicts with the principles that drive applied behavior analytic practice. We offer a definition and framework for EBP that aligns with the foundations of ABA and is consistent with well-established definitions of EBP in medicine, psychology, and other professions. In addition to supporting the systematic use of research evidence in behavior analytic decision making, this definition can promote clear communication about treatment decisions across disciplines and with important outside institutions such as insurance companies and granting agencies.

Almost 45 years ago, Baer et al. ( 1968 ) described a new discipline—applied behavior analysis (ABA). This discipline was distinguished from the experimental analysis of behavior by its focus on social impact (i.e., solving socially important problems in socially important settings). ABA has produced remarkably powerful interventions in fields such as education, developmental disabilities and autism, clinical psychology, behavioral medicine, organizational behavior management, and a host of other fields and populations. Behavior analysts have long recognized that developing interventions capable of improving client behavior solves only one part of the problem. The problem of broad social impact must be solved by having interventions implemented effectively in socially important settings and at scales of social importance (Baer et al. 1987 ; Horner et al. 2005b ; McIntosh et al. 2010 ). This latter set of challenges has proved to be more difficult. In many cases, demonstrations of effectiveness are not sufficient to produce broad adoption and careful implementation of these procedures. Key decision makers may be more influenced by variables other than the increases and decreases in the behaviors of our clients. In addition, even when client behavior is a very powerful factor in decision making, it does not guarantee that empirical data will be the basis for treatment selection; anecdotes, appeals to philosophy, or marketing have been given priority over evidence of outcomes (Carnine 1992 ; Polsgrove 2003 ).

Across settings in which behavior analysts work, there has been a persistent gap between what is known from research and what is actually implemented in practice. Behavior analysts have been concerned with the failed adoption of research-based practices for years (Baer et al. 1987 ). Even in the fields in which behavior analysts have produced powerful interventions, the vast majority of current practice fails to take advantage of them.

Behavior analysts have not been alone in recognizing serious problems with the quality of interventions used employed in practice settings. In the 1960s, many within the medical field recognized a serious research-to-practice gap. Studies suggested that a relatively small percentage (estimates range from 10 to 25 %) of medical treatment decisions were based on high-quality evidence (Goodman 2003 ). This raised the troubling question of what basis was used for the remaining decisions if it was not high-quality evidence. These concerns led to the development of evidence-based practice (EBP) of medicine (Goodman 2003 ; Sackett et al. 1996 ).

The research-to-practice gap appears to be universal across professions. For example, Kazdin ( 2000 ) has reported that less than 10 % of the child and adolescent mental health treatments reported in the professional literature have been systematically evaluated and found to be effective and those that have not been evaluated are more likely to be adopted in practice settings. In recognition of their own research-to-practice gaps, numerous professions have adopted an EBP framework. Nursing and other areas of health care, social work, clinical and educational psychology, speech and language pathology, and many others have adopted this framework and adapted it to the specific needs of their discipline to help guide decision-making. Not only have EBP frameworks been helping to structure professional practice, but they have also been used to guide federal policy. With the passage of No Child Left Behind ( 2002 ) and the reauthorization of the Individuals with Disabilities Education Improvement Act ( 2005 ), the federal department of education has aligned itself with the EBP movement. A recent memorandum from the federal Office of Management and Budget instructed agencies to consider evidence of effectiveness when awarding funds, increase the use of evidence in competitions, and to encourage widespread program evaluation (Zients 2012 ). The memo, which used the term evidence-based practice extensively, stated: “Where evidence is strong, we should act on it. Where evidence is suggestive, we should consider it. Where evidence is weak, we should build the knowledge to support better decisions in the future” (Zients 2012 , p. 1).

EBP is more broadly an effort to improve decision-making in applied settings by explicitly articulating the central role of evidence in these decisions and thereby improving outcomes. It addresses one of the long-standing challenges for ABA; the need to effectively support and disseminate interventions in the larger social systems in which our work is embedded. In particular, EBP addresses the fact that many decision-makers are not sufficiently influenced by the best evidence that is relevant to important decisions. EBP is an explicit statement of one of ABA’s core tenets—a commitment to evidence-based decision-making. Given that the EBP framework is well established in many disciplines closely related to ABA and in the larger institutional contexts in which we operate (e.g., federal policy and funding agencies), aligning ABA with EBP offers an opportunity for behavior analysts to work more effectively within broader social systems.

Discussion of issues related to EBP in ABA has taken place across several years. Researchers have extensively discussed methods for identifying well-supported treatments (e.g., Horner et al. 2005a ; Kratochwill et al. 2010 ), and systematically reviewed the evidence to identify these treatments (e.g., Maggin et al. 2011 ; National Autism Center 2009 ). However, until recently, discussion of an explicit definition of EBP in ABA has been limited to conference papers (e.g., Detrich 2009 ). Smith ( 2013 ) opened a discussion of the definition and critical features of EBP of ABA in the pages of The Behavior Analyst . In his thought-provoking article, Smith raised many important points that deserve serious discussion as the field moves toward a clear vision of EBP of ABA. Most importantly, Smith ( 2013 ) argued that behavior analysts must carefully consider how EBP is to be defined and understood by researchers and practitioners of behavior analysis.

Definitions Matter

We find much to agree with in Smith’s paper, and we will describe these points of agreement below. However, we have a core disagreement with Smith concerning the vision of what EBP is and how it might enhance and expand the effective practice of ABA. As behavior analysts know, definitions matter. A well-conceived definition can promote conceptual understanding and set the context for effective action. Conversely, a poor definition or confusion about definitions hinders clear understanding, communication, and action.

In providing a basis for his definition of EBP, Smith refers to definitions in professions that have well-developed conceptions of EBP. He quotes the American Psychological Association (APA) ( 2005 ) definition (which we quote here more extensively than he did):

Evidence-based practice in psychology (EBPP) is the integration of the best available research with clinical expertise in the context of patient characteristics, culture, and preferences. This definition of EBPP closely parallels the definition of evidence-based practice adopted by the Institute of Medicine ( 2001 , p. 147) as adapted from Sackett et al. ( 2000 ): “Evidence-based practice is the integration of best research evidence with clinical expertise and patient values.” The purpose of EBPP is to promote effective psychological practice and enhance public health by applying empirically supported principles of psychological assessment, case formulation, therapeutic relationship, and intervention.

The key to understanding this definition is to note how APA and the Institute of Medicine use the word practice . Clearly, practice does not refer to an intervention; instead, it references one’s professional behavior. This is the sense in which one might speak of the professional practice of behavior analysis. American Psychological Association Presidential Task Force of Evidence-Based Practice ( 2006 ) further elaborates this point:

It is important to clarify the relation between EBPP and empirically supported treatments (ESTs)…. ESTs are specific psychological treatments that have been shown to be efficacious in controlled clinical trials, whereas EBPP encompasses a broader range of clinical activities (e.g., psychological assessment, case formulation, therapy relationships). As such, EBPP articulates a decision-making process for integrating multiple streams of research evidence—including but not limited to RCTs—into the intervention process. (p. 273)

In contrast, Smith defined EBP not as a decision-making process but as a set of interventions that have been shown to be efficacious through rigorous research. He stated:

An evidence-based practice is a service that helps solve a consumer’s problem. Thus it is likely to be an integrated package of procedures, operationalized in a manual, and validated in studies of socially meaningful outcomes, usually with group designs. (p. 27).

Smith’s EBP is what APA has clearly labeled an empirically supported treatment . This is a common misconception found in conversation and in published articles (e.g., Cook and Cook 2013 ) but at odds with formal definitions provided by many professional organizations; definitions which result from extensive consideration and debate by representative leaders of each professional field (e.g., APA 2005 ; American Occupational Therapy Association 2008 ; American Speech-Language Hearing Association 2005 ; Institute of Medicine 2001 ).

Before entering into the discussion of a useful definition of EBP of ABA, we should clarify the functions that we believe a useful definition of EBP should perform. First, a useful definition should align with the philosophical tenets of ABA, support the most effective current practice of ABA, and contribute to further improvement of ABA practice. A definition that is in conflict with the foundations of ABA or detracts from effective practice clearly would be counterproductive. Second, a useful definition of EBP of ABA should enhance social support for ABA practice by describing its empirical basis and decision-making processes in a way that is understandable to professions that already have well-established definitions of EBP. A definition that corresponds with the fundamental components of EBP in other fields would promote ABA practice by improving communication with external audiences. This improved communication is critical in the interdisciplinary contexts in which behavior analysts often practice and for legitimacy among those familiar with EBP who often control local contingencies (e.g., policy makers and funding agencies).

Based on these functions, we propose the following definition: Evidence-based practice of applied behavior analysis is a decision-making process that integrates (a) the best available evidence with (b) clinical expertise and (c) client values and context. This definition positions EBP as a pervasive feature of all professional decision-making by a behavior analyst with respect to client services; it is not limited to a narrowly restricted set of situations or decisions. The definition asserts that the best available evidence should be a primary influence on all decision-making related to services for clients (e.g., intervention selection, progress monitoring, etc.). It also recognizes that evidence cannot be the sole basis for a decision; effective decision-making in a discipline as complex as ABA requires clinical expertise in identifying, defining, and analyzing problems, determining what evidence is relevant, and deciding how it should be applied. In the absence of this decision-making framework, practitioners of ABA would be conceptualized as behavioral technicians rather than analysts. Further, the definition of EBP of ABA includes client values and context. Decision-making is necessarily based on a set of values that determine the goals that are to be pursued and the means that are appropriate to achieve them. Context is included in recognition of the fact that the effectiveness of an intervention is highly dependent upon the context in which it is implemented. The definition asserts that effective decision-making must be informed by important contextual factors. We elaborate on each component of the definition below, but first we contrast our definition with that offered by Smith ( 2013 ).

Although Smith ( 2013 ) made brief reference to the other critical components of EBP, he framed EBP as a list of multicomponent interventions that can claim a sufficient level of research support. We agree with his argument that such lists are valuable resources for practitioners and therefore developing them should be a goal of researchers. However, such lists are not, by themselves , a powerful means of improving the effectiveness of behavior analytic practice. The vast majority of decisions faced in the practice of behavior analysis cannot be made by implementing the kind of manualized, multicomponent treatment packages described by Smith.

There are a number of reasons a list of interventions is not an adequate basis for EBP of ABA. First, there are few interventions that qualify as “practices” under Smith’s definition. For example, when discussing the importance of manuals for operationalizing treatments, Smith stated that the requirement that a “practice” be based on a manual, “sharply reduces the number of ABA approaches that can be regarded as evidence based. Of the 11 interventions for ASD identified in the NAC ( 2009 ) report, only the three that have been standardized in manuals might be considered to be practices, and even these may be incomplete” (p. 18). Thus, although the example referenced the autism treatment literature, it seems apparent that even a loose interpretation of this particular criterion would leave all practitioners with a highly restricted number of intervention options.

Second, even if more “practices” were developed and validated, many consumers cannot be well served with existing multicomponent packages. In order to meet their clients’ needs, behavior analysts must be able to selectively implement focused interventions alone or in combination. This flexibility is necessary to meet the diverse needs of their clients and to minimize the response demands on direct care providers or staff, who are less likely to implement a complicated intervention with fidelity (Riley-Tillman and Chafouleas 2003 ).

Third, the strategy of assembling a list of treatments and describing these as “practices” severely limits the ways in which research findings are used by practitioners. With the list approach to defining EBP, research only impacts practice by placing an intervention on a list when a specific criteria has been met. Thus, any research on an intervention that is not sufficiently broad or manualized to qualify as a “practice” has no influence on EBP. Similarly, a research study that shows clear results but is not part of a sufficient body of support for an intervention would also have no influence. A study that provides suggestive results but is not methodologically strong enough to be definitive would have no influence, even if it were the only study that is relevant to a given problem.

The primary problem with a list approach is that it does not provide a strong framework that directs practitioners to include the best available evidence in all of their professional decision-making. Too often, practitioners who consult such lists find that no interventions relevant to their specific case have been validated as “evidence-based” and therefore EBP is irrelevant. In contrast, definitions of EBP as a decision-making process can provide a robust framework for including research evidence along with clinical expertise and client values and context in the practice of behavior analysis. In the next sections, we explore the components of this definition in more detail.

Best Available Evidence

The term “best available evidence” occupies a critical and central place in the definition and concept of EBP; this aligns with the fundamental reliance on scientific research that is one of the core tenets of ABA. The Behavior Analyst Certification Board ( 2010 ) Guidelines for Responsible Conduct for Behavior Analysts repeatedly affirm ways in which behavior analysts should base their professional conduct on the best available evidence. For example:

Behavior analysts rely on scientifically and professionally derived knowledge when making scientific or professional judgments in human service provision, or when engaging in scholarly or professional endeavors.

  • The behavior analyst always has the responsibility to recommend scientifically supported most effective treatment procedures. Effective treatment procedures have been validated as having both long-term and short-term benefits to clients and society.
  • Clients have a right to effective treatment (i.e., based on the research literature and adapted to the individual client).

A Continuum of Evidence Quality

The term best implies that evidence can be of varying quality, and that better quality evidence is preferred over lower quality evidence. Quality of evidence for informing a specific practical question involves two dimensions: (a) relevance of the evidence and (b) certainty of the evidence.

The dimension of relevance recognizes that some evidence is more germane to a particular decision than is other evidence. This idea is similar to the concept of external validity. External validity refers to the degree to which research results apply to a range of applied situations whereas relevance refers to the degree to which research results apply to a specific applied situation. In general, evidence is more relevant when it matches the particular situation in terms of (a) important characteristics of the clients, (b) specific treatments or interventions under consideration, (c) outcomes or target behaviors including their functions, and (d) contextual variables such as the physical and social environment, staff skills, and the capacity of the organization. Unless all conditions match perfectly, behavior analysts are necessarily required to use their expertise to determine the applicability of the scientific evidence to each unique clinical situation. Evidence based on functionally similar situations is preferred over evidence based on situations that share fewer important characteristics with the specific practice situation. However, functional similarity between a study or set of studies and a particular applied problem is not always obvious.

The dimension of certainty of evidence recognizes that some evidence provides stronger support for claims that a particular intervention produced a specific result. Any instance of evidence can be evaluated for its methodological rigor or internal validity (i.e., the degree to which it provides strong support for the claim of effectiveness and rules out alternative explanations). Anecdotes are clearly weaker than more systematic observations, and well-controlled experiments provide the strongest evidence. Methodological rigor extends to the quality of the dependent measure, treatment fidelity, and other variables of interest (e.g., maintenance of skill acquisition), all of which influence the certainty of evidence. But the internal validity of any particular study is not the only variable influencing the certainty of evidence; the quantity of evidence supporting a claim is also critical to its certainty. Both systematic and direct replication are vital for strengthening claims of effectiveness (Johnston and Pennypacker 1993 ; Sidman 1960 ). Certainty of evidence is based on both the rigor of each bit of evidence and the degree to which the findings have been consistently replicated. Although these issues are simple in principle, operationalizing and measuring rigor of research is extremely complex. Numerous quality appraisal systems for both group and single-subject research have been proposed and used in systematic reviews (see below for more detail).

Under ideal circumstances, consistently high-quality evidence that closely matches the specifics of the practice situation is available; unfortunately, this is not always the case, and evidence-based practitioners of ABA must proceed despite an imperfect evidence base. The mandate to use the best available evidence specifies that the practitioner make decisions based on the best evidence that is available. Although this statement may seem rather obvious, the point is worth underscoring because the implications are highly relevant to behavior analysts. In an area with considerable high-quality relevant research, the standards for evidence should be quite high. But in an area with more limited research, the practitioner should take advantage of the best evidence that is available. This may require tentative reliance on research that is somewhat weaker or is only indirectly relevant to the specific situation at hand. For example, ideally, evidence-based practitioners of ABA would rely on well-controlled experimental results that have been replicated with the precise population with whom they are working. However, if this kind of evidence is not available, they might have to make decisions based on a single study that involves a similar but not identical population.

This idea of using the best of the available evidence is very different from one of using only extremely high-quality evidence (i.e., empirically supported treatments). If we limit EBP to considering only the highest quality evidence, we leave the practitioner with no guidance in the numerous situations in which high-quality and directly relevant evidence (i.e., precise matching of setting, function, behavior, motivating operations and precise procedures) simply does not exist. This approach would lead to a form of EBP that is irrelevant to the majority of decisions that a behavior analyst must make on a daily basis. Instead, our proposed definition of EBP asserts that the practitioner should be informed by the best evidence that is available.

Expanding Research on Utility of Treatments

Smith ( 2013 ) argued that the research methods used by behavior analysts to evaluate these treatments should be expanded to more comprehensively describe the utility of interventions. He suggested that too much ABA research is conducted in settings that do not approximate typical service settings, optimizing experimental control at the expense of external validity. Along this same line of reasoning, he noted that it is important to test the generality of effects across clients and identify variables that predict differential effectiveness. He suggested systematically reporting results from all research participants (e.g., the intent-to-treat model), and purposive selection of participants would provide a more complete account of the situations in which treatments are successful and those in which they are unsuccessful. Smith argued that researchers should include more distal and socially important outcomes because with a narrow target “behavior may change, but remain a problem for the individual or may be only a small component of a much larger cluster of problems such as addiction or delinquency.” He pointed out that in order to best support effective practice, research must demonstrate that an intervention produces or contributes to producing the socially important outcomes that would cause a consumer to say that the problem is solved.

Further, Smith argues that many of the questions most relevant to EBP—questions about the likely outcomes of a treatment when applied in a particular type of situation—are well suited to group research designs. He argued that RCTs are likely to be necessary within a program of research because:

most problems pose important actuarial questions (e.g., determining whether an intervention package is more effective than community treatment as usual; deciding whether to invest in one intervention package or another, both, or neither; and determining whether the long-term benefits justify the resources devoted to the intervention)…. A particularly important actuarial issue centers on the identification of the conditions under which the intervention is most likely to be effective. (p. 23)

We agree that selection of research methods should be driven by the kinds of questions being asked and that group research designs are the methods of choice for some types of questions that are central to EBP. Therefore, we support Smith’s call for increased use of group research designs within ABA. If practice decisions are to be informed by the best available evidence, we must take advantage of both group and single-subject designs. However, we disagree with Smith’s statement that EBP should be limited to treatments that are validated “usually with group designs” (Smith, p. 27). Practitioners should be supported by reviews of research that draw from all of the available evidence and provide the best recommendations possible given the state of knowledge on the particular question. In most areas of behavior analytic practice, single-subject research makes up a large portion of the best available evidence. The Institute for Education Science (IES) has recognized the contribution single case designs can make toward identifying effective practices and has recently established standards for evaluating the quality of single case design studies (Institute of Educational Sciences, n.d. ; Kratochwill et al. 2013 ).

Classes of Evidence

Identifying the best available evidence to inform specific practice decisions is extremely complex, and no single currently available source of evidence can adequately inform all aspects of practice. Therefore, we outline a number of strategies for identifying and summarizing evidence in ways that can support the EBP of ABA. We do not intend to cover all sources of evidence comprehensively, but merely outline some of the options available to behavior analysts.

Empirically Supported Treatment Reviews

Empirically supported treatments (EST) are identified through a particular form of systematic literature review. Systematic reviews bring a rigorous methodology to the process of reviewing research. The development and use of these methods are, in part, a response to the recognition that the process of reviewing the literature is subject to threats to validity. The systematic review process is characterized by explicitly stated and replicable methods for (a) searching for studies, (b) screening studies for relevance to the review question, (c) appraising the methodological quality of studies, (d) describing outcomes from each study, and (e) determining the degree to which the treatment (or treatments) is supported by the research. When the evidence in support of a treatment is plentiful and of high quality, the treatment generally earns the status of an EST. Many systematic reviews, however, find that no intervention for a particular problem has sufficient evidence to qualify as an EST.

Well-known organizations in medicine (e.g., Cochrane Collaboration), education (e.g., What Works Clearinghouse), and mental health (e.g., National Registry of Evidence-based Programs and Practices) conduct EST reviews. Until recently, systematic reviews have focused nearly exclusively on group research; however, systematic reviews of single-subject research are quickly becoming more common and more sophisticated (e.g., Carr 2009 ; NAC 2009 ; Maggin et al. 2012 ).

Systematic reviews for EST status is one important way to summarize the best available evidence because it can give a relatively objective evaluation of the strength of the research literature supporting a particular intervention. But systematic reviews are not infallible; as with all other research and evaluation methods, they require skillful application and are subject to threats to validity. The results of reviews can change dramatically based on seemingly minor changes in operational definitions and procedures for locating articles, screening for relevance, describing treatments, appraising methodological quality, describing outcomes, summarizing outcomes for the body of research as a whole, and rating the degree to which an intervention is sufficiently supported (Slocum et al. 2012a ; Wilczynski 2012 ). Systematic reviews and claims based upon them must be examined critically with full recognition of their limitations just as one examines primary research reports.

Behavior analysts encounter many situations in which no ESTs have been established for the particular combination of client characteristics, target behaviors, functions, contexts, and other parameters for decision-making. This dearth may exist because no systematic review has addressed the particular problem or because a systematic review has been conducted but failed to find any well-supported treatments for the particular problem. For example, in a recent review of all of the recommendations in the empirically supported practice guides published by the IES, 45 % of the recommendations had minimal support (Slocum et al. 2012b ). As Smith noted ( 2013 ), only 3 of the 11 interventions that the NAC identified as meeting quality standards might be considered practices in the sense that they are manualized. In these common situations, a behavior analyst cannot respond by simply selecting an intervention from a list of ESTs. A comprehensive EBP of ABA requires additional strategies for reviewing research evidence and drawing practice recommendations from existing evidence—strategies that can glean the best available evidence from an imperfect research base and formulate practice recommendations that are most likely to lead to favorable outcomes under conditions of uncertainty.

Other Methods for Reviewing Research Literature

The three strategies outlined below may complement systematic reviews in guiding behavior analysts toward effective decision-making.

Narrative Reviews of the Literature

There has been a long tradition across disciplines of relying on narrative reviews to summarize what is known with respect to treatments for a class of problems (e.g., aggression) or what is known about a particular treatment (e.g., token economy). The author of the review, presumably an expert, selects the theme and synthesizes the research literature that he or she considers most relevant. Narrative reviews allow the author to consider a wide range of research including studies that are indirectly relevant (e.g., those studying a given problem with a different population or demonstrating general principles) and studies that may not qualify for systematic reviews because of methodological limitations but which illustrate important points nonetheless. Narrative reviews can consider a broader array of evidence and have greater interpretive flexibility than most systematic reviews.

As with all sources of evidence, there are difficulties with narrative reviews. The selection of the literature is left up to the author’s discretion; there are no methodological guidelines and little transparency about how the author decided which literature to include and which to exclude. There is always the risk of confirmation bias that the author emphasized literature that is consistent with her preconceived opinions. Even with a peer-review process, it is always possible that the author neglected or misinterpreted research relevant to the discussion. These concerns not withstanding, narrative reviews may provide the best available evidence when no systematic reviews exist or when substantial generalizations from the systematic review to the practice context are needed. Many textbooks (e.g., Cooper et al. 2007 ) and handbooks (e.g., Fisher et al. 2011 ; Madden et al. 2013 ) provide excellent examples of narrative reviews that can provide important guidance for evidence-based practitioners of ABA.

Best Practice Guides

Best practice guides are another source of evidence that can inform decisions in the absence of available and relevant systematic reviews. Best practice guides provide recommendations that reflect the collective wisdom of an expert panel. It is presumed that the recommendations reflect what is known from the research literature, but the validity of recommendations is largely derived from the panel’s expertise rather than from the rigor of their methodology. Recommendations from best practice panels are usually much broader than the recommendations from systematic reviews. The recommendations from these guides can provide important information about how to implement a treatment, how to adapt the treatment for specific circumstances, and what is necessary for broad scale or system-wide implementation.

The limitations to best practice guides are similar to those for narrative reviews; specifically, potential bias and lack of transparency are significant concerns. Panel members are typically not selected using a specific set of operationalized criteria. Bias is possible if the panel is drawn too narrowly. If the panel is drawn too broadly; however, the panel may have difficulty reaching a consensus (Wilczynski 2012 ).

Empirically Supported Practice Guides

Empirically supported practice guides, a more recently developed strategy, integrate the strengths of systematic reviews and best practice panels. In this type of review, an expert panel is charged with developing recommendations on a topic. As part of the process, a systematic review of the literature is conducted. Following the systematic review, the panel generates a set of recommendations and objectively determines the strength of evidence for the recommendation and assigns an evidence rating. When there is little empirical evidence directly related to a specific issue, the panel’s recommendations may have weak research support but nonetheless may be based on the best evidence that is available. The obvious advantage of empirically supported practice guides is that there is greater transparency about the review process and certainty of recommendations. Practice recommendations are usually broader than those derived from systematic reviews and address issues related to implementation and acceptable variations to enhance the treatment’s contextual fit (Shanahan et al. 2010 ; Slocum et al. 2012b ). Although empirically supported practice guides offer the objectivity of a systematic review and the flexibility of best practice guidelines, they also face potential sources of error from both methods. Systematic and explicit criteria are used to review the research and rate the level of evidence for each recommendation; however, it is the panel that formulates recommendations. Thus, results of these reviews are influenced by the selection of panel members. When research evidence is incomplete or equivocal, panelists must exercise judgment in interpreting the evidence and drawing conclusions (Shanahan et al. 2010 ).

Other Units of Analysis

Smith ( 2013 ) weighed in on the critical issue of the unit of analysis when describing and evaluating treatments (Slocum and Wilczynski 2008 ). The unit of analysis refers to whether EBP should focus on (a) principles, such as reinforcement; (b) tactics, such as backward chaining; (c) multicomponent packages, such as Functional Communication Training; or (d) even more comprehensive systems, such as Early Intensive Behavioral Intervention. After reviewing the ongoing debate between those favoring a smaller unit of analysis that focuses on specific procedures and those favoring a larger unit of analysis that evaluates the effects of multicomponent packages, Smith made a case that the multicomponent treatment package is the key unit in EBP. Smith noted that practitioners rarely solve a client’s problem with a single procedure; instead, solutions typically involve combinations of procedures. He argued that the unit should be “a service aimed at solving people’s problems” and procedures that are merely components of such services are not sufficiently complete to be the proper unit of analysis for EBP. He further stated that these treatment packages should include strategies for implementation in typical service settings and an intervention manual.

We concur that the multicomponent treatment package is a particularly significant and strategic unit of treatment because it specifies a suite of procedures and exactly how they are to be used together to solve a problem. Validated treatment packages are far more than the sum of their parts. A well-developed treatment package can be revised and optimized over many iterations in a way that would be difficult or impossible for a practitioner to accomplish independently. In addition, research outcomes from implementation of treatment packages reflect the interaction of the components, and these interactions may not be evident in the research literature on the individual components. Further, research on the outcomes from multicomponent packages can evaluate broader and more socially important outcomes than is generally possible when evaluating more narrowly defined treatments. For example, in the case of teaching a child with autism to communicate, research on a focused procedure such as time delay may indicate that its use leads to more independent communicative responses; however, research on a comprehensive Early Intensive Behavioral Intervention can evaluate the impact of the program on children’s global development or intellectual functioning.

Having recognized our agreement with Smith ( 2013 ) on the special importance of multicomponent treatment packages for EBP, we hasten to add that this type of intervention is not enough to support a broad and robust EBP of ABA. EBP must also provide guidance to the practitioner in the frequently encountered situations in which well-established treatment packages are not available. In these situations, problems may be best addressed by building an intervention from a set of elemental components. These components, referred to as practice elements (Chorpita et al. 2005 , 2007 ) or kernels (Embry 2004 ; Embry and Biglan 2008 ), may be validated either directly or indirectly. The practitioner assembles a particular combination of components to solve a specific problem. Because this newly constructed package has not been evaluated as a whole, there is additional uncertainty about the effectiveness of the package, and the quality of evidence may be considered lower than a well-supported treatment package (Slocum et al. 2012b ; Smith 2013 ; however, see Chorpita ( 2003 ) for a differing view). Nonetheless, treatment components that are supported by strong evidence provide the practitioner with tools to solve practical problems when EST packages are not relevant.

In some cases, behavior analysts are presented with problems that cannot be addressed even by assembling established components. In these cases, the ABA practitioner must apply principles of behavior to construct an intervention and must depend on these principles to guide sensible modifications of interventions in response to client needs and to support sensible implementation of interventions. Principles of behavior are broadly generalized statements describing behavioral relations. Their empirical base is extremely large and diverse including both human and nonhuman participants across numerous contexts, behaviors, and consequences. Although principles of behavior are based on an extremely broad research literature, they are also stated at a broad level. As a result, the behavior analyst must use a great deal of judgment in applying principles to particular problems and a particular attempt to apply a principle to solve a problem may not be successful. Thus, although behavioral principles are supported by evidence, newly constructed interventions based on these principles have not yet been evaluated. These interventions must be considered less certain or validated than treatment packages or elements that have been demonstrated to be effective for specific problems, populations, and context (Slocum et al. 2012b ).

Evidence-based practitioners of ABA recognize that the process of selecting and implementing treatments always includes some level of uncertainty (Detrich et al. 2013 ). One of the fundamental tenets of ABA shared with many other professions is that the best evidence regarding the effectiveness of an intervention does not come from systematic literature reviews, best practice guides, or principles of behavior, but from close continual contact with the relevant outcomes (Bushell and Baer 1994 ). The BACB guidelines ( 2010 ) state that, “behavior analysts recognize limits to the certainty with which judgments or predictions can be made about individuals” (item 3.0 [c]). As a result, “the behavior analyst collects data…needed to assess progress within the program” (item 4.07) and “modifies the program on the basis of data” (item 4.08). Thus, an important feature of the EBP of ABA is that professional decision-making does not end with the selection of an initial intervention. The process continues with ongoing progress monitoring and adjustments to the treatment plan as needed to achieve the targeted outcomes. Progress monitoring and data-based decision-making are the ultimate hedge against the inherent uncertainties of imperfect knowledge derived from research. As the quality of the best available evidence decreases, the importance of frequent direct measurement of client progress increases.

Practice decisions are always accompanied by some degree of uncertainty; however, better decisions are likely when multiple of sources of evidence are integrated. For example, a multicomponent treatment package may be an EST for clients who differ slightly from those the practitioner currently serves. Confidence in the use of this treatment may be increased if there is evidence showing the central components are effective with clients belonging to the population of interest. The principles of behavior might further inform sensible variations appropriate for the specific context of practice. When considered together, numerous sources of evidence increase the confidence the behavior analyst can have in the intervention. And when the plan is implemented, progress monitoring may reveal the need for additional adjustments. Each of these different classes of evidence provides answers to different questions for the practitioner, resulting in a more fine-grained analysis of the clinical problem and solutions to it (Detrich et al. 2013 ).

Client Values and Context

In order to be compatible with the underlying tenets of ABA, parallel with other professions, and to promote effective practice, a definition of EBP of ABA must include client values and context among the primary contributors to professional decision-making. Baer et al. ( 1968 ) suggested that the word applied refers to an immediate and important change in behavior that has practical value and that this value is determined “by the interest which society shows in the problems” (p. 92)—that is, by social values. Wolf ( 1978 ) went on to specify that behavior analytic practice can only be termed successful if it addresses goals that are meaningful to our clients, uses procedures that are judged appropriate by our clients, and produces effects that are valued by our clients. These foundational tenets of ABA correspond with the centrality of client values in classic definitions of EBP (e.g., Institute of Medicine 2001 ). Like medical professionals and those in the many other fields that have adopted similar conceptualizations of EBP, behavior analysts have long recognized that client values are critical contributors to responsible decision-making.

Behavior analysts have defined the client as the individual who is the focus of the behavior change, other individuals who are critical to the behavior change process (Baer et al. 1968 ; Heward et al. 2005 ), as well as outside individuals or groups who may have a stake in the target behavior or improved outcomes (Baer et al. 1987 ; Wolf 1978 ). Wolf ( 1978 ) argued that only our clients can judge the social validity of our work and suggested that behavior analysts address three levels of social validity: (a) the social significance of the goals, (b) the social desirability of the procedures, and (c) the social importance of the outcomes. With respect to selection of interventions, Wolf noted, “not only is it important to determine the acceptability of treatment procedures to participants for ethical reasons, it may also be that the acceptability of the program is related to effectiveness, as well as to the likelihood that the program will be adopted and supported by others” (p. 210). He further maintained that clients are the ultimate arbiters of whether or not the effects of a program are sufficiently helpful to be termed successful.

The concept of social validity directs our attention to some of the important aspects of the context of intervention. Intervention always occurs in some context and features of that context can directly influence the fidelity with which the intervention is implemented and its effectiveness. Albin et al. ( 1996 ) expanded further on the contextual variables that might be critical for designing and implementing effective interventions. They described the concept of contextual fit or the congruence of a behavioral support plan and the context and indicate that this fit will determine its implementation, effectiveness, and maintenance.

Contextual fit includes the issues of social validity, but also explicitly encompasses issues associated with the individuals who implement treatments and manage other aspects of the environments within which treatments are implemented. Behavioral intervention plans prescribe the behavior of implementers. These implementers may include professionals, such as therapists and teachers, as well as nonprofessionals, such as family and community members. It is important to consider characteristics of these implementers when developing plans because the success of a plan may hinge on how it corresponds with the values, skills, goals, and stressors of the implementers. Effective plans must be within the skill repertoire of the implementers, or training to fidelity must occur to introduce the plan components into that repertoire. Values, goals, and stressors refer to motivating operations that determine the reinforcing or punishing value of implementing the plan. Plans that provide little reinforcement and substantial punishment in the process of implementation or outcomes are unlikely to be implemented with fidelity or maintained over time. The effectiveness of behavioral interventions is also influenced by their compatibility with other aspects of their context. Plans that are compatible with ongoing routines are more likely to be implemented than those that conflict (Riley-Tillman and Chafouleas 2003 ). Interventions require various kinds of resources to be implemented and sustained. For example, financial resources may be necessary to purchase curricula, equipment, or other goods. Interventions may require human resources such as direct service staff, training, supervision, administration, and consultation. Fixsen et al. ( 2005 ) have completed an extensive review of contextual variables that can potentially influence the quality of intervention implementation. Behavior analytic practice is unlikely to be effective if it does not consider the context in which interventions will be implemented.

Extensive behavior analytic research has documented the importance of social validity and other contextual factors in producing behavioral changes with practical value. This research tradition is as old as our field (e.g., Jones and Azrin 1969 ) and continues through the present day. For example, Strain et al. ( 2012 ) provided multiple examples of the impact of social validity considerations on relevant outcomes. They reported that integrating client values, preferences, and characteristics in the selection and implementation of an intervention can successfully inform decisions regarding (a) how to design service delivery systems, (b) how to support implementers with complex strategies, (c) when to fade support, (e) how to identify important and unanticipated effects, and (f) how to focus on future research efforts.

Benazzi et al. ( 2006 ) examined the effect of stakeholder participation in intervention planning on the acceptability and usability of behavior intervention plans (BIP) based on descriptive functional behavior assessments (FBA). Plans developed by behavior experts were rated as high in technical adequacy, but low in acceptability. Conversely, plans developed by key stakeholders were highly acceptable, but lacked technical adequacy. However, when the process included both behavior experts and key stakeholders, BIPs were considered both acceptable and technically adequate. Thus, the BIPs developed by behavior analysts may be marginalized and implementation may be less likely to occur in the absence of key stakeholder input. Thus, a practical commitment to effective interventions that are implemented and maintained with integrity over time requires that behavior analysts consider motivational variables such as the alignment of interventions with the values, reinforcers, and punishers of relevant stakeholders.

Clinical Expertise

All of the key components for expert behavior analytic practice (i.e., identification of important behavioral problems, recognition of underlying behavioral processes, weighing of evidence supporting various treatment options, selecting and implementing treatments in complex social contexts, engaging in ongoing data-based decision making, and being responsive to client values and context) require clinical expertise. Clinical expertise refers to the competence attained by practitioners through education, training, and experience that results in effective practice (American Psychological Association Presidential Task Force of Evidence-Based Practice 2006 ). Clinical expertise is the means by which the best available evidence is applied to individual cases in all their complexity. Based on the work of Goodheart ( 2006 ), we suggest that clinical expertise in EBP of ABA includes (a) knowledge of the research literature and its applicability to particular clients, (b) incorporation of the conceptual system of ABA, (c) breadth and depth of clinical and interpersonal skills, (d) integration of client values and context, (e) recognition of the need for outside consultation, (f) data-based decision making, and (g) ongoing professional development. In the sections that follow, we describe each component of clinical expertise in ABA.

Knowledge and Application of the Research Literature

ABA practitioners must be skilled in applying the best available evidence to unique cases in specific contexts. The role of the best available evidence in EBP of ABA was discussed above. Practitioners need to be knowledgeable about the scientific literature and able to appropriately apply the literature to behaviors, clients, and contexts that are rarely a perfect match to the behaviors, clients, and contexts in any particular study. This confluence of knowledge and skillful application requires that the behavior analyst respond to the functionally important features of cases. A great deal of training is necessary to build the expertise required to discriminate critical functional features from those that are incidental. These discriminations must be made with respect to the presenting problem (i.e., the behavioral patterns that have been identified as problematic, their antecedent stimuli, motivating operations, and consequences); client variables such as histories, skills, and preferences; and contextual variables that may impact the effectiveness of various treatment options as applied to the particular case. These skills are reflected in BACB Guidelines 1.01 and 2.10 cited above.

Incorporation of the Conceptual System

The critical features of a case must be identified and mapped onto the conceptual system of ABA. It is not enough to recognize that a particular feature of the environment is important; it must also be understood in terms of its likely behavioral function. This initial conceptualization is necessary in order to generate reasonable hypotheses that may be tested in more thorough analyses. Developing the skill of describing cases in terms of likely behavioral functions typically requires a great deal of formal and informal training as well as ongoing learning from experience. These repertoires are usually acquired through extensive training, supervised practice, and the ongoing feedback of client outcomes. This is recognized in BACB Guidelines; for example, 4.0 states that “the behavior analyst designs programs that are based on behavior analytic principles” (BACB 2010 ).

Breadth and Depth of Clinical and Interpersonal Skills

Evidence-based practitioners of behavior analysis must be able to implement various assessment and intervention procedures with fidelity, and often to train and supervise others to implement such procedures with fidelity. Further, clinical expertise in ABA requires that the practitioner have effective interpersonal skills. For example, he must be able to explain the behavioral philosophy and approach, in nonbehavioral terms, to various audiences who may have different theoretical orientations. BCBA Guidelines 1.05 specifies that behavior analysts “use language that is fully understandable to the recipient of those services” (BACB 2010 ).

Integration of Client Values and Context

In all aspects of their work, practitioners of evidence-based ABA must integrate the values and preferences of the client and other stakeholders as well as the features of the specific context that may impact the effectiveness of an intervention. These factors can be considered additional variables that the behavior analyst must attend to when planning and providing behavior-analytic services. For example, when assessment data suggest behavior serves a particular function, a range of intervention alternatives may be considered (see Geiger, Carr, and LeBlanc for an example of a model for selecting treatments for escape-maintained problem behavior). A caregiver’s statements might suggest that one type of intervention may not be viable due to limited resources while another treatment may be acceptable based on financial considerations, available resources, or other practical factors; the behavior analyst must have the training and expertise to evaluate and incorporate these factors into initial treatment selection and to re-evaluate these concerns as a part of progress monitoring for both treatment integrity and client improvement. BACB Guideline 4.0 states that the behavior analyst “involves the client … in the planning of … programs, [and] obtains the consent of the client” and 4.1 states that “if environmental conditions hamper implementation of the behavior analytic program, the behavior analyst seeks to eliminate the environmental constraints, or identifies in writing the obstacles to doing so” (BACB 2010 ).

Recognition of Need for Outside Consultation

Behavior analysts engaging in responsible evidence-based practice discriminate between behaviors and contexts that are within the scope of their training and those that are not, and respond differently based on this discrimination. For example, a behavior analyst who has been trained to provide assessment and intervention for severe problem behavior may not have the specific training to provide organizational behavior management services to a corporation; in this case, a behavior analyst with clinical expertise would make this discrimination and seek additional consultation or make appropriate referrals. This aspect of expertise is described in BACB ( 2010 ) Guidelines 1.02 and 2.02.

Data-Based Decision Making

Data-based decision making plays a central role in the practice of ABA and is an indispensable feature of clinical expertise. The process of data-based decision making includes identifying useful measurement pinpoints, constructing measurement systems, and graphing results, as well as identifying meaningful patterns in data, interpreting these patterns, and making appropriate responses to them (e.g., maintaining, modifying, replacing, or ending a program). The functional features of the case, the best available research evidence, and the new evidence obtained through progress monitoring must inform these judgments and are central to this model of EBP of ABA. BACB ( 2010 ) Guidelines 4.07 and 4.08 specify that behavior analysts collect data to assess progress and modify programs on the basis of data.

Ongoing Professional Development

Clinical expertise is not static; rather, it requires ongoing professional development. Clinical expertise in ABA requires ongoing contact with the research literature to ensure that practice reflects current knowledge about the most effective and efficient assessment and intervention procedures. The critical literature includes primary empirical research as well as reviews and syntheses such as those described in the section on “ Best Available Evidence ”. In addition, professional consensus on important topics for professional practice evolves over time. For example, in ABA, there has been increased emphasis recently on ethics and supervision competence. All of these dynamics point to the need for ongoing professional development. This is reflected in the requirement that certified behavior analysts “undertake ongoing efforts to maintain competence in the skills they use by reading the appropriate literature, attending conferences and conventions, participating in workshops, and/or obtaining Behavior Analyst Certification Board certification” (Guideline 1.03, BACB 2010 ).

Conclusions

We propose that EBP of ABA be understood as a professional decision-making framework that draws on the best available evidence, client values and context, and clinical expertise. We argue that this conception of EBP of ABA is more compatible with the basic tenets of ABA and more closely aligned with definitions of EBP in other fields than that provided by Smith ( 2013 ). It is noteworthy that this notion of EBP is not necessarily in conflict with many of the observations and arguments put forth by Smith ( 2013 ). His concerns were primarily about how to define and validate EST, which is an important way to inform practitioners about the best available evidence to integrate into their overall EBP.

Given the close alignment between the proposed framework of EBP of ABA and broadly accepted descriptions of behavior analytic practice, one might wonder whether EBP offers anything new. We believe that the EBP of ABA framework, offered here, has several important implications for our field. First, this framework draws together numerous elements of ABA practice into a single coherent system, which can help behavior analysts provide an explicit rationale for their decision-making to clients and other stakeholders. The EBP of ABA provides a decision-making framework that supports a cogent and transparent description of (a) the evidence considered, including direct and frequent measurement of the client’s behavior; (b) why this evidence was identified as the “best available” for the particular case; (c) how client values and contextual factors influenced the process; and (d) the ways in which clinical expertise was used to conceptualize the case and integrate the various considerations. This transparency and explicitness allows the behavior analyst to offer empirically based treatment recommendations while addressing the concerns raised by stakeholders. It also highlights the critical analysis required to be an effective behavior analyst. For example, if an EST is available and appropriate, the behavior analyst can describe the relevance and certainty of the evidence for this intervention. If no relevant EST is available, the behavior analyst can describe how the best available evidence supports the intervention and emphasize the importance of progress monitoring.

Second, the EBP framework prompts the behavior analyst to refer to the important client values that underlie the goals of intervention, the specific methods of intervention, and describe how the intervention is supported by features of the context. This requires the behavior analyst to explicitly recognize that the effectiveness of an intervention is always context dependent. By serving as a prompt, the EBP framework should increase behavior analysts’ adherence to this central tenet of ABA.

Third, by explicitly recognizing the role of clinical expertise, the framework gives the behavior analyst a way to talk about the complex skills required to make appropriate decisions about client needs. In addition, the fact that the proposed definition of EBP of ABA is so closely aligned with definitions in other professions such as medicine and psychology that it provides a common framework and language for communicating about a particular case that can enhance collaboration between behavior analysts and other professionals.

Fourth, this framework for EBP of ABA suggests further development of behavior analysis as well. Examination of the meaning of best available evidence encourages behavior analysts to continue to refine methods for systematically reviewing research literature and identifying ESTs. Further, behavior analysts could better support EBP if we developed methods for validating other units of intervention such as practice elements, kernels, and even the principles of behavior; when these are invoked to support interventions, they must be supported by a clearly specified research base.

Finally, the explicit recognition of the role of clinical expertise in the EBP of ABA has important implications for training behavior analysts. This framework suggests that decision-making is at the heart of EBP of ABA and could be an organizing theme for ABA training programs. Training programs could systematically teach students to articulate the chain of logic that is the basis for their treatment recommendations. The chain of logic would include statements about which research was considered and why, how the client’s values influenced decision-making, and how contextual factors influenced the selection and adaptation (if necessary) of the treatment. This type of training could be embedded in all instructional activities. Formally requiring students to articulate a rationale for the decisions and receiving feedback about their decisions would sharpen their clinical expertise.

In addition to influencing our behavior analytic practice, the EBP of ABA framework impacts our relationship with other members of the broader human service field as well as individuals and agencies that control contingencies relevant to practitioners and scientists. Methodologically rigorous reviews that identify ESTs and other treatments supported by the best available evidence are extremely important for working with organizations that control funding for behavior analytic research and practice. Federal funding for research and service provision is moving strongly towards EBP and ESTs. This trend is clear in education through the No Child Left Behind Act of 2001 , the Individuals with Disabilities Education Act of 2004 , the funding policies of IES, and the What Works Clearinghouse. The recent memorandum by the Director of the Office of Management and Budget (Zients 2012 ) makes it clear that the importance of EBP is not limited to a single discipline or to one political party. In addition, insurance companies are increasingly making reimbursement decisions based, in part, on whether or not credible scientific evidence supports the use of the treatment (Small 2004 ). The insurance companies have consistently adopted criteria for scientific evidence that are closely related to EST (Bogduk and Fraifeld 2010 ). As a result, reimbursement for ABA services may depend on the scientific credibility of EST reviews, a critical component of EBP. Methodologically rigorous reviews that identify ESTs within a broader framework of EBP appear to be critical for ABA to maintain and expand its access to federal funding and insurance reimbursement for services. Establishment of this literature base will require behavior analysts to develop appropriate methods for reviewing and summarizing research based on single-subject designs. IES has established such standards for reviewing studies, but to date, there are no accepted methods for calculating a measure of effect size as an objective basis for combining result across studies (Kratochwill et al. 2013 ). If behavior analysts develop such a measure, it would reflect a significant methodological advance as a field and it would increase the credibility of behavior analytic research with agencies that fund research and services.

EBP of ABA emphasizes the research-supported selection of treatments and data-driven decisions about treatment progress that have always been at the core of ABA. ABA’s long-standing recognition of the importance of social validity is reflected in the definition of EBP. This framework for EBP of ABA offers many positive professional consequences for scientists and practitioners while promoting the best of the behavior analytic tradition and making contact with developments in other disciplines and the larger context in which behavior analysts work.

  • Albin RW, Lucyshyn JM, Horner RH, Flannery KB. Contextual fit for behavior support plans. In: Koegel LK, Koegel RL, Dunlap G, editors. Positive behavioral support: Including people with difficult behaviors in the community. Baltimore: Brookes; 1996. pp. 81–92. [ Google Scholar ]
  • American Occupational Therapy Association Occupational therapy practice framework: domain and process (2nd ed.) American Journal of Occupational Therapy. 2008; 62 :625–683. doi: 10.5014/ajot.62.6.625. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • American Psychological Association (2005). Policy statement on evidence-based practice in psychology. http://www.apa.org/practice/resources/evidence/evidence-based-statement.pdf .
  • American Psychological Association Presidential Task Force of Evidence-Based Practice Evidence-based practice in psychology. American Psychologist. 2006; 61 :271–285. doi: 10.1037/0003-066X.61.4.271. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • American Speech-Language-Hearing Association (2005). Evidence-based practice in communication disorders [position statement]. www.asha.org/policy .
  • Baer DM, Wolf MM, Risley TR. Some current dimensions of applied behavior analysis. Journal of Applied Behavior Analysis. 1968; 1 :91–97. doi: 10.1901/jaba.1968.1-91. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Baer DM, Wolf MM, Risley TR. Some still-current dimensions of applied behavior analysis. Journal of Applied Behavior Analysis. 1987; 20 :313–327. doi: 10.1901/jaba.1987.20-313. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Behavior Analyst Certification Board (2010). Guidelines for responsible conduct for behavior analysts. http://www.bacb.com/index.php?page=57 .
  • Benazzi L, Horner RH, Good RH. Effects of behavior support team composition on the technical adequacy and contextual-fit of behavior support plans. The Journal of Special Education. 2006; 40 (3):160–170. doi: 10.1177/00224669060400030401. [ CrossRef ] [ Google Scholar ]
  • Bogduk N, Fraifeld EM. Proof or consequences: who shall pay for the evidence in pain medicine? Pain Medicine. 2010; 11 (1):1–2. doi: 10.1111/j.1526-4637.2009.00770.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Bushell D, Jr, Baer DM. Measurably superior instruction means close, continual contact with the relevant outcome data. Revolutionary! In: Gardner R III, Sainato DM, Cooper JO, Heron TE, Heward WL, Eshleman J, Grossi TA, editors. Behavior analysis in education: Focus on measurably superior instruction. Pacific Grove: Brooks; 1994. pp. 3–10. [ Google Scholar ]
  • Carnine D. Expanding the notion of teachers’ rights: access to tools that work. Journal of Applied Behavior Analysis. 1992; 25 (1):13–19. doi: 10.1901/jaba.1992.25-13. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Carr JE, Severtson JM, Lepper TL. Noncontingent reinforcement is an empirically supported treatment for problem behavior exhibited by individuals with developmental disabilities. Research in Developmental Disabilities. 2009; 30 :44–57. doi: 10.1016/j.ridd.2008.03.002. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chorpita BF. The frontier of evidence-based practice. In: Kazdin AE, Weisz JR, editors. Evidence-based psychotherapies for children and adolescents. New York: Oxford; 2003. pp. 42–59. [ Google Scholar ]
  • Chorpita BF, Daleiden EL, Weisz JR. Identifying and selecting the common elements of evidence based interventions: a distillation and matching model. Mental Health Services Research. 2005; 7 :5–20. doi: 10.1007/s11020-005-1962-6. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chorpita BF, Becker KD, Daleiden EL. Understanding the common elements of evidence-based practice: misconceptions and clinical examples. Journal of the American Academy of Child and Adolescent Psychiatry. 2007; 46 :647–652. doi: 10.1097/chi.0b013e318033ff71. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Cook BG, Cook SC. Unraveling evidence-based practices in special education. Journal of Special Education. 2013; 47 (2):71–82. doi: 10.1177/0022466911420877. [ CrossRef ] [ Google Scholar ]
  • Cooper JO, Heron TE, Heward WL. Applied behavior analysis. 2. Upper Saddle River: Pearson; 2007. [ Google Scholar ]
  • Detrich, R. (Chair) (2009). Evidence-based, empirically supported, best practice: What does it all mean? Symposium conducted at the annual meeting of the Association for Behavior Analysis International, Phoenix, AZ.
  • Detrich R, Slocum TA, Spencer TD. Evidence-based education and best available evidence: Decision-making under conditions of uncertainty. In: Cook BG, Tankersley M, Landrum TJ, editors. Advances in learning and behavioral disabilities, 26. Bingly, UK: Emerald; 2013. pp. 21–44. [ Google Scholar ]
  • Embry DD. Community-based prevention using simple, low-cost, evidence-based kernels and behavior vaccines. Journal of Community Psychology. 2004; 32 :575–591. doi: 10.1002/jcop.20020. [ CrossRef ] [ Google Scholar ]
  • Embry DD, Biglan A. Evidence-based kernels: fundamental units of behavioral influence. Clinical Child and Family Psychology Review. 2008; 11 :75–113. doi: 10.1007/s10567-008-0036-x. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Fisher WW, Piazza CC, Roane HS, editors. Handbook of applied behavior analysis. New York: Guilford Press; 2011. [ Google Scholar ]
  • Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation research: A synthesis of the literature (FMHI publication #231) Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network; 2005. [ Google Scholar ]
  • Goodheart CD. Evidence, endeavor, and expertise in psychology practice. In: Goodheart CD, Kazdin AE, Sternberg RJ, editors. Evidence-based psychotherapy: Where practice and research meet. Washington, D.C.: APA; 2006. pp. 37–61. [ Google Scholar ]
  • Goodman KW. Ethics and evidence-based education: Fallibility and responsibility in clinical science. New York: Cambridge University Press; 2003. [ Google Scholar ]
  • Heward WL, et al., editors. Focus on behavior analysis in education: Achievements, challenges, and opportunities. Upper Saddle River: Prentice Hall; 2005. [ Google Scholar ]
  • Horner RH, Carr EG, Halle J, McGee G, Odom S, Wolery M. The use of single-subject research to identify evidence-based practice in special education. Exceptional Children. 2005; 71 (2):165–179. doi: 10.1177/001440290507100203. [ CrossRef ] [ Google Scholar ]
  • Horner RH, Sugai G, Todd AW, Lewis-Palmer T. Schoolwide positive behavior support. In: Bambera LM, Kern L, editors. Individualized supports for students with problem behaviors: Designing positive behavior plans. New York: Guilford Press; 2005. pp. 359–390. [ Google Scholar ]
  • Individuals with Disabilities Education Improvement Act of 2004, 70, Fed. Reg., (2005).
  • Institute of Education Sciences, US. Department of Education. (n.d.). What Works Clearinghouse Procedures and Standards Handbook (No. Version 3.0). Washington DC. Retrieved from http://ies.ed.gov/ncee/wwc/pdf/reference_resources/wwc_procedures_v3_0_standards_handbook.pdf .
  • Institute of Medicine . Crossing the quality chasm: A new health system for the 21st century. Washington, DC: National Academies Press; 2001. [ PubMed ] [ Google Scholar ]
  • Johnston JM, Pennypacker HS. Strategies and tactics of behavioral research. 2. Hillsdale: Erlbaum; 1993. [ Google Scholar ]
  • Jones RJ, Azrin NH. Behavioral engineering: stuttering as a function of stimulus duration during speech synchronization. Journal of Applied Behavior Analysis. 1969; 2 :223–229. doi: 10.1901/jaba.1969.2-223. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kazdin AE. Psychotherapy for children and adolescents: Directions for research and practice. New York: Oxford University Press; 2000. [ Google Scholar ]
  • Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2010). Single-case designs technical documentation. Retrieved from http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf .
  • Kratochwill TR, Hitchcock JH, Horner RH, Levin JR, Odom SL, Rindskopf DM, et al. Single-case intervention research design standards. Remedial & Special Education. 2013; 34 (1):26–38. doi: 10.1177/0741932512452794. [ CrossRef ] [ Google Scholar ]
  • Madden GJ, Dube WV, Hackenberg TD, Hanley GP, Lattal KA, editors. American Psychological Association handbook of behavior analysis. Washington, DC: American Psychological Association; 2013. [ Google Scholar ]
  • Maggin DM, O’Keeffe BV, Johnson AH. A quantitative synthesis of single-subject meta-analyses in special education, 1985–2009. Exceptionality. 2011; 19 :109–135. doi: 10.1080/09362835.2011.565725. [ CrossRef ] [ Google Scholar ]
  • Maggin DM, Johnson AH, Chafouleas SM, Ruberto LM, Berggren M. A systematic evidence review of school-based group contingency interventions for students with challenging behavior. Journal of School Psychology. 2012; 50 :625–654. doi: 10.1016/j.jsp.2012.06.001. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • McIntosh K, Filter KJ, Bennett JL, Ryan C, Sugai G. Principles of sustainable prevention: designing scale–up of School–wide Positive Behavior Support to promote durable systems. Psychology in the Schools. 2010; 47 (1):5–21. [ Google Scholar ]
  • National Autism Center . National Standards Project: Findings and conclusions. Randolph: National Autism Center; 2009. [ Google Scholar ]
  • No Child Left Behind Act of 2001, Pub. L. No. 107-110. (2002).
  • Polsgrove L. Reflections on the past and future. Education and Treatment of Children. 2003; 26 :337–344. [ Google Scholar ]
  • Riley-Tillman TC, Chafouleas SM. Using interventions that exist in the natural environment to increase treatment integrity and social influence in consultation. Journal of Educational & Psychological Consultation. 2003; 14 (2):139–156. doi: 10.1207/s1532768xjepc1402_3. [ CrossRef ] [ Google Scholar ]
  • Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn’t. British Medical Journal. 1996; 312 (7023):71. doi: 10.1136/bmj.312.7023.71. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sackett DL, Straus SE, Richardson WS, Rosenberg W, Haynes RB, editors. Evidence-based medicine: How to teach and practice EBM. Edinburgh: Livingstone; 2000. [ Google Scholar ]
  • Shanahan, T., Callison, K., Carriere, C., Duke, N.K., Pearson, P.D., Schatschneider, C., et al. (2010). Improving reading comprehension in kindergarten through 3rd grade: A practice guide (NCEE 2010-4038). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. http://ies.ed.gov/ncee/wwc/publications/practiceguides . Accessed 12 Sept 2013
  • Sidman M. Tactics of scientific research: Evaluating experimental data in psychology. New York: Basic Books; 1960. [ Google Scholar ]
  • Slocum, T. A., & Wilczynski, S. (2008). The unit of analysis in evidence-based practice . Invited paper presented at the meeting the Association for Behavior Analysis International, Chicago, Il.
  • Slocum TA, Detrich R, Spencer TD. Evaluating the validity of systematic reviews to identify empirically supported treatments. Education and Treatment of Children. 2012; 35 :201–234. doi: 10.1353/etc.2012.0009. [ CrossRef ] [ Google Scholar ]
  • Slocum TA, Spencer TD, Detrich R. Best available evidence: three complementary approaches. Education and Treatment of Children. 2012; 35 :27–55. [ Google Scholar ]
  • Small RH. Maximize the likelihood of reimbursement when appealing managed care medical necessity denials. Getting Paid in Behavioral Healthcare. 2004; 9 (12):1–3. [ Google Scholar ]
  • Smith T. What is evidence-based behavior analysis? The Behavior Analyst. 2013; 36 :7–33. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Strain PS, Barton EE, Dunap G. Lessons learned about the utility of social validity. Education and Treatment of Children. 2012; 35 (2):183–200. doi: 10.1353/etc.2012.0007. [ CrossRef ] [ Google Scholar ]
  • Wilczynski SM. Risk and strategic decision-making in developing evidence-based practice guidelines. Education and Treatment of Children. 2012; 35 :291–311. doi: 10.1353/etc.2012.0012. [ CrossRef ] [ Google Scholar ]
  • Wolf M. Social validity: the case for subjective measurement, or how applied behavior analysis is finding its heart. Journal of Applied Behavior Analysis. 1978; 11 :203–214. doi: 10.1901/jaba.1978.11-203. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Zients, J. D. (2012). M-12-14. Memorandum to the heads of executive departs. From: Jeffrey D. Zients, Acting Director. Subject: use of evidence and evaluation in the 2014 Budget. www.whitehouse.gov/sites/default/files/omb/…/2012/m-12-14.pdf . Accessed 30 Sept 2012

COMMENTS

  1. Applied Research

    Applied Research is a multidisciplinary journal for researchers across the physical sciences, natural sciences, life sciences and engineering fields.. Our goal is to bridge the gap between fundamental and applied research and highlight the path towards the application of ideas through experiments, protocols, software, instrumentation, and other approaches.

  2. Rapid and Rigorous Qualitative Data Analysis:

    Qualitative methods play a vital role in applied research because they provide deeper examinations and realizations of the human experience (Grinnell & Unrau, 2011; Padgett, 2008; Watkins, 2012; Watkins & Gioia, 2015).Compared to quantitative methods, qualitative research methods help researchers acquire more in-depth information—or the words behind the numbers—for a phenomenon of interest ...

  3. Applied Research

    Applied Research 2024, vol. 3, eLoc. e202300014. Applied Research is a multidisciplinary journal that connects fundamental and applied research across the physical, natural, and life sciences, and engineering.

  4. Journal of Applied Research in Memory and Cognition

    The Journal of Applied Research in Memory and Cognition (JARMAC) publishes the highest-quality applied research in memory and cognition, in the format of empirical reports, review articles, and target papers with invited peer commentary. The goal of this unique journal is to reach psychological scientists and other researchers working in this field and related areas, as well as professionals ...

  5. Applied Research: List of Issues

    Volume 3, Issue 1. February 2024. Subscribe to this journal. Applied Research is a multidisciplinary journal that connects fundamental and applied research across the physical, natural, and life sciences, and engineering.

  6. Journal of Applied Psychology

    All articles submitted to the Journal of Applied Psychology must, at a minimum, adhere to the TOP levels as noted in the list below, which details the domains of research planning and reporting, the TOP level required, and a brief description of the journal's policy. Submissions that do not include (1) qualitative, quantitative, or simulated ...

  7. Overcoming Barriers to Applied Research: A Guide for Practitioners

    It is our hope that practitioners wishing to conduct applied research will use this article as a resource and source of motivation to begin doing so. It is also our job as a behavior-analytic community to support practitioners in their research endeavors. ... Journal of Applied Behavior Analysis. 2016; 49:848-868. doi: 10.1002/jaba.342 ...

  8. Journal of Applied Research in Higher Education

    Aims and scope. Internationally peer-reviewed, the Journal of Applied Research in Higher Education (JARHE), focuses on the scholarship and practice of teaching and learning and higher education, covering: Higher education teaching, learning, curriculum, assessment, policy, management, leadership, and related areas.

  9. Journal of Applied Research and Technology

    Lasers in Optics. Edited by Dra. Martha Rosete Aguilar, Dr. Jesús Garduño Mejía. December 2015. View all issues. Read the latest articles of Journal of Applied Research and Technology at ScienceDirect.com, Elsevier's leading platform of peer-reviewed scholarly literature.

  10. Applied Research and Development in Health Care

    Applied research and development involves seeking just enough information, through studies with just enough methodologic rigor, to improve practice within a short time. ... New England Journal of ...

  11. Home

    Announcing "Family Well-Being" section in Applied Research in Quality of Life (ARQOL) As family well-being is closely related to quality of life, we are launching a section on "Family well-Being" in Applied Research in Quality of Life.Effective January 2021 the journal welcomes submissions on family well-being which are consistent with the aims and scope of ARQOL (see below).

  12. Applied Nursing Research

    About the journal. Applied Nursing Research presents original, peer-reviewed research findings clearly and directly for clinical applications in all nursing specialties. Regular features include "Ask the Experts," research briefs, clinical methods, book reviews, news and announcements, and an editorial section. Appli….

  13. The relationship between basic and applied research in universities

    The balance between specialisation in basic or applied research also aligns with Becher's typology, with few disciplines having a roughly equal number of pure applied and pure basic researchers. One possible exception may be law, where pure applied research was rare (10 %) compared to pure basic (18 %).

  14. Journal of Applied Communication Research

    The Journal of Applied Communication Research (JACR) is a peer-reviewed publication of the National Communication Association. JACR publishes original scholarship that contributes to knowledge about how people practice communication across diverse applied contexts. All theoretical and methodological approaches are welcome, as are all contextual areas. Of utmost importance is that an applied ...

  15. Articles

    Ömer Ataç. Müfide Yoruç Çotuk. Elizabeth Trejos-Castillo. Original Research Open access 27 January 2024. 1. 2. …. 27. Announcing "Family Well-Being" section in Applied Research in Quality of Life (ARQOL) As family well-being is closely related to quality of life, we are ...

  16. Applied Research

    Applied Research is a peer-reviewed multidisciplinary journal showcasing international research spanning fields across the physical and life sciences. The journal aims at publishing papers that help unite the approaches between fundamental studies and applications, and supports work that contributes to sustainable problem solving and global initiatives.

  17. Full article: Applied research by design: an experimental collaborative

    In this sense, we in part equate design and applied research. Design as a process of discovery and creation that results in synthesis. Applied research as a process of discovery via observation and analysis that also results in creation, of new knowledge. We emphasize partially because the interdisciplinary collaboration we refer to is two-fold ...

  18. What is applied research anyway?

    Applied research is original investigation undertaken in order to acquire new knowledge; it is, however, directed primarily towards a specific, practical aim or objective ( OECD, 2015 ). It is about using the existing stock knowledge with the appropriate methodology towards a specific objective, which is usually related to the resolution of a ...

  19. Applied Ocean Research

    The aim of Applied Ocean Research is to encourage the submission of papers that advance the state of knowledge in a range of topics relevant to ocean engineering. These topics include: * Wave mechanics. * Fluid-structure interaction. * Structural dynamics. * Hydrodynamics. * Floating and moored system dynamics. * Structural mechanics.

  20. Applied Linguistics

    International Association for Applied Linguistics (AILA) AILA (originally founded in 1964 in France) is an international federation of national and regional associations of Applied Linguistics. Find out more. Publishes research into language with relevance to real-world problems. Connections are made between fields, theories, research methods ...

  21. Applied Research Articles: narrowing the gap between research and

    Applied research articles: narrowing the gap between research and organizations. Since 1994, REGE publishes articles with the main objective of adding to the development of scientific knowledge in management. These articles stem from academic research, with relevant theoretical contribution to the field of administration.

  22. Applied Research Open Access

    Applied Research offers authors the option to publish their articles Open Access: immediately free to read, download, and share. If the Open Access option is selected, submissions will be subject to an APC if accepted and published in the journal: $3500 USD / £2750 GBP / €3150 EUR. Authors who receive funding from an agency or institution ...

  23. The Evidence-Based Practice of Applied Behavior Analysis

    Evidence-based practice (EBP) is a model of professional decision-making in which practitioners integrate the best available evidence with client values/context and clinical expertise in order to provide services for their clients. This framework provides behavior analysts with a structure for pervasive use of the best available evidence in the ...

  24. Journal of Applied Ecology: Volume 61, Issue 6

    Ecological research and robust science will play a vital role in supporting more sustainable food production. We see Journal of Applied Ecology playing a key role in promoting agricultural research that goes beyond a narrow focus on increasing output, and encourage more submissions at the interface of agriculture, ecology and biodiversity conservation, considering social and economic aspects ...

  25. Consideration of sex as a biological variable in the context of thermal

    T o the editor: In the Viewpoint article "Aligning Thermal Physiology and Biometeorological Research for Heat Adaptation and Resilience in a Changing Climate," published in the Journal of Applied Physiology (), the authors claim the importance of cross-disciplinary work in supporting solutions to extreme heat challenges.We congratulate the authors for bringing this critical viewpoint to ...