• Login
    View Item 
    •   Home
    • Theses and Dissertations
    • Theses and Dissertations
    • View Item
    •   Home
    • Theses and Dissertations
    • Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of TUScholarShareCommunitiesDateAuthorsTitlesSubjectsGenresThis CollectionDateAuthorsTitlesSubjectsGenres

    My Account

    LoginRegister

    Help

    AboutPeoplePoliciesHelp for DepositorsData DepositFAQs

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    EVALUATING THE EFFECTS OF PUBLICATION BIAS IN SINGLE-CASE RESEARCH DESIGN FOR EVIDENCE-BASED PRACTICES IN AUTISM SPECTRUM DISORDER

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    TETDEDXDowdy-temple-0225E-13314.pdf
    Size:
    1.906Mb
    Format:
    PDF
    Download
    Genre
    Thesis/Dissertation
    Date
    2018
    Author
    Dowdy, Arthur G.
    Advisor
    Tincani, Matt
    Committee member
    Axelrod, Saul
    Hantula, Donald A.
    Fisher, Amanda Guld
    Department
    Special Education
    Subject
    Psychology, Behavioral
    Behavioral Sciences
    Autism Spectrum Disorder
    Evidence-based Practice
    File Drawer Problem
    Publication Bias
    Replication Crisis
    Single-subject Research Design
    Permanent link to this record
    http://hdl.handle.net/20.500.12613/2797
    
    Metadata
    Show full item record
    DOI
    http://dx.doi.org/10.34944/dspace/2779
    Abstract
    In single-case research design (SCRD), experimental control is demonstrated when the researcher’s application of an intervention, known as the independent variable, reliably produces a change in behavior, known as the dependent variable, and the change is not otherwise explained by confounding or extraneous variables. SCRD studies that fail to demonstrate experimental control may not be published because researchers may be unwilling to submit these papers for publication due to null findings and journals may be unwilling and unlikely to publish null outcomes (i.e., publication bias). The lack of submission and publication of null findings, leading to a disproportion of positive studies in the published research literature, is known as the “file drawer effect” (Rosenthal, 1979; Ferguson & Heene, 2012). Recently, researchers and policy organizations have identified evidence-based practices (EBPs) for children with autism spectrum disorder (ASD) based on systematic reviews of SCRD studies (Odom, Collet-Klingenberg, Rogers, & Hatton, 2010). However, if SCRD studies that do not demonstrate experimental control (i.e., null studies) are disproportionately unpublished due to the file drawer effect, this may result in a misrepresentation of positive findings, leading interventions to be deemed evidence-based that, actually, lack sufficient empirical support (Sham & Smith, 2014; Shadish, Zelinsky, Vevea, & Kratochwill, 2016). Social narratives, exercise, self-management, and response interruption/redirection are interventions for children with ASD that has been named EBPs according to the National Autism Standards (NAC; 2009) and National Professional Development Center on Autism Spectrum Disorder (NPDC; 2010); however, these interventions have not yet been evaluated for potential publication bias. The study employed and extended methods similar to Sham and Smith (2014), comparing the procedures and results of published articles and unpublished dissertations and theses for interventions identified as EBPs to evaluate the methodological rigor and evaluate the possibility of publication bias, file drawer effect, and lack of replication. Specifically, the results of published and unpublished studies were compared to determine if published studies showed greater treatment effect, which would indicate the file drawer effect. Also, SCRD quality indicators were employed to evaluate whether studies that were published tend to be of higher quality, as this would mitigate possible publication bias shown by larger effect sizes (ES) in published studies. The outcome resulted in three out of four EBPs (social narratives, antecedent exercise, and response interruption and redirection), yielding different ES when published studies were compared to unpublished studies; in contrast, self-management yielded a similar ES for published and unpublished studies. For social narratives and antecedent exercise, unpublished studies presented at lower estimated ES than published studies; whereas for response interruption and redirection, unpublished studies presented at a higher estimated ES compared to published studies. Generally, study quality presented at similar levels for published and unpublished studies for each EBP, with the exception of antecedent exercise. Differences were identified for antecedent exercise study quality based upon visual and statistical analyses. Lastly, there do not appear to be observed differences in treatment outcomes between published and unpublished studies when study quality was considered in the analysis. Implications of the results are discussed with respect to the file drawer effect and publication bias in EBPs, and the call to increase publications in peer-reviewed journals of negative findings and replication studies, which leads to identifying and establishing boundary criteria for EBPs.
    ADA compliance
    For Americans with Disabilities Act (ADA) accommodation, including help with reading this content, please contact scholarshare@temple.edu
    Collections
    Theses and Dissertations

    entitlement

     
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Temple University Libraries | 1900 N. 13th Street | Philadelphia, PA 19122
    (215) 204-8212 | scholarshare@temple.edu
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.