• Login
    View Item 
    •   Home
    • Theses and Dissertations
    • Theses and Dissertations
    • View Item
    •   Home
    • Theses and Dissertations
    • Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of TUScholarShareCommunitiesDateAuthorsTitlesSubjectsGenresThis CollectionDateAuthorsTitlesSubjectsGenres

    My Account

    LoginRegister

    Help

    AboutPeoplePoliciesHelp for DepositorsData DepositFAQs

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    An Investigation of the Content and Concurrent Validity of the School-wide Evaluation Tool

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    TETDEDXBloomfield-temple-0225E ...
    Size:
    798.5Kb
    Format:
    PDF
    Download
    Genre
    Thesis/Dissertation
    Date
    2015
    Author
    Bloomfield, Alison Elizabeth
    Advisor
    Fiorello, Catherine A.
    Committee member
    Pendergast, Laura L.
    DuCette, Joseph P.
    Farley, Frank
    Tincani, Matt
    Department
    School Psychology
    Subject
    Educational Psychology
    Educational Tests & Measurements
    School-wide Evaluation Tool
    Schoolwide Positive Behavior Support
    Set
    Swpbis
    Swpbs
    Permanent link to this record
    http://hdl.handle.net/20.500.12613/2608
    
    Metadata
    Show full item record
    DOI
    http://dx.doi.org/10.34944/dspace/2590
    Abstract
    The School-wide Evaluation Tool (SET) is a commonly used measure of the implementation fidelity of school-wide positive behavior interventions and supports (SWPBIS) programs. The current study examines the content and concurrent validity of the SET to establish whether an alternative approach to weighting and scoring the SET might provide a more accurate assessment of SWPBIS implementation fidelity. Twenty published experts in the field of SWPBIS completed online surveys to obtain ratings of the relative importance of each item on the SET to sustainable SWPBIS implementation. Using the experts' mean ratings, four novel SET scoring approaches were developed: unweighted, reweighted using mean ratings, unweighted dropping lowest quartile items, and reweighted dropping lowest quartile items. SET 2.1 data from 1,018 schools were used to compare the four novel and two established SET scoring methods and examine their concurrent validity with the Team Implementation Checklist 3.1 (TIC; across a subsample of 492 schools). Correlational data indicated that the two novel SET scoring methods with dropped items were both significantly stronger predictors of TIC scores than the established SET scoring methods. Continuous SET scoring methods have greater concurrent validity with the TIC overall score and greater sensitivity than the dichotomous SET 80/80 Criterion. Based on the equivalent concurrent validity of the unweighted SET with dropped items and the reweighted SET with dropped items compared to the TIC, this study recommends that the unweighted SET with dropped items be used by schools and researchers to obtain a more cohesive and prioritized set of SWPBIS elements than the existing or other SET scoring methods developed in this study.
    ADA compliance
    For Americans with Disabilities Act (ADA) accommodation, including help with reading this content, please contact scholarshare@temple.edu
    Collections
    Theses and Dissertations

    entitlement

     
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Temple University Libraries | 1900 N. 13th Street | Philadelphia, PA 19122
    (215) 204-8212 | scholarshare@temple.edu
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.