• Login
    View Item 
    •   Home
    • Theses and Dissertations
    • Theses and Dissertations
    • View Item
    •   Home
    • Theses and Dissertations
    • Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of TUScholarShareCommunitiesDateAuthorsTitlesSubjectsGenresThis CollectionDateAuthorsTitlesSubjectsGenres

    My Account

    LoginRegister

    Help

    AboutPeoplePoliciesHelp for DepositorsData DepositFAQs

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Systematic criterion-referenced test development in an English-language program

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    Kumazawa_temple_0225E_10639.pdf
    Size:
    1.370Mb
    Format:
    PDF
    Download
    Genre
    Thesis/Dissertation
    Date
    2011
    Author
    Kumazawa, Takaaki
    Advisor
    Beglar, David
    Brown, James Dean
    Committee member
    Childs, Marshall
    Sick, James
    Schaefer, Edward
    Department
    Educational Administration
    Subject
    Educational Tests and Measurements
    Educational Evaluation
    Educational Administration
    Achievement Test
    Criterion-referenced Test
    Diagnostic Test
    Generalizability Theory
    Many Faceted Rasch Model
    Test Development
    Permanent link to this record
    http://hdl.handle.net/20.500.12613/1673
    
    Metadata
    Show full item record
    DOI
    http://dx.doi.org/10.34944/dspace/1655
    Abstract
    Although classroom assessment is one of the most frequent practices carried out by teachers in all educational programs, limited research has been conducted to investigate the dependability and validity of criterion-referenced tests (CRTs). The main purpose of this study is to develop a criterion-referenced test for first-year Japanese university students in a general English program. To this end, four research questions are formulated: (a) To what extent do the criterion-referenced items function effectively?; (b) To what extent do the facets of persons, items, sections, classes, and subtests contribute to the total score variation in two CRT forms?; (c) To what extent are two CRT forms dependable when administered as pretests and posttests?; and (d) To what extent are two CRT forms valid when administered as pretests and posttests? Two CRT forms made up of vocabulary (k = 25), listening (k = 20), and reading (k = 25) subtests were administered to 249 students using a counterbalanced design. Criterion-referenced item analyses showed that most items were working well for criterion-referenced purposes. Both univariate and multivariate generalizability studies indicated that the most of the variance was accounted for by the interaction effect, followed by the items effect, and then by the persons effect. FACETS analyses showed the separation for all the facets accounted for in the analyses and showed that item separation was greater than person separation. This indicated that the students' ability estimates were similar due to their having taken a placement test, whose results were used to form proficiency-based classes. Both univariate and multivariate decision studies indicated that the CRT forms were moderately to highly dependable. The content validity of the CRT forms was supported because the test content was strongly linked to what was taught in class. The construct validity was supported mainly because a fair amount of score gain was observed. This study elucidates how the statistical analyses used in this study can be applied to CRT development, and how CRT development can be carried out as part of curriculum development.
    ADA compliance
    For Americans with Disabilities Act (ADA) accommodation, including help with reading this content, please contact scholarshare@temple.edu
    Collections
    Theses and Dissertations

    entitlement

     
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Temple University Libraries | 1900 N. 13th Street | Philadelphia, PA 19122
    (215) 204-8212 | scholarshare@temple.edu
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.