• Login
    View Item 
    •   Home
    • Theses and Dissertations
    • Theses and Dissertations
    • View Item
    •   Home
    • Theses and Dissertations
    • Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of TUScholarShareCommunitiesDateAuthorsTitlesSubjectsGenresThis CollectionDateAuthorsTitlesSubjectsGenres

    My Account

    LoginRegister

    Help

    AboutPeoplePoliciesHelp for DepositorsData DepositFAQs

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Peer Review Use in the EFL Writing Classroom

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    TETDEDXNeff-temple-0225E-12063.pdf
    Size:
    5.037Mb
    Format:
    PDF
    Download
    Genre
    Thesis/Dissertation
    Date
    2015
    Author
    Neff, Peter Edward
    Advisor
    Beglar, David
    Committee member
    Kozaki, Yoko
    Petchko, Katerina
    Fujioka, Mayumi
    Sawyer, Mark
    Department
    CITE/Language Arts
    Subject
    English as A Second Language
    English as A Foreign Language
    Peer Review
    Second Language Writing
    Tesol
    Permanent link to this record
    http://hdl.handle.net/20.500.12613/3335
    
    Metadata
    Show full item record
    DOI
    http://dx.doi.org/10.34944/dspace/3317
    Abstract
    This study was an examination of peer review use in English composition courses at a Japanese university. Approximately 100 students in four writing classes engaged in four modes of peer review modes: face-to-face, handwritten (both on-draft and using an evaluation sheet), and computer-assisted. The learners in the study represented a range of proficiencies, from lower-intermediate to advanced, so the assigned writing passages were limited to single paragraphs rather than full-length essays, which has typically been the case in prior research in this area. Each peer review session was preceded by training in peer review, including modeling and whole-class editing, as well as suggestions for each particular mode the learners participated in. After each session, students completed questionnaires in order to assess their evaluations of the activities, both as reviewers and comment receivers. The questionnaire data were then analyzed using a variety of statistical methods--including Rasch analysis descriptive statistics, and parametric and non-parametric measures--first to validate the questionnaire instrument, and second to ascertain the degree to which each peer review modes was viewed favorably or unfavorably received by the participants. Additionally, the participants' written drafts and peer comments were quantitatively and qualitatively analyzed in order to answer several research questions that focused on: the number and type of peer suggestions the learners made in each mode, the number and type of suggestions that were incorporated into later drafts by the authors, the degree to which suggestions and revisions were affected by learner proficiency, and the accuracy of the peer suggestions. For the research questions concerned with learner evaluations of the peer review modes, findings were mixed. The participants responded favorably to reading others' drafts and receiving comments, but they were less comfortable reviewing and making suggestions for their peers. Computer-assisted peer review was the most positively received overall, particularly from those in the High Proficiency Group. Person measures for Low Proficiency learners, on the other hand, were generally higher for on-draft peer review, while those for Intermediate Proficiency participants tended not to indicate strong endorsement for any particular mode. In order to answer the next set of research questions, the participants' drafts and peer suggestions were analyzed. Most of the learners' suggestions, particularly for those in the Low Proficiency Group, tended to be local in nature, concerning such areas as word choice, grammar, and mechanics; fewer suggestions were made at the sentence- or whole-text-level. In terms of incorporation of suggestion by authors into later drafts, oral peer review led to the highest rate of suggested revisions while review using an evaluation sheet of guided questions resulted in the lowest rate. Learner proficiency did not have a significant bearing on suggestions or revisions, except in the case of the High Proficiency Group, whose members made significantly more suggestions during computer-assisted peer review than during the other modes. Finally, over 73% of peer suggestions were determined to be accurate across all four modes. These findings indicate that peer review can work on even the most limited of scales with learners of even modest language proficiency. No single mode of peer review succeeded in all areas, and instructors are encouraged to blend different modes if possible. However, if a single mode is preferred or required, computer-assisted review is strong choice.
    ADA compliance
    For Americans with Disabilities Act (ADA) accommodation, including help with reading this content, please contact scholarshare@temple.edu
    Collections
    Theses and Dissertations

    entitlement

     
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Temple University Libraries | 1900 N. 13th Street | Philadelphia, PA 19122
    (215) 204-8212 | scholarshare@temple.edu
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.