Citation
Exploring Relationships Between Peer Review, Revision Strategies, and Self-Efficacy in Online College Composition

Material Information

Title:
Exploring Relationships Between Peer Review, Revision Strategies, and Self-Efficacy in Online College Composition
Creator:
Antee, Audrey M
Place of Publication:
[Gainesville, Fla.]
Florida
Publisher:
University of Florida
Publication Date:
Language:
english
Physical Description:
1 online resource (155 p.)

Thesis/Dissertation Information

Degree:
Doctorate ( Ed.D.)
Degree Grantor:
University of Florida
Degree Disciplines:
Curriculum and Instruction
Teaching and Learning
Committee Chair:
ANTONENKO,PAVLO
Committee Co-Chair:
KUMAR,SWAPNA
Committee Members:
KOHNEN,ANGELA MARIE
ORTAGUS,JUSTIN CHARLES

Subjects

Subjects / Keywords:
collaborate -- composition -- metacognitive -- peers -- self-efficacy
Teaching and Learning -- Dissertations, Academic -- UF
Genre:
bibliography ( marcgt )
theses ( marcgt )
government publication (state, provincial, terriorial, dependent) ( marcgt )
born-digital ( sobekcm )
Electronic Thesis or Dissertation
Curriculum and Instruction thesis, Ed.D.

Notes

Abstract:
This study examined the possible impact of instructor scaffolding on peer review in online college freshman composition courses. The instruction, Collaborative Multimedia Peer Review, was developed from the social cognitive model for sequential skill acquisition in order to prompt students to use feedback that would promote self-efficacy and revision skills. Students also utilized VoiceThread, an online tool for creating multimedia discussions. The purpose of the study was to determine what relationships may exist between the writing self-efficacy of students, their revision skills, and perceptions of peer review, revision, and learning. Students using Collaborative Multimedia Peer Review were compared to students using peer review worksheets to determine what differences, if any, existed in writing self-efficacy and revision efforts between the two groups of students. The qualitative data included peer review feedback, follow-up responses to feedback, and rough and final drafts which were part of the last essay assignment in the course. Quantitative data were collected from responses to the Writing Self-Regulatory Efficacy Scale and a survey on student perceptions. While findings showed the groups to be very similar in their revision skills and in their final self-efficacy scores, analysis revealed that the students using Collaborative Multimedia Peer Review began the course with lower self-efficacy average than those in course sections using peer review worksheets. The improvement in scores on the Writing Self-Regulatory Efficacy Scale at the end of the course could indicate Collaborative Multimedia Peer Review may have contributed to improvements in self-efficacy. The discussion considers how results contribute to the development of more standard, effective peer review practices for online composition courses, and implications for how instructors categorize writing skill levels and frame peer review activities are discussed. ( en )
General Note:
In the series University of Florida Digital Collections.
General Note:
Includes vita.
Bibliography:
Includes bibliographical references.
Source of Description:
Description based on online resource; title from PDF title page.
Source of Description:
This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Thesis:
Thesis (Ed.D.)--University of Florida, 2017.
Local:
Adviser: ANTONENKO,PAVLO.
Local:
Co-adviser: KUMAR,SWAPNA.
Statement of Responsibility:
by Audrey M Antee.

Record Information

Source Institution:
UFRGP
Rights Management:
Applicable rights reserved.
Classification:
LD1780 2017 ( lcc )

Downloads

This item has the following downloads:


Full Text

PAGE 1

1 EXPLORING RELATIONSH IPS BETWEEN PEER REV IEW, REVISION STRATE GIES, AND SELF EFFICACY IN ONLINE COLLEGE COMP OSITION By AUDREY ANTEE A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR TH E DEGREE OF DOCTOR OF EDUCATION UNIVERSITY OF FLORIDA 2017

PAGE 2

2 2017 Audrey Antee

PAGE 3

3 ACKNOWLEDGMENTS I recognize the irony in writing about motivation and co mposition while experiencing first hand the challenges of staying motivated and undertaking a writing task of this size. I could not have succeeded without the help and support of many people. I am very fortunate in the encouragement I have had from my friends and family When I really struggled with burn reminded me of the basics. get started. I would also like to thank my colleagues at Florida State Colleg e at Jacksonville for their time in helping me assess data and their willingness to listen to me go on about my ideas for improving our practice. I thank my parents for always pushing me to pursue my goals and believing in my ability to succeed Knowledg e, reading, and exploration have always been valued in my home because of the importance my parents placed on them. I am grateful for the love and support my in laws provided and the matter of fact texts from my mother in law that told me of course I was c repeat, over and over, things h e had told me just five minutes before and the subtle ways he let me know he missed me (such as wearing my unicorn slippers). I also appreciate the faculty in the onlin e Educational Technology program. I am so inspired by the dedication and passion they all invested into the success of each student in this program. I am thankful for my dissertation committee as well I had heard terrible anecdotes about experiences with committees at other schools, yet I have been fortunate to have individuals on my committee who asked me thought provoking questions and offered helpful feedback at each major step in the approval process. I am

PAGE 4

4 particularly grateful for the support I recei ved from my dissertation chair. When I struggled to mak e sense of some of my data or to finalize a chapter, he offered his time expertise, and encouragement. Finally, I am grateful for my students though they may not ever read this. While they can be the sources of great frustration, their failures and successes are why I continue to teach. I am often impressed by their determination and persistence in spite of many personal and academic obstacles. They have been a great source of inspiration to me

PAGE 5

5 TABLE OF CONTENTS page ACKNOWLEDGMENTS ................................ ................................ ................................ ...... 3 LIST OF TABLES ................................ ................................ ................................ ................ 8 LIST OF FIGURES ................................ ................................ ................................ .............. 9 ABSTRACT ................................ ................................ ................................ ........................ 10 CHAPTER 1 INTRODUCTION ................................ ................................ ................................ ........ 12 Contex t ................................ ................................ ................................ ........................ 12 Problem ................................ ................................ ................................ ....................... 14 Problem Statement ................................ ................................ ................................ ..... 17 Proposed A pproach to Peer Review ................................ ................................ .......... 17 Purpose of the Study ................................ ................................ ................................ .. 18 Research Questions ................................ ................................ ................................ ... 19 Significan ce ................................ ................................ ................................ ................. 19 2 LITERATURE REVIEW ................................ ................................ .............................. 22 Understanding Writing and Composition ................................ ................................ .... 22 Social Cognitivism ................................ ................................ ................................ 23 Models for Learning Written Composition ................................ ........................... 24 Self Efficacy and Self Regulation in Writing ................................ .............................. 27 Observational Learning, Modeling, and Writing Self Efficacy ............................. 29 Feedback and Writing Self efficacy ................................ ................................ ..... 34 Error Management Training and the Writing Process ................................ ......... 37 Feedback and Revision ................................ ................................ .............................. 38 Peer F eedback and Revision ................................ ................................ ..................... 40 Student Perceptions of Peer Review ................................ ................................ ......... 44 Feedback Modality ................................ ................................ ................................ ...... 47 Treatment: Collaborative Multimedia Peer Review ................................ ................... 49 Observation Level ................................ ................................ ................................ 50 Emulation Level ................................ ................................ ................................ .... 53 Self control Level ................................ ................................ ................................ .. 55 Self regul ation Level ................................ ................................ ............................ 56 Conceptual Framework ................................ ................................ ............................... 56 3 METHODOLOGY ................................ ................................ ................................ ........ 65 Research Design ................................ ................................ ................................ ........ 65 Participants and Context ................................ ................................ ...................... 66

PAGE 6

6 Exploring Group Homogeneity Relative to Writing Skills ................................ .... 66 Treatment ................................ ................................ ................................ ............. 67 Instruments ................................ ................................ ................................ ........... 70 Writing self efficacy ................................ ................................ ....................... 70 Revision ................................ ................................ ................................ ......... 70 Perceptions ................................ ................................ ................................ .... 71 Procedure ................................ ................................ ................................ ............. 73 Data Analysis ................................ ................................ ................................ ........ 74 Limitations and Ethical Considerations ................................ ................................ ...... 81 4 FINDINGS ................................ ................................ ................................ ................... 88 Self Efficacy ................................ ................................ ................................ ................ 88 Quantitative Self Efficacy Data ................................ ................................ ............ 88 Qualitative Data Self Efficacy Data ................................ ................................ ..... 90 Feedback types in CMPR ................................ ................................ .............. 90 Feedback types using the peer review worksheet. ................................ ...... 97 Self efficacy in follow up comments for CMPR ................................ ............ 98 Self .......... 102 Self reflection in feedback comments for C MPR ................................ ........ 103 Revision Skill ................................ ................................ ................................ ............. 103 Collaborative Multimedia Peer Review, Self Efficacy, and Revision ...................... 105 Student Perceptions, Self Efficacy, and Revision ................................ .................... 106 Perceptions ................................ ................................ ................................ ......... 106 Correlation Analysis ................................ ................................ ........................... 108 Modality Preferences ................................ ................................ ......................... 108 Summar y ................................ ................................ ................................ ................... 109 5 DISCUSSION AND IMPLICATIONS ................................ ................................ ........ 119 Writing Self Efficacy ................................ ................................ ................................ .. 121 Writin g Self Efficacy Scale ................................ ................................ ................. 122 Feedback Types ................................ ................................ ................................ 123 Self Reflection ................................ ................................ ................................ .... 125 Peer Review Dialogue ................................ ................................ .............................. 125 Revision Practices ................................ ................................ ................................ .... 128 Relationships between CMPR, Writing Self Efficacy, and Revision ....................... 130 CMPR Student Perceptions ................................ ................................ ...................... 132 Revising CMPR ................................ ................................ ................................ ......... 133 Implications for Future Practice ................................ ................................ ................ 136 APPENDIX A DIAGNOSTIC ESSAY RUBRIC ................................ ................................ .................. 141 B QUALITATIVE ANALYSIS CODEBOOK ................................ ................................ .... 143

PAGE 7

7 LIST OF REFERENCES ................................ ................................ ................................ 145 BIOGRAPHICAL SKETCH ................................ ................................ .............................. 155

PAGE 8

8 LIST OF TABLES Table page 2 1 ...................... 60 2 2 Treatment for teaching the writing process using the social cognitive model for sequential skill acquisition ................................ ................................ ................ 63 3 1 Timeline for implementation ................................ ................................ ................... 84 3 2 Interrater reliability with average measure Intraclass Correlation Coefficient for revisi on changes ................................ ................................ ............................... 86 3 3 Revision analysis categories based on Cho an taxonomy ................................ ................................ ................................ ................ 86 3 4 Alignment of research questions, data sources, and data analysis ...................... 86 4 1 Observed and Adjusted Differences between Groups on Self Efficacy Posttest. ................................ ................................ ................................ ................ 113 4 2 Descriptive Statistics for Pre and Posttest Scores ................................ .............. 113 4 3 Writing Sel f Regulatory Efficacy Scale questions showing a significant increase from pretest to posttest in the CMPR sections ................................ ..... 113 4 4 Writing Self Regulatory Efficacy Scale results for questions showing a significant increase from pretest to posttest in the peer review worksheet sections ................................ ................................ ................................ ................. 114 4 5 Frequency of feedback types in six VoiceThread peer review groups. ........... 114 4 6 Average number of surface level changes and meaning level changes for peer review conditions ................................ ................................ .......................... 117 4 7 prediction of performance average, and actual final average in CMPR. ................................ ....................... 117 4 8 Collaborative Multimedia Peer Review student perceptions of usefulness of giving/receiving feedback and VoiceThread ................................ .................... 118

PAGE 9

9 LIST OF FIGURES Figure page 2 1 ............ 59 2 2 Audience Response Model for Writing ................................ ... 59 2 3 Hannah using VoiceThread to post an audio comment and using the pen tool. ................................ ................................ ................................ ......................... 60 2 4 VoiceThread ................................ ................................ ................................ ........ 61 2 5 One of the questions embedded in the instructional v ideo. ................................ .. 61 2 6 Error management training within instructional v ideo. ................................ .......... 62 2 7 Peer review interaction on VoiceThread ................................ ............................ 62 2 8 Conceptual Framework ................................ ................................ .......................... 64 3 1 Example of two surface and two macro level revisions in student writing ............ 85 4 1 Example of student using directive fe edback on peer review worksheet ........... 115 4 2 Example of praise comments for studen ts using peer review worksheets ......... 117

PAGE 10

10 Abstract of Dissertation Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Education EXPLORING RELATIONSH IPS BETWEEN PEER REV IEW, REVISION STRATEGIES, AND SELF EFFICACY IN ONLINE COLLEGE COMP OSITION By Audrey Antee December 2017 Chair: Pavlo Antonenko Major: Curriculum and Instruction This study examined the possible impact of instructor scaffolding on peer review in online colle ge freshman composition courses. The instruction Collaborative Mult imedia Peer Review, was developed from the social cognitive model for sequential skill acquisition in order to prompt students to use feedback that would promote self efficacy and revision skills. Students also utilized VoiceThread an online tool for creating multimedia presentations The purpose of the study was to determine what relationships may ex ist between efficacy, revision skills, and perceptions of peer review, revision, and learning. Students using Collaborative Multimedia Peer Review were compared to students using peer review worksheets to determine what differences, if any, exist ed in writing self efficacy and revision efforts between the two groups of students feedback, follow up rough and final drafts which were part of the las t essay assignment in the course. Quantitative data were collected the Writing Self Regulatory Efficacy Scale and a survey on student perceptions While findings showed the groups to be very similar in their

PAGE 11

11 revision skills and in their final self efficacy scores, analysis revealed that the students using Collaborative Multimedia Peer Review began the course with a lower self efficacy average than those in course section s using peer review worksheets. The improvement in scores on the Writing Self Regulatory Efficacy Scale at the end of the course could indicate Collaborative Multimedia Peer Review may have contributed to improvements in self efficacy. The discussion considers how results contribute to the development of more stand ard, effective composition instruction and peer review practices for online composition courses. Also, i mplications for how instructors categorize writing skill levels frame composition instruction and deliver feedback are discussed.

PAGE 12

12 CHAPTER 1 INTRODUCTION Context Written communication skills are a fundamental component of K 12 and college education, and employers expect college g raduates to have mastered those skills as well as critical thinking and the ability to collaborate. Communication has been identified critical thinking, communication, col laboration, and creativity skills within the framework of 21 st century skills that educational institutions should focus on in the age of gl obal society (Partnership for 21 st Century Learning, 2007) Locally, for the Florida Department of public college and university in Florida must require its degree seeking students to successfully complete at least one college composition cla ss. Additionally, the Association of American Colleges and Universities published the results of a national survey of four hundred employers, which show 83% of those surveyed believe the ability to work in teams is very important, and 82% believe the abili ty to effectively communicate in writing is very important; however, many employers do not feel that colleges and universities are adequately preparing students in communication and collaboration (Hart Research Associates, 2015) In the field of composition studies, which includes theory and practice in teaching and learning composition at the college level, a critical compon learn college level written communication lies in peer review, which is a method of collaborative learning also referred to as peer editing, peer critiquing, peer evaluation,

PAGE 13

13 and peer assessment (Breuch, 2003; Bruffee, 1984; Ede & Lunsford, 1984; Magnifico, 2010; Mitchell & Taylor, 1979; Ward, 1994) Suthers (2006) defines collaborative h learners construct knowledge from social interactions (p. 318). Peer review is a collaborative learning writing meets a set of criteria and can be improved. Its value in the teaching of composition lies in emulating feedback from an authentic audience, allowing students to learn how to anticipate audience expectations and to revise. Though the writing process has multiple stages, the stage impacted by peer review which largely determines the level of writing performance is the revision stage. R evision is a complex task that can encompass previous stages a person revising a first draft may revisit the planning and drafting stages (Faigley & Witte, 1981) and how individuals approach revision determines how skilled they are as writers. For skilled writers, revision involves an awareness of audience and context as we ll as focus on global issues that impact meaning and coherence, which include the development, style, and organization of a text more so than surface errors such as comma usage or capitalization (Faigley & Witte, 1981; Kellogg, 2008; Sanders Reio, Alexander, Reio, & Newman, 2014; Yang, 2011; Zimmerman & Bandura, 1994; Zimmerman & Kitsantas, 2002) Whether or not a person fully engages in revision by tackling the more challenging global issues of content and organization may be influenced by that efficacy. Those with higher confiden ce in their abilities, or higher self efficacy, are more likely to persist in difficult tasks, to self regulate (Bandura, 1991,

PAGE 14

14 1997; Jones, 2008; Schunk & Zimmerman, 1997; Zimmerman & Bandura, 1994; Zimmerman & Ki tsantas, 2002; Zimmerman & Schunk, 2001) regulated directive attempts to transform mental abilities into academic performance (Zimmerman, 2008) Self regulation for writing involves cognitive, social, motivational, and behavioral processes (Zimmerman & Risemberg, 1997) efficacy, which in turn can influence self efficacy has been shown to impact their writing performance as students with more confidence in their ability to write tend to be more successful at completing complex or difficult tasks, like writing (Baaijen, Galbraith, & de Glopper, 2014; Bandura, 1991, 1997; Bruning & Kauffman, 2015; Jones, 2008; MacArthur & Philippakos, 2013; Pajares, 2003; Schunk & Zimmerman, 1997; Zimme rman & Bandura, 1994; Zimmerman & Kitsantas, 2002; Zimmerman & Risemberg, 1997) These three facets of college composition peer review, self ability to revise self efficacy, which in turn influences the extent to which one self regulates well enough to engage in challenging and complex tasks, such as revision. Problem Though peer review is typically a staple in first year college composition courses, there is no agreed upon strategy or set of strategies for implementing peer review, and much of the research on its effectiveness is inconsistent (Armstrong & Paulson, 2008; Paulson, Alexander, & Armstrong, 2007) From an instructor perspective, peer review can be ineffective because students are ill equipped to offer substantive feedback to their c lassmates

PAGE 15

15 Hawk, Stevens, & Schunn, 2013) Student perspectives on the value of peer review can vary. Students often do not trust themselves or their peers to provide helpful feedba ck (Brammer & Rees, 2007; Kasanga, 2004; Kaufman & Schunn, 2011; Ludemann & Mcmakin, 2014) Conversely, there is evidence demonstrating that peer review can be effective (Carifio, Jackson, & Dagostino, 2001; Cho, Schunn, & Charney, 2006; Min, 2005; Patchan et al., 2013; Yang, 2011) This di screpancy in the value of peer review could be the result of irregularities in how peer review activities are designed and executed or in how much guidance is offered by the instructor. Despite its theoretical basis in social learning, the way peer review is implemented does not always promote collaborative learning because some methods for peer review, like checklists or handouts that are popular in college composition courses, do not prompt student dialogue (Hauptle, 2006; Keeley, 2014) The need for an effe ctive peer review process is especially important in online freshman composition courses because there is no opportunity for a traditional face to face dialogue between learners, yet collaborative learning and social presence are integral in online context s where instructor support is more diminished (Curtis & Lawson, 2001; Garrison, 2007) At the institution where I teach, online courses are typically taught in Blackboard LMS with a course shell (pre made course content) developed by a subject matter expert and instructional d esigners. In the course shell for ENC 1101 English Composition I, peer review is a graded assignment, but it is not truly collaborative. The instructions in the shell indicate students receive full credit for peer review by completing and posting peer revi ew worksheets to the other members of their

PAGE 16

16 nor are they explicitly encouraged. Also of the utmost importance are peer review practices that promote writing self effic acy, which has been shown to aid students as they self regulate to revise. Most studies on the impact of writing self efficacy on performance examine outcomes such as course grades or success on a final essay draft as an indicator of actual writing compete writers from skilled writers (Ludemann & Mcmakin, 2014; MacArthur & Philippakos, 2013; Raedts, M.; Rijlaarsdam, G.; van Waes, L. & Daems, 2007; S anders Reio et al., 2014; Zimmerman & Bandura, 1994) Alternately, there are some studies that examine the impact of feedback on revision strategies without accounting for self efficacy (Baker, 2016; Carifio, J.; Jackson, I.; & Dagostino, 2001; Cho & MacArthur, 2011; Covill, 2010; Yang, 2011) though it has been shown that feedback can influence self efficacy. While there are some studies that examine all three, feedback, self efficacy, and revision, such research may focus on instructor feedback rather than peer feedback (Duijnhouwer, Prins, & Stokking, 2012) or examine contexts other than freshman writing courses (Kaufman & Schunn, 2011) As Baker (2016) explains, there is a need for additional researc process in online college composition courses that also consider the role of self efficacy, an impor tant non cognitive variable that impacts motivation to learn.

PAGE 17

17 Problem Statement Though the practice of peer review is believed to be vital for learners who are developing writing skills in college composition courses, there is little continuity in what methods of peer review work best, and the role of self efficacy and revision skills in online freshmen composition courses needs to be examined in more depth. Proposed Approach to Peer Review Online composition courses are not consistent in how peer review is developed. Some courses, like the online composition courses at Florida State College at Jacksonville, use a peer review worksheet with questions students must answer about a t instead of a discussion forum in an LMS, they may require to post comments on student developed blog posts (Delgado & McDougald, 2013; Vasileiou, 2016) and for some courses, online reciprocal peer review systems are used like SWoRD, scaffolded writing and rewriting in the discipline, though not necessarily in composition courses. For the SWoRD syst em specifically, a rubric and form are utilized for peer review. Reviewers download essays to be reviewed, log into the system, paste comments into a form and choose point ratings associated with the rubric ( Cho, 2006; Cho & MacArthur, 2010; Cho, Schunn, & Charney, 2006; Kaufman & Schunn, 2011). For all of these methods, replies to the completed forms or conversations between students about the feedback are not required. Also, while the importance of revision is examined in some cases (Cho & MacArthur, 2010; Kaufman & Schunn, 2011) the impact of peer review on self efficacy is not considered in dev eloping the peer review process. I developed a peer review scaffolding tool that supports a more collaborative learning based peer review process that also incorporates rich multimedia for student

PAGE 18

18 discussions. Within online asynchronous discussion forums, where collaborative learning is expected to occur, effective collaboration may not automatically occur, especially where discussion threads lack depth and fail to involve strong student interaction (Mooney, Southard, & Burton, 2014) Interaction is a requirement for collaborative learning (Zhao, Sullivan, & Mellenius, 2014) yet in the previously described methods of onli ne peer review, only one way interactions are employed. In my approach to peer review, two way interaction is emphasized since students are emphasizing qualities of colla borative learning, my approach is designed to teach students how to provide feedback that should promote self efficacy, a necessary component for self regulation and developing revision skills ( Baaijen, Galbraith, & de Glopper, 2014; Bandura, 1991, 1997; B runing & Kauffman, 2015; E. Jones, 2008; MacArthur & Philippakos, 2013; Pajares, 2003; Schunk & Zimmerman, 1997; Zimmerman & Bandura, 1994; Zimmerman & Kitsantas, 2002; Zimmerman & Risemberg, 1997) Purpose of the Study This study aimed to explore possible relationships (a) between peer review scaffolding, hereinafter called Collaborative Multimedia Peer Review (CMPR) efficacy, (b) between CMPR tual revision process from rough to efficacy and revision process, and (d) efficacy and revision process in multiple secti ons of an online freshman composition course. It also aimed to examine what differences, if any, may exist in writing self

PAGE 19

19 efficacy and revision efforts between sections using CMPR instructions and sections using a more traditional peer review form to cond uct peer review and revision. In an effort to simulate a more interactive collaborative environment such as face to face collaboration, students in randomly selected sections interacted in VoiceThread an online technology. U nlike the typical text based d iscussion boards VoiceThread allows users to create multimedia discussion threads by developing screencasts from uploaded files on which others can post voice, video, or text comments. Students in comparison sections went through the traditional process of peer review and revision, which was built into the course as completion of a peer review worksheet. Research Questions In investigating the possible relationships between CMPR self efficacy and revision process, the following resea rch questions were addressed: 1. To what extent does Collaborative Multimedia Peer Review for giving feedback in efficacy in an online freshman composition course? 2. To what extent does Collaborative Multimedia Peer Review contribute to 3. What relationship, if any, exists between Collaborative Multimedia Peer Review, fre shman composition course? 4. and video comments in Voicethread efficacy and revisions strategies? Significance The significance of this study lies in its contribution to solving an important problem of practice that is, improving design and implementation of online college

PAGE 20

20 student learning in composition cou rses, namely self efficacy and the revision process. between scaffolding, peer review, self efficacy, and revision in online freshman composition courses, an important issue in the field of composition studies as a freshman composition course of some type is required for degree completion at all colleges and universities nationwide. This study also fills a need for additional data driven research and practice identifyi ng the qualities of peer review and peer review scaffolding that best facilitate the peer review process and promote useful student feedback that improves student self efficacy and revision, particularly for instructors of freshman college composition who may not have a background in rhetoric and composition theory or learning theory. M ost English literature) in order for a candidate to be considered qualified to teach c omposition, so many of those teaching in coming college freshmen have a background in English literature and are not initially familiar with the pedagogy associated with teaching composition ( Association of Departments of English, 2009) Such was my experience when I first began teaching college level writing courses, and my understanding of the value of peer review depended on how it was presented in as a guiding principle for how peer review should be implemented. Additionally, the treatment for this study incorporated VoiceThread and multimedia tools for online media rich collaboration are relatively new. VoiceThread is marketed as a tool for de veloping interactive lectures and multimedia discussions. While

PAGE 21

21 there is a body of research on screencasting technologies used by instructors for giving feedback (Crook et al., 2012; Vincelette & Bostic, 2013; Yuan & Ki m, 2015) this study adds to the current body of research on the use screencasting and multimedia technology, specifically VoiceThread for multimedia enhanced feedback and dialogue between students. The students using VoiceThread in this study log in to access rough drafts, use VoiceThread to develop multimedia feedback and to create follow up discussions on that feedback.

PAGE 22

22 CHAPTER 2 LITERATURE REVIEW This chapter discusses the theoretical foundations for some of the more well known writing process models and principles of social cognitivism underpinning collaborative writing. It illustrates the connections between the social aspects of learning to improve writing skills ( modeling and feedback ) and self efficacy and learning, which ar e all important for developing writers to improve their revision skills. Then, perceptions of peer review are discussed as an important variable influencing student engagement in collaborative writing as well as modalities for engaging students in peer rev iew. In practice, there is little consistency in how instructors format peer review, so a proposed approach to peer review is described based on social cognitivism and writing process theory that emphasizes qualities of collaborative learning often correla writing self Understanding Writing and Composition In many ways, the development of composition theory has mirrored advances in learning theory. The field of composition studies has its basis in various philosophical traditions related to language acquisition, linguistics, philology, and the creative proce ss of writing. Nystrand, Green, and Wiemelt (1993) trace the progression of composition theory from formalism, strongly influenced by behaviorism; to reader response, with a basis in cognitive psychology, the psycholinguistic revolution, and constructivism ; to structuralism and post structuralism, influenced by cognitive constructivist and socio constructivist perspectives of language, linguistics, and composition. Modern language and composition scholars maintain that language, speaking, and writing are di alogical,

PAGE 23

23 and meaning is socially constructed, dependent on context (Nystrand, Green, and Wiemelt, 1993). Specifically, Bakhtin (1981) poses the concept of signs (spoken as utterances or written as words) as being subject to certain conditions, like social conditions, that make context critical to meaning m Bakhtin (1981) terms this concept between the sign (the part) and social groups or communities (the whole). What a text means is determined by the interpretation of both the writer and the reader (Nystrand, Greene, Wi emelt, 1993) As writing has become viewed as dialogic, approaches to writing pedagogy have also evolved from focusing on writing as a product and teaching students to emulate dents a sequential set of steps or stages; however, post process theory critiques the idea of writing as a clear set of steps and better situates writing and the teaching of writing as dialogical (Breuch, 2003) Essentially, writing is public, interpre tive, and situated, utilizing dialogs: inner dialogs, dialogs between students, dialogs between students and the teacher, and dialogs between students and social institutions (Breuch, 2003) From this perspective, the role of the audience in writing be comes critical in teaching students how to become more skilled writers, able to adapt to various contexts. Social Cognitivism Writing is an important part of learning in most knowledge domains. Vygotsky, o ne of the most notable learning theorists linking learning to language acquisition and

PAGE 24

24 use, posits that language acquisition and cognition, the psychological process of way in which mental functions are altered by the mediation of language signs is that knowledge, and thereby learning, becomes a social, communicative, and discursive Duffy, 1996, p. 12). However, the idea of learning as a social process is also a precept of social cognitivism. While Duffy (1996) to examine the interaction between both processes. In his description of social cognitive theory, Bandura (1991) describes a triadic structure of reciprocal causation that includes cognitive factors, behavior, and environ such, learning is a product of shared beliefs as well as individual beliefs (Bandura, 1991, 1997) Models for Learning Written Composition Some prominent models of the wri ting process are based in principles of cognitivism. Most notable is the Hayes and Flower Cognitive Process Model of the Composing Process ( Figure 2 1 ) which identifies the roles long term memory, task environment, and self mon process of planning, writing, and reviewing Additionally noteworthy is of Working Memory in Writing which examines three key systems that encapsulate the writing process -for mulation, execution, and monitoring and which emphasizes how (Chanquoy, 2009) While these models acknowledge the importance of context and self regulation, or

PAGE 25

25 mo nitoring, they do not fully explore the impact of a peer audience or a discourse Magnifico (2010) elaborates on how conceptualizing the audience as the writing context bridges key ideas in cognitive and soci ocultural perspectives. From a cognitive audience in order to frame their own ideas about what style and genre to use (Magnifico, 2010, p.175 176). According to Ong (1975) can fictionalize in his imagination an audience he has learned to know not from daily life but e does not take into consideration the differences in what knowledge individuals construct their knowledge of audience position, Ede and Lunsford (1984) argue that the audience should not be devalued in a model for the process of writing. They consider instances where a writer should consider feedback from an actual person as a representative of the intended audience. Kellogg (2008) e xperience in a discourse community, something that could take several years of experience writing in that particular domain. It is reasonable to assume, then, that in coming freshmen are unlikely to have adequate experience with academic discourse to succe ssfully anticipate the demands of a fictionalized audience. In focusing specifically on how college freshmen negotiate academic discourse, Fl ower (1990) explains how students experience a transition that is both cognitive and social in nature and why students need to learn the socially constructed norms in which

PAGE 26

26 academic discourse operates. She proposes a strategic knowledge framework as a w ay learners bridge cognition and social context that includes goals, strategies, and awareness; though she emphasizes the importance of students learning to adapt to an academic context for writing, the dialogue she discusses is primarily an internal one a s students consider readings and how to respond to readings with their writing. From a socio cognitive perspective, audience has a more active function as including members of a community who offer feedback and who help the writer consider community norms and think about his or her writing in a new way (Magnifico, 2010) Mitchell and Taylor's (1979) Audience Response Model for Writing ( Figure 2 2 ) considers the writer in reference to his or her audience and classifies writing as being good or bad according to wha t the audience sees as being effective or ineffective rather Ede and Lunsford (1984) argue that the Audience R esponse Model for Writing equal importance. The concept of audience model by Ede and Lunsford (1984) illustrates the various roles of an invoked audience and an addressed audience in an ints models illustrates how students learn to negotiate audience as part of the learning process. ct on motivation, yet Ede and Lunsford (1984) disagree because they believe such a claim

PAGE 27

27 None efficacy, which in turn impacts self regulation and performance. The social cognitive model for sequential skill acquisition, while not solely used as a model for writing instruction, considers self ef ficacy beliefs and provides a scaffold for helping learners transition from needing social support to self regulation (Zimmerman & Kitsantas, 2002) It includes four levels, the first two of which, observation and emulation, are forms of social learning that are intended to lead to the last two levels, self control and s elf regulation (which will be discussed in more detail in the following section) (Zimmerman & Kitsantas, 2002) Though the social cognitive model for sequential skill acquisition does not specifically elaborate on all facets of the writing process, it provides a framework for an instructional design that could apply to s ocial cognitive learning across disciplines, and it has been chosen as the model used for this study because (a) it recognizes the importance of self efficacy to skill acquisition, (b) accounts for the dialogic nature of learning as well as writing, and (c ) can be utilized for any aspect of teaching the writing process, in this case, specifically how to give peers feedback and how skilled writers approach revision. Self Efficacy and Self Regulation in Writing From a social cognitive perspective, self efficacy and the ability to self regulate are dependent on interaction with others. Self efficacy, a regulate It is directive process through which learners transform their mental abilities into task related is needed in order for learners to attempt to solve complex problems (Zimmerman & Schunk, 2001, p. 1) Zimmerman's (2002) phases and subprocesses of self regulation locate self efficacy in the forethought phase. This phase

PAGE 28

28 includes beliefs that precede learning, but beliefs are also a result of previous learning experiences. Zimmerman (2002) illustrates how the forethought phase leads to the performance phase, which is followed by the self reflection phase and cycles back to the forethought phase For example, a student who repetitively receives poor grades in a writing course would form beliefs about his or her inability to write, which equates to low self efficacy towards writing tasks, and these beliefs would play a role in the r egulatory practices for future writing tasks. In a study of 95 freshmen taking a university writing course (some advanced writing and some regular writing), Zimmerman and Bandura (1994) surveyed students to determine whether their sel f efficacy for writing, academic self efficacy, and self efficacy for writing influenced their perc eived self efficacy for academic achievement, while their perceived self efficacy for academic achievement was found to impact final course grades. Perceived self efficacy for academic achievement was also found to directly and indirectly impact personal g rade goals which correlated with final course grades (Zimmerman & Bandura, 1994) Sanders Reio and colleagues (2014) also examined the relationship between student b eliefs about writing and their writing performance in a study of 738 students taking an undergraduate educational psychology course. To measure beliefs about their own writing, the researchers developed a Beliefs about Writing Survey and alig writing skills: knowledge telling, knowledge transforming, and knowledge crafting. They

PAGE 29

29 measured writing self efficacy with the Writing Self Efficacy Index, and to determine whether studen ts fear and/or avoid writing, a revised version of the Daly Miller Writing Sanders Reio et al. (2014) scored by two instructors for the educational psychology course with high interrater reliability. They found that all three measures predicted writing performance with beliefs about writing being the strongest predictor and self efficacy for mechanical skills being the second strongest and the only one of the self efficacy subscales (not self regulatory writing skills or substantive writing skills which were also subscales) that was a between writing self efficacy and grades on the writing performance assessment (Sanders Reio et al., 2014, p. 9) though the authors also note that there are indications efficacy beliefs overestimated their writing skills. Observational Learning, Modeling, and Writing Self Effic acy regulated learning, self each self modeling by parents, teachers, coaches, and peers M efficacy if his or her self efficacy is disproportionate to a complex task; students unfamiliar with more complex once they fail to perform as they expect to can be detrimental to their self efficacy on future tasks: ional (Bandura, p.66, 1997)

PAGE 30

30 self regulatory systems which include self observation, judgm ental process, and self reaction he explains the significance of the judgmental component, which houses a subcategory: social performance with the performance of others. He also classifies this in a later publication as a source of self visualizing people similar to oneself perform successfully typically raises efficacy beliefs in observers that they themselves possess the capabilit ies to master comparable view into the social system that would allow a learner gain a better understanding of the skills needed to successfully complete a complex and so mewhat unfamiliar task and provide a realistic expectation for being successful based on seeing others succeed at the task. In discussing research on the impact of self efficacy and motivation on writing development, Bruning and Kauffman (2015) cite stud ies that have shown that self efficacy and writing skills are improved when students are involved with modeling strategies and a focus on process goals (not outcome goals initially) followed by opportunities to practice O nce automation has occurred, lear ners can focus on outcome goals. Also, in a relevant study of revision skills and self regulati on Zimmerman and Kitsantas (2002 learning impacted their learning when they participated in revision practice. Specificall y, Zimmerman and Kitsantas (2002 ) found that coping models elicited more effective practice and revision strategies; coping models differ from mastery models in that coping models begin by making errors which are gradually identified a nd eliminated. In

PAGE 31

31 their study of 72 undergraduate students who were randomly assigned to one of six conditions with or without mastery or coping modeling and with or without social feedback, the researchers found that students in the modeling conditions ou tperformed those with no modeling though those who used the coping model outperformed those groups, their self efficacy beliefs did not improve but decreased as they realized after observation that they had overestimated their ability to perform. Though the adjustment was a downward adjustment, this is still a beneficial change in self efficacy beliefs since efficacy can be detrimental once one fails to perform as expected. Zimmerman and Kitsantas (2002) efficacy scores ultimately correlated with their posttest writing skills, which is a desired outcome for students to self regulate through the writing process. The value of obse rvation or modeling to learning is further supported with studies such as Raedts, Rijlaarsdam van Waes, and Daems' (2007) examination of whether efficacy in writing is relative to the complexity of the task and whether observation of peer models would improve their knowledge of how to write a literature review more so than traditional instruction. In their study of 144 students taking a first year research methodologies and psychology course at a Flemish un iversity, Raedts, Rijlaarsdam van Waes, and Daems (2007) used a pretest posttest quasi experimental design. Students took pretests measuring intelligence, reading skills, self efficacy beliefs, and task knowledge. The experimental group received an interventio n with videos containing peer models demonstrating excerpts of the writing process; following the videos, students completed observational

PAGE 32

32 exercises based on what they learned from the videos. Posttests were given on task knowledge, writing self efficacy, and writing performance. At both the pretest and the efficacy with writing performance pre and posttests. Rijlaarsdam van Waes, and Daems (2007) found that students in the experiemental gro up had more accurately calibrated writing self efficacy and writing performance scores than those in the control group who had overestimated their writing competence by about 7% S tudents in the experimental group did not demonstrate more knowledge of what a literature review should look like, but they did have significantly more knowledge of writing strategies, particularly information gathering and planning, than those in the control group. Overall, students in the observational learning group outperforme d those in the control group in the quality of literature review produced at the end of the experiment demonstrating how observation and modeling can not only improve learning, but can also align students self efficacy with their ability to perform a comp lex task (Raedts, Rijlaarsdam, van Waes, & Daems, 2007) Through a two stage process of developing curriculum for developmental writing courses at a two year college, MacArthur and Philippakos (2013) found that instructors modeling self regulation strategies specifically for the writing process led to student gains in writing confidence and in writing achievement. Using design research in order to dev elop and evaluate specific instructional strategies, the researchers collected data through interviews, essay scores for an essay at the beginning of the term and one at efficacy for writing, achievement g oal (MacArthur &

PAGE 33

33 Philippakos, 2013, p. 183) The curriculum from the first term was evaluated and revised for the second term. The instructional components included a discussion on the elements of a certain genre of writing, examples of weak and successful essays (the former as a basis for characteristics of the genre and the latter f or evaluating whether a paper exhibits the characteristic necessary for a genre), instructor modeling through think alouds for the writing process from planning to draft to revision, student collaborative practice of the modeled strategy with instructor gu idance, student application of the strategy on an essay, practice evaluating papers written by unknown peers, peer review in pairs, and finally student editing papers with instructor feedback. This cycle was repeated for another paper with less instructor guidance. The instructional components were revised after the first implementation to include reviews and quizzes on the genres; added support for text organization and the connections between the characteristics of a genre, organization and text evaluatio n for revision purposes; goals strategy instruction to improve self regulation; more activities for practicing refuting an opposing position for the persuasive genre; an emphasis on giving ng as well as procedures for teaching peer review; and lessons on editing that included student on one conferences with the instructor. MacArthur and Philippakos (2013) collected data from 34 students in the second term of the study in order to measure writing acheivement, motivation (self efficacy, goals, and beliefs) writing strategies. They found that most students made statistically significant gains in writing quality, and there were increases in self efficacy, affect, and mastery motivation

PAGE 34

34 as opposed to performance motivatio n. Interviews with 16 students supported the previous findings. While this study adds to the body of research on the importance of modleing and observational learning, the researchers did not specifically correlate modeling with gains in self efficacy thou gh they note that half of those interviewed indicated an increase in confidence for writing as a result of the course. They also identified not including a control group as a limitation of the study, meaning the strategy of Feedback and Writing Self efficacy efficacy. Bandura (1991) explains how individuals who receive feedback that they are performing well compared to their peers have good self efficacy but become complacent, are satisfied with mediocre performance, and set less challenging personal goals; conversely, negative comparative feedback leads to lower self efficacy and a decl ine in of personal efficacy, fostered efficient analytic thinking, and transformed self evaluation from self discontent to self ndura, 1991, p. 272 273). Bandura (1997) identifies social persuasion as another source of self efficacy, self efficacy and motivation that drive self regulatory strat egies, feedback must have immediate benefits, be neither overly complementary nor negative but must, instead, indicate that a person is improving without having necessarily achieved complete mastery of a topic (Bandura 1991; 1997). In a study on the impac t of writing feedback perceptions and writing self efficacy perceptions on self regulation, Ekholm, Zumbrunn, and Conklin (2014) found that both

PAGE 35

35 perceptions were predictive of writing self regulatory perceptions. The researchers surveyed 115 undergraduate students, freshmen through seniors, taking English and education courses at a university P scale, writing self efficacy scale, and writing self regulation aptitude scale were analyzed using correlation analysis and a series of regression analysis in order to inspect the effect of self efficacy perceptions on feedback perceptions and the effect of feedback perceptions on self regulation aptitude perceptions (Ekholm et al., 2014) efficacy, and self regulation, they note limitations in the data being self reported, in its generalizability, and in the possibility that other factors can influence self regulation. Ludemann and Mcmakin (2014) assessed the writing confidence of 37 first year students taking a general psychology course in their study on the perceived helpfulness to student grades on writing assignments. The authors indicate that the edited papers provided feedback on level changes to sing on rhetorical issues like audience, focus, development, etc. (Armstrong & Paulson, 2008) Despite calling the activity peer editing, Ludemann and Mcmakin (2014) recorded student comments that indicated the editing exercise may have included responding to format and clarity of writing as well as surface students in an activity that included both peer review and editing. The researchers collected data on writing confidence through the Daly Miller Writing Apprehension Test and had s tudents complete a survey on exchanging feedback with peers as part of the

PAGE 36

36 peer review process for assigned writings. For two assigned writings, peer review was required prior to final submission. Interestingly, Ludemann and Mcmakin (2014) found that students believed that providing feedback (not recei ving it) was helpful in improving measured through the Daly Miller Writing Apprehension Test and the results from the perceived helpfulness of peer editing and essay scor es. They attribute this result to their small sample size, the need for a baseline assessment to determine variations in writing ability, and a need for a more sensitive assessment of writing confidence. Another possible limitation of this study may have b een its detachment from writing process theory. There is no mention of whether students were taught characteristics of the writing process other than peer review, and the specific requirements for the writing assignments which were focused on psychology co ncepts. Papers were exchanged once, and 20 minutes were allowed for peer review. Then 5 minutes were allotted for reviews were anonymous, so students could not ask question s directly of the person who reviewed his or her essay. In several studies on feedback, participants are left anonymous (Cho, 2006; Cho & MacArthur, 2010; Cot, 2014; Johnson, 2001; Kaufman & Sch unn, 2011; Patchan et al., 2013) (Cho & MacArthur, 2010; Patchan et al., 2013) Some argue keeping reviewers anonymous allows them to offer more critical feedback and avoid any bias against the individual whose writing is being reviewed (Cot, 2014; Johnson, 2001) However many studies

PAGE 37

37 when individuals lack the motivation to contribute to group efforts (Kerr & Bruun, 1983; Kreijns, Kirschner, & Jochems, 2003; Zhao et al., 2014) Also, in their studies on Hung (2016) and Jones, Georghiades, and Gunson (2012) found that students preferred seeing their reviewers. For this study, peer review was not conducted anonymously so as to encourage face to face social interaction and motivate group members to engage in the peer r eview process. Error Management Training and the Writing Process The concept of writing as a process presupposes that in trying to write according to a particular set of conditions, individuals do not typically achieve a perfect draft on the first try: we make errors when we write or do not meet the standard of writing we are attempting to achieve, and we learn partly through addressing those errors or inadequacies Part of the writing process is discovering shortcomings in content, organization, audience writing can be improved by negotiating those issues hence the importance of revision. Error management training is a strategy that frames error making as a natural part of the learning process (Keith & Frese, 2008) Though instructors may recognize how error correction feedback does not encourage students to think about how they can better develop the ability to a ddress global issues in their writing, many instructors still resort to giving brief, critical feedback (Ferris, 2014) A focus on trying to avoid shortcomings ca efficacy (Carter & Beier, 2010) However, a study on 173 adults between the ages of 20 and 66 found that low structure error management training can be beneficial for self efficacy and performance (Carter & Beier, 2010) Error management t raining is meant to decrease the impact of negative

PAGE 38

38 errors (Keith & Frese, 2008) Th is strategy is well suited as an approach for instructing students in how to improve their revision process as students must work through flaws and inadequacies throughout their writing process to arrive at final draft that meets a certain standard Though it has not been researched in the context of a college writing course, this perspective has potential as a way of helping developing writing students learn how to manage issues in writing related to grammar, content, and organization as part of their writ ing process. Feedback and Revision Though most studies on writing skill use a course grade or an essay grade to determines his or her level of skill as a writer. An influen tial study on revision was conducted by Faigley and Witte (1981) in which they describe a revisi on taxonomy for Inexperienced or novice writers tend to lack an awareness of audience and context revisions to a text to try to make certain that readers see matters the way the author (Kellogg, 2008, p. 9) This level of expertise is typically found in professi onal writers who are able to anticipate audience expectations; however, Faigley and Witte (1981) explain that based on their observations of the revision strategies of writers across different levels, some expert writers (professional writers) are able to revise mentally before committing text to paper, much as skilled mathematicians can work out comp lex math problems in their head s and arrive at the correct answer without working

PAGE 39

39 all parts of a problem out. Alternatively, developing writers are unlikely to have mastered the process of internal revision, and feedback is meant to serve as guidance for a udience expectations. One study illustrated how diagnostic and prescriptive comments can be helpful feedback and lead to improved revision skills. Carifio, Jackson, and Lorrain (2001) studied twenty eight first and second year students taking an introduct ory literature course at a public community college in the Northeast. Students were given modified personal essays from a college writing textbook. They revised pretest and posttest essays and evaluated three treatment essays. None of the essays contained errors in grammar/mechanics. The essays given to the experimental group had specifically marked flaws with a separate key on what the mistakes were and suggestions on how to fix them while the control group had the same essays with only a summary of the pr oblems, specifying that no errors were in grammar/mechanics and instructing the students to revise with no suggestions for how to go about doing so. The researchers evaluated the revisions based on a system of Effective Change variables based on the Faigle y Witte system with the added categories of Development Change, Coherence Change, and Organization Change (Carifio, Jackson, & Dagostino, 2001). The researchers discovered that diagnostic and prescriptive feedback can be helpful as students in both groups improved in positive revisions : they made needed changes and had fewer negative revisions correcting areas that were not flawed or making a change that did not fix the problem from the pretest to the posttest ; the experimental group became more proficient at improving flawed sentences H owever, the study cannot

PAGE 40

40 how long the ability to revise may last beyond the intensive practice sessions that were part of this study (Carifio, Jackson, & Dagostino, 2001). Peer Feedback and Revision By engaging in dialogue with peers, students may be better able to understand (2010) note that peers commu nicate by sharing the same language or way of speaking, meaning that they may more effectively understand comments from each other than comments that include professional or academic jargon with which they are unfamiliar. In a study o f 28 undergraduate psy chology students in a twelve week research methods course in which the researchers examined feedback and student revision, Cho and MacArthur (2010) found that students were actually better at incorporating multiple peer comments into their revisions than comments from a single expert. A subject matter expert with experience in the content of the course and in teaching the course was used to provide feedback to students but was not actually teaching the course being studied ; therefore, the researchers could control for possible extraneous influences. Students were randomly assigned to one of three experimental conditions. Students in one group received feedback only from the subject matter expert, students in another group received feedback from one of their peers, and students in another group received feedback from six they only received reviews based on thei r experimental condition. To control for writing skill, students took a multiple choice writing skill test prior to completing the writing assignment for the study. Students wrote a draft corresponding to a writing assignment in that course and reviewed th e drafts of others in SWoRD, an online tool for managing

PAGE 41

41 drafts and reviews. They were unaware that a subject matter expert would be giving feedback and were even told they would not receive feedback from the course instructor. Reviews were done anonymousl y and students were instructed to create pseudonyms when they registered for the SWoRD system. As part of the reviewing process, students rated the draft being reviewed according to a rubric and were supposed to provide a comment for each area of the rubri c. Feedback was characterized as directive, which involves suggestions for specific changes; nondirective, which observes a general area needing improvement; praise comments; y; and off task comments. To assess revision, Cho and MacArthur (2010) used a coding system based on the Faigley Witte Revision Taxonomy. Through a follow up ANOVA, analysis of revision types in each group revealed that students who received feedback from multiple peers made more complex repair revisions than those in other groups. A multivariate regression analysis on feedback types and revision type showed that non directive feedback was predictive of complex repair revisions and new content revisions. Al so, a multiple regression analysis on the revision types and writing improvement indicated that complex repair revisions positively predict final draft writing quality (Cho & MacArthur, 2010) Directive feedback was associated with surface changes, and non directive comments were associated with complex repair revisions. Though they found that students receiving feedback from multiple peers recei ve non directive comments which led to complex repair revisions, some limitations lie in having blind reviews since, in many peer review activities, students know their reviewers and vice versa. Also, results might not generalize to other disciplines.

PAGE 42

42 Yang (2011) developed a reciprocal peer review system that encouraged students to participate in modeling, coaching, scaffolding, articulation, reflection, and exploration as part of their peer review process. In the system, they can collect their examples of su rface errors (like run ons, fragments, spelling, etc.) that they can refer to as they offer reviews to peers and as they look at their own reviewed writing. Similar to coaching, scaffolding gives students access to definitions and examples, but these are f or error types found under global revision (coherence, organization, etc.). For articulation, peers provide comments to accompany identified error types, and those receiving the feedback can evaluate the helpfulness of the feedback and respond. Yang (2011) exts in the system, for reflection, there is a tool in the system that allows students to compare their originally submitted draft with the revisions of that draft provided by peers. A track changes feature allows students to see specific changes. To imple ment exploration, Yang (2011) collected data from 95 students at a science and technolog y university in Taiwan taking a third term English as Foreign Language writing class who had also passed the intermediate level of a nationwide English proficiency test. Using content analysis for first and final drafts, Yang (2011) peer review and revision processes.

PAGE 43

43 In a t hree year study on peer review and revision in junior level sociology courses, Baker (2016) Students submitted drafts (complete or incomplete) the day be fore peer review was to take place in class. Students received blinded drafts, a rubric for evaluating drafts, and an instruction packet. Prior to peer review on the same class day, the instructor gave a lesson on how to give formative feedback. During pee r review, students had to select statements from categories on the rubric that corresponded with the expectations of the writing assignment, and they had to also provide detailed formative comments on their own for each section. They were told only the pee r review form with the rubric would be returned to the original author, so any comments on the actual draft would not be seen. Final drafts were due three and a half weeks after students received completed peer review forms. Baker (2016) analyzed peer review comments and revisions using content ana lysis and a version of the Faigley Witte revision taxonomy was used to differentiate between surface level and meaning level revision changes. The majority of student comments involved meaning level changes, and most students made meaning level changes in their drafts as opposed to surface level changes However, meaning level changes primarily involved adding text to the end of an essay rather than making meaning level changes to text that was reviewed. Baker (2016) notes that a significant limitation lies in not comparing peer review comments to revision choices, and it is possible that some students already knew what they needed to change particularly in cases where incomplete drafts were submitted for peer review. A limitation not mentioned was there was no comparison between students who revis ed without peer

PAGE 44

44 review or who revised with limited guidance from the instructor to show the difference in revisions. Overall, t hough studies show how both expert and peer feedback can impact revision, not all studies show a change or improvement from rough to final draft when peer review is compared to no peer review. In a study comparing students in a peer review condition with t hose in a self review condition and no review, Covill (2010) found no significant difference in the final drafts. The researcher examined 61 students taking sections of a sophomore level psychology class. Each section was put into a different rev iew condition, and all students regardless of their group was provided with instructor feedback on the rough draft for all three required literature reviews before submitting their final drafts. While there was no significant difference in the final draft s, following a Covill (2010) found that more formal revisions were made by students in the no review condition than in the other two conditions S pecifically, students made more global revisions. While Covill's (2010) findings are unexpected, it is important to note that all students received a lesson in making reader based prose, and students in the no review condition still did receive feedback from the instruct or on rough drafts. Student Perceptions of Peer Review In professional writing contexts, peer review is an accepted practice for providing feedback which is then used to revise. Writing groups are typically considered to be a s, be they creative writers or doctoral students working on dissertations. These individuals seek out feedback from peers in order to improve their book which examined the origin of peer review from outside of

PAGE 45

45 and naturally, as it were, among status equals and outside the influence of a teacher whose authority cou ld undermine the power of collaborative learning among perceptions on peer review, students indicate a preference for expert feedback as opposed to feedback from peers, which seem s to contradict previous findings on how peer feedback can influence self efficacy and revision. Though much research has been done on the value of peer review to show that it is an important element of the writing process, student perceptions on its value vary. If students do not value peer review, efficacy and self regulatory improve. According to Brammer and Rees who had completed their freshmen writing sequence and were in an intensive writing average of all sections of the course (Bra mmer & Rees, 2007). The researchers note of self efficacy in her writing ability I don't think they can do it competently, just like I don't think I can give a good Peer review b/c I am a horrible writer 2007, p. 80). However, frequency of peer review related positively with perceived value, and perceived value was found to have a positive correlation with required in class peer review. Brammer and Rees (2007) conclude that students who were in structed in how

PAGE 46

46 to perform peer review and were required to participate in peer review more frequently found it to be a valuable practice. Kasanga (2004) in her study of 52 students in academic writing at a University in South Africa determined that stude nts primarily used teacher feedback over peer feedback, possibly due to the perception of the teacher as the one who awards final grades and due to negative feedback, which studies cited by Kasanga indicate may lead to a lack of trust between peers and rel However, despite these conclusions, students in the study reported a willingness to participate in peer review and satisfaction with the comments provided by peers (Kasanga, 2004). Ludemann and Mcmakin's (2014) study previously discussed in the section on Feedback and Writing Self impact the final grade for that assignment (p. 135). Despite these student concerns, L psychology class also found that students believed that providing feedback (not receiving it) was helpful to improving their own writing. Interestingly, even when peer review While some students indicate a preference for instructor feedback, Cho, Schunn, and Charney (2006) found that in a blind peer review stud y, there was no significant difference in the perceived helpfulness of peer comments versus expert comments. Cho, Schunn, and Charney (2006) studied 30 undergraduate students in research methods course. Stude nts were told their reviewers might be the subject matter expert,

PAGE 47

47 a single peer, or six peers. Students wrote a draft corresponding to a writing assignment in that course and reviewed the drafts of others in SWoRD, an online tool for managing drafts and re views. Then, students rated the helpfulness of feedback using a scale of 1 undergraduate course, Cho, Schunn, and Charney (2006) used a two way mixed ANOVA to analyze the feedback helpfulness ratings and found no evidence that peer feedback was perceived to be less helpful than expert feedback. In the same study, 88 undergraduate students in a psychology course for non psychology majors participated in peer review in SWoRD as well, and Cho, Schunn, and Charney (2006) found that the type of feedback may impact perceived helpfulness. In the large undergraduate course, directive feedbac k and praise feedback positively influenced student perceptions while critical feedback seemed to negatively influence student perceptions on the helpfulness of feedback. Feedback Modality There has been some debate as to whether written feedback is the be st modality for providing feedback, and some studies show that audio or audiovisual feedback is perceived to be more detailed and more personal than written feedback alone (Crook et al., 2012; Hung, 2016; Vincelette & Bostic, 2013) Howev er, most studies on modalities other than written are small scale, and very few focus on peer feedback. Vincelette & Bostic (2013) studied 39 students and their 5 composition instructors on their use and perceptions of instructor feedback via screencast T he researchers found that students felt screencast feedback was more effective and that screencast feedback led them to made better revisions.

PAGE 48

48 Crook and colleagues (2012) studied 287 students and 27 staff members on the use of video technology for providi ng feedback but found that most students preferred written feedback. Students and staff were surveyed on the possible benefits of video feedback before and after their use of a Web 2.0 resources called ASSET which serves as a video repository for students and staff. Though the majority of staff surveyed indicated a preference for giving video feedback, students preferred written feedback. This outcome could be a result of the video feedback not being individualized. Instead, generic videos were developed an d students could choose the ones they needed to watch from a repository. Students cited the lack of individualized comments as a disadvantage to video feedback compared to written feedback (Crook et. al., 2012). In a comparison of learner engagement with written peer to peer feedback and video feedback, Hung (2016) had 60 English as a Foreign Language (EFL) students in groups post video feedback on Facebook to members of their group. Facebook was chosen because students would receive updates when posts w ere made to their group and because groups could be made private. The researcher developed the Video mediated ora l feedback questionnaire (VOFQ) and offered a detailed description of how the questionnaire was developed, piloted, and modified to ensure reli ability and content validity. Students also completed 300 500 word reflections and fifteen students were randomly chosen to participate in semi structured interviews. These videos were not screencasts but were video recordings of individual students talki ng into a video camera. Each student was required to create a three minute video responding to (Hung, 2016, p. 93) Then, students had to view the videos posted by their group members and

PAGE 49

49 create a two m inute video offering feedback identifying weaknesses and ways to improve to each group member on his or her initial three (Hung, 2016, p. 93) Hung (2016) found that students engaged in observation through modeling O ne third of the students engaged in discussions with their peers about the feedback, and students engaged in cognitive strategies for achieving learning goals such as writing down comments after viewing a video to consider for further improvement. Learning gains, however, were not examined in this study. Most problems identified with different modalities of feedback video or screencast parallel issues already acknowledged as challenges to students understanding and using written feedback, such as too much feedback causing cognitive overload, but some issues are related to technology use. Treatment: Collaborative Multimedia Peer Review The treatment for this study was based on the social cognitive model for sequential skill acquisition because the model accounts for self efficacy beliefs, provides a scaffold for helping learners transition from needing social support to self regulation, and has been used as a model for instruction to improve student revision, albeit revision focused solely on sentence combining (Zimmerman & Kitsantas, 2002) The four levels involved are observation, emulation, self control, and self regulation. At the observation level, students observed a video modeling how to offer fe edback during peer review, emphasizing the feedback types that have been correlated with global revision and how to revise (Cho & MacArthur, 2010) At the emulation level, students enacted providing feedback to each other, which was also a form of practice in evaluating essays that should have helped them when

PAGE 50

50 they actually revised. Students participated in peer review with VoiceThread instead of the discussion board or other text based commenting tool s as Raedts and colleagues (2007) found that students who observed video with peer models had better calibrated self efficacy beliefs and outperformed those in a condition without video. Also, Jones, Georghiades, and G unson (2012) examined student perceptions of peer tutor feedback via screen capture and found that all interviewed students commented on how knowing feedback. For the ne xt level, self control, students composed a self reflection since this level was when they were measuring their standard of how to evaluate their writing and make decisions for revision against what they saw in the videos and experienced in peer review. During the self regulation level, Zimmerman and Kitsantas (2002) explain that learners shift from observing and reflecting to performing, which requires self regulation. At this level, students revised and submitted a final draft to demonstrate their comp etence at revising. They repeated this process for all four essays in the course. Observation Level The treatment included a video modeling how to provi de feedback during peer review emphasizing Cho (2006) and Cho and MacArthur's (2010) n ondirective and praise types of feedback while introducing other types with the exception of criticism as well as how to question and consider the feedback one received from peers (Table 2 1 ) It also modeled a developing writer toward becoming a skilled writer. Modeling is a type of observational learning, which

PAGE 51

51 has been well established as an effective instructional strategy for promoting self efficacy (Bandura, 1991, 1997; Bruning & Kauffman, 2015; Raedts, Rijlaarsdam, van Waes & Daems, 2007; Schunk & Zimmerman, 1997; Wang & Wu, 2008; Zimmerman & Kitsantas, 2002) The students used in the video were volunteers and not students in the freshman composition sections being studied. During the first minute and a half, the video explains the purpose of peer review which is to help a person become better at evaluating his or her own writing by practicing evaluating the writing of his or her peers. etc., are most important to evaluate. In t he ne xt thirty seconds the types of feedback students should give without specifically labeling them as directive, non directive, and praise are described (The names of those who volunteered to be in the video have been changed.) In the ten minutes that follo using VoiceThread an online multimedia discussion tool, in order to provide audio ( Figure 2 3 ) The video captures Hannah as she logs in and as she accesses her group and the drafts posted. It also shows Hannah as she responds to feedback posted to her on VoiceThread The remainder of the video members have posted i n reply to feedback that she provided. Sarah uses the option to post a video comment to further explain her initial feedback to Hannah. The video also shows Sarah as she considers feedback and begins revising her essay, highlighting how her revision appro ach can be improved ( Figure 2 4 )

PAGE 52

52 Throughout the video, there are eight questions on concepts from the video that allow students to test their knowledge of proper peer review practice as they progress through the video. One of the questions is demonstrated in Figure 2 5 Rather than the coping model used in utilized error management training, a method that emphasizes to the learner that making errors is an e ssential part of the learning process and frames errors or inadequacies as a form of positive feedback, in learning specific skills or in decision making (Keith & Frese, 2008) In the instructional video, there are areas where making errors or not meeting the writing standard is shown to be a natural part of learning. Early how peer re viewers are not expected to be experts in academic writing ( Figure 2 6 ) This comes up again when one of the volunteers, Hannah, expresses concern that her feedback to Daryl may be confusing. Text in the video reiterates that m aking mistakes is clarification if she is confused. The video encourages discussion between classmates to work out when someone has made an error in his or her writing or wh feedback seems inaccurate. Also in the video, one of the student volunteers demonstrates her revision process. She starts by correcting grammar, which is shown to be a mistake since examining and changing content is more important initially. Th e video again indicates mistakes are normal. Also, for outline assignments, students were permitted to revise and resubmit their outlines for a higher grade when they had errors in content. For example, the first outline assignment required students to pre write and develop an outline on the essay

PAGE 53

53 topic: illustrate the qualities of a purchaser of a specific product. Students whose outlines did not address the topic prompt were provided with feedback letting them know they did not address the prompt and encou raged to examine the lecture once more before resubmitting an outline for the instructor to review. Also, in video comments emphasized the process of making changes or needin g to improve areas as change multiple times before you find the statement that you think best expresses what thesis development before hitting on the best possible option. Since writing is a recursive process, and the point of revision is to improve upon organization, audience, awareness, and grammar and mechanics, error management process. Emu lation Level Following the video, students emulated the performance modeled in the video by actually participating in peer review, which corresponds with the emulation level. Zimmerman and Kitsantas (2002) contend that social feedback is essential at this level since a connection has been shown to exist between social feedback, achievement, and motivation. Students were placed into heterogeneous groups of three or four and each student posted an essay draft on an assigned topic. VoiceThread is a tool that allows

PAGE 54

54 use rs to create multimedia discussion threads by developing screencasts from uploaded files on which others can post text, voice or video comments. There is also a drawing tool that allows one to highlight as he or she offers voice/video comments. Each stud ent created a VoiceThread offering feedback on other group areas that were discusse d in the screencast. Once group members received a VoiceThread with feedback, they were supposed to reply on that thread by clicking a button to create a text, voice, or video comment. The replies should have included an explanation for which suggestions from the screencast they intended to us e, which they did not intend to use and an explanation why. This stage allowed students to receive social feedback on how well their drafts and their comments to each other met the writing standards of the group and allowed them to engage in a dialogue. follow up text, voice, or video comment from the reviewer justifying the original suggestion. Essentially, each student was require d to contribute a total of six posts as dialogue with two other classmates about the feedback they received and the feedback they provided to other group members within VoiceThread as illustrated in Figure 2 7 below. Requiring students to develop thoughtful replies to feedback in VoiceThread emphasizes the interaction between students, which is essential for collaborative learning to occur (Zhao et al., 2014)

PAGE 55

55 Self control Level At the self control level, students began making decisions about what changes they would implement in Zimmerman and Kitsantas (2002) is the instructional video and classmates in their peer review groups (p. 661). Students were r equired to compose a self reflection which prompted them to elaborate on their decision the Revisi from Writing Spaces: Readings on Writing Volume I which is an open source online book about writing d irected at composition students. The chapter can be accessed at http://writingspaces.org/sites/default/files/giles -reflective writing and the revision process.pdf In the chapter, Giles (2010) introduces important points of reflection and a letter to the reader assignment in which students address the following: Tell the reader what you intend for the essay to do for its readers. Describe its purpose(s) and the effect(s) you want it to have on the readers. Say who you think the readers are. Describe your process of working on the essay. How did you narrow the assigned topic? What kind of planning did you do? What steps did you go through, what changes did you make along the way, what decisions did you face, and how did you make the decisions? How did comments from your pee rs, in peer workshop, help you? How did any class activities on style, editing, etc., help you? letter and essay match up. Does the essay really do what your letter promises? If not, then use the draft of your letter as a revising tool to make a few more adjustments to your essay. Then, when the essay is polished and ready to hand in, polish the letter as well and hand them in together.

PAGE 56

56 Giles (2010) also provides a model letter to the reader. Following this approach, students were instructed to include a letter to the reader on the last page of each essay final draft. Self reflection is a key component of the learning process, particularly as it (Bandura, 1991; Duijnhouwer et al., 2012; Negretti, 2012) Self regulation Level Zimmerman and Kitsantas (2002) explain that, d uring the sel f regulation phase, learners shift from observing and reflecting to performing, which requires self regulation. Learners shifted from practice with an understanding for what decisions skilled writers make during revision to actually using those skills T he y should have made changes to their essay draft based on what they learned about revising from the instructional video, from their dialogues with classmates in VoiceThread and from their self reflections. Following revision, they submitted a final draft to demonstrate their competence at revising. This four step procedure shown in Table 2 2 was repeated for each of the four essays in this online composition course. Conceptual Framework The conceptual framework for this study is informed by social cognitive principles on the writing process (Flower, 1990; Flower & Hayes, 1981; Magnifico, 2010; Mitchell & Taylor, 1979; Zimmerman & Kitsantas, 2002) and empirical evidence that presents various peer review practices that are relevant for teaching composition in an online environment. Within college composition courses, peer review is viewed as an important form of colla borative learning for students developing their written communications skills becuause peer review allows students to share writing and feedback that is intended to

PAGE 57

57 emmulate feedback from an authentic audience. This practice is meant to teach students how to anticipate audience expectations as they revise their writing (Magnifico, 2010; Paulson et al., 2007) Revision is viewed as a key feature of the writing process since it is a complex task that may i nclude revisiting previous stages of the writing process (such as planning focus more on global issues, areas that impact meaning and coherence, being more skilled (Faigley & Witte 1981; Kellogg, 2008) A quality of learners believed to impact their ability to self efficacy (Bandura, 1997; Zimmerman & Schunk, 2001) The conceptual framework depicted in Figure 2 8 demonstrates two key absence of a standard strategy or set of strategies for how to implement peer review that might have consistent results. Though peer review is a c ollaborative activity, there are many instances of peer review as a one way interaction (Hauptle, 2006; Keeley, 2014) Another problem lies in a lack of research on how peer feedback impacts self eff ica c y and revision specifically as it applies to college compo sition. Though peer review has been show n to be effective in some cases (Carifio; Jackson ; & Dagostino, 2001; Cho, Schunn, & Charney, 2006; Patc han et al., 2013) in other studies, peer review is shown to be ineffective according t o student or instructor perception s (Bedore 200 4; Ludemann & Mcmakin, 2014) so it would seem that students may not have a clear understanding of how to provide feedback that promotes self effica c y and revision

PAGE 58

58 The purpose of this stu dy was to examine the relationship between three variables: peer review, self efficacy, and revision. The treatment illustruated in Figure 2 8 depicts Collaborative Multimedia Peer Review, an approach based on the Social Cognitive Model of Sequential Skill Acquisition because the model is supported by research as promoting social learning, self or h er preformance. The video devleoped for these students defined and emphasized the types of feedback research has shown to promote self efficacy and revision focused on global issues. It also demonstrated how to provide feedback in different modalities in V oiceThread emphasized error management training instead of error avoidance, and modeled revision that values meaning level changes over surface level changes. In the current context, students would be adapting their performance in peer review to evaluate their own drafts as they revise. The control for this study is also shown in Figure 2 8 These students participated in peer review activities predicated on one way interations. S tudents reviewed each in a smal l group filled out a peer review worksheet on each draft reviewed, and posted the worksheets on a discussion forum for their other group members to collect. Unlike the treatment, students in this group did not review instructions or modeling on how to giv e feedback, interact in peer review, or revise beyond basic class instructions. Ideally, this research would demonstrate whether instructional scaffolding for peer review based on the Social Cognitive Model of Sequential Skill Acquisition has a positive i mpa ct on students self efficacy, revision practices and perceptions of peer review and performace. To determine if there is an impact on student learning, the two

PAGE 59

59 groups are compared, and the group using scaffolding is closely examined for correlations be tween variables. Figure 2 Figure 2

PAGE 60

60 T able 2 examples of feedback types and definitions Feedback type Definition Example Directive Explicit suggestions for change applicable only to specific seemingly unrelated sources. However, they are all good references and could provide valuable contribution to your arguments. So try to present them in your favor, and build logical connection among these 3 reference s Non directive General comment applicable to any paper re Praise An encouraging comment using previous information to branch off and inquire about other issues relating to your Summary Restates main points expect a decreased frequency of scientific conversation involving prediction and explanation in a non scientific setting Figure 2 3. Hannah using VoiceThread to post an audio comment and using the pen tool. (Photo courtesy of author)

PAGE 61

61 Figure 2 VoiceThread (Photo courtesy of author) Figure 2 5. One of the questions embedded in the instructional video. (Photo courtesy of author)

PAGE 62

62 Figure 2 6. Error management training within instructional video (Photo courtesy of author) Figure 2 7 Peer review interaction on VoiceThread

PAGE 63

63 Table 2 2. Treatment for teaching the writing process using the social cognitive model for sequential skill acquisition Observation Emulation Self control Self regulation Theory Learners develop a clear image of what skill they need to perform through observing a model (Zimmerman & Kitsantas, 2002) Learners emulate performance while receiving social feedback which influences their motivation and level of achievement (Zimmerman & Kitsantas, 2002) Learners compare their emulation to their are based on the per formance (Zimmerman & Kitsa ntas, 2002, p. 660) Learners can performance to changes in internal and external self efficacy beliefs impact this stage. (Zimmerman & Kitsantas, 2002, p. 660) Practice Students watch a video modeling how to provide feedback in peer review (important to self efficacy) and modeling how to revise, focusing on error management training. Students participate in peer review (giving and receiving feedback) in VoiceThread eng aging in a dialogue with group members. They evaluate the writing of others. Students write a self reflection explaining their thought process for decisions about how to use feedback and justification for revision decisions they are making. They evaluate t heir own writing. Students enact revision decisions, utilizing the strategies of skilled writers.

PAGE 64

64 Figure 2 8. Conceptual Framework

PAGE 65

65 CHAPTER 3 METHODOLOGY Research Design addressed: 1. To what extent does Collaborative Multimedia Peer Review contribute to efficacy in an online freshman composition course? 2. To what extent does Collaborative Multimedia Peer Review contribute to in an online freshman composition course? 3. What relationship, if any, exists between Collaborative Multimedia Peer Review, freshman composition course? 4. ions of receiving and providing feedback using voice and video comments in VoiceThread efficacy and revisions strategies? This study utilized a convergent mixed methods research design with parallel quantitative and qualitative data collection and evaluation in order to determine whether and how CMPR contributes to writing self efficacy, contributes to student revision choice s, and what, if any, relationships exist between CMPR, writing self efficacy, and revision in online college composition courses. Five sections of an online ENC 1101 course were used (n = 41) with two sections utilizing CMPR (n = 22) and the other three se ctions utilizing peer review worksheets traditionally used in this course (n = 19). A mixed methods efficacy, and student revision (Creswell, & Clark, 2007, p. 62)

PAGE 66

66 Participants and Context The site of this study was a large college (40,000+ students) in northeast Florida. A full er load is fifteen workload units or five sections of a three credit course, so five sections of online ENC 1101 College Composition were used. All five sections were taught by the same instructor, the primary investigator, from a premade shell in Blackbo ard Learning Management System developed by the sections were randomly assigned to use CMPR by flipping a coin, and the shell materials for peer review were edited to include the peer review scaffolding video and the peer review process modeled after the social cognitive model of sequential skill acquisition. The other three sections received pre training and engaged in peer review with the worksheet that was already built int o the premade course shell. Sections of ENC 1101 have an enrollment cap of 22 students; however, to be included, students anonymously indicated their consent, complete d at least three of the four peer review assignments, and submitted a rough and final dra ft for the final essay module. Forty one students met these criteria, 22 from sections using CMPR and 19 from the sections using peer review worksheets. The participants for this study fell into two of Creswell's (2013) Though primarily chosen for convenience, the participants in this sample were also representative of typical freshman stude nts at this institution. Exploring Group Homogeneity Relative to Writing Skills After November 2013 in Florida, students who graduate from a Florida public high school in 2007 or later are not required to take a college placement test to determine college readiness for English.

PAGE 67

67 ability, students in all four sections were given a diagnostic timed writing prompt prior to the first essay assignment. The diagnostic essays were evaluated with a departmental rubric, wh effectiveness essay assessments ( Appendix B ). A selection of essays was scored by another experienced writing faculty member who was familiar wit h the rubric. McGraw and Wong (1996) explain that when measurements share both metric and variance, as is the case when comparing multiple the intraclass correlation coefficient (ICC). Shrout and Fleiss (1979) indicate there are three different study ty pes, or cases, for determining interrater reliability with the ICC. Where two people are rating the same targets, as is the case here, the raters should be treated as fixed effects (Shrout & Fleiss, 1979) Interrater reliability was moderate at .643 (N=16). The average rubric scores determined by each rater were within one level of each other with one rat er consistently scoring slightly higher. In regards to faculty reliability in scoring student writing with rubrics, Good (2012) f ound that even with training on using a specific rubric to assess focus, content, organization, style, and language conventions, faculty struggle to agree on style and organization. Treatment The online ENC 1101 course shell require d four practice assignments, each of which was a paragraph meant to lead directly to composing an essay on the same topic. However, outlines have been shown to be beneficial to dev eloping writers by lessening the potential cognitive load associated with complex writing tasks (Baaijen et al., 2014; Kellogg, 1988) so the course content in all shells was adjusted to include a well developed essay outline as a practice assignment rather than a paragraph.

PAGE 68

68 Major essay assignments included in the course sh ell were an example essay, a comp are and contrast essay, a cause or effect essay, and a definition essay. While the definition essay was meant to be a means of incorporating other rhetorical strategies and was the only assignment requiring research, I deci ded to replace it with a rhetorical analysis essay because one of the course outcomes is analyzing a reading selection. The required textbook, The Little Seagull Handbook provides a section on how to write a rhetorical analysis of an argument. A proctored timed essay on a general topic was also a requirement of the course. For the sections using CMPR, a module was developed that focused on how to participate in peer review. In the module, the instructional video was assigned as part of the class lecture. The lecture explained the types of feedback emphasizing the use of directive comments nondirective comments, and praise comments requirements for handout students coul d view online or download which provided instructions and screenshots showing how to log in to VoiceThread how to access peer review groups, and how to post comments. The module included a practice peer review assignment that required students to log in to VoiceThread access the group already created for them by the instructor, and comment on a sample student essay. The practice assignment allowed students to become familiar with VoiceThread and troubleshoot any technical issues prior to participating in peer review with their classmates. In subsequent modules, for each writing assignment students were required to submit an outline. This was graded and returned with instructor feedback indicating

PAGE 69

69 whether students demonstrated an understanding of the wr iting task prior to the deadline for student rough drafts. Students submitted their rough drafts through an assignment link in Blackboard This was consistent with how they submit ted other written work such as their outlines and their final essay drafts. I collected the rough drafts submitted by the deadline by downloading them, and then I uploaded each draft to VoiceThread in order to minimize potential technical difficulties that m ight inhibit students from uploading to VoiceThread To prevent students from expecting the instructor to identify all errors on their drafts, instructor feedback was not provided on rough drafts in VoiceThread. As part of the rough draft assignment des cription, students were told the following: Since this class is teaching you to evaluate your own writing effectively, I will not be automatically reviewing drafts, but I am happy to answer any specific questions you may have about yo ur writing like, for I am having trouble with my thesis. Would you please review it and let me know if you think it accurately e xplains what my essay is about? This is just an example, but any specific questions you may have about your writing, I am happy to answer Just email me or see me during online or face to face office hours. Despite these comments, I look ed over drafts and notif ied students who seemed to have misunderstood the assignment that they needed to reread the assignment description and get in touch with me for further assistance. Students who were in sections not using CMPR worked from the peer review worksheet and the instructions already provided in the premade course shell. However, they received pre training on the purpose of peer review and ho feedback They also completed a practice peer review activity with a sample student essay. These students were similarly informed that the instructor would not be providing feedback on their rough drafts unless specific help was solicited. Here too, I contacted

PAGE 70

70 students who seemed to have entirely misunderstood an assignment, encouraging them to reread the assignment description and to get in touch with me. Instruments The dependent variables in this study were writing self efficacy, how individual students revise, and student perceptions on giving and receiving feedback. Writing self efficacy Writing self efficacy was measured using the Writing Self Regulatory Effica cy Scale The scale was developed and validated by Zimmerman and Bandura (1994) to 20) perceived efficacy for steps in the writing process in regular and advanced college writing courses. The survey was given to all students before they began the first peer review assignment and after the fourth a nd efficacy may have occ urred. T he instructional video model ed how to prov ide feedback in peer review, which should have positively impact ed self efficacy, so it was assumed that students in that group who began with low self efficacy scores would have higher scores following instruction and peer review. Zimmerman and Bandura (1994) reported the reliability of the Writing Self Regulatory Efficacy Scale at = .91. The assessment include d twenty five questions that ask students for perceptions about their ability to engage in steps of the writing process (planning, organizing, and revising), their creative capability for writing, and their ability to self regula te through the process (Zimmerman & Bandura, 1994) Revision To measure student revision, a version of the Faigley Witte Revision Taxonomy was used. The original taxonomy was assessed for validity and earlier versions with

PAGE 71

71 additional categories were revised to improve reliability (Faigley & Witte, 1981) Cho and MacArt hur (2010) identified four types of revision change condensed down from Faigley and Witte's (19 81) taxonomy of 24 types of revision change. In my research design, Cho and MacArthur's (2010) version of the instrument was used. Cho and MacArthur (2010) verified the reliability of their coding scheme by selecting ten percent of student drafts in their study for coding by ano Kappa for coding agreement was acceptable at .83. To ensure reliability in using Cho and MacArthur's (2010) coding system, another writing faculty member coded a sample of student essay revisions, and interrater reliability was assessed using the intraclass correlation coefficient (ICC). Then, I coded all of the remaining revised essays. For h essay assignment each sentence in the revised essay was compared to the peer review draft and the frequency of changes w as counted. Frequencies were evaluated quantitatively. Perceptions Many studies on peer review also examine student perceptions because perceptions influence student self efficacy, interest, goal orientation, and, ultimately, their engagement with the task (Brammer & Rees, 2007; Cho, Schunn, & Charney, 2006; Kasanga, 2004) receiving feedback and using VoiceThread a seven point Likert type Judgment of Learning item common in metacognition literature was used (Anderson & Thiede, 2008; Thiede & Anderson, 2003) To assess participants' perception of learning as a result of the peer review, the participants responded to the following ques do you think you understood what revisions needed to be made based on your PEERS'

PAGE 72

72 perception of learning as a result of giving peer reviews, they answered th e following your own writing as a result of GIVING reviews to others? 1 (very poorly) to 7 (very was a two item measure used upon completion of each revision. A seven point Likert scale was used to determine the degree to which students felt VoiceThread was VoiceThread for engaging in peer review with your classmates? 1 (not useful) to 7 allow ed for different comment types, text, audio, preferred method or combination of methods for giving feedback in VoiceThread : text, point Likert Prediction of Performance item was used to assess how well participants believe they performed on the revisions (e.g., Thomas, Antonenko, & Davis, 2016). Specifically, the participants responded to th e e for your revised essay?" They s elect ed a score from 1 to 5 for each category of the same rubric that was evel 4 scale questions were evaluated quantitatively while the open ended question on student preferences for feedback modality in VoiceThread was examined through thematic analysis. R esults were analyzed with other data. Validity was ensured through data triangulation with the scores from the Writing Self

PAGE 73

73 Regulatory Efficacy Scale and revision taxonomy. Also, disc repant information further enhances validity, and so a detailed description of the coding process and how quantitative and qualitative data were merged is provided. Procedure ENC 1101 online is a sixteen week course. During the first week, students review ed the syllabus and acknowledge d that they had read and understood the course syllabus, which included assignment des criptions and how peer review was executed. The drop/add period ended after the first week. Du ring week two, once drop/add ended, the faculty dropped student s for nonattendance if they had not completed any of week essay was due during week 2 of the course and submitted as an essay quiz in Blackboa rd. The Writing Self Regulatory Efficacy Scale survey was also due during week 2. The instructor video on providing feedback in peer review and revising was required viewing in week 3 for course sections randomly chosen for the treatment group. Students included in this study spent an average of 41 minutes in the lecture content where the video was embedded. Students participated in peer review for the first time during week 4; students in the treatment used VoiceThread and those in the control posted pe er review worksheets to the discussion board in Blackboard. The peer review process was repeated for each of four essay assignments. Data were collected from peer review for the final essay, week 13, giving students ten weeks to practice giving and receivi ng feedback from peers and to practice revising. Also for the final assess their revision skills. Finally, students took the Writing Self Regulatory Efficacy

PAGE 74

74 Scale survey in week 15 of the course. The sequence of activities is illustrated in Table 3 1. Data Analysis Diagnostic s cores Diagnostic essays were assessed using the faculty developed departmental rubric. Essays were scored from level 1 to 5 in content, organization, conventions, and language and audience. Level 5 is equivalent to demonstrating excellence or A grade level skill in that area, level 4 is considered very good or equivalent to a B, level 3 indicates average skill and is equal t o a C in that area, and so on. Another experienced writing faculty member familiar with the rubric scored a selection of essays to ensure interrater reliability. Interrater reliability was moderate at .643 (N=16). An independent t test showed no significan t difference in diagnostic scores between the students in sections using CMPR (M = 3.20, SD = .59) and students in sections using peer review worksheets (M = 3.04, SD = .80); t(39) = 7.60, p = .45. Research Question 1: To what extent does collaborative mul timedia peer efficacy in an online freshman composition course? Writing self efficacy was measured using the Writing Self writi ng process ( Zimmerman & Bandura, 1994) The Writing Self Regulatory Efficacy Scale was given to students in all four sections before they beg a n the first peer review assignment and after the fourth and final peer review exercise to determine whether any efficacy may have occurred. Because the instructional video modeled how to provide feedback that would positively impa ct self efficacy, it was assumed that students in that group who began with low self efficacy scores would

PAGE 75

75 have higher scores following instruction and peer review. The posttest scores were compared through a between subjects ANCOVA with pretest scores as the covariate. Qualitative data from student developed screencasts were also analyzed to explore possible associations between instructor modeling, feedback, and self efficacy. Student developed VoiceThreads were coded using Cho, Schunn, & Charney's (2006) six category coding scheme to determine what type of feedback students gave each other: directive, which involves suggestions for specific changes; nondirective, which observes a general area needing impr ovement; praise comments; criticism comments, task comments. Thematic analysis (Braun & Clark, 2006) was used to extract and analyze the up dialogues in V oiceThreads because it is an effective method of finding and analyzing themes in qualitative data and can be used within different theoretical frameworks as it is not bound to a particular theory (Braun & Clarke, 2006) Six groups, each with three members, were analyzed, so eighteen VoiceThread conversations were collected. Ideally, each thread should have contained at least three interactions: the initial feedback post or posts, a reply from the person receiving feedback, and another r esponse from the person who originally posted feedback. In some cases, the initial feedback was one post critiquing the entire essay, and in others, the initial feedback involved multiple posts critiquing the writing on each page separately or each paragra ph separately. Replies from the person receiving the feedback should have included an explan ation for which suggestions he or she intended to use and which he or she did not intend to use and why. None of the group

PAGE 76

76 conversations developed beyond a reply fr om the person being reviewed. Of the thirty six interactions that occurred over the six groups, fourteen included a response to each classmate who provided feedback. For three of the drafts, the student writer posted a I appreciate all your guys' suggestions. Thank you and for four drafts, one of the reviewers viewed the draft, but did not post feedback. Braun and Clarke's (2006) six phases of analysis that serve as guidelines for performing For phase 1, I familiarized myself with the data first by transcribing the VoiceThreads and also by printing out comments and reviewing them for accuracy while rereading or listening again to the VoiceThreads. Braun and Clarke (2006) indicate that transcribing data facilitates a thourough understanding of the data. As my examination of qualitative data is based on theory related to feedback, self efficacy, and revision, my coding was more theory driven than data driven, so for phase 2, I manually generated initial codes looking specifically for patterns in feedback characteristics. In phase 3, thinking about the relationship between codes, between themes, and between different levels of themes (e.g., main overarching themes and sub As part of this process, I transfered my initial codes onto the files, created a table with the name of each code and a brief description, and began to tentatively designate themes (Braun & Clarke, 2006)

PAGE 77

77 Braun and Clarke (2006) explain how phase 4 should involve reviewing extracts for each theme and determining whether there may be a pattern. Once a discernable pattern emerged, I reread my data to determine if the themes fit. According to Braun and Clarke (2006), at the end of phase 4, the different themes would be evident. At that point, I also had an idea of how they interact ed as w NVivo 11. Saldana (2012) discusses how first time coders should code manually but that most researchers utilize Computer Assisted Qualitative Data Analysis Software, of which permit you to organize evolving and potentially complex coding systems into such (Saldana, 2012, p. 24) During phase 5, I organized my data extracts by theme, and in doing so, I was able to define each theme and refine the themes further to determine what, if any, sub themes exist ed how each theme related to the other themes that emerged, and how they contribute d (Braun & Clarke, 2006) Once I determined how the themes related to each other and the overall story told through the data I began phase 6 which involved writing a rich description of what the data reveald within and across themes in a concise and logical manner. Braun and Clarke (2006) also point out that, at this stage, the narrative being developed should do more than describe the data; it should make an argument as well. Research Question 2: To what extent does collaborative multimedia peer

PAGE 78

78 composition course? Student revisions were analyzed from the fourth essay assignment. Each stude using a version of the Faigley Witte Revision Taxonomy. For the revisions of the fourth essay assignment, each sentence in the revised draft was compared to the peer review draft and identified as either No Change, Surface Change, Micro level Change, or Macro level Change (Cho & MacArthur, 2010; Faigley & Witte, 1981) The Compare D ocuments tool in Microsoft Word was used to show the tracked changes from rough to final draft. These files were printed along with the original rough drafts allowing the primary investigator to more easily examine differences since, for essays where a lot of content was altered, it could be difficult to follow the tracked changes in the margins of a document To determine interrater reliability, a random selection of ten rough and final drafts were reviewed by another experienced English faculty member. Pr eceding the rating session, we met for three hours to review Cho and MacArthur's (2010) and Faigley and Witte's (1981) descriptions of change types and examples of error types. Two rating sessions each lasting two hours followed. As we marked changes directly onto duplicate copies of drafts, we compared our ratings for every other document. During training, we agreed on how to count each change type. Where Cho and MacArthur (2010) counted every word in a change, like where a sentence or sentences were deleted and/or replaced, we decided to count the number of sentences instead of words in instances where changes had been made as was the method in Faigley and Witte's (1981) research. We agreed counting the words might o verrepresent meaning level changes since meaning changing revisions could include multiple sentences with several words when the writer may have only altered or added a

PAGE 79

79 few thoughts. In the excerpt provided in Figure 3 1 a stu dent made two surface changes and two macro l evel changes, adding new points. b agreement. These changes were counted as two surface changes. In addition, the student created new content by adding the last two sentences in the paragraph which bring up points that did not previously exist in the essay. (This differs from extending co ntent, a micro level change, in that extended content adds to a p oint that already existed.) The addition of two sentences with new points is counted as two macro level changes, one for each sentence, instead of as sixty four macro level changes for the number of words added. I nterrater reliability was assessed using the intra class correlation coef ficient (ICC) Of the three different cases which Shrout and Fleiss (1979) indicate can be utilized to determine the ICC, the two way mixed model with the absolute agreement type was used here since the two raters were fixed and the rate of exact agreement was being examined, N=10. The ICC was found for eac h change type with all types sho wing high reliability ( Table 3 2 ). For Surface changes, the rate of reliability was high with an average measure ICC of .982. As Cho and MacArthur (2010) divided Micro level changes into two categories (complex repair and extended content) and Macro level changes into two categories (new points and organization), the ICC was found for e ach subcategory. The average measure ICC for complex repairs indicated a high rate of reliability at .985, and for extended content, the average measure ICC was .988. Where new points were identified, the average measure ICC was .997, and the average

PAGE 80

80 measu re ICC for organization was .959. Finally, the reliability between raters for indicating when no change had been made to sentences was calculated, and the average measure ICC was .988. Table 3 2 reports the ICC and the confidenc e intervals for each category as well as p values. Results were evaluated quantitatively looking primarily for instances of micro level and macro level change as indications that the students are becoming more skilled at revision. The revision analysis ca tegories are provided in Table 3 3 Faigley and Witte's (1981) resu lts for the percentage of surface and meaning level changes were used to rank each student as being either inexperienced (1), advanced (2), or expert (3). These ranks do no not denote that individuals are actually expert writers, advanced writers, or inexp erienced writers overall; they are being used here to indicate student revision practices based on previous findings for revision practices. Because the dependent variable was ordinal and not continuous, a nonparametric test, the Mann Whitney U test, was u sed. Research Question 3: What relationship, if any, exists between Collaborative Multimedia Peer Review efficacy, and in an online freshman composition course ? Quantitative and qualitative data already collected from the CMPR group were converged and analyzed (n = 22). Quantitative data on self efficacy and revision were analyzed with correlation analysis. Quantitative results were compared with qualitative data from peer review for a triangulated view of how the variables, self efficacy and revision, interact with CMPR (Creswell, 2014)

PAGE 81

81 R providing feedback using voice and video comments in VoiceThread and how efficacy and revisions strategies? Participant responses to the Judgment of Learn ing survey, usefulness of VoiceThread and w ere examined using descriptive statistics and a series of Spearman Rho correlation analyses. Spearman Rho was appropriate because the data be ing compared was ordi nal. Thematic analysis was used to analyze responses to the open ended question on student preferences for feedback modality in VoiceThread Table 3 4 demonstrates the alignment of the research questions, data sources, and analysis. Limitations and Ethical Considerations or oneself may raise issues of power and risk to the researcher, the participants, and the site. However, my knowledge of my institution, my experiences in teaching college composition and familiarity with the students and the challenges to teaching composition situate my personal, professional practice as a n appropriate topic for my research. In order to prevent ethical concerns related to issues of power, I made it clear to students that grades were not dependent on students granting permission to be subjects in the study. Everyone was required to participa te in peer review as part of the the course, not the study.

PAGE 82

82 A significant limitation arose from the number of students who persist ed in the course and agree d to part icipate. Ideally, at least 80 students across sections would have consented and persisted in completing the course H owever, retention in my past online freshman composition courses has ranged from 60% of the students enrolled finishing and passing to as l ow as 45% of the students enrolled finishing and passing Retention was similarly poor for the courses in this study This led to the sample size being much lower than 80 students which impacts how representative the sample is of the population. The rete ntion and success rate in the sections using peer review worksheets was notable lower than sections using CMPR; however, t he workload in all sections was equivalent with the only difference apparent in instruction for peer revi ew and peer review activities with students using CMPR having the added challenge of using a new technology outside of the course learni ng management system Though the causes for hen students withdraw, they are required to provide a r eason that is automatically sent to the instructor and dean. The withdrawal alerts for students in both groups were similar. Regardless of the group they were in, students who withdrew indicated that challenges to persisting in the course were external in volving an inability to manage school and work or school and family or a preference for face to face instruction. One of the students in a CMPR section who had initially My work hour s are conflicting with my schooling, I can t keep up with when things are due. Similarly, a student in the group using pee r review worksheets wrote as a reason for withdrawing Do not have the time to take classes right now. Have a lot more on my

PAGE 83

83 plate than I expected. While more students withdrew from sections using peer review worksheets, their explanations for withdrawing were essentially the same as the reasons given by students who withdrew from CMPR sections. Due to the similarities in reasons giv en for withdrawal by students in both groups, it is evident that the reasons students did not persist were not a result of differences in peer review activities. Also, there were students who remained in the class who did not participate in peer review of ten enough to be included in the study. Though peer review was a required assignment associated with points, students may have had life issues that prevented them from engaging in all of the peer review exercises, or they may not have been motivated to com plete the peer review discussions. Also, as illustrated by Zimmerman and Kitsantas (2002) sometimes learners overestimate their ability to perform a task which can elevate their self efficacy. After observing a model, Zimmerman and Kitsantas (2002) f efficacy decreased as they realized they had underestimated the difficulty of a task. If students overestimate their ability to perform revision tasks, it could impact the differences in pre and posttest self efficacy scores in this study. Interreater r eliability might be a limitation as well if raters are biased or disagree on definitions or performance criteria. However, twice a semester, faculty meet to score common writing assessments for developmental English courses, and yearly, faculty meet to ass ess a selection of 1101 essays with the departmental rubric ( Appendix A ) for institutional effectiveness. In both of these instances, definitions are discussed and norming occurs with sample essays prior to scoring. The raters involved in both of these

PAGE 84

84 processes typically agree on performance outcomes and interpretations of the scoring rubrics used. Technical issues and differences in student access to personal computers that are up to date and meet software requires may pose a barrier to some. Hung (2016) identified technical challenges in a study on using multimodal technology for feedback such as poor video quality, poor sound quality, slow internet connection, and a lack of the proper equipment. Similar issues may have pose d a challenge for some students in this study. There is also the possibility that confounding variables, like feedback from other sources, could have impacted the outcomes of this study. Multiple students specifically asked me for additional help outside o f the feedback that everyone received on final drafts. Some received support via email, some spoke to me over the phone, and some met with me through video conferencing. Also, local students, regardless of whether they are taking courses online, have acces s to writing tutors at each campus location, and all students have access to Smarthinking an online tutoring service that allows students to submit essays for review or to schedule synchronous online tutoring sessions. I made students aware of all of the se resources because it would have been unethical not to disclose the services available to support them. Students I actually spoke to over the phone or through video conferencing indicated that they used tutoring at least twice. Table 3 1 Timeline for i mplementation Week Activity Week 2 Students complete Diagnostic essay quiz and Writing Self Regulatory Efficacy Scale survey Week 3 Students view instructor video on providing feedback in peer review and revising

PAGE 85

85 Table 3 1. Continued Week Activity Week 4 Students engage in peer review for essay 1 rough draft Week 5 Students revise and submit essay 1 final draft with self reflection and answer Judgment of Learning and Prediction of Performance questions Week 6 Students encouraged to review instruct or video on providing feedback in peer review and revising Week 7 Students engage in peer review for essay 2 rough draft Week 8 Students revise and submit essay 2 final draft with self reflection and answer Judgment of Learning and Prediction of Performa nce questions Week 9 Students encouraged to review instructor video on providing feedback in peer review and revising Week 10 Students engage in peer review for essay 3 rough draft Week 11 Students revise and submit essay 3 final draft with self reflection and answer Judgment of Learning and Prediction of Performance questions Week 12 Students encouraged to review instructor video on providing feedback in peer review and revising Week 13 Students engage in peer review for essay 4 rough draf t Week 14 Students revise and submit essay 4 final draft with self reflection and answer Judgment of Learning, usefulness of VoiceThread VoiceThread modality preference, and Prediction of Performance questions Week 15 Students retake Writing Self Regu latory Efficacy Scale survey Figure 3 1. Example of two surface and two macro level revisions in student writing

PAGE 86

86 Table 3 2 Interrater reliability with average measure Intraclass Correlation Coefficient for revision changes Change Type ICC Confidence Interval Sig. Surface .982 .857 .996 .000 Micro Level: Complex repair .985 .940 .996 .000 Micro Level: Extended Content .998 .953 .997 .000 Macro Level: New Points .997 .990 .999 .000 Macro Level: Organization .959 .833 .990 .000 No Change .988 .953 .997 .000 Table 3 3 Revision analysis categories based on Cho and M acArthur revision taxonomy. Surface change Changes such correcting spelling, tense, punctuation, abbreviation, etc. Micro level change Complex repair: At the sentence or paragraph level, fixing points by changing or deleting Extended content: Elaborating on a point by adding Macro level change New points: Adding entirely new points or paragraphs, not just elaborating on an existing point Organization: Changing or deleting transitional elements No change No changes made to a sentence Table 3 4 Alignment of research questions, data sources, and data analysis Research Questions Data Source Data Analysis To what extent does collaborative multimedia peer writing self efficacy in an online freshman compositi on course? Writing Self Regulatory Efficacy Scale Between subjects ANCOVA with pretest as the covariate to determine if there is a significant difference in posttest scores. Student developed voice threads and comments Coding for themes related to self efficacy in the feedback provided using Cho, Schunn, & Charney's (2006) six category coding scheme, and using thematic analysis to find themes in follow up comments.

PAGE 87

87 Table 3 4. Continued Research Questions Data Source Data Analysis To what extent does collaborative multimedia peer revision process in an online freshman composition course? Revision codes based on Cho and MacArthur's (2010) revised version of Faigley Witte (1981)Taxonomy Each change made from rough draft to final draft identified as either Surface change, Micro level change, Macro level change, or No change. What relationship, if any, exists between collaborative multimedia peer review, efficacy, a process in an online freshman composition course? Writing Self Regulatory Efficacy Scale, Revision Taxonomy Results Paired samples t test for pre/posttest results on the Writing Self Regulatory Efficacy Scale in the treatment group, f ollowed by Spearman Rho correlation analysis for self efficacy and revision. Interpretation of correlation analysis and qualitative data Correlation results and qualitative analysis results aligned weighed equally perceptions of receiving and providing feedback using voice and video comments in VoiceThread and how do efficacy and revisions strategies? Judgment of Learning survey, usefulness of VoiceThread VoiceThre ad modality preference, Perception of Performance rubric score, Writing Self Regulatory Efficacy Scale, Revision Taxonomy Results Descriptive statistics, correlation analyses using Spearman Rho combined with thematic analysis of comments in response to th e modality preference question.

PAGE 88

88 CHAPTER 4 FINDINGS This study investigated whether Collaborative Multimedia Peer Review (CMPR) contributes to writing self efficacy, contributes to student revision choices, and what, if any, relationships exist bet ween CMPR, writing self efficacy, and revision in online college composition courses. A convergent mixed methods research design with parallel quantitative and qualitative data collection and evaluation was used. This chapter explores the findings as they relate to each of the four research questions. 1. To what extent does Collaborative Multimedia Peer Review contribute to efficacy in an online freshman composition course? 2. To what extent does Collaborative Multimedia Peer Review contribute to 3. What relationship, if any, exists between Collaborative Multimedia Peer Review, freshman compos ition course? 4. and video comments in VoiceThread efficacy and revisions strategies? Self Efficacy Quantitative Self Efficacy Data Writ ing self efficacy was measured quantitatively using the 25 question Writing Self writing process ( Zimmerman & Bandura, 1994) The reliability of the ass essment is = .91 as reported by Zimmerman and Bandura (1994) perceptions about their ability to engage in steps of the writing process (planning, organizing, and revising), their creative capability f or writing, and their ability to self regulate through the process (Zimmerman & Bandura, 1994) The Writing Self

PAGE 89

89 Regulatory Effic acy Scale was given to students before they began the first peer review assignment and after the fourth and final peer review exercise to determine whether any efficacy may have occurred. (Twenty two met the study criteria in sections using VoiceThread and nineteen in sections using worksheets ) Survey scores from the beginning of the course (pretest) and survey scores taken at the end of the course (posttest) were analyzed with a between subjects ANCOVA using pretest scores as the covariate (N=41). Peer review type (CMPR and worksheet) was the between subjects factor. The covariate produced a statistically significant influence in this model, F(2, 38) = 8.75, p = .005, 2 p = .19. The covariate accounted for 20% of the difference between groups. Observed and adjusted marginal means from the ANCOVA analysis are provided i n T able 4 1 After the variance assessed posttest scores did not reach significance, F(2,38) = 1.87, p = .18, 2 p = .05. Observed and adjusted means are provided in Table 4 1 A mixed ANOVA was run with peer review type (CMPR and worksheet) as the between subjects factor and time as the scores were significantly different, F(1, 39) =7.50, p = 009, 2 p = .16. Table 4 2 provides descriptive statistics. In order to more closely examine how groups scored on specific questions, paired samples t tests were run for each measure of the scale for each group to determine whet her there might be significant differences in writing confidence for specific questions within each group. For the sections using VoiceThread there was a significant increase in writing self efficacy scores for quest ions 4, 5, 17, and 20. Table 4

PAGE 90

90 3 displays the significant results. For the sections using peer review worksheets, there was a significant increase in writing self efficacy scores for questions 4, 5, 7, 16, 17, 18, and 22 as shown in Table 4 4 Both groups made significant increases in questions 4, 5, and 17. Questions 4 examines (Zimmerman & Bandura, 1994) Sections that used the peer review worksheets also demonstrated statistically significant increases in questions 7, 16, 18, and 22. Like questions 4 and 5, questions 7 and 18 involve r 16 deals specifically with the perception of how well a person can self regulate when organizati on of a paper. Qualitative Data Self Efficacy Data Qualitative data was also collected from peer review conversations in VoiceThread A codebook is provi ded with definitions ( Appendix B ). The data was analyzed both deductively, searching for specific codes based on previous research on feedback, and inductively using thematic analysis. Feedback types in CMPR Feedback on drafts for the fourth essay assignment was coded using Cho, Schunn, & Charney's (2006) six category coding scheme, which includes directive feedback, nondirective feedback, criticism comments, summary, and off task comments. Six groups, each with three members, were analyze d. The most frequent feedback types overall were directive feedback (explicit suggestions for specific

PAGE 91

91 changes) and prais e as shown on Table 4 5 According to Cho, Schunn, & Charney (2 006) the directive and praise feedback types are ones perceived to be most helpful by students. Through their perceived helpfulness, these types of feedback should have efficacy; however, while directive comme nts and praise were the most frequent feedback types, the length and focus of directive comments varied greatly. For example, in VoiceThread peer review Group 1, each student provided multiple directive comments. Some comments emphasi zed superficial issue s (grammar or mechanics) and some emphasized content related writing issues. The following excerpts illustrate some of the variety in the depth and breadth of directive comments (underlined) from one of the stronger groups: s Essay: Thank you for sharing your essay Student 1. I enjoyed reading it. I enjoyed your first page and agree with you on many points. The only thing on the first page that I noticed was some grammatical errors and a few punctuation errors that could b e fixed to really make everything flow better, but that is all I could find on this page I noticed in the last two paragraphs you metioned [ sic ] the cancer thing twice and it sounds a bit repetitive. I would maybe look at reworking those two paragraph s to include what you want to say but not repeat the thought. I enjoyed reading your essay and think you did a terrific job I noticed a spelling error on this page and the use of the word author a lot. I also second article you mentioned. I agree with a lot of what you wrote and think you wrote it very well. and most directive comments are focused on correcting repetitive wording and grammar. The final comment recommends that the writer make sure to use statements

PAGE 92

92 makes a directive comment in regards to content while most statements provid e praise: sharing. I found your essay well written, easy to read and it flowed nicely. I see that you used a lot of logos, but as a reader, I would like to see a little more of your view of how he uses pathos to relate to the reader I like how much you included from the second article and found it extremely interesting as well. You really made some good points and have me going back and reviewing the article again. Though essay, the latter feedback would be preferable as it focuses on content rather than focus es on content and purpose rather than grammar or sentence structure. It also dire ctive. in your essay. You give great examples of ethos. I would try to find at least one more example of ethos, and as far as your pathos you have a great understanding on what pathos are but I feel that you could of used pathos in more detail. I picked the same article as I read the article I looked for words that triggered my emotions such as cancer obesity violate all these words have some type of emotional impact. I second page but when I go to page number two it is blank. I was looking for your thoughts on how the author uses logos. forget about the second source. I picked a source that disagreed with Leonhardt opinions a bout stopping sugar. Good Luck person point of view when the assignment instructed students to write in third person.

PAGE 93

93 supposed to be written in third person, and I see where you have about that. I believe our essay was to be written in third person, so we r rewording that Overall, Group 1 members provided useful feedback to each other in the form of directive and praise comments though most comments focused on surface issues. This was not the cas e with every group though the group interactions took place at the end of the term when I expected most students to have reached a higher level of competence at reviewing than what they demonstrated at the beginning of the term. In Group 3, for example, each group me mber differed in the level of peer review demonstrated. details and information you have provi ded! In your last sentence I would read that sentence a few times, the wording was a little weird .. [ sic ] Your essay was very strong! Here, Group 3 Student 3 has offered a nondirectiv e comment about run on sentences, not clearly identifying an example or examples, and a directive comment about changing a word choice. Student 3 only comments on surface issues in her review of nt 3 provides directive comments related to the use of point of view and mentions a key aspect of how point of I think you wrote a good essay, but noticed that you took a standpoint on the article. I think we were supposed to analyze how well the author explained their argument rather than taking a side. Refraining from using and leave your personal opinion on the topic out, your essay would better fit the guidelines Good work though!

PAGE 94

94 Unlike Student 3, Student 1 in Group 3 offered no constructive comments on Student I really wanted to find something to help you with, but frankly I thought you did a good j ob with your paper. I thought it was easy to read and it flowed between subjects well. constructive feedback: Great job with your paper. I don't see any glaring issues that need fixing, and your grammar seems fine to me. I didn't cite the author of the essay I chose for my own topic and I like the way you did that at the end. Group 3 Student 3 focuses on only offering praise to the two other group members though the in contribute any feedback to the other members of the group though in a conversation with the i nstructor about her essay, she indicated that she did review the comments that were left for her. The interaction in Group 3 left Student 3 contributing the most constructive feedback, yet one of the other group members only offered praise and another made no comments at all. In Group 6, the differences in length and substance of feedback comments were more prominent. Student 3 in this group provided detailed audio feedback on each t nine minutes and twelve seconds of feedback that included comments on the purpose of the essay and on how to connect specific persuasive appeals to what Student 1 discusses in his essay: ighlights essay with pen tool] you remember what the purpose of the essay is which is to analyze rhetorical appeals instead of just kind of restating what the article

PAGE 95

95 And so on the sec ond paragraph, [Highlights second paragraph with pen tool] facts and things he did, so my suggestion would just kind of be maybe talk about how this, [Highlights second paragraph with pen tool] in this paragraph, maybe how he uses logos, which is the use of like facts and information, or I think ethos is that, whichever one is meant, whichever one explains the use of facts and information is the appeal I think this essay, or this paragraph [Highlights second paragraph with pen tool] could be used for that. four seconds of substantive, directive and praise audio feedback. Here, Student 3 focused on how Student 2 could more clearly explain her points and give examples to support her claims: sentence [highlight first sentence with pen tool], you address the author in this one, which I think is good, but like address the author, you should start in the very beginning [highlight first paragraph with pen tool] and then when you address him throughout the he assumptions about who M who the author is, you know. And then in the second sentence, my an warming. What is the research going on in sentence three right here, but what is the research he used to convince his credibility and why do think he used that possibly? Like I said. These are just my suggestions. Feel group members, Student 1 provides brief praise and a nondirective comments about grammar:

PAGE 96

96 very informative, all paragraphs are w ell stated. I did see some small grammer [ sic ] issues but that was about it. opening paragraph is very detailed and to the point. just a few minor grammer [ sic ] errors, but for some reason im [ sic ] not able to highlig ht or show them, othere [ sic ] then [ sic ] that good job and good closing paragraph. Group 6 Student 2 also provided feedback that lacked substance though it was more in your first paragraph i [ sic ] would re read [ sic ] it i [ sic ] think that you meant no instead of on in your foruth [ sic ] paragraph i [ sic ] would maybe include a quote from the actual article. i [ sic ] think your esaay [ sic ] is good and the flow is ok i would just check your spellingn [ sic ] and grammar. sic ] like your beginning paragraph it is very clear and it explains what we are going to be reading about i [ sic ] also think the flow of your essay is good in your last paragraph the first scentence [ sic ] is a little confusing i [ sic ] think your conclusion is very strong i [ sic ] like how you stated your opinion and why clearly. not seem to have invested an equal amount of time or effort. The frequency of feedback types is provided in Table 4 5 While directive feedback v aried in content, it is interesting to note that, of the 122 directive comments, twenty four referenced specific assignment requirements. In some cases, students provided feedback that offered suggestions about meeting assignment content requirements. When Group 1 Student 3 I believe our essay was to

PAGE 97

97 directive feedback on content but also referencing a specific requirement of the essay assignment that Group 1 Student 2 had failed to meet. In a different group, Group 3, suggested, I think if you ch anged that part of your essay and leave your personal opinion on the topic out, your essay would better fit the guidelines Some of the directive comments that alluded to the assignment requirements noted that classmates may have left out key content info were supposed to analyze how well the author explained their argument rather than meet an essential content requi rement for the essay assignment. Feedback types using the peer review worksheet Due to the absence of a significant difference in self efficacy improvement on the Writing Self Regulatory Efficacy Scale between the sections using CMPR and the sections using peer review worksheets, a selection of peer review worksheets was examined to determine whether there may have been similarities in feedback despite the peer review conditions being different. While the directions on the worksheet primarily instructe d students to answer questions that essentially required them to structed to pay attention to the worksheet directions which a helpful and constructive tone when making suggestions or pointing out areas that nlike the students using CMPR, students using peer

PAGE 98

98 review worksheets did not have a lesson on the specific feedback types (direct, indirect, p raise) Despite the lack of instruction on feedback types, multiple students utilized some directive and praise co mments in their completion of the peer review worksheets. Figure 4 1 provides an example of how one student utilized directive comments (underlined) on one page of a worksheet response: Also, students using peer review workshee ts often provided praise comments despite not being instructed in that type of feedback ( Figure 4 2 ) Though praise was not typically in the actual worksheet, it could be seen in discussion posts where completed worksheets were attached. Like the feedback from the sections using CMPR, the feedback from students in sections required to use peer review worksheets varied. Some students completed the worksheet offering only basic answers to the worksheet questions with no directive comments, some provided one or two directive comments that focused on surface issues such as formatting or point of view whereas others offered more comprehensive comments on content and organization. Self efficacy in follow up comments for CMPR A vit al feature of CMPR with VoiceThread involved students contributing, in addition to feedback on drafts, an additional six posts as dialogue with two other classmates about the feedback they received and the feedback they provided to other group members wit hin VoiceThread The rationale for this stage was to prompt the dialogue between writer and audience supported by theory as being important to the development of writing skills. However, most students did not offer substantial replies to ack, so interactive dialogue was not present in peer review groups.

PAGE 99

99 Of the eighteen students in the six groups being examined, twelve posted responses to the feedback they received. Of those twelve, most only acknowledged d an intention to use them, which did not prompt further comments. One student, however, offered substantial replies to her group group responded with questions that went u nanswered by the other members of his group. The majority of replies expressed gratitude to other group members for their review and a succinct acceptance of the feedback. Expressing gratitude is a socio emotional behavior that can lead to a sense of comm unity in group work (Kwon, Liu, & Johnson, 2014) so suc h comments may have also impacted writing self efficacy. Group suggestions offered indicating they would be used: your essay Student 1. I enjoyed reading it. I enjoyed your first page and agree with you on many points. The only thing on the first page that I noticed was some grammatical errors and a few punctuation errors that could be fixed to really make everythi ng flow better, but that is all I could find on this page. I noticed in the last two paragraphs you metioned [ sic ] the cancer thing twice and it sounds a bit repetitive. I would maybe look at reworking those two paragraphs to include what you want to say but not repeat the thought. I enjoyed reading your essay and think you did a terrific job. I noticed a spelling error on this page and the use of the word author a lot. I also I agree with a lot of what you wrote and think you wrote it very well. Thank you Student 2 for your feed back [ sic ] I will be sure to read it over and fix the grammatical errors and punctuations before th e final draft thank youn [ sic ]

PAGE 100

100 Here, Group 1 Student 1 replied with thanks and an abrupt acknowledgement of feedback related to grammar and punctuation. Student 1 passed on the opportunity to request more information about how errors impacted the flow of t he essay. The student could have requested the reviewer indicate a particular example of where this happened in the essay or asked for some feedback geared more towards the content and not prompt further comment from Student 2. further a dialogue between group members as exem plified in exchanges in Group 3 and in Group 6 below: details and information you have provide d! oppositionist maybe? I just had to re read that sentence a few times, the wording was a little weird.. [ sic ] Your essay was very strong! Student 2, Thank you for your feedback. I will change my wording where you mentioned it and re read my paper to correct any run on sentences or grammatical issues. sic ] it I think that you meant no instead of on in your foruth [ sic ] paragraph I would maybe include a quote from the actual article. I think your esaay [ sic ] is good and the flow is ok I would just check your spelling and grammar. dent 2 I need to make changes on. In Groups 1 and 4, two students offered replies to feedback that should have generated further discussion. In Group 1, Student 3 politely disagreed with the

PAGE 101

101 suggestions from both group members about her essay not addressing certain persuasive appeals. To both group members, Student 1 explained where she thought she had discussed the persuasive appeals being questioned, and Student 1 solicited both group members to follow up w ith her on whether her reply clarified that she had sufficiently addressed the persuasive appeals. The bold statements show where Student 1 sought additional discussion: Group 1 Student 3 reply to Student 1: Hi Student 1 Just wanted to thank you for readin g the essay on A month without sugar and thank you for your observations. you did mention that you were looking to see where I had written about pathos in the essay. And I thought I captured that information I the 3rd or s a personal connection to the readers by noting many of them may have understanding where they are as it relates to coming out of the holidays days without sugar, and he gives them Whole 30 information to use as an actual regiment or recommended popular food regiment. I thought I had captured that information. Ok. Thank you so much for taking the time to read the essay Group 1 Student 3 reply to Student 2: Hi Amanda. Thank you for reading my essay on Dr. Leonhardt. I thank you for your comments, pathos section in the last paragraph before you get to the conclusion of my essay. And it is in the section where I mention that he wri tes the, or published the article in December right around or right after the holidays essay towards the end. scroll to the last page on the cited page to see your response last night when I was taking a look and I just thought I would take a look today hoping to see your response, and there you were on that last page. That was an oversight on my end. However, I do appreciate you taking the time to respond

PAGE 102

102 lanation and request for additional comments from her group members, no further discussion followed her replies. In Group 4 also, a group member attempted to further the discussion with his group members. Group 4 Student 3 reply to Student 1 : Student 1, I appreciate your feedback. I know that it may be difficult to see where I was going with my thesis. I may rework that to see if I can make things more clear. Did you see any other errors or omissions? I just want to make sure that it made sense and tha t I was touching on all of the persuasive arguments. Student 1, If you do see anything else, please let me know. Do you think that I did my in text citation correctly? Group 4 Student 3 reply to Student 2 : Student 2, I appreciate your feedback. I will look over your suggestions and make some changes. I did agree with the author's arguments. If you happen to see any other errors or anything that you think I should add, please let me know. While Group 4 Stu dent 3 does not question the feedback he received from his group members, he does request further review of his writing which was not forthcoming from the other members of his group. Self For students using peer review worksheets on group discussion boards in Blackboard, though replies to feedback were not required or explicitly encouraged, gratitude and/or succi nctly acknowledged the feedback that was provided. For example, Thank You [name deleted] I was rushing to submit this night so it's a little over the place but thank you for your notes I will resumit [ sic ] it with the things I'm missing. While the similarity between groups in the types of feedback provided, use of replies, and expressions of gratitude may have contributed to the absence of a

PAGE 103

103 significant difference between groups in writing self efficacy improvement, quantitative analysis showed CMPR pretest scores on the Writing Self Regulatory Efficacy Scale differed with the pretest scores of those using the peer review worksheet. Qualitative results indicate that students do not need to be taught specific feedback types in order to provide feedback that has been supported as promoting self efficacy but peer review efficacy in the CMPR sections. Self reflection in feedback comments for CMPR While not frequent, an additional theme of interest that emerged from thematic analysis of feedback in the CMPR groups was self reflection as part of feedback to classmates. Self efficacy and reflection have been found to contri bute to critical thinking and learning with self reflection being important for metacognitive monitoring to occur (Isaacson & Fujita, 2006; Phan, 2014; Steiner, 2016; Zimmerman, 1998) For example, Student ired reading on which the essay was based: You really made some good points and have me going back and reviewing the article again. In Group 3 also, Student 1 provides only praise reflection is occurring w hen Student 1 Revision Skill revision process, rough and final drafts were collected and compared to identify the types of changes that occurred. Cho and MacArthur (2010) paired down Faigley and

PAGE 104

104 Wi tte's (1981) taxonomy of 24 types of revision change to four types of revision change: no change, surface change, micro level change, or macro level change. Overall, the average number of surface changes made from all of the possible change types (excl uding no change) for the students using CMPR was 47.7% while the average number of surface changes made for the students using peer review worksheets was 48.7%. The average for surface changes and meaning changes, which include all micro and macro level ch anges, can be found on Table 4 6 According to Faigley and Witte's (19 81) analysis of the averages for change types between inexperienced, advanced, and expert reviewers, inexperienced writers made overwhelming more surface changes than meaning changes with meaning changes making up only 12% of all changes between drafts. In examining the averages for each peer review condition, both exceed the percentage of meaning level changes Faigley and Witte (1981) associated with inexperienced writers, but despite the differences in group average can be misleading, particularly in smaller sample sizes such as what I am working with. In order to better represent the revision skill of each student, Faigley and Witte's (1981) results for the percentage of surface and meaning level changes were used to identify each student as being either inexperienced (1), advanced (2), or expert (3). According to Faigley and Witte's (1981) analysis of revision, individuals who had been identified as inexperienced writers prior to their research ended up having meaning level changes that made up 12% of all changes between drafts, advanced students and experts (also labeled at the beginning of their research) had meaning level changes making up 24% of changes and 34% of changes respectively. Students in the

PAGE 105

105 present study whose meaning level changes were less than 24% were identified as inexperienced, those whose meaning level changes were between 24% and 33% were identified as advanced, and those who have 34% or more of meaning level changes were labeled expert. For this study, categorizing a student according to these labels does not signify that they are actually expert writers, advanced writers, or inexperienced writers overall but instead to categorize student revision practices based on previous find ings for revision practices. In cases where students made almost no changes, fewer than ten, from rough draft to final draft, their final essay score determined how they were categorized. While the Mann Whitney U Test indicates that the group using CMPR has students more skilled in revision than in the peer review worksheet group, there was no U = 200, df = 39, p = .765). Based on the revision data, the two groups display equiva lent skill at revising, in many cases making meaningful global changes to their writing regardless of the peer review condition. Collaborative Multimedia Peer Review, Self Efficacy, and Revision In order to conclude whether a relationship exists between C writing self data from students in the CMPR efficacy scale from the beginning of the course and from the end of the course were an alyzed with a paired samples t test. Efficacy posttest scores were included in correlation analysis with student rankings based on the revision taxonomy results. Correlation analysis findings were combined with qualitative analysis of the peer review group s and evaluated.

PAGE 106

106 The paired samples t test showed a difference approaching significance between writing self efficacy scale scores from the beginning of the course (M = .437, SD = .780) and the end of the course (M = .479, SD = 1.06); t (22) = 2.06, p = 0 .051. Nonparametric correlation analysis was used to determine whether an association exists between differences in writing self efficacy scale scores from the beginning and ients indicated a correlation, and none were significant: Kendall tau b, = .054, p = .768 and Spearman rho, = .064, p = .776. Student Perceptions, Self Efficacy, and Revision f eedback using voice and video comments in VoiceThread self efficacy and revisions strategies, descriptive statistics for usefulness of VoiceThread prediction of performance and judgement of learning are examined; correlations between prediction of performance, writing self efficacy, and revision skills are analyzed; and a thematic analysis of comments in response to the modality preference question is combined with the former data sources. Perceptions Student perceptions are e xamined in peer review studies because perceptions can influence student self efficacy, interest, goal orientation, and, ultimately, student engagement with the task (Brammer & Rees, 2007; Cho, Schunn, & Charney, 2006; Kasanga, 2004) To assess how well participants in the CMPR group believe they performed on the revisions for the final essay assignment, the students completed a five point Likert Prediction of Performance item (e.g., Thomas, Antonenko, & Davis,

PAGE 107

107 2016). Students were pr ovided with the departmental rubric with the 5 performance levels and point distribution that was used to assess their diagnostic essay and were asked For each category of the rubric (Content, Organization, Conventions, and Language/Audience) students indicated whether they expected their final essay to be at on. predictions of performance using and actual performance was .68. Though three students overestimated their final essay score by one level or more, overall, students were fairly accurate in their predictions for how well they would perform on the final essay assignment as shown in Table 4 7 using VoiceThread a seven point Likert Judgment of Learning it em was used. To ascertain participants' perception of learning as a result of the peer review, the students ception of learning as a understood what revisions needed to be made to your own writing as a result of giving point Likert scale was used to d etermine the degree to which students felt VoiceThread was useful for facilitating peer review. The means and standard devi ations are reported in Table 4 8 Students indicate d that receiving feedback from peers and giving feedb ack to peers resulted in a good understanding for how to approach revision on their own drafts. Of the 22 students surveyed, only one

PAGE 108

108 suggested giving feedback may not have helped her understanding for how to revise her own draft by selecting 3 from the sc ale of 1 to 7. All students indicated that they found VoiceThread useful for facilitating peer review; however, these students did not experience any other type of peer review in this class nor did most fully participate in the program by engaging in inte ractive discussions with group members as part of peer review. Correlation Analysis in writing self efficacy from the beginning of the course to the end, or revision skill. A order correlation analyses were run to ascertain whether a re lationship existed between writing self efficacy, prediction of performance, and revision skill. Each analysis showed weak correlations of no statistical significance: for writing self efficacy and prediction of performance = .387, p = .075; prediction of performance and revision skill Spearman rho, = .106, p = .640; and writing self efficacy, and revision skills = .064, p = .776. Modality Preferences Students were also asked which modality (video, audio, or text) was preferred for giving feedba ck in VoiceThread Two students failed to respond to the question. Ten students preferred text comments to avoid technical challenges and because it was the type of feedback they preferred to receive as it was easier to review. One student Very often the audio option did not work and it is nice to go back and read remaining ten students indicated a preference for giving audio feedback, only five of

PAGE 109

109 those ten a ctually provided audio feedback during the final peer review assignment. The other five individuals provided text comments in VoiceThread One student who socioemotional better connection and explanation so that your peer don't [ sic ] take your response thee [ sic recordings, allowed me to speak my mind and to let the writer hear Summary Current data analysis indicates t hat there was not a significant difference in the Regulatory Efficacy Scale after the scores were notably lower than those using the peer review worksheet. As relates to revision skills, revision skills were similar despite each group differing in peer review instruction and in peer review type. Nor did data analysis reveal a correlation between CMPR, writing self efficacy, and rev ision skills for students using CMPR, yet survey questions of student perceptions of peer review, use of VoiceThread prediction of performance, and preferred modality for giving feedback showed students believe that giving and receiving feedback impacted their revision choices in addition to VoiceThread being a useful tool for peer review. They also showed students ca n predict their performance with some accuracy and demonstrated how students vary in the mode of feedback that they prefer.

PAGE 110

110 In an attempt to determine what impact CMPR may have had on writing self efficacy when compared to students using peer review works heets with no required follow up interactions, quantitative and qualitative data were collected and analyzed. Quantitative data came in the form of the results of a 25 question Writing Self Regulatory Efficacy Scale from the beginning and the end of the te rm. Results were analyzed with a between subjects ANCOVA and analysis showed the pretest scores contributed strongly to the outcome. Without the pretest as a covariant, no significant difference in posttest scores was found. Further analysis of individual questions on the Writing Self Regulatory Efficacy Scale through paired t tests revealed significant differences for specific questions within each group from the beginning of the term to the end, yet these results illustrated an equivalence in self effica cy improvement between the two groups despite differing peer review conditions. Both groups made significant increases in the Writing Self Regulatory Efficacy Scale questions 4, 5, and 17 and in the planning portion of the writing process. A minor distinction between the groups became apparent in how the sections using peer review worksheets showed a significant increase in regulation and percep revise the organization of an essay. Initial coding of qualitative data showed that feedback in the CMPR condition met the criteria for improving student self efficacy by primarily consisting of directive and praise comments. Furt her examination showed that comments varied in substance and length with most feedback focusing on surface writing issues. In order to draw conclusions about why there may not have been significant differences in writing self efficacy between the groups, t he researcher also

PAGE 111

111 examined peer review worksheets and found that, despite not receiving instruction on types of feedback, some students in the peer review worksheet group also used directive comments and praise. Additionally, it was found that for the mos t part, the CMPR group was not collaborative in its peer review discussions in VoiceThread An essential element of CMPR was the collaborative component which was required through dialogue between group members, but while some students posted replies to f eedback, extended dialogue was not present in peer review groups. Like writing self efficacy between groups, revision skills were also found to be mostly equivalent. Using Cho and MacArthur's (2010) paired down version of Faigley and Wit te's (1981) taxonomy of revision changes, student changes from rough draft to final draft were identified, and frequencies were reported. While the most frequent type of change was surface change, students also made many meaning level changes, which are asso ciated with more skilled writing Students were categorized based on level changes, following Faigley and Witte's (1981) findings from their analysis. The Mann Whitney U test was run to compare the revision skills of those in the CMPR group with the revision skills of those using peer review worksheets, and no statistically significant difference was found. The frequency of specific types of revision change and the results of the Mann Whitney U test indicate d that the groups were alike in their revision practices. Though it was expected that a relationship b etween CMPR would be found, further analysis of CMPR student averages on the Writing Self Efficacy Scale was done through a paired t test, and results confirmed previous writing self efficacy findings that there was no significant

PAGE 112

112 improvement in writing self efficacy within that group. Furthermore, no correlations were found between the variables that would indi cate a relationship between CMPR Finally, to understand CMPR student perceptions and their relationship to writing self dgement of learning, prediction of performance, and preferred modality in VoiceThread were would perform on their final essay were relatively accurate, within one level of their actual final rubric score. For questions on their perceptions of giving and receiving feedback, students implied that receiving feedback from peers improved their understanding for how to revise their own writing (M= 5.45, SD= .91), and similarly, gi ving feedback to peers resulted in a good understanding for how to approach revision on their own drafts (M= 5.18, SD= 1.09). Also, students perceived that VoiceThread was very useful as a medium for peer review (M= 5.95, SD= 1.13). However, the series of order correlation analyses showed weak correlations of no statistical significance between writing self efficacy, prediction of performance, and revision skill, and interestingly, while students were evenly divided about their preference f or giving feedback through text comments or audio comments in VoiceThread only five of the ten individuals who responded with a preference for using the audio comment feature actually posted audio feedback in the final peer review activity.

PAGE 113

113 Table 4 1. Observed and Adjusted Differences between Groups on Self Efficacy Posttest. Groups Observed Means Standard Deviation Estimated Means Standard Error CMPR 4.69 .98 4.81 .19 Peer Review Worksheet 5.33 .96 5.20 .21 Table 4 2. Descriptive Statistics for Pre and Posttest Scores Group Test Mean Standard Deviation CMPR (N=22) Pretest 4.35 .80 Posttest 4.69 .98 Peer Review Worksheet (N=19) Pretest 4.84 .90 Posttest 5.33 .96 Table 4 3. Writing Self Regulatory Efficacy Scale questions showing a significant increase from pretest to posttest in the CMPR sections Writing Self Regulatory Efficacy Scale Questions (N=22) Mean Difference SD t Sig. (two tailed) 4. I can come up with an unusual opening paragraph to capture .773 1.31 2.77 .011* 5. I can write a brief but informative overview that will prepare readers well for the main thesis of my paper. .727 1.32 2.59 .017* 17. When I write on a lengthy topic, I can create a variety of good outlines for the main sections of my paper. .682 1.49 2.14 .044* 20. I can find ways to motivate myself to write a paper even when the topic holds little interest for me. .727 1.42 2.40 .026* p < .05*

PAGE 114

114 Table 4 4. Writing Self Regulatory Efficacy Scale results for questions showing a significant increase from pretest to posttest in the peer review worksheet sections Writing Self Regulatory Efficacy Scale Questions (N=19) Mean Difference SD t Sig. (two tailed) 4. I can come up with an unusual opening paragraph to capture .789 1.13 3.03 .007* 5. I can write a brief but informative overview that will prepare readers well for the main thesis of my paper. .632 1.26 2.19 .042* 7. I can adjust my style of writing to suit the needs of any audience. .789 1.03 3.34 .004* 16. I can refocus my concentration on writing when I find myself thinking about other things. .737 1.45 2.22 .040* 17. When I write on a lengthy topic, I can create a variety of good outlines for the main sections of my paper. .632 1.07 2.59 .019* 18. When I want to persuade a skeptical reader about a point, I can come up with a convincing quote from an authori ty. .947 1.31 3.15 .006* 22. I can revise a first draft of any paper so that it is shorter and better organized. .632 1.26 2.19 .042* p < .05* Table 4 5. Frequency of feedback types in six VoiceThread peer review groups. Directive Praise Nondirective Summary Off Task Criticism 122 113 34 17 14 6

PAGE 115

115 Figure 4 1. Example of student using directive feedback on peer review worksheet.

PAGE 116

116 Figure 4 1. Continued

PAGE 117

117 Figure 4 2. Example of praise comments for students using peer review worksheets. Table 4 6. Average number of surface level changes and meaning level changes for peer review conditions Peer Review Condition Surface Changes Meaning Changes Collaborative Multimedia Peer Review 47.7% 52.3% Peer Review Worksheet 48.7% 51.2% Table 4 average, and actual final average in CMPR. Diagnostic Rubric Score Average Prediction of Performance Rubric Score Average Actual Final Essay Rubric Score Average Differen ce in Prediction and Actual Average 4.50 4.00 4.00 .00 3.25 3.50 3.75 .25 4.00 3.00 4.00 1.00 3.25 4.50 4.50 .00 3.25 3.75 3.75 .00 3.25 3.00 3.25 .25 2.25 2.75 2.75 .00 4.00 4.00 4.75 .75 4.00 3.00 3.25 .25 2.00 4.75 4.00 .75 3.50 4.25 2.50 1.75 2.75 3.00 4.50 1.50 3.00 3.50 4.00 .50 3.25 4.25 4.25 .00 3.25 3.75 4.50 .75 3.50 4.00 3.25 .75

PAGE 118

118 Table 4 7 Continued Diagnostic Rubric Score Average Prediction of Performance Rubric Score Average Actual Final Essay Rubric Score Average Difference in Prediction and Actual Average 3.00 4.50 4.50 .00 2.75 4.75 4.50 .25 2.75 4.00 4.25 .25 2.75 3.75 3.50 .25 3.50 4.25 4.75 .50 2.75 4.75 4.00 .75 Table 4 8. Collaborative Multimedia Peer Review student perceptions of usefulness of giving/receiving feedback and VoiceThread Question Minimum Maximum Mean SD How well do you think you understood what revisions needed to be made based on your peers' review of your writing? 4 7 5.45 .91 How well do you think you understood what revisions needed to be made to your own writing as a result of giving reviews to others? 3 7 5.18 1.09 How useful was VoiceThread for engaging in peer review with your classmates? 4 7 5.95 1.13

PAGE 119

119 CHAPTER 5 DISCUSSION AND IMPLICATIONS The purpose of this study was to examine the relationships between instruction based on Zimmerman and Kitsantas' (2002) social cognitive model for sequential skill efficacy, revision skills, and perceptions as they relate to peer r eview, revision, and learning in online college composition courses. Collaborative Multimedia Peer Review ( CMPR ) was the instructional strategy that incorporated composition theory and social cognitive theory in order to support students in collaborative l earning to develop revision skills. Composition theory emphasizes the importance of audience in the writing process (Breuch, 2003; Ede & Lunsford, 1984; Kellogg, 2008; Magnifico, 2010; Mitchell & Taylor, 1979) Social cognitivism similarly emphasizes how learning is a social activity and self efficacy is important to self regulation (Bandura, 1991, 1997; Schunk & Zimmerman, 1997; Zimmerman, 2002; Zimmerman & Schunk, 2001) CMPR was developed to promote student self efficacy through observational learning, specific feedback types, and through dialogue between students in peer review. As a result, it was hypothesized student revision skills would be more focused on global issu es in writing rather than surface issues. VoiceThread was the technology selected to facilitate peer review for CMPR in order to replicate a face to face collaborative environment through the availability of its multimedia communication tools as studies have shown students may prefer video or screen casting feedback over written feedback (Crook et al., 2012; Hung, 2016) While two sections of online composition utilized CMPR ( n = 22), the other three sections utilized peer review worksheets ( n = 19). For these students, pretraining on peer review

PAGE 120

120 was provided, but t hey were not specifically taught about feedback types that promote writing self efficacy. Rough drafts and completed peer review worksheets were exchanged on small group discussion boards in Blackboard and students were not required to respond to each ot semester, sixteen weeks. Students took a diagnostic essay quiz and the Writing Self Regulatory Efficacy Scale in week 2, and they studied peer review in week 3. They participated in peer review for the first time in week 4 and had three more peer review activities (one for each assigned essay) with the last peer review taking place in week 14. In weeks 15 and 16, they completed the Writing Self Regulatory Efficacy Scale once more. Interestingly, th e students using CMPR were found to have a much lower average on their initial attempt at the Writing Self Regulatory Efficacy Scale than the students using the peer review worksheet, though the groups were often found to be similar in their peer review in teractions for the last writing assignment. Despite the limitations of having a small number of students in each group, the differences in pretest average and improvements in scores on the Writing Self Regulatory Efficacy Scale at the end of the course cou ld indicate CMPR may have contributed to improvements in self provide some insight into the reasons a student may prefer text over audio comments and vice versa. These results could have implications for how instructors deliver feedback in online classes as well. CMPR students also found both giving and receiving feedback to be useful to their revision practices. Four research questions guided this study:

PAGE 121

121 1. To what extent does Collabo writing self efficacy in an online freshman composition course? 2. revision process in an online freshman composition course? 3. What relationship, if any, exists between Collaborative Multimedia Peer Review, composition course? 4. edback using voice and video comments in VoiceThread efficacy and revisions strategies? Writing Self Efficacy Self efficacy has been shown to impact self regulation, especially for difficult tasks, (Bandura, 1991, 1997; E. Jones, 2008; Schunk & Zimmerman, 1997; Zimmerman & Bandura, 1994; Zimmerman & Kitsantas, 2002; Zimmerman & Schunk, 2001) and the process of writing and revising is a complex, yet essential, component of composition efficacy (Bandura, 1997; Cho, Schunn, & Charney, 2006; Ekholm et al., 2014; Zimmerman & Kitsant as, 2002) Students also experienced error management training where they were encouraged to manage rather than avoid errors (Keith & Frese, 2008) Because error managem ent training helps learners deal with negative emotions that can result from making mistakes more positively (Carter & Beier, 2010; Keith & Frese, 2008) it works well in writing instruction where making errors is a n accepted part of the process. In feedback on assignments leading up to each final essay, I emphasized negotiating errors rather than avoiding errors by being encouraging in my responses and by allowing students to revise and resubmit their outlines as th ey worked through their understanding of each essay assignment. I suspected that greater improvements in self efficacy would result

PAGE 122

122 from CMPR. The instructional model was developed according to the social cognitive model for sequential skill acquisition, a model that emphasizes self efficacy improvement through observational learning and social interaction (Zimmerman & Kitsantas, 2002) and error management training counters the negative effects avoiding errors can have on self efficacy. Writing Self Efficacy Scale According to the quantitative data, w hile students using C MPR did not achieve a higher level of writing self efficacy at the end of the course than the students using peer review worksheets, the CMPR group began the course with a lower average on the Writing Self Efficacy Scale pretest. This could indicate that t he treatment may still have impacted the students using CMPR w hich supports the research demonstrating how feedback and error management training can positively impact self efficacy. Another possibility is that some students in CMPR overestimated their writing self efficacy at the beginning of the course or that variations in writing self efficacy over the duration of the course played a role in the difference between groups at the end. Other research has shown that learners may overestimate their writin g ability, which may impact the accuracy of their reported self efficacy an d how it increases or decreases (Raedts Rijlaarsdam van Waes, & Daems, 2007; Sanders Reio et al., 2014; Zimmerman & Kitsantas, 2002) efficacy can vary over the course of a learning task which can impact future performance (Bernacki, Nokes Malach, & Aleven, 2015) efficacy to vary, more accurate measures of self efficacy for the duration of a course may require more fre quent assessment, perhaps after each peer review activity.

PAGE 123

123 Also, an analysis of individual questions on the Writing Self Regulation Efficacy Scale showed that students in both groups overlapped in how their perceived ability to create and plan improved fro m the beginning of the course to the end, indicating that many students believe their weakness lies in getting started on a writing task as opposed to reviewing or revising their work. While a significant distinction between novice and more skilled writers lies in revision practices that focus more on global issues (Faigley & Witte, 1981; Kellogg, 2008; Sanders Reio et al., 2014; Yang, 2011; Zimmerman & Bandura, 1994; Zimmerman & Kitsantas, 2002) another important distinctio n between novice writers and those who are more skilled lies in the ability to demonstrates an understanding of how writing is a recursive process (Becker, 2006; Witte & Faigley, 1981) The instructional video used in CMPR did not touch on the planning or development stage of the writing process, but both grou ps of students had to develop an outline as a prewriting activity in the course. O utlining has been shown to lessen the cognitive load associated with the writing process (Baaijen et al., 2014; Kellogg, 1988) P ractice and experience in developing an outline for each writing assignment may have contributed to how both groups perceived improvement in this specific area of self efficacy. Qualitative data was equally important to the conclusions drawn from this study, so w eighing the quantitative results with the qualitative data collected from peer review groups was a key efficacy. Feedback Types Qualitative data analysis revealed similarities in the type and depth of feedback used by students in both groups. Specific feedback types (directive, non directive, and

PAGE 124

124 praise) w of the usefulness of peer review which can impact self efficacy (Cho, Schunn, & Charney, 2006; Ekholm et al., 2014) However, analysis revealed that students using the peer review worksheet in some cases utilized the same feedback types. As peer review is in K 12 courses as well as in college composition (Loretto, DeMartino, & Godley, 2016; Schunn, Godley, & DeMartino, 2016) t his finding could indicate that students experienced peer review prior to freshman composition and potentially kne w h ow to provide directive comments and praise (even though they may not have known the specific labels associated with those feedback types) This has important ramifications for how college composition instructors approach peer review since the findings ind icate that students in freshman composition do not necessarily need to be taught the feedback types that may promote self efficacy CMPR focused mainly on demonstrating how to give feedback. When developing peer review activities for college composition, modeling how to give feedback may not be as important to improving student self efficacy as strategies for prompting meaningful dialogue beyond giving and receiving feedback However many of the directive comments touched on specific assignment requiremen ts. Some students closely examined what was being communicated in other developed college writing. However, understan ding the writing situation, the context of the writing task, is an essential skill for students developing their written communication skills. Comments that referenced the assignment requir ements would have shown students where they were

PAGE 125

125 not meeting the ex pectations of the writing task and informed some of their revision decisions. I believe this is evident in the number of macro new changes students made from their rough to final draft which was 122 total changes made by 21 of the 41 students from both gr oups Students being more aware of whether they were meeting expectations for the writing task may have influenced their self efficacy positively or negatively depending on how those comments were received, but the additional presence of praise comments in the feedback should have had a positive impact on self efficacy in both groups. Self R eflection Qualitative data analysis also revealed the presence of self reflection within Students in both the CMPR groups and pee r review worksheet groups were required to write reflections as part of ea ch essay submission, but evidence of self reflection within feedback posts indicates that metacognitive monitoring may have been taking place as students reviewed their rafts. This adds to research that indicates the act of giving feedback can (MacArthur & Philippakos, 2013) rceptions of giving feedback improving their understanding of how to approach revision discussed in CMPR Student Perceptions below. Also, the presence of self reflection for the five students who referenced their own writing as part of their feedback comme nts may indicate that critical thinking as a result of collaborative learning was occurring in these cases. Peer Review Dialogue Interaction between writer and audience is essential to the development of writing skills (Breuch, 2003; Mitchell & Taylor, 1979; Nystrand, Greene, & Wiemelt,

PAGE 126

126 1993) Dialogue following feedback posts for the CMPR group was an important (Cho & MacArthur, 2010) and articu lating which suggestions they chose to take or not take would have allowed students to convey their thinking process about revision choices (Yang, 2011) For both groups of students, gratitude was expres sed, which can cause a sense of community in group work (Kwon et al., 2014) and su ch comments may have also impacted writing self efficacy. However a nalysis of responses to feedback in both groups showed the groups to be similar in the absence responses focused on discussing revision Meaningful responses to feedback were a required co mponent of peer review in CMPR, but replies from reviewees to their reviewers were mostly superficial, and there was no further dialogue beyond a reply when a reply was provided. Additionally, students using the peer review worksheet also posted replies in many cases though they were not encouraged to do so which was unexpected The results indicate, in the case of the students using CMPR, discussion in a different online environment may not impact known challenges associated with meaningful dialogue betw een students in online course discussions. (Their perceptions on modality are discussed further down.) When responses to peer feedback were provided often they were superficial and did not prompt an ongoing dialogue. The few responses that might have prompted further discussion were ignored by other group members. The lack of collaborative dialogue may have been because some feedback was abrupt and lacked depth. In cases where feedback consisted only of praise or identified minor surface an d typographical errors, there was little opportunity

PAGE 127

127 for meaningful replies though two students posed questions to prompt further discussion which were ignored by classmates. Another possible cause may have been that students did not see the importance of dialogue following initial feedback posts. The dialogue was one component of peer review that, if not completed, could result in a loss of points but not failure of the assignment. Since responses were not a higher stakes activity, students may have believ ed they were not as important a component as the initial comments giving feedback. These findings further illustrate how online discussions often fail to achieve the level of critical thinking and learning for which they are developed (Maurino, 2007; Schwartzman & Morrissey, 2010; Mooney et. al., 2014) Witho piece that was critical to Collaborative Multimedia Peer Review was mostly absent (Mooney et al., 2014; Zhao et al., 2014) For some students using the peer review worksheet, replies on the discussion board were provided without provocation from the instructor. While these replies also did not make meaningful contributions to discussions on writing, they were similar to responses in CMPR in that they expr essed gratitude. Both groups had individuals who expressed gratitude to reviewers for their feedback. While that may have improved the sense of community for individuals in each group, it would not have prompted collaborative critical thinking on revision choices The lack of meaningfu l interaction in the CMPR group though dialogue was supposed to be part of peer review, and the presence o f responses in the other groups though responses were not necessary, are noteworthy results since they imply a common kn owledge of feedback and interaction students may automatically engage in when reviewing drafts.

PAGE 128

128 unlike general online discussions, in research on peer review interactions, the instructor does not typically interject (Cho, 2006; Cho & MacArthur, 2010; Kaufman & Schunn, 2011) However, in order to promote more collaborative learning in peer review, students need to have more meaningful conversations, and increased instructor presence m ay be necessary. Research indicates that online discussions may benefit from direct prompts by the instructor within each the group or by guiding discussion questions In online peer review discussions, as in any online discussion, increased instructor i nvolvement may be required to ensure more meaningful online conversations between students (Maurino, 2007) This has important implications for comp osition instructors in developing peer review online since instructor interaction within peer review groups is not typically a key element of peer review activities. Revision Practices Examining revision practices was intended to show how many changes wer e meaning level changes, those that address global issues of development, style, and organization. Individuals are considered to be more skilled if their revision is focused on meaning level changes (Faigley & Witte, 1981; Kellogg, 2008; Sanders Reio, Alexander, Reio, & Newman, 2014; Yang, 2011) percentage of meaning level change, they were categorized according to Faigley and Witte's (1981) analysis of the averages for change types between inexperienced, advanced, and expert writers. An examination of revision practices showed the students in different peer review conditions were very similar in the frequency of surface changes

PAGE 129

129 and meaning c hanges though I expected students in the CMPR group to have more meaning level changes than students using peer review worksheets. These results may confirm how, in some cases, differences in peer review condition do not significantly impact student writin g, such as in Covill's (2010) study where no significant difference was found in the final drafts between students in a peer review condition with those in a self review condition and no review. The results may also show that observational learning plays a n important role in peer review when groups are arranged heterogeneously by skill. Weaker students in both groups had the opportunity to observe the writing of stronger students who performed well on the diagnostic essay and on subsequent writing assignmen ts. Alternatively, the results may instead identify an important issue in trying to Students using CMPR and students using peer review worksheets aligned closely in the average amount of c meaning changes compared to surface changes, but none met the criteria for being is was unexpected since they were all in a freshman college composition course. From analysis of revision, it would seem that students demonstrated the revision practices of more skilled writers. practices of individuals at various levels and development of more robust instruments to measure revision skill These findings mean that s tudent writing and revision practices need to be examined more closely to arrive at a better baseline for how college

PAGE 130

130 freshmen write and revise. not accurately identify those students as expert writers. When Faigley and Witte (1981) studied their revision taxonomy, drafts were handwritten. T ools and affordances for writing and revision have changed considerably revision, participants hand wrote draft s with different color pens. Unlike the early currently access to word processors is ubiquitous, and deleting as one writes makes large, meaning level changes easier. Differences in hand written composition and writing with a word processor have been found ( Chadwick & Bruce, 1989; Mogey & Hartley, 2013) Chadwick and Bruce (1989) specifically found di fferences in how students using word processors make more macrostructure changes. While versions of the Faigley and Witte (1981) revision taxonomy are still used to analyze revision, there have not been updated discussions on how revision prac tices can be used to categorize their skill level If composition instructors ha ve a better idea of how novice, advanced, and expert writers revise, we would be better able to teach stude n ts how to become more skilled. Also, m odeling revision made up a brief section of the CMPR instructional video about two and half minutes towards the end of the video. On c e composition faculty understand how students at different levels approach revision, we can develop peer review instruction that places more emphasis on modeling the revision practices of expert writers. Relationships between CMPR, Writing Self Efficacy, and Revision Self efficacy is a key aspect of self regulation and self regulation is necessary for tackling complex learning tasks such as revisio n (Bandura, 1991, 1997; Bruning & Kauffman, 2015; MacArthur & Philippakos, 2013; Pajares, 2003; Schunk & Zimmerman,

PAGE 131

131 1997) As a resul t, correlational analysis was performed to ascertain whether there may have been relationship s revision processes I expected to see a moderate to strong positive correlation, but no such relatio nship was discover e d. The absence of a correlation between CMPR, writing self efficacy, and revision choices may be likely due to the small number of students who remained in the CMPR group and, consequently, due to decreased statistical power Also, this (1981) taxonomy relative to the writing and revision context in my study. Students were characterized findings. In their study, participants were categorized based on classes they were taking group revised. However, there were only six participants in each level, and the participa nts were volunteers, not required to write well for a grade or other compensation. These qualities could have impacted the accuracy of revision practices be mis applied i n the current study. Also, as noted in the previous section, students use of writing and revision technologies such as word process ing tools could have impacted which would have impacted the correlational analysis. Reconsidering instructor revision practices, as discussed in the sections above, would also potentially impact the cor relation we might see between an instructional method, self efficacy, and revision skills in future research.

PAGE 132

132 CMPR Student Perceptions An important goal of this study was also to explore student perceptions of receiving and providing feedback, usefulness of VoiceThread and their self efficacy and revision strategies. Student perceptions are an important variable A s research has shown perceptions can influence student self efficacy and engagement with a task (Brammer & Rees, 2007; Ch o, Schunn, & Charney, 2006; Kasanga, 2004) S tudent responses to a judgement of learning question and a prediction of performance question provided some interesting results. While the variables were not correlated in this study, students believed both g iving and receiving feedback impacted their revision practices. peer feedback (Baker, 2016; Cho, Schunn, & Charney, 2006; Kasanga, 2004; Ludemann & Mcmakin, 2014) Also, m ost st perform on their final essay were within one level of their actual final rubric score which shows that students may have succeeded in their metacognitive monitoring, an important feature of self regulation (Anderson & Thiede, 2008; Thiede & Anderson, 2003; Thomas, Antonenko, & Davis, 2016) Of equal value were student responses regarding the usefulness of VoiceThread and their comments. Half of the st udents in CMPR did prefer to give or receive audio comments, but the other half preferred text. The students who preferred audio comments explained they had the sense of having a more personal connection with each other and of being better understood throu gh voice comments as opposed to text comments. This finding adds to the research supporting how audio or audiovisual feedback may be preferred in some cases because it is perceived to be more detailed

PAGE 133

133 and more personal than written feedback (Crook et al., 2012; Hung, 2016; Jones et al., 2012; Vincelette & Bostic, 2013) However, the preference of half of the students for text making personal connections, some students preferred text comments because they believed text comments were easier to r eview and apply. These results have important implications for not only peer review, but also the delivery of feedback in online learning environments Some learning management systems, for example Canvas provide instructors with affordances to generate audio and video feedback in addition to or as a substitute for, traditional text based feedback. The results of this study indicate that this decision may impact students in a class in different ways, with some students appreciating audio or video feedbac k due to an enhanced perception of personal connection with the instructor, whereas others may wish to receive text feedback that allows them to use review strategies that they are comfortable with. Student preferences for certain feedback types may impact the ease with which they are able to use feedback to improve, so it may be necessary to offer feedback in multiple modalities. Revising CMPR Findings from this study identified some areas where CMPR could be improved for future implementation. While peer review groups were initially heterogeneous based s (submitted prior to peer review) who did not understand specific ass ignment requirements in spite of the general writing skills displayed through their diagnostic essay scores. An addition to CMPR could

PAGE 134

134 include an assignment checkup quiz, a low stakes quiz students must complete after reviewing the module lecture but befor e they can access the links for submitting their essay outlines and drafts This assessment wo uld alert the instructor to which students in the cour se need additional instructional support before going forward in their writing process Offering more suppo rt to students who seem not to understand course content would also direct peer review feedback and dialogue to focus primarily on the quality of writing and not on whether or not group members have mentioned required terms or concepts from the assignment description Also, since students using peer review worksheets utilized directive feedback and praise comments without being instructed to do so, the video used in CMPR to instruct students in how to give feedback could be edited to cut out the explanatio n of feedback types. Instead, more attention could be given to modeling follow up dialogue and to modeling effective revision strategies. To better facilitate collaborative learn ing through the required dialogue instructor presence needs to become a part of peer review activities in CMPR social presence needs to be better prompted. S tudents need guidance on how to interact in w ays that allow for more collaborative learning opportunities Peer review a ctivities could be developed as p rotoc ol based discussions a type of structured discussion that is meant to promote problem solving among students as well as self reflection and support from others in the group (Zydney, Denoyelles, & Kyeong Ju Seo, 2012) In addition to posting a dr aft in VoiceThread students would use the software to also post a video identifying the areas they believe need improvement and their purpose for choosing a particular topic. The instructor can also guide the peer review

PAGE 135

135 discussion by addressing specific questions to each student in the group after they have offered their own initial feedback. The instructor commen t s can direct additional feedback. An instruct or comment may be Student 1 Student 2 has said she is working on being more specific in the examples and details she provide s. In looking at Student 2 paragraph three, offer her some feedback on her use of specific details. Think about how you utilize details in your own writi ng. Provide Student 2 with a strategy that you use when you write to help you formulate specific details In this comment, the instructor would directly address a student and suggest something he could offer more feedback on without telling him s pecifically what may be wrong or what corrections to recommend. A similar comment from the instructor could be directed to Student 2 in this example, directing her to talk to Student 1 about how she thinks his feedback will improve her writing. The dialog ue requirement also might seem more important to students if it is graded separately so that students would see a zero or a low score in their grades if In Zydney et al.'s (2012) use of protocol based discussion, the researchers found that students preferred clear, simple instructions with rubrics and due date reminders. For CMPR a rubric c ould be developed for grading timely posting of drafts, inclusion of a video by the draft author indicating goals and areas that he or she wants to improve on, and initial feedback posts and shared with students. A separate rubric c ould be developed and shared that demonstrates how follow up dialogue would be graded. The

PAGE 136

136 day before a deadline, the instructor could send out a reminder announcement to the class about upcoming deadlines as well. Implic ations for Future Practice Students encounter challenges outside of classes that may impact their ability to participate in course assignments or may impact their motivation to persist in a course, and it is difficult to account for all possible issues. In this study, sample size became a major limitation as many students did not persist in the course or failed to fully engage in all assignments. Despite my predictions that students in the CMPR group would improve in self efficacy and revision skills over s tudents using the peer review worksheet, CMPR students did not outperform the comparison group; nevertheless, the findings have significant implications for future practice in developing peer review activities in online composition. Given how important s elf efficacy is to performance, composition instructors need to be more intentionally aware of the impact feedback (peer feedback and instructor feedback) can have on student self efficacy. Error management training (Keith & Friese, 2008) in particular sho uld be more widely used in composition instruction as a means of promoting self efficacy and encouraging students to expect to have to make meaningful changes throughout their writing process. efficacy at the end of the course did not s ignificantly exceed that of students using the peer review worksheet, yet CMPR students also scored notably lower on the Writing Self Regulatory Efficacy Scale at the beginning of the course pre and posttest self eff icacy scores approached significance. This result in addition to their perceptions about the usefulness of VoiceThread and comments at the end of

PAGE 137

137 the course support the possibility that using CMPR may have influenced the self efficacy of students who st ruggle with writing or perceive themselves to be poor writers. Additionally, the similarities between how groups engaged in peer review has implications for what content may be important to include in peer review instruction. Planning is an important element of writing self efficacy and should be something instructors specifically guide students through. Instruction for how to plan an essay was not specifically addressed in CMPR though it was included in all courses through a planning and outlining worksheet students used prior to submitting their rough drafts. The exa mination of how each group scored on specific Writing Self Regulatory Efficacy Scale questions showed that students in both groups improved on questions related to the creative and planning aspects of writing. This finding may have result ed from the planning activities that were pr ovided in all courses, particularly the outline assignments. Outlines were required for all classes because outlining can decrease the cognitive load associated with a complex task (Baaijen et al., 2014; Kellogg, 1988) In the case of this study, requiring outlines may have contributed to self efficacy improvement in the planning stage of the writing process. In developing writing and peer review instruction, particular types of feedback may not need to be explained. F eedback that is perceived well by those who receive it impacts self effi cacy (Cho, Schunn, & Charney, 2006; Ekholm et al., 2014) but typ e s of feedback do not need to be defined for students as part of peer review S tudents who were assigned peer revie w worksheets used directive feedback and praise comments even though they were not taught to use these types of comments while students using CMPR were. These results indicate that specific types of feedback that promote self

PAGE 138

138 efficacy do not need to be ex plicitly defined when demonstrating how to engage in effective peer review. While feedback types may not need to be specifically defined as they were in CMPR, a stronger focus on improving self regulation and the modeling o f effective revision strategies should be emphasized The analysis of individual questions on the Writing Self Regulatory Efficacy Scale showed a lack of significant improvement in most questions related to revision esult, a larger part of composition instruction needs to include specific approaches to revision. The findings from the current study are important in how they highlight what specific qualities of the writing process, peer interaction, and revision should be part of composition instruction. Another meaningful conclusion revealed through this study is that challenges in prompting students to participate in discussions that promote learning can persist despite modeling dialogue and changing the discussion en vironment. The results illustrated how students using CMPR failed to engage in meaningful discussions as part of peer review. The dialogue that was a required component of peer review in CMPR courses lacked in depth, and replies, if any were posted, were l acking in content Composition instructors do not typically view peer review discussi ons as being the same as general discussions in online classes, so instructors do not usually engage with students in their peer review discussions. However, if composition faculty want collaborative learning to occur, it may be necessary to develop additional prompts to post within peer review discussions so that students make posts that indicate they are thinking critically about the stages of writing and how to improve their own practices.

PAGE 139

139 Other strategies for stimulating productive discussions may also apply (e.g., Stahl, 2015) As demonstrated in this study, the various w riting skill levels students bring to a composition course can be difficult to ascertain and to categorize but studying Once composition instructors understand how students at differ ent skill levels approach revision, and the role word processing software plays, we can more effectively model advanced and expert revision strategies to students as part of composition instruction. We would also be able to offer feedback more specifically geared towards helping students attain better revision skills. Word processing software makes large changes to content and organization easier than it would be for handwritten compositions. Use of this software may have an important influence on individua something that has not been explored in other studies using versions of the Faigley and Witte (1981) revision taxonomy More research is needed on the revision processes of individuals at various writing skill levels in order to bet ter develop a standard for how novices, advanced students, and skilled writers revise. in this study also has implications for how feedback should be delivered and for designers of multimedia fe edback software. Instructors for online courses not just online composition courses, in order to make feedback accessible to students If students prefer text feedback, as half of t he students using CMPR did, they might not utilize feedback delivered via audio or video. Asking students at the beginning of a course what their feedback preferences are could

PAGE 140

140 be a way of ensuring the modality fits their preferences, but whether or not de livering feedback in different modalities is possible may depend on the resources available in specific learning management systems or what other software instructors can access. Leaving audio feedback in Blackboard for example, is a complex process because institution. This makes it difficult for instructors, and for students, to utilize audio for feedback or discussion posts within the LMS. While this study found few d ifferences in groups using different peer review conditions, the findings add to the research on peer review and the revision practices of students taking freshman composition and bring us closer to conceptualizing a potential standard practice in CMPR T hese findings provide some insight into a few features efficacy and interactions in peer review, such as error management training, requiring planning activities like outlines, instructor presence i n peer review discussions, and a stronger focus on modeling revision strategies. Yet the findings of this study also raise more questions that practitioners and researchers in composition studies need to examine in order to come to a better understanding o f how to design peer review and feedback delivery in online courses that can develop stronger academic writers. Written communication is an important interdisciplinary skill, so composition instructors have a responsibility to examine our instructional pra ctices, how we facilitate peer review, and how feedback is both delivered and received in order to understand what our students need and to improve upon current teaching methods.

PAGE 141

141 APPENDIX A DIAGNOSTIC ESSAY RUBRIC INDICATORS Level 5 Level 4 Le vel 3 Level 2 Level 1 CONTENT Has a clear and perceptive thesis that is fully supported with specific, relevant examples and contains no generalizations Has a clear thesis that is supported with many specific, relevant examples but may have a few generalizations Has a clear thesis that is supported with some specific details and examples but contains some generalizations Has a thesis that is not clearly defined and is supported largely by generalizations Has no thesis and has inadequate or irreleva nt support ORGANI ZATION Demonstrates a clear and insightful progression of ideas; transitions are seamless and skillful; has an organic and sophisticated structure Demonstrates a clear progression of ideas; transitions are effective; has an effective st ructure Demonstrates a clear progression of ideas; transitions are adequate; has an adequate structure Lacks a clear progression of ideas; transitions are abrupt and sometimes inaccurate; has an inconsistent structure Lacks a progression of ideas; transiti ons are nonexistent or inaccurate; has no discernible structure CON VENTIONS Errors in grammar, mechanics, spelling, and word choice are insignificant. Errors in grammar, mechanics, spelling, and word choice are occasional. Some errors in grammar, mechanics, spelling, and word choice are evident but do not interfere with readability. Errors in grammar, mechanics, spelling, and word choice are frequent and sometimes interfere with readability. Numerous errors in grammar, mechanics, spelling, and wor d choice are abundant and often interfere with readability

PAGE 142

142 LANGUAGE and AUDIENCE Language choices are fresh and innovative. Sentences are skillfully worded. Sentence structure is both varied and sophisticated. Tone is exemplary and shows sophisticated awareness of the audience and purpose Language choices are precise and purposeful. Sentences are clearly worded and unambiguous. Sentence structure is varied and competent. Tone is well suited to the purpose and audience Language choices are appropriate an d accurate. Sentences are generally clearly worded though there may be minor problems with syntax or omissions; these minor errors do not interfere with readability. Sentence structure is varied. Tone is appropriate for the purpose and audience. Lan guage choices are adequate; occasional errors in diction or usage may interfere with meaning. Occasional errors in syntax or problems with wording may lead to ambiguity or interfere with readability. Sentence structure lacks variety and tends to be mec hanical. Tone shows some awareness of audience and purpose though some choices may be inappropriate for the writing task. Language choices are limited, inadequate, or inaccurate. Syntactical errors or ambiguous wording may lead to confusion. Sentence str ucture is simplistic or disjointed. Tone is inappropriate for the specific audience and purpose.

PAGE 143

143 APPENDIX B QUALITATIVE ANALYSIS CODEBOOK Code me Description Assignment Feedback references a particular aspect of the lecture such as explaining a persuasive appeal. Change Type Revision codes Cho, K., & MacArthur, C. (2010). Student revision with peer and expert reviewing. Learning and Instruction, 20(4), 328 338. http://doi.org/10.1016/j.learninstruc.2009.08.006 Macro level Changing meaning New Points Adding entirely new points or paragraphs, not just elaborating on an existing point Organization Changing or deleting transitional elements Micro level Meaning preserving change Micro Complex At the sentence or paragraph level, fixing points by changing or deleting Micro Extended Elaborating on a point by adding No change Indication writing requires no revision Surface Changes such correcting spelling, tense, punctuation, abbreviation, technical, etc. Closing Remark Polite closing to a comment offering "good luck," "best wishes" or some similarly polite statement. Feedback Plan Laying out how feedback will be provided. Feedback Type Types of feedback Cho, K., Schunn, C., & Charney, D. (2006). Commenti ng on writing: Typology and perceived helpfulness of comments from novice peer reviewers and subject matter experts. Written Communication, 23(3), 260 294. http://doi.org/10.1177/0741088306289261 Criticism making a negative observation about one& Charney, Directive suggestions for specific changes Nondirective General comment applicable to any paper -observes a general area needing improvement Off task Comment not related to essay improvement Praise encouraging observations on the whole or a portion of the paper. Summary Comments where the feedback summarizes an element of the essay or the entire essay

PAGE 144

144 Code me Description Follow Up A comment predicts future performance or expectation that the person whose draft is being reviewed may already plan to fix something. Gratitude Appreciation or thanks Modality What comment type in Voice Thread was utilized Audio Text Reference Peer A reviewer refers back to what another reviewer in the group commented. Reply Individual's response to peers' feedback Accept feedback In reply, the student indicates he/she will use feedback Reject feedback Student indicates he/she won't use feedback Justification Student explains why he/she won't use feedback Request Requests additional comments or poses a question for additional feedback. Self reflection Reviewer reflects on an element of his/her own writing as part of feedback Technical Indication that someone is experiencing technical issues. Uncertainty The person indicates that he/she is not sure about a comment by using question marks, contradicting him/herself, or using a term that indicates uncertainty.

PAGE 145

145 LIST OF REFERENCES An, H., Shin, S., & Lim, K. (2009). The effects of different instructor facilitation Computers and Education 53 (3), 749 760. https://doi.org/10.1016/j.compedu.2009.04.015 Anderson, M. C. M., & Thiede, K. W. (2008). Why do delayed summaries improve metacomprehension accuracy? Acta Psychologica 128 (1), 110 118. https://doi.org/10.1016/j.actpsy.2007.10.006 Armstrong, S. L. for the writing classroom. Teaching English in the Two Year College 35 (4) 398 407. Association of Departments of English. (2009). Education in the balance: A report on the academic workforce in English. Profession 65 180 244. https://doi.org/10.1632/prof.2009.2009. 1.180 Baaijen, V. M., Galbraith, D., & de Glopper, K. (2014). Effects of writing beliefs and planning on writing performance. Learning and Instruction 33 81 91. https://doi.org/10.1016/j. learninstruc.2014.04.001 Active Learning in Higher Education https://doi.org/10.1177/1469787416654794 Bakhtin, M. M. (1981). Discourse in the novel. The Dialogic Imagination: Four Essays. (C. Emerson & M. Holquist Trans.) Austin, TX: University of Texas Press. Bandura, A. (1991). Social cognitive theory of self regulation. Organizational Behavior and Human Decisi on Processes 50 (2), 248 287. https://doi.org/10.1016/0749 5978(91)90022 L Bandura, A. (1997). Self efficacy: The exercise of control New York: W. H. Freeman and Company. Becker, A. (2006). A review of writing model research based on cognitive processes. In A. Horning, A and Becker (Ed.), Revision: History, Theory, and practice (pp. 25 49). West Laffayette: Parlor Press. https://doi.org/10.1080/09585176.2010.504574 review and self assessment. Writing Program Administration 34 (2), 11 36. Retrieved from http://go.galegroup.com/ps/i.do?id=GALE|A279262778&v=2.1&u=gain40375&it=r& p=AONE&sw=w&asid=78e281c42a00437f98b2023ed2ceb01f Bernacki, M. L., Nokes Malach, T. J., & Aleven, V. (2015). Examining self efficacy during learning: variability and relations to behavior, performance, and learning.

PAGE 146

146 Metacognition and Learning 10 (1), 99 117. https://doi .org/10.1007/s11409 014 9127 x or invalid? Composition Studies 35 (2), 71 85. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&AuthType=ip,uid&db=aph&AN =28006696&site=ehost live Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology 3 (2), 77 101. https://doi.org/10.1191/1478088706qp063oa Breuch, L. (2003). Post (Ed.), Cross Talk in Comp Theory: A Reader (pp. 97 125). Bru College English 47 (7), 635 652. https://doi.org/10.1007/s13398 014 0173 7.2 Bruning, R. H., & Kauffman, D. F. (2015) Self efficacy beliefs and motivation in writing development. In J. MacAurthur, C. A.; Graham, C.; and Fitzgerald (Ed.), Handbook of Writing Research (2nd ed., pp. 160 173). New York: Guilford Publications. Carifio, J.; Jackson, I.; & Dagostino, L. (2001) Effects of diagnostic and prescriptive comments on the revising behaviors of community college students. Community College Journal of Research and Practice 25 109 122. https://doi.org/10.1080/10668920150218498 Carter, M., & Beier, M. E. (2010). The effectiveness of error management training with working aged adults. Personnel Psychology 63 (3), 641 675. htt ps://doi.org/10.1111/j.1744 6570.2010.01183.x Chadwick, S., & Bruce, N. (1989). The revision process in academic writing: From pen & paper to word processor. Hong Kong Papers in Lingustics and Language Teaching 12 Retrieved from http://lp.hscl.ufl.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true& db=eric&AN=ED347525&site=eds live Chanqu oy, L. (2009). Revision process. In M. Beard, R.; Myhill, D.; & Nystrand (Ed.), The SAGE handbook of writing development (pp. 80 97). London: SAGE Publications. Cho, K. (2006). Commenting on writing: Typology and perceived helpfulness of comments from novi ce peer reviewers and subject matter experts. Written Communication 23 (3), 260 294. https://doi.org/10.1177/0741088306289261 Cho, K., & MacArthur, C. (2010). Student revision with peer and expert re viewing. Learning and Instruction 20 (4), 328 338. https://doi.org/10.1016/j.learninstruc.2009.08.006

PAGE 147

147 Cho, K., & MacArthur, C. (2011). Learning by reviewing. Journal of Educational Psycholo gy 103 (1), 73 84. https://doi.org/10.1037/a0021950 Cho, K., Schunn, C., & Charney, D. (2006). Commenting on writing: Typology and perceived helpfulness of comments from novice peer reviewers and subject mat ter experts. Written Communication 23 (3), 260 294. https://doi.org/10.1177/0741088306289261 Cho, K., Schunn, C., & Wilson, R. (2006). Validity and reliability of scaffolded peer assessment of writin g from instructor and student perspectives. Journal of Educational Psychology 98 (4), 891 901. https://doi.org/10.1037/0022 0663.98.4.891 Cot, R. (2014). Peer feedback in anonymous peer review in an EFL writing. Gist Education and Learning Research Journal 9 (9), 67 87. Covill, A. (2010). Comparing peer review and self review as ways to improve college Journal of Literacy Research 42 199 226. https://doi.org/10.1080/10862961003796207 Cresw ell, J. W. & Clark, V. L. P. (2007). Designing and conducting mixed methods research. (1st ed.). Los Angeles: SAGE Publications Creswell, J. W. (2013). Qualitative inquiry and research design Los Angeles: Sage. Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods approaches (4th ed.). Los Angeles: SAGE Crook, A., Mauchline, A., Maw, S., Lawson, (2012). The use of video technology for providing feedback to students: Can it enhance the feedback experience for staff and students? Computers and Education 58 (1), 386 396. https://doi.org/10.1016/j.compedu.2011.08.025 Curtis, D. D., & Lawson, M. J. (2001). Exploring collaborative online learning. Journal of Asynchronous Learning Networks 5 (1), 21 34. https://doi.org/10.1016/j.jcss.2007.08.004 Delgado, O. G., & McDougald, J. (2013). Developing writing through blogs and peer feedback. kala 18 (3), 45 61. Retrieved from http://aprendeenlinea.udea.edu.co/revistas/index.php/ikala/article/viewArticle/16196 Dringus, L. P., Snyder, M. M., & Terrell, S. R. (2010). Facilitating discourse and enhancing teaching presence: Using mini audio presentation s in online forums. Internet and Higher Education 13 (1 2), 75 77. https://doi.org/10.1016/j.iheduc.2009.11.001 Duffy, Thomas and Cunningham, D. (1996). Constructivism: Implications for the design and delivery of instruction. In D. H. Jonassen (Ed.), Handbook of research on

PAGE 148

148 educational communications and technology: a project of the association for educational commu nications and technology (pp. 170 198). New York: Macmillan Library Reference. Duijnhouwer, H., Prins, F. J., & Stokking, K. M. (2012). Feedback providing motivation, proce ss, and performance. Learning and Instruction 22 (3), 171 184. https://doi.org/10.1016/j.learninstruc.2011.10.003 Ede, L., & Lunsford, A. (1984). Audience addressed/audience invoked: The ro le of audience in composition theory and pedagogy. College Composition and Communication 35 (2), 155 171. https://doi.org/10.2307/358093 Ekholm, E., Zumbrunn, S., & Conklin, S. (2014). The relation of college student self efficacy toward writing and writing self regulation aptitude: writing feedback perceptions as a mediating variable. Teaching in Higher Education 20 (2), 197 207. https://doi.org/10.1 080/13562517.2014.974026 Faigley, L., & Witte, S. (1981). Analyzing revision. College Composition and Communication 32 (4), 400 414. https://doi.org/10.2307/356602 Ferris, D. R. (2014). Responding to student w practices. Assessing Writing 19 6 23. https://doi.org/10.1016/j.asw.2013.09.004 Flower, L. (1990). Reading to write: Exploring a cognitive and social process. Ne w York: Oxford University Press. Flower, L., & Hayes, J. R. (1981). A cognitive process theory of writing. College Composition and Communication 32 (4), 365 387. https://doi.org/10.2307/356600 Garrison, D. R. (2007). Online community of inquiry review: Social, cognitive, and teaching presence issues. Journal of Asynchronous Learning Networks 11 (1), 61 72. ? In Writing Spaces: Readings on Writing (Vol. 1, pp. 191 204). Parlor Press. Retrieved from http://writingspaces.org/ Good, J. (2012). Crossing the measurement and writing assessment divide: The practical implicat ions of inter rater reliability in faculty development. WAC Journal 23 (2012), 19 30. Retrieved from http://stats.lib.pdx.edu/proxy.php?url=http://search.ebscohost.com/login.aspx?direct =true&db=ehh&AN=90484928&site=ehost live%5Cnhttp://content.ebsc ohost.com.proxy.lib.pdx.edu/ContentServer.asp?T=P& P=AN&K=90484928&S=R&D=ehh&EbscoContent=dGJyMNHr7ES Hart Research Associates. (2015). Falling short? College learning and career success.

PAGE 149

149 response process to the writing classroom. Issues in Writing 16 (2), 162 183. Retrieved from http://proquest.umi.com/pqdweb?did=1519127651&Fmt=7&clientId=18803&RQT=3 09&VName=PQD Hung, T A. (2016). Enhancing feedback provision through multimodal video technology. Computers & Education 98 90 101. https://doi.org/10.1016/j.compedu.2016.03.009 Isaacson, R., & Fujita, F. (2006). Metacognitive knowledge monitoring and self regulated learning: Academic success and reflections on learning. The Journal Scholarship of S cholarship of Teaching and Learning 6 (1), 39 55. https://doi.org/10.1177/0165551511400955 Johnson, R. (2001). The Next Frontier of the Student Centered Classroom: Teaching Students To Recognize Qual ity Writing through the Use of Peer Evaluation Retrieved from http://files.eric.ed.gov/fulltext/ED463813.pdf Jones, E. (2008). Predicting performance in first semester college basic writers: Revisiting the role of self beliefs. Contemporary Educational Psychology 33 (2), 209 238. https://doi.org/10.101 6/j.cedpsych.2006.11.001 Jones, N., Georghiades, P., & Gunson, J. (2012). Student feedback via screen capture Higher Education 64 (5), 593 607. h ttps://doi.org/10.1007/s10734 012 9514 7 year writing course. Tydskrif Vir Taalonderrig/Journal for Language Teaching 38 (1), 64 99. Kaufman, J. H., & Schunn, C. D. (2011). S for writing: Their origin and impact on revision work. Instructional Science 39 (3), 387 406. https://doi.org/10.1007/s11251 010 9133 6 Keeley, S. E. (2014 successful practices. Writing & Pedagogy 6 (2), 379 397. https://doi.org/10.1558/wap.v6i2.379 Keith, N. & Frese, M. (2008). Effectiveness of error management training: A meta analysis. The Journal of Applied Psychology 93 (1), 59 69. https://doi.org/10.1037/0021 9010.93.1.59 Kellogg, R. (1988). Attentional overload and writing performan ce: Effects of rough draft and outline strategies. Journal of Experimental Psychology: Learning, Memory and Cognition 14 (2lev p1), 355 365. https://doi.org/10.1037/0278 7393.14.2.355 Kellogg, R. (2008). Training writing skills: A cognitive developmental perspective.

PAGE 150

150 Journal of Writing Research 1 (1), 1 26. https://doi.org/10.17239/jowr 2008.01.01.1 Kerr, N. L. & Bruun, S. E. (1 983). Dispensability of member effort and group motivation losses: Free rider effects. Journal of Personality and Social Psychology 44 (1), 78 94. https://doi.org/10.1037/0022 3514.44.1.78 of communities of inquiry: Effects of learning technology use on cognitive presence in asynchronous online discussions. Internet and Higher Education 27 74 89. https://doi.org/10.1016/j.iheduc.2015.06.002 Kreijns, K., Kirschner, P. A., & Jochems, W. (2003). Identifying the pitfalls for social interaction in computer supported collaborative learning environments: A rev iew of the research. Computers in Human Behavior 19 (3), 335 353. https://doi.org/10.1016/S0747 5632(02)00057 2 Kwon, K., Liu, Y. H., & Johnson, L. P. (2014). Group regulation and social emotiona l interactions observed in computer supported collaborative learning: Comparison between good vs. poor collaborators. Computers & Education 78 185 200. https://doi.org/10.1016/j.compedu.2014.0 6.004 peer review of writing. Research in the Teaching of English 51 (2), 134 161. Ludemann, P. M. & Mcmakin, D. (2014). Perceived helpfulness of peer editing activities: First Psychology Learning and Teaching 13 (2), 129 136. https://doi.o rg/10.2304/plat.2014.13.2.129 MacArthur, C. & Philippakos, Z. (2013). Self regulated strategy instruction in developmental writing: A design research project. Community College Review 41 (2), 176 195. https://doi.org/10.1177/0091552113484580 audience. Educational Psychologist 45 (January), 167 184. https://doi.o rg/10.1080/00461520.2010.493470 Maurino, P. (2007). Looking for critical thinking in online threaded discussions. Journal of Educational Technology Systems 35 (3), 241 260. https://doi.org/10.2190 /P4W3 8117 K32G R34M McGraw, K. O. Psychological Methods 1 (4), 390 390. https://doi.org/10.10 37/1082 989X.1.4.390 Min, H. (2005). Training students to become successful peer reviewers. System 33 (2), 293 308. https://doi.org/10.1016/j.system.2004.11.003 Mitchell, R. & Taylor, M. (1979). The integrating perspective: An audience response

PAGE 151

151 model for writing. College English 41 (3), 247 271. https://doi.org/10.2307/376441 Mogey, N. & Hartley, J. (2013). To write or to type? The effects of handwriting and word processing on the written style of examination essays. Innovations in Education and Teaching International 50 (1), 85 93. https://doi.org/10.1080/14703297.2012.748334 Mooney, M., Southard, S., & Burton, C. H. (2014). Shifting from obligatory discourse to rich dialogue: Promoting student interaction in asynchronous threaded discussion postings. Online Journal of Distanc e Learning Administration 17 (1), 1. Retrieved from https://login.pallas2.tcl.sc.edu/login?url=http://search.ebsc ohost.com/login.aspx?dir ect=true&db=edb&AN=95649353&site=eds live Negretti, R. (2012). Metacognition in student academic writing: A longitudinal study of metacognitive awareness and its relation to task perception, self regulation, and evaluation of performance. Written Communication 29 (2), 142 179. https://doi.org/10.1177/0741088312438529 Nystrand, M., Greene, S., & Wiemelt, J. (1993). Where did composition studies come from?: An intellectual history. Written Communication 10 (3), 267 333. Ong, W. (1975). The writ Pmla 90 (1), 9 21. https://doi.org/10.2307/461344 Pajares, F. (2003). Self efficacy beliefs, motivation, and achievement in writing: A review of the literature. Reading & Wri ting Quarterly 19 (2), 139 158. https://doi.org/10.1080/10573560308222 Partnership for 21st Century. (2007). Framework for 21st century learning. Retrieved from http://www.p21.org/our work/p21 framework Patchan, M. M., Hawk, B., Stevens, C. A., & Schunn, C. D. (2013). The effects of skill diversity on commenting and revisions. Instructional Science 41 (2), 381 405. https://doi.org/10.1007/s11251 012 9236 3 Paulson, E. J., Alexander, J., & Armstrong, S. (2007). Peer review re viewed: review process. Research in the Teaching of English 41 (3), 304 335. https://doi.org/10.2307/40171733 Phan, H. P. (2014). Self efficacy, reflection, and achievement: A short term longitudinal examination. Jo urnal of Educational Research 107 (2), 90 102. https://doi.org/10.1080/00220671.2012.753860 Raedts, M., Rijlaars dam, G., van Waes, L. & Daems, F. (2007). Observational learning through video bas efficacy beliefs, task knowledge and writing performances. In P. Hidi, Suzanne & Boscolo (Ed.),

PAGE 152

152 Writing and Motivation (pp. 219 240). New York. htt ps://doi.org/10.1017/CBO9781107415324.004 Saldana, J. (2012). An introduction to codes and coding. In The Coding Manual for Qualitative Researchers (2nd ed., pp. 1 8). SAGE Publications. https:// doi.org/10.1519/JSC.0b013e3181ddfd0a Sanders about writing relate to their writing self efficacy, apprehension, and performance? Learning and Instruction 33 1 11. https://doi.org/10.1016/j.learninstruc.2014.02.001 Schunk, D. H. & Zimmerman, B. J. (1997). Social origins of self regulatory competence. Educational Psychologist 32 (4), 195 208. https://doi.org/10.1207/s15326985ep3204 Schunn, C., Godley, A., & DeMartino, S. (2016). The reliability and validity of peer review of writing in high school AP English classes. Journal of Adolescent & Adult Literacy 60 (1), 13 23. Retrieved from http://lp.hscl.ufl.edu/login?url=http://search.ebscohost.com/login.aspx?dire ct=true& db=eric&AN=EJ1105239&site=eds live Schwartzman, R. & Morrissey, M. (2010). Collaborative student groups and critical thinking in an online basic communication course. In J. E. Shedletsky, L., & Aitken (Ed.), Cases on online discussion and interacti on: Experiences and outcomes. (pp. 39 64). Hershey: Information Science Reference. Shrout, P. E. & Fleiss, J. L. (1979). Intraclass correlations: Uses in assessing rater reliability. Psychological Bulletin 86 (2), 420 428. https://doi.org/10.1037/0033 2909.86.2.420 Stahl, G. (2015). Computer supported academically productive discourse. In S. Resnick, L.; Asterhan, C.; & Clark (Ed.), Socializing intelligence through academic talk and dialogue. (pp. 2 13 224). AERA Publications. Regulated Learning through an Authentic Assignment. International Journal of Teaching and Learning in Higher Education 28 (2), 271 282. Retrieved from http://www.isetl.org/ijtlhe/ Suthers, D. (2006). Technology affordances for intersubjective meaning making: A research agenda for CSCL. International Journal of Computer Supported Collaborative Learning 1 (3), 315 337. https://doi.org/10.1007/s11412 006 9660 y Thiede, K. W. & Anderson, M. C. M. (2003). Summarizing can improve metacomprehension accuracy. Contemporary Educational Psychology 28 (2), 129 160. https://doi.org/10.1016/S0361 476X(02)00011 5 Thomas, A. O., Antonenko, P. D., & Davis, R. (2016). Understanding

PAGE 153

153 metacomprehension accuracy within video annotation systems. Computers in Human Behavior 58 269 277. https://doi.org/10.1016/j.chb.2016.01.014 Vasileiou, L. (2016). Online peer review across sections. Teaching English in the Two Year College 44 (1), 90 96. Vincelette, E. J. & Bostic, T. (2013). Show and tell: Student and instructor perceptions of screencast assessment. Assessing Writing 18 (4), 257 277. https://doi.org/10.1016/j.asw.2013.08.001 Wang, S., & Wu, P. (200 8). The role of feedback and self efficacy on web based learning: The social cognitive perspective. Computers & Education 51 1589 1598. https://doi.org/10.1016/j.compedu.2008.03.004 Ward, I. ( 1994). Albany: State University of New York Press. Witte, S. P. & Faigley, L. (1981). Coherence, cohesion, and writing quality. College Composition and Communication 32 (2), 189 204. https://doi.org/10.2307/356693 British Journal of Educational Technology 42 (4), 687 700. https://doi.org/10.1111/j.1467 8535.2010.01059.x Yuan, J. & Kim, C. (2015). Effective feedback design using free technologies. Journal of Educational Computing Research 52 (3), 408 434. https://doi.org/10.1177/0735633115571929 Zhao, H., Sullivan, K. P. H., & Mellenius, I. (2014). Participation, interaction and social presence: An exploratory study of collaboration in online peer review groups. British Journal of Educational Technology 45 (5), 807 819. https://doi.org/10.1111/bjet.12094 Zimmerman, B. J. (1998). Developing self fulfilling cycles of academic regulation: An analysis of exemplary instructional models. In Self regulated learning: From teaching to self reflective practice. (pp. 1 19). New York, NY, US: Guilford Publications. Zimmerman, B. J. (2002). Becoming a self regulated learner: An overview. Theory into Practice 41 64+. Zimmerman, B. J. (2008). Inv estigating self regulation and motivation: Historical background, methodological developments, and future prospects. American Educational Research Journal 45 (1), 166 183. Retrieved from http://www.jstor .org/stable/30069464 Zimmerman, B. J. & Bandura, A. (1994). Impact of self regulatory influences on writing course attainment. American Educational Research Journal 31 (4), 845 862.

PAGE 154

154 Zimmerman, B. J & Kitsantas, A. (2002). Acquiring writing revision skill: Shifting from process to outcome self regulatory goals. Journal of Educational Psychology 94 (4), 660 668. https://doi.org/10.1037/0022 0663.91.2.241 Zimmerman, B. J., & Risemberg, R. (1997). Beco ming a self regulated writer: A social cognitive perspective. Contemporary Educational Psychology 22 73 101. https://doi.org/10.1006/ceps.1997.0919 Zimmerman, B. J. & Schunk, D. H. (2001). Self regul ated learning and academic Mahwah, N.J.: Lawrence Erlbaum Associates, Inc. Zydney, J. M., Denoyelles, A., & Kyeong Ju Seo, K. (2012). Creating a community of inquiry in online environments: An exploratory study on th e effect of a protocol on interactions within asynchronous discussions. Computers and Education 58 (1), 77 87. https://doi.org/10.1016/j.compedu.2011.07.009

PAGE 155

155 BIOGRAPHICAL SKETCH Audrey Michelle Antee graduated from the University of Louisiana at Monroe in She continued her education at Literature in 2004. She worked as a part time instructor at Florida State College at Jacksonville in 2005, teaching composition and literature face to face and online. She also taught for Baker College and Cochise College as an online instructor. In 2011 Audrey was hired as a Professor of English at Florida State College at Jacksonville where she continues working today. She teaches professional development workshops to collea gues on technology and teaching practices in different modalities and has been involved in online cu rriculum development at FSCJ. Her research interests include computer supported collaborative learning in freshman composition, mobile learning in formal and informal learning environments, and student motivation in composition courses.