Abstract
Online peer feedback is an effective instructional strategy to enhance students' learning processes and outcomes. However, the literature lacks a comprehensive understanding of the influential factors that play a key role in the effective implementation of online peer feedback. This systematic review provides an overview of the current state of online peer feedback implementation in higher education contexts and explores the role of students' characteristics and online learning environments in relation to their learning processes and outcomes. To achieve this goal, the PRISMA method was followed, and a coding scheme was developed to create a framework that can guide the implementation of online peer feedback in higher education settings. This framework depicts factors that should be taken into account for effective implementation of online peer feedback in terms of four dimensions: students' characteristics (demographic characteristics, academic background, and personality and psychological features), environmental conditions (learning platform and setting), learning processes (content, feedback activity design, and technology), and learning outcomes including cognitive outcomes (e.g., acquisition of knowledge, comprehension, application, analysis, synthesis, and evaluation), behavioral outcomes (engagement, communication, and teamwork), and affective outcomes (satisfaction, motivation, attitude, self-efficacy, sense of autonomy, and confidence). We conclude this study by discussing the framework, limitations, and ideas for future research and practice.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
Feedback is a critical component of students' learning and performance (Hattie & Timperley, 2007) that has become increasingly important in higher education (Maringe, 2010). Due to increasing teaching workload and growing scalable classes in higher education (Shi, 2019), peer feedback is one of the feedback types that are crucial for higher education (Noroozi et al., 2023; Cho & Schunn, 2007). Peer feedback is an effective instructional strategy to support students' learning processes and outcomes at a large scale (Er et al., 2021; Noroozi et al., 2016, 2023; Taghizadeh Kerman et al., 2022a). Implementing peer feedback in classrooms not only helps teachers activate students' engagement but also helps students broaden and deepen their understanding of the topic (Bayat et al., 2022; Noroozi et al., 2022).
The literature reveals that peer feedback has positive impacts on students' learning, such as improving professional skills (Brill, 2016; Lowell & Ashby, 2018), enhancing writing performance (Huisman et al., 2018; Nelson & Schunn, 2009; Noroozi et al., 2023; Shang, 2019), and fostering argumentation skills (Noroozi & Hatami, 2019). Peer feedback also provides opportunities for active interactions and meaningful negotiations (Al Qunayeer, 2020), and improves judgment skills, decision-making skills (Bayat et al., 2022), self-regulation skills (Ku & Lohr, 2003), and communication skills (Ritzhaupt & Kumar, 2015).
In parallel to the growing popularity of online modality in higher education, the implementation of online peer feedback has exponentially increased over the last decade due to its convenience, flexibility, and accessibility (Latifi et al., 2021; Noroozi et al., 2016; Taghizadeh Kerman et al., 2022a). Online tools have provided an effective, time-saving, and easy way to set up peer feedback activities, particularly in classes with a large number of students (Er et al., 2021; Noroozi et al., 2016; Latifi et al., 2021). When implemented as an online activity, learners can take advantage of the flexibility to choose when and where they want to participate in the feedback tasks (Tsai et al., 2002). Additionally, the data collected from students’ online peer feedback activities can be recorded, later used, and reflected upon for a better understanding of the feedback processes and any emerging issues (Banihashem et al., 2022a; Er et al., 2021).
Although peer feedback offers numerous benefits for students' learning and performance, its application in higher education is not without challenges (see Cho et al., 2006; Noroozi et al., 2016, 2023). Some of these challenges regard students’ attitudes and perceptions of peers and their feedback, such as the level of trust and low tolerance for critical feedback and resistance (Hu & Lam, 2010; Panadero & Alonso-Tapia, 2013). Moreover, issues in the implementation of peer feedback can arise due to students’ inadequate skill and knowledge levels, which may include limited feedback literacy, familiarity with criteria, and experiences with providing and receiving feedback (Winstone et al., 2017), insufficient specialized knowledge and literacy about the topic (Van Zundert et al., 2010; Valero Haro et al., 2019, 2023), and weak writing and language skills (Allen & Mills, 2016; Lundstrom & Baker, 2009). Another significant challenge frequently mentioned in the literature is the complexity of the feedback task requiring higher-order thinking skills (Er et al., 2021; Zhu & Carless, 2018), which may not be properly handled by all students. If not properly addressed, these challenges can result in superficial feedback or impede the effective implementation and uptake of peer feedback.
Many theoretical models and frameworks of peer feedback have been proposed in the literature (e.g., Panadero & Lipnevich, 2022; Wu & Schunn, 2023), which may help to tackle these challenges. These models and frameworks aim to clarify how students engage in peer feedback activities, how they analyze and process feedback, and how such feedback from peers is incorporated into the revised works of students. However, there is a need for greater clarity regarding the operationalization of these models and frameworks in real educational contexts. Identifying factors that can influence the processes and outcomes of peer feedback could help teachers implement effective peer feedback activities in their classrooms (Cui et al., 2022). It is particularly important to examine how the unique characteristics of students and learning environments impact their engagement during peer feedback processes and how this engagement affects their learning outcomes. While several systematic reviews have been conducted in the field of peer feedback (e.g., Topping, 2021; Zhang et al., 2021), they differ from the present review study in terms of scope and focus. Our systematic review takes a comprehensive approach to examine the role of students' characteristics, learning environment, learning processes, and outcomes of online peer feedback in higher education.
2 Conceptualizing the review
We adopted Biggs’ model (2003) as the basis for conceptualizing our review. Biggs’ model provided a framework that helped identify the critical dimensions to be addressed in our review, ultimately yielding practical results for teachers. This model entails four dimensions including (a) student characteristics, (b) learning environment, (c) learning processes and activities, and (d) learning outcomes that fit well with the aim of our peer feedback study. In Biggs’ model (2003), students’ characteristics refer to prior knowledge, abilities, intelligence, personality, and background, and it represents students' incoming personal learning influences. These characteristics are different from one person to another, inevitably resulting in different performances. In the case of peer feedback, students' characteristics such as attitude, motivation, and gender may affect peer feedback processes and outcomes (e.g., Lane et al., 2018). The learning environment includes different features including instructional mode, subject area, course structure, learning tasks, etc. Although the literature confirms the impacts of online learning environments on peer feedback performance (e.g., Lin, 2016, 2018a; Noroozi & Mulder, 2017), it does not say how different elements of learning environments can influence the design of peer feedback and its implementation. There is a need to provide an overview of the impacts of different elements of online learning environments on students’ peer feedback performance. Learning processes and activities explain how students approach learning and what strategies and techniques they follow to learn. It is necessary to identify and understand the learning processes involved in peer feedback engagement to better understand student behavior. Finally, learning outcomes are the last dimension of Biggs’ model. Providing an overview of the learning outcomes obtained through the implementation of online peer feedback implementation can guide teachers to know for what purposes and for what kind of learning outcomes, online peer feedback can assist them. In general, the learning outcomes attained by students can be classified into three overarching domains, a classification that finds its roots in Bloom's Taxonomy (1956). Firstly, situated within the affective domain are the intricate nuances of feelings, perceptions, and emotions that students undergo when engaging with online peer feedback. Secondly, the cognitive domain encapsulates the vast spectrum of knowledge acquisition and the cultivation of intellectual proficiencies that transpire throughout the learning process. The cognitive domain encompasses six progressively intricate levels: starting from foundational knowledge and comprehension, then extending to application, analysis, synthesis, and culminating in evaluation. Thirdly, behavioral outcomes pertain to the observable actions, demonstrable behaviors, or tangible responses that students exhibit consequent to their engagement with online peer feedback. Underpinning students’ learning outcomes via online peer feedback on Bloom’s Taxonomy (1956) furnishes an organized framework for understanding the outcomes and enriches the interpretation of these outcomes with pedagogical insights.
By taking all four dimensions of Biggs’ model (2003) into account, our systematic review provides a general framework for teachers on how to effectively count for students’ characteristics in an optimal learning environment to engage in desirable peer feedback activities to achieve intended learning outcomes. The following research questions are formulated to achieve the main goal of this review study:
-
RQ1. What are the students’ characteristics that influence online peer feedback in higher education?
-
RQ2. How do the conditions of the learning environment impact online peer feedback in higher education?
-
RQ3. What are the learning processes and activities that influence online peer feedback in higher education?
-
RQ4. How does online peer feedback influence the learning outcomes in higher education?
3 Method
We first followed the PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) (Moher et al., 2009) method to systematically review the literature. Then, we used a quality appraisal strategy to fine-grain the identified publications (Theelen et al., 2019). Finally, we developed a coding scheme based on Biggs’ model (2003) to analyze the publications included in the final selection.
3.1 Search strategy
To find relevant publications, first, Web of Sciences (WOS) and Scopus were selected as the main databases since these two cover almost all relevant publications. Second, we defined our terms for search query including (improve* OR develop* OR foster* OR promot* OR support* OR enhance* OR train*) AND (“peer feedback” OR “peer review” OR “peer assessment” OR “peer learning”) AND (“higher education*” OR university* OR college* OR academy* OR “tertiary* education*”) AND (online* OR electronic* OR internet* OR computer* OR “e-learning*” OR virtual* OR “web*-based”). All publications from WOS and Scopus were imported to EndNote X9.0 reference management software for further analysis.
3.2 Inclusion and exclusion criteria
For screening identified publications, first, three primary inclusion and exclusion criteria were applied: (1) only peer-reviewed publications in the English language were included; (2) only publications from 2000 to 2023 were included; and (3) only empirical articles were included. This means that book chapters, proceedings, reports, dissertations, and conceptual articles were excluded. In the second phase of screening, we only selected empirical studies with intervention designs to get more valid and reliable findings. This means that non-experimental studies, analytical research, and those studies which only reported qualitative or descriptive results were excluded. We also focused exclusively on studies undertaken in higher education contexts. Therefore, studies in the context of K-12 education were excluded. In addition, we only focused on studies conducted in online learning environments which means that other types of learning settings such as blended, hybrid, or face-to-face education were excluded.
3.3 Identification of relevant publications
The first screening led us to identify a total of 2221 papers (WOS: N=639, Scopus: N=1582). After an initial screening, 368 articles were removed because of duplications. Then, 1362 publications did not meet the secondary inclusion criteria which left us with 491 papers for full-text screening. Full-text screening led 386 publications to be dropped because they were not conducted either in higher education contexts or in online learning settings. Finally, 105 studies were left for quality appraisal.
3.4 Quality appraisal
We used the quality appraisal framework proposed by Theelen et al. (2019) which includes a checklist for critical appraisal of both quantitative and qualitative studies. Each publication received a score for each question of the checklist ranging from zero (not mention) to three (extensive mention) and if the mean score was two or more than two then the article met the required quality for inclusion. We found that 22 studies did not meet the minimum criteria to be included in the final analysis and only 83 studies remained (Table 1). The stages of our screening and selection process are illustrated in Fig. 1.
3.5 Included publications
Out of the final pool of 83 selected publications for analysis (Table 2), the majority of publications were published since 2020 (N=12, 15%).
These papers were published in a wide range of scholarly journals, from writing research to information technology. We found 8 publications in Interactive Learning Environments (10%), 7 publications (8%) in Computers and Education, and Assessment and Evaluation in Higher Education, 5 publications in the Internet and Higher Education (6%), and 3 publications in Computers in Human Behavior (4%). The selected publications were geographically diverse, with 30 publications from Taiwan (36%), followed by 11 publications from the Netherlands (13%), 9 publications from the United States (11%), 6 publications from China (7%), 4 publications from Iran (8%), and 3 publications from Spain (4%).
The most common research design among the selected publications was experimental design (N=49, 59%), followed by quasi-experimental design (N=25, 30%). The study context varied from medicine to statistics, but studies with education science contexts were found to be dominating (N=21, 25%). In terms of online platforms, the selected publications used Wiki (N=5, 6%), Blackboard (N=4, 5%), Brightspace (N=3, 4%), Facebook (N=3, 4%), KnowCat (N=2, 3%), and mobile apps (N=2, 3%). This diversity in online platforms suggests that online peer feedback has been implemented in various online learning environments, and researchers have investigated the impact of different platforms on the peer feedback process and outcomes.
3.6 Analytic strategy
A coding scheme was developed based on the Biggs model (2003) to thematically analyze the included publications and address research questions (Table 3). The coding scheme consisted of four dimensions, including students' characteristics, learning environment conditions, learning processes and activities, and learning outcomes. All 83 publications were analyzed and coded using the coding scheme in ATLAS.ti 9 (Friese, 2019), and the inter-rater reliability between the two coders was examined by randomly selecting and coding sample papers. The Kappa results showed 84 percent agreement between the two coders (κ=0.84, p<0.001), indicating high consistency and reliability of the coding. We followed a deductive approach to group-identified codes. In this approach, we started from a theoretical lens to categorize basic codes and observations. During coding, the approach confirmed or rejected the propositions, allowing for a structured analysis (Moradian et al., 2014; Rauss & Pourtois, 2013).
4 Results
4.1 RQ1. What are the students’ characteristics that influence online peer feedback in higher education?
We identified 36 codes for students’ characteristics that influence online peer feedback in higher education. We categorized the codes into three main categories: demographic characteristics, academic background, and personality and psychological features (Table 4).
Demographic characteristics
Eight of the reviewed publications explored the role of student demographic characteristics, including gender (Noroozi et al., 2020, 2022), language (Culver et al., 2022), parental education (Culver et al., 2022), and race/ethnicity (Culver et al., 2022). Culver et al. (2022) found that language, parental education, and race/ethnicity did not predict students’ performance in a peer-reviewed lab activity. Among the demographic characteristics, other studies reported gender having a more significant influence on online peer feedback activities. In particular, there were significant differences between females and males in terms of negative sentiment comments (Lane et al., 2018) and peer feedback quality (Slee & Jacobs, 2017). Female students tended to produce higher-quality feedback (Noroozi et al., 2020, 2022; Slee & Jacobs, 2017) while providing negative comments with more caution (Lane et al., 2018). However, male students produced higher-quality argumentative essays than females based on peer feedback (Noroozi et al., 2020). Females tend to be more collaborative and communicative, which translates into more detailed and constructive feedback. They are also more likely to engage in social comparison and evaluation processes, which may enhance their ability to provide feedback that is sensitive to the needs and perspectives of others. On the other hand, males may be more competitive and goal-oriented, which may motivate them to improve their writing and argumentation skills in response to feedback. They may also be more confident in their writing abilities and willing to take risks, which could lead to greater creativity and effectiveness in their writing. As a result, while the impact of demographic variables on peer feedback may vary depending on the specific characteristic being considered, gender appears to be a consistently significant factor.
Academic background
Seven of the reviewed publications have explored the role of students' academic backgrounds including their online education experience, the type of high school that they have graduated from (Altinay, 2016), education level (Slee & Jacobs, 2017), field of study, feedback experience (Cheng & Hou, 2015), presentation ability (Day et al., 2021), and writing proficiency (Jiang & Yu, 2014; Yang & Meng, 2013). There were no significant differences in the mean grades of students allocated at different educational levels (Slee & Jacobs, 2017). Additionally, there were no significant differences in argumentation ability and conceptual understanding of students in different fields of study. However, there were meaningful differences between higher education students depending on the types of high school (Science High School, Vocational High School, Social Science High School, Anatolian High School, and Regular High School) they graduated from and their experience with distance education, specifically in collaborative learning or peer learning. Graduates of science high schools and participants with distance education experience reported more positive perceptions and experiences in online peer learning and assessment during collaborative learning (Altinay, 2016). Overall, the reviewed studies suggest that student's academic backgrounds can influence their peer feedback processes and outcomes in online learning settings. However, the impact may vary depending on the specific aspect of the academic background being considered.
Personality and psychological features
Seven of the reviewed publications explored the role of personality and psychological features including emotions (Cheng et al., 2014), epistemic beliefs (Noroozi & Hatami, 2019; Tsai & Liang, 2007), motivation (Tseng & Tsai, 2010), perceptions (Day et al., 2021; Jiang & Yu, 2014), and self-efficacy (Day et al., 2021; Tseng & Tsai, 2010). Different studies have achieved different results on the impact of epistemic beliefs on peer feedback processes and outcomes. For example, Cheng et al. (2014) found that students' participation in the peer assessment activity was influenced by their emotional responses, with students who experienced positive emotions being more likely to participate actively and provide high-quality feedback. In contrast, students who experienced negative emotions were more likely to avoid the activity or provide superficial feedback. Tsai and Liang (2007) showed that students with more constructivist-oriented epistemic beliefs might benefit more from peer feedback. Tseng and Tsai (2010), found that students with higher intrinsic motivation tended to have greater confidence in evaluating peers' work, receiving peers' opinions, and making the reaction to peers' feedback. Day et al. (2021) acknowledge that students' perceptions of peer feedback can impact their engagement and motivation to improve their presentation skills. Overall, the reviewed studies suggest that students with different personality traits behave in different ways when receiving peer feedback, and thus they achieve different outcomes.
4.2 RQ2. What are the learning environment conditions that influence online peer feedback in higher education?
In total, we identified 39 codes representing learning environment conditions that influence online peer feedback in higher education. We grouped these codes into two main categories including learning platform and learning setting (Table 5).
Learning platform
The reviewed publications on online peer feedback have explored the use of a variety of learning platforms to implement peer feedback including Expertiza (Hoffman, 2019), Blackboard (Ismaeel, 2020), Wiki (Al Abri et al., 2021; Xiao & Lucking, 2008), Adobe Connect program (Altinay, 2016), Google Apps (Slee & Jacobs, 2017), Google Docs, Sakai VLE, and Sakai Wiki (Canham, 2018), and Calibrated Peer Review software (Culver et al., 2022). Peer feedback on wikis, for example, was shown to facilitate the improvement of writing essays and peer feedback content quality (Al Abri et al., 2021; Gielen & De Wever, 2015; Xiao & Lucking, 2008). Calibrated Peer Review software was developed to offer a student-centered approach to process-based writing while minimizing the role of instructors in providing feedback (Culver et al., 2022). Moreover, the quality of the online learning environment, the collaborative and socially constructive effort of peers, and the assessment of the resulting progress were found to be important for enhancing the motivation and involvement of students in learning and skills development (Pifarré et al., 2014). Additionally, the visualization of group awareness information in the KnowCat platform positively influenced students' collaborative behavior (Pifarré et al., 2014). Overall, the type of learning platform to implement online peer feedback is an important component of peer feedback processes and outcomes in online learning settings.
Learning setting
The reviewed publications on online peer feedback have explored the features of the environment in which peer feedback is implemented including context, team, and learning culture. Altinay (2016) found that there is a meaningful difference between different contexts in terms of peer learning. Students within the arts and sciences context perceived it more positively in a collaborative peer learning task compared to students in the communication, engineering, and technology contexts due to differences in task complexity, disciplinary culture, and prior experience. Moreover, the learner's experiences and perceptions of the online learning culture were essential in creating quality education through peer feedback (Donia et al., 2021; Pham et al., 2020). Furthermore, Agrawal and Rajapakse (2018) found that diverse teams with members from different academic disciplines provided more valuable feedback. The effectiveness of peer feedback in mixed academic teams may be influenced by a variety of factors, including the communication skills of team members, and the ability of team members to take each other's perspectives and engage in critical reflection. Overall, the features of the environment including context, team composition, and learning culture are important factors for the quality of peer feedback processes and outcomes in online learning settings.
4.3 RQ3. What are the learning processes and activities that constitute online peer feedback in higher education?
We identified a total of 107 codes that represented the learning processes and activities of online peer feedback implementation. We grouped the codes into three main categories including content, feedback activity design, and technology. These categories are the heart of learning and teaching with technology (Koehler & Mishra, 2009) (Table 6).
Content
The reviewed publications on online peer feedback have explored the type and quality of feedback that influences the peer feedback processes. In terms of feedback type, Çevik (2015) found that both assessors and assessees improved their problem-solving skills. Regarding feedback quality, Tsai et al. (2002) found a positive relationship between the quality of peer feedback received and assessee students' performance. Students who perceived peer feedback as accurate and useful were more likely to utilize the feedback comments from peers to improve their work reviewed (Wang et al., 2019). Therefore, the quality of the feedback is crucial in influencing the peer feedback processes. The feedback should be rich in content and include good features such as being affective, constructive, timely, and detailed and containing problem identification, and problem justification (Taghizadeh Kerman et al., 2022a). The better the quality of the feedback, the more likely students are to take it seriously and uptake it to improve their work.
Feedback activity design
The reviewed publications on online peer feedback have explored various characteristics of peer feedback design considerations related to the peer feedback processes in higher education. These considerations include whether peer feedback should be voluntary or obligatory (Liu et al., 2019), whether it should be given anonymously or not (Lane et al., 2018; Lin, 2018a), the number of rounds of peer feedback (Chen et al., 2020; Lai et al., 2020), and the role of peer feedback (Day et al., 2021; Çevik, 2015). For example, findings indicate that voluntary peer feedback can lead to more accurate scores (peer rater accuracy) for the final task (Liu et al., 2019), and a collaborative team of reviewers can produce higher-quality feedback than individual reviewers (Mandala et al., 2018). Peer feedback can also improve problem-solving skills and reasoning abilities for both assessors and assessees (Çevik, 2015; Patchan et al., 2018). Furthermore, more rounds of peer assessment can lead to improved writing performance and the validity of peer scores (Liang & Tsai, 2010). Peer feedback training has also been found to have positive effects on writing improvement (Jiang & Yu, 2014) and text revisions (Yang & Meng, 2013), although no significant increases were observed in student assessment knowledge when participating in peer assessment training (Hoffman, 2019). Peer scoring and commenting tasks as part of peer feedback activity can improve students' performance (Chen et al., 2020; Hsia et al., 2016; Xiao & Lucking, 2008). When online peer feedback is provided anonymously, it has demonstrated the potential to enhance students' essay writing performance, as evidenced in the context of EFL learning (Al Abri et al., 2021), high-quality cognitive feedback (Liu et al., 2019), and constructive feedback (Basheti et al., 2010). Anonymity in online peer feedback can be useful because it encourages honesty and openness, reduces bias and social pressure, and promotes constructive feedback that is focused on helping the recipient improve. Compared to their male counterparts, female peer reviewers were found to be more influenced by anonymity than male peer reviewers as they produced more negative comments in their feedback (Lane et al., 2018). However, Liu and Zhang (2017) found no significant differences between anonymous and identified discussion groups in terms of writing quality. Moreover, the use of worked examples, including a typical answer model of a high-quality argumentative essay, has been found to improve the quality of argumentative essay writing and facilitate the acquisition of domain-specific knowledge (Latifi et al., 2020; Valero Haro et al., 2019). Overall, the feedback activity design considerations related to the peer feedback processes can significantly influence the outcomes of online peer feedback in higher education.
Technology
The reviewed publications on online peer feedback have explored various technological innovations that can facilitate peer feedback processes including synchronous or asynchronous online discussions (e.g., Liu et al., 2017; Zheng et al., 2018), video peer assessment (Ge, 2019), and video annotation (Lai et al., 2020; Lai, 2016). Synchronous peer assessment discussions were found to elicit interaction between basic and advanced cognitive dimensions, which may be valuable in developing cognitive abilities, improving writing (Liu et al., 2017; Zheng et al., 2018), and promoting affective and meta-cognitive feedback quality, meta-cognitive awareness, and self-efficacy (Zheng et al., 2018). Additionally, asynchronous discussion environments were shown to improve students' performance of argumentation and conceptual understanding. The use of video feedback and video annotation was found to be effective in improving e-learners' translation performance and the effectiveness of online peer assessment (Ge, 2019; Lai et al., 2020; Lai, 2016). Furthermore, reviewed publications have shown that supportive learning strategies with the help of technology can lead to improved learning. These strategies include mobile-supported (Chang & Lin, 2020; Kuo et al., 2017), blog-supported (Rahmany et al., 2013; Yeh et al., 2019), Facebook-based online peer assessment with micro-teaching (Lin, 2016), web-based alternatives (Ismaeel, 2020). For instance, the use of mobile phones in peer assessment can promote students' learning interests, motivation, and self-efficacy (Kuo et al., 2017). Blog-supported peer feedback can improve students' speaking and writing skills (Yeh et al., 2019; Rahmany et al., 2013). The use of argumentative peer feedback scripts and text-based digital learning modules can enhance the quality of students' written argumentative essays (Noroozi & Hatami, 2019; Noroozi et al., 2016). Additionally, feedback and feedforward support in terms of prompts can improve peer learning processes, argumentative essay quality, and domain-specific learning (Latifi et al., 2021). Overall, the use of technology as affordances in online peer feedback can enhance the effectiveness and efficiency of the peer feedback processes that lead to improved learning outcomes.
4.4 RQ4. What are the learning outcomes of online peer feedback in higher education?
We identified a total of 165 codes that represented learning outcomes of online peer feedback implementation in higher education. We categorized the learning outcomes into three categories based on Bloom’s Taxonomy (1956) including cognitive outcomes (number of codes = 104) (see Table 7), behavioral outcomes (number of codes = 16) (see Table 8), and affective outcomes (number of codes = 45) (see Table 9).
Cognitive outcomes
Cognitive outcomes are related to the acquisition of knowledge, comprehension, application, analysis, synthesis, and evaluation (Bloom, 1956). These outcomes are categorized into knowledge, comprehension, application, analysis, synthesis, and evaluation.
Knowledge
Among all, twenty studies explored knowledge outcomes such as domain-specific or domain-general knowledge (e.g., Latifi et al., 2020, 2021; Noroozi & Hatami, 2019), and assessment knowledge (Hoffman, 2019). These studies have found that various approaches, such as a combination of worked examples and scripting (Latifi et al., 2021; Valero Haro et al., 2019), guided peer feedback (Noroozi & Mulder, 2017), feedback and peer feedforward support (Latifi et al., 2021), mobile-supported (Chang & Lin, 2020), the use of awareness tools in KnowCat (Pifarré et al., 2014), and the rating-plus-qualitative-feedback (Hsia et al., 2016), can facilitate the acquisition of domain-specific or domain-general knowledge.
Comprehension
Five studies investigated comprehension outcomes in online peer feedback (e.g., Gielen & De Wever, 2015; Zhan, 2020). The selected studies on online peer feedback have identified comprehension outcomes such as conceptual understanding, elaboration (Gielen & De Wever, 2015), and ability to justify (Zhan, 2020). Peer feedback activities in asynchronous discussion environments were found to promote students' conceptual understanding, while structured peer assessment was shown to improve the quality and focus of peer feedback elaborations (Gielen & De Wever, 2015) and students' ability to justify their arguments with credible evidence (Zhan, 2020).
Application
Fifty studies have explored application outcomes in online peer feedback, including writing (e.g., Culver et al., 2022; Latifi et al., 2021), feedback performance (e.g., Chen et al., 2020; Day et al., 2021), problem-solving (e.g., Chang et al., 2015; Çevik, 2015), and dance performance (Hsia et al., 2016). The reviewed publications have demonstrated that various approaches, such as structured peer assessment (Tsai & Chuang, 2013), argumentative peer feedback script (e.g., Noroozi & Hatami, 2019; Noroozi et al., 2020), and online discourse community (Luhach, 2020), can improve argumentative essay writing. Additionally, online peer feedback with Total Quality Management (TQM) (Chang et al., 2015), the role of peer feedback (assessors and assessees) (Çevik, 2015), and peer learning experiences (Altinay, 2016) have been shown to facilitate problem-solving within an active, social process.
Analysis
Six studies investigated analysis outcomes in online peer feedback. These studies have identified various analysis outcomes, such as argumentation skills, reflective thinking (Chen et al., 2009; Pham et al., 2020), and critical thinking (e.g., Altinay, 2016; Zhan, 2020). Chen et al. (2009) and Pham et al. (2020) showed that it can enhance students' reflective thinking skills. Additionally, Liu et al. (2001) and Zhan (2020) demonstrated that online peer feedback can promote students' critical thinking abilities.
Synthesis
Three studies explored synthesis outcomes in online peer feedback (Chang et al., 2015; Liu et al., 2001). The reviewed publications on online peer feedback have identified various synthesis outcomes, such as design skills (Chang et al., 2015) and planning skills (Liu et al., 2001). Chang et al. (2015) found that using online peer feedback with TQM can enhance design skills, while Liu et al. (2001) demonstrated that web-based peer assessment can promote planning skills among students.
Evaluation
Three studies explored evaluation outcomes in online peer feedback (e.g., Hoffman, 2019; Liu et al., 2019). The reviewed publications on online peer feedback have evaluated various outcomes, such as assessment skills (Liu et al., 2019) and meta-cognitive awareness (Liu et al., 2001; Zheng et al., 2018). Liu et al. (2019) found that students who participated in voluntary group feedback provided more accurate scores (i.e., peer rater accuracy) than those in the compulsory group. Zheng et al. (2018) showed that synchronous discussion had a significant positive impact on improving meta-cognitive awareness. Liu et al. (2001) also found that monitoring and regulation can enhance structured peer assessment.
Behavioral outcomes
Behavioral outcomes refer to the level of student engagement, communication, and teamwork in learning activities that are caused by involvement in peer feedback activities. These outcomes are categorized into engagement, communication, and teamwork (Table 8).
Engagement
Seven studies explored learners’ engagement in online peer feedback (e.g., Lin, 2019; Yuan & Kim, 2017). Research suggests that using the rating-plus-qualitative-feedback (Hsia et al., 2016) and collaborative review (Mandala et al., 2018) can enhance students' participation in online learning activities. Additionally, Cheng et al. (2014) found that students who responded more frequently tended to participate more actively and express more positive emotions in response to their peers' positive comments or neutral questions. Also, Su et al. (2022) showed that using group awareness tools can enhance student engagement with online peer feedback in collaborative language learning activities.
Communication
Three studies, including Altinay (2016), Lai (2016), and Lai et al. (2020), examined the impact of online peer feedback on learners' communication skills. These studies found that using round number and video annotation (Lai, 2016; Lai et al., 2020) and collaborative learning (Altinay, 2016) was particularly effective in promoting the development of communication skills.
Teamwork
Three studies, including Chang et al. (2015), Donia et al. (2021), and Altinay (2016), examined the impact of online peer feedback on learners' teamwork skills. While online peer feedback with TQM and collaborative learning were found to improve teamwork skills according to Chang et al. (2015) and Altinay (2016), respectively, Donia et al. (2021) found no significant direct effect on teamwork.
Affective outcomes
Affective outcomes refer to the quality of students’ perceptions of their learning caused by online peer feedback implementation. To identify the aspects of affective learning outcomes in online peer feedback, students’ satisfaction, motivation, attitude, self-efficacy, sense of autonomy, and confidence have been examined (Table 9).
Satisfaction
Satisfaction was measured in five studies and revealed participants’ positive evaluation of online peer feedback implementation (e.g., Donia et al., 2021; Noroozi & Mulder, 2017). For example, some studies measured students’ satisfaction with the digital learning module with guided peer feedback (Noroozi & Mulder, 2017), anonymity (Liu et al., 2017), and the rating-plus-qualitative-feedback (Xiao & Lucking, 2008). These studies suggest that students generally have high satisfaction with online peer feedback implementation when provided with certain conditions, such as guided feedback, anonymity, and rating-plus-qualitative-feedback mode.
Perception
Twelve studies explored students’ experiences of learning online peer feedback including perceived collaborative task (Mandala et al., 2018), perceived fairness (Lin, 2018a), perceived usefulness (Kuo et al., 2017), perceived learning outcomes (e.g., Lin, 2016, 2018a; Noroozi & Mulder, 2017), perceived ease to use (Kuo et al., 2017; Ge, 2019). For example, some studies showed that students in the anonymous group (Lin, 2016, 2018a; Basheti et al., 2010), guided peer feedback (Noroozi & Mulder, 2017), and video peer assessment (Ge, 2019) perceived that they had learned more from peer feedback activities compared to other groups.
Motivation
Eight studies explored students’ motivation in online peer feedback settings (e.g., Chen et al., 2020; Noroozi & Mulder, 2017). For example, some studies (Chen et al., 2020; Hsia et al., 2016) measured students' motivation with the rating-plus-qualitative-feedback mode of peer feedback and found that students expressed higher motivation when provided with this type of feedback. Kuo et al. (2017) found that mobile-supported peer feedback also increased student motivation, while Noroozi and Mulder (2017) found that the digital learning module with guided peer feedback improved student motivation. Overall, these studies suggest that certain conditions in online peer feedback settings can increase student motivation.
Attitude
Fourteen studies explored students’ attitudes toward online peer feedback (e.g., Noroozi & Hatami, 2019; Wang et al., 2019). For example, reviewed publications showed that online peer feedback with TQM (Chang et al., 2015), mobile-supported (Kuo et al., 2017), scripting (Noroozi & Hatami, 2019), guided peer feedback (Noroozi & Mulder, 2017), blog-supported (Rahmany et al., 2013), and structured peer assessment (Wang et al., 2019) caused attitudinal change towards online peer feedback.
Self-efficacy
Five studies measured students’ self-efficacy after online peer feedback (e.g., Ismaeel, 2020; Zheng et al., 2018). For example, reviewed publications showed that synchronous discussion (Zheng et al., 2018), mobile-supported (Kuo et al., 2017), structured peer assessment (Wang & Wu, 2008), and web-based alternative (Ismaeel, 2020) have positive effects on students’ academic self-efficacy skills.
Confidence
Confidence has been rarely examined in relation to peer feedback. Altinay (2016) found that online peer feedback programs increase students' confidence by empowering them to take ownership of their learning.
5 Discussions
In this section, the main elements including students' characteristics, learning environments, learning processes and activities, and finally learning outcomes are discussed. Also, under each of the main elements, its more detailed dimensions are explained.
Researchers have explored various aspects of students' characteristics in relation to online peer feedback, including personality traits, emotions, epistemic beliefs, motivation, perceptions, and self-efficacy. These factors play a role in terms of how students give and receive feedback, as well as their engagement and learning outcomes in online peer feedback activities. Reviewed publications showed how demographic characteristics, such as age and gender can influence students' engagement and outcomes in online peer feedback activities. While gender has been a primary focus in many studies examining the relationship between demographic factors and online peer feedback, the effects of other factors, such as age, nationality, and language have not been extensively explored. However, some studies have explored the relationship between these demographic factors and online peer feedback. These findings suggest that demographic factors beyond gender can play a role in shaping students' behaviors and outcomes in online peer feedback activities. Numerous studies have explored the relationship between academic backgrounds and outcomes in online peer feedback activities. For instance, Cho and Schunn (2007) found that students' prior experience with peer feedback was related to their feedback quality and learning outcomes in an online writing task. Similarly, Li et al. (2021) found that students' educational level and prior knowledge were related to their perceptions and use of peer feedback in online learning environments. These findings suggest that academic backgrounds, including factors such as high school graduation, educational level, prior experience, and knowledge are critical considerations that scholars, educators, and educational designers must take into account when implementing peer feedback in online learning environments.
Our study aligns with previous research that emphasizes the significance of students' characteristics in the online peer feedback processes (Banihashem et al., 2023; Noroozi et al., 2022). Given the significant impact of students' characteristics, including their academic backgrounds, on their engagement with online peer feedback activities, educators, scholars, and instructional designers must recognize and address these factors (Li et al., 2021). Previous research has shown that students' prior experience, educational level, and knowledge are critical factors when implementing online peer feedback activities (Cho & Schunn, 2007; Li et al., 2021). Therefore, designing feedback activities that are tailored to students' specific needs and backgrounds may enhance their engagement, motivation, and learning outcomes. Additionally, recognizing the diversity of students' academic backgrounds and providing opportunities for peer feedback in different formats and languages may help create a more inclusive and equitable learning environment (Li et al., 2021).
Our review revealed that various platforms have been used to implement online peer feedback settings, with Wiki, Blackboard, KnowCat, Facebook, and Mobile apps being the most commonly utilized (Li et al., 2021). This finding is consistent with prior research that emphasizes the significance of learning technologies in the implementation of online peer feedback (e.g., Chang & Lin, 2020; Gielen & De Wever, 2015). The choice of technology can have a significant impact on the effectiveness of online peer feedback activities, as different platforms may have different features, functionalities, and affordances that influence students' engagement and learning outcomes (Noroozi & Hatami, 2019; Shang, 2019). As such, it is essential to consider the characteristics and affordances of the technology when designing and implementing online peer feedback activities to ensure optimal outcomes for students' learning and performance (Li et al., 2021). For example, a Wiki platform may be more suitable for collaborative writing tasks, while a mobile app may be more effective for providing feedback on multimedia projects.
The learning environment in online peer feedback is not limited to the platform used but also includes other factors such as culture, faculty, and teamwork, which can influence learning outcomes (Donia et al., 2021; Pham et al., 2020). Cultural factors such as language proficiency and communication styles can impact the effectiveness of online peer feedback activities, highlighting the importance of ensuring that feedback prompts and instructions are clear and easily understood by all students. Additionally, faculty support, including training and guidance on how to provide and receive feedback, can enhance students' engagement and the quality of their feedback. Students may receive different levels of support in different learning communities or settings, which can affect their actions and reactions during the online peer feedback process, ultimately leading to varying learning outcomes (Kuo et al., 2017). For example, students in a supportive and collaborative learning community may be more likely to engage actively in the feedback process and provide constructive feedback to their peers. In contrast, students in a competitive and individualistic learning community may be more likely to focus on their own performance and provide less constructive feedback to their peers. Additionally, factors such as the level of guidance and scaffolding provided by the instructor, the type of feedback prompts, and the overall design of the online peer feedback activities can also impact students' actions and reactions during peer feedback processes. Therefore, it is essential to consider the broader learning context when designing and implementing online peer feedback activities to ensure that they are effective in different cultural and institutional settings (Donia et al., 2021; Kuo et al., 2017).
Online peer feedback activities should provide flexibility, support, and guidance to address these cultural factors (Li et al., 2019a, b). Anonymity, instructor modeling, cooperative environment, guidelines, and examples are strategies to encourage cross-cultural peer feedback (Li et al., 2019a, b). Overall, cultural sensitivity is key to designing effective peer feedback for diverse learners as culture profoundly impacts students’ expectations and engagement in such activities (Hofstede, 2001; Li et al., 2019a, b). Educators can use strategies such as clear guidelines, cross-cultural communication, supportive learning environments, culturally responsive pedagogy, forming diverse peer feedback groups, cultural competence promotion, and critical reflection to overcome cultural hurdles and promote participation in online peer feedback (Golonka & Lance, 2020).
Also, we analyzed online peer feedback from three dimensions: content, feedback activity design, and technology. Within the content dimension, the type of feedback provided by peers has received particular attention from scholars (Van Zundert et al., 2010). Studies have explored the impact of different types of feedback, such as corrective, elaborative, and directive feedback, on learning outcomes and student motivation. Understanding the impact of feedback type on learning can help educators design effective online peer feedback activities that promote student learning and engagement. Previous research has also found that the type, features, and quality of feedback provided in online peer feedback activities can predict students' success. For example, a study by Taghizadeh Kerman et al. (2022b) found that the quality of feedback provided by peers was positively associated with students' writing performance. Similarly, a study by Patchan et al. (2016) found that the quality of feedback, including its specificity, clarity, and detail was a significant predictor of students' writing improvement in online peer feedback activities.
Within the feedback activity design dimension of online peer feedback, scholars have explored the effectiveness of various strategies and methods for implementing peer feedback activities. Some of the strategies that have been studied include the number of peer feedback rounds, reviewer characteristics, and training. For example, a study by Topping (2017) found that increasing the number of peer feedback rounds improved the quality and quantity of feedback provided by peers. Other studies have explored the impact of reviewer characteristics, such as experience and expertise, on the effectiveness of online peer feedback activities. Additionally, studies have shown that providing training for students on how to give and receive feedback can improve the quality of feedback provided in online peer feedback activities. Previous studies have also emphasized the importance of peer feedback rounds, reviewer characteristics, and training in the effectiveness of online peer feedback activities (Latifi et al., 2020; Min, 2006; Noroozi et al., 2019).
In the field of educational technology, various methods have been explored for implementing educational strategies with the aid of technology. These include video annotation, video peer assessment, different types of discussions and support, as well as various scaffolding techniques (Noroozi & Hatami, 2019). Among these, peer feedback processes and activities are considered to be of significant importance because they help students express their opinions, write more effectively, reflect on their knowledge, and achieve deeper learning (Noroozi & Hatami, 2019). Through this student-led approach, students may also develop higher-order thinking skills by taking on the tasks and responsibilities of assessors. Despite the potential benefits of peer feedback, empirical research has identified several problems related to the reluctance to include peer feedback in instructional practices and the learning process (Zhu & Carless, 2018). To address these issues, it is important to establish a safe environment by clearly communicating the goals of peer assessment and training assessors to provide constructive feedback and scaffolding (Topping, 1998). Educators should encourage thorough discussion of evaluation criteria before peer evaluation occurs, and they should intervene if feedback or marking is deemed unsatisfactory (Topping, 1998). The activities and processes discussed above can help achieve these goals (Noroozi & Hatami, 2019).
Our analysis reveals that online peer feedback is utilized for different learning purposes, including cognitive, behavioral, and affective outcomes (e.g., Latifi et al., 2020; Lin, 2018a; Noroozi & Hatami, 2019). Based on Bloom's classification, the primary cognitive outcomes resulting from the implementation of online peer feedback were in the application category, such as writing performance, feedback performance, and problem-solving (e.g., Hsia et al., 2016; Latifi et al., 2020). Researchers focused on the engagement of students in the peer feedback process as the primary behavioral outcome, which was influenced by various learning mechanisms and strategies, such as guided peer feedback, mode, and anonymous condition (e.g., Latifi et al., 2021; Noroozi & Mulder, 2017). In terms of affective outcomes, researchers mainly investigated perception and attitude towards peer feedback (e.g., Chang et al., 2015; Lin, 2018a; Noroozi & Hatami, 2019).
To achieve the desired goal, it is crucial to adopt appropriate educational strategies. For instance, to acquire skills in argumentative essay writing, structured peer assessment, a combination of worked examples and scripting, argumentative peer feedback script, mixed feedback and peer feedforward support, and online discourse community can be effective (e.g., Noroozi et al., 2020; Tsai & Chuang, 2013; Valero Haro et al., 2019). These educational approaches create opportunities for students to prepare and learn more, discuss, think, and reflect on the criteria of argumentative writing by providing formulae, procedures, and examples of desirable works. To increase student participation in the online peer feedback process, educational approaches such as the rating-plus-qualitative-feedback, and collaborative review are useful because they motivate students to take the peer feedback process more seriously and get involved in it (Hsia et al., 2016; Mandala et al., 2018). To improve students' attitudes towards peer feedback, instructional approaches such as argumentative peer feedback, mobile peer assessment, online peer feedback with TQM, anonymous condition, guided peer feedback, blogging, and accurate and specific feedback can be effective (e.g., Chang et al., 2015; Lin, 2018a).
In summary, in higher education, it is essential for educators and educational designers to choose appropriate educational design principles and keep educational goals in mind while designing and implementing online peer feedback. Ignoring other aspects of educational goals and their effects may diminish the effectiveness of the educational technique. Therefore, it is crucial to consider the different learning purposes, cognitive, behavioral, and affective, and adopt appropriate educational strategies to achieve the desired result (e.g., Noroozi et al., 2011, 2016, 2020; Rahmany et al., 2013; Valero Haro et al., 2019).
6 A conceptual framework to guide the use of online peer feedback
Developing a conceptual framework to steer the integration of online peer feedback within higher education holds the potential to guarantee that instructors deploy strategic approaches that harmonize with precise learning objectives. Drawing inspiration from our discoveries concerning the fundamental dimensions of online peer feedback, we present a proposed evidence-grounded conceptual framework as illustrated in Fig. 2.
Assessing students’ characteristics represents the crucial first step in the incorporation of online peer feedback within higher education. Gaining insights into students' distinctive qualities, encompassing their pre-existing knowledge, skill sets, and attitudes toward peer feedback, serves as a compass for educators to implement online peer feedback that is more tailored to students’ needs, preferences, and abilities. For example, knowing that students have limited experience with online peer feedback, may convince educators to provide more guidance and support during the peer feedback process. On the other hand, if students have a high level of experience with online peer feedback, a more independent and self-directed approach may be found appropriate by educators. Similarly, if students have negative attitudes towards peer feedback, it may be necessary to use instructional approaches that focus on building trust and promoting a positive feedback culture. These methods provide students with more control and autonomy in the feedback process, as well as opportunities for collaboration and peer support.
In the second step, a successful implementation of online peer feedback requires a well understanding of learning environment conditions such as learning settings (context, team, and culture) and learning platform. Studies have shown that the context of learning plays a role in online peer feedback. For example, students within the arts and sciences context perceived online peer feedback more positively compared to students in the communication, engineering, and technology contexts and this is related to the differences in task complexity, disciplinary culture, and prior experience (Altinay, 2016). In addition, the type of learning platform should be considered in the implementation of online peer feedback in higher education, as different learning platforms offer distinct arrays of functionalities for facilitating online peer feedback. It is important for educators and designers to regularly and critically reflect on the most appropriate online platform for peer feedback, especially as technologies continue to rapidly change and develop. While selecting an appropriate platform is important, it should not be the primary consideration. Instead, educators and designers should prioritize defining clear learning objectives and determining the specific needs and characteristics of their students. This will enable them to select a platform that is most appropriate for achieving their goals. In addition, it is important to stay informed about new and innovative technologies, such as AI, that may have the potential to enhance the peer feedback process. By regularly reflecting on and evaluating the effectiveness of different online platforms and technologies, educators and designers can make informed decisions about which tools and approaches are most appropriate for their students and learning objectives. This can help to ensure that the peer feedback process remains current, effective, and engaging for students. Furthermore, in the final step, activities and processes should be determined according to the students' characteristics and learning objectives. This will ensure that the peer feedback process is tailored to the specific needs of the students and is designed to promote positive learning outcomes. By taking into account the students' characteristics and learning objectives, educators and designers can select appropriate activities and processes that will engage and motivate their students, promote effective feedback, and facilitate learning.
A foundational understanding of whom the peer feedback system is intended to serve and for what purpose is critical to ensuring that the peer feedback process is meaningful and relevant. By taking a student-centered approach and considering students' characteristics and learning objectives, educators and designers can establish peer feedback settings tailored to the specific needs of their students to guide them towards achieving learning outcomes. Moreover, when students are involved in peer feedback processes, it is important to consider their perspectives and experiences. Students should have a voice in the development of the peer feedback process and be involved in the selection of activities and processes that are most effective for their learning. This will help to promote student engagement and motivation and ensure that the peer feedback process is effective, relevant, and ethical. As Noroozi et al. (2011, 2016) suggest, objectives play a key role in determining what types of activities and strategies are needed to collect feedback effectively. However, it is also important to consider the ethical perspective when designing and implementing peer feedback processes. When peers are involved in online peer feedback situations, students need to know what happens with the feedback that is provided and received. Human values such as privacy, equality, and responsibility can be considered crucial in providing feedback in online situations. Therefore, educators and designers should ensure that appropriate measures are in place to protect students' privacy, promote equality in feedback provision and reception, and foster a responsible and constructive feedback culture. By integrating the student-centered approach and the ethical perspective, educators and designers can design effective and relevant peer feedback processes that promote positive learning outcomes while also being responsible and ethical.
The implementation of peer feedback in higher education should be tailored to the needs of students and fit with the educational objectives. For instance, projects with different goals, such as promoting cognitive, behavioral, and affective learning outcomes, may require different activities and methods at different stages of the feedback process. As such, the steps in our conceptual framework should be considered in a hierarchical manner, taking into account the specific learning goals and the needs of the students. This approach is supported by previous research, such as Noroozi et al. (2012, 2016), who have emphasized the importance of aligning the goals of the feedback process with the desired learning outcomes. By doing so, it is possible to design and implement peer feedback activities that are effective in promoting learning and development among students in higher education.
7 Conclusions, limitations, and suggestions for future research and practice
This systematic review utilized Biggs' (2003) model of online peer feedback to guide the analysis by focusing on the four dimensions of effective online peer feedback. The review provides a comprehensive overview of the current state of implementation of online peer feedback in technology-mediated learning environments and identifies gaps and areas for future research. The review emphasizes the importance of considering cultural differences, learner characteristics, and appropriate technologies in designing effective online peer feedback practices. The review also highlights the need for future research to focus on specific dimensions of online peer feedback to gain a more nuanced understanding of how each dimension affects learning outcomes. Overall, this review contributes to the field of online peer feedback and helps educators and researchers develop more effective approaches to enhance learning outcomes.
There are several limitations to this review that should be acknowledged. Firstly, the review only included empirical studies to ensure the reporting of authentic findings, which may have excluded some noteworthy reviews and conceptual papers. Secondly, while the selected literature databases cover the most relevant publications, some studies not indexed in these databases may have been missed. Thirdly, there may be a publication bias, where studies with null findings are not published, which could affect the generalizability of our findings. Therefore, caution should be exercised when interpreting our results. Fourthly, our study only investigated online peer feedback in higher education and did not examine its use and impact in K-12 educational contexts. Thus, our findings may not be generalizable to all modes of educational contexts. Future research could explore how online peer feedback in higher education differs in its use and impact compared to K-12 educational environments. Fifth, the review only focused on studies published in English, which may have excluded relevant studies published in other languages. Finally, the review only included articles published between 2000 and 2023, which may have excluded relevant studies published before 2000.
Future research should explore several areas to enhance our understanding of online peer feedback and optimize its implementation in higher education. First, investigating the impact of different types of feedback such as written or verbal feedback and text-based comments, audio or video feedback could provide insights into which types are most effective in promoting learning. Second, exploring the use of peer feedback as a formative assessment tool could help students identify areas they need to improve and make progress toward learning goals. Third, examining the effects of emotions and different delivery methods on learning outcomes could help identify factors that influence the effectiveness of online peer feedback. Fourth, exploring gamification and other motivational techniques to enhance engagement and developing best practices to ensure effective and efficient feedback processes could improve the quality of feedback. Fifth, incorporating virtual and augmented reality technologies to create immersive feedback experiences could enhance engagement and provide more effective feedback. Sixth, using blockchain technology to enhance the credibility and transparency of feedback could ensure that feedback is fair and accurate. Seventh, the emergence of new technologies such as ChatGPT holds great potential to support online peer feedback and essay writing (Farrokhnia et al., 2023; Banihashem et al., 2022a, 2022b). Future research in online peer feedback in higher education could investigate the potential role of AI and machine learning in enhancing peer feedback quality and relevance. Such research could explore how AI-powered tools can support students in providing personalized and constructive feedback to their peers. Additionally, the ethical implications of using AI-powered tools in online peer feedback should be investigated to ensure that these tools are used in a responsible and ethical manner. Furthermore, research could focus on integrating human values such as privacy, equality, and responsibility into the design and implementation of online peer feedback processes. This could include the development of guidelines and best practices that consider the ethical dimension of feedback provision and reception. Additionally, research could investigate how to promote a responsible and constructive feedback culture in online settings, and how to ensure that students are adequately prepared to provide and receive feedback in a responsible and ethical manner. By integrating the ethical perspective into online peer feedback, educators, and designers can help to ensure that these processes are not only effective and relevant but also responsible and ethical, contributing to the advancement of Responsible AI and AI ethics in education. Eighth, exploring the use of online peer feedback in interdisciplinary and cross-cultural contexts could optimize its implementation. Finally, this study concentrated on offering a comprehensive overview of the current state of online peer feedback implementation in higher education. We achieved this by conducting a systematic review exclusively centered on empirical studies known for their robust methodologies, ensuring reliable and valid results. As a suggestion for future research initiatives, we propose advancing further by conducting a meta-analysis to delve into the effect size of implementing online peer feedback in higher education. These areas of research could lead to more personalized, effective, and innovative approaches to online peer feedback.
Data availability
The data supporting this study’s findings are available from the corresponding author upon request.
References
Agrawal, A., & Rajapakse, D. C. (2018). Perceptions and practice of peer assessments: an empirical investigation. International Journal of Educational Management, 32(6), 975–989. https://doi.org/10.1108/IJEM-05-2016-0085/FULL/XML
Al Abri, A., Al Baimani, S., & Al Bahlani, S. (2021). The role of web-based peer feedback in advancing EFL essay writing. Computer-Assisted Language Learning Electronic Journal (CALL-EJ), 22(1), 374-390. 10.29140/call-ej.v22i1.420
Al Qunayeer, H. S. (2020). Supporting postgraduates in research proposals through peer feedback in a Malaysian university. Journal of Further and Higher Education, 44(7), 956–970. https://doi.org/10.1080/0309877x.2019.1627299
Altinay, Z. (2016). Evaluating peer learning and assessment in online collaborative learning environments. Behaviour & Information Technology, 36(3), 312–320. https://doi.org/10.1080/0144929X.2016.1232752
Allen, D., & Mills, A. (2016). The impact of second language proficiency in dyadic peer feedback. Language Teaching Research, 20(4), 498–513. https://doi.org/10.1177/1362168814561902
Banihashem, S. K., Farrokhnia, M., Badali, M., & Noroozi, O. (2022b). The impacts of constructivist learning design and learning analytics on students’ engagement and self-regulation. Innovations in Education and Teaching International, 59(4), 442–452. https://doi.org/10.1080/14703297.2021.1890634
Banihashem, S. K., Noroozi, O., van Ginkel, S., Macfadyen, L. P., & Biemans, H. J. (2022a). A systematic review of the role of learning analytics in enhancing feedback practices in higher education. Educational Research Review, 100489. https://doi.org/10.1016/j.edurev.2022.100489
Banihashem, S. K., Noroozi, O., Biemans, H. J., & Tassone, V. C. (2023). The intersection of epistemic beliefs and gender in argumentation performance. Innovations in Education and Teaching International, 1-19. https://doi.org/10.1080/14703297.2023.2198995
Bayat, M., Banihashem, S. K., & Noroozi, O. (2022). The effects of collaborative reasoning strategies on improving primary school students’ argumentative decision-making skills. The Journal of Educational Research, 1-10. https://doi.org/10.1080/14703297.2023.2198995
Basheti, I. A., Ryan, G., Woulfe, J., & Bartimote-Aufflick, K. (2010). Anonymous Peer Assessment of Medication Management Reviews. American Journal of Pharmaceutical Education, 74(5), 1–8. https://doi.org/10.5688/AJ740577
Bellhäuser, H., Liborius, P., & Schmitz, B. (2022). Fostering Self-Regulated Learning in Online Environments: Positive Effects of a Web-Based Training With Peer Feedback on Learning Behavior. Frontiers in Psychology, 13, 813381. https://doi.org/10.3389/fpsyg.2022.813381
Biggs, J. B. (2003). Teaching for quality learning at university: What the student does (2nd ed.). Buckingham: Open University Press
Brill, J. M. (2016). Investigating peer review as a systemic pedagogy for developing the design knowledge, skills, and dispositions of novice instructional design students. Educational Technology Research and Development, 64(4), 681–705. https://doi.org/10.1007/s11423-015-9421-6
Canham, N. (2018). Comparing Web 2.0 applications for peer feedback in language teaching: Google Docs, the Sakai VLE, and the Sakai Wiki. Writing & Pedagogy, 9(3), 429–456. https://doi.org/10.1558/wap.32352
Chang, C. Y.-h. (2015). Teacher modeling on EFL reviewers audience-aware feedback and affectivity in L2 peer review. Assessing Writing, 25, 2-21. https://doi.org/10.1016/j.asw.2015.04.001
Chang, C., & Lin, H.-C. K. (2020). Effects of a mobile-based peer-assessment approach on enhancing language-learners’ oral proficiency. Innovations in Education and Teaching International, 57(6), 668–679. https://doi.org/10.1080/14703297.2019.1612264
Chang, S. H., Yu, L. C., Kuo, Y. K., Mai, Y. T., & Chen, J. De. (2015). Applying online peer assess ment with total quality management to elevate project-based learning performance. Journal of Baltic Science Education, 14(3), 379–390. 10.33225/JBSE/15.14.379
Chen, H. L., & Liu, C. Y. (2023). The effects of web-based peer assessment and peer feedback quality on students’ performances in a financial market course. TechTrends, 67, 664–675. https://doi.org/10.1007/s11528-023-00856-8
Chen, I. C., Hwang, G. J., Lai, C. L., & Wang, W. C. (2020). From design to reflection: Effects of peer-scoring and comments on students’ behavioral patterns and learning outcomes in musical theater performance. Computers & Education, 150. https://doi.org/10.1016/J.COMPEDU.2020.103856
Chen, N. S., Wei, C. W., Wu, K. T., & Uden, L. (2009). Effects of high level prompts and peer assessment on online learners’ reflection levels. Computers & Education, 52(2), 283–291. https://doi.org/10.1016/J.COMPEDU.2008.08.007
Cheng, K.-H., Hou, H.-T., & Wu, S.-Y. (2014). Exploring students’ emotional responses and participation in an online peer assessment activity: A case study. Interactive Learning Environments, 22(3), 271–287. https://doi.org/10.1080/10494820.2011.649766
Cheng, K. H., & Hou, H. T. (2015). Exploring students’ behavioural patterns during online peer assessment from the affective, cognitive, and metacognitive perspectives: a progressive sequential analysis. Technology, Pedagogy and Education, 24(2), 171–188. https://doi.org/10.1080/1475939X.2013.822416
Cho, K., & Schunn, C. D. (2007). Scaffolded writing and rewriting in the discipline: A web-based reciprocal peer review system. Computers & Education, 48(3), 409–426. https://doi.org/10.1016/j.compedu.2005.01.004
Cho, K., Schunn, C. D., & Wilson, R. W. (2006). Validity and reliability of scaffolded peer assessment of writing from instructor and student perspectives. Journal of Educational Psychology, 98(4), 891. https://doi.org/10.1037/0022-0663.98.4.891
Cui, Y., Schunn, C. D., & Gai, X. (2022). Peer feedback and teacher feedback: a comparative study of revision effectiveness in writing instruction for EFL learners. Higher Education Research & Development, 41(6), 1838–1854. https://doi.org/10.1080/07294360.2021.1969541
Culver, K., Bowman, N. A., Youngerman, E., Jang, N., & Just, C. L. (2022). Promoting equitable achievement in STEM: lab report writing and online peer review. The Journal of experimental education, 90(1), 23–45. https://doi.org/10.1080/00220973.2020.1799315
Day, I. N. Z., Saab, N., & Admiraal, W. (2021). Online peer feedback on video presentations: type of feedback and improvement of presentation skills. Assessment & Evaluation in Higher Education. https://doi.org/10.1080/02602938.2021.1904826
De Wever, B., Van Keer, H., Schellens, T., & Valcke, M. (2011). Assessing collaboration in a wiki: The reliability of university students’ peer assessment. The Internet and Higher Education, 14(4), 201–206. https://doi.org/10.1016/j.iheduc.2011.07.003
Çevik, Y. (2015). Assessor or assessee? Investigating the differential effects of online peer assessment roles in the development of students’ problem-solving skills. Computers in Human Behavior, 52, 250–258. https://doi.org/10.1016/j.chb.2015.05.056
Donia, M. B., Mach, M., O’Neill, T. A., & Brutus, S. (2021). Student satisfaction with use of an online peer feedback system. Assessment & Evaluation in Higher Education, 47(2), 269–283. https://doi.org/10.1080/02602938.2021.1912286
Er, E., Dimitriadis, Y., & Gašević, D. (2021). A collaborative learning approach to dialogic peer feedback: a theoretical framework. Assessment & Evaluation in Higher Education, 46(4), 586–600. https://doi.org/10.1080/02602938.2020.1786497
Farrokhnia, M., Banihashem, S. K., Noroozi, O., & Wals, A. (2023). A SWOT analysis of ChatGPT: Implications for educational practice and research. Innovations in Education and Teaching International, 1-15. https://doi.org/10.1080/14703297.2023.2195846
Friese, S. (2019). Qualitative data analysis with ATLAS.ti. Sage Publication.
Ge, Z. G. (2019). Exploring the effect of video feedback from unknown peers on e-learners’ English-Chinese translation performance. Computer Assisted Language Learning, 35(1–2), 169–189. https://doi.org/10.1080/09588221.2019.1677721
Gielen, M., & De Wever, B. (2015). Structuring peer assessment: Comparing the impact of the degree of structure on peer feedback content. Computers in Human Behavior, 52, 315–325. https://doi.org/10.1016/j.chb.2015.06.019
Golonka, L. D., & Lance, T. S. (2020). Cultural Factors in Online Learning. In L. D. Golonka & T. S. Lance (Eds.), Online learning across a random family (1st ed., pp. 69–80). Routledge.
Gorham, T., Majumdar, R., & Ogata, H. (2023). Analyzing learner profiles in a microlearning app for training language learning peer feedback skills. Computers in Education, 1-16. https://doi.org/10.1007/s40692-023-00264-0.
Hattie, J., & Timperley, H. (2007). The Power of Feedback. Review of Educational Research, 77(1), 81–112. https://doi.org/10.3102/003465430298487
Havard, B., Podsiad, M., & Valaitis, K. (2023). Peer Assessment Collaboration Evaluation: An Innovative Assessment Tool for Online Learning Environments. TechTrends, 67(6), 1428–1439. https://doi.org/10.1007/s11528-022-00832-8
Hoffman, B. (2019). The influence of peer assessment training on assessment knowledge and reflective writing skill. Journal of Applied Research in Higher Education, 11(4), 863–875. https://doi.org/10.1108/JARHE-01-2019-0004
Hofstede, G. (2001). Culture’s consequences: Comparing values, behaviors, institutions, and organizations across nations. Sage publications.
Hsia, L. H., Huang, I., & Hwang, G. J. (2016). Effects of different online peer-feedback approaches on students’ performance skills, motivation and self-efficacy in a dance course. Computers & Education, 96, 55–71. https://doi.org/10.1016/J.COMPEDU.2016.02.004
Hu, G., & Lam, S. T. E. (2010). Issues of cultural appropriateness and pedagogical efficacy: Exploring peer review in a second language writing class. Instructional Science, 38(4), 371–394. https://doi.org/10.1007/s11251-008-9086-1
Huisman, B., Saab, N., Van Driel, J., & Van Den Broek, P. (2018). Peer feedback on academic writing: undergraduate students’ peer feedback role, peer feedback perceptions and essay performance. Assessment & Evaluation in Higher Education, 43(6), 955–968. https://doi.org/10.1080/02602938.2018.1424318
Ismaeel, D. A. (2020). Alternative web-based assessment and academic self-efficacy of pre-service student teachers. International Journal of Web-Based Learning and Teaching Technologies, 15(4), 66–81. https://doi.org/10.4018/IJWLTT.20201001.OA1
Jiang, J., & Yu, Y. (2014). The Effectiveness of Internet-based Peer Feedback Training on Chinese EFL College Students’ Writing Proficiency. International Journal of Information and Communication Technology Education, 10(3), 34–46. https://doi.org/10.4018/IJICTE.2014070103
Jin, X., Jiang, Q., Xiong, W., Feng, Y., & Zhao, W. (2022). Effects of student engagement in peer feedback on writing performance in higher education. Interactive Learning Environments, 1-14. https://doi.org/10.1080/10494820.2022.2081209
Ko, Y., Issenberg, S. B., & Roh, Y. S. (2022). Effects of peer learning on nursing students' learning outcomes in electrocardiogram education. Nurse Education Today, 108, 105182. https://doi.org/10.1016/j.nedt.2021.105182
Koehler, M., & Mishra, P. (2009). What is technological pedagogical content knowledge (TPACK)? Contemporary issues in technology and teacher education, 9(1), 60–70.
Kuo, F.-C., Chen, J.-M., Chu, H.-C., Yang, K.-H., & Chen, Y.-H. (2017). A Peer- Assessment Mobile Kung Fu Education Approach to Improving Students' Affective Performances. International Journal of Distance Education Technologies., 15(1), 1–14. https://doi.org/10.4018/IJDET.2017010101
Ku, H. Y., & Lohr, L. (2003). A case study of a peer assessment strategy for developing self-regulated learning. Educational Technology Research and Development, 51(2), 5–22. https://doi.org/10.1007/BF02504501
Lai, C. Y. (2016). Training nursing students’ communication skills with online video peer assessment. Computers and Education, 97, 21–30. https://doi.org/10.1016/j.compedu.2016.02.017
Lai, C. Y., Chen, L. J., Yen, Y. C., & Lin, K. Y. (2020). Impact of video annotation on undergraduate nursing students’ communication performance and commenting behaviour during an online peer-assessment activity. Australasian Journal of Educational Technology, 36(2), 71–88. 10.14742/AJET.4341
Lane, J. N., Ankenman, B., & Iravani, S. (2018). Insight into Gender Differences in Higher Education: Evidence from Peer Reviews in an Introductory STEM Course. Service Science, 10(4), 442–456. https://doi.org/10.1287/SERV.2018.0224
Latifi, S., Noroozi, O., & Talaee, E. (2020). Worked example or scripting? Fostering students’ online argumentative peer feedback, essay writing and learning. Interactive Learning Environments, 1–15. https://doi.org/10.1080/10494820.2020.1799032
Latifi, S., Noroozi, O., & Talaee, E. (2021). Peer feedback or peer feedforward? Enhancing students’ argumentative peer learning processes and outcomes. British Journal of Educational Technology, 52(2), 768–784. https://doi.org/10.1111/bjet.13054
Li, H., Xiong, Y., Hunter, C. V., Guo, X., & Tywoniw, R. (2019a). Does peer assessment promote student learning? A meta-analysis. Assessment & Evaluation in Higher Education, 45(2), 193–211. https://doi.org/10.1080/02602938.2019.1620679
Li, N., Zhao, Y., Huang, X., & Tan, X. (2019b). The impact of personality traits on peer feedback in online learning environment. International Journal of Emerging Technologies in Learning, 14(19), 77–91. https://doi.org/10.3991/ijet.v14i19.10605
Li, N., Huang, X., Zhao, Y., & Tan, X. (2021). Understanding college students' perceptions and use of peer feedback in online learning environments: The roles of prior knowledge and educational level. Educational Technology Research and Development, 69(1), 251–272. https://doi.org/10.1007/s11423-020-09959-8
Liang, J. C., & Tsai, C. C. (2010). Learning through science writing via online peer assessment in a college biology course. The Internet and Higher Education, 13(4), 242–247. https://doi.org/10.1016/J.IHEDUC.2010.04.004
Lin, C.-J. (2019). An online peer assessment approach to supporting mind-mapping flipped learning activities for college English writing courses. Computers in Education, 6(3), 385–415. https://doi.org/10.1007/S40692-019-00144-6
Lin, G.-Y. (2016). Effects that Facebook-based Online Peer Assessment with Micro-teaching Videos Can Have on Attitudes toward Peer Assessment and Perceived Learning from Peer Assessment. Eurasia Journal of Mathematics, Science and Technology Education, 12(9), 2295–2307. https://doi.org/10.12973/EURASIA.2016.1280A
Lin, G. Y. (2018a). Anonymous versus identified peer assessment via a Facebook-based learning application: Effects on quality of peer feedback, perceived learning, perceived fairness, and attitude toward the system. Computers and Education, 116, 81–92. https://doi.org/10.1016/j.compedu.2017.08.010
Lin, J.-W. (2018b). Effects of an online team project-based learning environment with group awareness and peer evaluation on socially shared regulation of learning and self-regulated learning. Behaviour & Information Technology, 37(5), 445–461. https://doi.org/10.1080/0144929X.2018.1451558
Lin, H.-S., Hong, Z.-R., Wang, H.-H., & Lee, S.-T. (2011). Using Reflective Peer Assessment to Promote Students' Conceptual Understanding through Asynchronous Discussions. Educational Technology & Society, 14(3), 178–189.
Liu, E. Z. F., Lin, S. S. J., Chiu, C. H., & Yuan, S. M. (2001). Web-based peer review: The learner as both adapter and reviewer. IEEE Transactions on Education, 44(3), 246–251. https://doi.org/10.1109/13.940995
Liu, J., Guo, X., Gao, R., Fram, P., Ling, Y., Zhang, H., & Wang, J. (2019). Students’ learning outcomes and peer rating accuracy in compulsory and voluntary online peer assessment. Assessment & Evaluation in Higher Education, 44(6), 835–847. https://doi.org/10.1080/02602938.2018.1542659
Liu, E. Z. F., & Lee, C. Y. (2013). Using peer feedback to improve learning via online peer assessment. Turkish Online Journal of Educational Technology, 12(1), 187–199.
Liu, X., Li, L., & Zhang, Z. (2017). Small group discussion as a key component in online assessment training for enhanced student learning in web-based peer assessment. Assessment & Evaluation in Higher Education, 43(2), 207–222. https://doi.org/10.1080/02602938.2017.1324018
Lowell, V. L., & Ashby, I. V. (2018). Supporting the development of collaboration and feedback skills in instructional designers. Journal of Computing in Higher Education, 30(1), 72–92. https://doi.org/10.1007/s12528-018-9170-8
Luhach, S. (2020). Recreating Discourse Community for Appropriating HOCs in Law Undergraduates’ Academic Writing. IAFOR Journal of Education, 8(4), 151–170. 10.22492/ije.8.4.12
Lundstrom, K., & Baker, W. (2009). To give is better than to receive: The benefits of peer review to the reviewer’s own writing. Journal of Second Language Writing, 18, 30–43. https://doi.org/10.1016/j.jslw.2008.06.002
Maringe, F. (2010). Leading learning: Enhancing the learning experience of university students through anxiety auditing. Education, Knowledge, and Economy, 4, 15–31. https://doi.org/10.1080/17496891003696470
Mandala, M., Schunn, C., Dow, S., Goldberg, M., Pearlman, J., Clark, W., & Mena, I. (2018). Impact of collaborative team peer review on the quality of feedback in engineering design projects. International Journal of Engineering Education, 34(4), 1299–1313.
Min, H. T. (2006). The effects of trained peer review on EFL students’ revision types and writing quality. Journal of Second Language Writing, 15(2), 118–141. https://doi.org/10.1016/j.jslw.2006.01.003
Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., Altman, D., Antes, G., Atkins, D., Barbour, V., Barrowman, N., Berlin, J. A., Clark, J., Clarke, M., Cook, D., D’Amico, R., Deeks, J. J., Devereaux, P. J., Dickersin, K., Egger, M., Ernst, E., … Tugwell, P. (2009). Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. PLOS Medicine, 6(7), e1000097. https://doi.org/10.1371/JOURNAL.PMED.1000097
Moradian, A., Kalli, A., Sweredoski, M. J., & Hess, S. (2014). The top-down, middle-down, and bottom-up mass spectrometry approaches for characterization of histone variants and their post-translational modifications. Proteomics, 14(4-5), 489–497. https://doi.org/10.1002/pmic.201300256
Mulyati, Y., & Hadianto, D. (2023). Enhancing Argumentative Writing Via Online Peer Feedback-Based Essay: A Quasi-Experiment Study. International Journal of Instruction, 16(2), 195–212. https://doi.org/10.29333/iji.2023.16212a
Nelson, M. M., & Schunn, C. D. (2009). The nature of feedback: How different types of peer feedback affect writing performance. Instructional science, 37, 375–401. https://doi.org/10.1007/s11251-008-9053-x
Noroozi, O., Banihashem, S. K., Biemans, H. J., Smits, M., Vervoort, M. T., & Verbaan, C. L. (2023). Design, implementation, and evaluation of an online supported peer feedback module to enhance students’ argumentative essay quality. Education and Information Technologies, 1–28. https://doi.org/10.1007/s10639-023-11683-y
Noroozi, O., Banihashem, S. K., Taghizadeh Kerman, N., Parvaneh Akhteh Khaneh, M., Babayi, M., Ashrafi, H., & Biemans, H. J. A. (2022). Gender differences in students’ argumentative essay writing, peer review performance and uptake in online learning environments. Interactive Learning Environments, 1-18. https://doi.org/10.1080/10494820.2022.2034887
Noroozi, O., Biemans, H. J. A., Busstra, M. C., Mulder, M., & Chizari, M. (2011). Differences in learning processes between successful and less successful students in computer-supported collaborative learning in the field of human nutrition and health. Computers in Human Behavior, 27(1), 309–318. https://doi.org/10.1016/j.chb.2010.08.009
Noroozi, O., Biemans, H., & Mulder, M. (2016). Relations between scripted online peer feedback processes and quality of written argumentative essay. Internet and Higher Education, 31, 20–31. https://doi.org/10.1016/j.iheduc.2016.05.002
Noroozi, O., & Hatami, J. (2019). The effects of online peer feedback and epistemic beliefs on students’ argumentation-based learning. Innovations in Education and Teaching International, 56(5), 548–557. https://doi.org/10.1080/14703297.2018.1431143
Noroozi, O., Hatami, J., Bayat, A., van Ginkel, S., Biemans, H. J., & Mulder, M. (2020). Students’ online argumentative peer feedback, essay writing, and content learning: Does gender matter? Interactive Learning Environments, 28(6), 698–712. https://doi.org/10.1080/10494820.2018.1543200
Noroozi, O., Hatami, J., Latifi, S., & Fardanesh, H. (2019). The effects of argumentation training in online peer feedback environment on process and outcomes of learning. Journal of Educational Scinces, 26(2), 71–88. 10.22055/EDUS.2019.28694.2763
Noroozi, O., & Mulder, M. (2017). Design and evaluation of a digital module with guided peer feedback for student learning biotechnology and molecular life sciences, attitudinal change, and satisfaction. Biochemistry and Molecular Biology Education, 45(1), 31–39. https://doi.org/10.1002/bmb.20981
Noroozi, O., Weinberger, A., Biemans, H. J. A., Mulder, M., & Chizari, M. (2012). Argumentation-Based Computer Supported Collaborative Learning (ABCSCL): A synthesis of 15 years of research. Educational Research Review, 7(2), 79–106. https://doi.org/10.1016/j.edurev.2011.11.006
Panadero, E., & Alonso-Tapia, J. (2013). Self-assessment: Theoretical and practical connotations. When it happens, how is it acquired and what to do to develop it in our students. Electronic Journal of Research in Educational Psychology, 11(2), 551–576. https://doi.org/10.14204/ejrep.30.12200
Panadero, E., & Lipnevich, A. (2022). A review of feedback models and typologies: Towards an integrative model of feedback elements. Educational Research Review., 35. https://doi.org/10.1016/j.edurev.2021.100416
Patchan, M. M., Schunn, C. D., & Clark, R. J. (2018). Accountability in peer assessment: Examining the effects of reviewing grades on peer ratings and peer feedback. Studies in Higher Education, 43(12), 2263–2278. https://doi.org/10.1080/03075079.2017.1320374
Patchan, M. M., Schunn, C. D., & Correnti, R. J. (2016). The nature of feedback: how peer feedback features affect students’ implementation rate and quality of revisions. Journal of Educational Psychology, 108(8), 1098–1120. https://doi.org/10.1037/edu0000103
Pereira, J., Echeazarra, L., Sanz-Santamaría, S., & Gutiérrez, J. (2014). Student-generated online videos to develop cross-curricular and curricular competencies in nursing studies. Computers in Human Behavior, 31, 580–590. https://doi.org/10.1016/j.chb.2013.06.011
Pham, T. N., Lin, M., Trinh, V. Q., & Bui, L. T. P. (2020). Electronic Peer Feedback, EFL Academic Writing and Reflective Thinking: Evidence From a Confucian Context: Sage Open, 10(1), 215824402091455.
Pifarré, M., Cobos, R., & Argelagós, E. (2014). Incidence of group awareness information on students’ collaborative learning processes. Journal of Computer Assisted Learning, 30(4), 300–317. https://doi.org/10.1111/JCAL.12043
Pifarré, M., & Cobos, R. (2010). Promoting metacognitive skills through peer scaffolding in a CSCL environment. Journal of Computer Assisted Learning, 5, 237–253. https://doi.org/10.1007/s11412-010-9084-6
Rahmany, R., Sadeghi, B., & Faramarzi, S. (2013). The effect of blogging on vocabulary enhancement and structural accuracy in an EFL context. Theory and Practice in Language Studies, 3(7), 1288–1298.
Rauss, K., & Pourtois, G. (2013). What is bottom-up and what is top-down in predictive coding? Frontiers in psychology, 4, 276. https://doi.org/10.3389/fpsyg.2013.00276
Ritzhaupt, A. D., & Kumar, P. (2015). The impact of peer feedback on communication skills in online discussions. Journal of Educational Computing Research, 53(1), 31–50. https://doi.org/10.1177/0735633115592429
Shang, H.-F. (2019). Exploring online peer feedback and automated corrective feedback on EFL writing performance. Interactive Learning Environments, 1–13. https://doi.org/10.1080/10494820.2019.1629601
Sadegh, T. (2022). Leveraging Regulative Learning Facilitators to Foster Student Agency and Knowledge (Co-) Construction Activities in CSCL Environments. International Journal of Online Pedagogy and Course Design, 12(1), 1–22. https://doi.org/10.4018/IJOPCD.293209
Shi, M. (2019). The effects of class size and instructional technology on student learning performance. The International Journal of Management Education, 17(1), 130–138. https://doi.org/10.1016/j.ijme.2019.01.004
Simonsmeier, B. A., Peiffer, H., Flaig, M., & Schneider, M. (2020). Peer Feedback Improves Students’ Academic Self-Concept in Higher Education. Research in Higher Education, 61(6), 706–724. https://doi.org/10.1007/S11162-020-09591-Y/FIGURES/2
Slee, N. J., & Jacobs, M. H. (2017). Trialling the use of Google Apps together with online marking to enhance collaborative learning and provide effective feedback. F1000Research, 4, 177. https://doi.org/10.12688/f1000research.6520.2
Su, Y., Ren, J., & Song, X. (2022). The effects of group awareness tools on student engagement with peer feedback in online collaborative writing environments. Interactive Learning Environments, 1-15. https://doi.org/10.1080/10494820.2022.2131833.
Taghizadeh Kerman, N., Noroozi, O., Banihashem, S. K., Karami, M., & Biemans, H. J. A. (2022a). Online peer feedback patterns of success and failure in argumentative essay writing. Interactive Learning Environments, 1–13. https://doi.org/10.1080/10494820.2022.2093914
Taghizadeh Kerman, N., Noroozi, O., Banihashem, S. K., & Biemans, H. J. A. (2022b). The effects of students' perceived usefulness and trustworthiness of peer feedback on learning satisfaction in online learning environments. 8th International Conference on Higher Education Advances (HEAd’22), Universitat Politecnica de Valencia. https://doi.org/10.4995/HEAd22.2022.14445
Theelen, T., van der Slikke, R. M. A., de Mul, M., & van der Steen, J. (2019). Quality appraisal of systematic reviews on end-of-life care: A systematic review. Palliative medicine, 33(2), 179–188. https://doi.org/10.1177/0269216318820464
Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68(3), 249–276. https://doi.org/10.3102/00346543068003249
Topping, K. (2017). Peer Assessment: Learning by Judging and Discussing the Work of Other Learners. Interdisciplinary Education and Psychology, 1(1), 1–17. https://doi.org/10.31532/INTERDISCIPEDUCPSYCHOL.1.1.007
Topping, K. J. (2021). Digital peer assessment in school teacher education and development: a systematic review. Research Papers in Education, 1-27. https://doi.org/10.1080/02671522.2021.1961301
Tran, O. T. T., & Pham, V. P. H. (2023). The effects of online peer feedback on students’ writing skills during corona virus pandemic. International Journal of Instruction, 16(1), 881–896. https://doi.org/10.29333/iji.2023.16149a
Tsai, C.-C., & Liang, J.-C. (2007). The development of science activities via on-line peer assessment: the role of scientific epistemological views. Instructional Science, 37:3, 37(3), 293–310. https://doi.org/10.1007/S11251-007-9047-0
Tsai, C. C., Lin, S. S. J., & Yuan, S. M. (2002). Developing science activities through a networked peer assessment system. Computers & Education, 38(1–3), 241–252. https://doi.org/10.1016/S0360-1315(01)00069-0
Tsai, Y.-C., & Chuang, M.-T. (2013). Fostering Revision of Argumentative Writing through Structured Peer Assessment. Perceptual and motor skills, 116(1), 210–221. https://doi.org/10.2466/10.23.PMS.116.1.210-221
Tseng, S. C., & Tsai, C. C. (2010). Taiwan college students’ self-efficacy and motivation of learning in online peer assessment environments. The Internet and Higher Education, 13(3), 164–169. https://doi.org/10.1016/J.IHEDUC.2010.01.001
Valero Haro, A., Noroozi, O., Biemans, H. J. A., & Mulder, M. (2019). The effects of an online learning environment with worked examples and peer feedback on students’ argumentative essay writing and domain-specific knowledge acquisition in the field of biotechnology. Journal of Biological Education, 53(4), 390–398. https://doi.org/10.1080/00219266.2018.1472132
Valero Haro, A., Noroozi, O., Biemans, H. J. A., Mulder, M., & Banihashem, S. K. (2023). How does the type of online peer feedback influence feedback quality, argumentative essay writing quality, and domain-specific learning? Interactive Learning Environments, 31(3), 387–405. https://doi.org/10.1080/10494820.2022.2023624
Van Zundert, M., Sluijsmans, D., & Van Merriënboer, J. (2010). Effective peer assessment processes: Research findings and future directions. Learning and Instruction, 20(4), 270–279. https://doi.org/10.1016/j.learninstruc.2009.08.004
Wang, J., Gao, R., Guo, X., & Liu, J. (2019). Factors associated with students’ attitude change in online peer assessment – a mixed methods study in a graduate-level course. Assessment & Evaluation in Higher Education, 45(5), 714–727. https://doi.org/10.1080/02602938.2019.1693493
Wang, S. L., & Wu, P. Y. (2008). The role of feedback and self-efficacy on web-based learning: The social cognitive perspective. Computers & Education, 51(4), 1589–1598. https://doi.org/10.1016/J.COMPEDU.2008.03.00
Wihastyanang, W. D., Kusumaningrum, S. R., Latief, M. A., & Cahyono, B. Y. (2020). Impacts of providing online teacher and peer feedback on students’ writing performance. Turkish Online Journal of Distance Education, 21(2), 178–189. https://doi.org/10.17718/TOJDE.728157
Winstone, N. E., Nash, R. A., Parker, M., & Rowntree, J. (2017). Supporting learners' agentic engagement with feedback: A systematic review and a taxonomy of recipience processes. Educational psychologist, 52(1), 17–37. https://doi.org/10.1080/00461520.2016.1207538
Wu, S.-Y., Hou, H.-T., & Hwang, W.-Y. (2012). Exploring students' cognitive dimensions and behavioral patterns during a synchronous peer assessment discussion activity using instant messaging. Asia-Pacific Education Researcher (De La Salle University Manila), 21(3).
Wu, Y., & Schunn, C. D. (2023). Passive, active, and constructive engagement with peer feedback: A revised model of learning from peer feedback. Contemporary Educational Psychology, 73, 102160. https://doi.org/10.1016/j.cedpsych.2023.102160
Xiao, Y., & Lucking, R. (2008). The impact of two types of peer assessment on students’ performance and satisfaction within a Wiki environment. Internet and Higher Education, 11(3–4), 186–193. https://doi.org/10.1016/j.iheduc.2008.06.005
Yang, Y.-F., & Meng, W.-T. (2013). The Effects of Online Feedback Training on Students’ Text Revision. Language Learning & Technology, 17(2), 220–238.
Yeh, H.-C., Tseng, S.-S., & Chen, Y.-S. (2019). Using Online Peer Feedback through Blogs to Promote Speaking Performance. Educational Technology & Society, 22(1), 1–14.
Yu, F.-Y., & Liu, Y.-H. (2009). Creating a psychologically safe online space for a student-generated questions learning activity via different identity revelation modes. British Journal of Educational Technology, 40(6), 1109–1123. https://doi.org/10.1111/J.1467-8535.2008.00905.X
Yuan, J., & Kim, C. (2017). The effects of autonomy support on student engagement in peer assessment. Educational Technology Research and Development 2017 66:1, 66(1), 25–52. https://doi.org/10.1007/S11423-017-9538-X
Zakharova, A., Evers, K., & Chen, S. (2022). Optimal scaffolding method for resume writing in the supplementary online writing course. Interactive Learning Environments, 1–15. https://doi.org/10.1080/10494820.2022.2043382
Zhan, Y. (2020). What matters in design? Cultivating undergraduates’ critical thinking through online peer assessment in a Confucian heritage context. Assessment & Evaluation in Higher Education, 46(4), 615–630. https://doi.org/10.1080/02602938.2020.1804826
Zhang, H., Liao, A. W. X., Goh, S. H. L., Yoong, S. Q., Lim, A. X. M., & Wang, W. (2021). Effectiveness and quality of peer video feedback in health professions education: A systematic review. Nurse Education Today, 105203. https://doi.org/10.1016/j.nedt.2021.105203
Zheng, L., Cui, P., Li, X., & Huang, R. (2018). Synchronous discussion between assessors and assessees in web-based peer assessment: Impact on writing performance, feedback quality, meta-cognitive awareness and self-efficacy. Assessment and Evaluation in Higher Education, 43(3), 500–514. https://doi.org/10.1080/02602938.2017.1370533
Zhu, Q., & Carless, D. (2018). Dialogue within peer feedback processes: clarification and negotiation of meaning. Higher Education Research and Development, 37(4), 883–897. https://doi.org/10.1080/07294360.2018.1446417
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
None
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Kerman, N.T., Banihashem, S.K., Karami, M. et al. Online peer feedback in higher education: A synthesis of the literature. Educ Inf Technol 29, 763–813 (2024). https://doi.org/10.1007/s10639-023-12273-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10639-023-12273-8