Important Announcement
PubHTML5 Scheduled Server Maintenance on (GMT) Sunday, June 26th, 2:00 am - 8:00 am.
PubHTML5 site will be inoperative during the times indicated!

Home Explore wang2020

wang2020

Published by rahma.dini.w, 2021-08-28 02:59:13

Description: wang2020

Search

Read the Text Version

Computers & Education 156 (2020) 103935 Contents lists available at ScienceDirect Computers & Education journal homepage: http://www.elsevier.com/locate/compedu Design-based research on integrating learning technology tools into higher education classes to achieve active learning Yi-Hsuan Wang Department of Educational Technology, Tamkang University, Taiwan RoomED 614, No. 151, Ying-ChuanRoad, Tamsui, TaipeiHsien, Taiwan ARTICLE INFO ABSTRACT Keywords: This study adopted the design-based research (DBR) method to explore whether the integrated Evaluation methodologies technology-enhanced learning mode could help college teachers design more interactive classes, Improving classroom teaching and to assist undergraduate students in achieving active learning. The study lasted for 4 semesters Pedagogical issues and consisted of 3 research phases, in which the issues of how to use learning technology tools to promote active learning of course participation and peer assessment were explored. The partic­ ipants of the whole study included about 458 2nd and 3rd year undergraduate students. After 3 iterative DBR phases, it is concluded that the individual use of IRS tools plays a role in assisting students in improving their learning retention, whereas students with the cooperative IRS ac­ tivities were able to produce and reach meaningful learning conclusions through course inter­ action. Peer assessment with guiding assessment descriptions and evaluation rubrics using the Google suite was evaluated positively by most students. It facilitated advanced level cognitive knowledge acquisition and helped the students to produce constructive peer feedback. The future work for the next phases of DBR study is also discussed. 1. Introduction 1.1. Motivation of the study: the teaching and learning scenarios in college In higher education, lectures are the main approach that most teachers adopt in the classroom (Weimer, 2008). The main char­ acteristic of lectures is that the targeted knowledge is transmitted to learners directly within the limited class time; however, the one-way instruction, which lacks interaction and students participation, has less learning effectiveness and also results in silence in the classroom. Teachers might find that students tend to forget the learning concept taught just 5 min before because the lack of interaction makes them passive in their learning. Researchers (Abdullah, Bakar, & Mahbob, 2012; Bonwell & Eison, 1991) have indicated the importance of course participation and being active learners. Active learning is a student-centered approach; it is iterative, dialogical and collaborative, and it needs to be intentional and well-designed (European University Association, 2019). When students are engaged in the activities of reading, writing, discussing or problem solving in courses, instead of sitting in class listening to the teacher, they are learning more through seeking as well as receiving information. This can help them to achieve more high-order skills such as from the basic cognitive knowledge level of remember and understand to the advanced levels of apply, analyze, evaluate and create (Anderson & Krathwohl, 2001). Adopting learning technology in courses to facilitate teaching could bring about better course interaction. Previous research has E-mail addresses: [email protected], [email protected]. https://doi.org/10.1016/j.compedu.2020.103935 Received 23 July 2019; Received in revised form 14 May 2020; Accepted 20 May 2020 Available online 29 May 2020 0360-1315/© 2020 Elsevier Ltd. All rights reserved.

Y.-H. Wang indicated that the interactive response system (IRS) is a mature tool for facilitating instruction in higher education (Han & Finkelstein, 2013; Li & Wong, 2020) and students with IRS have been reported as becoming more motivated and participative in exchanging learning thoughts (Blasco-Arcas, Buil, Herna�ndez-Ortega, & Sese, 2013). Meanwhile, studies have revealed that peer assessment is a common course activity that instructors often adopt to encourage students to share their learning thoughts, especially in higher ed­ ucation (Liu & Carless, 2006; Yung, 2012). In the activity of peer assessment, students are asked to give feedback on the work or performance of students of equal status, and they could understand their classmates’ ideas during the peer assessment process and know the strengths and weaknesses of their performance when giving comments to others (Van Gennip, Segers, & Tillema, 2010). At the same time, researchers (Liu & Carless, 2006) have indicated the problematic issues of reliability, perceived expertise, power re­ lations, and time available for doing peer assessment. It has also been noticed that anonymity in peer assessment is a factor that could be further explored (Raes, Vanderhoven, & Schellens, 2013) since some researchers have found that anonymity in activities makes students become more participative in learning and active in providing learning feedback for peers since they do not need to reveal their identity (Brady, Seli, & Rosenthal, 2013). 1.2. Research purpose and questions This study explores how to integrate the mature and readily available learning technology tools to facilitate course interaction, and to support better peer assessment activities. The purpose of the study is not only to look for significant effects or to test only one design, but to see how the various learning technology tools could be integrated into educational scenarios to facilitate teaching and learning. Hence, the design-based research (DBR) was conducted; this is a study method that aims to explore teaching and learning problems in real educational scenarios and to propose possible solutions through a long-term iterative experiment design for bridging the gap between theory and real educational problems (Collins, 1992; Juuti & Lavonen, 2012; Wang & Hannafin, 2005). In this study, I, as a researcher and also a course instructor, aimed to manage the research processes, and to design and implement interventions to refine and improve the course in terms of both theory and practice. I also aimed to explore whether the learning technology could be well integrated into a higher education scenario for helping college teachers to design more interactive classes, and to determine whether learning technology could enhance the teaching and learning model through rolling amendments to fulfill the needs of the higher education scenario. Further aims of the study were to verify and improve active learning theory, and to assist students in performing better in peer assessment activities. The following research questions were addressed: 1. How can DBR be conducted to help instructors achieve technology-based classroom learning? 2. What learning technology tools could be integrated into undergraduate courses to promote active learning and peer assessment activities? And how? 3. What didactic role did learning technology play in supporting teachers’ instruction in higher educational settings? 1.3. Research structure and outcome This article is structured as follows. First, a review of the related literature and the underpinning theory and practice issues is provided. Then, the DBR processes were conducted and the iterative research design and refinements of each DBR stage are described. Lastly, a technology-enhanced learning model is proposed according to the three phases of DBR. The major outcome of this study is the identification of design principles for teachers and practitioners who aim to promote students’ active learning, to have interactive undergraduate courses and who want their students to learn better with the convenient technology available in educational settings. 2. Literature review 2.1. Design-based research Researchers (Wang & Hannafin, 2005) have defined DBR as “A systematic but flexible methodology aimed to improve educational practices through iterative analysis, design, development, and implementation, based on collaboration among researchers and practitioners in real-world settings and leading to contextually-sensitive design principles and theories” (pp. 6–7). DBR is often referred to as a long-term research endeavor involving iterative observation, design, implementation, and redesign to come up with possible practical solutions to address educational problems. It is important to invite teachers as co-investigators and to address and analyze classroom questions so as to understand the reasons for failure and success and to take steps to refine the design in both theoretical and practical aspects after DBR (Collins, 1992). Easterday, Rees Lewis, and Gerber (2014) provide a clear definition of the process of conducting DBR, which consists of six iterative phases, namely focusing on the problem, understanding the problem, defining goals, conceiving the outline of a solution, building the solution, and testing the solution. The whole DBR process is not carried out in a linear sequence, but rather includes recursive and iterative nested processes, and the possible solutions which are the result of the DBR processes include new technological products, educational products, a technological enhanced process, and programs (McKenney & Reeves, 2012; Shattuck & Anderson, 2013; Majgaard, Misfeldt, & Nielsen, 2011; Weitze & Ørngreen, 2014). For example, Kim, Suh, and Song (2015) proposed a science learning curriculum through the DBR processes and proposed a deeper look at learners’ experience of a series of science learning activities. Barab, Squire, Wang, and Hannafin (2004) indicated that DBR could drive evidence-based claims from real learning contexts and settings through these iterative processes. Besides, a design theory was constructed after a series of DBR processes. The first steps would be identifying the relevant independent and dependent variables that might affect the

Y.-H. Wang innovation. Then, in the second step, it should explore and specify how the variables interact with each other to produce the success or failure of different designs (Collins, 1992). In summary, when designing DBR, researchers should first target the audience, learning domains, and contexts, and survey whether there are existing solutions (the focus and understanding phases). Then, a possible solution to the problems should be sketched up (the define and conceive phases) and all the relevant research variables identified. Lastly, the researchers should implement and evaluate the solutions (the build and test phases), to specify the results and to construct both practical and theoretical refinements. 2.2. Theory: active learning Active learning is an instructional method which emphasizes that learners should do and think about meaningful learning activ­ ities. An active learner would take a minute to ponder learned content, share feedback with peers or instructors, and discuss the ideas from the course outside of the class (Carr, Plamer, & Hagel, 2015). Active learning refers to a broad range of behaviors; in this study, it involves students participating in the learning process and thinking about what they are doing such as asking questions, contributing to discussions and giving comments after peers’ presentations. Researcher (Grunert, 1997) have found that students do not learn much when they are sitting in class listening to teachers; rather, when they are engaged and participate in the activities of reading, writing, discussing or problem solving, they are learning more and are able to acquire the high-order learning skills. Active learning could be incorporated into the classroom in several ways including class discussion, posing questions, and short writing exercises (Bonwell & Eison, 1991). For example, activities such as encouraging students to consolidate their notes during lectures and having brief dem­ onstrations followed by class discussion would be good methods to promote active learning. Studies (EUA coordinators, 2019) have indicated that students take more responsibility for their performance and see the course as being more valuable when they are invited to actively participate in the learning interaction. In higher education classes, many teachers adopt lectures as their main way of transmitting knowledge, as it is convenient, cost- effective and can reach numerous listeners at once. However, researchers have also found that the learners might feel bored and confused after 10–20 min of the lecture, and subsequently, little of the lecture could be recalled (Bonwell & Eison, 1991). Learners who listen to lectures are passive in acquiring knowledge; this phenomenon is particularly true for Eastern-style classrooms where students listen to teachers and show respect for authority and seniority without raising questions (Haarms, Holtzman, Xue, & Darbyshire, 2018). In contrast, discussion is a good way to promote questioning and critical thinking skills and to achieve active learning (Bonwell & Eison, 1991). Technology can allow teachers to hold discussion activities or to facilitate immediate interaction in a big class. For example, the interactive response system (IRS) enables both teachers and students to exchange information, and a learning system that supports peer assessment activities helps classmates to express their opinions to their peers and to generate and collect feedback regarding their performance. The following sections review the practical learning technology tools that support active learning. 2.3. Practice: IRS-enhanced course interaction An IRS can immediately deliver learners’ feedback to instructors, help teachers gain real-time perceptions of the students’ un­ derstanding of the course, and facilitate students’ cooperative learning (Kietzig & Orjuela-Laverde, 2015; Wang & Tahir, 2020). Research has provided evidence that the use of an IRS in higher education classes could be a tool for sustaining learners’ learning motivation (Bruff, 2009; Wang, 2016; Hwang & Chen, 2019). Besides, some studies have proposed possible benefits for the integration of various teaching and learning strategies with IRS; for example, Jones, Antonenko, and Greenwood (2012) found that using peer instruction with IRS activities is more effective than lectures, and that the students individually participated in the IRS activity. Kietzig and Orjuela-Laverde (2015) demonstrated that using the IRS open-ended question function fostered students’ learning interaction and active participation. Daniel and Tivener (2016) indicated that group-clicker participants worked towards consensus and tended to agree with the final clicker vote. Wang (2016) suggested that IRS with the learner as leader strategy benefited those who acted as leaders in taking the initiative to learn the course content, and also engaged the students who acted as learners in concentrating on the course. Hwang and Chen (2019) used an IRS with the collective issue-quest strategy to support college students’ learning in a flipped classroom, and found that learners’ learning achievement and collective efficacy were improved. Besides, Caldwell (2007) indicated that the anonymity characteristic of the IRS improved students’ participation, and they also suggested that the issue of how instructors can take advantage of the anonymity in IRS activities could be further explored (Cain & Robinson, 2008). The above research supports that using an IRS has positive learning effects for higher education, and the findings could be references for curricular optimization and didactic adjustment in the current study. 2.4. Learning technology for peer assessment Topping (1998, 2003) defined peer assessment as an educational arrangement that encourages students to give feedback on the work or performance of students of equal status according to certain criteria and standards. Research has proven that the peer assessment method can have positive effects on learners’ learning performance (Topping, 2003), because students could understand classmates’ ideas during the peer assessment process and learn about their strengths and weaknesses from the comments of their performance (Van Gennip et al., 2010). The peer assessment is an important element of designing learning interaction because it triggers students’ participation during the course, and hence, it is an effective way to promote active learning, problem solving or collaborative learning (Kollar & Fischer, 2010). Various studies have reported teachers’ perceptions of using peer assessment activities during teaching, such as in writing learning (Lynch & Golen, 1992) and self-regulated learning design (Brown & Harris, 2013; Topping,

Y.-H. Wang Computers & Education 156 (2020) 103935 2003). Panadero and Brown (2015) explored primary, secondary, and higher educational teachers’ perceptions of using peer assessment during course teaching, and revealed that teachers showed positive experience of using peer assessment in the class, while teachers in some studies (Panadero & Brown, 2015; Panadero, Jonsson, & Strijbos, 2016) reflected that when using peer assessment in their courses, it took them more time to prepare the course settings. Gielen and Wever (2015) indicated that using learning technology tools to support peer assessment activities might be a possible way to overcome the dilemma since learners could do the peer assessment activity in a more flexible way with the support of the learning technology, and they could also feel much more confident and feel more motivated as a result of the on-line peer assessment activity. For example, Raes, Vanderhoven, & Schellens, 2013 used a classroom response technology tool to achieve peer assessment anonymity in a face-to-face authentic classroom setting, and revealed that the classroom response technology supported accurate assessment, and the anonymity in peer assessment in higher education settings made the learners feel more comfortable and positive about peer assessment. Simonsmeier, Peiffer, Flaig, and Schneider (2020) adopted a web-based feedback system in a higher educational classroom for learners’ academic writing, and found that the structured online peer feedback promoting interaction between students and supporting learning was effective. 2.5. Issues of peer assessment Teachers should notice several issues when integrating peer assessment activities into courses, such as students being reluctant to give grades to their peers (Lynch & Golen, 1992), scoring their peers’ performance unfairly (Vanderhoven, Raes, Montrieux, Rotsaert, & Schellens, 2015), and the issues of reliability, perceived expertise, power relations and time available for doing peer assessments (Liu & Carless, 2006). Of these raised issues, those related to anonymity during peer assessment should be carefully considered. According to the research findings (Van Gennip et al., 2010), learners give fairer peer assessment with less peer pressure in environments in which the students’ real identity is not revealed since students’ peer pressure increased in the face-to-face scenario (Latane� & Wolf, 1981). Vanderhoven et al. (2015) revealed that anonymous peer assessment could be achieved through using learning technology tools, but the researchers also noticed that the validity of anonymous peer assessment should be considered during the activity. Vanderhoven et al. (2015) mentioned the importance of teachers controlling the peer assessment process. Teacher-led peer assessment could prevent learners from giving unfairly high scores to their friends. Besides, Vanderhoven et al. (2015) suggested that in order to prevent students from giving higher grades to their friends, the learning technology tools can be set up to support the function of learners remaining anonymous when giving feedback, but they could be identified by instructors afterwards. Secondly, the issue of whether anonymity is important in higher education could be further explored. Researchers (Liu & Hsu, 2016) have indicated that college students prefer to do peer assessment activities anonymously, but another study pointed out that anonymous assessors in peer assessment for university students was not so necessary, since the university-level students were honest and accurate in their peer assessments (Panadero & Brown, 2015). Panadero et al. (2016) even indicated that if students know where their feedback came from, then more face-to-face interaction would occur between the assessed and the assessors. Thirdly, despite classroom response technology having an advan­ tage in terms of reducing students’ pressure, research has also pointed out that peer assessment which gives scores on peers’ per­ formance without providing argumentation or justification for the scores might be superficial (Hattie, 2003). Strijbos and Sluijsmans (2010) suggested that using written feedback gives more constructive reflections and information than only giving scores. Besides, Gielen and Wever (2015) revealed that providing students with additional structure forms such as a content checklist for the assessment is a useful approach to increase the quality of peer assessment feedback. Teachers are encouraged to lead discussion of the surface content from peer feedback, and also to provide guidance and peer review rubrics to encourage learners to produce detailed explanations of the peers’ performance (Wu & Schunn, 2020). Further research could implement a structured peer feedback template to elicit more constructive feedback (Gielen & Wever, 2015), and issues related to whether various kinds of feedback matter, such as peer versus teacher feedback, and comparison of learners’ perceptions of feedback and the influence of feedback collected from various procedures such as in traditional, non-traditional or anonymous contexts could be further explored (Wu & Schunn, 2020). Fig. 1. The iterative DBR process of this study. 4

Y.-H. Wang 3. Methodology The DBR method was adopted to conduct the experiment in which various research designs in different research phases were integrated into the iterative cycle. In this study, the research method and data presentation were separated into three phases, and in each phase, how the practice and theory were modified and adjusted in order to achieve better teaching results is presented. Firstly, in the preparation phase, a preliminary literature review was conducted with the purpose of identifying the teaching problems for college teachers and to review the related learning theory and potential learning technology tools for addressing the possible learning problems. After reviewing the literature, in Phase 1, the IRS tools were adopted for course interaction and peer assessment. The problems and issues of using IRS in assisting peer assessment were found, and hence Phase 2 focused on using the learning technology tools to support a more fluent peer assessment activity, and discussed the issues of anonymity. Lastly, the researcher integrated the findings of Phase 1 and 2 into the design of Phase 3. The third stage of the study focused on how to use learning technology tools with teaching and learning strategies to achieve better course interaction and to encourage the students to offer constructive feedback during the peer assessment activity. Fig. 1 presents the sequences and research topics of each phase in the iterative DBR study. 3.1. The course, participants, research tools, experiment period, and sub-research purposes of each phase Two courses were chosen for conducting the DBR research. The purpose of one course was to help second-year college students acquire the concepts of video production and the abilities of video editing and creation, and the other was for third-year college students and introduced empirical examples of how educational institutions or enterprises adopt e-learning strategies according to their different purposes. The two chosen courses originally adopted lectures as the main instructional approach, and the students were required to form groups for handing in a project proposal as their mid-term assignment and to present their final products at the end of the semester. Besides, each student in the course was asked to perform peer assessment of both the mid-term and final assignments and to provide feedback to the reporters. The researcher used questionnaires, a self-learning diary after the classes, IRS scores, and delayed tests for collecting the research data. The whole DBR research existed for four semesters, and the sub-research purposes of each DBR phase were defined individually. The detailed information regarding the participants and the period of each research phase are presented in Table 1. 3.2. DBR procedure This study followed the DBR processes in each phase to define the problems, design the experiment, facilitate and evaluate the method, and specify the findings, then to give suggestions for the next phases. The whole DBR process is not carried out in a linear sequence, but includes recursively and iteratively nested processes. In this study, the data were conducted and analyzed according to the above-mentioned DBR processes, and the results of each stage are presented in the following section. The detailed information of each DBR phase is summarized in Table 5 to indicate how the study aligned with the iterative DBR processes. 3.3. The learning technology tools used in the study: IRS tools and google educational service The study adopted several learning technology tools according to various research purposes to facilitate teaching and learning in the different DBR phases. In Phase 1, two kinds of IRS tools were chosen to run the course interaction. For the multiple-choice question Table 1 Research variables Experiment Research purpose of each phase The information of DBR in each phase. period ∙ None 2015.02-2015-06 Preparing stage Participants/Course Research tools 2015.09–2016.01 ∙ The learning technology To understand how to use learning Phase second-year course observation tools (IRS) for course 2016.02–2016.06 technology tools to promote course 0 students interaction interaction and peer assessment activity. third-year questionnaires, 2016.09–2017.01 Phase students/e-learning learning diary ∙ Group learning for course 2017.02–2017.06 To understand how to adopt learning 1 (n ¼ 115) IRS scores interaction technology tools to explore the issues of anonymous and non-anonymous peer Phase second-year questionnaires, ∙ The learning technology assessment activities 2 students/video learning diary tools (Google sheets and To understand how to integrate learning production (n ¼ forms) for peer assessment and teaching strategies into learning Phase 108) questionnaires, activity technology tools to elicit better course 3 IRS scores, interaction and constructive feedback third-year students/ delayed tests ∙ Various ways (group and during the activity e-learning (n ¼ 120) individual) of participating & in course interaction second-year ∙ Re-design the rubrics in the students/video peer assessment forms production (n ¼ 115)

Y.-H. Wang Computers & Education 156 (2020) 103935 interaction, the IRS tool developed by the team, Kahoot!AS (Johan Brand, Jamie Brooker, and Morten Versvik), from the Norwegian University of Technology and Science was used (Fig. 2-b); meanwhile, for the open-ended questions interaction, Menti, developed by the Mentimeter Team from a Swedish company, was adopted (Fig. 2-c). These two IRS tools were used to collect students’ qualitative and quantitative ideas in the classes, and the process of using the IRS started from instructors running the activity by projecting questions onto the screen and students then using a PIN code (provided by the teacher) for accessing the website (Fig. 2-a) to share their thoughts regarding the course issues. The IRS would immediately present statistical results of the students’ answers or organized textual paragraphs on the screen. In phases 2 and 3, the Google suite for education (Google for education), which is an e-platform developed by Google, was adopted for hosting the peer assessment activity. The Google suite for education is a service that provides several Google products to educators with which teachers with appropriate instructional design can create learning activities to engage students whenever and wherever they want and on any device. The products adopted in the study were Google forms (Fig. 2-d) and Google sheet (Fig. 2-e), with which the learners can immediately and synchronously create, co-author, and peer-edit textual comments through web applications. Besides, the main reason to adopt the above learning technology tools was that no installations or downloads were required, and the tools were free. The instructional design strategy adopted these tools for course assessment. The IRS tools were used for evaluating whether the learners were learning during the course lectures, and the Google suite for education was set up for various kinds of peer assessment activity such as immediate or anonymous peer assessment. Moreover, the tools were also used for assisting the teacher and students in creating an interactive atmosphere in class. 3.4. Data collection In this study, IRS scores, delayed tests, questionnaires, students’ weekly learning diaries, and course observations were used to collect both qualitative and quantitative data. The items in the IRS were designed by the course instructor for understanding the learners’ acquisition of the targeted knowledge. There were seven to nine multiple-choice question items in one IRS activity, and the full score of the test was 100 for each. The delayed tests were conducted in Phase 3, and the questions of the delayed tests were the same as the IRS questions, but the sequence was changed. The delayed tests were conducted after 2 weeks of the IRS activity. The items in the questionnaires included multiple-choice questions and open-ended questions. The items were designed according to the research purpose. There were seven to nine multiple-choice question items in one questionnaire based on a 5-point Likert scale (from 5 to 1: strongly agree, agree, neutral, disagree, strongly disagree) to collect learners’ feedback on using the learning technology tools in the class. Meanwhile, the open-ended questions asked the learners to share their feelings, perceived advantages, and suggestions regarding the IRS and peer assessment activity as a way of eliciting their qualitative feedback that could not be generated from the quantitative questionnaires. During the course, the learners were also asked to write a weekly learning diary regarding their learning performance and learning reflections. The requirement of the learning diary included a short paragraph (from 50 to 100 words) about their weekly learning experience and feedback. The students were encouraged to use this diary to communicate with the instructor, and the instructor would also reference the learners’ feedback from the diary to adjust the course schedule and progress. The data from the questionnaires and weekly learning diary were analyzed according to the research questions. SPSS for Windows was used as the main software for statistical analysis. For the qualitative data from the weekly learning diaries and open-ended Fig. 2. The learning technology tools adopted in the study. 6

Y.-H. Wang Computers & Education 156 (2020) 103935 questions from the questionnaires, each participant was given a code, for example, EÀ 1-23451 represents the data from learner 23451 of Phase 1 in the E-learning course; P-3-24515 represents the data from learner 24515 of Phase 3 in the video production course. The weekly learning diaries and the responses to the open-ended questions were translated into raw data for each participant by the instructor, and then the raw data were re-coded as different themes to understand the learners’ perceptions of the use of IRS and the peer-assessment activity. The themes were designed according to the research questions, and included the students’ course feedback, IRS activity feedback, and cooperative learning reflection on the IRS activities. The coding categories included general feedback on the course, advantages or the inconveniences of using the tools for learning, and suggestions to improve the course. Each coding theme aimed to answer the research questions, and the final qualitative data were organized and displayed as reduced data from which the findings for each question could be highlighted. 4. Results: phase 1 4.1. Defining and designing Before Phase 1, the course observation was conducted. It was found that lectures were the main course teaching method and there was not much interaction between the teacher and students and among the students; thus, some students showed little interest in learning. The e-learning platform was the only learning technology tool adopted in the class, and it was used for handing in homework assignments and sharing course materials. After analyzing the above course situation, a review of the related literature was performed and it was found that the use of IRS tools can have positive learning effects on assisting learning interaction (Blasco-Arcas, Buil, Hernandez-Ortega, & Sese, 2013). Hence, in Phase 1, it was decided to adopt a learning technology tool, namely the IRS, to facilitate and promote course interaction, and also to examine whether integrating learning technology could promote students’ motivation to participate in the course activity and to achieve active learning. Two experiments were implemented in Phase 1. For experiment design 1-1, IRS tools were adopted to facilitate course interaction, and for experiment design 1–2, the IRS tools were further used for con­ ducting peer assessment activities. 4.2. Facilitating and evaluating In experiment design 1-1, the instructor took the class through both the lectures and IRS interaction. The course progress went as follows: Firstly, the teacher spent 30–40 min introducing important concepts via a lecture, and an IRS activity was held after each topic. The IRS activity included multiple-choice and open-ended question interaction (Fig. 3). The purpose of holding the multiple- choice question interaction was to help the teacher understand how the students acquired the target knowledge, while the purpose of designing open-ended question interaction was to encourage them to share their ideas and opinions on the questions raised by the instructor. For example, a multiple-choice question might ask learners to choose the items that were not included in the ADDIE (analysis, design, development, implement, evaluation) processes from the four candidate options. An open-ended question might ask students to give an example of how the company implemented ADDIE mode to develop the e-learning materials or to explain the bottom-up and top-down eLearning modes. The students were asked to type a paragraph as their response using the IRS interface and after they submitted their answers, the teacher and peers could read each student’s feedback synchronously and a follow-up discussion could be conducted. For evaluation of the research results of phase 1, the students’ feedback on participating in the IRS activity was collected through questionnaires and their weekly learning diaries. According to the data analysis, it was found that the students had quite positive feedback regarding participation in the IRS activity in the class, and they indicated that the use of IRS tools improved their engagement in the course (Table 2–a, Q2, Avgas ¼ 4.39, S.D. ¼ 0.61) and their interaction with the teacher (Table 2–a, Q1, Avgas ¼ 4.57, S.D. ¼ 055). Their comments included, “I enjoyed sharing my ideas with my classmates” (EÀ 1-30435), and “I read and learned from others’ comments and checked what difference there was between mine and others” (EÀ 1-30195). Besides, most students (about 87%) showed satisfaction with learning with the IRS tools, and some positive feedback was collected, as shown in Table 2–b. In experiment design 1–2, the teacher facilitated peer assessment activities through integrating an IRS tool (Kahoot!). In the peer Fig. 3. Pictures of integrating IRS tools into the course. 7

Y.-H. Wang Table 2a The questionnaire results of experiment 1-1. Questions items n ¼ 115 Avg. S.D. 1. I have good learning interaction with the teacher through participating in the IRS activity. (LL) 4.57 .548 2. I am engaged in the course through participating in the IRS activity. (CP) 4.39 .610 3. The use of IRS promotes the efficiency of course discussion. (CP) 4.01 .786 4. The use of IRS promotes my willingness to participate in course discussion. (FIRS) 3.96 .751 5. In order to perform well in the IRS activity, I concentrate on the course. (FIRS) 4.16 .650 *5-point Likert scale (from 5 to 1: strongly agree, agree, neutral, disagree, and strongly disagree). *Categories of the question items: Learning interaction (LL), Course participation (CP), Feedback on IRS (FIRS). assessment activity, the students were separated into groups and each group had to make a presentation about their project; then, the students were asked to immediately evaluate their peers’ performance according to the evaluation criteria set up by the instructor. Each group was able to know their total score for their performance after their presentation. The research evaluated the experiment design through questionnaires and the students’ self-learning diaries, and it was found that most students agreed that using IRS was a convenient way to give and receive peers’ feedback immediately after the presentation, and they liked to use the IRS tool to conduct peer assessment during the presentation (Table 2–c, Q2, Avgs ¼ 4.17, S.D. ¼ 0.903). The quantitative findings were supported by the qualitative data from learners’ self-learning diaries such as “Using Kahoot! to do the evaluation was convenient and quick; I liked it” (E-1- 30898), “It’s good. I listened to my peers’ reports and gave grades directly. I was more concentrated on listening to the reports” (E-1-30153). However, it was also noticed that the IRS tools used only showed quantitative results (numbers) of peer evaluation (students only got scores from their peers but did not know the reasons for the scoring); some students wanted more detailed feedback on their per­ formance, and some doubted the feedback since there was no way to confirm where it came from (“I don’t know who gave me the grades, and I want to ask someone for advice to know how I can improve myself” (E-1-30344) (Table 2–d). The researcher cross-analyzed the data from the learners’ questionnaires and learning diaries of experiment design 1-1 and 1–2, and then organized them into three categories, course participation (CP), learning interaction (LL) and feedback on the IRS (FIRS) tool, in Table 2–e. 4.3. Specifying and suggesting After Phase 1, it was concluded that integrating IRS tools into the course promotes interaction between the instructor and the students, and the students also showed better motivation to participate in the course. Besides, the IRS was suitable and convenient for the peer assessment activity; however, it was noticed that the design of the peer assessment activity could be further improved. Hence, according to the findings of Phase 1, it was decided to focus on the mechanism of using learning technology tools to support the peer assessment activity and also to examine whether the anonymous peer assessment would make students more willing to participate in the activity so as to promote active learning. 5. Results: phase 2 5.1. Defining and designing After conducting and finding possible learning issues in Phase 1, Phase 2 aimed to integrate the learning technology tools from the Google suite for education into the peer assessment activity to achieve a more fluent peer assessment process. Besides, the issue of anonymity in the peer assessment was also explored. Two experiment designs were conducted in Phase 2. For experiment design 2-1, the non-anonymous strategy was integrated into the peer assessment, while for experiment design 2-2, the anonymous strategy was adopted. Table 2b The qualitative feedback of experiment 1-1. Course participation (CP) � I was more concentrated on the course. EÀ 1-30039 � It promoted my course participation and it was really interesting. EÀ 1-30401 � It promoted my course participation. EÀ 1-30960 � It was exciting and I was engaged in the course. EÀ 1-31003 Learning interaction (LL) � The way of course interaction was really good and convenient. EÀ 1-30997 � It was great to share ideas with my classmates. EÀ 1-30435 � I read and learned from others’ comments and checked what difference there was between mine and others’. EÀ 1-30195 � I enjoyed sharing my ideas with my classmates. EÀ 1-30435 Feedback on IRS (FIRS) � I liked the tool because I was able to share my ideas with classmates immediately. EÀ 1-30260

Y.-H. Wang Computers & Education 156 (2020) 103935 Table 2c The questionnaire results of experiment 1-2. Questions items n ¼ 115 Avg. S.D. 1. I like to use paper-based sheets to conduct the peer assessment evaluation. (FIRS) 3.44 .836 2. I like to use IRS (Kahoot!) to conduct the peer assessment evaluation. (FIRS) 4.17 .903 3. I think it is good and fair to use the IRS tool (Kahoot!) to evaluate peers’ performance. (FIRS) 3.55 .882 4. I concentrated on peers’ presentations using the IRS tool (CP) 3.94 .656 5. I think there is a need to use IRS (Kahoot!) to conduct the peer assessment evaluation (FIRS) 3.31 .788 *5-point Likert scale (from 5 to 1: strongly agree, agree, neutral, disagree, and strongly disagree). *Categories of the question items: Learning interaction (LL), Course participation (CP), Feedback on IRS (FIRS). Table 2d The qualitative feedback of experiment 1-2. Course participation (CP) � It’s good. I listened to my peers’ reports and gave grades directly. I was more concentrated on listening to the reports. EÀ 1-30153 Learning interaction (LL � I liked it because it was immediate, and I wanted to know who gave me these scores but without showing my identification. EÀ 1-30484 Feedback of IRS (FIRS) � Using Kahoot! to do the evaluation was convenient and quick; I liked it. EÀ 1-30898 � I don’t know who gave me the grades, and I want to ask someone for advice to know how I can improve myself. EÀ 1-30344 5.2. Facilitating and evaluating Research (Panadero et al., 2016) has indicated that more face-to-face interaction occurs between feedback assessees and assessors if the students know where their feedback comes from. Hence, in experiment 2-1, the peer assessment activity was conducted via Google sheets, and the fields in the Google sheets included the names of the person who gave and received the feedback, the rating levels, and qualitative comments (Fig. 4-a). The peer assessment progress went as follows: First, the teacher drew lots to determine the order of presentation, and the groups had to make their project presentations following the sequence. Then, each student had to give feedback to the reporting groups using the Google sheets (a web-based sheet) with which they were encouraged to give scores and comments on their peers’ learning performance, and each student could read their peers’ feedback (non-anonymous) immediately. Experiment design 2-1 was evaluated through students’ weekly learning diaries. It was found that most students agreed that using Google sheets was a good way to give and receive peer feedback immediately after the presentation; however, many students mentioned that they preferred to do peer evaluation anonymously, since they felt pressure about giving negative comments to their good friends or they dared not express their true feelings. The feedback they gave is summarized in Table 3–a. Following the findings of experiment 2-1, the instructor redesigned the process of the peer assessment activity and chose another Google educational service tool, Google forms, to support the peer assessment activity. In experiment 2-1, the forms for the peer assessment activity were presented in one web-based sheet, in which all the students could co-edit the sheets and write comments simultaneously (Fig. 4-a); in experiment 2-2, using Google forms, the forms contained multiple-choice questions and open-ended questions. The students accessed Google forms through a web-link, typed their personal information first then evaluated their peers’ performance via choosing a rating level through multiple-choice questions and adding qualitative comments. After finishing filling in the Google forms for the whole class, the students submitted the forms and then the instructor would get the peer assessment results of all the students by name, and could show the evaluation results to the whole class without displaying the evaluators’ names (Fig. 4-b). The immediate feedback files were bundled and delivered to the assessed team if needed (Fig. 4-c). Experiment design 2-2 was evaluated using questionnaires and it was found that the students were satisfied with the anonymous evaluation, and the way of summarizing peers’ feedback after the activity was helpful for self-reflection (Table 3–b). Besides, it was noticed that most students reflected that the comments from the course instructor after the performance were very helpful (Table 3–b, item Q5), but the average score for the items asking whether their peers’ comments were helpful was a little lower (Table 3–b, item Q4) and might indicate that the learners thought the comments from peers were less useful for improving their work. The data from the quantitative feedback of the questionnaires and peer assessment forms were therefore cross-analyzed, and it was found that students would like to give comments anonymously since they were able to share the positive and negative feedback bravely. Besides, the mechanism of half-anonymity, that is anonymously for peers, but the teacher could know who gave the com­ ments, was effective, although it was found that most of the comments on the peer assessment forms were superficial, such as “the work was very good,” “the work was creative,” “excellent work,” or “the team members were well prepared.” These findings also echoed the reflection from the qualitative data in which some learners revealed that the comments from their peers were not so serious such as “I read the feedback and some comments were made a joking manner, It was a pity that not everyone gave thoughtful peer feedback” (P-2-3125) and hence the comments were not helpful for reference (Table 3–c). After phase 2, the learners also gave their feedback on the various peer assessment approaches, and it showed that the anonymous way of using IRS (Kahoot!) and Google form was the most welcomed by the students, and the non-anonymous way and traditional paper-based peer assessment were less welcomed. The researcher cross-analyzed the data from learners’ questionnaires and learning diaries of experiment design 2-1 and 2-2, and 9

Table 2e Data summary of phase 1. Categories .Qualitative findings Q 10 Course The learners’ course participation was promoted through IRS and peer assessment T participation activity Q The learners shared ideas with classmates and the immediacy of the IRS facilitated T Learning the interaction for peer assessment. Q interaction The tools were welcomed by the learners for immediate interaction, but the T information provided by the current IRS was not enough. T Feedback of IRS

Y.-H. Wang Quantitative findings The average score of whether learners had good course participation was 4.11. (Table 2–a Q2: 4.39 þ Table 2–a Q3: 4.01 þ Table 2–c: Q4: 3.94)/3 ¼ 4.11 The average score of whether learners had good learning interaction was 4.26. (Table 2–a: Q1:4.57þTable 2–c: Q4: 3.94)/2 ¼ 4.26 The average score of learners’ feedback on using the IRS tool in the class was 3.75. (Table 2–a Q4: 3.96 þ Table 2–a Q5: 4.16 þ Table 2–c: Q3: 3.55þ Table 2–c: Q5: 3.31)/4 ¼ 4.11 Computers & Education 156 (2020) 103935

Y.-H. Wang Computers & Education 156 (2020) 103935 Fig. 4. The interface of integrating Google suite for educational tools for students. Table 3a The qualitative feedback of experiment 2-1. Feedback on IRS (FIRS) � It is efficient to use the Google tool for peer assessment. P-2-6026 � I think using the Google tool for peer assessment was great, but some classmates did not give feedback regarding the performance P-2-31506 Feedback of anonymity (FA) � There were pros and cons to doing the peer evaluation with personal information; however, as a feedback provider, I did not dare to type my real comments, but if I were a feedback receiver, I would know who gave me the comments. P-2-30260 � I didn’t want to share my true thoughts if the forms were not anonymous, and it gave me more space to express my true feelings if it was anonymous. P-2-30930 � I think it is stressful to give feedback that is not anonymous, and some did not take the peer assessment seriously since the comments were anonymous. P-2-35029 � I think it is objective to give comments anonymously. Non-anonymous peer assessment would break our friendship. P-2-30459 � I think to give feedback non-anonymously will create stress and hence we don’t dare to write down our real feelings. I think to give feedback anonymously would be better for helping us get the true comments. P-2-30103 Table 3b Avg. S.D. The questionnaire results of experiment 2-2. 4.12 .758 Questions items n ¼ 108 3.67 1.03 3.98 1.00 1. I like to use google tool for anonymously peer assessment evaluation. (FA) 3.82 .797 2. I can give my comments objectively no matter whether in a non-anonymous or anonymous way. (FA) 4.36 .768 3. I can give my comments objectively in an anonymous way. (FA) 2.67 .907 4. The comments from peers were useful for improving my work. (PAA) 4.05 .806 5. The comments from teachers were helpful for improving my work. (PAA) 6. The peers did not give comments seriously because of the anonymous way. (FA) 7. It was good for self-reflection to summarize the comments from the peer assessment activity. (PAA) *5-point Likert scale (from 5 to 1: strongly agree, agree, neutral, disagree, and strongly disagree). *Categories of the question items: Feedback on anonymity (FA), Peer assessment activity (PAA). 11

Y.-H. Wang then organized them into two categories, Feedback of anonymity (FA) and Peer assessment activity (PAA), in Table 3–d. 5.3. Specifying and suggesting After Phase 2, the redesigned process of peer assessment and the use of various learning technology tools resolved the anonymous issues of peer assessment. It was concluded that using anonymous strategies in the peer assessment activity such as the Google form tools was evaluated positively by most students. The peer assessment activity gave students opportunities to reflect through giving and receiving comments from peers. This in some way, provided students with chances to achieve self-reflection. However, issues of how to design the tables of the Google forms to elicit more constructive feedback should be further discussed. 5.4. Defining and designing After conducting the study in Phases 1 and 2, it could be confirmed from students’ self-evaluation opinion that adopting IRS and Google suite for education service tools in the course promoted their learning interaction and benefited the peer assessment activity. In Phase 3, the aim was to further add various teaching and learning strategies for using learning technology tools to encourage advanced learning interaction and to assist students in giving thoughtful peer feedback. Hence, two learning experiments were conducted in Phase 3. In experiment design 3-1, two modes were adopted, group use and individual use, for integrating IRS into course interaction. In experiment design 3-2, the instructor improved the design of the tables in the peer assessment forms in which students were not only asked to give scores and comments, but the teacher also included a detailed evaluation rubric and categorization in the forms for encouraging students to write focused feedback through narrowing down their peers’ performance according to the evaluation rubric. 5.5. Facilitating and evaluating In Phase 3, the process of the course was conducted in the same way as in Phases 1 and 2 in which the instructor adopted IRS tools for class interaction and the anonymous strategy in the peer assessment activity. Since several studies have found that group use of IRS benefited learning (Daniel & Tivener, 2016; Wang, 2017), in experiment 3-1, two modes of IRS interaction were integrated into the course. For the group IRS modes, the students participated in the IRS activity through sharing a single account of IRS in a group, and were encouraged to discuss the answers with group members (Fig. 5), while for the individual IRS mode, the students were asked to use their own accounts in the IRS activity and to make their own decisions regarding the answers. The researcher used the scores of the IRS activity, delayed tests and questionnaires for the data analysis. The question items in the IRS activities and delayed tests were the same and were chosen based on the content of the lecture course, but the sequence of the questions was changed. The delayed tests were conducted 2 weeks after the IRS activities, and there were four scores collected, namely IRS activity 1, IRS activity 2, Delayed test 1 and Delayed test 2. According to the results of the paired samples t-test of the four scores (Table 4–a), the findings indicated that the learners who used the group IRS mode did not show a significant difference between the IRS activities and the delayed tests, while it was noticed that the learners with individual use of the IRS achieved significant improvements on the IRS activity 1 (t ¼ À 2.398, p ¼ 0.02). This indicates that learners may have better learning retention of the learning content with individual IRS mode (Wang, 2018). Besides, the learners with individual mode had higher agreement that the IRS activity helped with learning acquisition (Table 4–b, Q5). It was noticed that the students had a preference for the shared clicker activities (Table 4–b Q1), and they were not just sitting in the classroom, but had to communicate based on what they had just learned, and to immediately respond to their group members in order to produce the answers and reach meaningful learning conclusions during the cooperative IRS activities. In experiment 3-2, the teacher redesigned the fields in the Google forms including three evaluation rubrics, the richness of the content, the content design, and the quality of presentation, and then the students had to give meaningful feedback accordingly in the peer assessment activity. Besides, in this phase, the instructor also encouraged the learners to summarize the feedback from their peers and to respond to the feedback in order to improve their own work. This was an optional exercise. Experiment 3-2 was evaluated Table 3c The qualitative feedback on experiment 2-2. Feedback on anonymity (FA) � Adopting anonymous evaluation made us speak out our thoughts bravely and helped us to generate more ideas; then we can make improvements accordingly P2-2-31779. � It was great that the comments were presented anonymously for us, but the teacher could know who gave the comments P-2-31688. Peer assessment activity (PAA) � It was great to share my real thoughts about the performance with my peers P-2-31910 � It was great that the comments were presented anonymously for us, but the teacher could know who gave the comments P-2-31688. � I read the feedback and some comments were made in a joking manner. It was a pity that not everyone gave thoughtful peer feedback P-2-31225 � I liked to read constructive feedback from my peers; however, most of the feedback was not useful P-2-31029 � I think the way of doing peer assessment was good, because I could know their thoughts and improve the work accordingly, but some comments were just superficial P-2-1415 � Some comments were not professional. P-2-31464 � I learned from peers’ comments, but some comments were not worth referencing. P-2-37056

Y.-H. Wang Computers & Education 156 (2020) 103935 Table 3d Quantitative findings Data summary of phases 2. The students liked to do peer assessment anonymously and the average score was 4.12 (Table 3–a Q1: 4.12). Categories Qualitative findings The students revealed that summarizing peers’ comments was useful and the average score was 4.05 (Table 3–a Q7: 4.05). Feedback of The learners preferred to do peer evaluation anonymously to The students’ feedback of whether peers’ comments were helpful, and anonymity express their true feelings about peers’ work. the average score was 4.82 (Table 3–a Q7: 4.82). The students reflected that some comments on the peer Peer assessment assessment forms were superficial. activity Fig. 5. The students discussing the IRS and peer assessment activity in the course. Table 4a Paired samples t-test on learning performance scores (Wang, 2018). (a). Group mode (n ¼ 67) (b). Individual mode (n ¼ 53) Mean S.D t p Mean S.D t p . 494 À 2.398 .020* IRS activity 1 70.4981 12.234 -.688 64.8526 16.09305 Delayed test 1 72.4138 18.1185 -.860 .393 72.6531 15.91324 À 1.374 .177 IRS activity 2 83.8095 13.7845 86.8217 12.36469 Delayed test 2 85.7500 15.1496 90.3488 10.76786 Table 4b The questionnaire results of experiment 3-1 (Wang, 2018). a. Questions Modes Avg. S.D. 1. I like the way of IRS participation. (SIRS) Group (n ¼ 67) 4.44 .616 Individual (n ¼ 53) 3.98 .753 3. The IRS activities promoted interaction between peers. (SIRS) Group (n ¼ 67) 4.41 .586 Individual (n ¼ 53) 3.82 .806 5. The participation in the IRS activities helped me understand the learning content. (SIRS) Group (n ¼ 67) 4.11 .785 Individual (n ¼ 53) 4.07 .580 7. I felt stressed about participating in the IRS activities. (SIRS) Group (n ¼ 67) 2.54 .997 Individual (n ¼ 53) 2.73 .915 9. The IRS activities gave me a strong impression of the learning questions. (SIRS) Group (n ¼ 67) 3.44 1.01 Individual (n ¼ 53) 4.18 .576 *5-point Likert scale (from 5 to 1: strongly agree, agree, neutral, disagree, and strongly disagree). *Categories of the question items: Strategy with IRS (SIRS). through questionnaires and it was found that the average score from the items regarding whether the peer comments were helpful had improved (Phase 3: Table 4–c Q1 to Q5, Phase 2: Table 3–bQ2 to Q6) and although it did not achieve significant difference, this might still reveal that adding detailed evaluation rubrics as scaffolding to guide the peer evaluations was helpful in terms of initiating students’ detailed comments. Some comments from the students echoed the findings revealing that they learned from peers’ feedback; however, it was still noticed that some did not think their classmates paid much attention to the peer assessment work. One student suggested that “The forms could be improved through adding detailed scoring items to let classmates giving sub-scores for each evaluation rubric” (P-3-30343). Furthermore, it was quite encouraging to find that most of the learners agreed that to summarize peers’ feedback as modification reference was very helpful work (Table 4–c Q6, Avgas ¼ 4.06, S.D. ¼ 0.64). The researcher cross-analyzed the data from learners’ questionnaires and learning diaries of experiment design 3-1 and 3-2, and then organized them into two categories, Strategy with IRS (SIRS), and Peer assessment activity (PAA), in Tables 4-d and 4-e. 13

Y.-H. Wang Computers & Education 156 (2020) 103935 Table 4c The questionnaire results of experiment 3-2. Questions items Phase 3 Phase 2 experiment 3-2 experiment 2-2 (n ¼ 115) (n ¼ 108) Avg. S.D. Avg. S.D. tp 1. I can give my comments objectively no matter whether in a non-anonymous or anonymous way. (FA) 3.98 1.047 3.67 1.03 1.73 .08 3.98 1.00 .54 .58 2. I can give my comments objectively in an anonymous way. (FA) 4.08 1.089 3.82 .797 .91 .36 4.36 .768 -.13 .89 3. The comments from peers were useful for improving my work. (FA) 3.94 .639 2.67 .907 1.13 .26 4.05 .806 .08 .93 4. The comments from teachers were helpful for improving my work (PAA) 4.35 .653 5. The peers did not give comments seriously because of the anonymous way. (FA) 2.85 .872 6. It was good for self-reflection to summarize the comments from the peer assessment activity (PAA) 4.06 .639 *5-point Likert scale (from 5 to 1: strongly agree, agree, neutral, disagree, and strongly disagree). *Categories of the question items: Feedback of anonymity (FA), Peer assessment activity (PAA). 5.6. Specifying and suggesting After Phase 3, the researcher found that students preferred to experience the IRS activity via groups because they enjoyed the group discussion and learned from peers’ opinions while finding correct answers. They performed as active learners in thinking about the learning content, finding answers to questions, and having good interaction during the IRS activity. The results also confirmed the possible benefits of using redesigned forms to elicit students’ meaningful feedback in the peer assessments; however, despite the improvement in the overall average scores of the questionnaires, it was found that the current design of the peer assessment mech­ anism is still not perfect for initiating students’ deep reflection. Thus, it is suggested that the next phase of the DBR research should give students a chance to double blind-review their comments to help them refine their peer feedback, leaving the suitable comments, and improving the more generic comments to help the reporters enhance their performance accordingly. Besides, for the next phase, the instructor could ask the learners to summarize the feedback from peers and to provide their reflections on the received feedback in order to improve their own work. Moreover, students could be assessed based on the quality of the feedback they give to their peers as a component of their course grade to encourage all students to be more thoughtful about the feedback process. 6. Brief summary of the results After three phases of data analysis, I outlined the processes and results of each phase according to the DBR structure in Table 5. Moreover, an overview of the focus in each iterative process from the aspect of educational practice (learning technology tools) and educational theory of the four semesters is summarized in Fig. 6. 7. Discussion To answer the first research question, the DBR processes were conducted in three phases to explore the issues of how to use learning technology tools to formulate good higher educational settings. The data analysis, research design and re-design, and how the learning technology tools can promote active learning of course participation and peer assessment were explored through the iterative DBR phases. To answer the second research question, it is suggested that the free learning technology tools such as Kahoot! and Menti were useful learning technologies that could be adopted to arouse course interaction, participation and learning retention. Teachers could use these tools alternately for hosting immediate question interaction in their classes. Then, a peer assessment activity could be conducted to encourage students to read and learn from peers’ work through structuring their thoughts from giving comments. The learning technology tools such as Google sheets and Google forms could be applied to courses for conducting immediate anonymous and non-anonymous peer assessment. The results of the study also indicated that the non-anonymous design of the peer assessment activity made some students dare not reveal their real comments on their peers’ performance; most students were more accepting of Table 4d The qualitative feedback of experiment 3-2. Feedback on anonymity (FA) � I learned a lot from my classmates’ feedback. Because of the anonymity, the classmates gave us very true comments directly. P-3-01536 Peer assessment activity (PAA) � I think it was really a good way to conduct peer assessment. P-3-01536 � We modified the work according to our peers’ comments. I think it was helpful and I’d like to know my classmates’ opinions about our work. P- 3-31860 � Some classmates were very serious about the peer assessment work, but some were not. P-3-31845 � The teacher’s feedback was helpful for us, but I didn’t think the feedback from the classmates was helpful. P-3-31829 � The forms could be improved through adding detailed scoring items to let classmates giving sub-scores for each evaluation rubric. P-3-30343 14

Y.-H. Wang Computers & Education 156 (2020) 103935 Table 4e Data summary of phases 3. Categories Qualitative findings Quantitative findings Strategy with The students preferred to experience the IRS activity via groups. The students had a preference for group IRS mode. The IRS average score for Groups was 4.32 and for individuals it was 3.96. Peer assessment The students reflected that to summarize peers’ feedback was helpful, while Group: (Table 4–b Q1: 4.44 þ Q2: 4.41 þ Q3: 4.11)/3 ¼ 4.32 activity adding evaluation rubrics for helping peers generate useful feedback was still Individual: (Table 4–b Q1: 3.98 þ Q2: 3.82 þ Q3: 4.07)/3 ¼ needed. 3.96 The individual IRS mode was helpful for learning retention. ( Table 4–a, (b) p ¼ 0.02). The students revealed that summarizing peers’ comments was useful (Table 4–c Q6: 4.06). The students’ feedback on whether peers’ comments were helpful. (Table 3–a Q3: 3.94). the anonymous peer assessment. This revealed that anonymity was the key reason why students would like to honestly express their feedback on their peers. The current research results supported Liu and Hsu’s (2016) study that anonymity in activities for college students is needed, but it is contrary to Panadero and Brown’s (2015) findings. The various research findings might result from a cultural difference whereby Taiwanese students are perhaps shyer when it comes to expressing their opinions in comparison with European or American students, since Asian students might be afraid of losing face or of pressure from teachers and the overall ac­ ademic environment (Haarms et al., 2018). The anonymous comments mechanism gave students a chance to hide their identity and to express their true feelings without being afraid of being laughed at for asking or proposing low-level questions and comments. To answer the last research question, the learning technology tools with appropriate instructional design would help teachers to host an interactive course and trigger the students to become active learners. A model for using learning technology tools in a higher education scenario is presented (Fig. 7). The model in Fig. 7 indicates that the interaction between instructors and students, and between peers is important since interaction promotes active learning, and active learners are more likely to internalize the knowledge and achieve high-order learning ability. In Fig. 7, the ways and tools of integrating the learning technology into class are categorized by the targeted learning knowledge dimension. According to the revised Bloom’s taxonomy of educational objectives (Krathwohl, Bloom, & Masia, 1956), the two-dimensional framework consists of the knowledge dimension and cognitive process dimension. The knowledge dimension consists of four types: factual, conceptual, procedural and metacognitive, and the cognitive processes dimension includes, from lowest to highest, remember, understand, apply, analyze, evaluate and create. It is suggested that for the dimension of remember and understand, individual use of IRS by learners is more effective for achieving better learning retention. For the knowledge dimensions of apply and analyze and above, cooperation through group use strategies in the IRS activity and peer assessment activity would be better to facilitate advanced cognitive knowledge level acquisition. Besides, the pre-organized peer assessment activity with the Google form is suggested to help students reflect, give more constructive peer feedback, and achieve advanced knowledge acquisition. In sum, when integrating educational learning technology tools into teaching, whether the tools are convenient and free to use and whether they are easy for the teachers and students to learn are important because the purpose of integrating learning technology into courses is to use the tools to assist with learning the target knowledge rather than to learn how to use the tools. 8. Conclusion After conducting the iterative DBR phases, the researcher worked through the processes from seeing the problem in the educational site, finding solutions from both theory review and practice, then verifying the idea and redesigning the research again. The most challenging aspect of the study is applying the teaching theory with learning technology tools to the practical classroom scenario, because the classroom environment is dynamic and the interaction between students and instructors is changing constantly. During the trial and error process of the DBR, the study focused on how to use learning technology tools to promote course interaction and achieve better peer assessment activities as well as to produce constructive peer feedback. The students’ feedback such as learning diaries were quite important for teacher since it helped instructor to adjust the course progresses and activity design immediately every week. It is found that the learners were willing to share their learning though through learning diaries, both of a sense of accom­ plishment or learning dilemma, since they believed that their feedback would be referenced by teachers and their comments and suggestion might lead to change in the next class, thus making the course achieve their expectations. Although it took a great deal of time for the teacher to read and digest the students’ feedback, and to adjust and redesign the course activity every week, the time spent made it possible to build two-way communication between the teacher and learners, resulting in a win-win situation that helped to facilitate the use of the learning technology tools in the higher education class setting. After conducting the study, it can be concluded that the ways in which learning technology tools are integrated into courses would vary according to the knowledge dimension. At the beginning of a course or in the introductory lessons, the individual use of IRS tools would play the role of interaction facilitator to assist students in acquiring the basic knowledge level; then, with the cooperative IRS activities, students are able to produce and reach meaningful learning conclusions. When the conceptual knowledge or lecture study finishes, peer assessment is an effective activity for facilitating more advanced cognitive knowledge level acquisition. The learning 15

Table 5 The outline and processes of the study according to the DBR structure.1. Y.-H. Wang 16 Computers & Education 156 (2020) 103935

Y.-H. Wang Computers & Education 156 (2020) 103935 Fig. 6. An overview of the iterative process in DBR. Fig. 7. Integrating learning technology into higher education class models. technology tools such as Google suite are effective in terms of achieving both non-anonymous and half-anonymous peer assessment activities, and a predesigned peer assessment form with detailed evaluation rubrics could promote and scaffold students’ elicitation of meaningful feedback and achievement of higher level knowledge acquisition. 8.1. Future work After the three-phase DBR process, it is suggested that teachers could include peer assessment as a formal component of the course grade, and explore how to enhance the assessment mechanism and the guidance design during the peer assessment processes. Besides, further research could focus on assisting learners in providing more accountable talk during the IRS interactions, and on running in- group and between-group learning assessment activities to help students enhance their higher order thinking. Meanwhile, it could focus on how the feedback receivers could modify their performance according to the feedback from the in-group and between-group assessment activities for generating more constructive comments. Lastly, research issues regarding students’ perceptions at various educational levels and in learning various subjects with similar learning technology tools for course interaction could be studied. 8.2. Limitations The limitations of the study are that, first, the study was performed by one teacher, the courses conducted in this study were lecture- based courses with the purpose of encouraging students to present team projects at the end of the semester, and the target learners were adult learners. Thus, the results cannot be generalized to other course instructors, other learning subjects, or younger learners. Second, the collected data of the study are mostly from the students’ self-evaluation opinions on the learning technology tools use instead of on how they may learn effectively with the tools. Third, due to considering the real classroom learning scenario of DBR, there was no control group as a comparison to test the learning outcome before and after the treatment. Author contributions Yi Hsuan Wang: Conceptualization, Methodology, Data curation, Formal analysis, Writing, Visualization and Project administration. 17

Y.-H. Wang Acknowledgements his research project is jointly funded by the Ministry of Science and Technology in Taiwan, 107-2511-H-032 -004 -MY2, and by Ministry of Education in Taiwan, PED1080017. The author would like to thank Ministry of Science and Technology, and Ministry of Education for their support. References Abdullah, M. Y., Bakar, N. R. A., & Mahbob, M. H. (2012). Student’s participation in classroom: What motivates them to speak up? Social and Behavioral Sciences, 51, 516–522. Anderson, L. W., & Krathwohl, D. R. (2001). A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives: Complete (Edition). New York: Longman. Barab, S., Squire, K., Wang, F., & Hannafin, M. J. (2004). Design-based research: Putting a stake in the ground. The Journal of the Learning Sciences, 13(1), pp1–14, 2005. Blasco-Arcas, L., Buil, I., Herna�ndez-Ortega, B., & Sese, F. J. (2013). Using clickers in class. The role of interactivity, active collaborative learning and engagement in learning performance. Computers & Education, 62, 102–110. Bonwell, C. C., & Eison, J. A. (1991). Active learning: Creating excitement in the classroom. Washington, DC: ASHE-ERIC Higher Education Report No. 1, ISBN 978-1- 878380-08-1. ISSN 0884-0040. Brady, M., Seli, H., & Rosenthal, J. (2013). “Clickers” and metacognition: A quasi-experimental comparative study about metacognitive self-regulation and use of electronic feedback devices. Computers and Education, 65, 56e63. Brown, G. T. L., & Harris, L. R. (2013). Student self-assessment. In J. H. McMillan (Ed.), The SAGE Handbook of Research on Classroom Assessment (pp. 367–393). Thousand Oaks, CA: Sage. Bruff, D. (2009). Teaching with classroom response systems: Creating active learning environments. San Francisco, CA: Jossey-Bass. Cain, J., & Robinson, E. (2008). A primer on audience response systems: Current applications and future considerations. American Journal of Pharmaceutical Education, 72(4), 77. Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best-practice tips CBE-life. Sciences Education, 6(1), 9–20. Carr, R., Palmer, S., & Hagel, P. (2015). Active learning: The importance of developing a comprehensive measure. Active Learning in Higher Education, 16, 173–186. Collins, A. (1992). Towards a design science of education. In E. Scanlon, & T. O’Shea (Eds.), New directions in educational technology (pp. 15–22). Berlin: Springer. Daniel, T., & Tivener, K. (2016). Effects of sharing clickers in an active learning environment. Educational Technology & Society, 19(3), 260–268. Easterday, M. W., Rees Lewis, D., & Gerber, E. M. (2014). Design-Based research process: Problems, phases, and applications. In Proceedings of the International Conference of the Learning Sciences, June 23-27, 2014, Colorado, USA (pp. 317–324). copyright ISLS. This work was first published in Learning: Research and Practice 17th February 2017 available online at: http://www.tandfonline.com/10.1080/23735082.2017.1286367. EUA coordinators. (2019). Promoting active learning in universities. Cecilia christersson, patricia staaf. Gielen, M., & Wever, B. D. (2015). Scripting the role of assessor and assessee in peer assessment. Computer & Education, 88, 370–386. Grunert, J. (1997). The course syllabus: A learning-centered approach. Bolton, MA: Anker Publishing Co, Inc, 1997. Haarms, R., Holtzman, J., Xue, T., & Darbyshire, D. (2018). Chinese students’ cultural and behavioural differences among domestic and internationally oriented educational institutions. International Journal of Psychology and Educational Studies, 5(2), 30–38. Han, H. J., & Finkelstein, A. (2013). Understanding the effects of professors’ pedagogical development with Clicker Assessment and Feedback technologies and the impact on students’ engagement and learning in higher education. Computers & Education, 65, 64–76, 2013. Hattie, J. (2003). Teachers Make a Difference. What Is the Research Evidence?. In 1-17) Australian Council for Educational Research Annual Conference on Building Teacher Quality. Auckland: University of Auckland. Hwang, G.-J., & Chen, P.-Y. (2019). Effects of a collective problem-solving promotion-based flipped classroom on students’ learning performances and interactive patterns. Interactive Learning Environments. Jones, M. E., Antonenko, P. D., & Greenwood, C. M. (2012). The impact of collaborative and individualized student response system strategies on learner motivation, metacognition, and knowledge transfer. Journal of Computer Assisted Learning, 28, 477e487. Juuti, K., & Lavonen, J. (2012). Design-based research in science education: One step towards methodology. Nord. Stud. Sci. Educ., 2(2), 54–68. Kietzig, A. M., & Orjuela-Laverde, M. C. (2015). Increasing students engagement in class using an open-ended students response system. In Proceedings of The Canadian Engineering Education Association Conference University of Calgary. June 8-11. Kim, P., Suh, E., & Song, D. (2015). Development of a design-based learning curriculum through design-based research for a technology-enabled science classroom. Educational Technology Research and Development, 63(4), 575–602. Kollar, I., & Fischer, F. (2010). Peer assessment as collaborative learning: A cognitive perspective. Learning and Instruction, 20(4), 344–348. https://doi.org/10.1016/j. learninstruc.2009.08.005. Krathwohl, D., Bloom, B., & Masia, B. (1956). Taxonomy of educational objectives. Handbook II: Affective domain. New York: David McKay. Latan�e, B., & Wolf, S. (1981). The social impact of majorities and minorities. Psychological Review, 88, 438–453. Li, K. C., & Wong, B. T. M.( (2020). The use of student response systems with learning analytics: A review of case studies (2008-2017). International Journal of Mobile Learning and Organization, 14(1), 69–79. Liu, N. F., & Carless, D. (2006). Peer feedback: The learning element of peer assessment. Teaching in Higher Education, 11, 279–290. Liu, N. T., & Hsu, T. C. (2016). Employing the evidence of eye movement to explore the reasons of different results of peer assessment. Proceedings of the 20th global Chinese conference on computers in education 2016. Hong Kong: The Hong Kong Institute of Education. Lynch, D. H., & Golen, S. (1992). Peer evaluation of writing in business communication classes. Journal of Education for Business, 68(1), 44–48. https://doi.org/ 10.1080/08832323.1992.10117585. Majgaard, G., Misfeldt, M., & Nielsen, J. (2011). How design-based research, action research and interaction design contributes to the development of designs for learning. Des. Learn., 4(2), 8–21. McKenney, S. E., & Reeves, T. C. (2012). Conducting educational design research. New York, NY: Routledge. Panadero, E., & Brown, G. T. L. (2015). Higher education teachers’ assessment practices: Formative espoused but not yet fully implemented. In Paper presented at the Fifth Assessment in Higher Education Conference 2015, Birmingham, (UK). Panadero, E., Jonsson, A., & Strijbos, J. W. (2016). Scaffolding self-regulated learning through self-assessment and peer assessment: Guidelines for classroom implementation. In D. Laveault, & L. Allal (Eds.), Assessment for Learning: Meeting the Challenge of Implementation. European University Association. (2019). Promoting active learning in universities. European University Association asbl. Raes, A., Vanderhoven, E., & Schellens, T. (2013). Increasing anonymity in peer assessment by using classroom response technology within face-to-face higher education. Studies in Higher Education, 40(1), 178–193. https://doi.org/10.1080/03075079.2013.823930. Shattuck, J., & Anderson, T. (2013). Using a design-based research study to identify principles for training instructors to teach online. International Review of Research in Open and Distance Learning, 14(5), 186–210. Simonsmeier, B. A., Peiffer, H., Flaig, M., & Schneider, M. (2020). Peer feedback improves students’ academic self-concept in higher education. Research in higher education. Strijbos, J. W., & Sluijsmans, D. (2010). Unravelling peer assessment: Methodological, functional, and conceptual developments. Learning and Instruction, 20, 265–269.

Y.-H. Wang Topping, K. (2003). Self and peer assessment in school and university: reliability, validity and utility. In M. Segers, F. Dochy, & E. Cascallar (Eds.), Vol. 1. Optimising New Modes of Assessment: In Search of Qualities and Standards. Innovation and Change in Professional Education. Dordrecht: Springer. Topping, K. J. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68, 249–276. https://doi.org/10.2307/1170598. Van Gennip, N. A. E., Segers, M. S. R., & Tillema, H. H. (2010). Peer assessment as a collaborative learning activity: The role of interpersonal variables and conceptions. Learning and Instruction, 20(4), 280e290. https://doi.org/10.1016/j.learninstruc.2009.08.010. Vanderhoven, E., Raes, A., Montrieux, H., Rotsaert, T., & Schellens, T. (2015). What if pupils can assess their peers anonymously? A quasi-experimental study. Computers & Education, 81, 123–132. https://doi.org/10.1016/j.compedu.2014.10.001. Wang, A. I., & Tahir, R. (2020). The effect of using Kahoot! for learning – a literature review. Comput. Educ., 149. Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments. Educational Technology Research and Development, 53, 5–23. Wang, Y. H. (2016). Could a mobile-assisted learning system support a flipped classroom for classical Chinese learning? Journal of Computer Assisted Learning, 32(5), 391–502 (SSCI Journal). Wang, Y. H. (2017). The effectiveness of integrating teaching strategies into IRS activities to facilitate learning and teaching. Journal of Computer Assisted Learning, 33 (1), 35–50. SSCI Journal. Wang, Y. H. (2018). Interactive response system (IRS) for college students: Individual versus cooperative learning. Interactive Learning Environments, 26(7), 943–957. Weimer, M. (2008). Active learning advocates and lectures. Retrieved April 7, 2017 from http://www.facultyfocus.com/articles/teaching-and-learning/active-learning- advocates-and-lectures/. Weitze, C. L., & Ørngreen, R. (2014). The global classroom model simultaneous campus-and homebased education using video conferencing. Electron. J. E-Learning, 12 (2), 215–226. Wu, Y., & Schunn, C. D. (2020). From feedback to revisions: Effects of feedback features and perceptions. Contemporary Educational Psychology, 60, 101826. Yung, K. W. H. (2012). Peer assessment in higher education: A reflection of the experience of an English language instructor in hong kon. Dr. Wang received her Ph.D. degree in Institute of Information System and Applications from National Tsing Hus University and her research interests include game- based learning, interactive e-learning content design, and technology enhanced language learning.


Like this book? You can publish your book online for free in a few minutes!
Create your own flipbook