Using an electronic voting system to promote active reflection on coursework feedback, Q Cutts, A Carbone, K van Haaster

Tags: lecturer, coursework, EVS, students, feedback, coursework feedback, discussion, Jossey-Bass, Aus J. Educational Tech, Angela Carbone School of Information Management Systems Monash University, College Teachers, student, cyclic process, References Angelo & Cross, Votes, Number of students, deeper understanding, collaborative learning, Kolb's learning cycle, J. Comp, San Francisco, Brown, GTCO Calcomp, Laurillard, Classroom Assessment Techniques, electronic voting system, Draper & Brown, Draper
Content: To appear in Proc. of the Intnl. Conf. on Computers in Education 2004, Melbourne, Australia, Nov. 30th ­ Dec 3rd 2004. Using an Electronic voting System to Promote Active Reflection on Coursework Feedback Quintin Cutts Department of Computing Science University of Glasgow, Scotland [email protected] Angela Carbone School of Information Management Systems Monash University, Australia [email protected] Kelsey van Haaster School of Computer Science and software engineering Monash University, Australia [email protected]
Abstract: Many lecturers use coursework as the primary mechanism for providing students with feedback on their learning. However, against the models of Laurillard and Kolb which view learning as a cyclical process, they provide little or no scaffolding to support effective assimilation of the feedback by the students. This paper proposes a pedagogical script for using an electronic voting system (EVS) to promote the necessary assimilation, based on the generation of discussion found in Mazur's Peer Instruction method. The script's use in three case studies is described. Staff and students found the sessions beneficial over traditional remediation mechanisms. Over threequarters of the final session was spent in students working on and discussing the misunderstandings apparent in their coursework.
Feedback and reflection are essential to learning, and have been represented in many educational theories and frameworks. For example, Laurillard's dialogue model of learning (Laurillard, 2002) embodies feedback in the communications between the teacher and learner, and reflection on this feedback in their thought processes; Kolb's learning cycle (Kolb, 1984) depends on feedback and subsequent reflection generated from the results of actively working with the material under study; and collaborative learning (Matthews, 1996) depends on the interplay amongst learners and between teacher and learner, again embodying both feedback and reflection. These models view learning as a cyclic process, which may require many iterations of the communication, processing, feedback and reflection loop before the learning is successful.
1
4
T
3
L
2
Figure 1: A simplified version of Laurillard's dialogue model In this paper, a simplified version of Laurillard's model will be used to represent the learning process, as shown in (Fig. 1). In a typical learning situation, a teacher (T) imparts knowledge, asks questions and demonstrates skills to the learner (L), represented by (1) in the diagram. The learner then processes, engages with, and reflects upon the material received (2). The learner subsequently responds to the teacher (3), based upon their current understanding derived from the processing of step (2). Finally, the teacher uses the information in (3) to assess the current position
1
of the student's understanding in relation to the intended learning outcomes (4). If necessary, he/she will reformulate and re-present the material, thereby embarking on a new cycle of the process (1). Failing to make use of coursework feedback In many university courses, a principal use of this cyclic process is the setting of coursework by the lecturer (1), which the students complete in their own time, processing the material of the course to construct a submission (2). They hand this work to the lecturer (3), who assesses it summatively with a mark and/or formatively with feedback for the students. During the assessment of each submission, the lecturer determines the student's progress in the course, and offers advice deemed necessary for the students' progress (4). Handing the submissions back to the students constitutes the start (1) of a new cycle of the process. The expectation is that the students read the feedback carefully, working out how to adjust their current understanding so it is in line with the feedback, and if necessary discussing it with the lecturer or other students if they cannot reach a satisfactory understanding on their own. A weakness of this approach to coursework is that whilst the initial communication (1) by the lecturer in setting the work demands active engagement by the students, since they must construct a result, the second communication of type (1) by the lecturer, the feedback, is much more passive in nature. At worst, the students simply ignore it; in most cases, they may read it and even attempt to learn it, but they are required to do little in the way of active processing that would really help adjust the misunderstandings that the lecturer has highlighted. Even when the lecturer takes time in a lecture to present misunderstandings, the format is usually passive. Additionally, the students often completed the work some time in the past, making it difficult for them to re-engage with their original work. Improving reflection and remediation on coursework feedback using EVS This paper attempts to address the lack of active student engagement with coursework feedback by proposing a particular style of use of an electronic voting system (EVS). Generally, the use of an EVS in classes enables every student to submit an answer to a multiple-choice question set by the lecturer. Each student has a handset that transmits their response to a central computer which in turn collates all responses and displays them to the group, typically as a bar chart. Whilst the availability of EVSs is increasing rapidly, there is relatively little written on how to make good use of them in teaching. Successful styles of EVS use require a sound educational rationale, along with a pedagogical script that outlines the general format of use (Draper & Brown, 2004). In the literature, there are a growing number of such scripts, (e.g. Dufresne et al., 1996, Mazur, 1997, Wit, 2003, Draper et al., 2001). A script for working with coursework feedback is introduced over the next two sets of bullet points, where an EVSenabled feedback session is designed specifically to encourage students to engage with feedback derived from coursework completed outside lectures and tutorials. The third set of bullet points outlines the proposed benefits. The steps of the script carried out before the EVS session are as follows: The lecturer sets an exercise, either written or on-line, (1) in the model of (Fig. 1). Students work on the exercise, in a specific session, or in their own time, (2). The submission represents the student's understanding of the material at this stage, (3). The lecturer marks the submissions, making a list of misconceptions that repeatedly occur, (4). An EVS session is developed containing one or more questions related to each misconception. A key aspect of the questions is that the students must engage deeply with the misunderstood concepts in order to answer. During the feedback session, the following steps are carried out for each major misunderstanding. This format is largely derived from the Peer Instruction and Class-wide Discussion methods of Mazur and Dufresne respectively (Mazur, 1997, Dufresne, et al 1996). A question associated with the misunderstanding is asked using the EVS, (1). The students attempt the question posed, forcing them to re-engage with the relevant subject matter (2). They use the EVS to respond, with the collated responses presented back to the lecturer only, initially, (3). The lecturer reviews the responses, deciding on one of three remediation options. If most students answer 2
incorrectly, the lecturer shows the result and provides some remedial instruction, followed by another question on the topic, to check understanding. If most answer correctly, the lecturer can again show the result and then move on, directing the incorrect responders to supporting materials. If the class is split, (between 30-70% correct according to (Mazur, 1997)), the lecturer uses a discussion between peers. Students work in small groups of 2-4 attempting to persuade each other of the correctness of their answer, followed by a re-vote, the responses from which can be used by the lecturer to determine the next course of action. Students should see the results of the votes after the discussion. The students reflect on the discussion and write down notes about what they have learned. It should be noted that the structure of the communication between students can be important. For example, students should be dissuaded from discussing their first vote on an issue as this can reduce the openness of the subsequent discussion. Communication groups are best set up so that friends are not working together although this has to be balanced against forcing students to work with strangers, particularly when cultural issues may be present. We propose that the benefits of using this technique are as follows: The students are made more aware of the importance of engaging with feedback. More iterations of the learning cycle of (Fig. 1) take place, providing opportunities to enhance learning. Teaching effort is directed to areas demonstrated to be required by students. The lecturer's involvement may lead to a better understanding of how the misconceptions have arisen. Committing to an answer and then defending their own viewpoints and/or accommodating the viewpoints of others results in deeper understanding of the concepts by students (Dufresne et al., 1998). The to-and-fro nature of the discussion between peers represents a number of additional cycles around the dialogue loop. The peer discussion also helps those students who originally got the question correct, because the need to articulate their reasoning to others is likely to deepen their understanding. In effect, they are supporting the learning activities of others, and to do this convincingly, must work to understand the material thoroughly. Three case studies The script outlined above is the result of a study of three EVS-enabled sessions which aimed both to evaluate the script and to evolve it through experience. The sessions are described in chronological order and issues arising in each are identified. All sessions depended on written or on-line tests carried out during the semester, the prime motivation of which was to provide feedback to the students on their learning. The tests also carried marks to encourage students to take them seriously. The 'PRS' electronic voting system (GTCO, 2004) was used for all sessions, along with the QRS software front-end developed at the University of Glasgow (QRS, 2004). The lecturers were supported in the setting up and running of the EVS during the sessions by an experienced user. The aim here was to maximise the learning opportunity by enabling the lecturer to focus on the presentation of the questions, the interpretation of the responses, and the consequent remediation options, without having to worry about the operation of the technology. Once these skills have been mastered, then the driving of the technology can be included. This approach to introducing EVS-use is discussed in detail in (Draper & Brown, 2004). Reports for all three sessions are available on-line (QRS, 2004). Session 1 The first and third sessions were in a first year undergraduate unit titled Web-based information systems, covering concepts of Web design and development. Twenty-two students were enrolled in the unit. The unit comprises of 2 weekly lectures and a 2-hour tutorial session per week. During the semester, two test papers are issued, each worth 10% of the unit's total mark. Both tests consist of 15 MCQs and some short answer questions. In previous years, tutors marked the papers and returned them to the students, and the lecturer provided students with solutions to the unit's test papers on a website. Typically, some discussion of the solutions was offered in lectures or tutorials but this usually required minimal input from the students and offered little opportunity for student reflection. In the semester under study, after each test, tutors marked the papers and collated the results. As an example, a summary of the results for Test 1 is provided in (Tab. 1). The collated results provided feedback on how students answered each question, highlighting those concepts that students grasped and those which they misunderstood. For example, (Tab. 1) illustrates that in the MCQ section of the test all 20 students answered question 7 correctly, but 3
that only 9 out of 20 students responded correctly to questions 10 and 14. The shaded cells highlight the questions in which the majority of students misunderstood the question, or misunderstood the concept behind the question. A set of 9 EVS questions was constructed, each one based on a different topic in the unit and shown by the test responses to be poorly understood. In this first session, the lecturer did not have a clear plan to follow after viewing the response to each question, beyond knowing that the general options of peer discussion, lecturer remediation, or moving on were all available.
MCQ
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
# Correct Responses 17 14 19 17 16 18 20 17 17 9 14 19 14 9 12
Short Answer Ave Mark Total Mark
1a 1b 1c 2a 2b 3a 3b 4a 4b 2.8 0.87 1.68 2.71 3.72 0.81 0.87 1.7 2.7 43 2 5 6 3 3 27
Table 1: Results of Test 1, with problem areas shaded.
In the session itself 13 of the 22 enrolled students were present. The lecturer was surprised to find that a couple of questions that were poorly answered during the test were answered correctly during the EVS session. The obvious interpretation of this was that only the better students had attended the session. Peer discussion was used after the responses to two questions. Class-wide discussion was used after four questions, either to talk through those answered wrongly, or to assist incorrect responders when most students answered correctly. The following issues concerning discussion were noted: students were observed to have discussed their first vote which is likely to reduce their personal commitment to their answer; the students saw the result of the first vote, and when a large majority voted for one or two options only, those voting differently may have been discouraged from arguing for their answer.
Session 2
The second session was in Programming 2 with Java, the second programming unit in a first year undergraduate Computing course. Sixty-seven students were enrolled in the unit. The unit comprises of 2 hours of lectures and a 2hour tutorial session per week. A single unit test is held during the unit, worth 20% of the total unit mark. The feedback session was based on broadly the same format as Session 1 and the EVS questions were derived from the summary statistics produced from the unit test delivered on-line using Vista (Vista, 2004). In this case, the students' responses pointed to a major misunderstanding between two core programming concepts, both of which were required in an up-coming practical assignment, and so effective remediation on these concepts was urgently required.
The format of the EVS questions was designed to hone in on particular aspects of the concepts that the students did not understand. For example, a question about interfaces, one of the two misunderstood concepts, is given below:
Which of the following is true about interfaces? a. An interface does not contain any concrete methods b. Interfaces create new types c. Interfaces allow us to use existing types in a new way
1. All of the above 2. a, b, not c 3. a, c, not b 4. b, c, not a 5. none of the above 6. I don't know what an interface is The lecturer prepared a script in advance on how to work with the EVS responses. The PRS/QRS system was used to analyse the answer to this style of question on-the-fly to produce a graph showing how many students thought each of statements a, b, and c were true and false respectively. Hence, for each EVS question, the lecturer had three remediation decisions to make, one for the collated response to each individual statement. When the vote on a particular statement, a, b, or c, was split, the discussion option was used followed by a re-vote on just the one
4
statement. When a statement was well understood, the lecturer agreed to post web material for the few incorrect responders, and when most of the class responded incorrectly, the lecturer discussed the statement in detail. Session 3 The questions were developed in the same way as for Session 1. The lecturer however adjusted the pedagogical script used within the session based on the experience of the two earlier uses as follows: the structure of the session was explained clearly to students at the beginning, along with how it would be educationally beneficial; the students were seated in groups of three, and were expected to work together during the session; they were instructed not to discuss their first vote, so that it would represent their understanding alone; only the lecturer viewed the result of the first vote to avoid a student's choice being biased by seeing the group's response; the lecturer was clear about when different remediation options would be adopted; finally, the students were given time to record anything they had learned during answering and discussing each question.
Results
The style of evaluation is in line with Draper's Integrative Evaluation (Draper et al., 1996), using the following techniques to determine the effectiveness of the sessions: minute papers (Angelo & Cross, 1993) completed by the students at the end of the first and third sessions; an on-line questionnaire after the second session; interviews with the lecturers; and measurement by an observer of times spent presenting, answering and discussing questions during the session. (Tab 2) shows summary statistics about the sessions.
Evaluations, either from minute papers or from the on-line evaluation, were received from 45 students. Only one negative evaluation was received, from a student in the second session. This session faced significant technical problems due to data projection issues and was shortened because of this. The technical issues were unsettling for the students and lecturer and most commented on it. Although timing data was not collected for this session, and despite the difficulties, it was still included in the study because the evaluation data gained from the session fed into the development of the next.
Number of students Total EVS votes Votes to train the students to use the EVS Votes followed by discussion and revote Votes >70% correct Votes between 30 and 70% correct Votes <30% correct Total time of session in minutes %time lecturer presenting %time students working individually %time students discussing together %time discussion between students and lecturer
Session 1 13 14 1 4 4 5 4 52 52 17 12 19
Session 2 40 6 2 1 1 2 1 30 n/a n/a n/a n/a
Session 3 10 14 0 6 6 6 2 58 24 26 34 16
Table 2: Summary data from sessions
Eleven questions over all three sessions were followed by discussion and a re-vote. Analysis of the data not provided in (Tab. 2) indicates that the second vote never demonstrated poorer understanding than the first: in two questions, the result was similar for both votes; in 7 questions, the second vote moved up one of the shaded categories shown in (Tab. 2) compared to the first, and in two, the first vote was in the <30% correct category, the second in the >70% category. In the final session, 5 of the 6 questions using a revote resulted in the students responding in the top category. Hence discussion appears to have been beneficial.
The evaluation process collected qualitative responses from the participants and, in line with other studies of EVS use, the students enjoyed seeing how others responded, they liked the anonymity of their responses, and they saw
5
how the responses gave the lecturer data on their misunderstandings. Students indicated that increased activity was beneficial: "Entertaining ... It got us thinking and active, which is always good for our level of concentration." On the ability to adjust the lecture to the needs of the students: "I don't think there are many responses in any lecture. The lecturer, therefore, cannot adjust the contents or pace to fit the ability of students. [With the EVS] the lecturer can explain more about the part that most students do not understand." Over the progression of the sessions, the time spent with the lecturer speaking alone halved, and the students appeared to be increasingly willing to enter into discussion. Student comments include: "learned more from discussion with others, better than study alone", "much better than just taking the lecture notes by the lecturer", "everyone is paying attention", "it was a better way of finding the right answer to the questions I found I chose wrongly", and finally "helps understanding by discussing all alternatives instead of just giving the correct answer". One of the lecturers commented "The EVS was successful at probing students to think, commit and reflect on their responses. Once the discussions started and gathered momentum, more and more students engaged with the questions and their underlying concepts." Useful formative feedback for the lecturers was provided in the following feedback: "better not seeing responses until after the discussion"; "I didn't always have time to think about the question". Many students in session 2 viewed it more as a survey than as an aid to their learning - indicating that the objectives of the session needed to be outlined more clearly. A number of students commented that it took too much time. These are all matters that were incorporated into future sessions, and can be rectified with increasing experience by the lecturer. Conclusions The pedagogical script for using an EVS presented here has been demonstrated to be effective in encouraging students to engage with the feedback derived from coursework. The study of the three sessions has highlighted some important aspects that should be considered when adopting the approach: Motivating the students: The purpose of the session should be made clear, before or at the start of the session. Session attendance: The responses in the session may not match those in the coursework depending on which students attend the session compared to those that submitted the original coursework. Following the script: To ensure that a comprehensive discussion occurs, students should not discuss their first response with others and they should not see the result of the first vote until after the discussion. This latter point is lost in a footnote in (Mazur, 1997), but has been significant in these uses. Encouraging the students to work in the session: When students were encouraged to get their notes and books out, the discussions became more heated. The students should be encouraged to write up their learning. One student commented, apparently surprised: "I was in a lecture and I was studying". Remediation for those in the minority: When a question is answered mostly correctly, it is not worth using the whole class's time on remediation. A strategy is required to assist those students responding incorrectly, or else the effort they expended on engaging with the activity may be devalued in their minds. Scaffolding EVS use: Before this type of technology becomes mainstream, lecturers need technical assistance in using the technology and professional development to refine their scripts/teaching to ensure the maximum educational value is added to the students' on campus learning experience. Further work will aim to minimise the time spent on question answering and maximise the time spent on student reflection and discussion. However, on the issue of time, students and maybe staff need to be aware of the value of spending as much time as is needed to get the core concepts in a discipline thoroughly understood. Acknowledgements This work was partially supported by a grant from the Visiting Scholars Scheme of the Faculty of information technology at Monash University. The authors thank the students who completed the evaluations that that helped validate and improve this method of EVS use, and Selby Markham for improving its description here. 6
References Angelo & Cross (1993) Angelo, T.A., and K.P. Cross (1993). Minute Paper. In Classroom assessment techniques: A Handbook for College Teachers, San Francisco: Jossey-Bass, 148-153. Draper et al. (1996) Draper, S.W., Brown, M.I., Henderson, F.P., & McAteer, E. (1996) Integrative Evaluation: an emerging role for classroom studies of CAL. Computers and Education, 26(1-3), 17-32. Draper et al. (2001) Draper, S.W., Cargill, J., & Cutts, Q.I., (2001) Electronically enhanced classroom interaction. Aus J. Educational Tech. 18(1), 13-23. Draper & Brown (2004) Draper, S.W., & Brown, M.I., (2004), Increasing interactivity in lectures using an electronic voting system. J. Comp. Assisted Learning in press. Dufresne et al. (1996) Dufresne, R.J., Gerace, W.J., Leonard, W.J., Mestre, J.P., & Wenk, L. (1996) Classtalk: A Classroom communication system for active learning. J. Comp in Higher Ed. 7, 3-47. GTCO (2004) GTCO Calcomp, Columbia, Maryland. http://www.gtcocalcomp.com Kolb (1984) Kolb, D. A(1984). experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice Hall. Laurillard (2002) Laurillard, D. (2002), Rethinking university teaching: a conversational framework for the effective use of Learning Technology, 2nd Ed. London: RoutledgeFarmer. Matthews (1996) Matthews, R.S., (1996), Collaborative Learning: creating knowledge with students. In Menges, R.J., Weimer, M., and Associates Teaching on Solid Ground San Francisco: Jossey-Bass. Mazur (1997) Mazur, E. (1997) Peer instruction: a user's manual Upper Saddle River, NJ: Prentice-Hall. QRS (2004) http://www.dcs.gla.ac.uk/~quintin/QRS Vista (2004) http://www.webct.com Wit (2003) Wit, E. (2003) Who wants to be... The use of a personal response system in statistics teaching, MSOR Connections, 3(2), 5-11. 7

Q Cutts, A Carbone, K van Haaster

File: using-an-electronic-voting-system-to-promote-active-reflection.pdf
Title: Microsoft Word - ICCE2004FinalWeb.doc
Author: Q Cutts, A Carbone, K van Haaster
Author: quintin
Published: Thu Jul 1 03:31:38 2004
Pages: 7
File size: 0.06 Mb


India's economic reforms, 12 pages, 0.11 Mb

, pages, 0 Mb

, pages, 0 Mb

, pages, 0 Mb
Copyright © 2018 doc.uments.com