Design and evaluation of an e-learning environment to support the development and refinement of clinical reasoning and decision-making, JN Scanlan, C McLoughlin

Tags: critical thinking, clinical reasoning, development, discussions, learning environment, understanding, Higgs & Jones, University of Sydney, occupational therapy students, health professions, hybrid instrument, fieldwork placements, reasoning skills, occupational therapy, cognitive development, Scanlan J. & Hancock, Australia, computer based learning, metacognitive reasoning, collaborative learning, clinical settings, Regional Australia, discussion, decision making process, Higgs, interactions, clinical situation, Faculty of Health Sciences, professional development, University Nicola Hancock Faculty of Health Sciences University of Sydney Emerging, American Journal of Distance Education, instrument development, Health Sciences University of Sydney, clinical reasoning skills, students, clinical competencies, interrater reliability, Journal of Educational Computing Research, Nicola Hancock, San Francisco, Jossey-Bass, Catherine McLoughlin, Justin Newton Scanlan, American Psychological Association, reflective practitioner, Australian Catholic University
Content: Proceedings of the 23rd annual ascilite conference: Who's learning? Whose technology? Design and evaluation of an e-learning environment to support the development and refinement of clinical reasoning and decision-making Justin Newton Scanlan Faculty of health sciences University of Sydney Catherine McLoughlin SIMERR, ACT Australian Catholic University Nicola Hancock Faculty of Health Sciences University of Sydney Emerging paradigms of clinical reasoning skills are tending to veer away from linear and clinical competencies towards generic professional skills and decision making processes. In the present study, occupational therapy students have previously complained that they do not receive enough support from the university or their peers during fieldwork placements, when they are expected to demonstrate clinical reasoning skills. Supervisors have observed that occupational therapy students, as novices, have difficulty in demonstrating strong clinical reasoning skills in the fieldwork setting. In this situation, the end-user (i.e. the patient or client) may not receive the optimal level of care and it is therefore imperative to scaffold students' reasoning skills to prepare them as working professionals. This paper will explore the design and evaluation of a moderated online forum to support the development and refinement of clinical reasoning (a form of critical thinking) skills in occupational therapy students undergoing fieldwork placements. An innovative analytic content-based instrument derived from current models of clinical reasoning is applied to a corpus of data to measure students' skills, and on the basis of results obtained, to suggest ways of enhancing the online environment to support emerging decision-making skills among novice practitioners. Keywords: online asynchronous discussion; clinical reasoning; critical thinking; occupational therapy; health professional education; instrument development Introduction: Context of the study Over recent years, the health care service system has undergone rapid and substantial change (Higgs & Hunt, 1999). Service users are demanding that their unique circumstances be acknowledged and considered in the clinical reasoning process. This changing health care context has prompted the review of how clinical reasoning skills are developed and applied by students of the health professions. In the undergraduate occupational therapy curriculum in this study, students in the third year of their fouryear course engage in thirteen weeks of fieldwork in consecutive blocks of six and seven weeks. These fieldwork placements present students with the opportunity to apply their clinical reasoning skills in a real-life setting, taking responsibility for their own group of clients. It is therefore important for students to receive support in their application of these skills. Students in previous years have complained that during these fieldwork placements they did not receive enough support from their peers or university staff to support them in putting the clinical reasoning process into practice. The expansion of web-based teaching tools opened new opportunities and new ways to provide support to these students. On the whole, occupational therapy students appreciate the convenience and flexibility provided by web-based teaching tools (Scanlan & Hancock, 2005). The use of web-based tools to allow students to provide support to each other during fieldwork was a way of handing over ownership of this 727
ascilite 2006, The University of Sydney technology to the users. With structure provided by educators, students are able to utilise this technology to support one another and learn together in their collective application of clinical reasoning skills. It was considered that the provision of support via online asynchronous discussions would be one way of supporting students to achieve the LEARNING OUTCOMES associated with this fieldwork, some of which included: x demonstration of the importance of client/service user perspective in occupational therapy practice x utilisation of best practice and evidence based practice to plan, implement, and evaluate relevant occupational therapy services in collaboration with supervisor x demonstration of an awareness of the value and importance of life long learning in their professional development x development of professional reasoning skills and professional persona e.g.: values, confidence, skills and accurate self-assessment x reflection upon fieldwork experiences demonstrating a deeper level of understanding of its significance compared to previous placements/years. This paper reviews the implementation and effectiveness of these online asynchronous discussions in supporting students to apply their clinical reasoning skills whilst on fieldwork. It explores how our students use technology to support their learning and explores how educators may learn to use this technology to more effectively meet the learning needs of students undertaking fieldwork placements. literature review Critical overview of the links between critical thinking and clinical reasoning Clinical reasoning is a form of critical thinking employed in the context of health care service provision. The development of critical thinking skills lies at the core of all educational programs at the tertiary level and this is especially important in the development of student health professionals who will enter a rapidly changing and demanding healthcare environment. Francis Bacon defined critical thinking as "the skillful application of a repertoire of validated general techniques for deciding the level of confidence you should have in a proposition in the light of the available evidence" (Austhink, 2006). One of the most pressing issues in education is to discover how to support intellectually productive interaction and foster critical thinking and higher forms of cognition, such as those competencies outlined by Brookfield (1987) and by Hager, Sleet, Logan and Hooper (2003). These competencies can include: making reasoned decisions in problematic situations; adapting to change; reasoning and thinking critically; collaborating productively in groups or teams; learning independently; seeing multiple perspectives; and solving problems. When considering higher-order thinking, theorists may differ in the definitions they offer, but agree that it means the capacity to go beyond the information given, to adopt a critical stance, to evaluate, to have metacognitive awareness and problem solving capacities. Having the capacity to be an autonomous thinker and make reasoned judgements is the quality that most often emerges in the literature discussing higher order thinking (Lipman, 1991; Paul, 1993). Much current debate surrounds how to create optimal conditions in online environments for productive interactions that lead to higher order cognition and enable learners to develop as independent thinkers. Most research on computer mediated conferencing has been positive about its potential and capacity to provide a social and supportive climate for learning (Garrison, Anderson & Archer, 2000; Jonassen & Kwon, 2001). Recent research on forms of productive interaction in online environments is linked to socio-cultural theory (McLoughlin & Luca, 2006) as this has been found to be robust and flexible in accounting for group and individual processes in Computer conferencing environments. The theoretical basis for a great deal of research on thinking is derived from a cluster of theories relating to communicative, sociallybased practices in learning. The recognition that learning and everyday cognition are tied to language use has influenced theorists to pay close attention to the influence of social processes on learning and to socio-cultural theory (Resnick, Levine & Teasley, 1991; Coles, 1995). According to sociocultural theory, 728
Proceedings of the 23rd annual ascilite conference: Who's learning? Whose technology? dialogue in a learning setting plays an important part in helping learners to internalise ideas and knowledge from the social plane. Learning is advanced when tasks are pitched just beyond the learners' level of independent ability but still within their reach with outside support or assistance. In order to advance the learner towards more complex forms of understanding, scaffolding can be provided by peers and others. Whereas much of the research applying Vygotsky's work has been based on the asymmetric interactions of teachers and learners, contemporary research is also investigating the interactions in more symmetrical learning environments involving learners working collaboratively (McAteer, Tolmie, Duffy & Corbett, 1997). The interactions that occur among peers in computer conferences are legitimate forms of scaffolding that offer opportunities and support for cognitive development. When learners have to explain ideas to each other, a more explicit and organised understanding can result (Repman, 1993). This form of co-construction leading to cognitive change is critical to the development of higher order thinking processes. Reflection is a key element of clinical practice, interaction and learning and has been emphasised by many theorists and practitioners. Dewey (1933) first emphasised the importance of reflection based on experience. The seminal works of Schцn (1983, 1987, 1995) suggested that reflection, the ability to engage in a process of continuous self-evaluative learning, was a crucial feature of professional practice. He was opposed to professional training models of `Technical Rationality' ­ which involved giving participants materials to apply later in the world of professional practice. Reflection requires "restructuring, theories of phenomena, or ways of framing the problem" (Schцn, 1987, p. 35) and Schцn (1987) saw `knowing-in-action,' `reflection-on-action' (after the event) and `reflection-in-action' (during the event) as "increasingly complex components of reflective practice" (p. 123). The cultivation of reflective abilities has become a critical element of education for students of the health professions. Clinical reasoning in the health professions Traditional methods of clinical decision making in the health professions have been criticised for failing to take into consideration the unique context of the individual. Such methods include pattern recognition (relying on a strong knowledge base to determine intervention given the client's clinical presentation) and hypothetico-deductive reasoning (refining hypotheses through further investigation using an "if... then" progression) (Higgs & Jones, 2000). Recent developments in theory about clinical reasoning in the health professions suggest that education needs to be tailored to support the development of students into professionals capable of interactional clinical reasoning and decision making (Higgs & Hunt, 1999). They refer to this as the "interactional professional." The interactional professional is responsive to the unique needs of the client in their own unique context. Higgs and Hunt (1999) provide an operational definition of the interactional professional as a health professional who "combines the key notions of competence, reflection, problem solving and professionalism, with three other practice concepts, social responsibility, interaction and situational leadership" (p. 15). Stated simply, clinical reasoning refers to thinking and decision making processes that are integral to clinical practice. More specifically, clinical reasoning refers to the process of reflective inquiry. Mattingly and Fleming (1994) further explain that clinical reasoning is a way of thinking and reasoning that "involves deliberation about what an appropriate action is in this particular case, with this particular client, at this particular time" (pp. 9­10). The integration of the concept of the "interactional professional" with previously devised models of clinical reasoning lead to the development of a new model of clinical reasoning for the health professions (Higgs & Jones, 2000). This model (presented in Figure 1) underpins the fieldwork curriculum offered to students in the current study and informed the primary learning outcomes for students in the case study described in this paper. This model integrates the different demands and expectations of modern health care services. Each loop of the model represents "data input, data interpretation (or re-interpretation) and problem formulation (or re-formulation) to achieve a progressively broader and deeper understanding of the clinical problem" (Higgs & Jones, 2000, p. 11). The model also integrates six elements critical to the clinical reasoning process described as: x cognition or reflective enquiry 729
ascilite 2006, The University of Sydney x a strong discipline specific knowledge base x metacognition, which provides the integrative element between cognition and knowledge x Mutual decision making, or the role of the client or patient in the decision making process x contextual interaction, or the interactivity between the decision makers and the situation or environment of the reasoning process, and x task impact, or the influence of the nature of the clinical problem or task on the reasoning process. (Higgs & Jones, 2000, p. 10) Figure 1: Model of clinical reasoning for the health professions (Reprinted by permission from Higgs & Jones, 2000, p. 11) E-learning environments to support clinical reasoning skills The use of online asynchronous discussions was considered to be a useful way to mobilise support from peers for occupational therapy students undertaking fieldwork placements. peer support had previously been identified as an under-utilised resource in this context. In 2004, asynchronous online discussions were introduced to the second of two third year fieldwork placements. In this form, students were required to respond to trigger questions focused on their experiences of the placement: how they found the supervisory style, their feelings about their performance and their supervisor's assessment of their practice, as well as reflection upon difficult or challenging situations. There were not specific, structured opportunities to discuss clinical problems or engage in peer to peer learning surrounding clinical reasoning. In 2005, a new format was introduced. In the first, second and final weeks of fieldwork, students were engaged in similar descriptive and reflective discussions but in the middle weeks of the placement, students were grouped into five clusters according to fieldwork setting and engaged in clinical case discussions. It was expected that these clinical discussions would assist students in the development and refinement of their skills in clinical reasoning. The structure for the clinical case discussions section was presented to the students as follows: Engage in a case discussion about a client in an area similar to that of your placement area. Some students will choose to open a discussion by posting a "client outline", according to these specific criteria: age; diagnosis; assets/ strengths; limitations; occupational performance problems; further information required; and ideas for treatment plan. These client outlines will be the basis of further discussion. Each student is expected to provide additional suggestions by way of at least two postings. Postings this week should cover such areas as: things you might have attempted with similar clients; suggestions for further assessments; general discussion about treatment options for clients with this particular diagnosis; and suggestions for specific treatment interventions. As this was the first implementation of clinical case Discussion Forums, the format for ongoing discussion was presented with minimal structure. This was intended to give some freedom to the students to allow them to utilise the forums in the ways which best met their needs. It was expected that the opportunity to 730
Proceedings of the 23rd annual ascilite conference: Who's learning? Whose technology?
engage in discussions with peers and receive feedback would generate a strong clinical reasoning through collaborative action. Discussions were moderated by fieldwork educators from the university and weekly contributions from students were compulsory.
Data analysis
Method and instrument development
Ethics approval for this study was obtained from the Human Research Ethics Committee of the first and third authors' university. De-identified transcripts from the clinical discussion forums from the first six week fieldwork period were collated for analysis. Initially, the discussions were analysed using Murphy's (2004) instrument for analysis of critical thinking in online asynchronous discussions. A pilot analysis of a random selection of 140 postings was conducted by the first and third authors. This pilot revealed significant limitations of this instrument in this context. Most significantly, this tool was found to be unable to accurately identify a critical feature of clinical reasoning, that is, THE EXPLORATION of the individual's unique context and situation as described by Higgs and Jones (2000).
Following the completion of the pilot study it was determined that it was important to more accurately capture the important elements of clinical reasoning evidenced in the discussions. Whilst a number of authors have discussed methodological approaches for increasing inter-rater agreement (Oriogun & Cook, 2003) and for the analysis of online interactivity (Fahy, 2005; Oriogun, 2003), no methods were found that allowed for the analysis of clinical reasoning in the health care context. In this study, discussion messages were used as artefacts for the analysis of clinical reasoning processes. A hybrid instrument was therefore developed, integrating elements from the instrument for analysis of critical thinking (Murphy, 2004) and the model of clinical reasoning for the health professions (Higgs & Jones, 2000).
Murphy's instrument contained five broad categories and is summarised in Table 1.
Table 1: Instrument for the analysis of critical thinking in online asynchronous discussions (Murphy, 2004)
Recognise Understand Analyse Evaluate Create
Recognising or identifying an existent issue, dilemma, problem, etc. Exploring related evidence, knowledge, research, information and perspectives Seeking in depth clarification, organising known information and dissecting the issue, dilemma or problem into its fundamental components Critiquing and judging information, knowledge or perspectives Producing new knowledge, perspectives, or strategies and implementing them or acting on them
The hybrid instrument retains three broad categories from Murphy's instrument, namely understand, analyse and evaluate. Recognise was removed, as in the clinical environment, the health professional is presented with the problem as the client, their family, or another person has recognised the problem. The clinical decision making process typically begins with the development of understanding.
The understand category was retained, but expanded in the hybrid instrument to more fully analyse the required considerations that must be made by the health professional or student to fully understand the client's unique circumstances. Following Higgs and Jones' (2000) model, these elements of understanding are labelled as clinical problem, knowledge, client's input and environment.
The create category was replaced by two further categories. These categories are metacognitive reasoning and decision making. As presented in the Higgs and Jones (2000) model, clinical reasoning involves deepening and widening understanding, analysis and evaluation that incorporates metacognitive elements to guide decision making. Metacognitive reasoning is considered to be a crucial step prior to effective decision making and is therefore included as a separate category in the hybrid instrument. This hybrid instrument is presented in Table 2.
Two additional codes were also used in the analysis. The first of these codes, "no response", was used where the message did not contain an element of clinical reasoning. Most often these messages were to express gratitude to other students for the provision of information, but there were also messages where
731
ascilite 2006, The University of Sydney
students discussed individual frustrations not related to specific clinical issues. The other code was "moderator comments", which indicated input from a university staff member.
Table 2: Hybrid instrument for the analysis of health professional clinical reasoning
Category Understand Analyse Evaluate Metacognitiv e reasoning Decision making
Code U-P U-K U-I U-E A E M D
Description
Clinical
"The influence of the nature of the clinical problem or task" (Higgs
problem
& Jones, 2000, p. 10)
Knowledge Diagnostic and profession-specific knowledge
Client's
Input from the client in the decision making process, including their
input
preferences
Environment "The interactivity between the decision makers and the situation or
environment of the reasoning process" (Higgs & Jones, 2000, p. 10)
as well as the influence of the client's unique environment
"Seeking in depth clarification, organising known information and dissecting the
issue, dilemma or problem into its fundamental components" (Murphy, 2004)
"Critiquing and judging information, knowledge or perspectives" (Murphy, 2004)
The integrative element between knowledge and information gathered through
reflective enquiry and deep understanding of the unique situation and context of the
client
Sound judgements and decisions on intervention strategies
A total of 263 individual messages were analysed. These consisted of full threads drawn from discussion forums in all clinical areas. Postings were coded by the first author using the hybrid instrument as described above. The use of a single rater is acknowledged as a limitation of this exploratory study. The unit of analysis was selected as the syntactic unit of the whole message as it has been suggested to be more reliable and efficient than other semantic units of analysis (Murphy & Ciszewska-Carr, 2005). Messages were coded according to the different elements of clinical reasoning present. As clinical reasoning is a process, it is neither possible nor useful to rate the "highest" level of reasoning achieved. From the 263 original messages, a total of 314 (using the single broad category of "Understand") and 416 (using the more specific categories within "Understand") codes were assigned.
Results
Table 3 presents the results of the analysis, where the broad category of "Understand" was used in analysis. Although limited, this method of data representation allows for broad comparison to the results of other studies investigating critical thinking in online asynchronous discussions.
Table 3: Clinical reasoning in messages by broad categories and clinical area
Understand Analyse Evaluate Metacognitive Reasoning Decision Making No Response Moderator Comments
Acute (n=85) 76% 5% 13% 2% 0% 5% 0%
Project (n=84) 44% 10% 8% 11% 3% 23% 1%
Rehabilitation / Community (n=91) 83% 3% 10% 0%
Paediatrics (n=55) 67% 4% 16% 4%
0%
2%
0%
7%
4%
0%
mental health (n= 79) 56% 5% 10% 13% 3% 13% 2%
TOTAL (n=314) 65% 5% 11% 6% 2% 10% 2%
Further detail about the clinical reasoning demonstrated in the messages is revealed when the "Understand" category is further separated into its constituent elements. Figure 2 presents how many messages included consideration of the various elements required for deep understanding of the clinical
732
Proceedings of the 23rd annual ascilite conference: Who's learning? Whose technology?
situation. Figure 3 illustrates the frequency of consideration of each individual element of the understanding category.
0% 10% 20% 30% 40% 50% 60% 70%
Acute (n=47)
Project (n=32) Rehabilitation / Community (n=59) Paediatrics (n=30) Mental Health (n=35) TOTAL (n=203)
All 4 elements considered 3 elements considered 2 elements considered 1 element considered
Figure 2: Number of understand elements considered by clinical area
60%
50%
U-P (Understand - Clinical
Problem)
40%
U-K (Understand - Knowledge)
30% U-I (Understand - Client's Input) 20% U-E (Understand - Environment) 10%
0%
Acute (n=72) Project (n=44) Rehabilitation / Community (n=85) Paediatrics (n=46) Mental Health (n=58) TOTAL (n=305)
Figure 3: Elements of Understand category by clinical area Discussion: who's learning and where are they learning? These data demonstrate that the provision of opportunity for clinical discussions was not enough to support students in their development and demonstration of clinical reasoning skills. The expectation that the collaborative learning environment would provide sufficient scaffolding for this process was not realised. As with other studies investigating critical thinking, there was a predominance of responses in the "understand" category (Murphy, 2004). Nevertheless, student discussions did demonstrate consideration of a wide range of contextual factors integral for the development of individualised interventions that consider the unique circumstances of the individual. This is highlighted in Figure 3. Traditional models of clinical decision making have focussed upon pattern recognition and Hypothesis Testing. These models rely on broad understanding of the clinical problem and profession-specific knowledge, but do not consider the input of the client or environmental factors. Whilst there was a predominance of messages including the element of knowledge, peer
733
ascilite 2006, The University of Sydney involvement in the discussion regularly reinforced the importance of considering the opinion of the client and their environmental context. This was a clear advantage of this method of peer-supported learning. The use of the hybrid instrument for analysis of clinical reasoning was also very useful in this study. The instrument revealed significant and important information about the quality of clinical reasoning engaged in by the students, both in individual messages as well as overall. The results shown in Figures 2 and 3 highlight the additional utility of the hybrid instrument over the generic instrument (Murphy, 2004) used in the pilot. This allows us to analyse the quality of students' understanding of the clinical situation (most especially in terms of breadth), which was the primary shortcoming of the original instrument. Clinical reasoning in different settings Different clinical settings appear to influence the clinical reasoning processes demonstrated by students. In the acute hospital setting, there is a typical pattern of short admissions with a focus upon discharge planning. In this setting there is often limited opportunity to conduct detailed assessments. The significant focus upon "understanding" in these discussions is therefore not unexpected as students and practitioners focus on gathering information to assist in pattern recognition and provision of the most appropriate equipment and services. Student discussions reflected this, with many messages containing suggestions of services or equipment provided to similar clients. This context may have limited students' ability to analyse or critique clinical information. This was verbalised by students who expressed that they felt pressured to make quick decisions without fully understanding the individual situations of their clients. Students engaged in project placements act as consultants. They are provided with broad objectives and are required to design and implement strategies to achieve their goals. These settings tend to challenge students to find their role and to discover how their professional skills can be put into practice. Discussions within the project forum were less skewed towards a simple understanding focus. Students demonstrate higher than average levels of analysis, evaluation and metacognitive reasoning. This can be considered an artefact of their need to discover and evaluate their role and contribution in these settings. Ill-structured problems such as those encountered in some project-based fieldwork placements have been reported to assist in the development of critical thinking in a variety of student populations (Cheung & Hew, 2004; Cheung, Tan, & Hung, 2005). Nevertheless, students did seem to be most frustrated in these settings, reflected by the high number of messages rated as "no response." These messages tended to consist of students venting their frustrations associated with the `ill-defined' nature of the placement. Similar to project-based fieldwork, mental health settings appeared to promote greater levels of evaluation and metacognitive reasoning demonstrated in student messages. Clients of mental health services tend to have multiple needs and their complex presentations tend to thwart any attempt to apply model solutions. These circumstances challenge students to develop a comprehensive understanding of their clients' unique circumstances and perspectives. This was highlighted by the relatively even distribution of the different elements of understanding and most especially the high level of input from the client themselves. Lessons about e-learning and e-teaching: future directions This paper examined the use of online asynchronous discussion forums to support the development and refinement of clinical reasoning in a group of third year occupational therapy students undertaking fieldwork placements. A hybrid instrument to assist in the analysis of clinical reasoning in discussions was also developed and trialled. Our hypothesis that student collaboration in the clinical discussion forums would provide suitable scaffolding for the demonstration of the full process of clinical reasoning was not supported. Further development of structures and academic scaffolding around the clinical discussions will be required to assist students in their development and application of clinical reasoning skills in real-life settings. The initial instructions may have encouraged students to take a more superficial approach when making suggestions to their peers. Although other information within the unit encouraged students to explore for evidence in the literature and analyse and critique perspectives, the instructional message did not include 734
Proceedings of the 23rd annual ascilite conference: Who's learning? Whose technology? explicit reference to these requirements. Additional structures that could be incorporated could include the use of a clinical reasoning proforma to guide student messages and further explanation of the purpose of the discussions as forums to refine and develop clinical reasoning skills. Lessons learnt from this study (such as the need for significant amounts of structure and educatorimplemented scaffolding to develop and refine clinical reasoning skills in the discussion forums) have applicability well beyond the education of occupational therapists and other health professionals. Findings regarding this method of e-teaching may allow this technology to be used more effectively in a range of scenario-based educational environments. References Austhink. (23 July, 2006). Tim van Gelder's critical thinking on the web. Retrieved 30 July, 2006, from http://www.austhink.org/critical/ Brookfield, S. D. (1987). Developing critical thinkers. San Francisco: Jossey-Bass. Cheung, W. S., & Hew, K. F. (2004). Evaluating the extent of ill-structured problem solving process among pre-service teachers in an asynchronous online discussion and reflection log learning environment. Journal of Educational Computing Research, 30(3), 197­227. Cheung, W. S., Tan, S. C., & Hung, D. (2005). Investigating problem solving with computer-supported collaborative learning. In AARE 2004 Conference papers [computer file]: [Conference of the Australian Association for Research in Education, 28 November­2 December 2004] compiled by P.L. Jeffrey. Melbourne: Australian Association for Research in Education. Retrieved 30 July, 2006 from http://www.aare.edu.au/04pap/che04032.pdf Coles, M. J. (1995). Critical thinking, talk and a community of inquiry in the primary school. Language and Education, 9(3), 161­177. Dewey, J. (1933). How we think: A restatement of the relation between reflective thinking to the educative process. Lexington, Mass.: D.C. Heath. Fahy, P.J. (2005) Two methods for assessing critical thinking in computer-mediated communications (CMC) transcripts. International Journal of Instructional Technology and Distance Learning, (March). Retrieved 30 August, 2006 from http://www.itdl.org/Journal/Mar_05/article02.htm. Garrison, D. R., Anderson, T. & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2­3), 87­105. Retrieved 30 July, 2006 from http://communitiesofinquiry.com/documents/Critical_Inquiry_model.pdf. Hager, P., Sleet, R., Logan, P. & Hooper, M. (2003). Teaching critical thinking in undergraduate science courses. Science and Education, 12(3), 303­313. Higgs, J., & Hunt, A. (1999). Rethinking the beginning practitioner: introducing the `interactional professional'. In J. Higgs & H. Edwards (Eds.), Educating beginning practitioners: challenges for health professional education (pp. 10­18). Oxford: Butterworth-Heinemann. Higgs, J., & Jones, M. (2000). Clinical reasoning in the health professions. In J. Higgs & M. Jones (Eds.), Clinical reasoning in the health professions (2nd ed., pp. 3­14). Oxford: Butterworth-Heinemann. Jonassen, D. & Kwon, H. (2001). Communication patterns in computer mediated versus face-to-face group problem solving. Educational Technology Research and Development, 49(1), 35­51. Lipman, M. (1991). Thinking in education. Cambridge: Cambridge University Press. Mattingly, C., & Fleming, M.H. (1994). Clinical reasoning: Forms of inquiry in a therapeutic practice. Philadelphia: FA Davis. McAteer, E., Tolmie, A., Duffy, C., & Corbett, J. (1997). Computer mediated communication as a learning resource. Journal of Computer Assisted Learning, 13(4), 219­227. McLoughlin, C. & Luca, J. (2006). Applying situated learning theory to the creation of learning environments to enhance socialisation and self-regulation. A. Herrington and J. Herrington (Eds.) Authentic Learning Environments in Higher Education. (pp. 194­213). Hershey, PA: Idea Group. Murphy, E. (2004). An instrument to support thinking critically about critical thinking in online asynchronous discussions. Australasian Journal of Educational Technology, 20(3), 295­318. Murphy, E., & Ciszewska-Carr, J. (2005). Contrasting syntactic and semantic units in the analysis of online discussions. Australian Journal of Educational Technology, 21(4), 546­566. Oriogun, P.K. (2003). Towards understanding online learning levels of engagement using the SQUAD approach to CMC discourse. Australian Journal of Educational Technology, 19(3), 371­387. Retrieved 30 August, 2006 from http://www.ascilite.org.au/ajet/ajet19/oriogun.html. 735
ascilite 2006, The University of Sydney Oriogun, P.K. & Cook, J. (2003). Transcript reliability cleaning percentage: an alternative interrater reliability measure of message transcripts in online learning. The American Journal of Distance Education, 17(4), 221­234. Paul, R. W. (1993). Critical thinking: What, why, and how. In C. A. Barnes (Ed.), Critical thinking: Educational imperative, 77, 3­24. San Francisco: Jossey-Bass. Repman, J. (1993). Collaborative computer based learning; Cognitive and affective outcomes. Journal of Educational Computing Research, 9(2), 149­163. Resnick, L. B., Levine, J. M., & Teasley, S. D. (Eds). (1991). Perspectives on socially shared cognition. Washington: American Psychological Association. Scanlan J. & Hancock, N. (2005) Unpublished evaluation data. Schцn, D.A. (1983). The reflective practitioner: How professionals think in action. New York: Basic Books. Schцn, D. A. (1987). Educating the reflective practitioner. San Francisco, CA: Jossey-Bass. Schцn, D.A. (1995). Knowing in action: The new scholarship requires a new epistemology. Change, 27(6), 27­34. Acknowledgement The authors wish to thank and acknowledge the support received from the ascilite community mentoring program. Author contact details Justin Newton Scanlan, Faculty of Health Sciences, University of Sydney, PO Box 170, Lidcombe, NSW 1825, Australia. Email: [email protected] Catherine McLoughlin, Coordinator, SIMERR, ACT, National Centre for Science, ICT and Mathematics in Rural and Regional Australia (SIMERR, ACT), School of Education, Australian Catholic University, PO Box 256, Dickson, ACT 2602, Australia. Email: [email protected] Nicola Hancock, Faculty of Health Sciences, University of Sydney, PO Box 170, Lidcombe, NSW 1825, Australia. Email: [email protected] Copyright © 2006 Scanlan, J. N., McLoughlin, C., Hancock, N. The author(s) assign to ascilite and educational non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The author(s) also grant a non-exclusive licence to ascilite to publish this document on the ascilite web site (including any mirror or archival sites that may be developed) and in electronic and printed form within the ascilite Conference Proceedings. Any other usage is prohibited without the express permission of the author(s). For the appropriate way of citing this article, please see the frontmatter of the Conference Proceedings. 736

JN Scanlan, C McLoughlin

File: design-and-evaluation-of-an-e-learning-environment-to-support.pdf
Title: ASCILITE 2006: Scanlan, McLoughlin and Hancock - Design and evaluation of an e-learning environment to support the development and refinement of clinical reasoning and decision-making
Author: JN Scanlan, C McLoughlin
Author: Justin Newton Scanlan, Catherine McLoughlin and Nicola Hancock
Subject: Design and evaluation of an e-learning environment to support the development and refinement of clinical reasoning and decision-making
Keywords: online asynchronous discussion; clinical reasoning; critical thinking; occupational therapy; health professional education; instrument development
Published: Sun Nov 19 12:29:28 2006
Pages: 10
File size: 0.27 Mb


ABOUT THE PRODUCTION, 27 pages, 0.53 Mb

, pages, 0 Mb

Avian influenza, 8 pages, 0.33 Mb

Master class, 3 pages, 1.07 Mb
Copyright © 2018 doc.uments.com