Laura April McEwen
Queen's University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Laura April McEwen.
Educational Research and Evaluation | 2001
Christina De Simone; Richard F. Schmid; Laura April McEwen
University students are too often challenged by their limited skills in application, investigation, relational thinking, and communication of ideas. In this study, we have combined 3 tools that potentially can support and foster students’ development in the above mentioned areas through student collaboration, concept mapping, and electronic technologies. The participants in this study were 26 students in two intact classes in learning theories. In groups of 3 to 5 students, they were asked to generate 3 concept maps and accompanying prose over the term on 3 major issues in the field of learning. Through the use of interviews, questionnaires, and student generated concept maps, students reportedly enjoyed concept mapping for its organizational and relational properties but preferred sharing their concept maps and dialoguing with one another in a synchronous mode where immediate feedback and flow of thinking could be maintained when involved in constructing maps. Moreover, they did not like the redundancy offered by both prose and concept map outputs, suggesting that while concept mapping can be an arena for generating and generally structuring ideas, prose can be a means of communicating such ideas in a form that is common to most people. This is particularly important for teachers and students who have difficulty navigating through maps alone.
Academic Medicine | 2015
Laura April McEwen; Jane Griffiths; Karen Schultz
The use of portfolios in postgraduate medical residency education to support competency development is increasing; however, the processes by which these assessment systems are designed, implemented, and maintained are emergent. The authors describe the needs assessment, development, implementation, and continuing quality improvement processes that have shaped the Portfolio Assessment Support System (PASS) used by the postgraduate family medicine program at Queen’s University since 2009. Their description includes the impetus for change and contextual realities that guided the effort, plus the processes used for selecting assessment components and developing strategic supports. The authors discuss the identification of impact measures at the individual, programmatic, and institutional levels and the ways the department uses these to monitor how PASS supports competency development, scaffolds residents’ self-regulated learning skills, and promotes professional identity formation. They describe the “academic advisor” role and provide an appendix covering the portfolio elements. Reflection elements include learning plans, clinical question logs, confidence surveys, and reflections about continuity of care and significant incidents. Learning module elements cover the required, online bioethics, global health, and consult-request modules. Assessment elements cover each resident’s research project, clinical audits, presentations, objective structured clinical exam and simulated office oral exam results, field notes, entrustable professional activities, multisource feedback, and in-training evaluation reports. Document elements are the resident’s continuing medical education activities including procedures log, attendance log, and patient demographic summaries. The authors wish to support others who are engaged in the systematic portfolio-design process or who may adapt aspects of PASS for their local programs.
CBE- Life Sciences Education | 2009
Laura April McEwen; dik Harris; Richard F. Schmid; Jackie Vogel; Tamara L. Western; Paul M. Harrison
This article offers a case study of the evaluation of a redesigned and redeveloped laboratory-based cell biology course. The course was a compulsory element of the biology program, but the laboratory had become outdated and was inadequately equipped. With the support of a faculty-based teaching improvement project, the teaching team redesigned the course and re-equipped the laboratory, using a more learner-centered, constructivist approach. The focus of the article is on the project-supported evaluation of the redesign rather than the redesign per se. The evaluation involved aspects well beyond standard course assessments, including the gathering of self-reported data from the students concerning both the laboratory component and the technical skills associated with the course. The comparison of pre- and postdata gave valuable information to the teaching team on course design issues and skill acquisition. It is argued that the evaluation process was an effective use of the scarce resources of the teaching improvement project.
American Journal of Surgery | 2016
Ayca Toprak; Ulemu Luhanga; Sarah A. Jones; Andrea Winthrop; Laura April McEwen
BACKGROUND The Surgical Procedure Feedback Rubric (SPR) is a tool to document resident intraoperative performance and provide targeted feedback to support learning in a competency-based model of surgical education. It differs from other assessment tools because it defines performance criteria by increasing complexity through the use of behavioral anchors, thus embedding standards of performance within the tool. This study explores aspects of validity of the SPR as an assessment tool. METHODS A 14-month observational study was conducted in 2 surgical training programs. Factor structure of the SPR was examined using exploratory factor analysis. Discriminative ability of the SPR was examined using analysis of variance. RESULTS The SPR measures 3 factors: Operating Room Preparation, Technical skill, and intrinsic Competencies. Analysis of variance demonstrated the utility of the SPR to discriminate between intraoperative performances of residents by postgraduate training year. CONCLUSIONS This study contributes to the validity argument for the SPR by providing evidence for construct and discriminative validity.
International Journal of Information Communication Technologies and Human Development | 2010
Christopher DeLuca; Laura April McEwen
Assessment for learning (AFL) is a highly effective strategy for promoting student learning, development and achievement in higher education (Falchikov, 2003; Kirby & Downs, 2007; Nicol & Macfarlane-Dick, 2006; Rust, Price, & O’Donovan, 2003; Vermunt, 2005). However, since AFL relies on continuous monitoring of student progress through instructor feedback, peer collaboration, and student self-assessment, enacting AFL within large-group learning formats is challenging. This paper considers how technology can be leveraged to promote AFL in higher education. Drawing on data from students and instructors and recommendations from an external instructional design consultant, this paper documents the process of pairing technology and AFL within a large-group pre-service teacher education course at one Canadian institution. Recommendations for the improvement of the web-based component of the course are highlighted to provide practical suggestions for instructors to evaluate their own web-based platforms and improve their use of technology in support of AFL. The paper concludes with a discussion of areas for continued research related to the effectiveness of this pairing between assessment theory and technology.
Canadian Journal of Higher Education | 2009
dik Harris; Laura April McEwen
Academic Medicine | 2014
Karen Schultz; Laura April McEwen; Jane Griffiths
Canadian Family Physician | 2015
Ivy Oandasan; Douglas Archibald; Louise Authier; Kathrine Lawrence; Laura April McEwen; Maria Palacios; Marie Parkkari; Heidi Plant; Steve Slade; Shelley Ross
Canadian Family Physician | 2015
Ivy Oandasan; Douglas Archibald; Louise Authier; Kathrine Lawrence; Laura April McEwen; Maria Palacios; Marie Parkkari; Heidi Plant; Steve Slade; Shelley Ross
Canadian Family Physician | 2016
Jane Griffiths; Ulemu Luhanga; Laura April McEwen; Karen Schultz; Nancy Dalgarno