Susan McCahan
University of Toronto
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Susan McCahan.
Shock Compression of Condensed Matter - 2001: 12th APS Topical Conference | 2002
David L. Frost; Fan Zhang; Susan McCahan; Stephen Burke Murray; Andrew J. Higgins; Marta Slanik; Marc Casas‐Cordero; Chayawat Ornthanalai
Particle momentum effects from the detonation of a spherical heterogeneous charge consisting of a packed bed of inert particles saturated with a liquid explosive have been investigated experimentally and numerically. When such a charge is detonated, an interesting feature of the subsequent flow field is the interplay between the decaying air blast wave and the rapidly expanding cloud of particles. Using a cantilever gauge, it is found that the particle momentum flux provides the primary contribution of the multiphase flow to the near‐field impulse applied to a nearby small structure. To determine the impulse from the particle momentum flux on the structure, a novel particle streak gauge was developed to measure the rate of particle impacts at various locations. The trends of the experimental results are reproduced using an Eulerian two‐fluid model for the gas‐particle flow and a finite‐element model for the structural response of the cantilever gauge.
acm conference on hypertext | 2010
Chirag Variawa; Susan McCahan
It has been suggested that the teaching of engineering content should include integration across subject matter and contextualization of the material. However, contextualization can create a barrier to accessibility when there is a disconnect between the students background experience and the context chosen by the instructor. This can have a particular impact when the contextualization is used in the process of assessing student learning. This study investigates the types of words that cause difficulty for students. The assessment shows that there are terms used on engineering tests that present possible barriers for students, such that the test may in part be assessing the students cultural knowledge or vocabulary rather than engineering competency. We propose a number of strategies for remediating this situation including the use of hypertext to mitigate this type of accessibility barrier.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2018
Bahar Memarian; Susan McCahan
Freeform comments as a means of providing formative feedback on engineering problems, from the perspective of feedback providers (i.e. assessors), is examined. The aim of this research is to collect and analyze assessors’ ratings on the usability of this type of task. Two course topics with different error loads in the sample solutions were used as the basis for the work. Assessors were divided into two groups: Group 1 received an evaluation package containing first year mechanics students’ test solutions with a high error load (Error1=32), while Group 2 received first year circuits students’ test solutions with low error load (Error2=11). Assessment time was held constant (ttot=20min). A standard instrument for usability was utilized. Analysis of the survey data from the assessors (n1=11, n2=19) revealed some significant differences between the two groups. In particular, Group 1 reported a lower degree of perceived consistency in marking relative to Group 2.
Proceedings of the Canadian Engineering Education Association | 2017
Gayle Lesmond; Nikita Dawe; Susan McCahan; Lisa Romkey
The shift towards outcomes-based assessment in higher education has necessitated the exploration and development of valid measurement tools. Given this trend, the current project seeks to develop a set of generic analytic rubrics for the purpose of assessing learning outcomes in the core competency areas of design, communication, teamwork, problem analysis and investigation. This paper will provide an update on the original paper presented at CEEA 2015, in which the approach to rubric development for communication, design and teamwork was discussed. The current paper will detail the process of testing the communication, design and teamwork rubrics. In particular, it will report on the progress achieved in shadow testing, where teaching assistants and/or course instructors with grading experience (“assessors”) are asked to evaluate samples of student work using selected rows from the rubrics. The results of shadow testing will be presented.
Proceedings of the Canadian Engineering Education Association | 2011
Brian Frank; Susan Fostaty-Young; Susan McCahan; Peter Wolf; Peter Ostafichuck; K.Christopher Watts; Nasser Saleh
Proceedings of the Canadian Engineering Education Association | 2011
Susan McCahan; Grant Allen; Lisa Romkey
119th ASEE Annual Conference and Exposition | 2012
Susan McCahan; Holly K. Ault; Edmund Tsang; Mark R. Henderson; Spencer P. Magleby; Annie Soisson
Proceedings of the Canadian Engineering Education Association | 2012
James A. Kaupp; Brain Frank; Robert W. Brennan; Susan McCahan; Lata Narayanan; Peter Ostafichuck; Nariman Sepehri; K.Christopher Watts
Proceedings of the Canadian Engineering Education Association | 2011
Susan McCahan; Lisa Romkey
2016 ASEE Annual Conference & Exposition | 2016
Nikita Dawe; Lisa Romkey; Susan McCahan; Gayle Lesmond