Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ann W. Frye is active.

Publication


Featured researches published by Ann W. Frye.


Medical Teacher | 2012

Program evaluation models and related theories: AMEE Guide No. 67

Ann W. Frye; Paul A. Hemmer

This Guide reviews theories of science that have influenced the development of common educational evaluation models. Educators can be more confident when choosing an appropriate evaluation model if they first consider the models theoretical basis against their programs complexity and their own evaluation needs. Reductionism, system theory, and (most recently) complexity theory have inspired the development of models commonly applied in evaluation studies today. This Guide describes experimental and quasi-experimental models, Kirkpatricks four-level model, the Logic Model, and the CIPP (Context/Input/Process/Product) model in the context of the theories that influenced their development and that limit or support their ability to do what educators need. The goal of this Guide is for educators to become more competent and confident in being able to design educational program evaluations that support intentional program improvement while adequately documenting or describing the changes and outcomes—intended and unintended—associated with their programs.


Medical Education | 2007

Lessons learned from complementary and integrative medicine curriculum change in a medical school

Moshe Frenkel; Ann W. Frye; Tracie Finkle; David Yzaguirre; Robert J. Bulik; Victor S. Sierpina

Objectives  This paper describes a pilot study that examined lessons learned from the introduction of complementary and alternative medicine (CAM) elements into a medical school curriculum.


Academic Medicine | 2004

Investigating the use of negatively phrased survey items in medical education settings: common wisdom or common mistake?

Thomas J. Stewart; Ann W. Frye

Background. Attitude surveys in medical education often combine negative items with positive items, a “common wisdom” strategy to counteract response sets. A body of research in other fields has demonstrated that negatively phrased items affect reliability and validity by introducing measurement artifact into scores. The authors investigated the effect of negatively phrased items in the Medical School Learning Environment Survey (MSLES) with data from six medical student cohorts at the University of Texas Medical Branch, Galveston, Texas. Method. This study describes the impact of negatively phrased items in the MSLES through analysis of item and scale means and comparisons of coefficient alpha values. Results. Findings indicate that negatively phrased items performed differently than the positively phrased items. Negatively phrased items were associated with lower scale reliability. Conclusion. The authors conclude, as did earlier studies, that negatively phrased items introduce an artifact into attitude measurement. The “common wisdom” practice of routinely including negative items should be employed with care.


Medical Education | 2010

Effects of comprehensive educational reforms on academic success in a diverse student body

Steven A. Lieberman; Michael A. Ainsworth; Gregory K. Asimakis; Lauree Thomas; Lisa D. Cain; Melodee G. Mancuso; Jeffrey P. Rabek; Ni Zhang; Ann W. Frye

Medical Education 2010: 44: 1232–1240


Academic Medicine | 2003

Introduction of patient video clips into computer-based testing: effects on item statistics and reliability estimates.

Steven A. Lieberman; Ann W. Frye; Stephanie D. Litwins; Karen A. Rasmusson; John R. Boulet

Purpose. Using patient video clips to evaluate examinees’ skills in interpreting physical examination findings is possible with computer-based testing, but the psychometric properties of video-based questions are unknown. Method. We developed parallel test questions incorporating video clips or text descriptions of abnormal neurologic findings and administered them to 106 fourth-year medical students finishing their Neurology Clerkship. Results. Overall, video-based questions had comparable difficulty and discrimination compared to analogous text-based questions. Preliminary studies indicated similar reliability with text- and video-based questions. Conclusions. The inclusion of patient video clips in computer-based testing is feasible from technical, practical, and psychometric perspectives. Further study is needed to gather validity evidence for this novel question format.


Academic Medicine | 2002

Incorporating simulators in a standardized patient exam.

Bernard M. Karnath; Ann W. Frye; Mark D. Holden

OBJECTIVE Using simulated patients during a clinical skills exam that involves many students has the advantage of standardizing the delivery of historical data. One major disadvantage is the inability to standardize the physical exam findings. We designed a simulated patient exam that incorporates simulated abnormal physical exam findings. DESCRIPTION The simulated patient exam case was divided into three separate stations: (1) the simulated patients history, (2) the simulated physical exam, and (3) the presentation station. Dyspnea was chosen as the chief complaint because of the broad differential of possible cardiac and pulmonary auscultatory findings. In the first station, students obtained historical data from the standardized simulated patient. Students were graded on their ability to ask appropriate historical questions. Trained observers were used to verify the numbers of historical cues obtained by the students. The second station consisted of simulated physical exam findings. Students first measured the blood pressure on a commercially available blood pressure simulator arm from the Medical Plastics Laboratory, Inc., Gatesville, TX. Students then auscultated an abnormal digital heart sound and pulmonary sound from a small auscultation transducer developed by Andries Acoustics, Spicewood, TX. Students also palpated a simulated pulse from a newly developed pulse transducer. Digital cardiopulmonary sounds and pulse data were recorded onto a CD-ROM disc and transmitted to the small transducers via a CD-ROM disc player. Students used their own stethoscopes to auscultate cardiopulmonary sounds from the small transducers. The students were graded in the second station on their ability to accurately measure a blood pressure, identify abnormal cardiopulmonary digital sounds, and finally describe a peripheral pulse. In the third station, students presented the historical data and physical exam findings to a faculty member, and then provided a differential diagnosis list based on their key findings from the other two stations. A total of 171 students (n = 171) completed the simulated patient exam. Each student completed the exam in 45 minutes. DISCUSSION In our simulated patient exam, students were evaluated not only on their data-gathering skills for key historical findings but also on the ability to correctly identify key physical exam findings such as abnormal cardiopulmonary sounds. Key physical exam findings were then integrated into the clinical decision-making process, which was presented in the faculty presentation station. Simulated patients with abnormal cardiopulmonary findings can be used for testing purposes. However, cardiac auscultatory abnormalities such as the ventricular S3 gallop are difficult to find and usually occur in a decompensated state such as heart failure. Other physical exam findings such as pulmonary crackles and wheezes also occur in decompensated conditions. Therefore, the use of simulators during a simulated patient exam offers the possibility of introducing several abnormal physical exam findings without having an unstable patient present in an exam setting. Further, the use of simulated physical exam findings allows for complete standardization of a clinical-simulated patient exam.


Academic Medicine | 2007

Creating sustainable curricular change: Lessons learned from an alternative therapies educational initiative

Victor S. Sierpina; Robert J. Bulik; Constance D. Baldwin; Moshe Frenkel; Susan M Gerik; Diedra Walters; Ann W. Frye

The authors describe the process by which a curriculum was developed to introduce complementary and alternative medicine topics at multiple levels from health professional students to faculty, as part of a five-year project, funded by a grant from the National Institutes of Health, at the University of Texas Medical Branch in Galveston, Texas, from 2001 to 2005. The curriculum was based on four educational goals that embrace effective communication with patients, application of sound evidence, creation of patient-centered therapeutic relationships, and development of positive perspectives on wellness. The authors analyze the complex and challenging process of gaining acceptance for the curriculum and implementing it in the context of existing courses and programs. The developmental background and context of this curricular innovation at this institution is described, with reference to parallel activities at other academic health centers participating in the Consortium of Academic Health Centers for Integrative Medicine. The authors hold that successful curricular change in medical schools must follow sound educational development principles. A well-planned process of integration is particularly important when introducing a pioneering curriculum into an academic health center. The process at this institution followed six key principles for successful accomplishment of curriculum change: leadership, cooperative climate, participation by organization members, politics, human resource development, and evaluation. The authors provide details about six analogous elements used to design and sustain the curriculum: collaboration, communication, demonstration, evaluation, evolution, and dissemination.


Academic Medicine | 2002

Effect of curriculum reform on students' preparedness for clinical clerkships: a comparison of three curricular approaches in one school.

Ann W. Frye; M D Carlo; Stephanie D. Litwins; Bernard M. Karnath; Christine A. Stroup-Benham; Steven A. Lieberman

As medical schools revise preclinical curricula to emphasize active learning, clinical relevance of the basic sciences, and early clinical experiences, critical evaluation of the results of the changes is important. Such changes in preclinical curricula are expected to help students develop better skills in communication, interpersonal relationships, critical thinking, and other areas essential to the practice of medicine, resulting in better preparation to begin clinical clerkships. How does changing foundational aspects of preclinical curricula affect students’ preparedness for clinical work? How can that be assessed? Performance on the USMLE Step 1 is certainly the most visible outcome of preclinical education. Although the Step 1 is commonly taken just before clinical clerkships are undertaken, its scores are not likely to reflect effects of all curricular changes. Changes such as adopting small-group, problem-based learning (PBL) or early clinical experiences might be expected to impact noncognitive aspects of students’ performances beyond the cognitive outcomes measured by Step 1 scores. Scores on knowledge-based examinations are not likely to be useful measures of students’ preparedness for noncognitive elements of clinical clerkships, such as cross-disciplinary teamwork or patient communication, in which procedural knowledge must be applied in clinical tasks. Might students’ preclinical course performances predict their readiness for clinical clerkships? Studies of preclinical course performances as predictors of clerkship performance, such as those by Baciewicz et al. and Roop and Pangaro, tend to demonstrate a relationship between those measures and students’ clinical course examination scores or grades. We felt, however, that preclinical course grades had not been shown to be a sensitive measure of readiness for the noncognitive demands of clinical training. While students are frequently asked to evaluate course objectives, instructional delivery, and other curriculum features, they are not often asked how well their curriculum has prepared them to undertake the next training level. Fincher, Lewis, and Kuske used interns’ self-assessments to examine their preparedness in competencies required to begin the intern year, including history and physical examination, patient diagnosis and management, and interpersonal skills. We adopted a similar approach to study important noncognitive outcomes of preclinical curriculum change. Over the past seven years, the University of Texas Medical Branch (UTMB) implemented stepwise preclinical curricular reform. In 1995, a problem-based learning (PBL) track featuring selfdirected learning in small groups and early clinical experiences opened to 24 students chosen by lottery from approximately twice that number of volunteer students per class, running parallel to the traditional didactic curriculum (TC). The PBL track’s student assessment procedures relied heavily on essay tests, standardized-patient (SP) examinations, and evaluation of small-group work; the TC assessments relied predominantly on multiple-choice questions (MCQs), with less use of SP examinations. In 1998, the TC was replaced with the Integrated Medical Curriculum (IMC), a hybrid curriculum combining the problem-based, small-group, self-directed aspects of the PBL track with some didactic teaching. The hybrid IMC retained the TC’s heavy reliance on MCQs for cognitive assessment with some SP-based examinations but added the PBL track’s small-group assessment. The PBL track, meanwhile, remained essentially unchanged. All three tracks featured early clinical experiences, but the PBL track’s emphasis was heavier than that of the TC or IMC. The curriculum labels used in this study (‘‘PBL,’’ ‘‘traditional,’’ ‘‘hybrid IMC’’) may unintentionally call attention to each curriculum’s instructional characteristics more than the curriculum features more relevant to this study. Our use of these labels references all features of each curriculum, including amount of early clinical experience and array of assessment methods. UTMB’s curriculum evolution process provided an uncommon opportunity to examine the effects of three distinct preclinical curricula within a single institution on students’ perceptions of their preparedness for clinical training. To that end, we developed a clinical-preparedness survey and administered it to students as they finished their preclinical curriculum. We hypothesized that if differences were found between students’ self-assessments of preparedness for clinical training those differences would correspond to the differing emphases in the three preclinical curricula.


Academic Medicine | 2008

Comprehensive changes in the learning environment: subsequent step 1 scores of academically at-risk students.

Steven A. Lieberman; Ann W. Frye; Lauree Thomas; Jeffrey P. Rabek; Garland D. Anderson

Background During the past 10 years at our institution, a number of changes have been instituted in the learning environment, including instructional techniques, assessment methods, academic support, and explicit board preparation. Method The authors studied the Step 1 performance of students with MCAT scores of 20 to 25 in our former and current curricula. Effect sizes were calculated for score improvement using adjusted means from ANCOVA with covariates of MCAT and age. Results The overall effect size was 0.48, with larger effects seen for underrepresented minority students overall (d = 0.64) and African American students especially (d = 0.77), representing medium to large effects. Overall failure rates decreased by two thirds. Conclusions Comprehensive changes in the learning environment were followed by substantial improvement in Step 1 performance among academically at-risk students.


Teaching and Learning in Medicine | 2002

Clinical Performance Assessment and Interactive Video Teleconferencing: An Iterative Exploration

Robert J. Bulik; Ann W. Frye; Michael R. Callaway; Cecilia M. Romero; Diedra Walters

Background: The direct observation of students in authentic settings by faculty provides valuable feedback on performance and helps ensure mastery of clinical skills. Description: We explored the use of interactive video technology (IVT) as a way of involving community preceptors as raters on a clinical performance exam for 3rd-year students after their family medicine clerkship. Family medicine preceptors, from locations in their communities, observed students on campus conduct interviews and physical exams of standardized patients and then interacted with them during their case presentations. Evaluation: We chose an action research approach to this project and conducted four independent trials. Interviews and observations were structured around three areas of concern: human, technical, and institutional. Conclusions: We feel confident in recommending IVT as a viable option for involving community preceptors in high-stakes testing and with other campus-based activities. We also report on the value of IVT in faculty development activities.

Collaboration


Dive into the Ann W. Frye's collaboration.

Top Co-Authors

Avatar

Steven A. Lieberman

University of Texas Medical Branch

View shared research outputs
Top Co-Authors

Avatar

Robert J. Bulik

University of Texas Medical Branch

View shared research outputs
Top Co-Authors

Avatar

Bernard M. Karnath

University of Texas Medical Branch

View shared research outputs
Top Co-Authors

Avatar

Stephanie D. Litwins

University of Texas Medical Branch

View shared research outputs
Top Co-Authors

Avatar

Gregory K. Asimakis

University of Texas Medical Branch

View shared research outputs
Top Co-Authors

Avatar

Jeffrey P. Rabek

University of Texas Medical Branch

View shared research outputs
Top Co-Authors

Avatar

Michael A. Ainsworth

University of Texas Medical Branch

View shared research outputs
Top Co-Authors

Avatar

Victor S. Sierpina

University of Texas Medical Branch

View shared research outputs
Top Co-Authors

Avatar

Diedra Walters

University of Texas Medical Branch

View shared research outputs
Top Co-Authors

Avatar

Karen Szauter

University of Texas Medical Branch

View shared research outputs
Researchain Logo
Decentralizing Knowledge