Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Curtis A. Olson is active.

Publication


Featured researches published by Curtis A. Olson.


Journal of Continuing Education in The Health Professions | 2007

Gaps between knowing and doing: understanding and assessing the barriers to optimal health care.

Lorna J. Cochrane; Curtis A. Olson; Suzanne Murray; Martin Dupuis; Tricia R. Tooman; Sean M. Hayes

Introduction: A significant gap exists between science and clinical practice guidelines, on the one hand, and actual clinical practice, on the other. An in‐depth understanding of the barriers and incentives contributing to the gap can lead to interventions that effect change toward optimal practice and thus to better care. Methods: A systematic review of English language studies involving human subjects and published from January 1998 to March 2007 yielded 256 articles that fulfilled established criteria. The analysis was guided by two research questions: How are barriers are assessed? and What types of barriers are identified? The studies abstracted were coded according to 33 emerging themes; placed into seven categories that typified the barriers; grouped as to whether they involved the health care professional, the guideline, the scientific evidence, the patient, or the health system; and organized according to relationship pattern between barriers. Results: The results expand our understanding of how multiple factors pose barriers to optimal clinical practice. The review reveals increasing numbers of behavioral and system barriers. Quantitative survey type assessments continue to dominate barrier research; however, an increasing number of qualitative and mixed‐method study designs have emerged recently. Discussion: The findings establish the evolution of research methodologies and emerging barriers to the translation of knowing to doing. While many studies are methodologically weak, there are indications that designs are becoming more aligned with the complexity of the health care environment. The review provides support for the need to examine multiple factors within the knowledge‐to‐action process.


Evaluation & the Health Professions | 2010

Commitment to Practice Change: An Evaluator’s Perspective

Marianna B. Shershneva; Min-fen Wang; Gary C. Lindeman; Julia N. Savoy; Curtis A. Olson

A commitment to practice change (CTC) approach may be used in educational program evaluation to document practice changes, examine the educational impact relative to the instructional focus, and improve understanding of the learning-to-change continuum. The authors reviewed various components and procedures of this approach and discussed some practical aspects of its application using an example of a study evaluating a presentation on menopausal care for primary care physicians. The CTC approach is a valuable evaluation tool, but it requires supplementation with other data to have a complete picture of the impact of education on practice. From the evaluation perspective, the self-reported nature of the CTC data is a major limitation of this method.


Journal of Continuing Education in The Health Professions | 2011

Factors contributing to successful interorganizational collaboration: The case of CS2day

Curtis A. Olson; Rn Jann T. Balmer PhD; George Mejicano

&NA; Continuing medical educations transition from an emphasis on dissemination to changing clinical practice has made it increasingly necessary for CME providers to develop effective interorganizational collaborations. Although interorganizational collaboration has become commonplace in most sectors of government, business, and academia, our review of the literature and experience as practitioners and researchers suggest that the practice is less widespread in the CME field. The absence of a rich scholarly literature on establishing and maintaining interorganizational collaborations to provide continuing education to health professionals means there is little information about how guidelines and principles for effective collaboration developed in other fields might apply to continuing professional development in health care and few models of successful collaboration. The purpose of this article is to address this gap by describing a successful interorganizational CME collaboration—Cease Smoking Today (CS2day)—and summarizing what was learned from the experience, extending our knowledge by exploring and illustrating points of connection between our experience and the existing literature on successful interorganizational collaboration. In this article, we describe the collaboration and the clinical need it was organized to address, and review the evidence that led us to conclude the collaboration was successful. We then discuss, in the context of the literature on effective interorganizational collaboration, several factors we believe were major contributors to success. The CS2day collaboration provides an example of how guidelines for collaboration developed in various contexts apply to continuing medical education and a case example providing insight into the pathways that lead to a collaborations success.


Journal of Continuing Education in The Health Professions | 2011

Peering inside the clock: Using success case method to determine how and why practice-based educational interventions succeed†

Curtis A. Olson; Marianna B. Shershneva; Michelle Horowitz Brownstein

Introduction: No educational method or combination of methods will facilitate implementation of clinical practice guidelines in all clinical contexts. To develop an empirical basis for aligning methods to contexts, we need to move beyond “Does it work?” to also ask “What works for whom and under what conditions?” This study employed Success Case Method to understand how 3 performance improvement CME activities contributed to implementation of tobacco cessation practice guidelines in 9 outpatient practices. Methods: Success criteria were applied to clinical data from 93 practices, generating a pool of 14 success cases; 9 were recruited into the study. We conducted semistructured telephone interviews with 1 to 4 informants in each practice. Individual case reports were developed summarizing changes made, what was done to effect the changes, relevant contextual factors, and contributions of the educational interventions to change. A cross‐case analysis followed. Results: Twenty informants were interviewed. Practice changes varied in number and degree. Implementation mechanisms included acquisition of new knowledge and skills, making improving cessation practice an active goal, engaging the clinical team, adopting a more proactive approach with smokers, and making smokers and clinical practice performance more visible. Contextual factors influencing the implementation process were also identified. Discussion: The study shows that (1) the appropriate target of an educational intervention may be a team rather than an individual, (2) implementing even relatively simple practice guidelines can be a complex process, and (3) change requires scientific and practical knowledge. A richer understanding of implementation mechanisms and contextual factors is needed to guide educational planning.


Journal of Continuing Education in The Health Professions | 2012

Twenty predictions for the future of CPD: Implications of the shift from the update model to improving clinical practice

Curtis A. Olson

There has been a remarkable transformation of the continuing professional development (CPD) field over the past several years. I am referring, of course, to the change from an emphasis on the update model of CPD, which relied heavily on dissemination of scientific evidence, to a more systematic and concerted effort to deliver educational interventions that improve clinical practice. There are many drivers behind this change, including changes in accreditation requirements, new reimbursement models, ever improving information technology, public demands for transparency, the abundant evidence that dissemination alone is rarely effective at changing practice, and the growing acceptance of the notion that continuing growth in the costs of health care is unsustainable. The implications of this change in emphasis for the CPD enterprise are profound; it has and will continue to have substantial and enduring consequences. Facilitating change in human behavior in the context of a complex organizational system is very different than fostering learning at the level of the individual clinician in an educational environment. These implications will demand sustained attention from researchers, practitioners, and policy makers. In this editorial, I describe several current trends and make predictions about the future, all of which stem from the shift from updates to improving practice.


Journal of Interprofessional Care | 2014

Examining the intersections between continuing education, interprofessional education and workplace learning

Simon Kitto; Joanne Goldman; Madeline H. Schmitt; Curtis A. Olson

The aims of this themed issue are to increase our understanding of how three distinct, yet overlapping, fields – continuing education (CE), interprofessional education (IPE) and workplace learning ...


Advances in Health Sciences Education | 2012

Didactic CME and practice change: don't throw that baby out quite yet.

Curtis A. Olson; Tricia R. Tooman

Skepticism exists regarding the role of continuing medical education (CME) in improving physician performance. The harshest criticism has been reserved for didactic CME. Reviews of the scientific literature on the effectiveness of CME conclude that formal or didactic modes of education have little or no impact on clinical practice. This has led some to argue that didactic CME is a highly questionable use of organizational and financial resources, and a cause of lost opportunities for physicians to engage in meaningful learning. The authors’ current program of research has forced them to reconsider the received wisdom regarding the relationship between didactic modes of education and learning, and the role frank dissemination can play in bringing about practice change. The authors argued that the practice of assessing and valuing educational methods based only on their capacity to directly influence practice reflects an impoverished understanding of how change in clinical practice actually occurs. Drawing on case studies research, examples were given of the functions didactic CME served in the interest of improved practice. Reasons were then explored as to why the contribution of didactic CME is often missed or dismissed. The goal was not to advocate for a return to the status quo ante where lecture-based education is the dominant modality, but rather to acknowledge both the limits and potential of this longstanding approach to delivering continuing education.


Journal of Continuing Education in The Health Professions | 2013

Evolving Health Care Systems and Approaches to Maintenance of Certification

R. Van Harrison; Curtis A. Olson

This supplemental issue to the Journal of Continuing Education in the Health Professions (JCEHP) on maintenance of certification (MOC)* is sponsored by the American Board of Medical Specialties (ABMS). It provides a highly useful overview of the programmatic efforts of ABMS in the United States), the Royal College of Physicians and Surgeons of Canada (RCPSC), and the General Medical Council (GMC) of the United Kingdom to maintain and document over time the competence of physicians. Unlike most issues of JCEHP, the primary focus of this issue is on regulatory policies and practices adopted at the national level. Two points of intersection of MOC with continuing education are (1) the development and validation of approaches to assessing physician knowledge and competence, and (2) facilitating practice-based learning. Perhaps the most fundamental convergence is the shared goal of improving patient safety, enhancing health care quality, and reducing costs. Researchers, practitioners, and policymakers in continuing education in the health professions need to be aware of the history, current status, and future directions for MOC in these three countries. MOC is the subject of an increasing number of research studies, creating a unique opportunity to


Journal of Continuing Education in The Health Professions | 2014

Survey Burden, Response Rates, and the Tragedy of the Commons

Curtis A. Olson

Surveys are an essential tool in the repertoire of educational researchers and evaluators. However, overuse of surveys— whether they be end-of-course evaluations, needs assessments, or opinion polls—can undermine their utility.* In this editorial, I argue that indiscriminate use of surveys may be undercutting their effectiveness as a data collection approach by creating survey fatigue and lowering response rates. Concerted efforts are needed to husband and optimize the use of an endangered resource: the cooperation of survey populations we seek to describe and understand. In the analysis that follows, I focus on self-administered online surveys, but my observations can be applied to other modes as well, including interviews, mail questionnaires, and focus groups.


Journal of Continuing Education in The Health Professions | 2013

Evaluations of Educational Interventions: Getting Them Published and Increasing Their Impact

Curtis A. Olson; Lori L. Bakken

Advancing the evidence base that informs educational research and practice must include studies with both randomized and nonrandomized designs.1–3 Although experimental designs produce results with a high level of internal validity, randomization and strict controls are not always feasible when evaluating the impact of an educational intervention, and experimental study designs are not suited for answering all evaluation questions, especially when the aim is to produce practical knowledge for immediate use.4 Many reports of evaluations of educational interventions—especially those employing nonrandomized designs, which have methods and reporting conventions that are less standardized—are submitted to JCEHP with significant shortcomings and, as a result, go unpublished or require extensive revisions. In some cases, this reflects shortcomings in the study design; in others, the problem lies more with how the study was initially written up. This is unfortunate for several reasons, not the least of which is the investment in time and effort authors invest in preparing manuscripts for submission. There is currently little guidance specifically for reporting evaluations of continuing education interventions. There are guidelines on reporting innovations in medical school curricula,5 quality improvement projects in health care organizations,6 nonrandomized evaluations of public health interventions,1 and evaluation studies of health promotion

Collaboration


Dive into the Curtis A. Olson's collaboration.

Top Co-Authors

Avatar

Marianna B. Shershneva

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David C. Thomas

Icahn School of Medicine at Mount Sinai

View shared research outputs
Top Co-Authors

Avatar

Patricia K. Kokotailo

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Robert Morrow

Montefiore Medical Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge