Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where John D. Voss is active.

Publication


Featured researches published by John D. Voss.


Journal of General Internal Medicine | 2003

Randomized controlled trial of education and feedback for implementation of guidelines for acute low back pain.

Joel M. Schectman; W. Scott Schroth; Dante Verme; John D. Voss

OBJECTIVE: The effect of clinical guidelines on resource utilization for complex conditions with substantial barriers to clinician behavior change has not been well studied. We report the impact of a multifaceted guideline implementation intervention on primary care clinician utilization of radiologic and specialty services for the care of acute low back pain.DESIGN: Physician groups were randomized to receive guideline education and individual feedback, supporting patient education materials, both, or neither. The impact on guideline adherence and resource utilization was evaluated during the 12-month period before and after implementation.PARTICIPANTS: Fourteen physician groups with 120 primary care physician and associate practitioners from 2 group model HMO practices.INTERVENTIONS: Guideline implementation utilized an education/audit/feedback model with local peer opinion leaders. The patient education component included written and videotaped materials on the care of low back pain.MAIN RESULTS: The clinician intervention was associated with an absolute increase in guideline-consistent behavior of 5.4% in the intervention group versus a decline of 2.7% in the control group (P=.04). The patient education intervention produced no significant change in guideline-consistent behavior, but was poorly adopted. Patient characteristics including duration of pain, prior history of low back pain, and number of visits during the illness episode were strong predictors of service utilization and guideline-consistent behavior.CONCLUSIONS: Implementation of an education and feedback-supported acute low back pain care guideline for primary care clinicians was associated with an increase in guideline-consistent behavior. Patient education materials did not enhance guideline effectiveness. Implementation barriers could limit the utility of this approach in usual care settings.


International Journal of Medical Informatics | 2005

Determinants of physician use of an ambulatory prescription expert system

Joel M. Schectman; John B. Schorling; Mohan M. Nadkarni; John D. Voss

PURPOSE To determine whether physician experience with and attitude towards computers is associated with adoption of a voluntary ambulatory prescription writing expert system. METHODS A prescription expert system was implemented in an academic internal medicine residency training clinic and physician utilization was tracked electronically. A physician attitude and behavior survey (response rate=89%) was conducted six months after implementation. RESULTS There was wide variability in system adoption and degree of usage, though 72% of physicians reported predominant usage (> or =50% of prescriptions) of the expert system six months after implementation. Self-reported and measured technology usage were strongly correlated (r=0.70, p<0.0001). Variation in use was strongly associated with physician attitude toward issues of system efficiency and effect on quality, but not with prior computer experience, level of training, or satisfaction with their primary care practice. Non-adopters felt that electronic prescribing was more time consuming and also more likely to believe that their patients preferred hand-written prescriptions. CONCLUSION A voluntary electronic prescription system was readily adopted by a majority of physicians who believed it would have a positive impact on the quality and efficiency of care. However, dissatisfaction with system capabilities among both adopters and non-adopters suggests the importance of user education and expectation management following system selection.


Academic Medicine | 2005

The Clinical Health Economics System Simulation (CHESS): a teaching tool for systems- and practice-based learning.

John D. Voss; Mohan M. Nadkarni; Joel M. Schectman

Academic medical centers face barriers to training physicians in systems- and practice-based learning competencies needed to function in the changing health care environment. To address these problems, at the University of Virginia School of Medicine the authors developed the Clinical Health Economics System Simulation (CHESS), a computerized team-based quasi-competitive simulator to teach the principles and practical application of health economics. CHESS simulates treatment costs to patients and society as well as physician reimbursement. It is scenario based with residents grouped into three teams, each team playing CHESS using differing (fee-for-service or capitated) reimbursement models. Teams view scenarios and select from two or three treatment options that are medically justifiable yet have different potential cost implications. CHESS displays physician reimbursement and patient and societal costs for each scenario as well as costs and income summarized across all scenarios extrapolated to a physicians entire patient panel. The learners are asked to explain these findings and may change treatment options and other variables such as panel size and case mix to conduct sensitivity analyses in real time. Evaluations completed in 2003 by 68 (94%) CHESS resident and faculty participants at 19 U.S. residency programs preferred CHESS to a traditional lecture-and-discussion format to learn about medical decision making, physician reimbursement, patient costs, and societal costs. Ninety-eight percent reported increased knowledge of health economics after viewing the simulation. CHESS demonstrates the potential of computer simulation to teach health economics and other key elements of practice- and systems-based competencies.


The American Journal of the Medical Sciences | 2004

Can Prescription Refill Feedback to Physicians Improve Patient Adherence

Joel M. Schectman; John B. Schorling; Mohan M. Nadkarni; John D. Voss

Background: Although adherence to long‐term drug therapy is an important issue, the means to facilitate its assessment and improvement in clinical practice remain a challenge. Objective: To evaluate the impact of prescription refill feedback and adherence education provided to primary care physicians. Methods: We provided 83 resident and attending physicians at a university‐based general internal medicine practice with refill adherence reports on each of 340 diabetic patients. An educational session on adherence assessment and improvement techniques was held, and all physicians received a written outline on this topic. Physician attitude toward the intervention and 6‐month change in refill adherence (doses filled/doses prescribed) of their patient panels were assessed. A nonrandomized comparison group of patients receiving hypertension medications for whom the physicians did not receive feedback was also evaluated. Results: The overall improvement in mean refill adherence was not significant (83.9% vs 86.0%, P = 0.18). The educational session was attended by 53% of the physicians. The patient refill adherence of physicians attending the educational session improved by 5.0% (P < 0.0009) with no significant change among patients of physicians not attending the session. There was no adherence change among patients for whom physicians did not receive refill feedback data, regardless of educational session attendance. Conclusions: Patients of physicians that received refill feedback and attended an educational session improved their refill adherence. After replication of these results in a randomized trial, broad implementation of this approach could have substantial impact from a public health perspective, given the ubiquity of prescription claims data.


American Journal of Medical Quality | 2004

The Effect of Physician Feedback and an Action Checklist on Diabetes Care Measures

Joel M. Schectman; John B. Schorling; Mohan M. Nadkarni; Jason A. Lyman; Mir S. Siadaty; John D. Voss

The objective was to evaluate whether physician feed-back accompanied by an action checklist improved diabetes care process measures. Eighty-three physicians in an academic general medicine clinic were provided a single feedback report on the most recent date and result of diabetes care measures (glycosylated hemoglobin [Alc, urine microalbumin, serum creatinine, lipid levels, retinal examination) as well as recent diabetes medication refills with calculated dosing and adherence on 789 patients. An educational session regarding the feedback and adherence information was provided. The physicians were asked to complete a checklist accompanying the feedback on each of their patients, indicating requested actions with respect to follow-up, testing, and counseling. The physicians completed 82% of patient checklists, requesting actions consistent with patient needs on the basis of the feedback. Of the physicians, 93% felt the patient information and intervention format to be useful. The odds of urine microalbumin testing, serum creatinine, lipid profile, Alc, and retinal examination increased in the 6 months after the feedback. The increase was sustained at 1 year only for microalbumin and retinal exams. There was no significant change in refill adherence for the group overall after the feedback, although adherence did improve among patients of physicians attending the educational session. No significant change was noted in lipid or Alc levels during the study period. In conclusion, a simple physician feedback tool with action checklist can be both helpful and popular for improving rates of diabetes care guideline adherence. More complex interventions are likely required to improve diabetes outcomes.


Proceedings of the Human Factors and Ergonomics Society ... Annual Meeting . Human Factors and Ergonomics Society. Annual Meeting | 2010

SUPPORTING PHYSICIANS' PRACTICE-BASED LEARNING AND IMPROVEMENT (PBLI) AND QUALITY IMPROVEMENT THROUGH EXPLORATION OF POPULATION-BASED MEDICAL DATA.

Leigh A. Baumgart; Ellen J. Bass; Jason A. Lyman; Sherry Springs; John D. Voss; Gregory F. Hayden; Martha A. Hellems; Tracey R. Hoke; Katharine A. Schlag; John B. Schorling

Participating in self-assessment activities may stimulate improvement in practice behaviors. However, it is unclear how best to support the development of self-assessment skills, particularly in the health care domain. Exploration of population-based data is one method to enable health care providers to identify deficiencies in overall practice behavior that can motivate quality improvement initiatives. At the University of Virginia, we are developing a decision support tool to integrate and present population-based patient data to health care providers related to both clinical outcomes and non-clinical measures (e.g., demographic information). By enabling users to separate their direct impact on clinical outcomes from other factors out of their control, we may enhance the self-assessment process.


IEEE Transactions on Human-Machine Systems | 2015

Effect of Pooled Comparative Information on Judgments of Quality

Leigh A. Baumgart; Ellen J. Bass; John D. Voss; Jason A. Lyman

Quality assessment is the focus of many healthcare initiatives. Yet, it is not well understood how the type of information used in decision support tools to enable judgments of quality based on data impacts the accuracy, consistency, and reliability of judgments made by physicians. Comparative pooled information could allow physicians to judge the quality of their practice by making comparisons with other practices or other specific populations of patients. In this study, resident physicians were provided with varying types of information derived from pooled patient datasets: quality component measures at the individual and group level, a qualitative interpretation of the quality measures using percentile rank, and an aggregate composite quality score. Thirty-two participants viewed 30 quality profiles consisting of information applicable to the practice of 30 deidentified resident physicians. Those provided with quality component measures and a qualitative interpretation of the quality measures (rankings) judged quality of care more similarly to experts and were more internally consistent compared with participants who were provided with quality component measures alone. Reliability between participants was significantly less for those who were provided with a composite quality score compared with those who were not.


Diabetes Care | 2002

The Association Between Diabetes Metabolic Control and Drug Adherence in an Indigent Population

Joel M. Schectman; Mohan M. Nadkarni; John D. Voss


Medical Care | 2002

Predictors of medication-refill adherence in an indigent rural population.

Joel M. Schectman; Viktor E. Bovbjerg; John D. Voss


Academic Medicine | 2008

Changing conversations: teaching safety and quality in residency training.

John D. Voss; Natalie B. May; John B. Schorling; Jason A. Lyman; Joel M. Schectman; Andrew M.D. Wolf; Mohan M. Nadkarni; Margaret Plews-Ogan

Collaboration


Dive into the John D. Voss's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jason A. Lyman

University of Virginia Health System

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ye Chen

University of Virginia

View shared research outputs
Top Co-Authors

Avatar

Ellen J. Bass

Applied Science Private University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge