Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nicholas Hartman is active.

Publication


Featured researches published by Nicholas Hartman.


American Journal of Medical Quality | 2016

Delphi Method Validation of a Procedural Performance Checklist for Insertion of an Ultrasound-Guided Internal Jugular Central Line.

Nicholas Hartman; Mary Wittler; Kim Askew; David E. Manthey

Placement of ultrasound-guided central lines is a critical skill for physicians in several specialties. Improving the quality of care delivered surrounding this procedure demands rigorous measurement of competency, and validated tools to assess performance are essential. Using the iterative, modified Delphi technique and experts in multiple disciplines across the United States, the study team created a 30-item checklist designed to assess competency in the placement of ultrasound-guided internal jugular central lines. Cronbach α was .94, indicating an excellent degree of internal consistency. Further validation of this checklist will require its implementation in simulated and clinical environments.


Postgraduate Medical Journal | 2017

Validation of a performance checklist for ultrasound-guided internal jugular central lines for use in procedural instruction and assessment

Nicholas Hartman; Mary Wittler; Kim Askew; Brian Hiestand; David E. Manthey

Purpose of the study Tools created to measure procedural competency must be tested in their intended environment against an established standard in order to be validated. We previously created a checklist for ultrasound-guided internal jugular central venous catheter (US IJ CVC) insertion using the modified Delphi method. We sought to further validate the checklist tool for use in an educational environment. Study design This is a cohort study involving 15 emergency medicine interns being evaluated on their skill in US IJ CVC placement. We compared the checklist tool with a modified version of a clinically validated global rating scale (GRS) for procedural performance. Results The correlation between the GRS tool and the checklist tool was excellent, with a correlation coefficient (Pearsons r) of 0.90 (p<0.0001). Conclusions This checklist represents a useful tool for measuring procedural competency.


Journal for Healthcare Quality | 2016

A Multidisciplinary Self-Directed Learning Module Improves Knowledge of a Quality Improvement Instrument: The HEART Pathway.

Nicholas Hartman; Erin N. Harper; Lauren M. Leppert; Brittany M. Browning; Kim Askew; David E. Manthey; Simon A. Mahler

We created and tested an educational intervention to support implementation of an institution wide QI project (the HEART Pathway) designed to improve care for patients with acute chest pain. Although online learning modules have been shown effective in imparting knowledge regarding QI projects, it is unknown whether these modules are effective across specialties and healthcare professions. Participants, including nurses, advanced practice clinicians, house staff and attending physicians (N = 486), were enrolled into an online, self-directed learning course exploring the key concepts of the HEART Pathway. The module was completed by 97% of enrollees (469/486) and 90% passed on the first attempt (422/469). Out of 469 learners, 323 completed the pretest, learning module and posttest in the correct order. Mean test scores across learners improved significantly from 74% to 89% from the pretest to the posttest. Following the intervention, the HEART Pathway was used for 88% of patients presenting to our institution with acute chest pain. Our data demonstrate that this online, self-directed learning module can improve knowledge of the HEART Pathway across specialties—paving the way for more efficient and informed care for acute chest pain patients.


Journal of Emergency Medicine | 2015

Faculty Prediction of In-Training Examination Scores of Emergency Medicine Residents: A Multicenter Study

Amer Z. Aldeen; Erin Quattromani; Kelly Williamson; Nicholas Hartman; Natasha B. Wheaton; Jeremy Branzetti

BACKGROUND The Emergency Medicine In-Training Examination (EMITE) is one of the few validated instruments for medical knowledge assessment of emergency medicine (EM) residents. The EMITE is administered only once annually, with results available just 2 months before the end of the academic year. An earlier predictor of EMITE scores would be helpful for educators to institute timely remediation plans. A previous single-site study found that only 69% of faculty predictions of EMITE scores were accurate. OBJECTIVE The goal of this article was to measure the accuracy with which EM faculty at five residency programs could predict EMITE scores for resident physicians. METHODS We asked EM faculty at five different residency programs to predict the 2014 EMITE scores for all their respective resident physicians. The primary outcome was prediction accuracy, defined as the proportion of predictions within 6% of the actual scores. The secondary outcome was prediction precision, defined as the mean deviation of predictions from the actual scores. We assessed faculty background variables for correlation with the two outcomes. RESULTS One hundred and eleven faculty participated in the study (response rate 68.9%). Mean prediction accuracy for all faculty was 60.0%. Mean prediction precision was 6.3%. Participants were slightly more accurate at predicting scores of noninterns compared to interns. No faculty background variable correlated with the primary or secondary outcomes. Eight participants predicted scores with high accuracy (>80%). CONCLUSIONS In this multicenter study, EM faculty possessed only moderate accuracy at predicting resident EMITE scores. A very small subset of faculty members is highly accurate.


Western Journal of Emergency Medicine | 2018

3 for the Price of 1: Teaching Chest Pain Risk Stratification in a Multidisciplinary, Problem-based Learning Workshop

William D. Alley; Cynthia A. Burns; Nicholas Hartman; Kim Askew; Simon A. Mahler

Introduction Chest pain is a common chief complaint among patients presenting to health systems and often leads to complex and intensive evaluations. While these patients are often cared for by a multidisciplinary team (primary care, emergency medicine, and cardiology), medical students usually learn about the care of these patients in a fragmented, single-specialty paradigm. The present and future care of patients with chest pain is multidisciplinary, and the education of medical students on the subject should be as well. Our objective was to evaluate the effectiveness of a multidisciplinary, problem-based learning workshop to teach third-year medical students about risk assessment for patients presenting with chest pain, specifically focusing on acute coronary syndromes. Methods To create an educational experience consistent with multidisciplinary team-based care, we designed a multidisciplinary, problem-based learning workshop to provide medical students with an understanding of how patients with chest pain are cared for in a systems-based manner to improve outcomes. Participants included third-year medical students (n=219) at a single, tertiary care, academic medical center. Knowledge acquisition was tested in a pre-/post-retention test study design. Results Following the workshop, students achieved a 19.7% (95% confidence interval [CI] [17.3–22.2%]) absolute increase in scores on post-testing as compared to pre-testing. In addition, students maintained an 11.1% (95% CI [7.2–15.0%]) increase on a retention test vs. the pre-test. Conclusion A multidisciplinary, problem-based learning workshop is an effective method of producing lasting gains in student knowledge about chest pain risk stratification.


The Clinical Teacher | 2018

Educational priorities of students in the entrustable professional activity era

Roy E. Strowd; Allison McBride; Jon Goforth; Joseph Cristiano; Nicholas Hartman; Gregory S. Waters; James Beardsley; James E. Johnson; Kim Askew

The Association of American Medical Colleges (AAMC) guidelines on the entrustable professional activities (EPAs) expected of graduating medical students were recently published. Although perceptions of educators, residents and programme directors have been described, the voice of senior medical students is lacking.


MedEdPORTAL | 2018

Satisfaction Academy: A Novel Residency Curriculum to Improve the Patient Experience in the Emergency Department

Jonah Gunalda; Kathleen Hosmer; Nicholas Hartman; Lane Smith; Bradley Chapman; Warren Jones; Michael Irick; Manoj Pariyadath

Introduction Patient satisfaction is a key indicator of health care value and an increasingly important metric used to assess emergency physician performance and often reimbursement. To our knowledge, there is no standardized curriculum within emergency medicine (EM) residency programs that focuses on the patient experience in EM. Methods Our novel resident curriculum is an organized approach to enhancing patient-centered care by optimizing the patient experience. It spans the academic year, with key topics organized into a quarterly time line. Topics include physician courtesy and respect, pain management, discussion of diagnostic and therapeutic interventions, timely communication, and delivery of quality care. Each quarter has three components: introduction/didactics, an interactive workshop, and stories and reflection. The instructional methods used include didactic lectures, role-playing, and group reflection and storytelling. Results Of 44 participants, 54.5% completed a preintervention survey, and 45.5% completed a postintervention survey. The surveys consisted of 5-point Likert scales measuring degree of agreement with statements that reflected desired behaviors and/or attitudes. On the postintervention survey, participants gave scores indicating general agreement with desired behaviors including sitting at the bedside, acknowledging all persons in the room, and giving an anticipated disposition, as well as with feeling more knowledgeable about patient satisfaction. Discussion Our Satisfaction Academy has filled a significant gap related to enhancing the patient experience. This curriculum is generalizable to other EM residency programs, and the interactive peer-to-peer format is both engaging and customizable.


Western Journal of Emergency Medicine | 2017

This Article Corrects: “Trends in NRMP Data from 2007–2014 for U.S. Seniors Matching into Emergency Medicine”

David E. Manthey; Nicholas Hartman; Aileen Newmyer; Jonah Gunalda; Brian Hiestand; Kim Askew

[This corrects the article on p. 105 in vol. 18, PMID: 28116018.].


Journal of Ultrasound in Medicine | 2017

Derivation of a Performance Checklist for Ultrasound-Guided Arthrocentesis Using the Modified Delphi Method

Derek Kunz; Manoj Pariyadath; Mary Wittler; Kim Askew; David E. Manthey; Nicholas Hartman

Arthrocentesis is an important skill for physicians in multiple specialties. Recent studies indicate a superior safety and performance profile for this procedure using ultrasound guidance for needle placement, and improving quality of care requires a valid measurement of competency using this modality.


Journal of Emergency Medicine | 2016

A Novel Tool for Assessment of Emergency Medicine Resident Skill in Determining Diagnosis and Management for Emergent Electrocardiograms: A Multicenter Study

Nicholas Hartman; Natasha B. Wheaton; Kelly Williamson; Erin Quattromani; Jeremy Branzetti; Amer Z. Aldeen

Collaboration


Dive into the Nicholas Hartman's collaboration.

Top Co-Authors

Avatar

Kim Askew

Wake Forest University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Natasha B. Wheaton

Roy J. and Lucille A. Carver College of Medicine

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge