Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Luan Lawson is active.

Publication


Featured researches published by Luan Lawson.


Prehospital Emergency Care | 2002

Automated external defibrillation by very young, untrained children

Luan Lawson; Juan A. March

For patients with sudden cardiac death (SCD), the time interval to defibrillation is the main determinant of survival. As such, the American Heart Association has attempted to promote public-access defibrillation (PAD). Previous studies have shown that automated external defibrillators (AEDs) can be used successfully by untrained adults. Objective: To determine whether very young, untrained children could use AEDs. Methods: Third-grade students from an elementary school participated in this study representing a convenience sample of volunteers. They were given no formal training, but were shown how to peel off the backing from the electrode pads, like a sticker. Students were then given a mock code situation using a training manikin. The time to delivery of first shock was recorded. Students were then trained during a 2-minute review of the process, one on one with an instructor, and the study was then repeated. Data were analyzed using a paired Students t-test comparing pre- and post-training. Results: Thirty-one children participated in the study, with a median age of 9 years. For untrained children, the mean time for delivery of the first shock was 59.3 ± 13.6 seconds, 95% CI = 54.3 to 64.3. Following training, the mean time for delivery of the first shock was 35.2 ± 6.0 seconds, 95% CI = 33.0 to 37.4, p = 0.001. Conclusion: Although this study suggests that even very young, untrained children can successfully perform automated external defibrillation, training does significantly decrease the time to delivery of first shock.


Academic Medicine | 2017

Health Systems Science Curricula in Undergraduate Medical Education: Identifying and Defining a Potential Curricular Framework.

Jed D. Gonzalo; Michael Dekhtyar; Stephanie R. Starr; Jeffrey Borkan; Patrick Brunett; Tonya L. Fancher; Jennifer Green; Sara Jo Grethlein; Cindy J. Lai; Luan Lawson; Seetha Monrad; Patricia S. O’Sullivan; Mark D. Schwartz; Susan E. Skochelak

Purpose The authors performed a review of 30 Accelerating Change in Medical Education full grant submissions and an analysis of the health systems science (HSS)-related curricula at the 11 grant recipient schools to develop a potential comprehensive HSS curricular framework with domains and subcategories. Method In phase 1, to identify domains, grant submissions were analyzed and coded using constant comparative analysis. In phase 2, a detailed review of all existing and planned syllabi and curriculum documents at the grantee schools was performed, and content in the core curricular domains was coded into subcategories. The lead investigators reviewed and discussed drafts of the categorization scheme, collapsed and combined domains and subcategories, and resolved disagreements via group discussion. Results Analysis yielded three types of domains: core, cross-cutting, and linking. Core domains included health care structures and processes; health care policy, economics, and management; clinical informatics and health information technology; population and public health; value-based care; and health system improvement. Cross-cutting domains included leadership and change agency; teamwork and interprofessional education; evidence-based medicine and practice; professionalism and ethics; and scholarship. One linking domain was identified: systems thinking. Conclusions This broad framework aims to build on the traditional definition of systems-based practice and highlight the need for medical and other health professions schools to better align education programs with the anticipated needs of the systems in which students will practice. HSS will require a critical investigation into existing curricula to determine the most efficient methods for integration with the basic and clinical sciences.


Academic Emergency Medicine | 2009

RVU Ready? Preparing Emergency Medicine Resident Physicians in Documentation for an Incentive‐based Work Environment

Kelly Carter; Brian C. Dawson; Kori L. Brewer; Luan Lawson

OBJECTIVES The emergency medicine (EM) job market is increasingly focused on incentive-based reimbursement, which is largely based on relative value units (RVUs) and is directly related to documentation of patient care. Previous studies have shown a need to improve resident education in documentation. The authors created a focused educational intervention on billing and documentation practices to meet this identified need. The hypothesis of this study was that this educational intervention would result in an increase in RVUs generated by EM resident physicians and the average amount billed per patient. METHODS The authors used a quasi-experimental study design. An educational intervention included a 1-hour lecture on documentation and billing, biweekly newsletters, and case-specific feedback from the billing department for EM resident physicians. RVUs and charges generated per patient were recorded for all second- and third-year resident physicians for a 3-month period prior to the educational intervention and for a 3-month period following the intervention. Pre- and postintervention data were compared using Students t-test and repeated-measures analysis of variance, as appropriate. RESULTS The evaluation and management (E/M) chart levels billed during each phase of the study were significantly different (p < 0.0001). The total number of RVUs generated per hour increased from 3.17 in the first phase to 3.71 in the second phase (p = 0.0001). During the initial 3-month phase, the average amount billed per patient seen by a second- or third-year resident was 282.82 USD, which increased to 301.94 USD in the second phase (p = 0.0004). CONCLUSIONS The educational intervention positively affected resident documentation resulting in greater RVUs/hour and greater billing performance in the study emergency department (ED).


Academic Medicine | 2017

Priority Areas and Potential Solutions for Successful Integration and Sustainment of Health Systems Science in Undergraduate Medical Education

Jed D. Gonzalo; Elizabeth G. Baxley; Jeffrey Borkan; Michael Dekhtyar; Richard E. Hawkins; Luan Lawson; Stephanie R. Starr; Susan E. Skochelak

Educators, policy makers, and health systems leaders are calling for significant reform of undergraduate medical education (UME) and graduate medical education (GME) programs to meet the evolving needs of the health care system. Nationally, several schools have initiated innovative curricula in both classroom and workplace learning experiences to promote education in health systems science (HSS), which includes topics such as value-based care, health system improvement, and population and public health. However, the successful implementation of HSS curricula across schools is challenged by issues of curriculum design, assessment, culture, and accreditation, among others. In this report of a working conference using thematic analysis of workshop recommendations and experiences from 11 U.S. medical schools, the authors describe seven priority areas for the successful integration and sustainment of HSS in educational programs, and associated challenges and potential solutions. In 2015, following regular HSS workgroup phone calls and an Accelerating Change in Medical Education consortium-wide meeting, the authors identified the priority areas: partner with licensing, certifying, and accrediting bodies; develop comprehensive, standardized, and integrated curricula; develop, standardize, and align assessments; improve the UME to GME transition; enhance teachers’ knowledge and skills, and incentives for teachers; demonstrate value added to the health system; and address the hidden curriculum. These priority areas and their potential solutions can be used by individual schools and HSS education collaboratives to further outline and delineate the steps needed to create, deliver, study, and sustain effective HSS curricula with an eye toward integration with the basic and clinical sciences curricula.


Western Journal of Emergency Medicine | 2015

Correlation of the NBME Advanced Clinical Examination in EM and the National EM M4 exams

Katherine M. Hiller; Emily S. Miller; Luan Lawson; David A. Wald; Michael S. Beeson; Corey Heitz; Thomas K. Morrissey; Joseph B. House; Stacey Poznanski

Introduction Since 2011 two online, validated exams for fourth-year emergency medicine (EM) students have been available (National EM M4 Exams). In 2013 the National Board of Medical Examiners offered the Advanced Clinical Examination in Emergency Medicine (EM-ACE). All of these exams are now in widespread use; however, there are no data on how they correlate. This study evaluated the correlation between the EM-ACE exam and the National EM M4 Exams. Methods From May 2013 to April 2014 the EM-ACE and one version of the EM M4 exam were administered sequentially to fourth-year EM students at five U.S. medical schools. Data collected included institution, gross and scaled scores and version of the EM M4 exam. We performed Pearson’s correlation and random effects linear regression. Results 303 students took the EM-ACE and versions 1 (V1) or 2 (V2) of the EM M4 exams (279 and 24, respectively). The mean percent correct for the exams were as follows: EM-ACE 74.8 (SD-8.83), V1 83.0 (SD-6.41), V2 78.5 (SD-7.70). Pearson’s correlation coefficient for the V1/EM-ACE was 0.51 (0.42 scaled) and for the V2/EM-ACE was 0.59 (0.41 scaled). The coefficient of determination for V1/EM-ACE was 0.72 and for V2/EM-ACE = 0.71 (0.86 and 0.49 for scaled scores). The R-squared values were 0.25 and 0.30 (0.18 and 0.13, scaled), respectively. There was significant cluster effect by institution. Conclusion There was moderate positive correlation of student scores on the EM-ACE exam and the National EM M4 Exams.


Academic Medicine | 2016

The Teachers of Quality Academy: A Learning Community Approach to Preparing Faculty to Teach Health Systems Science.

Elizabeth G. Baxley; Luan Lawson; Herbert G. Garrison; Danielle S. Walsh; Suzanne Lazorick; Donna Lake; Jason Higginson

Problem Although efforts to integrate health systems science (HSS) topics, such as patient safety, quality improvement (QI), interprofessionalism, and population health, into health professions curricula are increasing, the rate of change has been slow. Approach The Teachers of Quality Academy (TQA), Brody School of Medicine at East Carolina University, was established in January 2014 with the dual goal of preparing faculty to lead frontline clinical transformation while becoming proficient in the pedagogy and curriculum design necessary to prepare students in HSS competencies. The TQA included the completion of the Institute for Healthcare Improvement Open School Basic Certificate in Quality and Safety; participation in six 2-day learning sessions on key HSS topics; completion of a QI project; and participation in three online graduate courses. Outcomes Twenty-seven faculty from four health science programs completed the program. All completed their QI projects. Nineteen (70%) have been formally engaged in the design and delivery of the medical student curriculum in HSS. Early into their training, TQA participants began to apply new knowledge and skills in HSS to the development of educational initiatives beyond the medical student curriculum. Next Steps Important next steps for TQA participants and program planners include further incorporation as faculty advisors and contributors to the full implementation of the longitudinal HSS curriculum; expanded involvement with the Leaders in Innovative Care Scholars student leadership distinction track; continued in-depth evaluation of the impact of TQA participation on patient care, teaching, and role modeling; and the recruitment of the next cohort of TQA participants.


Western Journal of Emergency Medicine | 2015

Medical Student Performance on the National Board of Medical Examiners Emergency Medicine Advanced Clinical Examination and the National Emergency Medicine M4 Exams.

Katherine M. Hiller; Joseph B. House; Luan Lawson; Stacey Poznanski; Thomas K. Morrissey

Introduction In April 2013, the National Board of Medical Examiners (NBME) released an Advanced Clinical Examination (ACE) in emergency medicine (EM). In addition to this new resource, CDEM (Clerkship Directors in EM) provides two online, high-quality, internally validated examinations. National usage statistics are available for all three examinations, however, it is currently unknown how students entering an EM residency perform as compared to the entire national cohort. This information may help educators interpret examination scores of both EM-bound and non-EM-bound students. Objectives The objective of this study was to compare EM clerkship examination performance between students who matched into an EM residency in 2014 to students who did not. We made comparisons were made using the EM-ACE and both versions of the National fourth year medical student (M4) EM examinations. Method In this retrospective multi-institutional cohort study, the EM-ACE and either Version 1 (V1) or 2 (V2) of the National EM M4 examination was given to students taking a fourth-year EM rotation at five institutions between April 2013 to February 2014. We collected examination performance, including the scaled EM-ACE score, and percent correct on the EM M4 exams, and 2014 NRMP Match status. Student t-tests were performed on the examination averages of students who matched in EM as compared with those who did not. Results A total of 606 students from five different institutions took both the EM-ACE and one of the EM M4 exams; 94 (15.5%) students matched in EM in the 2014 Match. The mean score for EM-bound students on the EM-ACE, V1 and V2 of the EM M4 exams were 70.9 (n=47, SD=9.0), 84.4 (n=36, SD=5.2), and 83.3 (n=11, SD=6.9), respectively. Mean scores for non-EM-bound students were 68.0 (n=256, SD=9.7), 82.9 (n=243, SD=6.5), and 74.5 (n=13, SD=5.9). There was a significant difference in mean scores in EM-bound and non-EM-bound student for the EM-ACE (p=0.05) and V2 (p<0.01) but not V1 (p=0.18) of the National EM M4 examination. Conclusion Students who successfully matched in EM performed better on all three exams at the end of their EM clerkship.


International Journal of Emergency Medicine | 2011

Transvaginal evisceration progressing to peritonitis in the emergency department: a case report

Luan Lawson; Leigh Patterson; Kelly Carter

BackgroundAbdominal pain is a common complaint among emergency department patients, making it essential to identify those with life-threatening etiologies. We report on the rare finding of atraumatic transvaginal bowel evisceration in a patient presenting to the emergency department with the primary complaint of abdominal pain.Case DescriptionA 63-year-old female presented ambulatory to the emergency department with abdominal pain and foreign body sensation in her vagina after coughing. Physical exam demonstrated evisceration of her small bowel through her vagina. During her clinical course, she rapidly deteriorated from appearing well without abdominal tenderness to hypotensive with frank peritonitis.ConclusionThis case demonstrates the need to perform a thorough physical exam on all patients with abdominal pain and details the management of vaginal evisceration. This case also highlights the difficulty of appropriate triage for patients with complaints not easily assessed in triage. In an era of emergency department crowding, emergency physicians should reevaluate nursing education on triaging abdominal pain to prevent delays in caring for well-appearing patients who have underlying life-threatening illnesses.


Western Journal of Emergency Medicine | 2018

The National Clinical Assessment Tool for Medical Students in the Emergency Department (NCAT-EM)

Julianna Jung; Douglas Franzen; Luan Lawson; David E. Manthey; Matthew Tews; Nicole M. Dubosh; Jonathan Fisher; Marianne Haughey; Joseph B. House; Arleigh Trainor; David A. Wald; Katherine M. Hiller

Introduction Clinical assessment of medical students in emergency medicine (EM) clerkships is a highly variable process that presents unique challenges and opportunities. Currently, clerkship directors use institution-specific tools with unproven validity and reliability that may or may not address competencies valued most highly in the EM setting. Standardization of assessment practices and development of a common, valid, specialty-specific tool would benefit EM educators and students. Methods A two-day national consensus conference was held in March 2016 in the Clerkship Directors in Emergency Medicine (CDEM) track at the Council of Residency Directors in Emergency Medicine (CORD) Academic Assembly in Nashville, TN. The goal of this conference was to standardize assessment practices and to create a national clinical assessment tool for use in EM clerkships across the country. Conference leaders synthesized the literature, articulated major themes and questions pertinent to clinical assessment of students in EM, clarified the issues, and outlined the consensus-building process prior to consensus-building activities. Results The first day of the conference was dedicated to developing consensus on these key themes in clinical assessment. The second day of the conference was dedicated to discussing and voting on proposed domains to be included in the national clinical assessment tool. A modified Delphi process was initiated after the conference to reconcile questions and items that did not reach an a priori level of consensus. Conclusion The final tool, the National Clinical Assessment Tool for Medical Students in Emergency Medicine (NCAT-EM) is presented here.


Western Journal of Emergency Medicine | 2015

Correlation of the National Emergency Medicine M4 Clerkship Examination with USMLE Examination Performance

Luan Lawson; Davis Musick; Kori L. Brewer

Introduction Assessment of medical students’ knowledge in clinical settings is complex yet essential to the learning process. Clinical clerkships use various types of written examinations to objectively test medical knowledge within a given discipline. Within emergency medicine (EM), a new national standardized exam was developed to test medical knowledge in this specialty. Evaluation of the psychometric properties of a new examination is an important issue to address during test development and use. Studies have shown that student performance on selected standardized exams will reveal students’ strengths and/or weaknesses, so that effective remedial efforts can be implemented. Our study sought to address these issues by examining the association of scores on the new EM national exam with other standardized exam scores. Methods From August 2011 to April 2013, average National EM M4 examination scores of fourth-year medical students taken at the end of a required EM clerkship were compiled. We examined the correlation of the National EM M4 examination with the scores of initial attempts of the United States Medical Licensing Exam (USMLE) Step 1 and Step 2 Clinical Knowledge (CK) examinations. Correlation coefficients and 95% confidence intervals of correlation coefficients are reported. We also examined the association between the national EM M4 examination score, final grades for the EM rotation, and USMLE Step 1 and Step 2 CK scores. Results 133 students were included in the study and achieved a mean score of 79.5 SD 8.0 on the National EM M4 exam compared to a national mean of 79.7 SD 3.89. The mean USMLE Step 1 score was 226.8 SD 19.3. The mean USMLE Step 2 CK score was 238.5 SD 18.9. National EM M4 examination scores showed moderate correlation with both USMLE Step 1 (mean score=226.8; correlation coefficient=0.50; 95% CI [0.28–0.67]) and USMLE Step 2 CK (mean score=238.5; correlation coefficient=0.47; 95% CI [0.25–0.65]). Students scoring below the median on the national EM M4 exam also scored well below their colleagues on USMLE exams. Conclusion The moderate correlation of the national EM M4 examination and USMLE Step 1 and Step 2 CK scores provides support for the utilization of the CDEM National EM M4 examination as an effective means of assessing medical knowledge for fourth-year medical students. Identification of students scoring lower on standardized exams allows for effective remedial efforts to be undertaken throughout the medical education process.

Collaboration


Dive into the Luan Lawson's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Julianna Jung

Johns Hopkins University School of Medicine

View shared research outputs
Top Co-Authors

Avatar

Kori L. Brewer

East Carolina University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge