Jason L. Morris
University of Alabama at Birmingham
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jason L. Morris.
The American Journal of the Medical Sciences | 2014
Joshua Coker; Analia Castiglioni; F. Stanford Massie; Stephen W. Russell; Terrance Shaneyfelt; Lisa L. Willett; Carlos A. Estrada; Ryan R. Kraemer; Jason L. Morris; Martin Rodriguez
Background:Current evaluation tools of medical school courses are limited by the scope of questions asked and may not fully engage the student to think on areas to improve. The authors sought to explore whether a technique to study consumer preferences would elicit specific and prioritized information for course evaluation from medical students. Methods:Using the nominal group technique (4 sessions), 12 senior medical students prioritized and weighed expectations and topics learned in a 100-hour advanced physical diagnosis course (4-week course; February 2012). Students weighted their top 3 responses (top = 3, middle = 2 and bottom = 1). Results:Before the course, 12 students identified 23 topics they expected to learn; the top 3 were review sensitivity/specificity and high-yield techniques (percentage of total weight, 18.5%), improving diagnosis (13.8%) and reinforce usual and less well-known techniques (13.8%). After the course, students generated 22 topics learned; the top 3 were practice and reinforce advanced maneuvers (25.4%), gaining confidence (22.5%) and learn the evidence (16.9%). The authors observed no differences in the priority of responses before and after the course (P = 0.07). Conclusions:In a physical diagnosis course, medical students elicited specific and prioritized information using the nominal group technique. The course met student expectations regarding education of the evidence-based physical examination, building skills and confidence on the proper techniques and maneuvers and experiential learning. The novel use for curriculum evaluation may be used to evaluate other courses—especially comprehensive and multicomponent courses.
Journal of General Internal Medicine | 2011
Deepa Bhatnagar; Jason L. Morris; Martin Rodriguez; Robert M. Centor; Carlos A. Estrada; Lisa L. Willett
In this series, a clinician extemporaneously discusses the diagnostic approach (regular text) to sequentially presented clinical information (bold). Additional commentary on the diagnostic reasoning process (italics) is integrated throughout the discussion.
Journal of General Internal Medicine | 2013
J. William Schleifer; Robert M. Centor; Gustavo R. Heudebert; Carlos A. Estrada; Jason L. Morris
In this series, a clinician extemporaneously discusses the diagnostic approach (regular text) to sequentially presented clinical information (bold). Additional commentary on the diagnostic reasoning process (italics) is integrated throughout the discussion.
Journal of Graduate Medical Education | 2016
John L. Musgrove; Jason L. Morris; Carlos A. Estrada; Ryan R. Kraemer
Background Published clinical problem solving exercises have emerged as a common tool to illustrate aspects of the clinical reasoning process. The specific clinical reasoning terms mentioned in such exercises is unknown. Objective We identified which clinical reasoning terms are mentioned in published clinical problem solving exercises and compared them to clinical reasoning terms given high priority by clinician educators. Methods A convenience sample of clinician educators prioritized a list of clinical reasoning terms (whether to include, weight percentage of top 20 terms). The authors then electronically searched the terms in the text of published reports of 4 internal medicine journals between January 2010 and May 2013. Results The top 5 clinical reasoning terms ranked by educators were dual-process thinking (weight percentage = 24%), problem representation (12%), illness scripts (9%), hypothesis generation (7%), and problem categorization (7%). The top clinical reasoning terms mentioned in the text of 79 published reports were context specificity (n = 20, 25%), bias (n = 13, 17%), dual-process thinking (n = 11, 14%), illness scripts (n = 11, 14%), and problem representation (n = 10, 13%). Context specificity and bias were not ranked highly by educators. Conclusions Some core concepts of modern clinical reasoning theory ranked highly by educators are mentioned explicitly in published clinical problem solving exercises. However, some highly ranked terms were not used, and some terms used were not ranked by the clinician educators. Effort to teach clinical reasoning to trainees may benefit from a common nomenclature of clinical reasoning terms.
Journal of Hospital Medicine | 2014
Nancy M. Tofil; Jason L. Morris; Dawn Taylor Peterson; Penni Watts; Chad Epps; Kathy Harrington; Kevin Leon; Caleb Pierce; Marjorie Lee White
The American Journal of the Medical Sciences | 2010
Mihaela Stefan; J. Matthew Blackwell; Kamau M. Crawford; Samuel Cykert; Johanna Martinez; Sun Wu Sung; Scott A. Holliday; Michael Landry; Nancy LaVine; Nathan Lerfald; Jason L. Morris; Sandra B. Greene
The American Journal of the Medical Sciences | 2010
Mihaela Stefan; J. Matthew Blackwell; Kamau M. Crawford; Johanna Martinez; Sun Wu Sung; Scott A. Holliday; Michael Landry; Nancy LaVine; Nathan Lerfald; Jason L. Morris; Alexandra C Greene; Samuel Cykert
Archive | 2015
Carlos A. Estrada; Jason L. Morris; Birmingham Vamc; Chad Miller; Analia Castiglioni
Archive | 2013
Carlos A. Estrada; Robert M. Centor; Jason L. Morris; Ryan R. Kraemer; Amanda Vick; Lisa L. Willett; Starr Steinhilber; Chad Miller; Deepa Bhatnagar; Jeff Kohlwes
Archive | 2011
Amanda Vick; Ryan R. Kraemer; Jason L. Morris; Lisa L. Willett; Robert M. Centor; Carlos A. Estrada; J. Martin Rodriguez