Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Marianne M. Green is active.

Publication


Featured researches published by Marianne M. Green.


Academic Medicine | 2009

Selection Criteria for Residency: Results of a National Program Directors Survey

Marianne M. Green; Paul Jones; J. X. Thomas

Purpose To assess the relative importance of criteria used for residency selection in 21 medical specialties given current available data and competitiveness of specialties. Method In 2006, questionnaires were distributed to 2,528 program directors in university hospital or university-affiliated community hospital residency programs across 21 medical specialties. Responses were recorded using a five-point Likert scale of importance. Mean values for each item were calculated within and across all specialties. Mean scores for item responses were used to create rank orders of selection criteria within the specialties. To facilitate comparisons, specialties were grouped according to the percentages of positions filled with U.S. medical school graduates. Results The overall response rate was 49%. With the data from all specialties pooled, the top five selection criteria were (1) grades in required clerkships, (2) United States Medical Licensing Examination (USMLE) Step 1 score, (3) grades in senior electives in specialty, (4) number of honors grades, and (5) USMLE Step 2 Clinical Knowledge (CK) score. Conclusions The top academic selection criteria are based on clinical performance, with the exception of USMLE Step 1 score. Indicators that reflect excellence in clinical performance are valued across the specialties by residency program directors regardless of competitiveness within the specialty. USMLE Step 2 CK ranks higher in the less competitive specialties, whereas research experience is more prominent in the most competitive specialties. The Medical Student Performance Evaluation was ranked lowest of all criteria by the program directors.


Academic Medicine | 2009

Defining professionalism from the perspective of patients, physicians, and nurses.

Marianne M. Green; Amanda Zick; Gregory Makoul

Purpose Although professionalism has always been a core value in medicine, it has received increasingly explicit attention over the past several years. Unfortunately, the terms used to explain this competency have been rather abstract. This study was designed to identify and prioritize behaviorally based signs of medical professionalism that are relevant to patients, physicians, and nurses. Method The qualitative portion of this project began in 2004 with a series of 22 focus groups held to explore behavioral signs of professionalism in medicine. Separate groups were held with patients, inpatient nurses, outpatient nurses, resident physicians, and attending physicians from different specialties, generating a total of 68 behaviorally based items. In 2004–2006, quantitative data were collected through national patient (n = 415) and physician leader (n = 214) surveys and a statewide nurse (n = 237) survey that gauged the importance these groups attach to the behaviors as signs of professionalism and determined whether they are in a position to observe these behaviors in the clinical setting. Results The surveys of patients, physician leaders, and nurses provided different perspectives on the importance and visibility of behavioral signs of professionalism. Most of the behaviors were deemed very important signs of professionalism by at least 75% of patients, physicians, and/or nurses; far fewer were considered observable in the clinical setting. Conclusions This study demonstrates that it is possible and instructive to define professionalism in terms of tangible behaviors. Focusing on behaviors rather than attributes may facilitate discussion, assessment, and modeling of professionalism in both medical education and clinical care.


Medical Teacher | 2016

Status of portfolios in undergraduate medical education in the LCME accredited US medical school

Jason Chertoff; Ashleigh Wright; Maureen Novak; Joseph Fantone; Amy Fleming; Toufeeq Ahmed; Marianne M. Green; Adina Kalet; Machelle Linsenmeyer; Joshua Jacobs; Christina Dokter; Zareen Zaidi

Abstract Aim: We sought to investigate the number of US medical schools utilizing portfolios, the format of portfolios, information technology (IT) innovations, purpose of portfolios and their ability to engage faculty and students. Methods: A 21-question survey regarding portfolios was sent to the 141 LCME-accredited, US medical schools. The response rate was 50% (71/141); 47% of respondents (33/71) reported that their medical school used portfolios in some form. Of those, 7% reported the use of paper-based portfolios and 76% use electronic portfolios. Forty-five percent reported portfolio use for formative evaluation only; 48% for both formative and summative evaluation, and 3% for summative evaluation alone. Results: Seventy-two percent developed a longitudinal, competency-based portfolio. The most common feature of portfolios was reflective writing (79%). Seventy-three percent allow access to the portfolio off-campus, 58% allow usage of tablets and mobile devices, and 9% involve social media within the portfolio. Eighty percent and 69% agreed that the portfolio engaged students and faculty, respectively. Ninety-seven percent reported that the portfolios used at their institution have room for improvement. Conclusion: While there is significant variation in the purpose and structure of portfolios in the medical schools surveyed, most schools using portfolios reported a high level of engagement with students and faculty.


JAMA | 2017

Teaching Medical Students About Conflicts of Interest

Diane B. Wayne; Marianne M. Green; Eric G. Neilson

A long-standing ethos surrounds the practice of medicine. In that ethos physicians cannot fulfill their healing purpose without showing a high level of professionalism toward patients. It is part of medicine’s social contract, a contract through which scrutiny by public interest typically tells physicians how well they are doing and how well they have been taught. Medical educators within the span of modern memory still believe the careful selection of students maximizes the likelihood new learners, in addition to acquiring medical knowledge and skills, can understand and adopt traditional values of professionalism. Matriculating students harboring some sense of social responsibility are thought to be more inclined to embrace these values, particularly if they are shared by likeminded peers. Such principles include service, competency, altruism, integrity, promoting the public good, transparency, and accountability.


Academic Medicine | 2016

Feasibility and Outcomes of Implementing a Portfolio Assessment System Alongside a Traditional Grading System

Celia Laird O'Brien; Sandra M. Sanguino; J. X. Thomas; Marianne M. Green

Purpose Portfolios are a powerful tool to collect and evaluate evidence of medical students’ competence across time. However, comprehensive portfolio assessment systems that are implemented alongside traditional graded curricula at medical schools in the United States have not been described in the literature. This study describes the development and implementation of a longitudinal competency-based electronic portfolio system alongside a graded curriculum at a relatively large U.S. medical school. Method In 2009, the authors developed a portfolio system that served as a repository for all student assessments organized by competency domain. Five competencies were selected for a preclerkship summative portfolio review. Students submitted reflections on their performance. In 2014, four clinical faculty members participated in standard-setting activities and used expert judgment and holistic review to rate students’ competency achievement as “progressing toward competence,” “progressing toward competence with some concern,” or “progressing toward competence pending remediation.” Follow-up surveys measured students’ and faculty members’ perceptions of the process. Results Faculty evaluated 156 portfolios and showed high levels of agreement in their ratings. The majority of students achieved the “progressing toward competence” benchmark in all competency areas. However, 31 students received at least one concerning rating, which was not reflected in their course grades. Students’ perceptions of the system’s ability to foster self-assessment were mixed. Conclusions The portfolio review process allowed faculty to identify students with a concerning rating in a behavioral competency who would not have been identified in a traditional grading system. Identification of these students allows for intervention and early remediation.


Academic Medicine | 2016

Academic Performance of Students in an Accelerated Baccalaureate/MD Program: Implications for Alternative Physician Education Pathways

Marianne M. Green; Leah J. Welty; J. X. Thomas; Raymond H. Curry

Purpose Over one-third of U.S. medical schools offer combined baccalaureate/MD (BA/MD) degree programs. A subset of these truncate the premedical phase, reducing total time to the MD degree. Data comparing educational outcomes of these programs with those of conventional pathways are limited. Method The authors reviewed demographic characteristics and medical school performance of all 2,583 students entering Northwestern University Feinberg School of Medicine from 1999 to 2013, comparing students in the Honors Program in Medical Education (HPME), an accelerated seven-year program, versus non-HPME medical students. They evaluated Alpha Omega Alpha (AOA) selection, quintile performance distribution from the Medical Student Performance Evaluation, United States Medical Licensing Examination (USMLE) scores, and Match outcomes. Results A total of 560 students (21.7%) entered through the HPME. HPME students were on average 2.2 years younger and less likely (15/537 [2.8%] versus 285/1,833 [15.5%]) to belong to a racial/ethnic group underrepresented in medicine. There were no significant differences in AOA selection, quintile performance distribution, or USMLE scores. More HPME students entered internal medicine (161/450 [35.8%] versus 261/1,265 [20.6%]), and fewer chose emergency medicine (25/450 [5.6%] versus 110/1,265 [8.7%]) and obstetrics–gynecology (9/450 [2.0%] versus 67/1,265 [5.3%]). Conclusions The academic performances of medical students in the two programs studied were equivalent. Accelerated BA/MD programs might play a role in ameliorating the length and cost of a medical education. The academic success of these students absent the usual emphasis on undergraduate GPA and Medical College Admission Test scores supports efforts to redefine medical student selection criteria.


The virtual mentor : VM | 2012

Standardizing and Improving the Content of the Dean’s Letter

Marianne M. Green; Sandra M. Sanguino; J. X. Thomas

For the deans letter to be valuable, it should be an objective and unabridged summary of the students performance without obscuring or eliminating the very information that might predict difficulty in residency.


JAMA | 2017

Comparison of Content on the American Board of Internal Medicine Maintenance of Certification Examination With Conditions Seen in Practice by General Internists

Bradley M. Gray; Jonathan L. Vandergrift; Rebecca S. Lipner; Marianne M. Green

Importance Success on the internal medicine (IM) examination is a central requirement of the American Board of Internal Medicine’s (ABIM’s) Maintenance of Certification program (MOC). Therefore, it is important to understand the degree to which this examination reflects conditions seen in practice, one dimension of content validity, which focuses on the match between content in the discipline and the topics on the examination questions. Objective To assess whether the frequency of questions on IM-MOC examinations were concordant with the frequency of conditions seen in practice. Design, Setting, and Participants The 2010-2013 IM-MOC examinations were used to calculate the percentage of questions for 186 medical condition categories from the examination blueprint, which balances examination content by considering importance and frequency of conditions seen in practice. Nationally representative estimates of conditions seen in practice by general internists were estimated from the primary diagnosis for 13 832 office visits (2010-2013 National Ambulatory Medical Care Surveys) and 108 472 hospital stays (2010 National Hospital Discharge Survey). Exposures Prevalence of conditions included on the IM-MOC examination questions. Main Outcomes and Measures The outcome measure was the concordance between the percentages of IM-MOC examination questions and the percentages of conditions seen in practice during either office visits or hospital stays for each of 186 condition categories (eg, diabetes mellitus, ischemic heart disease, liver disease). The concordance thresholds were 0.5 SD of the weighted mean percentages of the applicable 186 conditions seen in practice (0.74% for office visits; 0.51% for hospital stays). If the absolute differences between the percentages of examination questions and the percentages of office visit conditions or hospital stay conditions seen were less than the applicable concordance threshold, then the condition category was judged to be concordant. Results During the 2010-2013 IM-MOC examination periods, 3600 questions (180 questions per examination form) were administered and 3461 questions (96.1%) were mapped into the 186 study conditions (mean, 18.6 questions per condition). Comparison of the percentages of 186 categories of medical conditions seen in 13 832 office visits and 108 472 hospital stays with the percentages of 3461 questions on IM-MOC examinations revealed that 2389 examination questions (69.0%; 95% CI, 67.5%-70.6% involving 158 conditions) were categorized as concordant. For concordance between questions and office visits only, 2010 questions (58.08%; 95% CI, 56.43%-59.72% of all examination questions) involving 145 conditions were categorized as concordant. For concordance between questions and hospital stays only, 1456 questions (42.07%; 95% CI, 40.42%-43.71% of all examination questions) involving 122 conditions were categorized as concordant. Conclusions and Relevance Among questions on IM-MOC examinations from 2010-2013, 69% were concordant with conditions seen in general internal medicine practices, although some areas of discordance were identified.


Academic Medicine | 2018

What Is the Relationship between a Preclerkship Portfolio Review and Later Performance in Clerkships

Celia Laird O’Brien; J. X. Thomas; Marianne M. Green

Purpose Medical educators struggle to find effective ways to assess essential competencies such as communication, professionalism, and teamwork. Portfolio-based assessment provides one method of addressing this problem by allowing faculty reviewers to judge performance, as based on a longitudinal record of student behavior. At the Feinberg School of Medicine, the portfolio system measures behavioral competence using multiple assessments collected over time. This study examines whether a preclerkship portfolio review is a valid method of identifying problematic student behavior affecting later performance in clerkships. Method The authors divided students into two groups based on a summative preclerkship portfolio review in 2014: students who had concerning behavior in one or more competencies and students progressing satisfactorily. They compared how students in these groups later performed on two clerkship outcomes as of October 2015: final grades in required clerkships, and performance on a clerkship clinical composite score. They used Mann–Whitney tests and multiple linear regression to examine the relationship between portfolio review results and clerkship outcomes. They used USMLE Step 1 to control for knowledge acquisition. Results Students with concerning behavior preclerkship received significantly lower clerkship grades than students progressing satisfactorily (P = .002). They also scored significantly lower on the clinical composite score (P < .001). Regression analysis indicated concerning behavior was associated with lower clinical composite scores, even after controlling for knowledge acquisition. Conclusions The results show a preclerkship portfolio review can identify behaviors that impact clerkship performance. A comprehensive portfolio system is a valid way to measure behavioral competencies.


Journal of General Internal Medicine | 2015

Maintaining Competence in General Internal Medicine

Marianne M. Green

To The Editor: As Chair of the American Board of Internal Medicine (ABIM) Internal Medicine Board and former chair of the Internal Medicine Exam Writing Committee, I appreciate Dr. Feldman’s recent editorial on the ABIM Maintenance of Certification (MOC) program.1 While I cannot comment on specific exam questions due to ABIM’s examination non-disclosure policy, I can attest that ABIM exams are carefully constructed by practicing internists and are intended to cover content that the exam committees have determined a qualified candidate should be able to recognize without consulting medical resources or references. Additionally, general internists serve on the Internal Medicine exam committee and collectively decide whether content is appropriate for the practicing internist. All new questions undergo rigorous review by physician experts, followed by pretesting without risk to examinees. More information on ABIM’s exam development process is available at http://www.abim.org/about/examInfo/developed.aspx. That said, ABIM agrees with many of the concerns Dr. Feldman raises, which echo what we have heard from other diplomates. Since the editorial’s publication, ABIM issued a statement announcing changes to the MOC program in response to diplomate feedback and in recognition that MOC needs to better meet diplomates’ needs.2 The statement also expresses ABIM’s commitment to work closely with diplomates to enhance the MOC program through various channels, including in-person meetings and workshops, diplomate surveys and a blog at http://transforming.abim.org/. Among the announced changes is an update to the Internal Medicine MOC examination blueprint to make the exam more reflective of what physicians in practice are doing. Through a structured review process, ABIM is collecting diplomate input on the updated blueprint, which will be implemented for the Fall 2015 MOC exam, with other specialties to follow. Additionally, to make the secure exam a more helpful feedback mechanism to inform physicians’ ongoing learning, ABIM is introducing enhanced exam score reports with the Spring 2015 MOC exam administration. These score reports were developed with feedback from physician focus groups, and will provide exam takers with more detailed exam performance data on their strengths and weaknesses. ABIM also recently launched Assessment 2020, an initiative to work with the broader health care community to define what competencies physicians will need as the field of medicine continues to evolve. The Assessment 2020 website at http://assessment2020.abim.org/ features a blog on which we welcome feedback, as well as a section on exam enhancements in research and development.3 As Dr. Feldman noted, all physicians are committed to lifelong learning. MOC needs to be a value-added process that helps us keep up in our rapidly changing fields and improve the quality of care we provide to our patients. I look forward to working with the community to develop an MOC program that better meets the needs of physicians across the diversity of internal medicine practice.

Collaboration


Dive into the Marianne M. Green's collaboration.

Top Co-Authors

Avatar

J. X. Thomas

Loyola University Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Amanda Zick

Northwestern University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge