Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kenneth M. Ludmerer is active.

Publication


Featured researches published by Kenneth M. Ludmerer.


Annals of Internal Medicine | 2000

Time and Medical Education

Kenneth M. Ludmerer

From a broad perspective, there have always been two dimensions to medical education: pedagogic principles and the institutional environment in which those principles are implemented. The two are closely related because adequate financial resources are necessary to allow medical education to proceed on a high plane. In the United States, educators realized as early as the 1870s the importance of conducting medical education in an environment of inquiry and discovery, where students learned by doing and where full-time faculty engaged in research as well as teaching. However, the financial resources to establish such a system proved difficult to acquire, which explains why this approach did not become generalized in the United States until nearly a half century later (1). Many of the ingredients of a rich learning environment are immediately discernible: an up-to-date library, a well-equipped medical school and hospital laboratories, an ample patient population, and a large cadre of dedicated, talented faculty. More subtle, but perhaps most important, is the presence of sufficient time. Time is indispensable for every learning objective to be met. Formal medical education must be long enough to allow a sufficient exposure to the facts and principles of medicine. Faculty must be provided sufficient time to teach learners and interact with them on a personal level, and students and house officers must be allowed enough time with patients to observe the natural history of disease, become independent problem solvers, and acquire the art of medicine, such as good communication skills. The success of U.S. medical education in the 20th century has resulted in no small part from the conscientious effort of educators to make the requisite time available. This essay reviews features of the time dimension in medical education. It also indicates how time has become medical educations most threatened commodity during the current era of managed care. The importance of time needs to be kept clearly in focus by those who wish to preserve the quality of U.S. medical education in the century ahead. The Length of Study Medical education has always encountered a dilemma: the proper length of time for medical study. To protect society, medical educators have been obligated to ensure that physicians be properly grounded in theory and technique. However, because it is impossible for an individual to learn all of medicine, even with a lifetime of study, at some point formal education must end and physicians must enter practice. Through the 1870s, medical education in the United States was surprisingly short: Two 16-week terms of lectures, the second term repeating the content of the first, were the norm. As medical knowledge grew exponentially, the length of time required for medical study began to increase. Sixteen-week terms became 9-month sessions, 2-year courses evolved into 3-year curricula, which then became 4-year curricula. By the early 1900s, the basic structure of current undergraduate medical education2 years of basic science instruction followed by 2 years of clinical clerkshipswas in place (1). By World War I, it was apparent that even 4 years of medical school was insufficient for the preparation of a physician. Medical knowledge, techniques, and practice were growing and changing too rapidly. Accordingly, the internship, a 1-year period of hospital education following receipt of the MD degree, became standard for all physicians, including most who were entering general practice. In addition, further training became necessary for those who wished to enter specialty practice or pursue an academic career. For these purposes, the residency, a several-year hospital experience following internship, became the customary vehicle. After World War II, as specialty careers eclipsed general practice in popularity, the residency became standard for virtually all physicians, and many physicians embarked on 2- or 3-year fellowships after the completion of a residency to acquire training in a subspecialty. Thus, by the 1960s, physicians were typically engaging in 3 to 7 or more years of preparation after graduation from medical schoola pattern that has persisted through the present (2). From the 1960s forward, the size of the formal educational box has remained unchanged, despite the continued exponential growth of medical knowledge, a series of reports decrying the crowding of the undergraduate curriculum (3), and the well-known rigors of graduate medical education. There have been occasional attempts to lengthen the period of training still further. For instance, in 1959 Stanford Medical School increased its undergraduate curriculum to 5 years. However, such efforts have not become widespread, and there appears to be little chance of any major increase in the length of training in the foreseeable future. What allowed the existing duration of formal medical training to work was the change in medical education a century ago from a substantive to a procedural emphasis. In the proprietary era, medical education emphasized the inculcation of facts through rote memorization. In the latter part of the 19th century, the objective of medical education became producing problem-solvers and critical thinkers. Through laboratory work in scientific subjects and hospital work with real responsibility for patient care during the clinical years, it was anticipated that learners would develop the power of critical reasoning, the capacity to generalize, and the ability to find out and evaluate information for themselves. Of course, there remained a huge amount of factual information that physicians needed to know, and the tension between the procedural and substantive approaches in medical education never abated. Nevertheless, with the new emphasis on reasoning skills, it mattered far less if a physician had not encountered every clinical situation during his or her formal education. A properly trained physician, who had become skilled in problem solving and dealing with clinical unknowns, was well-equipped to handle the innumerable uncertainties of day-to-day practice (1). The history of U.S. medical education has been one of striving to achieve, but not realizing, the educational ideal of teaching reasoning skills. Each generation of medical educators has reaffirmed its commitment to this principle while acknowledging the failure to accomplish the objective fully. Nevertheless, patients and the public have been generally well-served. The outstanding reputation of U.S. medical practice in the 20th century owes much to a system of medical education that has produced physicians who not only know facts and understand theory but also can solve problems and deal with clinical uncertainty. Time To Teach A distinctive characteristic of the proprietary school was its faculty-centered environment. A course of instruction based almost wholly on didactic lectures may not have been conducive to good learning, but it did represent efficient use of the facultys time. When faculty were not lecturing, they were seeing private patients. Indeed, the typical medical professor of that period derived the preponderance of his income from professional fees collected as a medical or surgical consultant. A central tenet of the revolution in U.S. medical education was that faculty should be genuine university professorsthat is, the bulk of their work should be in teaching and research. To accomplish this, reformers held that faculty should be retained on one or another version of the full-time system so they would be freed from having to practice medicine to generate their salaries. No one argued this more passionately than Abraham Flexner, the noted educational reformer. In 1930 he wrote, If the scientific budget of a clinical department is once dependent upon the earnings of the clinical staff, that staff will in all probability have to earn the requisite amountby doing what it is interested in, if it can, by doing other things, should that become necessary (4). As a consequence, learners became the central focus of the facultys attention in the new system of medical education implemented in the early 20th century. Laboratory instruction, clinical clerkships, seminars, small group conferences, individual tutorials, and personalized instruction became the hallmark of medical education in the United States. Such an environment was expensive to provide, but it did allow faculty to concentrate at last on the needs of learners. For instance, at the University of Michigan before World War II, the average faculty member devoted about 60% of his time to teaching or the preparation for teaching (5). At schools that were less research-intensive, an even greater percentage of the facultys time went into teaching. From the beginning of the modern era, faculty had important activities besides teaching to pursuemost notably, research. During the first half of the 20th century, and much more dramatically after the establishment of the National Institutes of Health in the late 1940s, research gradually supplanted teaching as the main faculty activity at many medical schools. Nevertheless, good teaching was still widely found, partly because there were more faculty, particularly in the clinical departments, to share the duties. This resulted from the fact that clinical faculty were under relatively little economic pressure to see patients. If they chose to spend time with students or house officers, they could freely do so. One extraordinary aspect of medical education during the managed care era has been the increasing diversion of faculty time into private practice. Faced with lower reimbursement rates from third-party payers, many schools have begun placing pressure on their full-time clinical instructors to see more patients. Schools have begun to measure the clinical productivity of their faculty (defined as the amount of clinical income a faculty member generates) and implement ne


Perspectives in Biology and Medicine | 2011

Abraham Flexner and medical education.

Kenneth M. Ludmerer

The Flexner Report had its roots in the recognition in the mid-19th century that medical knowledge is not something fixed but something that grows and evolves. This new view of medical knowledge led to a recasting of the goal of medical education as that of instilling the proper techniques of acquiring and evaluating information rather than merely inculcating facts through rote memorization. Abraham Flexner, a brilliant educator, had the background to understand and popularize the meaning of this new view of education, and he took the unprecedented step of relating the developments in medical education to the ideas of John Dewey and the progressive education movement. Although the Flexner Report is typically viewed as a historical document—due to an understandable tendency to refer only to the second half of the report, where Flexner provides his famous critiques of the medical schools that existed at the time—this article argues that the Flexner Report is actually a living educational document of as much significance to medical educators today as in Flexners time. The article analyzes Flexners discussion of medical education and shows that his message—the importance of academic excellence, professional leadership, proper financial support, and service and altruism—is timeless, as applicable to the proper education of physicians today and tomorrow as in the past.


The American Journal of Medicine | 1985

Renal failure, stroke, and death in an elderly woman with rheumatoid arthritis

Kenneth M. Ludmerer; John M. Kissane

Abstract Stenographic reports of weekly clinicopathologic conferences held in Barnes and Wohl Hospitals are published in each issue of the Journal . Members of the Departments of Internal Medicine, Radiology, and Pathology of the Washington University School of Medicine participate jointly in these conferences. Kenneth M. Ludmerer, M.D., and John M. Kissane, M.D., are the editors of this feature.


Annals of Internal Medicine | 2000

The Creation of Time to Heal

Kenneth M. Ludmerer

In November 1999, Oxford University Press published my book Time to Heal: American Medical Education from the Turn of the Century to the Era of Managed Care (1). The purpose of this book is twofold. First, it provides a comprehensive account of the evolution of medical education, medical schools, and teaching hospitals in the United States from the turn of the 20th century to the present, and second, it describes the negative impact of the marketplace on the way in which physicians learn and practice medicine in the era of managed care. The overarching thesis is that academic health centers are highly threatened by managed care, government cutbacks, and the loss of vision among some medical leaders. This situation carries ominous implications for the future of medical education and medical care in the United States. Fortunately, the past bears heavily on the present, and the book also shows how an understanding of the past offers constructive solutions to our current problems. The creation of Time to Heal illustrated many important points about the writing of medical history, and for these reasons, the editors of Annals of Internal Medicine requested an account of how the book came into being. The Writing of Medical History Medical history has a long tradition in Europe, but in the United States it began as an academic discipline in 1929, with the opening of the Institute of the History of Medicine of the Johns Hopkins University School of Medicine. The founder and first director of the Institute, William Welch, was a noted pathologist and the medical schools first dean, not a historian. However, he recruited outstanding European historians, such as Henry Sigerist and Owsei Temkin, to teach at the Institute, which quickly became the major center for research and writing in the history of medicine in the United States. Since World War II, study of the history of medicine has grown impressively in the United States; departments and graduate programs have been created at many medical schools and universities around the country. An indication of this growth is that the Institute of the History of Medicine, although still an outstanding department, is no longer considered the center of medical history in the United States. Medical historians have their own professional society, the American Association for the History of Medicine, and 400 to 500 registrants attend its annual meeting. Although most of todays professional medical historians hold PhD rather than MD degrees, most physician-historians have pursued graduate study in history at some point in their careers. Serious study of history requires a discipline and conceptual framework as sophisticated and demanding as that of any of the biomedical sciences. In its earlier periods, the history of medicine emphasized the intellectual development of medicine (the internalist approach). Medical historians focused on the growth of scientific knowledge, the evolution of medical and surgical practices, and the biographies of prominent medical scientists and practitioners. In the 1960s, the main focus of medical history, like that of its close cousin, the history of science, shifted to the social, economic, and political context of medicine (the externalist approach). In recent years, however, medical historians have come to recognize that the internalist and externalist approaches are in fact complementary, and some of the best recent work in the field has incorporated both perspectives (2). Time to Heal was written with this ideal in mind. Because historians attempt to understand the past on its own terms, much work in medical history has lacked lessons for contemporary readers. However, a few recent medical historians, most notably Daniel Fox (3, 4) and Rosemary Stevens (5, 6), have written important books that illuminate contemporary issues of medical practice and contribute to the current debate over health care policy. Time to Heal was influenced by these works, and from the beginning I sought to write a book that addressed present-day anxieties about health care among patients, professionals, and the public. The Creation of Time to Heal Origins In 1985, my book on medical education, Learning to Heal: The Development of American Medical Education, was published (7). Learning to Heal examined the creation of the U.S. system of medical education from the Civil War through World War I. At that time, I thought my work in medical education was done. However, in 1986 and 1987, the need for another book became apparent to me as the managed care movement began to spread rapidly. Many medical schools and teaching hospitals were no longer receiving enough clinical income to fully support their educational and research programs. More subtle and more important, the learning environment for medical students and house officers was eroding, and professional values in medical practice were being marginalized. Accordingly, in the spring of 1988, I began new research in an effort to understand what had happened and what might be done about it. My objective was not merely to chronicle events but also to provide a rich sociologic analysis of the evolution of U.S. academic health centers and the problems that they were facing. This work resulted in Time to Heal. My historical work has been heavily influenced by the fact that I am both a physician and a historian. As a clinician and educator, I have never stopped seeing patients or teaching medical students and residents. My personal experiences as a physician and teacher made me sensitive to the emerging threats to medical education and practice in a way that otherwise would not have been possible. Every day on the wards, I could see that learners had less time to learn, teachers had less time to teach, and the temptation to cut corners in patient care was growingall consequences of the rapid increase in the throughput of patients (seeing more and more patients for shorter lengths of time) in recent years. I could also see that academic health centers were becoming more commercial places where the relief of suffering often seemed subordinate to the capture of market share. At its most fundamental level, Time to Heal was driven by a deep concern that medical practice and education were being compromised, that academic health centers were losing their moorings, and that something needed to be done about this while there was still time. Although it was obvious from an insiders standpoint that patient safety was being threatened, the public was not yet aware of the danger. Thus, it was important to me that Time to Heal be a book for general readers, not only academic readers. I thought that the public might care about this story because everyone at some time needs medical care. My deepest hope was that Time to Heal would not only inspire professional leaders into action but also arouse public sentiment on behalf of protecting medical education. Execution Historical scholarship, like scholarship in any field, cannot be done on the fly. Abundant time is needed to sift through sources, reflect, analyze, interpret, synthesize, organize, and write. Time to Heal would not have been possible without the extraordinary gift of time provided by the Department of Medicine at Washington University, which allowed me to undertake a large project whose success was not guaranteed. Nor would the project have been possible without the departments generosity in allowing books to count as my scholarship. The financial support of several private foundations also made the project possible; it helped pay for my protected time and expenses incurred by the project. From 1988 to 1992, I was engaged mainly in research, obtaining and analyzing primary data from a representative sample of about one quarter of U.S. academic medical centers. The most important sources were unpublished records of medical schools, hospitals, faculty members, administrators, students, and various private and public organizations. These records and sources provided many concrete examples and much rich detail that were otherwise unobtainable. In general, records became particularly voluminous after 1965, creating one of the daunting problems of researching contemporary history. For example, the minutes and agenda items of the Executive Council of the Association of American Medical Colleges from 1932 to 1956 were contained in one storage box; the records from 1957 to 1991 required 42 boxes. Although historians depend heavily on primary sources for data, the richness of their interpretations generally reflects their conceptual framework. Thus, I also worked hard to understand the intellectual and social context in which medical education evolved. I wanted to explore the changes in medical education and practice that resulted from the internal development of medicine, particularly the increasing reductionism (molecular level of analysis) of medical knowledge. However, I also wished to interpret medical education in its external context: higher education in the United States, the evolving health care delivery system, and the major cultural trends of the 20th century. To accomplish these objectives, I read widely in medical, social, cultural, and intellectual history and medical sociology. In addition, friends and colleagues asked questions or raised issues that I might not otherwise have thought about. In retrospect, the research went more quickly than I would have predicted. However, the organization and writing, which began in 1992, proved more difficult than I had imagined. It was no small task to provide a comprehensive, panoramic account of U.S. medical education in one readable volume. It was especially challenging to carry the story through the present and write a book that addresses many of the major anxieties of todays health care world. Indeed, friends would speak of the seemingly eternal last two chapters of the book, which in themselves added about 3 years to the project. The


Journal of the History of Medicine and Allied Sciences | 2015

The History of Medicine in Medical Education

Kenneth M. Ludmerer

The history of medicine occupies a curious position in the United States today. Teaching and scholarship in the field (and in the history of all the health sciences, more broadly) is thriving throughout much of the university—particularly at faculties of arts and sciences, but also at schools of nursing, public health, and social work. Attendance at annual meetings of the American Association for the History of Medicine continues to be robust and the quality of the presentations, outstanding. Readers of the Journal of the History of Medicine and Allied Sciences, the Bulletin of the History of Medicine, and other publications focusing on the field have benefited from the continued high quality of articles, and competition to publish in those journals remains keen. Yet, as David S. Jones, Jeremy A. Greene, Jacalyn Duffin, and John Harley Warner have observed in this issue of the Journal, the history of medicine has barely a presence at the location where a priori it might be most expected to thrive: North American medical schools. As the authors observe, the precarious position of the history of medicine in medical education is not new. For generations, historians of medicine have struggled to make the case for history in medical education. For a moment—in the s and s—it appeared that they might succeed. This history of medicine had established a foothold at nineteen North American medical schools, boasting formal departments, full-time professors, and graduate programs. Forty-seven percent of U.S. medical schools offered some form of historical teaching in the s and  percent of U.S. medical schools and  percent of Canadian schools in . Financial support to the field came from the Josiah Macy, Jr. Foundation, and important alliances were established with the National Library of Medicine, the New York Academy of Medicine, the College of Physicians of Philadelphia, and other institutions. Yet this foothold was transitory. In the s, bioethics (and the medical humanities more broadly) displaced the history of medicine in the


Nursing History Review | 1999

Learning to heal : the development of American medical education

Kenneth M. Ludmerer

The development of American medical education involved a conceptual revolution in how medical students should be taught. With the introduction of laboratory and hospital work, students were expected to be active participants in their learning process, and the new goal of medical training was to foster critical thinking rather than the memorization of facts. In Learning to Heal, Kenneth Ludmerer offers the definitive account of the rise of the modern medical school and the shaping of the medical profession.


The New England Journal of Medicine | 2006

American Medical Education 100 Years after the Flexner Report

David M. Irby; Molly Cooke; William Sullivan; Kenneth M. Ludmerer


Archive | 1999

Time to Heal: American Medical Education from the Turn of the Century to the Era of Managed Care

Kenneth M. Ludmerer


Archive | 2005

Time to Heal

Kenneth M. Ludmerer


The American Historical Review | 1974

Genetics and American society : a historical appraisal

Theodosius Dobzhansky; Kenneth M. Ludmerer

Collaboration


Dive into the Kenneth M. Ludmerer's collaboration.

Top Co-Authors

Avatar

John M. Kissane

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David M. Irby

University of California

View shared research outputs
Top Co-Authors

Avatar

Molly Cooke

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge