Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Patricia J. Hicks is active.

Publication


Featured researches published by Patricia J. Hicks.


Academic Pediatrics | 2014

Domain of Competence: Practice-Based Learning and Improvement

Ann E. Burke; Bradley Benson; Robert Englander; Carol Carraccio; Patricia J. Hicks

From the Wright State University Boonshoft School of Medicine, Dayton, Ohio (Dr Burke); Department of Internal Medicine and Pediatrics, University of Minnesota Medical School, Minneapolis, Minn (Dr Benson); Association of American Medical Colleges, Washington, DC (Dr Englander); Competency-Based Assessment, The American Board of Pediatrics, Chapel Hill, NC (Dr Carraccio); and Department of Pediatrics, Childrens Hospital of Philadelphia, Philadelphia, Pa (Dr Hicks) The views expressed in this report are those of the authors and do not necessarily represent those of the Accreditation Council for Graduate Medical Education, the American Board of Pediatrics, the Association of Pediatric Program Directors, or the Academic Pediatric Association. The authors declare that they have no conflict of interest. Publication of this article was supported by the American Board of Pediatrics Foundation and the Association of Pediatric Program Directors. Address correspondence to Ann E. Burke, MD, Wright State University Boonshoft School of Medicine, One Children’s Plaza, Dayton Children’s Hospital, Department of Pediatrics, Dayton, OH 45404 (e-mail: [email protected]).


The Journal of Pediatrics | 2010

Pediatrics milestones: a developmental approach to the competencies.

Robert Englander; Patricia J. Hicks; Bradley Benson

n 2009, the Accreditation Council for Graduate Medical Education (ACGME) and the American Board of Pediatrics (ABP) partnered to create the Pediatric Milestone Project. One of the goals of this project was to reframe the 6 competencies in the context of the specialty, identifying markers of achievement (ie, milestones) along the path of residency training. They invited Carol Carraccio, MD, to be the project leader, and her first task was to establish an Advisory Board and a Working Group. Advisory Board members were recruited from the parent organizations as well as from the pool of national leaders in medical education, both from within and outside of pediatrics. The selected Working Group, composed of members of the Association of Pediatric Program Directors (APPD), one member of the Medicine Pediatrics Program Directors’ Association, two representatives of the ACGME, and a resident member, was charged with the following tasks: (1) redefine the competencies within the context of pediatrics; (2) set performance standards by level of training for achievement of the milestones; and (3) identify tools that could be embraced by the whole of the pediatric community to assess performance. The purpose of this report is to briefly describe the process used by the Working Group to accomplish the first charge of developing the milestones. We highlight some of the key lessons learned throughout the past year and outline future steps on the path to completion of our charge. We hope that other specialty groups will find this approach useful in their own work toward the ultimate goal of the Milestone Project: the ability to assess achievement of educational/curricular goals and program effectiveness.


Academic Pediatrics | 2012

The pediatrics milestones: a continuous quality improvement project is launched-now the hard work begins!

Robert Englander; Ann E. Burke; Susan Guralnick; Bradley Benson; Patricia J. Hicks; Stephen Ludwig; Daniel J. Schumacher; Lisa Johnson; Carol Carraccio

From the Association of American Medical Colleges, Washington, DC (Dr Englander); Department of Pediatrics, Dayton Children’s Medical Center and the Wright State University Boonshoft School of Medicine, Dayton, Ohio (Dr Burke); Winthrop University Hospital, Winthrop, NY (Dr Guralnick); Departments of Pediatrics and Internal Medicine, University of Minnesota Amplatz Children’s Hospital and the University of Minnesota School of Medicine, Minneapolis, MN (Dr Benson); Department of Pediatrics, The Perelman School of Medicine at the University of Pennsylvania, Philadelphia, PA (Drs Hicks and Ludwig); Department of Pediatrics, Boston Children’s Hospital/Boston Medical Center and Boston University School of Medicine, Boston, Mass (Dr Schumacher); Dartmouth Institute for Health Policy and Clinical Practice Center for Leadership and Improvement, Hanover, NH (Ms Johnson); and American Board of Pediatrics, Chapel Hill, NC (Dr Carraccio) Address correspondence to Robert Englander, MD, MPH, Association of American Medical Colleges, 2450 N Street NW, Washington, DC 20037 (e-mail: [email protected]).


Medical Teacher | 2016

Medical education practice-based research networks: Facilitating collaborative research.

Alan Schwartz; Robin Young; Patricia J. Hicks; For Appd Learn

Abstract Background: Research networks formalize and institutionalize multi-site collaborations by establishing an infrastructure that enables network members to participate in research, propose new studies, and exploit study data to move the field forward. Although practice-based clinical research networks are now widespread, medical education research networks are rapidly emerging. Aims: In this article, we offer a definition of the medical education practice-based research network, a brief description of networks in existence in July 2014 and their features, and a more detailed case study of the emergence and early growth of one such network, the Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network (APPD LEARN). Methods: We searched for extant networks through peer-reviewed literature and the world-wide web. Results: We identified 15 research networks in medical education founded since 2002 with membership ranging from 8 to 120 programs. Most focus on graduate medical education in primary care or emergency medicine specialties. Conclusions: We offer four recommendations for the further development and spread of medical education research networks: increasing faculty development, obtaining central resources, studying networks themselves, and developing networks of networks.


Academic Pediatrics | 2010

Resident work duty hour requirements: medical educators' perspectives.

Ann E. Burke; Jerry L. Rushton; Susan Guralnick; Patricia J. Hicks

From the Association of Pediatric Program Directors (Drs Burke, Rushton, Guralnick, Hicks); Department of Pediatrics, Wright State University Boonshoft School of Medicine, Dayton, Ohio (Dr Burke); Department of Pediatrics, Indiana University School of Medicine, Indianapolis, Ind (Dr Rushton); Winthrop University Hospital, Mineola, NY (Dr Guralnick); and The Children’s Hospital of Philadelphia, Department of Pediatrics, the University of Pennsylvania School of Medicine, Philadelphia, Pa (Dr Hicks) Address correspondence to Ann E. Burke, MD, Dayton Children’s Medical Center, Medical Education Department, One Children’s Plaza, Dayton, Ohio 45419 (e-mail: [email protected]).


Academic Pediatrics | 2014

Domain of Competence: Personal and Professional Development

Patricia J. Hicks; Daniel Schumacher; Susan Guralnick; Carol Carraccio; Ann E. Burke

From the Department of Pediatrics, Perelman School of Medicine at the University of Pennsylvania, The Children’s Hospital of Philadelphia, Philadelphia, Pa (Dr Hicks); Boston Combined Residency Program in Pediatrics, Pediatric Emergency Medicine, Boston Medical Center, Boston, Mass (Dr Schumacher); Graduate Medical Education and Student Affairs, Winthrop University Hospital, and Office of Graduate Medical Education andStudent Affairs, andDepartment of Pediatrics,WinthropUniversity Hospital, Mineola, NY (Dr Guralnick); CompetencyBased Assessment, The American Board of Pediatrics, Chapel Hill, NC (Dr Carraccio); and Wright State University Boonshoft School of Medicine, Dayton, Ohio (Dr Burke) The views expressed in this report are those of the authors and do not necessarily represent those of the Accreditation Council for Graduate Medical Education, the American Board of Pediatrics, the Association of Pediatric Program Directors, or the Academic Pediatric Association. The authors declare that they have no conflict of interest. Publication of this article was supported by the American Board of Pediatrics Foundation and the Association of Pediatric Program Directors. Address correspondence to Patricia J. Hicks, MD, MHPE, The Children’s Hospital of Philadelphia, 34th & Civic Center Blvd, Philadelphia, PA 19104 (e-mail: [email protected]).


Academic Pediatrics | 2014

Domain of Competence: Patient Care

Daniel J. Schumacher; Robert Englander; Patricia J. Hicks; Carol Carraccio; Susan Guralnick

From the Boston Combined Residency Program in Pediatrics, Pediatric Emergency Medicine, Boston Medical Center, Boston, Mass (Dr Schumacher); Association of American Medical Colleges, Washington, DC (Dr Englander); Department of Clinical Pediatrics, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, Pa (Dr Hicks); Competency-Based Assessment, American Board of Pediatrics, Chapel Hill, NC (Dr Carraccio); and Office of Graduate Medical Education and Student Affairs, and Department of Pediatrics, Winthrop University Hospital, Mineola, NY (Dr Guralnick) The views expressed in this report are those of the authors and do not necessarily represent those of the Accreditation Council for Graduate Medical Education, the American Board of Pediatrics, the Association of Pediatric Program Directors, or the Academic Pediatric Association. The authors declare that they have no conflict of interest. Publication of this article was supported by the American Board of Pediatrics Foundation and the Association of Pediatric Program Directors. Address correspondence to Daniel J. Schumacher, MD, MEd, One Boston Medical Center Place, Boston, MA 02118 (e-mail: daniel. [email protected]).


Journal of Hospital Medicine | 2014

Front-line ordering clinicians: matching workforce to workload.

Evan S. Fieldston; Lisa B. Zaoutis; Patricia J. Hicks; Susan Kolb; Debra L. Geiger; Paula M. Agosto; Jan P. Boswinkel; Louis M. Bell

BACKGROUND Matching workforce to workload is particularly important in healthcare delivery, where an excess of workload for the available workforce may negatively impact processes and outcomes of patient care and resident learning. Hospitals currently lack a means to measure and match dynamic workload and workforce factors. OBJECTIVES This article describes our work to develop and obtain consensus for use of an objective tool to dynamically match the front-line ordering clinician (FLOC) workforce to clinical workload in a variety of inpatient settings. METHODS We undertook development of a tool to represent hospital workload and workforce based on literature reviews, discussions with clinical leadership, and repeated validation sessions. We met with physicians and nurses from every clinical care area of our large, urban childrens hospital at least twice. RESULTS We successfully created a tool in a matrix format that is objective and flexible and can be applied to a variety of settings. We presented the tool in 14 hospital divisions and received widespread acceptance among physician, nursing, and administrative leadership. The hospital uses the tool to identify gaps in FLOC coverage and guide staffing decisions. DISCUSSION Hospitals can better match workload to workforce if they can define and measure these elements. The Care Model Matrix is a flexible, objective tool that quantifies the multidimensional aspects of workload and workforce. The tool, which uses multiple variables that are easily modifiable, can be adapted to a variety of settings.


Academic Pediatrics | 2014

The Pediatrics Milestones: Pursuit of a National System of Workplace-Based Assessment Through Key Stakeholder Collaboration

Patricia J. Hicks; Alan Schwartz; Stephen G. Clyman; David G. Nichols

From the Department of Clinical Pediatrics, The Children’s Hospital of Philadelphia, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, Pa (Dr Hicks); Department of Medical Education, Department of Pediatrics, University of Illinois, Chicago, Ill (Dr Schwartz); Center for Innovation at the National Board of Medical Examiners (Dr Clyman); and American Board of Pediatrics (Dr Nichols) The views expressed in this report are those of the authors and do not necessarily represent those of the Accreditation Council for Graduate Medical Education, the American Board of Pediatrics, the Association of Pediatric Program Directors, or the Academic Pediatric Association. The authors declare that they have no conflict of interest. Publication of this article was supported by the American Board of Pediatrics Foundation and the Association of Pediatric Program Directors. Address correspondence to Patricia J. Hicks, MD, MHPE, Children’s Hospital of Philadelphia, 34th & Civic Center Blvd, 12NW96, Philadelphia, PA 19104 (e-mail: [email protected]).


Academic Medicine | 2016

The pediatrics milestones assessment pilot: Development of workplace-based assessment content, instruments, and processes

Patricia J. Hicks; Melissa J. Margolis; Sue E. Poynter; Christa N. Chaffinch; Rebecca Tenney-Soeiro; Teri L. Turner; Linda A. Waggoner-Fountain; Robin Lockridge; Stephen G. Clyman; Alan Schwartz

Purpose To report on the development of content and user feedback regarding the assessment process and utility of the workplace-based assessment instruments of the Pediatrics Milestones Assessment Pilot (PMAP). Method One multisource feedback instrument and two structured clinical observation instruments were developed and refined by experts in pediatrics and assessment to provide evidence for nine competencies based on the Pediatrics Milestones (PMs) and chosen to inform residency program faculty decisions about learners’ readiness to serve as pediatric interns in the inpatient setting. During the 2012–2013 PMAP study, 18 U.S. pediatric residency programs enrolled interns and subinterns. Faculty, residents, nurses, and other observers used the instruments to assess learner performance through direct observation during a one-month rotation. At the end of the rotation, data were aggregated for each learner, milestone levels were assigned using a milestone classification form, and feedback was provided to learners. Learners and site leads were surveyed and/or interviewed about their experience as participants. Results Across the sites, 2,338 instruments assessing 239 learners were completed by 630 unique observers. Regarding end-of-rotation feedback, 93% of learners (128/137) agreed the assessments and feedback “helped me understand how those with whom I work perceive my performance,” and 85% (117/137) agreed they were “useful for constructing future goals or identifying a developmental path.” Site leads identified several benefits and challenges to the assessment process. Conclusions PM-based instruments used in workplace-based assessment provide a meaningful and acceptable approach to collecting evidence of learner competency development. Learners valued feedback provided by PM-based assessment.

Collaboration


Dive into the Patricia J. Hicks's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ann E. Burke

Wright State University

View shared research outputs
Top Co-Authors

Avatar

Daniel J. Schumacher

Cincinnati Children's Hospital Medical Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Susan Guralnick

Winthrop-University Hospital

View shared research outputs
Top Co-Authors

Avatar

Alan Schwartz

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Sue E. Poynter

Cincinnati Children's Hospital Medical Center

View shared research outputs
Top Co-Authors

Avatar

Alan L. Schwartz

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge