Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where William L. Roberts is active.

Publication


Featured researches published by William L. Roberts.


Medical Education Online | 2011

Relationships between high-stakes clinical skills exam scores and program director global competency ratings of first-year pediatric residents.

Erik E. Langenau; Gina Pugliano; William L. Roberts

Abstract Background Responding to mandates from the Accreditation Council for Graduate Medical Education (ACGME) and American Osteopathic Association (AOA), residency programs have developed competency-based assessment tools. One such tool is the American College of Osteopathic Pediatricians (ACOP) program directors’ annual report. High-stakes clinical skills licensing examinations, such as the Comprehensive Osteopathic Medical Licensing Examination Level 2-Performance Evaluation (COMLEX-USA Level 2-PE), also assess competency in several clinical domains. Objective The purpose of this study is to investigate the relationships between program director competency ratings of first-year osteopathic residents in pediatrics and COMLEX-USA Level 2-PE scores from 2005 to 2009. Methods The sample included all 94 pediatric first-year residents who took COMLEX-USA Level 2-PE and whose training was reviewed by the ACOP for approval of training between 2005 and 2009. Program director competency ratings and COMLEX-USA Level 2-PE scores (domain and component) were merged and analyzed for relationships. Results Biomedical/biomechanical domain scores were positively correlated with overall program director competency ratings. Humanistic domain scores were not significantly correlated with overall program director competency ratings, but did show moderate correlation with ratings for interpersonal and communication skills. The six ACGME or seven AOA competencies assessed empirically by the ACOP program directors’ annual report could not be recovered by principal component analysis; instead, three factors were identified, accounting for 86% of the variance between competency ratings. Discussion A few significant correlations were noted between COMLEX-USA Level 2-PE scores and program director competency ratings. Exploring relationships between different clinical skills assessments is inherently difficult because of the heterogeneity of tools used and overlap of constructs within the AOA and ACGME core competencies.


Academic Medicine | 2009

Patient note fabrication and consequences of unprofessional behavior in a high-stakes clinical skills licensing examination

Jeanne M. Sandella; William L. Roberts; Laurie A. Gallagher; John R. Gimpel; Erik E. Langenau; John R. Boulet

Background The National Board of Osteopathic Medical Examiners (NBOME) administers the Comprehensive Osteopathic Medical Licensing Examination USA Level 2-PE (COMLEX-USA Level 2-PE) and has developed a process that links the competencies of written communication and professionalism by tracking fabrication in the postencounter SOAP (Subjective, Objective, Assessment, Plan) note exercise. Method A process used to identify potential SOAP note fabrication was implemented in the 2007–2008 test cycle for the COMLEX-USA Level 2-PE. Results A total of 3,753 candidates took the COMLEX-USA Level 2-PE in the 2007–2008 test cycle. Forty-eight candidates were screened, and the NBOME’s Subcommittee on SOAP Note Fabrication made failure decisions on eight, leading to a failure rate of 0.2% based on fabrication review. Conclusions The NBOME has adopted the stance that postencounter note fabrication represents unprofessional behavior. Screening for and failing candidates who exhibit unprofessional behavior enhances the validity of the examination.


Advances in Health Sciences Education | 2013

Investigation of Standardized Patient Ratings of Humanistic Competence on a Medical Licensure Examination Using Many-Facet Rasch Measurement and Generalizability Theory.

Xiuyuan Zhang; William L. Roberts

Humanistic doctor–patient interaction has been measured for eight years using the Global Patient Assessment (GPA) tool in the national osteopathic clinical skills medical licensure examination. Standardized patients (SPs) apply the GPA tool to rate examinees’ competence on doctor–patient communication, interpersonal skills, and professionalism. Many-Facet Rasch Measurement was employed to evaluate the overall functioning of the GPA rating scale and to estimate measurement errors associated with characteristics of SP raters and medical case presentations. Generalizability theory was applied to investigate variance components corresponding to each facet of interest. For the 2010–2011 testing cycle, 50,090 SP ratings were analyzed. Although SP raters varied in leniency/stringency of rating, SPs differentiated the six GPA aspects in difficulty, and utilized a reasonable range of the 9-point scale. Reliability indices resulted in sufficient examinee separation, 0.94, from the Rasch model and sufficient dependability from the generalizability analysis for raw scores, 0.83, and transformed Rasch scores, 0.97. Results indicate that medical students’ humanistic competence can be reliably measured through the GPA tool in the simulated environment. These measurement models supplement other means of observation and quality control with valuable information about the psychometric quality of SP ratings of humanistic competence.


Simulation in healthcare : journal of the Society for Simulation in Healthcare | 2011

Relationship between standardized patient checklist item accuracy and performing arts experience

Erik E. Langenau; Caitlin Dyer; William L. Roberts; André F. De Champlain; Donald P. Montrey; Jeanne M. Sandella

Introduction: It is not known whether a Standardized Patients (SPs) performing arts background could affect his or her accuracy in recording candidate performance on a high-stakes clinical skills examination, such as the Comprehensive Osteopathic Medical Licensing Examination Level 2 Performance Evaluation. The purpose of this study is to investigate the differences in recording accuracy of history and physical checklist items between SPs who identify themselves as performing artists and SPs with no performance arts experience. Methods: Forty SPs identified themselves as being performing artists or nonperforming artists. A sample of SP live examination ratings were compared with a second set of ratings obtained after video review (N = 1972 SP encounters) over 40 cases from the 2008–2009 testing cycle. Differences in SP checklist recording accuracy were tested as a function of performing arts experience. Results: Mean overall agreement rates, both uncorrected and corrected for chance agreement, were very high (0.94 and 0.79, respectively, at the overall examination level). There was no statistically significant difference between the two groups with respect to any of the mean accuracy measures: history taking (z = −0.422, P = 0.678), physical examination (z = −1.453, P = 0.072), and overall data gathering (z = −0.812, P = 0.417) checklist items. Conclusion: Results suggest that SPs with or without a performing arts background complete history taking and physical examination checklist items with high levels of precision. Therefore, SPs with and without performing arts experience can be recruited for high-stakes SP-based clinical skills examinations without sacrificing examination integrity or scoring accuracy.


Medical Education Online | 2012

Clinical skills assessment of procedural and advanced communication skills: performance expectations of residency program directors.

Erik E. Langenau; Xiuyuan Zhang; William L. Roberts; Andre F. DeChamplain; John R. Boulet

Background: High stakes medical licensing programs are planning to augment and adapt current examinations to be relevant for a two-decision point model for licensure: entry into supervised practice and entry into unsupervised practice. Therefore, identifying which skills should be assessed at each decision point is critical for informing examination development, and gathering input from residency program directors is important. Methods: Using data from previously developed surveys and expert panels, a web-delivered survey was distributed to 3,443 residency program directors. For each of the 28 procedural and 18 advanced communication skills, program directors were asked which clinical skills should be assessed, by whom, when, and how. Descriptive statistics were collected, and Intraclass Correlations (ICC) were conducted to determine consistency across different specialties. Results: Among 347 respondents, program directors reported that all advanced communication and some procedural tasks are important to assess. The following procedures were considered ‘important’ or ‘extremely important’ to assess: sterile technique (93.8%), advanced cardiovascular life support (ACLS) (91.1%), basic life support (BLS) (90.0%), interpretation of electrocardiogram (89.4%) and blood gas (88.7%). Program directors reported that most clinical skills should be assessed at the end of the first year of residency (or later) and not before graduation from medical school. A minority were considered important to assess prior to the start of residency training: demonstration of respectfulness (64%), sterile technique (67.2%), BLS (68.9%), ACLS (65.9%) and phlebotomy (63.5%). Discussion: Results from this study support that assessing procedural skills such as cardiac resuscitation, sterile technique, and phlebotomy would be amenable to assessment at the end of medical school, but most procedural and advanced communications skills would be amenable to assessment at the end of the first year of residency training or later. Conclusions: Gathering data from residency program directors provides support for developing new assessment tools in high-stakes licensing examinations. To access the supplementary material to this article please see Supplementary Files under Article Tools online.


Advances in Health Sciences Education | 2010

Effect of first-encounter pretest on pass/fail rates of a clinical skills medical licensure examination

William L. Roberts; Danette W. McKinley; John R. Boulet

Due to the high-stakes nature of medical exams it is prudent for test agencies to critically evaluate test data and control for potential threats to validity. For the typical multiple station performance assessments used in medicine, it may take time for examinees to become comfortable with the test format and administrative protocol. Since each examinee in the rotational sequence starts with a different task (e.g., simulated clinical encounter), those who are administered non-scored pretest material on their first station may have an advantage compared to those who are not. The purpose of this study is to investigate whether pass/fail rates are different across the sequence of pretest encounters administered during the testing day. First-time takers were grouped by the sequential order in which they were administered the pretest encounter. No statistically significant difference in fail rates was found between examinees who started with the pretest encounter and those who encountered the pretest encounter later in the sequence. Results indicate that current examination administration protocols do not present a threat to the validity of test score interpretations.


The Journal of the American Osteopathic Association | 2010

Five-year summary of COMLEX-USA level 2-PE examinee performance and survey data.

Erik E. Langenau; Caitlin Dyer; William L. Roberts; Crystal Wilson; John R. Gimpel


The Journal of the American Osteopathic Association | 2012

Frequency of Specific Osteopathic Manipulative Treatment Modalities Used by Candidates While Taking COMLEX-USA Level 2-PE

Erik E. Langenau; Caitlin Dyer; William L. Roberts


Advances in Health Sciences Education | 2012

Modeling relationships between traditional preadmission measures and clinical skills performance on a medical licensure examination

William L. Roberts; Gina Pugliano; Erik E. Langenau; John R. Boulet


The Journal of the American Osteopathic Association | 2011

Competency-based classification of COMLEX-USA cognitive examination test items

Erik E. Langenau; Gina Pugliano; William L. Roberts

Collaboration


Dive into the William L. Roberts's collaboration.

Researchain Logo
Decentralizing Knowledge