Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Norman B. Berman is active.

Publication


Featured researches published by Norman B. Berman.


Academic Medicine | 2005

Multi-institutional development and utilization of a computer-assisted learning program for the pediatrics clerkship: The CLIPP project

Leslie H. Fall; Norman B. Berman; Sherilyn Smith; Christopher B. White; Jerold C. Woodhead; Ardis L. Olson

Computer-assisted instruction (CAI) holds significant promise for meeting the current challenges of medical education by providing consistent and quality teaching materials regardless of training site. The Computer-assisted Learning in Pediatrics Project (CLIPP) was created over three years (2000–2003) to meet this potential through multi-institutional development of interactive Internet-based patient simulations that comprehensively teach the North American core pediatrics clerkship curriculum. Project development adhered to four objectives: (1) comprehensive coverage of the core curriculum; (2) uniform approach to CAI pedagogy; (3) multi-institutional development by educators; and (4) extensive evaluation by users. Pediatrics clerkship directors from 30 institutions worked in teams to develop a series of 31 patient case simulations. An iterative process of case content and pedagogy development, case authoring, peer review, and pilot-testing ensured that the needs of clerkship directors and medical students were met. Fifty medical schools in the United States and Canada are presently using CLIPP. More than 8,000 students have completed over 98,000 case sessions, with an average of 2,000 case sessions completed per week at this time. Each CLIPP case has been completed by more than 3,000 students. The current cost of CLIPP development is approximately


Academic Medicine | 2009

Integration strategies for using virtual patients in clinical clerkships.

Norman B. Berman; Leslie H. Fall; Sherilyn Smith; David A. Levine; Christopher G. Maloney; Michael Potts; Benjamin Siegel; Lynn Foster-Johnson

70 per student user, or


Medical Teacher | 2011

A collaborative model for developing and maintaining virtual patients for medical education

Norman B. Berman; Leslie H. Fall; Alexander W. Chessman; Michael Dell; Valerie J. Lang; Shou Ling Leong; L. James Nixon; Sherilyn Smith

6 per case session. The project’s success demonstrates that multi-institutional development and implementation of a peer-reviewed comprehensive CAI learning program by medical educators is feasible and provides a useful model for other organizations to develop similar programs. Although CAI development is both time-consuming and costly, the initial investment decreases significantly with broad use over time.


Medical Teacher | 2015

How we developed and piloted an electronic key features examination for the internal medicine clerkship based on a US national curriculum

Kirk A. Bronander; Valerie J. Lang; L. James Nixon; Heather Harrell; Regina A. Kovach; Susan Hingle; Norman B. Berman

Purpose To explore students’ perceptions of virtual patient use in the clinical clerkship and develop a framework to evaluate effects of different integration strategies on students’ satisfaction and perceptions of learning effectiveness with this innovation. Method A prospective, multiinstitutional study was conducted at six schools’ pediatric clerkships to assess the impact of integrating Web-based virtual patient cases on students’ perceptions of their learning during 2004–2005 and 2005–2006. Integration strategies were designed to meet the needs of each school, and integration was scored for components of virtual patient use and elimination of other teaching methodologies. A student survey was developed, validated, and administered at the end of the clerkship to 611 students. Data were analyzed using confirmatory factor analysis and structural equation modeling. Results A total of 545 students (89%) completed the survey. Overall student satisfaction with the virtual patients was high; students reported that they were more effective than traditional methods. The structural model demonstrated that elimination of other teaching methodologies was directly associated with perceived effectiveness of the integration strategies. A higher use score had a significant negative effect on perceived integration, but a positive effect on perceived knowledge and skills gain. Students’ positive perceptions of integration directly affected their satisfaction and perception of the effectiveness of their learning. Conclusions Integration strategies balancing the use of virtual patients with elimination of some other requirements were significantly associated with students’ satisfaction and their perceptions of improved knowledge and skills.


BMC Medical Education | 2018

Development and initial validation of an online engagement metric using virtual patients

Norman B. Berman; Anthony R. Artino

There is great interest in using computer-assisted instruction in medical education, but getting computer-assisted instruction materials used broadly is difficult to achieve. We describe a successful model for the development and maintenance of a specific type of computer-assisted instruction – virtual patients – in medical education. The collaborative models seven key components are described and compared to other models of diffusion of innovation and curriculum development. The collaborative development model that began in one medical discipline is now extended to two additional disciplines, through partnerships with their respective clerkship director organizations. We believe that the ability to achieve broad use of virtual patients, and to transition the programs from successfully relying on grant funding to financially self-sustaining, resulted directly from the collaborative development and maintenance process. This process can be used in other learning environments and for the development of other types of computer-assisted instruction programs.


BMC Medical Education | 2017

An investigation of professionalism reflected by student comments on formative virtual patient encounters

Ting Dong; William Kelly; Meredith Hays; Norman B. Berman; Steven J. Durning

Abstract Background: Key features examinations (KFEs) have been used to assess clinical decision making in medical education, yet there are no reports of an online KFE-based on a national curriculum for the internal medicine clerkship. What we did: The authors developed and pilot tested an electronic KFE based on the US Clerkship Directors in Internal Medicine core curriculum. Teams, with expert oversight and peer review, developed key features (KFs) and cases. Evaluation: The exam was pilot tested at eight medical schools with 162 third and fourth year medical students, of whom 96 (59.3%) responded to a survey. While most students reported that the exam was more difficult than a multiple choice question exam, 61 (83.3%) students agreed that it reflected problems seen in clinical practice and 51 (69.9%) students reported that it more accurately assessed the ability to make clinical decisions. Conclusions: The development of an electronic KFs exam is a time-intensive process. A team approach offers built-in peer review and accountability. Students, although not familiar with this format in the US, recognized it as authentically assessing clinical decision-making for problems commonly seen in the clerkship.


Advances in Health Sciences Education | 2008

Computer-Assisted Instruction in Clinical Education: a Roadmap to Increasing CAI Implementation

Norman B. Berman; Leslie H. Fall; Christopher G. Maloney; David A. Levine

BackgroundConsiderable evidence in the learning sciences demonstrates the importance of engagement in online learning environments. The purpose of this work was to demonstrate feasibility and to develop and collect initial validity evidence for a computer-generated dynamic engagement score based on student interactions in an online learning environment, in this case virtual patients used for clinical education.MethodsThe study involved third-year medical students using virtual patient cases as a standard component of their educational program at more than 125 accredited US and Canadian medical schools. The engagement metric algorithm included four equally weighted components of student interactions with the virtual patient. We developed a self-report measure of motivational, emotional, and cognitive engagement and conducted confirmatory factor analysis to assess the validity of the survey responses. We gathered additional validity evidence through educator reviews, factor analysis of the metric, and correlations between student use of the engagement metric and self-report measures of learner engagement.ResultsConfirmatory factor analysis substantiated the hypothesized four-factor structure of the survey scales. Educator reviews demonstrated a high level of agreement with content and scoring cut-points (mean Pearson correlation 0.98; mean intra-class correlation 0.98). Confirmatory factor analysis yielded an acceptable fit to a one-factor model of the engagement score components. Correlations of the engagement score with self-report measures were statistically significant and in the predicted directions.ConclusionsWe present initial validity evidence for a dynamic online engagement metric based on student interactions in a virtual patient case. We discuss potential uses of such an engagement metric including better understanding of student interactions with online learning, improving engagement through instructional design and interpretation of learning analytics output.


Academic Medicine | 2016

The Role for Virtual Patients in the Future of Medical Education

Norman B. Berman; Steven J. Durning; Martin R. Fischer; Sören Huwendiek; Marc M. Triola

BackgroundThis study explored the use of virtual patient generated data by investigating the association between students’ unprofessional patient summary statements, which they entered during an on-line virtual patient case, and detection of their future unprofessional behavior.MethodAt the USUHS, students complete a number of virtual patient encounters, including a patient summary, to meet the clerkship requirements of Internal Medicine, Family Medicine, and Pediatrics. We reviewed the summary statements of 343 students who graduated in 2012 and 2013. Each statement was rated with regard to four features: Unprofessional, Professional, Equivocal (could be construed as unprofessional), and Unanswered (students did not enter a statement). We also combined Unprofessional and Equivocal into a new category to indicate a statement receiving either rating. We then examined the associations of students’ scores on these categories (i.e. whether received a particular rating or not) and Expertise score and Professionalism score reflected by a post-graduate year one (PGY-1) program director (PD) evaluation form. The PD forms contained 58 Likert-scale items designed to measure the two constructs (Expertise and Professionalism).ResultsThe inter-rater reliability of statements coding was high (Cohen’s Kappa = .97). The measure of receiving an Unprofessional or Equivocal rating was significantly correlated with lower Expertise score (r = −.19, P < .05) as well as lower Professionalism score (r = −.17, P < .05) during PGY-1.ConclusionIncident reports and review of routine student evaluations are what most schools rely on to identify the majority of professionalism lapses. Unprofessionalism reflected in student entries may provide additional markers foreshadowing subsequent unprofessional behavior.


Academic Medicine | 2017

In Reply to Robison et al and White et al

Norman B. Berman; Steven J. Durning; Martin R. Fischer; Soeren Huwendiek; Marc M. Triola


Journal of Medical Internet Research | 2016

Using Foreign Virtual Patients With Medical Students in Germany: Are Cultural Differences Evident and Do They Impede Learning?

Jens Walldorf; Tina Jähnert; Norman B. Berman; Martin R. Fischer

Collaboration


Dive into the Norman B. Berman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sherilyn Smith

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Steven J. Durning

Uniformed Services University of the Health Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David A. Levine

Morehouse School of Medicine

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anthony R. Artino

Uniformed Services University of the Health Sciences

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge