Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Min Soon Kim is active.

Publication


Featured researches published by Min Soon Kim.


Health Information and Libraries Journal | 2013

Information needs and information-seeking behaviour analysis of primary care physicians and nurses: a literature review

Martina A. Clarke; Jeffery L. Belden; Richelle J. Koopman; Linsey M. Steege; Joi L. Moore; Shannon M. Canfield; Min Soon Kim

BACKGROUNDnThe increase in the adoption of electronic health records (EHR) has contributed to physicians and nurses experiencing information overload. To address the problem of information overload, an assessment of the information needs of physicians and nurses will assist in understanding what they view as useful information to make patient care more efficient.nnnOBJECTIVEnTo analyse studies that assessed the information needs and information-seeking behaviour of physicians and nurses in a primary care setting to develop a better understanding of what information to present to physicians when they making clinical decisions.nnnMETHODnA literature review of studies was conducted with a comprehensive search in PubMed, cinahl, scopus, as well as examination of references from relevant papers and hand-searched articles to identify articles applicable to this review.nnnRESULTSnOf the papers reviewed the most common information needs found among physicians and nurses were related to diagnoses, drug(s) and treatment/therapy. Colleagues remain a preferred information source among physicians and nurses; however, a rise in Internet usage is apparent.nnnCONCLUSIONnPhysicians and nurses need access to the Internet and job-specific resources to find practitioner-oriented information. In addition, effective usage of resources is important for improving patient care.


Health Informatics Journal | 2016

Health information needs, sources, and barriers of primary care patients to achieve patient-centered care: A literature review

Martina A. Clarke; Joi L. Moore; Linsey M. Steege; Richelle J. Koopman; Jeffery L. Belden; Shannon M. Canfield; Susan E. Meadows; Susan G. Elliott; Min Soon Kim

To synthesize findings from previous studies assessing information needs of primary care patients on the Internet and other information sources in a primary care setting. A systematic review of studies was conducted with a comprehensive search in multiple databases including OVID MEDLINE, CINAHL, and Scopus. The most common information needs among patients were information about an illness or medical condition and treatment methods, while the most common information sources were the Internet and patients’ physicians. Overall, patients tend to prefer the Internet for the ease of access to information, while they trust their physicians more for their clinical expertise and experience. Barriers to information access via the Internet include the following: socio-demographic variables such as age, ethnicity, income, education, and occupation; information search skills; and reliability of health information. Conclusion: Further research is warranted to assess how to create accurate and reliable health information sources for both Internet and non-Internet users.


Journal of the American Board of Family Medicine | 2015

Physician Information Needs and Electronic Health Records (EHRs): Time to Reengineer the Clinic Note

Richelle J. Koopman; Linsey M. Steege; Joi L. Moore; Martina A. Clarke; Shannon M. Canfield; Min Soon Kim; Jeffery L. Belden

Background: Primary care physicians face cognitive overload daily, perhaps exacerbated by the form of electronic health record documentation. We examined physician information needs to prepare for clinic visits, focusing on past clinic progress notes. Methods: This study used cognitive task analysis with 16 primary care physicians in the scenario of preparing for office visits. Physicians reviewed simulated acute and chronic care visit notes. We collected field notes and document highlighting and review, and we audio-recorded cognitive interview while on task, with subsequent thematic qualitative analysis. Member checks included the presentation of findings to the interviewed physicians and their faculty peers. Results: The Assessment and Plan section was most important and usually reviewed first. The History of the Present Illness section could provide supporting information, especially if in narrative form. Physicians expressed frustration with the Review of Systems section, lamenting that the forces driving note construction did not match their information needs. Repetition of information contained in other parts of the chart (eg, medication lists) was identified as a source of note clutter. A workflow that included a patient summary dashboard made some elements of past notes redundant and therefore a source of clutter. Conclusions: Current ambulatory progress notes present more information to the physician than necessary and in an antiquated format. It is time to reengineer the clinic progress note to match the workflow and information needs of its primary consumer.


Journal of Evaluation in Clinical Practice | 2014

Determining differences in user performance between expert and novice primary care doctors when using an electronic health record (EHR)

Martina A. Clarke; Jeffery L. Belden; Min Soon Kim

RATIONALE, AIMS AND OBJECTIVESnThe goal of this study is to determine usability gaps between expert and novice primary care doctors when using an electronic health record (EHR).nnnMETHODSnUsability tests using video analyses with triangular method approach were conducted to analyse usability gaps between 10 novice and seven expert doctors. Doctors completed 19 tasks, using think-aloud strategy, based on an artificial but typical patient visit note. The usability session lasted approximately 20 minutes. The testing room consisted of the participant and the facilitator. Mixed methods approach including four sets of performance measures, system usability scale (SUS), and debriefing session with participants was used.nnnRESULTSnWhile most expert doctors completed tasks more efficiently, and provided a higher SUS score than novice doctors (novice 68, expert 70 out of 100 being perfect score), the result of percent task success rate were comparable (74% for expert group, 78% for novice group, Pu2009=u20090.98) on all 19 tasks.nnnCONCLUSIONnThis study found a lack of expertise among doctors with more experience using the system demonstrating that although expert doctors have been using the system longer, their proficiency did not increase with EHR experience. These results may potentially improve the EHR training programme, which may increase doctors performance when using an EHR. These results may also assist EHR vendors in improving the user interface, which may aid in reducing errors caused from poor usability of the system.


Applied Clinical Informatics | 2014

Determining Primary Care Physician Information Needs to Inform Ambulatory Visit Note Display

Martina A. Clarke; Linsey M. Steege; Joi L. Moore; Richelle J. Koopman; Jeffery L. Belden; Min Soon Kim

BACKGROUNDnWith the increase in the adoption of electronic health records (EHR) across the US, primary care physicians are experiencing information overload. The purpose of this pilot study was to determine the information needs of primary care physicians (PCPs) as they review clinic visit notes to inform EHR display.nnnMETHODnData collection was conducted with 15 primary care physicians during semi-structured interviews, including a third party observer to control bias. Physicians reviewed major sections of an artificial but typical acute and chronic care visit note to identify the note sections that were relevant to their information needs. Statistical methods used were McNemar-Mostellers and Cochran Q.nnnRESULTSnPhysicians identified History of Present Illness (HPI), Assessment, and Plan (A&P) as the most important sections of a visit note. In contrast, they largely judged the Review of Systems (ROS) to be superfluous. There was also a statistical difference in physicians highlighting among all seven major note sections in acute (p = 0.00) and chronic (p = 0.00) care visit notes.nnnCONCLUSIONnA&P and HPI sections were most frequently identified as important which suggests that physicians may have to identify a few key sections out of a long, unnecessarily verbose visit note. ROS is viewed by doctors as mostly not needed, but can have relevant information. The ROS can contain information needed for patient care when other sections of the Visit note, such as the HPI, lack the relevant information. Future studies should include producing a display that provides only relevant information to increase physician efficiency at the point of care. Also, research on moving A&P to the top of visit notes instead of having A&P at the bottom of the page is needed, since those are usually the first sections physicians refer to and reviewing from top to bottom may cause cognitive load.


international conference of design user experience and usability | 2013

Addressing human computer interaction issues of electronic health record in clinical encounters

Martina A. Clarke; Linsey M. Steege; Joi L. Moore; Jeffery L. Belden; Richelle J. Koopman; Min Soon Kim

Electronic Health Records (EHRs) are known to reduce medical errors and store comprehensive patient information, and they also impact the physician-patient interaction during clinical encounters. This study reviewed the literature to (1) identify the most common challenges to patient-physician relations while using an EHR during a clinical visit, (2) discuss limitations of the research methodologies employed, and (3) suggest future research directions related to addressing human computer interaction issues when physicians use an EHR in clinical encounters.


JMIR human factors | 2016

How Does Learnability of Primary Care Resident Physicians Increase After Seven Months of Using an Electronic Health Record? A Longitudinal Study

Martina A. Clarke; Jeffery L. Belden; Min Soon Kim

Background Electronic health records (EHRs) with poor usability present steep learning curves for new resident physicians, who are already overwhelmed in learning a new specialty. This may lead to error-prone use of EHRs in medical practice by new resident physicians. Objective The study goal was to determine learnability gaps between expert and novice primary care resident physician groups by comparing performance measures when using EHRs. Methods We compared performance measures after two rounds of learnability tests (November 12, 2013 to December 19, 2013; February 12, 2014 to April 22, 2014). In Rounds 1 and 2, 10 novice and 6 expert physicians, and 8 novice and 4 expert physicians participated, respectively. Laboratory-based learnability tests using video analyses were conducted to analyze learnability gaps between novice and expert physicians. Physicians completed 19 tasks, using a think-aloud strategy, based on an artificial but typical patient visit note. We used quantitative performance measures (percent task success, time-on-task, mouse activities), a system usability scale (SUS), and qualitative narrative feedback during the participant debriefing session. Results There was a 6-percentage-point increase in novice physicians’ task success rate (Round 1: 92%, 95% CI 87-99; Round 2: 98%, 95% CI 95-100) and a 7-percentage-point increase in expert physicians’ task success rate (Round 1: 90%, 95% CI 83-97; Round 2: 97%, 95% CI 93-100); a 10% decrease in novice physicians’ time-on-task (Round 1: 44s, 95% CI 32-62; Round 2: 40s, 95% CI 27-59) and 21% decrease in expert physicians’ time-on-task (Round 1: 39s, 95% CI 29-51; Round 2: 31s, 95% CI 22-42); a 20% decrease in novice physicians mouse clicks (Round 1: 8 clicks, 95% CI 6-13; Round 2: 7 clicks, 95% CI 4-12) and 39% decrease in expert physicians’ mouse clicks (Round 1: 8 clicks, 95% CI 5-11; Round 2: 3 clicks, 95% CI 1-10); a 14% increase in novice mouse movements (Round 1: 9247 pixels, 95% CI 6404-13,353; Round 2: 7991 pixels, 95% CI 5350-11,936) and 14% decrease in expert physicians’ mouse movements (Round 1: 7325 pixels, 95% CI 5237-10,247; Round 2: 6329 pixels, 95% CI 4299-9317). The SUS measure of overall usability demonstrated only minimal change in the novice group (Round 1: 69, high marginal; Round 2: 68, high marginal) and no change in the expert group (74; high marginal for both rounds). Conclusions This study found differences in novice and expert physicians’ performance, demonstrating that physicians’ proficiency increased with EHR experience. Our study may serve as a guideline to improve current EHR training programs. Future directions include identifying usability issues faced by physicians when using EHRs, through a more granular task analysis to recognize subtle usability issues that would otherwise be overlooked.


international conference of design, user experience, and usability | 2014

Usability Improvement of a Clinical Decision Support System

Frederick Thum; Min Soon Kim; Nicholas Genes; Laura Rivera; Rosemary Beato; Jared Soriano; Joseph Kannry; Kevin M. Baumlin; Ula Hwang

This paper focuses on improving the usability of an electronic health record (EHR) embedded clinical decision support system (CDSS) targeted to treat pain in elderly adults. CDSS have the potential to impact provider behavior. Optimizing CDSS-provider interaction and usability may enhance CDSS use. Five CDSS interventions were developed and deployed in test scenarios within a simulated EHR that mirrored typical Emergency Department (ED) workflow. Provider feedback was analyzed using a mixed methodology approach. The CDSS interventions were iteratively designed across three rounds of testing based upon this analysis. Iterative CDSS design led to improved provider usability and favorability scores.


international conference on human-computer interaction | 2015

What Learnability Issues Do Primary Care Physicians Experience When Using CPOE

Martina A. Clarke; Jeffery L. Belden; Min Soon Kim

Objective: To determine learnability gaps between expert and novice primary care physicians when using a computerized physician order entry (CPOE). Method: Two rounds of lab-based usability tests using video analyses with triangular method approach were conducted to analyze learnability gaps between ten novice and six expert physicians. Results: There was a 14 percent point increase in novice physicians’ task success rate (p = 0.29) and an 11 percent point increase in expert physicians’ task success rate between round one and round two (p = 0.64). There was an 8 % decrease in novice physicians’ time on task between round one and round two (p = 0.83) and a 12 % decrease in expert physicians’ time on task between round one and round two (p = 0.47). There was a 17 % decrease in novice physicians’ mouse clicks between round one and round two (p = 0.97) and a 20 % decrease in expert physicians’ mouse clicks between round one and round two (p = 0.80). There was a 5 % increase in novice physicians’ mouse movements between round one and round two (p = 0.67) and an 8 % decrease in expert physicians’ mouse movements between round one and round two (p = 0.99). Conclusion: Future directions include identifying usability issues faced by physicians when using the EHR through subtask analysis.


biomedical engineering systems and technologies | 2015

Comparing Computerized Physician Order Entry Usability between Expert and Novice Primary Care Physicians

Martina A. Clarke; Jeffery L. Belden; Min Soon Kim

Objectives: To examine usability gaps between expert and novice primary care physicians when using n ncomputerized provider order entry (CPOE). Methods: To analyze usability gaps between ten novice and seven n nexpert physicians, using the triangular method approach, usability tests involving video analysis were n nconducted. Results: While most novice physicians completed tasks less proficiently, and provided a lower n nSystem Usability Scale (SUS) score than expert physicians, the result of âx80x98percent task success rateâx80x99 (t(8) = n n2.31, p=0.98) was not significant for both physician groups on all five tasks. Seven common and four unique n nusability issues were identified between the two physician groups. Three themes emerged during analysis: n nuser interface issues, ambiguous terminologies, and training and education issues. Discussion and Conclusion: n nThis study identified varying usability issues for users of CPOE with different expertise. Two additional n niterations of the usability data collections are undergoing to uncover comprehensive usability issues and n nmeasure the learnability.

Collaboration


Dive into the Min Soon Kim's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Linsey M. Steege

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Frederick Thum

Icahn School of Medicine at Mount Sinai

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jared Soriano

Icahn School of Medicine at Mount Sinai

View shared research outputs
Researchain Logo
Decentralizing Knowledge