Simona Carini
University of California, San Francisco
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Simona Carini.
Journal of Biomedical Informatics | 2011
Samson W. Tu; Mor Peleg; Simona Carini; Michael Bobak; Jessica Ross; Daniel L. Rubin; Ida Sim
Formalizing eligibility criteria in a computer-interpretable language would facilitate eligibility determination for study subjects and the identification of studies on similar patient populations. Because such formalization is extremely labor intensive, we transform the problem from one of fully capturing the semantics of criteria directly in a formal expression language to one of annotating free-text criteria in a format called ERGO annotation. The annotation can be done manually, or it can be partially automated using natural-language processing techniques. We evaluated our approach in three ways. First, we assessed the extent to which ERGO annotations capture the semantics of 1000 eligibility criteria randomly drawn from ClinicalTrials.gov. Second, we demonstrated the practicality of the annotation process in a feasibility study. Finally, we demonstrate the computability of ERGO annotation by using it to (1) structure a library of eligibility criteria, (2) search for studies enrolling specified study populations, and (3) screen patients for potential eligibility for a study. We therefore demonstrate a new and practical method for incrementally capturing the semantics of free-text eligibility criteria into computable form.
Journal of Biomedical Informatics | 2004
Ida Sim; Ben Olasov; Simona Carini
Randomized controlled trials (RCTs) are one of the least biased sources of clinical research evidence, and are therefore a critical resource for the practice of evidence-based medicine. With over 10,000 new RCTs indexed in Medline each year, knowledge systems are needed to help clinicians translate evidence into practice. Common ontologies for RCTs and other domains would facilitate the development of these knowledge systems. However, no standard method exists for developing domain ontologies. In this paper, we describe a new systematic approach to specifying and evaluating the conceptual content of ontologies. In this method, called competency decomposition, the target task for an ontology is hierarchically decomposed into subtasks and methods, and the ontology content is specified by identifying the domain information required to complete each of the subtasks. We illustrate the use of this competency decomposition approach for the content specification and evaluation of an RCT ontology for evidence-based practice.
conference of the centre for advanced studies on collaborative research | 2008
Maria-Elena Hernandez; Sean M. Falconer; Margaret-Anne D. Storey; Simona Carini; Ida Sim
Searching and comparing information from semi-structured repositories is an important, but cognitively complex activity for internet users. The typical web interface displays a list of results as a textual list which is limited in helping the user compare or gain an overview of the results from a series of iterative queries. In this paper, we propose a new interactive, lightweight technique that uses multiple synchronized tag clouds to support iterative visual analysis and filtering of query results. Although tag clouds are frequently available in web interfaces, they are typically used for providing an overview of key terms in a set of results, but thus far have not been used for presenting semi-structured information to support iterative queries. We evaluated our proposed design in a user study that presents typical search and comparison scenarios to users trying to understand heterogeneous clinical trials from a leading repository of scientific information. The study gave us valuable insights regarding the challenges that semi-structured data collections pose, and indicated that our design may ease cognitively demanding browsing activities of semi-structured information.
Journal of Biomedical Informatics | 2014
Ida Sim; Samson W. Tu; Simona Carini; Harold P. Lehmann; Brad H. Pollock; Mor Peleg; Knut M. Wittkowski
To date, the scientific process for generating, interpreting, and applying knowledge has received less informatics attention than operational processes for conducting clinical studies. The activities of these scientific processes - the science of clinical research - are centered on the study protocol, which is the abstract representation of the scientific design of a clinical study. The Ontology of Clinical Research (OCRe) is an OWL 2 model of the entities and relationships of study design protocols for the purpose of computationally supporting the design and analysis of human studies. OCRes modeling is independent of any specific study design or clinical domain. It includes a study design typology and a specialized module called ERGO Annotation for capturing the meaning of eligibility criteria. In this paper, we describe the key informatics use cases of each phase of a studys scientific lifecycle, present OCRe and the principles behind its modeling, and describe applications of OCRe and associated technologies to a range of clinical research use cases. OCRe captures the central semantics that underlies the scientific processes of clinical research and can serve as an informatics foundation for supporting the entire range of knowledge activities that constitute the science of clinical research.
Contemporary Clinical Trials | 2013
Dina Roumiantseva; Simona Carini; Ida Sim; Todd H. Wagner
OBJECTIVE We examine the extent to which ClinicalTrials.gov is meeting its goal of providing oversight and transparency of clinical trials with human subjects. We analyzed the ClinicalTrials.gov database contents as of June 2011, comparing interventions, medical conditions, and trial characteristics by sponsor type. We also conducted a detailed analysis of incomplete data. RESULTS Among trials with only government sponsorship (N=9252), 36% were observational and 64% interventional; in contrast, almost all (90%) industry-only sponsored trials were interventional. Industry-only sponsored interventional trials (N=30,036) were most likely to report a drug intervention (81%), followed by biologics (9%) and devices (8%). Government-only interventional trials (N=5886) were significantly more likely to test behavioral interventions (28%) and procedures (13%) than industry-only trials (p<0.001). Medical conditions most frequently studied in industry-only trials were cancer (19%), cardiovascular conditions (12%) and endocrine/metabolic disorders (11%). Government-only funded trials were more likely to study mental health (19% vs. 7% for industry, p<.001), and viral infections, including HIV (15% vs 7% for industry, p<.001). Government-funded studies are significantly more likely to be missing data about study design and intervention arms in the registry. For all studies, we report ambiguous and contradictory data entries. CONCLUSIONS Industry-sponsored studies differ systematically from government-sponsored studies in study type, choice of interventions, conditions studied, and completeness of submitted information. Imprecise study design information, incomplete coding of conditions, out-of-date or unspecified enrollment numbers, and other missing data continue to hinder robust analyses of trials registered in ClinicalTrials.gov.
Systematic Reviews | 2016
Ian J Saldanha; Christopher H. Schmid; Joseph Lau; Kay Dickersin; Jesse A. Berlin; Jens Jap; Bryant T Smith; Simona Carini; Wiley Chan; Berry de Bruijn; Byron C. Wallace; Susan Hutfless; Ida Sim; M. Hassan Murad; Sandra A. Walsh; Elizabeth J. Whamond; Tianjing Li
BackgroundData abstraction, a critical systematic review step, is time-consuming and prone to errors. Current standards for approaches to data abstraction rest on a weak evidence base. We developed the Data Abstraction Assistant (DAA), a novel software application designed to facilitate the abstraction process by allowing users to (1) view study article PDFs juxtaposed to electronic data abstraction forms linked to a data abstraction system, (2) highlight (or “pin”) the location of the text in the PDF, and (3) copy relevant text from the PDF into the form. We describe the design of a randomized controlled trial (RCT) that compares the relative effectiveness of (A) DAA-facilitated single abstraction plus verification by a second person, (B) traditional (non-DAA-facilitated) single abstraction plus verification by a second person, and (C) traditional independent dual abstraction plus adjudication to ascertain the accuracy and efficiency of abstraction.MethodsThis is an online, randomized, three-arm, crossover trial. We will enroll 24 pairs of abstractors (i.e., sample size is 48 participants), each pair comprising one less and one more experienced abstractor. Pairs will be randomized to abstract data from six articles, two under each of the three approaches. Abstractors will complete pre-tested data abstraction forms using the Systematic Review Data Repository (SRDR), an online data abstraction system. The primary outcomes are (1) proportion of data items abstracted that constitute an error (compared with an answer key) and (2) total time taken to complete abstraction (by two abstractors in the pair, including verification and/or adjudication).DiscussionThe DAA trial uses a practical design to test a novel software application as a tool to help improve the accuracy and efficiency of the data abstraction process during systematic reviews. Findings from the DAA trial will provide much-needed evidence to strengthen current recommendations for data abstraction approaches.Trial registrationThe trial is registered at National Information Center on Health Services Research and Health Care Technology (NICHSR) under Registration # HSRP20152269: https://wwwcf.nlm.nih.gov/hsr_project/view_hsrproj_record.cfm?NLMUNIQUE_ID=20152269&SEARCH_FOR=Tianjing%20Li. All items from the World Health Organization Trial Registration Data Set are covered at various locations in this protocol. Protocol version and date: This is version 2.0 of the protocol, dated September 6, 2016. As needed, we will communicate any protocol amendments to the Institutional Review Boards (IRBs) of Johns Hopkins Bloomberg School of Public Health (JHBSPH) and Brown University. We also will make appropriate as-needed modifications to the NICHSR website in a timely fashion.
BMC Medical Informatics and Decision Making | 2010
Svetlana Kiritchenko; Berry de Bruijn; Simona Carini; Joel D. Martin; Ida Sim
AMIA Joint Summits on Translational Science proceedings. AMIA Joint Summits on Translational Science | 2010
Jessica Ross; Samson W. Tu; Simona Carini; Ida Sim
american medical informatics association annual symposium | 2008
Berry de Bruijn; Simona Carini; Svetlana Kiritchenko; Joel D. Martin; Ida Sim
AMIA Joint Summits on Translational Science proceedings. AMIA Joint Summits on Translational Science | 2010
Ida Sim; Simona Carini; Samson W. Tu; Rob Wynden; Brad H. Pollock; Shamim A. Mollah; Davera Gabriel; Herbert K. Hagler; Richard H. Scheuermann; Harold P. Lehmann; Knut M. Wittkowski; Meredith Nahm; Suzanne Bakken