Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Katherine V Chew is active.

Publication


Featured researches published by Katherine V Chew.


Journal of The Medical Library Association | 2009

A case study: Planning a statewide information resource for health professionals: An evidence-based approach

Erinn E Aspinall; Katherine V Chew; Linda A. Watson; Mary Parker

QUESTION What is the best approach for implementing a statewide electronic health library (eHL) to serve all health professionals in Minnesota? SETTING The research took place at the University of Minnesota Health Sciences Libraries. METHODS In January 2008, the authors began planning a statewide eHL for health professionals following the five-step process for evidence-based librarianship: formulating the question, finding the best evidence, appraising the evidence, assessing costs and benefits, and evaluating the effectiveness of resulting actions. MAIN RESULTS The authors identified best practices for developing a statewide eHL for health professionals relating to audience or population served, information resources, technology and access, funding model, and implementation and sustainability. They were compared to the mission of the eHL project to drive strategic directions by developing recommendations. CONCLUSION EBL can guide the planning process for a statewide eHL, but findings must be tailored to the local environment to address information needs and ensure long-term sustainability.


Evidence Based Library and Information Practice | 2016

E-Journal Metrics for Collection Management: Exploring Disciplinary Usage Differences in Scopus and Web of Science

Katherine V Chew; Mary Schoenborn; James A. Stemper; Caroline Lilyard

Objective – The purpose was to determine whether a relationship exists between journal downloads and either faculty authoring venue or citations to these faculty, or whether a relationship exists between journal rankings and local authoring venues or citations. A related purpose was to determine if any such relationship varied between or within disciplines. A final purpose was to determine if specific tools for ranking journals or indexing authorship and citation were demonstrably better than alternatives. Methods – Multiple years of journal usage, ranking, and citation data for twelve disciplines were combined in Excel, and the strength of relationships were determined using rank correlation coefficients. Results – The results illustrated marked disciplinary variation as to the degree that faculty decisions to download a journal article can be used as a proxy to predict which journals they will publish in or which journals will cite faculty’s work. While journal access requests show moderate to strong relationships with the journals in which faculty publish, as well as journals whose articles cite local faculty, the data suggest that Scopus may be the better resource to find such information for these journals in the health sciences and Web of Science may be the better resource for all other disciplines analyzed. The same can be said for the ability of external ranking mechanisms to predict faculty publishing behaviours. Eigenfactor is more predictive for both authoring and citing-by-others across most of the representative disciplines in the social sciences as well as the physical and natural sciences. With the health sciences, no clear pattern emerges. Conclusion – Collecting and correlating authorship and citation data allows patterns of use to emerge, resulting in a more accurate picture of use activity than the commonly used cost-per-use method. To find the best information on authoring activity by local faculty for subscribed journals, use Scopus. To find the best information on citing activity by faculty peers for subscribed titles use Thomson Reuters’ customized Local Journal Use Reports (LJUR), or limit a Web of Science search to local institution. The Eigenfactor and SNIP journal quality metrics results can better inform selection decisions, and are publicly available. Given the trend toward more centralized collection development, it is still critical to obtain liaison input no matter what datasets are used for decision making. This evidence of value can be used to defend any local library “tax” that academic departments pay as well as promote services to help faculty demonstrate their research impact.


Archive | 2010

Serving Multiple Stakeholders: Crafting a “blended” scorecard at the University of Minnesota Health Sciences Libraries

Katherine V Chew; Erinn E Aspinall


Archive | 2013

User-defined valued metrics for electronic journals

Katherine V Chew; James A. Stemper; Caroline Lilyard; Mary Schoenborn


Minnesota Library Association Annual Conference | 2017

Health in the Headlines:: Critical Evaluation to Combat “Fake News

Katherine V Chew; Elizabeth Kiscadne


Medical Library Association Annual Meeting | 2017

Health Fact or Fiction: Health Fact or Fiction:: Utilizing an iPad Flashcard APP to Engage and Educate Fair Attendees

Katherine V Chew


Archive | 2016

Adventures in Bibliometrics: Research Impact and the CTSI

Katherine V Chew; Caitlin Bakker


Archive | 2016

Evaluative Bibliometrics Meet the CTSI

Caitlin Bakker; Katherine V Chew


Medical Library Association Annual Meeting: Librarians Without Limits | 2015

Breaking Into Uncharted Territory: Collaborating On NIH Public Access Policy Compliance with the Sponsored Projects Administration

Katherine V Chew


Expanding the Assessment Toolbox: : Blending Old and New Assessment Practices | 2015

E-Journal Metrics:: Exploring Disciplinary Differences

Katherine V Chew; Mary Schoenborn

Collaboration


Dive into the Katherine V Chew's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mary Parker

University of Minnesota

View shared research outputs
Researchain Logo
Decentralizing Knowledge