Megan Oakleaf
Syracuse University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Megan Oakleaf.
portal - Libraries and the Academy | 2008
Megan Oakleaf
The culture of assessment in higher education requires academic librarians to demonstrate the impact of information literacy instruction on student learning. As a result, many librarians seek to gain knowledge about the information literacy assessment approaches available to them. This article identifies three major assessment approaches: (1) fixed-choice tests, (2) performance assessments, and (3) rubrics. It maps the theoretical and educational assumptions on which these options are grounded and charts the dangers and opportunities of each assessment approach.
Journal of Documentation | 2009
Megan Oakleaf
Purpose – The aim of this paper is to present the Information Literacy Instruction Assessment Cycle (ILIAC), to describe the seven stages of the ILIAC, and to offer an extended example that demonstrates how the ILIAC increases librarian instructional abilities and improves student information literacy skills.Design/methodology/approach – Employing survey design methodology, the researcher and participants use a rubric to code artifacts of student learning into pre‐set rubric categories. These categories are assigned point values and statistically analyzed to evaluate students and examine interrater reliability and validity.Findings – By engaging in the ILIAC, librarians gain important data about the information behavior of students and a greater understanding of student strengths and weaknesses. The ILIAC encourages librarians to articulate learning outcomes clearly, analyze them meaningfully, celebrate learning achievements, and diagnose problem areas. In short, the ILIAC results in improved student lear...
portal - Libraries and the Academy | 2009
Megan Oakleaf; Neal K. Kaske
Librarians throughout higher education must assess information literacy; however, many are overwhelmed by the prospect of selecting the best assessment for their campus context. This article presents six questions to aid librarians in surmounting this challenge. Are we ready to conduct an information literacy assessment? Why are we conducting this assessment? What are the stakeholder needs? Will the assessment tell us what we want to know? What are the costs of this assessment? What are the institutional implications of this assessment? Armed with the answers to these questions, librarians will be well positioned to make informed assessment choices.
Evidence Based Library and Information Practice | 2007
Megan Oakleaf
Objective - Every day, librarians make decisions that impact the provision of library products and services. To formulate good decisions, librarians must be equipped with reliable and valid data. Unfortunately, many library processes generate vast quantities of unwieldy information that is ill-suited for the evidence based decision-making (EBDM) practices librarians strive to employ. As a result, librarians require tools that facilitate the translation of unmanageable facts and figures into data that can be used to support decision-making. One such tool is a rubric. Rubrics provide at least four major benefits to librarians seeking to use EBDM strategies and merit further investigation. To this end, this study examined 1) librarians’ ability to use rubrics as a decision facilitation tool, 2) barriers that might prevent effective rubric usage, and 3) training topics that address potential barriers. Methods - This study investigated librarians’ use of rubrics as an EBDM tool to improve an online information literacy tutorial. The data for the study came from student responses to open-ended questions embedded in an online information literacy tutorial called LOBO used by first-year students in English 101 at North Carolina State University (NCSU). Fifteen academic librarians, five instructors, and five students applied rubrics to transform students’ textual responses into quantitative data; this data was statistically analyzed for reliability and validity using Cohen’s kappa. Participant comment sheets were also examined to reveal potential hurdles to effective rubric use. Results - Statistical analysis revealed that a subset of participants included in this study were able to achieve substantially valid results. On the other hand some librarian participants included in the study were unable to achieve an expert level of validity. Non-expert participants alluded to roadblocks that interfered with their ability to provide quality data using rubrics. Conclusions - Participant feedback can be categorized into six barriers that may explain why some participants could not attain expert status: 1) difficulty understanding an outcomes-based approach, 2) tension between analytic and holistic rubric structures, 3) failure to comprehend rubric terms, 4) disagreement with rubric assumptions, 5) difficulties with data artifacts, and 6) difficulties understanding local library context and culture. Each of these barriers can be addressed through training, and topics to maximize the usefulness of a rubric approach to EBDM are suggested.
Australian Academic & Research Libraries | 2011
Megan Oakleaf
This paper provides an overview of the process undertaken in the US during 2009/10 in developing a major report on the value of academic libraries. A summary of the key findings and recommendations from the report are also provided. While very much focused on the US situation, the author feels the findings may well have resonance elsewhere, including Australia.
College & Research Libraries | 2015
Megan Oakleaf
Megan Oakleaf is an Associate Professor in the iSchool at Syracuse University; e-mail: [email protected].
College & Research Libraries | 2011
Megan Oakleaf
204 Last fall, ACRL published the Value of Academic Libraries Comprehensive Research Review and Report. Since then, many librarians have cited the report’s literature review; even more have commented on the variety of recommendations and the breadth of the research agenda laid out in the report. The literature review captures our past efforts to explore the return-oninvestment and impact of academic libraries; the recommendations and research agenda give direction to our future work in articulating and increasing academic library value. Although the report is a static document, the library value conversation can be dynamic. The report can serve as a foundation for a lively professional and scholarly dialogue, but how might librarians engage and develop that dialogue? Certainly, ACRL can take a role in the library value conversation; it is already doing so by commencing a major initiative around academic library value issues complete with presentations, partnerships, professional development offerings, and grant proposals. But librarians, individually and in concert with others, can also engage rigorously in the value conversation. Librarians and library science faculty can collaborate; in addition, librarians can also seek research partnerships with other higher education stakeholders including institutional researchers, higher education associations, and grant funders. Large-scale, rigorous research studies can be initiated whenever possible. Such studies are often perceived as “objective”, apolitical, and generalizable to multiple academic library contexts. They can also deliver the holy grail of “statistical significance.” However, large-scale studies represent Do the Right (Write) Thing: Engaging in Academic Library Value Research
Information and Learning Science | 2018
Megan Oakleaf
Purpose This paper describes the need for academic libraries to demonstrate and increase their impact of student learning and success. It highlights the data problems present in existing library value correlation research and suggests a pathway to surmounting existing data obstacles. The paper advocates the integration of libraries into institutional learning analytics systems to gain access to more granular student learning and success data. It also suggests using library-infused learning analytics data to discover and act upon new linkages that may reveal library value in an institutional context. Design/methodology/approach The paper describes a pattern pervasive in existing academic library value correlation research and identifies major data obstacles to future research in this vein. The paper advocates learning analytics as one route to access more usable and revealing data. It also acknowledging several challenges to the suggested approach. Findings This paper describes learning analytics as it may...
Journal of the Association for Information Science and Technology | 2009
Megan Oakleaf
The Journal of Academic Librarianship | 2014
Megan Oakleaf