Jennifer J. Kaplan
University of Georgia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jennifer J. Kaplan.
Journal of Statistics Education | 2009
Jennifer J. Kaplan; Diane G. Fisher; Neal Rogness
Language plays a crucial role in the classroom. The use of specialized language in a domain can cause a subject to seem more difficult to students than it actually is. When words that are part of everyday English are used differently in a domain, these words are said to have lexical ambiguity. Studies in other fields, such as mathematics and chemistry education suggest that in order to help students learn vocabulary instructors should exploit the lexical ambiguity of the words. The study presented here is a pilot study that is the first in a sequence of studies designed to understand the effects of and develop techniques for exploiting lexical ambiguities in the statistic classroom. In particular, this paper describes the meanings most commonly used by students entering an undergraduate statistics course of five statistical terms.
CBE- Life Sciences Education | 2011
Kevin C. Haudek; Jennifer J. Kaplan; Jennifer K. Knight; Tammy M. Long; John E. Merrill; Alan Munn; Ross H. Nehm; Michelle K. Smith; Mark Urban-Lurain
Concept inventories, consisting of multiple-choice questions designed around common student misconceptions, are designed to reveal student thinking. However, students often have complex, heterogeneous ideas about scientific concepts. Constructed-response assessments, in which students must create their own answer, may better reveal students’ thinking, but are time- and resource-intensive to evaluate. This report describes the initial meeting of a National Science Foundation–funded cross-institutional collaboration of interdisciplinary science, technology, engineering, and mathematics (STEM) education researchers interested in exploring the use of automated text analysis to evaluate constructed-response assessments. Participants at the meeting shared existing work on lexical analysis and concept inventories, participated in technology demonstrations and workshops, and discussed research goals. We are seeking interested collaborators to join our research community.
Journal of Statistics Education | 2010
Jennifer J. Kaplan; Diane G. Fisher; Neal Rogness
Language plays a crucial role in the classroom. The use of specialized language in a domain can cause a subject to seem more difficult to students than it actually is. When words that are part of everyday English are used differently in a domain, these words are said to have lexical ambiguity. Studies in other fields, such as mathematics and chemistry education, suggest that in order to help students learn vocabulary instructors should exploit the lexical ambiguity of the words. The study presented here is the second in a sequence of studies designed to understand the effects of and develop techniques for exploiting lexical ambiguities in statistics classrooms. In particular, this paper looks at five statistical terms and the meanings of these terms most commonly expressed by students at the end of an undergraduate statistics course.
PLOS ONE | 2012
Michelle K. Smith; Seanna L. Annis; Jennifer J. Kaplan; Frank Drummond
Blueberry growers in Maine attend annual Cooperative Extension presentations given by university faculty members. These presentations cover topics, such as, how to prevent plant disease and monitor for insect pests. In 2012, in order to make the sessions more interactive and promote learning, clicker questions and peer discussion were incorporated into the presentations. Similar to what has been shown at the undergraduate level, after peer discussion, more blueberry growers gave correct answers to multiple-choice questions than when answering independently. Furthermore, because blueberry growers are characterized by diverse levels of education, experience in the field etc., we were able to determine whether demographic factors were associated with changes in performance after peer discussion. Taken together, our results suggest that clicker questions and peer discussion work equally well with adults from a variety of demographic backgrounds without disadvantaging a subset of the population and provide an important learning opportunity to the least formally educated members. Our results also indicate that clicker questions with peer discussion were viewed as a positive addition to university-related informal science education sessions.
Journal of Statistics Education | 2014
Jennifer J. Kaplan; John Gabrosek; Phyllis Curtiss; Christopher Malone
Histograms are adept at revealing the distribution of data values, especially the shape of the distribution and any outlier values. They are included in introductory statistics texts, research methods texts, and in the popular press, yet students often have difficulty interpreting the information conveyed by a histogram. This research identifies and discusses four misconceptions prevalent in student understanding of histograms. In addition, it presents pre- and post-test results on an instrument designed to measure the extent to which the misconceptions persist after instruction. The results presented indicate not only that the misconceptions are commonly held by students prior to instruction, but also that they persist after instruction. Future directions for teaching and research are considered.
Numeracy | 2011
Alla Sikorskii; Vince Melfi; Dennis Gilliland; Jennifer J. Kaplan; Suzie Ahn
Development, psychometric testing, and the results of the administration of a quantitative literacy (QL) assessment to undergraduate students are described. Three forms were developed covering a wide range of skills, contexts, and quantitative information presentation formats. Following item generation and revision based on preliminary testing and cognitive interviewing, a total of 3,701 consented undergraduate students at Michigan State University completed one of the three forms. Two of the forms contained 14 multiple-choice items, and one form contained 17 multiple-choice items. All forms were completed by students in less than 30 minutes. Evidence of validity and reliability were obtained for the three forms. Unidimensionality of the underlying construct was established using confirmatory factor analysis. Correlations with ACT and university mathematics placement test ranged from .41 to .67, and correlations with the Lipkus numeracy scale ranged from .40 to .45. Cronbach’s alphas for the three forms were near or exceeded .70. Comparison of student QL performance according to demographic characteristics revealed gender differences, with males scoring higher than females. These gender differences persisted even after controlling for ACT composite scores. Race/ethnicity differences were significant in unadjusted analysis, but did not persist over and above ACT composite scores in the adjusted analyses. The three newly developed forms of QL assessment will need to be further tested in the future to determine if they capture the effects of interventions that aim to improve
The American Statistician | 2015
Allison Amanda Moore; Jennifer J. Kaplan
Program assessment is used by institutions and/or departments to prompt conversations about the status of student learning and make informed decisions about educational programs. It is also typically required by accreditation agencies, such as the Southern Association of Colleges and Schools (SACS) or the Western Association of Schools & Colleges (WASC). The cyclic assessment process includes four steps: establishing student learning outcomes, deciding on assessment methods, collecting and analyzing data, and reflecting on the results. The theory behind the choice of assessment methods and the use of rubrics in assessment is discussed. A description of the experiences of a Department of Statistics at a large research university during their process of developing an assessment plan for the undergraduate statistics major is provided. The article concludes with the lessons learned by the department as they completed the assessment development process. Supplementary materials for this article are available online. [Received December 2014. Revised July 2015]
Numeracy | 2018
Jennifer J. Kaplan; Neal Rogness
Instructional inattention to language poses a barrier for students in entry-level science courses, in part because students may perceive a subject as difficult solely based on the lack of understanding of the vocabulary. In addition, the technical use of terms that have different everyday meanings may cause students to misinterpret statements made by instructors, leading to an incomplete or incorrect understanding of the domain. Terms that have different technical and everyday meanings are said to have lexical ambiguity and statistics, as a discipline, has many lexically ambiguous terms. This paper presents a cyclic process for designing activities to address lexical ambiguity in statistics. In addition, it describes three short activities aimed to have high impact on student learning associated with two different lexically ambiguous words or word pairs in statistics. Preliminary student-level data are used to assess the efficacy of the activities, and future directions for development of activities and research about lexical ambiguity in statistics in particular and STEM in general are discussed.
Teaching Statistics | 2012
Jennifer J. Kaplan; Neal Rogness; Diane G. Fisher
Statistics Education Research Journal | 2014
Jennifer J. Kaplan; Neal Rogness; Diane G. Fisher