Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Marsha C. Lovett is active.

Publication


Featured researches published by Marsha C. Lovett.


Cognitive Psychology | 1996

History of Success and Current Context in Problem Solving. Combined Influences on Operator Selection.

Marsha C. Lovett; John R. Anderson

Abstract Problem solvers often have multiple operators available to them but must select just one to apply. We present three experiments that demonstrate that solvers use at least two sources of information to make operator selections in the building sticks task (BST): information from their past history of using the operators and information from the current context of the problem. Specifically, problem solvers are more likely to use an operator the more successful it has been in the past and the closer it takes the current state to the goal state. These two effects, respectively, represent the learning and performance processes that influence solvers’ operator selections. A computational model of BST problem solving, developed within the ACT-R theory (Anderson, 1993), provides the unifying framework in which both types of processes can be integrated to predict solvers’ selection tendencies.


Journal of interactive media in education | 2008

The Open Learning Initiative: Measuring the Effectiveness of the OLI Statistics Course in Accelerating Student Learning.

Marsha C. Lovett; Oded Meyer; Candace Thille

The Open Learning Initiative (OLI) is an open educational resources project at Carnegie Mellon University that began in 2002 with a grant from The William and Flora Hewlett Foundation. OLI creates web-based courses that are designed so that students can learn effectively without an instructor. In addition, the courses are often used by instructors to support and complement face-to-face classroom instruction. Our evaluation efforts have investigated OLI courses’ effectiveness in both of these instructional modes – stand-alone and hybrid. This report documents several learning effectiveness studies that were focused on the OLI-Statistics course and conducted during Fall 2005, Spring 2006, and Spring 2007. During the Fall 2005 and Spring 2006 studies, we collected empirical data about the instructional effectiveness of the OLI-Statistics course in stand-alone mode, as compared to traditional instruction. In both of these studies, in-class exam scores showed no significant difference between students in the stand-alone OLI-Statistics course and students in the traditional instructor-led course. In contrast, during the Spring 2007 study, we explored an accelerated learning hypothesis, namely, that learners using the OLI course in hybrid mode will learn the same amount of material in a significantly shorter period of time with equal learning gains, as compared to students in traditional instruction. In this study, results showed that OLI-Statistics students learned a full semester’s worth of material in half as much time and performed as well or better than students learning from traditional instruction over a full semester. Editor: Stephen Godwin (Open University, UK). Reviewers: Tim de Jong (Open University, NL), Elia Tomadaki (Open University, UK), and Stephen Godwin (Open University, UK). Interactive elements: A demonstration of the StatTutor statistics tutorial is available for playback from http://jime.open.ac.uk/2008/14/stattutor_tour/ . The demonstration is in Flash format. http://jime.open.ac.uk/2008/14/stattutor_tour/


The American Statistician | 2000

Applying Cognitive Theory to Statistics Instruction

Marsha C. Lovett; Joel B. Greenhouse

Abstract This article presents five principles of learning, derived from cognitive theory and supported by empirical results in cognitive psychology. To bridge the gap between theory and practice, each of these principles is transformed into a practical guideline and exemplified in a real teaching context. It is argued that this approach of putting cognitive theory into practice can offer several benefits to statistics education: A means for explaining and understanding why reform efforts work; a set of guidelines that can help instructors make well-informed design decisions when implementing these reforms; and a framework for generating new and effective instructional innovations.


Archive | 2007

Thinking with data

Marsha C. Lovett; Priti Shah

Recent science reform efforts and standards documents advocate that students develop scientific inquiry practices, such as the construction and communication of scientific explanations. This paper focuses on 7th grade students’ scientific explanations during the enactment of a project based chemistry unit where the construction of scientific explanations is a key learning goal. During the unit, we make the explanation framework explicit to students and include supports or scaffolds in both the student and teacher materials to facilitate students’ in their understanding and construction of scientific explanations. Results from the enactment show significant learning gains for students for all components of scientific explanation (i.e. claim, evidence, and reasoning). Although students’ explanations were stronger at the end of the instructional unit, we also found that students’ still had difficulty differentiating between appropriate and inappropriate evidence for some assessment tasks. We conjecture that students’ ability to use appropriate data as evidence depends on the wording of the assessment task, students’ content knowledge, and their understanding of what counts as evidence. Having students construct scientific explanations can be an important tool to help make students thinking visible for both researchers and teachers. Appropriate and Inappropriate Evidence Use 2 Middle School Students’ Use of Appropriate and Inappropriate Evidence in Writing Scientific Explanations The National Research Council (1996) and the American Association for the Advancement of Science (1993) call for scientific literacy for all. All students need knowledge of scientific concepts and inquiry practices required for personal decision making, participation in societal and cultural affairs, and economic productivity. Science education should support students’ development toward competent participation in a science infused world (McGinn & Roth, 1999). This type of participation should be obtainable for all students, not just those who are educated for scientific professions. Consequently, we are interested in supporting all students in learning scientific concepts and inquiry practices. By scientific inquiry practices, we mean the multiple ways of knowing which scientists use to study the natural world (National Research Council, 1996). Key scientific inquiry practices called for by national standards documents include asking questions, designing experiments, analyzing data, and constructing explanations (American Association for the Advancement of Science, 1993; National Research Council, 1996). In this study, we focus on analyzing data and constructing explanations. These practices are essential not only for scientists, but for all individuals. On a daily basis, individuals need to evaluate scientific data provided to them in written form such as newspapers and magazines as well spoken through television and radio. Citizens need to be able to evaluate that data to determine whether the claims being made based on the data and reasoning are valid. This type of data evaluation, like other scientific inquiry practices, is dependent both on a general understanding of how to evaluate data as well as an understanding of the science content. Appropriate and Inappropriate Evidence Use 3 In this study we explore when students use appropriate evidence and when they use inappropriate evidence to support their claims. Our work focuses on an 8-week project-based chemistry curriculum designed to support 7 grade students in using evidence and constructing scientific explanations. We examine the characteristics of these students’ explanations, their understanding of the content knowledge, and the assessment tasks to unpack what may be influencing students use of evidence. Our Instructional Model for Scientific Explanations In our work, we examine how students construct scientific explanations using evidence. We use a specific instructional model for evidence-based scientific explanations as a tool for both classroom practice and research. We provide both teachers and students with this model to make the typically implicit framework of explanation, explicit to both teachers and students. Our instructional model for scientific explanation uses an adapted version of Toulmin’s (1958) model of argumentation and builds off previous science educators’ research on students’ construction of scientific explanations and arguments (Bell & Linn, 2000; Jiménez-Aleixandre, Rodríguez, & Duschl, 2000; Lee & Songer, 2004; Sandoval, 2003; Zembal-Saul, et al., 2002). Our explanation framework includes three components: a claim (similar to Toulmin’s claim), evidence (similar to Toulmin’s data), and reasoning (a combination of Toulmin’s warrants and backing). The claim makes an assertion or conclusion that addresses the original question or problem. The evidence supports the student’s claim using scientific data. This data can come from an investigation that students complete or from another source, such as observations, reading material, or archived data. The data need to be both appropriate and sufficient to support the claim. Appropriate data is relevant to the question or problem and relates to the given claim. Data is sufficient when it includes the necessary quantity to convince someone of a claim. The Appropriate and Inappropriate Evidence Use 4 reasoning is a justification that links the claim and evidence and shows why the data counts as evidence to support the claim by using the appropriate scientific principles. Kuhn argues (1993) that argument, or in our case scientific explanation, is a form of thinking that transcends the particular content to which it refers. Students can construct scientific explanations across different content areas. Although an explanation model, such as Toulmin’s, can be used to assess the structure of an explanation, it cannot determine the scientific accuracy of the explanation (Driver, Newton & Osborne, 2000). Instead, both the domain general explanation framework and the domain specific context of the assessment task determine the correctness of the explanation. Consequently, in both teaching students about explanation and assessing students’ construction of explanations we embed the scientific inquiry practice in a specific context. Student Difficulties Constructing Explanations Prior research in science classrooms suggests that students have difficulty constructing high-quality scientific explanations where they articulate and defend their claims (Sadler, 2004). For example, students have difficulty understanding what counts as evidence (Sadler, 2004) and using appropriate evidence (Sandoval, 2003; Sandoval & Reiser, 1997). Instead, students will draw on data that do not support their claim. Consequently, we are interested in whether students use appropriate evidence to support their claim or if they draw on evidence that is not relevant. Students’ claims also do not necessarily relate to their evidence. Instead, students often rely on their personal views instead of evidence to draw conclusions (Hogan & Maglienti, 2001). Students have a particularly difficult time reasoning from primary data, especially when Appropriate and Inappropriate Evidence Use 5 measurement error plays an important role (Kanari & Millar, 2004). Students can recognize variation in data and use characteristics of data in their reasoning, but their ability to draw final conclusions from that data can depend on the context. Masnick, Klahr, and Morris (this volume) concluded that young students who poorly understood the context of the investigation had difficulty interpreting data, particularly when the interpretation of that data contradicted their prior beliefs. Students will likely discount data if the data contradicts their current theory (Chinn & Brewer, 2001) and they will only consider data if they can come up with a mechanism for the pattern of data (Koslowski, 1996). When students evaluate data, more general reasoning strategies interact with domain-specific knowledge (Chinn & Brewer, 2001). Whether students use appropriate and inappropriate evidence may depend on their prior understanding of a particular content area or task. Students also have difficulty providing the backing, or what we refer to as reasoning, for why they chose the evidence (Bell & Linn, 2000) in their written explanations. Other researchers have shown that during classroom discourse, discussions tend to be dominated by claims with little backing to support their claims (Jiménez-Aleixandre, Rodríguez & Duschl, 2000). Our previous work supports these ideas. We found that middle school students’ had the most difficulty with the reasoning component of scientific explanations (McNeill, Lizotte, Krajcik & Marx, in review; McNeill, et al., 2003). Although students’ reasoning improved over the course of the 6-8 week instructional unit, it was consistently of lower quality than their claims or evidence. Students’ reasoning often just linked their claim and evidence and less frequently articulated the scientific principles that allowed them to make that connection. Similar to students ability to evaluate and use data, providing accurate reasoning is related to students understanding of the content. Students with stronger content knowledge Appropriate and Inappropriate Evidence Use 6 provide stronger reasoning in their scientific explanations (McNeill et al., in review). Previous research with students has found that their success at completing scientific inquiry practices is highly dependent on their understanding of both the content and the scientific inquiry practices (Metz, 2000). Both domain specific and general reasoning are essential for students’ effective evaluation of data and construction of scientific explanations. Although previous work has shown that students have difficulty with components of scientific explanations, there has been little research unpacking exactly when


Journal of Experimental Psychology: Learning, Memory and Cognition | 1994

Effects of solving related proofs on memory and transfer in geometry problem solving.

Marsha C. Lovett; John R. Anderson

Three experiments investigate the relationship between memory and problem solving in the domain of geometry theorem proving. In Experiment 1, Ss memories for an original problem-solving episode were interfered with retroactively by solving a 2nd problem that had the same diagram, but no memory effects were observed that depended on the second problems logical similarity to the original. Results suggest that the diagram is the basis for geometry problem-solving memories. Experiments 2 and 3 investigated problem-solving memories in use by examining Ss transfer to a 3rd (test) problem. As with the memory results, transfer was reduced when the 1st two problems had the same diagram relative to when they had 2 different diagrams. Transfer was reduced most in the condition with the greatest proportion of memory-interfering steps. Results suggest that the structure and quality of problem-solving memories affect problem-solving transfer.


Cognitive, Affective, & Behavioral Neuroscience | 2009

Neural correlates of arithmetic calculation strategies

Miriam Rosenberg-Lee; Marsha C. Lovett; John R. Anderson

Recent research into math cognition has identified areas of the brain that are involved in number processing (Dehaene, Piazza, Pinel, & Cohen, 2003) and complex problem solving (Anderson, 2007). Much of this research assumes that participants use a single strategy; yet, behavioral research finds that people use a variety of strategies (LeFevre et al., 1996; Siegler, 1987; Siegler & Lemaire, 1997). In the present study, we examined cortical activation as a function of two different calculation strategies for mentally solving multidigit multiplication problems. The school strategy, equivalent to long multiplication, involves working from right to left. The expert strategy, used by “lightning” mental calculators (Staszewski, 1988), proceeds from left to right. The two strategies require essentially the same calculations, but have different working memory demands (the school strategy incurs greater demands). The school strategy produced significantly greater early activity in areas involved in attentional aspects of number processing (posterior superior parietal lobule, PSPL) and mental representation (posterior parietal cortex, PPC), but not in a numerical magnitude area (horizontal intraparietal sulcus, HIPS) or a semantic memory retrieval area (lateral inferior prefrontal cortex, LIPFC). An ACT-R model of the task successfully predicted BOLD responses in PPC and LIPFC, as well as in PSPL and HIPS.


Memory & Cognition | 2001

Awareness and working memory in strategy adaptivity

Christian D. Schunn; Marsha C. Lovett; Lynne M. Reder

To further the understanding of the mechanisms of strategy choice, in three experiments, we investigate the role of explicit awareness and working memory in strategy adaptivity. Experiment 1 provided correlational evidence that individual differences in strategy adaptivity to changing base rates are related to individual differences in awareness of those changes but appear not to be related to individual differences in working memory capacity. Experiment 2 replicated the role of awareness, and the results suggest that awareness at the time of the base-rate change, rather than afterwards, is related to increased strategy adaptivity. Experiment 3 measured working memory capacity using a different procedure and manipulated working memory load with a dual-task procedure; again, no apparent role of working memory capacity in strategy adaptivity was found. This juxtaposition of findings presents a challenge for existing models of strategy choice.


intelligent tutoring systems | 1998

Cognotive Task Analysis in Service of Intelligent Tutoring System Design: A Case Study in Statistics

Marsha C. Lovett

Cognitive task analysis involves identifying the components of a task that are required for adequate performance. It is thus an important step in ITS design because it circumscribes the curriculum to be taught and provides a decomposition of that curriculum into the knowledge and subskills students must learn. This paper describes several different kinds of cognitive task analysis and organizes them according to a taxonomy of theoretical/empirical ∞ prescriptive/descriptive approaches. Examples are drawn from the analysis of a particular statistical reasoning task. The discussion centers on how different approaches to task analysis provide different perspectives on the decomposition of a complex skill and compares these approaches to more traditional methods.


Cognitive Systems Research | 2002

Modeling selective attention: Not just another model of Stroop (NJAMOS)

Marsha C. Lovett

The Stroop effect has been studied for more than sixty years, and yet it still defies a complete theoretical account. The model presented here offers a new approach that integrates several explanations of the Stroop phenomenon into a hybrid model. Because this model is built within the ACT-R cognitive architecture (Anderson & Lebiere, 1998), it applies a generic, pre-specified set of mechanisms for learning and performance to the particulars of the Stroop task. Besides fitting a variety of already published experimental results, the model offers the potential to capture strategic variation in what is typically considered a low-level attentional phenomenon.


Behavioral and Brain Sciences | 1998

ACT-R: A higher-level account of processing capacity

John R. Anderson; Christian Lebiere; Marsha C. Lovett; Lynne M. Reder

We present an account of processing capacity in the ACT-R theory. At the symbolic level, the number of chunks in the current goal provides a measure of relational complexity. At the subsymbolic level, limits on spreading activation, measured by the attentional parameter W, provide a theory of processing capacity, which has been applied to performance, learning and individual differences data.

Collaboration


Dive into the Marsha C. Lovett's collaboration.

Top Co-Authors

Avatar

John R. Anderson

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Lynne M. Reder

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Christian Lebiere

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Andreas Karatsolis

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Kevin A. Gluck

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Stacie Rohrbach

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Suguru Ishizaki

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge