Robert C. delMas
University of Minnesota
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Robert C. delMas.
Journal of Statistics Education | 2002
Robert C. delMas
Similarities and differences in the articles by Rumsey, Garfield and Chance are summarized. An alternative perspective on the distinction between statistical literacy, reasoning, and thinking is presented. Based on this perspective, an example is provided to illustrate how literacy, reasoning and thinking can be promoted within a single topic of instruction. Additional examples of assessment items are offered. I conclude with implications for statistics education research that stem from the incorporation of recommendations made by Rumsey, Garfield and Chance into classroom practice.
TESOL Quarterly | 2006
Martha Bigelow; Robert C. delMas; Kit Hansen; Elaine Tarone
In this exploratory study, we examine the role of literacy in the acquisition of second-language (L2) oral skills through a partial replication of Jenefer Philps (2003) study of recasts in native speaker (NS) -non-native speaker (NNS) interaction. The principal research question was the following: Is the ability to recall a recast related to the learners alphabetic print literacy level? The participants in the study were eight first language (LI) speakers of Somali with limited formal schooling, who were grouped according to scores on LI and L2 literacy measures. Procedures involved interactive tasks in which participants received and recalled recasts on their grammatically incorrect interrogative sentences. Unlike Philps more educated participants, our overall less educated participants showed no significant effects for recast length or, as a group, for number of changes in the recasts. This suggests that findings on the oral L2 processing of more educated L2 learners may not hold for the oral L2 processing of less educated learners. Within our less educated population, the more literate group recalled all recasts significantly better than the less literate group when correct and modified recalls were combined. Literacy level was also significantly related to ability to recall recasts with two or more (2+) changes, with the more literate group doing better than the less literate group. Theoretical implications of these findings are discussed.
Journal of Career Development | 1998
Shari L. Peterson; Robert C. delMas
Following the successful application of Banduras (1977) theory of self-efficacy to the treatment of career indecision (Betz & Hackett, 1981; Hackett & Betz, 1981), Taylor and Betz (1983) developed the Career Decision-Making Self-Efficacy (CDMSE) scale. CDMSE identifies the extent to which students have confidence (self-efficacy) in their ability to engage in educational and occupational planning and decision-making. The Taylor and Betz (1983) study identified five sub-scales: Self-Appraisal, Occupational Information, Goal Selection, Planning, and Problem-Solving. Each sub-scale consisted of ten items. Using principal components factor analysis with varimax rotation, the total variance accounted for by factor scores of the five sub-scales equaled 52%. However, it was noted that since most of the items had relatively large loadings on more than one factor, the structure was not clear-cut, and therefore, the scale potentially represented a single and rather large general factor. Research has provided support for the validity of the CDMSE instrument (Betz & Hackett, 1986; Robbins, 1985; Taylor & Popma, 1990), and the concept has been widely adapted (Brown, Lent, & Larkin, 1989; Lent, Brown, & Larkin, 1984, 1986, 1987; Lent, Larkin, & Brown, 1989; Nevill & Schlecker, 1988; Peterson, 1993a, 1993b; Rotberg, Brown, & Ware, 1987; Shelton, 1990; Stumpf & Brief, 1987). However, the nature of career decision-making self-efficacy, as a construct is still being explored.
Archive | 2007
Joan Garfield; Robert C. delMas; Beth Chance
Recent science reform efforts and standards documents advocate that students develop scientific inquiry practices, such as the construction and communication of scientific explanations. This paper focuses on 7th grade students’ scientific explanations during the enactment of a project based chemistry unit where the construction of scientific explanations is a key learning goal. During the unit, we make the explanation framework explicit to students and include supports or scaffolds in both the student and teacher materials to facilitate students’ in their understanding and construction of scientific explanations. Results from the enactment show significant learning gains for students for all components of scientific explanation (i.e. claim, evidence, and reasoning). Although students’ explanations were stronger at the end of the instructional unit, we also found that students’ still had difficulty differentiating between appropriate and inappropriate evidence for some assessment tasks. We conjecture that students’ ability to use appropriate data as evidence depends on the wording of the assessment task, students’ content knowledge, and their understanding of what counts as evidence. Having students construct scientific explanations can be an important tool to help make students thinking visible for both researchers and teachers. Appropriate and Inappropriate Evidence Use 2 Middle School Students’ Use of Appropriate and Inappropriate Evidence in Writing Scientific Explanations The National Research Council (1996) and the American Association for the Advancement of Science (1993) call for scientific literacy for all. All students need knowledge of scientific concepts and inquiry practices required for personal decision making, participation in societal and cultural affairs, and economic productivity. Science education should support students’ development toward competent participation in a science infused world (McGinn & Roth, 1999). This type of participation should be obtainable for all students, not just those who are educated for scientific professions. Consequently, we are interested in supporting all students in learning scientific concepts and inquiry practices. By scientific inquiry practices, we mean the multiple ways of knowing which scientists use to study the natural world (National Research Council, 1996). Key scientific inquiry practices called for by national standards documents include asking questions, designing experiments, analyzing data, and constructing explanations (American Association for the Advancement of Science, 1993; National Research Council, 1996). In this study, we focus on analyzing data and constructing explanations. These practices are essential not only for scientists, but for all individuals. On a daily basis, individuals need to evaluate scientific data provided to them in written form such as newspapers and magazines as well spoken through television and radio. Citizens need to be able to evaluate that data to determine whether the claims being made based on the data and reasoning are valid. This type of data evaluation, like other scientific inquiry practices, is dependent both on a general understanding of how to evaluate data as well as an understanding of the science content. Appropriate and Inappropriate Evidence Use 3 In this study we explore when students use appropriate evidence and when they use inappropriate evidence to support their claims. Our work focuses on an 8-week project-based chemistry curriculum designed to support 7 grade students in using evidence and constructing scientific explanations. We examine the characteristics of these students’ explanations, their understanding of the content knowledge, and the assessment tasks to unpack what may be influencing students use of evidence. Our Instructional Model for Scientific Explanations In our work, we examine how students construct scientific explanations using evidence. We use a specific instructional model for evidence-based scientific explanations as a tool for both classroom practice and research. We provide both teachers and students with this model to make the typically implicit framework of explanation, explicit to both teachers and students. Our instructional model for scientific explanation uses an adapted version of Toulmin’s (1958) model of argumentation and builds off previous science educators’ research on students’ construction of scientific explanations and arguments (Bell & Linn, 2000; Jiménez-Aleixandre, Rodríguez, & Duschl, 2000; Lee & Songer, 2004; Sandoval, 2003; Zembal-Saul, et al., 2002). Our explanation framework includes three components: a claim (similar to Toulmin’s claim), evidence (similar to Toulmin’s data), and reasoning (a combination of Toulmin’s warrants and backing). The claim makes an assertion or conclusion that addresses the original question or problem. The evidence supports the student’s claim using scientific data. This data can come from an investigation that students complete or from another source, such as observations, reading material, or archived data. The data need to be both appropriate and sufficient to support the claim. Appropriate data is relevant to the question or problem and relates to the given claim. Data is sufficient when it includes the necessary quantity to convince someone of a claim. The Appropriate and Inappropriate Evidence Use 4 reasoning is a justification that links the claim and evidence and shows why the data counts as evidence to support the claim by using the appropriate scientific principles. Kuhn argues (1993) that argument, or in our case scientific explanation, is a form of thinking that transcends the particular content to which it refers. Students can construct scientific explanations across different content areas. Although an explanation model, such as Toulmin’s, can be used to assess the structure of an explanation, it cannot determine the scientific accuracy of the explanation (Driver, Newton & Osborne, 2000). Instead, both the domain general explanation framework and the domain specific context of the assessment task determine the correctness of the explanation. Consequently, in both teaching students about explanation and assessing students’ construction of explanations we embed the scientific inquiry practice in a specific context. Student Difficulties Constructing Explanations Prior research in science classrooms suggests that students have difficulty constructing high-quality scientific explanations where they articulate and defend their claims (Sadler, 2004). For example, students have difficulty understanding what counts as evidence (Sadler, 2004) and using appropriate evidence (Sandoval, 2003; Sandoval & Reiser, 1997). Instead, students will draw on data that do not support their claim. Consequently, we are interested in whether students use appropriate evidence to support their claim or if they draw on evidence that is not relevant. Students’ claims also do not necessarily relate to their evidence. Instead, students often rely on their personal views instead of evidence to draw conclusions (Hogan & Maglienti, 2001). Students have a particularly difficult time reasoning from primary data, especially when Appropriate and Inappropriate Evidence Use 5 measurement error plays an important role (Kanari & Millar, 2004). Students can recognize variation in data and use characteristics of data in their reasoning, but their ability to draw final conclusions from that data can depend on the context. Masnick, Klahr, and Morris (this volume) concluded that young students who poorly understood the context of the investigation had difficulty interpreting data, particularly when the interpretation of that data contradicted their prior beliefs. Students will likely discount data if the data contradicts their current theory (Chinn & Brewer, 2001) and they will only consider data if they can come up with a mechanism for the pattern of data (Koslowski, 1996). When students evaluate data, more general reasoning strategies interact with domain-specific knowledge (Chinn & Brewer, 2001). Whether students use appropriate and inappropriate evidence may depend on their prior understanding of a particular content area or task. Students also have difficulty providing the backing, or what we refer to as reasoning, for why they chose the evidence (Bell & Linn, 2000) in their written explanations. Other researchers have shown that during classroom discourse, discussions tend to be dominated by claims with little backing to support their claims (Jiménez-Aleixandre, Rodríguez & Duschl, 2000). Our previous work supports these ideas. We found that middle school students’ had the most difficulty with the reasoning component of scientific explanations (McNeill, Lizotte, Krajcik & Marx, in review; McNeill, et al., 2003). Although students’ reasoning improved over the course of the 6-8 week instructional unit, it was consistently of lower quality than their claims or evidence. Students’ reasoning often just linked their claim and evidence and less frequently articulated the scientific principles that allowed them to make that connection. Similar to students ability to evaluate and use data, providing accurate reasoning is related to students understanding of the content. Students with stronger content knowledge Appropriate and Inappropriate Evidence Use 6 provide stronger reasoning in their scientific explanations (McNeill et al., in review). Previous research with students has found that their success at completing scientific inquiry practices is highly dependent on their understanding of both the content and the scientific inquiry practices (Metz, 2000). Both domain specific and general reasoning are essential for students’ effective evaluation of data and construction of scientific explanations. Although previous work has shown that students have difficulty with components of scientific explanations, there has been little research unpacking exactly when
Journal of Statistics Education | 2012
Andrew Zieffler; Jiyoon Park; Joan Garfield; Robert C. delMas; Audbjorg Bjornsdottir
This paper reports on an instrument designed to assess the practices and beliefs of instructors of introductory statistics courses across the disciplines. Funded by a grant from the National Science Foundation, this project developed, piloted, and gathered validity evidence for the Statistics Teaching Inventory (STI). The instrument consists of 50 items in six parts and is administered online. The development of the instrument and the gathering and analysis of validity evidence are described. Plans and suggestions for use of the STI are offered.
Journal of College Student Retention: Research, Theory and Practice | 2001
Shari L. Peterson; Robert C. delMas
A path model was constructed mapping the effect of career decision-making self-efficacy (CDMSE) and degree utility on persistence of underprepared college students. The path model accounted for 21 percent of the variance in intent to persist and 27 percent of the variance in student persistence. The final structural model adds to the literature on student persistence in several ways. First, it suggests the importance of Degree Utility for this population: Students who believed college would provide employment opportunities and better careers were more likely to persist. Second, it confirms that CDMSE has a direct effect on social and academic integration and an indirect effect on persistence. Implications for research include developing and testing interventions to enhance CDMSE. Implications for practice include providing career counseling and advising that identifies the connection between employment opportunities and academic course, program, or degree completion and engaging in practices that increase CDMSE.
Learning, Media and Technology | 2012
Hosun Kang; Mary Lundeberg; Bjørn H. K. Wolter; Robert C. delMas; Clyde Freeman Herreid
This study investigated gender differences in science learning between two pedagogical approaches: traditional lecture and narrative case studies using personal response systems (‘clickers’). Thirteen instructors of introductory biology classes at 12 different institutions across the USA and Canada used two types of pedagogy (Clicker Cases and traditional lecture) to teach eight topic areas. Three different sets of multiple regression analysis were conducted for three separate dependent variables: posttest score, change in score from posttest to final, and transfer score. Interactions between gender and pedagogical approach were found across the three analyses. Women either performed better with Clicker Cases, or about the same with either instructional method, but men performed markedly better with lectures in most topic areas. Our results suggest that men and women experience two pedagogical approaches—Clicker Cases and lectures—differently, and that Clicker Cases are more favorable for women than for men.
Erlbaum | 2007
Robert C. delMas; Yan Liu
Recent science reform efforts and standards documents advocate that students develop scientific inquiry practices, such as the construction and communication of scientific explanations. This paper focuses on 7th grade students’ scientific explanations during the enactment of a project based chemistry unit where the construction of scientific explanations is a key learning goal. During the unit, we make the explanation framework explicit to students and include supports or scaffolds in both the student and teacher materials to facilitate students’ in their understanding and construction of scientific explanations. Results from the enactment show significant learning gains for students for all components of scientific explanation (i.e. claim, evidence, and reasoning). Although students’ explanations were stronger at the end of the instructional unit, we also found that students’ still had difficulty differentiating between appropriate and inappropriate evidence for some assessment tasks. We conjecture that students’ ability to use appropriate data as evidence depends on the wording of the assessment task, students’ content knowledge, and their understanding of what counts as evidence. Having students construct scientific explanations can be an important tool to help make students thinking visible for both researchers and teachers. Appropriate and Inappropriate Evidence Use 2 Middle School Students’ Use of Appropriate and Inappropriate Evidence in Writing Scientific Explanations The National Research Council (1996) and the American Association for the Advancement of Science (1993) call for scientific literacy for all. All students need knowledge of scientific concepts and inquiry practices required for personal decision making, participation in societal and cultural affairs, and economic productivity. Science education should support students’ development toward competent participation in a science infused world (McGinn & Roth, 1999). This type of participation should be obtainable for all students, not just those who are educated for scientific professions. Consequently, we are interested in supporting all students in learning scientific concepts and inquiry practices. By scientific inquiry practices, we mean the multiple ways of knowing which scientists use to study the natural world (National Research Council, 1996). Key scientific inquiry practices called for by national standards documents include asking questions, designing experiments, analyzing data, and constructing explanations (American Association for the Advancement of Science, 1993; National Research Council, 1996). In this study, we focus on analyzing data and constructing explanations. These practices are essential not only for scientists, but for all individuals. On a daily basis, individuals need to evaluate scientific data provided to them in written form such as newspapers and magazines as well spoken through television and radio. Citizens need to be able to evaluate that data to determine whether the claims being made based on the data and reasoning are valid. This type of data evaluation, like other scientific inquiry practices, is dependent both on a general understanding of how to evaluate data as well as an understanding of the science content. Appropriate and Inappropriate Evidence Use 3 In this study we explore when students use appropriate evidence and when they use inappropriate evidence to support their claims. Our work focuses on an 8-week project-based chemistry curriculum designed to support 7 grade students in using evidence and constructing scientific explanations. We examine the characteristics of these students’ explanations, their understanding of the content knowledge, and the assessment tasks to unpack what may be influencing students use of evidence. Our Instructional Model for Scientific Explanations In our work, we examine how students construct scientific explanations using evidence. We use a specific instructional model for evidence-based scientific explanations as a tool for both classroom practice and research. We provide both teachers and students with this model to make the typically implicit framework of explanation, explicit to both teachers and students. Our instructional model for scientific explanation uses an adapted version of Toulmin’s (1958) model of argumentation and builds off previous science educators’ research on students’ construction of scientific explanations and arguments (Bell & Linn, 2000; Jiménez-Aleixandre, Rodríguez, & Duschl, 2000; Lee & Songer, 2004; Sandoval, 2003; Zembal-Saul, et al., 2002). Our explanation framework includes three components: a claim (similar to Toulmin’s claim), evidence (similar to Toulmin’s data), and reasoning (a combination of Toulmin’s warrants and backing). The claim makes an assertion or conclusion that addresses the original question or problem. The evidence supports the student’s claim using scientific data. This data can come from an investigation that students complete or from another source, such as observations, reading material, or archived data. The data need to be both appropriate and sufficient to support the claim. Appropriate data is relevant to the question or problem and relates to the given claim. Data is sufficient when it includes the necessary quantity to convince someone of a claim. The Appropriate and Inappropriate Evidence Use 4 reasoning is a justification that links the claim and evidence and shows why the data counts as evidence to support the claim by using the appropriate scientific principles. Kuhn argues (1993) that argument, or in our case scientific explanation, is a form of thinking that transcends the particular content to which it refers. Students can construct scientific explanations across different content areas. Although an explanation model, such as Toulmin’s, can be used to assess the structure of an explanation, it cannot determine the scientific accuracy of the explanation (Driver, Newton & Osborne, 2000). Instead, both the domain general explanation framework and the domain specific context of the assessment task determine the correctness of the explanation. Consequently, in both teaching students about explanation and assessing students’ construction of explanations we embed the scientific inquiry practice in a specific context. Student Difficulties Constructing Explanations Prior research in science classrooms suggests that students have difficulty constructing high-quality scientific explanations where they articulate and defend their claims (Sadler, 2004). For example, students have difficulty understanding what counts as evidence (Sadler, 2004) and using appropriate evidence (Sandoval, 2003; Sandoval & Reiser, 1997). Instead, students will draw on data that do not support their claim. Consequently, we are interested in whether students use appropriate evidence to support their claim or if they draw on evidence that is not relevant. Students’ claims also do not necessarily relate to their evidence. Instead, students often rely on their personal views instead of evidence to draw conclusions (Hogan & Maglienti, 2001). Students have a particularly difficult time reasoning from primary data, especially when Appropriate and Inappropriate Evidence Use 5 measurement error plays an important role (Kanari & Millar, 2004). Students can recognize variation in data and use characteristics of data in their reasoning, but their ability to draw final conclusions from that data can depend on the context. Masnick, Klahr, and Morris (this volume) concluded that young students who poorly understood the context of the investigation had difficulty interpreting data, particularly when the interpretation of that data contradicted their prior beliefs. Students will likely discount data if the data contradicts their current theory (Chinn & Brewer, 2001) and they will only consider data if they can come up with a mechanism for the pattern of data (Koslowski, 1996). When students evaluate data, more general reasoning strategies interact with domain-specific knowledge (Chinn & Brewer, 2001). Whether students use appropriate and inappropriate evidence may depend on their prior understanding of a particular content area or task. Students also have difficulty providing the backing, or what we refer to as reasoning, for why they chose the evidence (Bell & Linn, 2000) in their written explanations. Other researchers have shown that during classroom discourse, discussions tend to be dominated by claims with little backing to support their claims (Jiménez-Aleixandre, Rodríguez & Duschl, 2000). Our previous work supports these ideas. We found that middle school students’ had the most difficulty with the reasoning component of scientific explanations (McNeill, Lizotte, Krajcik & Marx, in review; McNeill, et al., 2003). Although students’ reasoning improved over the course of the 6-8 week instructional unit, it was consistently of lower quality than their claims or evidence. Students’ reasoning often just linked their claim and evidence and less frequently articulated the scientific principles that allowed them to make that connection. Similar to students ability to evaluate and use data, providing accurate reasoning is related to students understanding of the content. Students with stronger content knowledge Appropriate and Inappropriate Evidence Use 6 provide stronger reasoning in their scientific explanations (McNeill et al., in review). Previous research with students has found that their success at completing scientific inquiry practices is highly dependent on their understanding of both the content and the scientific inquiry practices (Metz, 2000). Both domain specific and general reasoning are essential for students’ effective evaluation of data and construction of scientific explanations. Although previous work has shown that students have difficulty with components of scientific explanations, there has been little research unpacking exactly when
Archive | 2014
Robert C. delMas; Joan Garfield; Andrew Zieffler
This chapter describes the development of students’ thinking as they experienced an innovative introductory statistics curriculum that replaced traditional content and methods with an approach based on simulation and resampling. The methods employed in the curriculum were based on a framework for inference that had students specify a chance model, draw repeated samples of simulated data, create a distribution of summary measures, and use the distribution to evaluate a claim. Students used TinkerPlots™ software to resample simulated data from chance processes and models, as well as to explore the distribution of summary measures. The software incorporates many features of a ″Monte Carlo Workbench″ (see Biehler, 1997a) that allows students to visualize the entire modeling process. Problem solving interviews were conducted with five students after five weeks of the curriculum. These interviews revealed that students were beginning to develop an understanding of important concepts underlying the process of statistical inference. The results suggest that students are able to create and use appropriate chance models and simulations to draw statistical inferences after only a few weeks of instruction in an introductory course. The interviews also suggest that TinkerPlots™ provides students with a memorable, visual medium to support the development of their thinking and reasoning.
Exceptional Children | 2017
Pyung-Gang Jung; Kristen L. McMaster; Robert C. delMas
We examined effects of research-based early writing intervention delivered within a data-based instruction (DBI) framework for children with intensive needs. We randomly assigned 46 students with and without disabilities in Grades 1 to 3 within classrooms to either treatment or control. Treatment students received research-based early writing intervention within a DBI framework for 30 min, 3 times per week, for 12 weeks. Control students received business-as-usual writing instruction. We measured writing performance using curriculum-based measures (CBM) and Woodcock Johnson III Tests of Achievement (WJ III). We found significant treatment effects on CBM outcomes (Hedges g = 0.74 to 1.36). We also found a significant interaction between special education status and condition on the WJ III favoring treatment students with disabilities (Hedges g = 0.45 to 0.70). Findings provide preliminary support for using a combination of research-based intervention and DBI with students with intensive writing needs.