Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Taylor Martin is active.

Publication


Featured researches published by Taylor Martin.


Annals of Biomedical Engineering | 2007

Comparison of Student Learning in Challenge-based and Traditional Instruction in Biomedical Engineering

Taylor Martin; Stephanie Rivale; Kenneth R. Diller

This paper presents the results of a study comparing student learning in an inquiry-based and a traditional course in biotransport. Collaborating learning scientists and biomedical engineers designed and implemented an inquiry-based method of instruction that followed learning principles presented in the National Research Council report “How People Learn” (HPL). In this study, the intervention group was taught a core biomedical engineering course in biotransport following the HPL method. The control group was taught by traditional didactic lecture methods. A primary objective of the study was to identify instructional methods that facilitate the early development of adaptive expertise (AE). AE requires a combination of two types of engineering skills: subject knowledge and the ability to think innovatively in new contexts. Therefore, student learning in biotransport was measured in two dimensions: A pre and posttest measured knowledge acquisition in the domain and development of innovative problem-solving abilities. HPL and traditional students’ test scores were compared. Results show that HPL and traditional students made equivalent knowledge gains, but that HPL students demonstrated significantly greater improvement in innovative thinking abilities. We discuss these results in terms of their implications for improving undergraduate engineering education.


The Journal of the Learning Sciences | 2013

Learning Analytics and Computational Techniques for Detecting and Evaluating Patterns in Learning: An Introduction to the Special Issue

Taylor Martin; Bruce Sherin

The learning sciences community’s interest in learning analytics (LA) has been growing steadily over the past several years. Three recent symposia on the theme (at the American Educational Research Association 2011 and 2012 annual conferences, and the International Conference of the Learning Sciences 2012), organized by Paulo Blikstein, led to the meeting of learning scientists working in this area and ultimately generated the proposal for this special issue. In the two years that we have worked on putting together this special issue, the task of writing an introduction has become both much simpler and significantly more difficult. On the one hand, many of the trends that are driving the increasing attention to LA— big data, the Cloud—have become so prominent that we can count on readers to have some familiarity with them. Thus, we do not need to start at the beginning in our discussion of LA for the Journal of the Learning Sciences (JLS) audience. On the other hand, the scope of the field and the potential applications have grown tremendously in this short time. The result is that, if anything, we have fallen further behind. Although the educational data mining and LA communities have produced a steady stream of interesting results, work in education has far to go in


international conference on design of communication | 2015

HART: the human affect recording tool

Jaclyn Ocumpaugh; Ryan S. Baker; Ma. Mercedes T. Rodrigo; Aatish Salvi; Ani Aghababyan; Taylor Martin

This paper evaluates the Human Affect Recording Tool (HART), a Computer Assisted Direct Observation (CADO) application that facilitates scientific sampling. HART enforces an established method for systematic direct observation in Educational Data Mining (EDM) research, the Baker Rodrigo Ocumpaugh Monitoring Protocol [25] [26]. This examination provides insight into the design of HART for rapid data collection for both formative classroom assessment and educational research. It also discusses the possible extension of these tools to other domains of affective computing and human computer interaction.


The Journal of the Learning Sciences | 2015

Learning Fractions by Splitting: Using Learning Analytics to Illuminate the Development of Mathematical Understanding

Taylor Martin; Carmen Petrick Smith; Nicole Forsgren; Ani Aghababyan; Philip Janisiewicz; Stephanie Baker

The struggle with fraction learning in kindergarten through Grade 12 in the United States is a persistent problem and one of the major stumbling blocks to succeeding in higher mathematics. Research into this problem has identified several areas where students commonly struggle with fractions. While there are many theories of fraction learning, none of the research on these theories employs samples large enough to test theories at scale or nuanced enough to demonstrate how learning unfolds over time during instructional activities based on these theories. The work reported here uses learning analytics methods with fine-grained log data from an online fraction game to unpack how splitting (i.e. partitioning a whole into equal-sized parts) impacts learning. Study 1 demonstrated that playing the game significantly improved students’ fraction understanding. In addition, a cluster analysis suggested that exploring splitting was beneficial. Study 2 replicated the learning results, and a cluster analysis showed that compared to early game play, later game play showed more optimal splitting strategies. In addition, in looking at the types of transitions that were possible between a student’s early cluster categorization and later cluster categorization, we found that some types of transitions were more beneficial for learning than others.


learning analytics and knowledge | 2013

Nanogenetic learning analytics: illuminating student learning pathways in an online fraction game

Taylor Martin; Ani Aghababyan; Jay Pfaffman; Jenna Olsen; Stephanie Baker; Philip Janisiewicz; Rachel S. Phillips; Carmen Petrick Smith

A working understanding of fractions is critical to student success in high school and college math. Therefore, an understanding of the learning pathways that lead students to this working understanding is important for educators to provide optimal learning environments for their students. We propose the use of microgenetic analysis techniques including data mining and visualizations to inform our understanding of the process by which students learn fractions in an online game environment. These techniques help identify important variables and classification algorithms to group students by their learning trajectories.


Journal of Pre-College Engineering Education Research | 2013

Student Learning in Challenge-Based Engineering Curricula

Leema K. Berland; Taylor Martin; Pat Ko; Stephanie Baker Peacock; Jennifer Rudolph; Chris Golubski

In recent years, there has been a demand to teach engineering in high schools, particularly using a challenge-based curriculum. Many of these programs have the dual goals of teaching students the engineering design process (EDP), and teaching to deepen their understanding and ability to apply science and math concepts. Using both quantitative and qualitative methods, this study examines whether a high school design engineering program accomplishes each of the two goals. During the 2010–2011 school year, over 100 students enrolled in the same design engineering course in seven high schools. Evidence of learning and application of the EDP is accomplished by triangulating student interviews with pre-/post-tests of EDP-related questions and a survey of design engineering beliefs. To determine whether students could apply science and math concepts, we examined content test questions to see if students used science and math ideas to justify their engineering work, and triangulated these results with student interviews. The results are mixed, implying that although there is some learning, application is inconsistent.


The Journal of the Learning Sciences | 2003

Representations That Depend on the Environment: Interpretative, Predictive, and Praxis Perspectives on Learning

Daniel L. Schwartz; Taylor Martin

Suchman’s book has stirred debates over the past 15 years that have produced both interesting work and questionable polemics. As one colleague stated, “Discussing Suchman has become something of an indoor sport around here.” Rather than revisit thevariousarguments,wewill exportSuchman’s idea thatplans, andmoregenerally representations, arise in and depend on situated activity. We will lift this idea from its methodological roots and apply it three times, each time according to a different criterion of social-scientific knowledge. Ideally, this exercise will help us use Suchman’s idea to inform diverse characterizations of learning and instruction. To promote her idea that plans arise in and depend on situated activity, Suchman argues that plans are insufficient to specify situated action, and, therefore, they are not the sole mechanism for producing or regulating that action. Like the legal system, there is no number of laws or plans that can possibly anticipate all the contingencies (which is why we have judges to interpret new contingencies and congress to make new laws). Suchman also raises the attendant question of how people revise their understanding when their knowledge or plans prove inadequate. To continue the analogy, when the laws of a legal system fail because of unanticipated (and therefore unrepresented) contingencies, the laws cannot easily self-correct (hence judges and congress). Suchman chalTHE JOURNAL OF THE LEARNING SCIENCES, 12(2), 285–297 Copyright


Journal of Pre-College Engineering Education Research | 2015

Changes in Teachers' Adaptive Expertise in an Engineering Professional Development Course.

Taylor Martin; Stephanie Baker Peacock; Pat Ko; Jennifer Rudolph

Although the consensus seems to be that high-school-level introductory engineering courses should focus on design, this creates a problem for teacher training. Traditionally, math and science teachers are trained to teach and assess factual knowledge and closed-ended problemsolving techniques specific to a particular discipline, which is unsuited for teaching design skills for open-ended problems that may involve multiple engineering disciplines. Instead, engineering teacher training should use the more fluid framework of adaptive expertise which values the ability to apply knowledge in innovative ways as well as recall facts and solve problems using conventional techniques. In this study, we examined a 6-week program to train math/science teachers to teach high school design engineering. For each curriculum unit, we had a pre-posttest to assess the teachers’ factual knowledge and ability to solve typical problems (termed ‘‘efficiency’’) and their ability to apply their knowledge to reason through open-ended problems (termed ‘‘innovation’’). In addition, we conducted a pre-posttest to see whether teachers’ attitudes and beliefs related to adaptive expertise changed over the course of the program.


Archive | 2017

Measuring Computational Thinking Development with the FUN! Tool

Sarah Brasiel; Kevin Close; Soojeong Jeong; Kevin Lawanto; Phil Janisiewicz; Taylor Martin

Computational thinking (CT) has been given recent attention suggesting that it be developed in children of all ages. With the creation of K-12 computer science standards by the Computer Science Teacher Association, states such as Massachusetts and Washington are leading the nation in adopting these standards into their school systems. This seems somewhat premature, when there are so few measures of computational thinking or computer programming skills that can be applied easily in a K-12 setting to assess outcomes of such state-wide initiatives. Through funding from the National Science Foundation, we developed an analysis tool to efficiently capture student learning progressions and problem-solving activities while coding in Scratch, a popular visual programming language developed by MIT Media Lab. Our analysis tool, the Functional Understanding Navigator! or FUN! tool, addresses the need to automate processes to help researchers efficiently clean, analyze, and present data. We share our experiences using the tool with Scratch data collected from three different week-long summer Scratch Camps with students in grades 5 to 8. Based on our preliminary analyses, we share important considerations for researchers interested in educational data mining and learning analytics in the area of assessing computational thinking. We also provide links to the publically available FUN! tool and encourage others to participate in a community developing new measures of computational thinking and computer programming.


learning at scale | 2016

Macro Data for Micro Learning: Developing the FUN! Tool for Automated Assessment of Learning

Taylor Martin; Sarah Brasiel; Soojeong Jeong; Kevin Close; Kevin Lawanto; Phil Janisciewcz

Digital learning environments are becoming more common for students to engage in during and outside of school. With the immense amount of data now available from these environments, researchers need tools to process, manage, and analyze the data. Current methods used by many education researchers are inefficient; however, without data science experience tools used in other professions are not accessible. In this paper, we share about a tool we created called the Functional Understanding Navigator! (FUN! Tool). We have used this tool for different research projects which has allowed us the opportunity to (1) organize our workflow process from start to finish, (2) record log data of all of our analyses, and (3) provide a platform to share our analyses with others through GitHub. This paper extends and improves existing work in educational data mining and learning analytics.

Collaboration


Dive into the Taylor Martin's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kenneth R. Diller

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Pat Ko

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tom Benton

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Stephanie Rivale

University of Texas at Austin

View shared research outputs
Researchain Logo
Decentralizing Knowledge