Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Furcy is active.

Publication


Featured researches published by David Furcy.


Artificial Intelligence | 2004

Lifelong planning A

Sven Koenig; Maxim Likhachev; David Furcy

Heuristic search methods promise to find shortest paths for path-planning problems faster than uninformed search methods. Incremental search methods, on the other hand, promise to find shortest paths for series of similar path-planning problems faster than is possible by solving each path-planning problem from scratch. In this article, we develop Lifelong Planning A* (LPA*), an incremental version of A* that combines ideas from the artificial intelligence and the algorithms literature. It repeatedly finds shortest paths from a given start vertex to a given goal vertex while the edge costs of a graph change or vertices are added or deleted. Its first search is the same as that of a version of A* that breaks ties in favor of vertices with smaller g-values but many of the subsequent searches are potentially faster because it reuses those parts of the previous search tree that are identical to the new one. We present analytical results that demonstrate its similarity to A* and experimental results that demonstrate its potential advantage in two different domains if the path-planning problems change only slightly and the changes are close to the goal.


Ai Magazine | 2004

Incremental heuristic search in AI

Sven Koenig; Maxim Likhachev; Yaxin Liu; David Furcy

Incremental search reuses information from previous searches to find solutions to a series of similar search problems potentially faster than is possible by solving each search problem from scratch. This is important because many AI systems have to adapt their plans continuously to changes in (their knowledge of) the world. In this article, we give an overview of incremental search, focusing on LIFELONG PLANNING A*, and outline some of its possible applications in AI.


Artificial Intelligence | 2006

Maximizing over multiple pattern databases speeds up heuristic search

Robert C. Holte; Ariel Felner; Jack Newton; Ram Meshulam; David Furcy

A pattern database (PDB) is a heuristic function stored as a lookup table. This paper considers how best to use a fixed amount (m units) of memory for storing pattern databases. In particular, we examine whether using n pattern databases of size m/n instead of one pattern database of size m improves search performance. In all the state spaces considered, the use of multiple smaller pattern databases reduces the number of nodes generated by IDA*. The paper provides an explanation for this phenomenon based on the distribution of heuristic values that occur during search.


international conference on unconventional computation | 2017

Scaled Tree Fractals Do not Strictly Self-assemble

Kimberly Barth; David Furcy; Scott M. Summers; Paul Totzke

A \emph{pier fractal} is a discrete self-similar fractal whose generator contains at least one \emph{pier}, that is, a member of the generator with exactly one adjacent point. Tree fractals and pinch-point fractals are special cases of pier fractals. In this paper, we study \emph{scaled pier fractals}, where a \emph{scaled fractal} is the shape obtained by replacing each point in the original fractal by a


technical symposium on computer science education | 2008

Sorting out sorting: the sequel

David Furcy; Thomas L. Naps; Jason Wentworth

c \times c


web intelligence | 2011

Designing Effective Heterogeneous Teams for Multiagent Routing Domains

David Furcy; George Thomas

block of points, for some


Algorithmica | 2017

Optimal Program-Size Complexity for Self-Assembled Squares at Temperature 1 in 3D

David Furcy; Samuel Micka; Scott M. Summers

c \in \mathbb{Z}^+


Ai Magazine | 2009

AAAI 2008 Workshop Reports

Sarabjot Singh Anand; Razvan C. Bunescu; Vitor R. Carvalho; Jan Chomicki; Vincent Conitzer; Michael T. Cox; Virginia Dignum; Zachary Dodds; Mark Dredze; David Furcy; Evgeniy Gabrilovich; Mehmet Göker; Hans W. Guesgen; Haym Hirsh; Dietmar Jannach; Ulrich Junker; Wolfgang Ketter; Alfred Kobsa; Sven Koenig; Tessa A. Lau; Lundy Lewis; Eric T. Matson; Ted Metzler; Rada Mihalcea; Bamshad Mobasher; Joelle Pineau; Pascal Poupart; Anita Raja; Wheeler Ruml; Norman M. Sadeh

. We prove that no scaled discrete self-similar pier fractal strictly self-assembles, at any temperature, in Winfrees abstract Tile Assembly Model.


international joint conference on artificial intelligence | 2005

Limited discrepancy beam search

David Furcy; Sven Koenig

Ronald Baeckers Sorting Out Sorting (SOS) set the stage for much of what has followed in the evolution of algorithm visualization (AV). That period of evolution has now spanned over a quarter century, and we have learned much about how to effectively use AV. This paper addresses how we can incorporate that knowledge into a new rendition of SOS, which we call SOS - The Sequel. In this sequel we attempt to transform Baeckers original video into a highly interactive multimedia learning resource delivered over the Web using Macromedia Flash. The paper describes the design and use of this new resource and reports on a small empirical study designed to measure its effectiveness.


Ai Magazine | 2004

Incremental Heuristic Search in Artificial Intelligence

Sven Koenig; Maxim Likhachev; Yaxin Liu; David Furcy

Many realistic problem domains are composed of heterogeneous tasks distributed in a physical environment. Even though the distribution of skills among the members of a heterogeneous team has a significant influence on its effectiveness, little is known about how to design effective heterogeneous teams. In this paper, we develop a graph-search approach to tackle this team design problem in the context of multiagent routing, a generalizable domain in which heterogeneous, randomly located tasks must be completed in overall minimum time (or make span) given an a priori distribution of their heterogeneity, a fixed team size, and a limited budget. First, we develop complete and optimal search algorithms. Second, we show that dominance-based pruning significantly increases the size of problems that can be solved optimally. Third, we introduce an anytime algorithm called TD-BR that uses beam search with restarts in order to scale up to much larger problems. We evaluate our algorithms empirically in two ways: first, we predict the performance of the teams using a team performance metric called task coverage, and show that our algorithms produce high coverage teams, second, we test a subset of these teams in simulation by allocating the teams to various task sets and measuring their make span. We show that our teams perform well when compared to an ideal homogeneous team, and outperform heterogeneous teams created by other methods. Our main contributions are thus new algorithmic tools for designers of heterogeneous teams in robotics and other domains where modular construction and refitting of robots is possible.

Collaboration


Dive into the David Furcy's collaboration.

Top Co-Authors

Avatar

Sven Koenig

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Scott M. Summers

University of Wisconsin–Oshkosh

View shared research outputs
Top Co-Authors

Avatar

George Thomas

University of Wisconsin–Oshkosh

View shared research outputs
Top Co-Authors

Avatar

Maxim Likhachev

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Thomas L. Naps

University of Wisconsin–Oshkosh

View shared research outputs
Top Co-Authors

Avatar

Yaxin Liu

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Samuel Micka

Montana State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ariel Felner

Ben-Gurion University of the Negev

View shared research outputs
Researchain Logo
Decentralizing Knowledge