Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jonathan Eisenmann is active.

Publication


Featured researches published by Jonathan Eisenmann.


EvoMUSART'13 Proceedings of the Second international conference on Evolutionary and Biologically Inspired Music, Sound, Art and Design | 2013

Inverse mapping with sensitivity analysis for partial selection in interactive evolution

Jonathan Eisenmann; Matthew R. Lewis; Richard E. Parent

Evolutionary algorithms have shown themselves to be useful interactive design tools. However, current algorithms only receive feedback about candidate fitness at the whole-candidate level. In this paper we describe a model-free method, using sensitivity analysis, which allows designers to provide fitness feedback to the system at the component level. Any part of a candidate can be marked by the designer as interesting (i.e. having high fitness). This has the potential to improve the design experience in two ways: (1) The finer-grain guidance provided by partial selections facilitates more precise iteration on design ideas so the designer can maximize her energy and attention. (2) When steering the evolutionary system with more detailed feedback, the designer may discover greater feelings of satisfaction with and ownership over the final designs.


Archive | 2011

Interactive Evolution for Designing Motion Variants

Jonathan Eisenmann; Matthew R. Lewis; Bryan Cline

We present an intuitive method for novice users to interactively design custom populations of stylized, heterogeneous motion, from one input motion. The user sets up lattice deformers which are used by a genetic algorithm to manipulate the animation channels of the input motion and create new motion variants. Our interactive evolutionary design environment allows the user to traverse the available space of possibilities, presents the user with populations of motion, and gradually converges to a satisfactory set of solutions. Each generated motion can undergo a filtering process subject to user-specified, high-level metrics to produce a result crafted to fit the designer’s interest. We demonstrate application to both character animation and particle systems.


Leonardo | 2016

Spatiotemporal Ideation and Generation with Interactive Evolutionary Design

Jonathan Eisenmann; Matthew R. Lewis; Richard E. Parent

ABSTRACT Interactive evolutionary design tools enable human intuition and creative decision-making in high-dimensional design domains while leaving technical busywork to the computer. Current evolutionary algorithms for interactive design tools accept only feedback about entire design candidates, not their parts, which can lead to user fatigue. This article describes several case studies in which designers used an enhanced interactive evolutionary design tool with region-of-interest feedback for character animation tasks. This enhanced interactive evolutionary design tool is called the Interactive Design with Evolutionary Algorithms and Sensitivity (IDEAS) tool. Designers’ feedback and narratives about their experiences with the tool show that interactive evolutionary algorithms can be made suitable for the ideation and generation of digital assets, even in time-varying domains.


european conference on applications of evolutionary computation | 2011

Creating choreography with interactive evolutionary algorithms

Jonathan Eisenmann; Benjamin Schroeder; Matthew R. Lewis; Richard E. Parent

Directing a group behavior towards interesting and complex motion can and should be intuitive, iterative, and often participatory. Toward this end, we present a choreographic system that enables designers to explore a motion space based on a parametric model of behaviors. Designers may work with the system by moving back and forth through two complementary stages: first, using an evolutionary algorithm to traverse the space of behavior possibilities, allowing designers to emphasize desired kinds of motion while leaving room for an element of the unexpected, and second, using selected behaviors to direct the group motion of simple performing creatures. In the second stage, evolved group motion behaviors from the first stage are used alongside existing high-level parametric rules for local articulated motion.


genetic and evolutionary computation conference | 2013

Trace selection for interactive evolutionary algorithms

Jonathan Eisenmann; Matthew R. Lewis; Richard E. Parent

This paper presents a selection method for use with interactive evolutionary algorithms and sensitivity analysis in spatiotemporal domains. Recent work in the field has made it possible to give feedback to an interactive evolutionary system with a finer granularity than the typical wholesale selection method. This recent development allows the user to drive the evolutionary search in a more precise way by allowing him to select a part of a phenotype to indicate fitness. The method has potential to alleviate the human fatigue bottleneck, so it seems ideally suited for use in domains that vary in both space and time, such as character motion or cloth simulation where evaluation times are long. However no evolutionary interface has been developed yet which will allow for selecting parts of time-varying phenotypes. We present a selection interface that should be fast and intuitive enough to minimize the interaction bottleneck in evolutionary algorithms that receive feedback at the phenotype part level.


Proceedings of SPIE | 2010

Matte Painting in Stereoscopic Synthetic Imagery

Jonathan Eisenmann; Richard E. Parent

While there have been numerous studies concerning human perception in stereoscopic environments, rules of thumb for cinematography in stereoscopy have not yet been well-established. To that aim, we present experiments and results of subject testing in a stereoscopic environment, similar to that of a theater (i.e. large flat screen without head-tracking). In particular we wish to empirically identify thresholds at which different types of backgrounds, referred to in the computer animation industry as matte paintings, can be used while still maintaining the illusion of seamless perspective and depth for a particular scene and camera shot. In monoscopic synthetic imagery, any type of matte painting that maintains proper perspective lines, depth cues, and coherent lighting and textures saves in production costs while still maintaining the illusion of an alternate cinematic reality. However, in stereoscopic synthetic imagery, a 2D matte painting that worked in monoscopy may fail to provide the intended illusion of depth because the viewer has added depth information provided by stereopsis. We intend to observe two stereoscopic perceptual thresholds in this study which will provide practical guidelines indicating when to use each of three types of matte paintings. We ran subject tests in two virtual testing environments, each with varying conditions. Data were collected showing how the choices of the users matched the correct response, and the resulting perceptual threshold patterns are discussed below.


Proceedings of SPIE | 2009

Stereoscopy in cinematographic synthetic imagery

Jonathan Eisenmann; Richard E. Parent

In this paper we present experiments and results pertaining to the perception of depth in stereoscopic viewing of synthetic imagery. In computer animation, typical synthetic imagery is highly textured and uses stylized illumination of abstracted material models by abstracted light source models. While there have been numerous studies concerning stereoscopic capabilities, conventions for staging and cinematography in stereoscopic movies have not yet been well-established. Our long-term goal is to measure the effectiveness of various cinematography techniques on the human visual system in a theatrical viewing environment. We would like to identify the elements of stereoscopic cinema that are important in terms of enhancing the viewers understanding of a scene as well as providing guidelines for the cinematographer relating to storytelling. In these experiments we isolated stereoscopic effects by eliminating as many other visual cues as is reasonable. In particular, we aim to empirically determine what types of movement in synthetic imagery affect the perceptual depth sensing capabilities of our viewers. Using synthetic imagery, we created several viewing scenarios in which the viewer is asked to locate a target objects depth in a simple environment. The scenarios were specifically designed to compare the effectiveness of stereo viewing, camera movement, and object motion in aiding depth perception. Data were collected showing the error between the choice of the user and the actual depth value, and patterns were identified that relate the test variables to the viewers perceptual depth accuracy in our theatrical viewing environment.


international conference on evolutionary computation | 2016

INTERACTIVE EVOLUTIONARY DESIGN OF MOTION VARIANTS

Jonathan Eisenmann; Matthew R. Lewis; Bryan Cline


Archive | 2014

Interactive Evolutionary Design with Region-of-Interest Selection for Spatiotemporal Ideation & Generation

Jonathan Eisenmann


IJCCI | 2009

Interactive Evolutionary Design of Motion Variants.

Jonathan Eisenmann; Matthew R. Lewis; Bryan Cline

Collaboration


Dive into the Jonathan Eisenmann's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge