Christian Noon
Iowa State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Christian Noon.
Proceedings of SPIE | 2013
Christian Noon; Joseph Holub; Eliot Winer
Performing high quality 3D visualizations on mobile devices, while tantalizingly close in many areas, is still a quite difficult task. This is especially true for 3D volume rendering of digital medical images. Allowing this would empower medical personnel a powerful tool to diagnose and treat patients and train the next generation of physicians. This research focuses on performing real time volume rendering of digital medical images on iOS devices using custom developed GPU shaders for orthogonal texture slicing. An interactive volume renderer was designed and developed with several new features including dynamic modification of render resolutions, an incremental render loop, a shader-based clipping algorithm to support OpenGL ES 2.0, and an internal backface culling algorithm for properly sorting rendered geometry with alpha blending. The application was developed using several application programming interfaces (APIs) such as OpenSceneGraph (OSG) as the primary graphics renderer coupled with iOS Cocoa Touch for user interaction, and DCMTK for DICOM I/O. The developed application rendered volume datasets over 450 slices up to 50-60 frames per second, depending on the specific model of the iOS device. All rendering is done locally on the device so no Internet connection is required.
ASME 2011 World Conference on Innovative Virtual Reality | 2011
Brandon Newendorp; Christian Noon; Joe Holub; Eliot Winer; Stephen B. Gilbert; Julio de la Cruz
In order to adapt to an ever-changing set of threats, military forces need to find new methods of training. The prevalence of commercial game engines combined with virtual reality (VR) and mixed reality environments can prove beneficial to training. Live, virtual and constructive (LVC) training combines live people, virtual environments and simulated actors to create a better training environment. However, integrating virtual reality displays, software simulations and artificial weapons into a mixed reality environment poses numerous challenges. A mixed reality environment known as The Veldt was constructed to research these challenges. The Veldt consists of numerous independent displays, along with movable walls, doors and windows. This allows The Veldt to simulate numerous training scenarios. Several challenges were encountered in creating this system. Displays were precisely located using the tracking system, then configured using VR Juggler. The ideal viewpoint for each display was configured based on the expect location for users to be looking at it. Finally, the displays were accurately aligned to the virtual terrain model. This paper describes how the displays were configured in The Veldt, as well as how it was used for two training scenarios.Copyright
design automation conference | 2009
Christian Noon; Eliot Winer
Many high fidelity analysis tools including finite-element analysis and computational fluid dynamics have become an integral part of the design process. However, these tools were developed for detailed design and are inadequate for conceptual design due to complexity and turnaround time. With the development of more complex technologies and systems, decisions made earlier in the design process have become crucial to product success. Therefore, one possible alternative to high fidelity analysis tools for conceptual design is metamodeling. Metamodels generated upon high fidelity analysis datasets from previous design iterations show large potential to represent the overall trends of the dataset. To determine which metamodeling techniques were best suited to handle high fidelity datasets for conceptual design, an implementation scheme for incorporating Polynomial Response Surface (PRS) methods, Kriging Approximations, and Radial Basis Function Neural Networks (RBFNN) was developed. This paper presents the development of a conceptual design metamodeling strategy. Initially high fidelity legacy datasets were generated from FEA simulations. Metamodels were then built upon the legacy datasets. Finally, metamodel performance was evaluated based upon several dataset conditions including various sample sizes, dataset linearity, interpolation within a domain, and extrapolation outside a domain.Copyright
ASME 2010 World Conference on Innovative Virtual Reality | 2010
Christian Noon; Brandon Newendorp; Eliot Winer; Jim Oliver
Virtual reality (VR) applications are used in many areas of academic and industrial research areas including engineering, bio-medical & geo-sciences, among others. These applications generally focus on creating a VR environment to enhance user experience. One of the main challenges VR application developers face is to make objects within the environment move in a natural, realistic manner. Many commercial packages and programming libraries exist to help generate complex animations, including physics engines, game engines and modeling software such as Autodesk Maya. All of these tools are very useful, but have many disadvantages when applied to VR applications, as they were not designed for VR development. To address these issues, a VR application programming interface (API) was developed to help VR developers create and visualize natural, complex animations for VR-based systems utilizing OpenSceneGraph. This API, called the Animation Engine 2.0, was built in a manner animators and developers are already familiar with by integrating control points and keyframes for controlling animations. The system is time-based to scale to any size of VR system, which enabled the ability to support different time interpolations as well to incorporate acceleration into animations to create behavioral events such as a boing, bounce, or surge. In this paper, the Animation Engine API is presented along with its integration into a VR aircraft carrier application.Copyright
ASME-AFM 2009 World Conference on Innovative Virtual Reality | 2009
Christian Noon; Ruqin Zhang; Eliot Winer; James H. Oliver; Brian Gilmore; Jerry Duncan
Currently, new product concepts are evaluated by developing detailed virtual models with Computer Aided Design (CAD) tools followed by evaluation analyses (e.g., finite element analysis, computational fluid dynamics, etc.). Due to the complexity of these evaluation methods, it is generally not possible to model and analyze each of the ideas generated throughout the conceptual design phase of the design process. Thus, promising ideas may be eliminated based solely on insufficient time to model and assess them. Additionally, the analysis performed is usually of much higher detail than needed for such early assessment. By eliminating the time-consuming CAD complexity, engineers could spend more time evaluating additional concepts. To address these issues, a software framework, the Advanced Systems Design Suite (ASDS), was created. The ASDS incorporates a PC user interface with an immersive virtual reality (VR) environment to ease the creation and assessment of conceptual design prototypes individually or collaboratively in a VR environment. Assessment tools incorporate metamodeling approximations and immersive visualization to evaluate the validity of each concept. In this paper, the ASDS framework and interface along with specifically designed immersive VR assessment tools such as state saving, dynamic viewpoint creation, and animation playback are presented alongside a test case example of redesigning a Boeing 777 in the conceptual design phase.Copyright
48th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference | 2007
Ruqin Zhang; Christian Noon; Eliot Winer; Jim Oliver; Brian Gilmore; Jerry Duncan
Currently, new product concepts are evaluated by developing detailed virtual part and assembly models with traditional Computer Aided Design (CAD) tools followed by appropriate analyses (e.g., finite element analysis, computational fluid dynamics, etc.). The creation of these models and analyses are tremendously time consuming. If a number of different conceptual configurations have been determined, it may not be possible to model and analyze each of them. Thus, promising concepts might be eliminated based solely on insufficient time to assess them. In addition, the virtual models and analyses performed are usually of much higher detail and accuracy than what is needed for such early assessment. By eliminating the time-consuming complexity of a CAD environment and incorporating qualitative assessment tools, engineers could spend more time evaluating additional concepts, which were previously abandoned due to time constraints. In this paper, a software framework, the Advanced Systems Design Suite (ASDS), for creating and evaluating conceptual design configurations in an immersive virtual reality environment is presented. The ASDS allows design concepts to be quickly modeled, analyzed, and visualized. It incorporates a PC user interface with an immersive virtual reality environment to ease the creation and assessment of conceptual design prototypes. The development of the modeling and assessment tools are presented along with a test case to demonstrate the usability and effectiveness of the framework.
Proceedings of SPIE | 2012
Christian Noon; Eric Foo; Eliot Winer
This research focuses on performing interactive, real-time volume raycasting in a large clustered graphics environment using custom GPU shaders for composite volume raycasting with trilinear interpolation. Working in this type of environment presents unique challenges due to the distributed nature, and inherently required synchronization of data and operations across the cluster. Invoking custom vertex and fragment shaders in a non-thread-safe manner becomes increasingly complex in a large clustered graphics environment. Through use of an abstraction layer, all rendering contexts are split-up with no changes to the volume raycasting core. Therefore, the volume raycasting core is completely transparent from the computing platform. The application was tested on a 6-wall immersive VR system with 96 graphics contexts coming from 48 cluster nodes. Interactive framerates of 60 frames per second were produced on 512x512x100 volumes, and an average of 30 frames per second for a 512x512x1000 volume. The use of custom configuration files allows the same code to be highly scalable from a single screen VR system to a fully immersive 6-sided wall VR system. Through the code abstraction, the same volume raycasting core can be implemented on any type of computing platform including desktop and mobile.
ASME 2010 World Conference on Innovative Virtual Reality | 2010
Christian Noon; Brandon Newendorp; Ruqin Zhang; Eliot Winer; Jim Oliver; Jerry Duncan; Brian Gilmore
Conceptual design involves generating hundreds to thousands of concepts and combining the best of all the concepts into a single idea to move forward into detailed design. With the current tools available, design teams usually model a small number of concepts and analyze them using traditional Computer-Aided Design (CAD) analysis tools. The creation and validation of concepts using CAD packages is extremely time consuming and unfortunately, not all concepts can be evaluated. Thus, promising concepts can be eliminated based on insufficient time and resources to use the tools available. Additionally, these virtual models and analyses are usually of much higher fidelity than what is needed at such an early stage of design. To address these issues, an desktop and immersive virtual reality (VR) framework, the Advanced Systems Design Suite (ASDS), was created to foster rapid geometry creation and concept assessment using a unique creation approach which does not require precise mating and dimensioning constraints during the geometry creation phase. The ASDS system removes these precision constraints by using 3D manipulation tools to build concepts and providing a custom easy-to-use measurement system when precise measurements are required. In this paper, the ASDS framework along with a unique and intuitive measurement system are presented for large vehicle conceptual design.Copyright
ASME-AFM 2009 World Conference on Innovative Virtual Reality | 2009
Brandon Newendorp; Christian Noon; Chiu-Shui Chan; Eliot Winer; James H. Oliver
This paper presents a scenegraph animation application programming interface (API), known as the Animation Engine, which was constructed for software developers to easily perform smooth transitions and manipulations to scenegraph nodes. A developer can use one line of code to enter the property, end state and number of frames to describe the animation, then the Animation Engine handles the rest in the background. The goal of the Animation Engine is to provide a simple API that integrates into existing applications with minimal effort. Additionally, techniques to improve virtual reality (VR) application performance on a large computer cluster are presented. These techniques include maintaining high frame rates with 4096 × 4096 pixel textures, eliminating extraneous network traffic and reducing long model loading time. To demonstrate the Animation Engine and the development techniques, an application known as the Virtual Universe was created. The Virtual Universe, designed to run in a six walled CAVE, allows users to freely explore a set of space themed environments. The architecture and development techniques for writing a stable immersive VR application on a large computer cluster, in addition to the creation of the Animation Engine, is presented in this paper.Copyright
12th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference | 2008
Andrew Koehring; Christian Noon; Ruqin Zhang; Eliot Winer; James H. Oliver; Brian Gilmore; Jerry Duncan
Software packages used in the engineering design process have become increasingly complex. Computer aided design (CAD) and finite element analysis (FEA) tools are capable of generating high fidelity models and simulations that have become indispensible components of any design. However, a fair amount of experience and time is required to effectively use such software. When designing at the conceptual level, a high level of accuracy is not needed. Rapid concept generation and evaluation is the primary focus. Unfortunately, few tools exist that successfully suit these needs. The Advanced Systems Design Suite (ASDS) is an application which allows a user to quickly design 3D conceptual models and perform both qualitative and quantitative assessments. This quantitative feedback provided through the use of metamodels which, once constructed, can be evaluated in real time. In this paper, two different metamodeling techniques are applied: Polynomial Response Surface (PRS) and Polynomial Chaos Expansion (PCE). Experiments are carried out using various models in order to determine which is most suitable to a conceptual design and assessment application. Both third order PRS and PCE with second order interaction effects were found to yield positive results when generated from as few as thirty data points.