Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Loren Olson is active.

Publication


Featured researches published by Loren Olson.


international conference on multimedia and expo | 2004

A gesture-driven multimodal interactive dance system

Gang Qian; Feng Guo; Todd Ingalls; Loren Olson; Jodi James; Thanassis Rikakis

In this paper, we report a real-time gesture driven interactive system with multimodal feedback for performing arts, especially dance. The system consists of two major parts., a gesture recognition engine and a multimodal feedback engine. The gesture recognition engine provides real-time recognition of the performers gesture based on the 3D marker coordinates from a marker-based motion capture system. According to the recognition results, the multimodal feedback engine produces associated visual and audio feedback to the performer. This interactive system is simple to implement and robust to errors in 3D marker data. Satisfactory interactive dance performances have been successfully created and presented using the reported system


acm multimedia | 2006

A real-time, multimodal biofeedback system for stroke patient rehabilitation

Yinpeng Chen; Weiwei Xu; Richard Isaac Wallis; Hari Sundaram; Thanassis Rikakis; Todd Ingalls; Loren Olson; Jiping He

This paper presents a novel real-time, multi-modal biofeedback system for stoke patient therapy. The problem is important as traditional mechanisms of rehabilitation are monotonous, and do not incorporate detailed quantitative assessment of recovery in addition to traditional clinical schemes. We have been working on developing an experiential media system that integrates task dependent physical therapy and cognitive stimuli within an interactive, multimodal environment. The environment provides a purposeful, engaging, visual and auditory scene in which patients can practice functional therapeutic reaching tasks, while receiving different types of simultaneous feedback indicating measures of both performance and results. There are two contributions of this paper - (a) identification of features and goals for the functional task, (b) the development of sophisticated feedback (auditory and visual) mechanisms that match the semantics of action of the task.


international conference of the ieee engineering in medicine and biology society | 2006

Novel Design of Interactive Multimodal Biofeedback System for Neurorehabilitation

He Huang; Yinpeng Chen; Weiwei Xu; Hari Sundaram; Loren Olson; Todd Ingalls; Thanassis Rikakis; Jiping He

A previous design of a biofeedback system for Neurorehabilitation in an interactive multimodal environment has demonstrated the potential of engaging stroke patients in task-oriented neuromotor rehabilitation. This report explores the new concept and alternative designs of multimedia based biofeedback systems. In this system, the new interactive multimodal environment was constructed with abstract presentation of movement parameters. Scenery images or pictures and their clarity and orientation are used to reflect the arm movement and relative position to the target instead of the animated arm. The multiple biofeedback parameters were classified into different hierarchical levels w.r.t. importance of each movement parameter to performance. A new quantified measurement for these parameters were developed to assess the patients performance both real-time and offline. These parameters were represented by combined visual and auditory presentations with various distinct music instruments. Overall, the objective of newly designed system is to explore what information and how to feedback information in interactive virtual environment could enhance the sensorimotor integration that may facilitate the efficient design and application of virtual environment based therapeutic intervention


interaction design and children | 2013

Digital culture creative classrooms (DC3): teaching 21st century proficiencies in high schools by engaging students in creative digital projects

David Tinapple; John Sadauskas; Loren Olson

Children and young adults are immersed in digital culture, but most are not familiar with the computational thinking behind the latest tools and technologies. There are few opportunities in secondary school curricula for students to learn such practices, but we believe that skills such as computational thinking, creative coding, collaboration, innovation, and information literacy can be taught in a highly effective manner by using aesthetic challenges as a motivation. In other words, by engaging students in creative digital arts projects they are naturally driven to acquire the many new skills to effectively use and understand the computational tools and techniques involved in creating digital and interactive projects. In this paper, we outline a project-based digital arts curriculum through which novice middle/high school students are intrinsically motivated to learn and apply science, technology, engineering and mathematics (STEM) skills and computational thinking.


acm multimedia | 2009

SMALLab: a mixed-reality environment for embodied and mediated learning

Aisling Kelliher; David Birchfield; Ellen Campana; Sarah Hatton; Mina C. Johnson-Glenberg; Christopher Martinez; Loren Olson; Philippos Savvides; Lisa Tolentino; Kelly Phillips; Sibel Uysal

In this video presentation, we introduce the Situated Multimedia Arts Learning Lab [SMALLab], a mixed-reality learning environment that supports interactive engagement through full body 3D movements and gestures within a collaborative, computationally mediated space. The video begins by describing the holistic approach to embodied and mediated learning developed by our transdisciplinary research team, grounded in understandings derived from research in the learning sciences, digital media and human computer interaction. We then outline the three core tenets of effective learning exemplified by our research -- embodiment, multimodality and collaboration. The video next demonstrates the design and functionality of the physical and digital components of SMALLab. We conclude by illustrating our partner collaborations with K12 teachers and students with four scenarios depicting Geography, Physics, Language Arts and Chemistry learning modules.


Archive | 2009

Experiential Media Systems – The Biofeedback Project

Yinpeng Chen; Hari Sundaram; Thanassis Rikakis; Todd Ingalls; Loren Olson; Jiping He

Experiential media systems refer to real-time, physically grounded multimedia systems in which the user is both the producer and consumer of meaning. These systems require embodied interaction on part of the user to gain new knowledge. In this chapter, we have presented our efforts to develop a real-time, multimodal biofeedback system for stroke patients. It is a highly specialized experiential media system where the knowledge that is imparted refers to a functional task — the ability to reach and grasp an object. There are several key ideas in this chapter: we show how to derive critical motion features using a biomechanical model for the reaching functional task. Then we determine the formal progression of the feedback and its relationship to action. We show how to map movement parameters into auditory and visual parameters in real-time. We develop novel validation metrics for spatial accuracy, opening, flow, and consistency. Our real-world experiments with unimpaired subjects show that we are able to communicate key aspects of motion through feedback. Importantly, they demonstrate that the messages encoded in the feedback can be parsed by the unimpaired subjects.


frontiers in education conference | 2013

The digital culture degree: A competency-based interdisciplinary program spanning engineering and the arts

Thanassis Rikakis; David Tinapple; Loren Olson

This paper describes the Digital Culture BA degree: an engineering-arts undergraduate curriculum that combines competency-based education (CBE) and knowledge-oriented education (KOE) structures and related Pull-Push approaches. The degree has been offered for three years at Arizona State University, has 200 enrolled students and is continuing to grow. The degree embeds nine knowledge-oriented concentrations, each offered by a relevant participating department, within an interdisciplinary CBE context. The CBE part of the degree provides customized access to 40 interdisciplinary digital culture courses from 12 different academic units by connecting these courses through a set of core competencies. Access to courses is not determined by fixed prerequisites but rather by having one of several possible combinations of lower level competencies. This flexible curriculum is attractive to students, promotes integrative collaborative learning that inspires innovation, and prepares the type of engineering-arts experts and complex problem solvers that are currently needed in creative industries. This type of degree also presents several important challenges for educators and administrators. To address these challenges we developed project based assessment approaches, custom web-based software for advising a very diverse student body, as well as online tools for facilitating peer critique and feedback in large creative classrooms.


2009 Virtual Rehabilitation International Conference | 2009

Visual feedback for Mixed Reality stroke rehabilitation

Nicole Lehrer; Loren Olson

The Mixed Reality Rehabilitation project is a real-time multimedia system for upper extremity rehabilitation of stroke survivors through task-oriented physical therapy. This poster describes the visual feedback designed to inform subjects of the spatial aspects of their movement as a reaching task is performed.


acm multimedia | 2006

The design of a real-time, multimodal biofeedback system for stroke patient rehabilitation

Yinpeng Chen; He Huang; Weiwei Xu; Richard Isaac Wallis; Hari Sundaram; Thanassis Rikakis; Todd Ingalls; Loren Olson; Jiping He


international conference of the ieee engineering in medicine and biology society | 2005

Interactive Multimodal Biofeedback for Task-Oriented Neural Rehabilitation

He Huang; Todd Ingalls; Loren Olson; Kathleen J. Ganley; Thanassis Rikakis; Jiping He

Collaboration


Dive into the Loren Olson's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Todd Ingalls

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Jiping He

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

David Tinapple

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Hari Sundaram

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Yinpeng Chen

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

He Huang

North Carolina State University

View shared research outputs
Top Co-Authors

Avatar

John Sadauskas

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Weiwei Xu

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Aisling Kelliher

Carnegie Mellon University

View shared research outputs
Researchain Logo
Decentralizing Knowledge