Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Victor A. Mateevitsi is active.

Publication


Featured researches published by Victor A. Mateevitsi.


ieee virtual reality conference | 2014

Omegalib: A multi-view application framework for hybrid reality display environments

Alessandro Febretti; Arthur Nishimoto; Victor A. Mateevitsi; Luc Renambot; Andrew E. Johnson; Jason Leigh

In the domain of large-scale visualization instruments, hybrid reality environments (HREs) are a recent innovation that combines the best-in-class capabilities of immersive environments, with the best-in-class capabilities of ultra-high-resolution display walls. HREs create a seamless 2D/3D environment that supports both information-rich analysis as well as virtual reality simulation exploration at a resolution matching human visual acuity. Co-located research groups in HREs tend to work on a variety of tasks during a research session (sometimes in parallel), and these tasks require 2D data views, 3D views, linking between them and the ability to bring in (or hide) data quickly as needed. In this paper we present Omegalib, a software framework that facilitates application development on HREs. Omegalib is designed to support dynamic reconfigurability of the display environment, so that areas of the display can be interactively allocated to 2D or 3D workspaces as needed. Compared to existing frameworks and toolkits, Omegalib makes it possible to have multiple immersive applications running on a cluster-controlled display system, have different input sources dynamically routed to applications, and have rendering results optionally redirected to a distributed compositing manager. Omegalib supports pluggable front-ends, to simplify the integration of third-party libraries like OpenGL, OpenSceneGraph, and the Visualization Toolkit (VTK). We present examples of applications developed with Omegalib for the 74-megapixel, 72-tile CAVE2™ system, and show how a Hybrid Reality Environment proved effective in supporting work for a co-located research group in the environmental sciences.


Future Generation Computer Systems | 2016

SAGE2: A collaboration portal for scalable resolution displays

Luc Renambot; Thomas Marrinan; Jillian Aurisano; Arthur Nishimoto; Victor A. Mateevitsi; Krishna Bharadwaj; Lance Long; Andrew E. Johnson; Maxine D. Brown; Jason Leigh

Abstract In this paper, we present SAGE2, a software framework that enables local and remote collaboration on Scalable Resolution Display Environments (SRDE). An SRDE can be any configuration of displays, ranging from a single monitor to a wall of tiled flat-panel displays. SAGE2 creates a seamless ultra-high resolution desktop across the SRDE. Users can wirelessly connect to the SRDE with their own devices in order to interact with the system. Many users can simultaneously utilize a drag-and-drop interface to transfer local documents and show them on the SRDE, use a mouse pointer and keyboard to interact with existing content that is on the SRDE and share their screen so that it is viewable to all. SAGE2 can be used in many configurations and is able to support many communities working with various types of media and high-resolution content, from research meetings to creative session to education. SAGE2 is browser-based, utilizing a web server to host content, WebSockets for message passing and HTML with JavaScript for rendering and interaction. Recent web developments, with the emergence of HTML5, have allowed browsers to use advanced rendering techniques without requiring plug-ins (canvas drawing, WebGL 3D rendering, native video player, etc.). One major benefit of browser-based software is that there are no installation requirements for users and it is inherently cross-platform. A user simply needs a web browser on the device he/she wishes to use as an interaction tool for the SRDE. This lowers considerably the barrier of entry to engage in meaningful collaboration sessions.


augmented human international conference | 2013

Sensing the environment through SpiderSense

Victor A. Mateevitsi; Brad Haggadone; Jason Leigh; Brian Kunzer; Robert V. Kenyon

Recent scientific advances allow the use of technology to expand the number of forms of energy that can be perceived by humans. Smart sensors can detect hazards that human sensors are unable to perceive, for example radiation. This fusing of technology to human s forms of perception enables exciting new ways of perceiving the world around us. In this paper we describe the design of SpiderSense, a wearable device that projects the wearer s near environment on the skin and allows for directional awareness of objects around him. The millions of sensory receptors that cover the skin presents opportunities for conveying alerts and messages. We discuss the challenges and considerations of designing similar wearable devices.


ieee international conference on high performance computing data and analytics | 2012

Scalable Visual Queries for Data Exploration on Large, High-Resolution 3D Displays

Khairi Reda; Andrew E. Johnson; Victor A. Mateevitsi; Catherine Offord; Jason Leigh

As the scale and complexity of data continue to grow at unprecedented rates, scientists are increasingly relying on Large, High-Resolution Displays to visualize and analyze scientific datasets. Recent studies have demonstrated the effectiveness of these displays in supporting cognitively demanding data analysis and sensemaking tasks. While there has been an abundance of research on rendering algorithms for large, high-resolution displays, far less effort has gone into designing interactive visual analytic interfaces to effectively leverage these displays in visual exploration and sensemaking scenarios involving large collections of data. In this paper, we present an interactive visual analytics application for the exploration of large trajectory datasets. Our application utilizes large, high-resolution 3D display environments to simultaneously visualize and juxtapose a large number of trajectories. It also integrates a scalable visual query technique, which can be used to quickly formulate and verify hypotheses, encouraging scientists to contemplate multiple competing theories before drawing conclusions. We evaluate our design within the context of a behavioral ecology case study. We also share our observations from a pilot user study to provide insights on how scientists might utilize large display environments in visual exploration and sensemaking scenarios.


international symposium on visual computing | 2011

The OmegaDesk: towards a hybrid 2D and 3D work desk

Alessandro Febretti; Victor A. Mateevitsi; Dennis Chau; Arthur Nishimoto; Brad McGinnis; Jakub Misterka; Andrew E. Johnson; Jason Leigh

OmegaDesk is a device that allows for seamless interaction between 2D and 3D content. In order to develop this hybrid device, a new form of Operating System is needed to manage and display heterogeneous content. In this paper we address the hardware and software requirements for such a system, as well as challenges. A set of heterogeneous applications has been successfully developed on OmegaDesk. They allowed us to develop a set of guidelines to drive future investigations into 2D/3D hybridized viewing and interaction.


Eurasip Journal on Image and Video Processing | 2013

A human-computer collaborative workflow for the acquisition and analysis of terrestrial insect movement in behavioral field studies

Khairi Reda; Victor A. Mateevitsi; Catherine Offord

The study of insect behavior from video sequences poses many challenges. Despite the advances in image processing techniques, the current generation of insect tracking tools is only effective in controlled lab environments and under ideal lighting conditions. Very few tools are capable of tracking insects in outdoor environments where the insects normally operate. Furthermore, the majority of tools focus on the first stage of the analysis workflow, namely the acquisition of movement trajectories from video sequences. Far less effort has gone into developing specialized techniques to characterize insect movement patterns once acquired from videos. In this paper, we present a human-computer collaborative workflow for the acquisition and analysis of insect behavior from field-recorded videos. We employ a human-guided video processing method to identify and track insects from noisy videos with dynamic lighting conditions and unpredictable visual scenes, improving tracking precision by 20% to 44% compared to traditional automated methods. The workflow also incorporates a novel visualization tool for the large-scale exploratory analysis of insect trajectories. We also provide a number of quantitative methods for statistical hypothesis testing. Together, the various components of the workflow provide end-to-end quantitative and qualitative methods for the study of insect behavior from field-recorded videos. We demonstrate the effectiveness of the proposed workflow with a field study on the navigational strategies of Kenyan seed harvester ants.


Analytical Cellular Pathology | 2014

Scalable Adaptive Graphics Environment: A Novel Way to View and Manipulate Whole-Slide Images

Victor A. Mateevitsi; Bruce Levy

The Scalable Adaptive Graphics Environment (SAGE) was developed at theUniversity of Illinois at Chicago’s (UIC) Electronic Visualization Laboratory (EVL) to facilitate collaborative efforts that require the sharing of data-intensive information for analysis. SAGE is a cross-platform, communitydriven, open-source visualization and collaboration tool that enables users to access, display, and share a variety of dataintensive information, in a variety of resolutions and format, frommultiple sources, on tiled display walls of arbitrary size. SAGE walls have had the ability to display digital-cinema animations, high resolution images, high-definition videoconferences, presentation slides, documents, spreadsheets, and computer screens; however, there was no way to display and manipulate histologic whole-slide images (WSIs). Our desire was to create a tool to permit the importation, display, and manipulation of WSI in the SAGE environment.


collaborative computing | 2014

SAGE2: A new approach for data intensive collaboration using Scalable Resolution Shared Displays

Thomas Marrinan; Jillian Aurisano; Arthur Nishimoto; Krishna Bharadwaj; Victor A. Mateevitsi; Luc Renambot; Lance Long; Andrew E. Johnson; Jason Leigh


augmented human international conference | 2014

The health bar: a persuasive ambient display to improve the office worker's well being

Victor A. Mateevitsi; Khairi Reda; Jason Leigh; Andrew E. Johnson


Insectes Sociaux | 2013

Context-dependent navigation in a collectively foraging species of ant, Messor cephalotes

Catherine Offord; Khairi Reda; Victor A. Mateevitsi

Collaboration


Dive into the Victor A. Mateevitsi's collaboration.

Top Co-Authors

Avatar

Arthur Nishimoto

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Jason Leigh

University of Hawaii at Manoa

View shared research outputs
Top Co-Authors

Avatar

Alessandro Febretti

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Andrew E. Johnson

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Luc Renambot

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Khairi Reda

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bruce Levy

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Jillian Aurisano

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Krishna Bharadwaj

University of Illinois at Chicago

View shared research outputs
Researchain Logo
Decentralizing Knowledge