Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daisuke Ochi is active.

Publication


Featured researches published by Daisuke Ochi.


ieee virtual reality conference | 2015

Live streaming system for omnidirectional video

Daisuke Ochi; Yutaka Kunita; Akio Kameda; Akira Kojima; Shinnosuke Iwaki

NTT Media Intelligence Laboratories and DWANGO Co., Ltd. have jointly developed a virtual reality system that enables users to have an immersive experience visiting a remote site. This system makes it possible for users to watch video content wherever they want to watch it by using interactive streaming technology that selectively streams the users watching section at a high bitrate in a limited network bandwidth. Applying this technology to omnidirectional video allows users to experience feelings of presence through the use of an intuitive head mount display. The system has also been released on a commercial platform and successfully streamed a real-time event. A demonstration is planned in which the details of the system and the streaming service results obtained with it will be presented.


ieee global conference on consumer electronics | 2012

Mobile and multi-device interactive panorama video distribution system

Hideaki Kimata; Daisuke Ochi; Akio Kameda; Hajime Noto; Katsuhiko Fukazawa; Akira Kojima

Distribution of reality of the event site through broadband network is expected for a future video service. We propose video distribution system for providing such a realistic user experience in the mobile and home environments, upon recent trends of wireless broadband network LTE and Wi-Fi, and multi-touch wide screen mobile terminal. In the system the user can see high quality video of the expected area within the huge panorama video which represents the whole scene while moving and zooming in interactively the viewing area under limited bandwidth of wireless network. In addition, multi-device video streaming to the TV terminal and mobile terminal is supported. The user can experience more sensational interactive viewing with more high quality video and sound while keeping stress-free multi-touch interface to control the viewing area.


acm multimedia | 2014

HMD Viewing Spherical Video Streaming System

Daisuke Ochi; Yutaka Kunita; Kensaku Fujii; Akira Kojima; Shinnosuke Iwaki; Junichi Hirose

We propose a video streaming system that lets users view their favorite sections through an HMD (head mount display) from a video recorded by a spherical (360-degree) camera. Although spherical video streaming tends to consume a lot of bitrate for its huge image area, our system consumes a reasonable amount of bitrate by assigning higher bitrate only for the users viewing area, not for the area outside of it. Technical Demos of the system with an Oculus Rift HMD will be performed to demonstrate it enables them to view images at a bitrate of about 2.5 Mbps.


international conference on computer graphics and interactive techniques | 2017

GazeSphere: navigating 360-degree-video environments in VR using head rotation and eye gaze

Yun Suen Pai; Benjamin Tag; Megumi Isogai; Daisuke Ochi; Kai Kunze

Viewing 360-degree-images and videos through head-mounted displays (HMDs) currently lacks a compelling interface to transition between them. We propose GazeSphere; a navigation system that provides a seamless transition between 360-degree-video environment locations through the use of orbit-like motion, via head rotation and eye gaze tracking. The significance of this approach is threefold: 1) It allows navigation and transition through spatially continuous 360-video environments, 2) It leverages the humans proprioceptive sense of rotation for locomotion that is intuitive and negates motion sickness, and 3) it uses eye tracking for a completely seamless, hands-free, and unobtrusive interaction. The proposed method uses an orbital motion technique for navigation in virtual space, which we demonstrate in applications such as navigation and interaction in computer aided design (CAD), data visualization, as a game mechanic, and for virtual tours.


acm multimedia | 2015

Dive into Remote Events: Omnidirectional Video Streaming with Acoustic Immersion

Daisuke Ochi; Kenta Niwa; Akio Kameda; Yutaka Kunita; Akira Kojima

We propose a system that can provide the physical presence of remote events through a head mount display (HMD) and a headphone. It can stream omnidirectional video within a limited network bandwidth at a high bitrate without sending regions that users are not viewing. It can also reproduce binaural sounds by convoluting head related transfer functions and angular region-wise separated signals. Technical demos of the system using an Oculus Rift HMD with a headphone will be performed to enable users to experience the visual and acoustic immersion it provides.


acm multimedia | 2012

A study on making camera trajectory from panorama watching manipulation

Daisuke Ochi; Hideaki Kimata; Hajime Noto; Akira Kojima

We propose a new interactive panorama video delivery system that enables users not only to select favorite parts of an event site, but also to watch repeatedly with stable and the user-selected camera trajectory. In this paper, we present a system that makes a favorable and stable camera trajectory as a scenario from users manipulations of a tablet device in a format that can be easily shared with others. We also evaluate the validity of the camera trajectory results obtained our stabilization technique.


acm multimedia | 2012

Scenario-driven interactive panorama video delivery: promptly watch and share enjoyable parts of an event

Daisuke Ochi; Hideaki Kimata; Hajime Noto; Akira Kojima

We propose a scenario-driven interactive panorama video delivery system that allows users to repeatedly watch the enjoyable parts of an event and share them with others. It provides functions for interactively watching panorama video parts, as well as for scenario making (with user-selected camera trajectory), and delivery that allows users to share their panorama watching experiences. In this technical demos, we demonstrate the system that can make a favorable and stable camera trajectories as a scenario from intuitive user manipulations of a tablet device in a format that can be easily shared with others.


international symposium on wearable computers | 2017

face2faceVR: using AR to assist VR in ubiquitous environment usage

Yun Suen Pai; Megumi Isogai; Daisuke Ochi; Hideaki Kimata; Kai Kunze

As virtual reality (VR) usage becomes more popular, one of the issues, among others, which still prevents VR from being used in a more ubiquitous manner is spatial awareness, unlike augmented reality (AR). Generally, there are two forms of such an awareness; recognizing the environment and recognizing other people around us. We propose face2faceVR; an easy to use implementation of AR tracking to assist VR towards recognizing other nearby VR users. The contribution of this work are the following; 1) it is compatible with mobile VR technology that already caters towards a wider adoption, 2) it does not require a networked or shared virtual environment, and 3) it is an inexpensive implementation without any additional peripherals or hardware.


international conference on computer graphics and interactive techniques | 2017

Partial plane sweep volume for deep learning based view synthesis

Kouta Takeuchi; Kazuki Okami; Daisuke Ochi; Hideaki Kimata

We propose a partial plane sweep volume that can be a more suitable input format for deep-learning-based view synthesis approaches. Our approach makes it possible to synthesize higher quality images with a smaller number of learning iterations, while keeping the number of depth planes.


international conference on computer graphics and interactive techniques | 2017

CleaVR: collaborative layout evaluation and assessment in virtual reality

Yun Suen Pai; Benjamin Tag; Megumi Isogai; Daisuke Ochi; Hideaki Kimata; Kai Kunze

Layout planning is a process often used in architectural interior design, for factory production plans, and so on. We present CleaVR, a system that provides the user with an immersive virtual reality system that accurately visualizes the layout plan in three dimensions as shown in Figure 1(a). The user is able to freely orbit around the design to observe it in every angle for an accurate evaluation and assessment. The implemented gesture recognition system means that no physical buttons are required, allowing a complete immersion with intuitive controls. These controls allow the user to pan around the environment, pinch to pick and place objects, as well as swiping the view to switch into first person view, as shown in Figure 1(b). With our system, architects, engineers, designers, and even sports analysts may approach their targeted environment through a multi-user, multi-view tool with full control of the virtual environment purely by intuitive gesture controls.

Collaboration


Dive into the Daisuke Ochi's collaboration.

Top Co-Authors

Avatar

Akira Kojima

Nippon Telegraph and Telephone

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Shiro Ozawa

Nippon Telegraph and Telephone

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge