Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kaan Aksit is active.

Publication


Featured researches published by Kaan Aksit.


IEEE\/OSA Journal of Display Technology | 2012

Portable 3D Laser Projector Using Mixed Polarization Technique

Kaan Aksit; Osman Eldes; Selvan Viswanathan; Mark O. Freeman; Hakan Urey

This paper introduces a new twist on stereoscopic displays -one that has similarities to existing methods in that it utilizes both polarization and color to present different stereo 3D perspectives to each eye, but by combining the use of polarization and color, it avoids weaknesses associated with previous methods. This new method is named Mixed Polarization 3D. Color imbalance artifacts associated with anaglyph methods of 3D are avoided by alternating the colors presented to each eye. Flicker, associated with polarization-sequential 3D, or the need to increase the frame rate to at least 120 Hz to avoid this perceived flicker, is avoided in mixed polarization 3D by presenting both eyes with 3D information in every single frame. It is particularly aimed at use in scanned laser projectors where all three primary colors (R, G, B) are already polarized and simultaneously displayed. Like other polarization-based approaches, it requires the use of a polarization-preserving screen and inexpensive passive polarization glasses. The 3D display needs just a single handheld mobile projector coupled with an active polarization rotator, thus the image registration problems with two projectors is avoided.


ubiquitous computing | 2015

Head-mounted mixed reality projection display for games production and entertainment

Daniel Kade; Kaan Aksit; Hakan Urey; Oğuzhan Özcan

AbstractThis research presents a mixed reality (MR) application that is designed to be usable during a motion capture shoot and supports actors with their task to perform. Through our application, we allow seeing and exploring a digital environment without occluding an actor’s field of vision. A prototype was built by combining a retroreflective screen covering surrounding walls and a headband consisting of a laser scanning projector with a smartphone. Built-in sensors of a smartphone provide navigation capabilities in the digital world. The presented system was demonstrated in an initially published paper. Here, we extend these research results with our advances and discuss the potential use of our prototype in gaming and entertainment applications. To explore this potential use case, we built a gaming application using our MR prototype and tested it with 45 participants. In these tests, we use head movements as rather unconventional game controls. According to the performed user tests and their feedback, our prototype shows a potential to be used for gaming applications as well. Therefore, our MR prototype could become of special interest because the prototype is lightweight, allows for freedom of movement and is a low-cost, stand-alone mobile system. Moreover, the prototype also allows for 3D vision by mounting additional hardware.


conference on advances in computer entertainment technology | 2014

Head-worn mixed reality projection display application

Kaan Aksit; Daniel Kade; Oğuzhan Özcan; Hakan Urey

The main goal of this research is to develop a mixed reality (MR) application to support motion capture actors. This application allows seeing and exploring a digital environment without occluding the actors visual field. A prototype is built by combining a retro-reflective screen covering surrounding walls and a headband consisting of a laser scanning projector with a smartphone. Built-in sensors of a smartphone provide navigation capabilities in the digital world. The integrated system has some unique advantages, which are collectively demonstrated for the first time: (i) providing fixed field-of-view (50° in diagonal), fixed retinal images at full-resolution, and distortion-free images that are independent of the screen distance and shape; (ii) presenting different perspectives to the users as they move around or tilt their heads, (iii) allowing a focus-free and calibration-free display even on non-flat surfaces using laser scanning technology, (iv) enabling multiple users to share the same screen without crosstalk due to the use of retro-reflectors, and (v) producing high brightness pictures with a projector of only 15 lm; due to a high-gain retro-reflective screen. We demonstrated a lightweight, comfortable to wear and low cost head-mounted projection display (HMPD) which acts as a stand-a-lone mobile system. Initial informal functionality tests have been successfully performed. The prototype can also be used as a 3D stereo system using the same hardware by additionally mounting polarized glasses and an active polarization rotator, while maintaining all of the advantages listed above.


Optics Letters | 2014

Super stereoscopy technique for comfortable and realistic 3D displays

Kaan Aksit; Amir Hossein Ghanbari Niaki; Erdem Ulusoy; Hakan Urey

Two well-known problems of stereoscopic displays are the accommodation-convergence conflict and the lack of natural blur for defocused objects. We present a new technique that we name Super Stereoscopy (SS3D) to provide a convenient solution to these problems. Regular stereoscopic glasses are replaced by SS3D glasses which deliver at least two parallax images per eye through pinholes equipped with light selective filters. The pinholes generate blur-free retinal images so as to enable correct accommodation, while the delivery of multiple parallax images per eye creates an approximate blur effect for defocused objects. Experiments performed with cameras and human viewers indicate that the technique works as desired. In case two, pinholes equipped with color filters per eye are used; the technique can be used on a regular stereoscopic display by only uploading a new content, without requiring any change in display hardware, driver, or frame rate. Apart from some tolerable loss in display brightness and decrease in natural spatial resolution limit of the eye because of pinholes, the technique is quite promising for comfortable and realistic 3D vision, especially enabling the display of close objects that are not possible to display and comfortably view on regular 3DTV and cinema.


3dtv-conference: the true vision - capture, transmission and display of 3d video | 2011

Light engine and optics for HELIUM3D auto-stereoscopic laser scanning display

Kaan Aksit; Selim Olcer; Erdem Erden; Vc Kishore; Hakan Urey; Eero Willman; Hadi Baghsiahi; Se Day; David R. Selviah; F. Anibal Fernandez; Phil Surman

This paper presents a laser-based auto-stereoscopic 3D display technique and a prototype utilizing a dual projector light engine. The solution described is able to form dynamic exit pupils under the control of a multi-user head-tracker. A prototype completed recently is able to provide a glasses-free solution for a single user at a fixed position. At the end of the prototyping phase it is expected to enable a multiple user interface with an integration of the pupil tracker and the spatial light modulator.


Optics Express | 2013

Dynamic exit pupil trackers for autostereoscopic displays

Kaan Aksit; Hadi Baghsiahi; Phil Surman; Selim Ӧlçer; Eero Willman; David R. Selviah; Se Day; Hakan Urey

This paper describes the first demonstrations of two dynamic exit pupil (DEP) tracker techniques for autostereoscopic displays. The first DEP tracker forms an exit pupil pair for a single viewer in a defined space with low intraocular crosstalk using a pair of moving shutter glasses located within the optical system. A display prototype using the first DEP tracker is constructed from a pair of laser projectors, pupil-forming optics, moving shutter glasses at an intermediate pupil plane, an image relay lens, and a Gabor superlens based viewing screen. The left and right eye images are presented time-sequentially to a single viewer and seen as a 3D image without wearing glasses and allows the viewer to move within a region of 40 cm × 20 cm in the lateral plane, and 30 cm along the axial axis. The second DEP optics can move the exit pupil location dynamically in a much larger 3D space by using a custom spatial light modulator (SLM) forming an array of shutters. Simultaneous control of multiple exit pupils in both lateral and axial axes is demonstrated for the first time and provides a viewing volume with an axial extent of 0.6-3 m from the screen and within a lateral viewing angle of ± 20° for multiple viewers. This system has acceptable crosstalk (< 5%) between the stereo image pairs. In this novel version of the display the optical system is used as an advanced dynamic backlight for a liquid crystal display (LCD). This has advantages in terms of overall display size as there is no requirement for an intermediate image, and in image quality. This system has acceptable crosstalk (< 5%) between the stereo image pairs.


Photonics | 2012

Novel 3D displays using micro-optics and MEMS

Hakan Urey; Kaan Aksit; Osman Eldes

A portable projector can be built using a MEMS raaster scanner and laser light sources. Three recently developed technologies are reviewed: mixed-polarization based stereoscopic display, Gabor superlens based autostereoscopic display, and interaction using retroreflectors.


SID Symposium Digest of Technical Papers | 2011

48.4: Beam Forming for a Laser Based Auto-stereoscopic Multi-Viewer Display

Hadi Baghsiahi; David R. Selviah; Eero Willman; Anibal Fernández; Se Day; Kaan Aksit; Selim Olcer; Aref Mostafazadeh; Eedem Erden; Velichappattu C. Kishore; Hakan Urey; Phil Surman

Abstract An auto-stereoscopic back projection display using a RGB multi-emitter laser illumination source and micro-optics to provide a wider view is described. The laser optical properties and the speckle due to the optical system configuration and its diffusers are characterised . Index Terms : 3D Display, laser scanning, pupil tracking, auto-stereoscopic, laser display, laser speckle, light engine, backlight, liquid crystal, LCoS, microlens, diffuser . 1. Introduction The goal of this project is to develop a multi-viewer multi-user auto-stereoscopic display. The display is being developed within the European Union-funded HELIUM3D (High Efficiency Laser-based Multi-user Multi-modal 3D Display) project. The collaborators are University College London, De Montfort University, Koc University, Philips, Nanjing University, Barco, Fraunhofer HHI, and Technische Universiteit Eindhoven. The 3D laser based display consists of three subsections: a multi-user head-tracker, projection optics and transmission screen (which is called the transfer screen in this project) and a light engine, shown in Figure 1. The light engine produces time-multiplexed scanned views of the displayed image content. Three red, green and blue lasers are employed as the light sources and the laser beams are combined and shaped into a light line which is scanned across the surface of a Liquid Crystal on Silicon (LCoS) modulation device to form the image.


3dtv-conference: the true vision - capture, transmission and display of 3d video | 2014

Super stereoscopy 3D glasses for more realistic 3D vision

Kaan Aksit; Amir Hossein Ghanbari Niaki; Osman Eldes; Hakan Urey

This paper introduces a new major twist on stereoscopic displays, where users suffer less from the accommodation-vergence conflict with the help of improved monocular parallax. Our method provides two different views to each eye by using special apertures equipped with color filters. The design can be embedded into conventional stereoscopic glasses or special contact lenses. Subjective tests verified that the accommodation-vergence conflict is avoided to a large degree. The technique is also applicable to multi-view 3DTV displays in general.


Optics Express | 2013

Multi-view autostereoscopic projection display using rotating screen

Osman Eldes; Kaan Aksit; Hakan Urey

Collaboration


Dive into the Kaan Aksit's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Se Day

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Phil Surman

Nanyang Technological University

View shared research outputs
Top Co-Authors

Avatar

Eero Willman

University College London

View shared research outputs
Top Co-Authors

Avatar

Hadi Baghsiahi

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge