Electrical and Electronics Engineering PhD Thesis Defense by Shoaib Rehman Soomro



KOÇ UNIVERSITY

GRADUATE SCHOOL OF SCIENCES & ENGINEERING

ELECTRICAL AND ELECTRONICS ENGINEERING

PhD THESIS DEFENSE BY SHOAIB REHMAN SOOMRO

 

Title: Augmented Reality 3D Display and Light Field Imaging Systems Based on Passive Optical Surfaces

 

Speaker: Shoaib Rehman Soomro

 

Time: February 19, 2018, 10:00 AM

 

Place: ENG 208

Koç University

Rumeli Feneri Yolu

Sariyer, Istanbul

Thesis Committee Members:

Prof. Hakan Ürey (Advisor, Koç University)

Prof. Murat Tekalp (Koç University)

Asst. Prof. Sedat Nizamoğlu (Koç University)

Prof. Levent Onural (Bilkent University)

Asst. Prof. Onur Ferhanoğlu (Istanbul Technical University)

 

 

Abstract:

3D imaging and display techniques are widely explored for augmented reality (AR) applications to provide realistic content capture and visualization. Display technologies are quite advanced but emerging AR headsets usually have small screens, limited transparency and offer limited field-of-view. On the other hand, existing 3D imaging technologies are either limited to a single (or two) perspective views or require large camera array installations for multi-perspective capture. This dissertation presents new display and imaging techniques where a number of micro-structured passive optical surfaces are developed and then combined with mobile projectors and cameras to achieve novel display and imaging modalities. Three types of novel optical surfaces are developed: (i) transparent AR screen, which is a transparent retroreflective surface developed for augmented reality and automotive head-up display (HUD) applications, (ii) 3D imaging surface, which is a handheld lens array sheet that is used for mobile light-field capture using a regular camera, and (iii) integrated dual-purpose screen, which combines the first two surfaces to achieve 3D display and 3D imaging on the same surface simultaneously.

The transparent AR screen provides high optical gain within a small viewing window and enables the use of low lumen output head-mounted laser projectors to produce bright content. Higher see-through performance is achieved through partial covering of the surface with retroreflective microspheres. The screen transparency and optical gain is varied by changing the retroreflective fill factor and three versions of the screen providing 50%, 75% and 90% transparency are developed. The optical design and fabrication flow as well as the retroreflective and transmission characteristics of the screens are presented. The transparent retroreflective AR screen was first used with a pair of head-mounted laser projectors (called pico-projectors) for 3D vision. We achieved good stereoscopic vision providing down-to 1% crosstalk between two eyes and up-to 1,000 cd/m2 brightness using 30 lumen projectors. For automotive 3D head-up-display applications, we performed a user study to test simulated optical collimation and provide virtual image perception at different screen distances. The user study focused on the visual acuity response under see-through condition and simulated collimation condition that shows the relation of visual acuity with the distance of HUD screen and amount of Accommodation-Convergence (AC) conflict. We showed that visual acuity is slightly dropped (from 20/20 to 20/25) when the HUD screen is placed between the driver/user and real-scene. For simulated collimation condition, an inverse relation between the amount of AC conflict and visual acuity is observed which shows minimum effect on visual acuity and viewing comfort when AC conflict is less than 0.85 diopters or when the screen distance is >100 cm.

3D imaging surface introduces a light-field imaging platform that can turn a regular camera into a multi-perspective 3D capture system. The 3D imaging screen contains an array of lenses and tracking markers to capture the 3D perspective views of the scene. The camera image provides an array of sub-images each corresponding to a different perspective view. The location of the mobile screen is determined by tracking the distinct markers attached to the corners of the surface. The resolution analysis is performed to understand the spatial-angular resolution tradeoff and optimize the spatial resolution for each sub-image. We developed a computational image tool for real-time light-field reconstruction and showed that digital image refocusing and partial occlusion problems can be handled using our surface.

Lastly, a novel integrated dual-purpose screen is introduced for simultaneous display and imaging using the reflections off the screen. The screen mainly consists of patterned retroreflective microspheres as top layer and an array of curved mirrors as the bottom layer for 3D display and multi-perspective imaging respectively. The simultaneous display and imaging is performed by using an intermediate polarization selective layer and performing polarization multiplexing to separate the projected and captured light. A telepresence demonstrator is built where user-1 uses a head-mounted projector and camera while across the integrated dual-purpose screen and user-2 uses a VR headset, where user-2 can watch user-1 in real-time and from arbitrary perspectives. The display and imaging characteristics of the developed prototype were also evaluated. While mobile 3D telepresence is not possible using existing head-mounted AR and VR displays, our proposed solution offers a unique alternative and can be the future of mobile telepresence.