The Engineering Reality of Virtual Reality 2020

Conference Keywords: Virtual and Augmented Reality Systems; Virtual Reality UI and UX; Emergent Augmented Reality Platforms; Virtual and Augmented Reality in Education, Learning, Gaming, Art

ERVR 2020 Call for Papers PDF

Wednesday January 29, 2020

KEYNOTE: Imaging Systems and Processing

Session Chairs: Kevin Matherson, Microsoft Corporation (United States) and Dietmar Wueller, Image Engineering GmbH & Co. KG (Germany)
8:50 – 9:30 AM
ISS Room

This session is jointly sponsored by: The Engineering Reality of Virtual Reality 2020, Imaging Sensors and Systems 2020, and Stereoscopic Displays and Applications XXXI.

Abstract: Medical imaging is used extensively world-wide to visualize the internal anatomy of the human body. Since medical imaging data is traditionally displayed on separate 2D screens, it needs an intermediary or well trained clinician to translate the location of structures in the medical imaging data to the actual location in the patient’s body. Mixed reality can solve this issue by allowing to visualize the internal anatomy in the most intuitive manner possible, by directly projecting it onto the actual organs inside the patient. At the Incubator for Medical Mixed and Extended Reality (IMMERS) in Stanford, we are connecting clinicians and engineers to develop techniques that allow to visualize medical imaging data directly overlaid on the relevant anatomy inside the patient, making navigation and guidance for the clinician both simpler and safer. In this presentation I will talk about different projects we are pursuing at IMMERS and go into detail about a project on mixed reality neuronavigation for non-invasive brain stimulation treatment of depression. Transcranial Magnetic Stimulation is a non-invasive brain stimulation technique that is used increasingly for treating depression and a variety of neuropsychiatric diseases. To be effective the clinician needs to accurately stimulate specific brain networks, requiring accurate stimulator positioning. In Stanford we have developed a method that allows the clinician to “look inside” the brain to see functional brain areas using a mixed reality device and I will show how we are currently using this method to perform mixed reality-guided brain stimulation experiments.


ISS-189
Mixed reality guided neuronavigation for non-invasive brain stimulation treatment, Christoph Leuze, Stanford University (United States)

Christoph Leuze is a research scientist in the Incubator for Medical Mixed and Extended Reality at Stanford University where he focuses on techniques for visualization of MRI data using virtual and augmented reality devices. He published BrainVR, a virtual reality tour through his brain and is closely working with clinicians on techniques to visualize and register medical imaging data to the real world using optical see-through augmented reality devices such as the Microsoft Hololens and the Magic Leap One. Prior to joining Stanford, he worked on high-resolution brain MRI measurements at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, for which he was awarded the Otto Hahn medal by the Max Planck Society for outstanding young researchers.


10:00 AM – 3:30 PM Industry Exhibition - Wednesday

10:10 – 10:30 AM Coffee Break

Augmented Reality in Built Environments

Session Chairs: Raja Bala, PARC (United States) and Matthew Shreve, Palo Alto Research Center (United States)
10:30 AM – 12:40 PM
IMAWM Room

This session is jointly sponsored by: The Engineering Reality of Virtual Reality 2020, and Imaging and Multimedia Analytics in a Web and Mobile World 2020.


10:30IMAWM-220
Augmented reality assistants for enterprise, Matthew Shreve and Shiwali Mohan, Palo Alto Research Center (United States)

11:00IMAWM-221
Extra FAT: A photorealistic dataset for 6D object pose estimation, Jianhang Chen1, Daniel Mas Montserrat1, Qian Lin2, Edward Delp1, and Jan Allebach1; 1Purdue University and 2HP Labs, HP Inc. (United States)

11:20IMAWM-222
Space and media: Augmented reality in urban environments, Luisa Caldas, University of California, Berkeley (United States)

12:00ERVR-223
Active shooter response training environment for a building evacuation in a collaborative virtual environment, Sharad Sharma and Sri Teja Bodempudi, Bowie State University (United States)

12:20ERVR-224
Identifying anomalous behavior in a building using HoloLens for emergency response, Sharad Sharma and Sri Teja Bodempudi, Bowie State University (United States)



12:40 – 2:00 PM Lunch

2:00 – 3:10 PM PLENARY: VR/AR Future Technology

3:10 – 3:30 PM Coffee Break

Visualization Facilities

3:30 – 4:10 PM
SD&A Room

This session is jointly sponsored by: The Engineering Reality of Virtual Reality 2020, and Stereoscopic Displays and Applications XXXI.


3:30SD&A-265
Immersive design engineering, Bjorn Sommer, Chang Lee, and Savina Toirrisi, Royal College of Art (United Kingdom)

3:50SD&A-266
Using a random dot stereogram as a test image for 3D demonstrations, Andrew Woods, Wesley Lamont, and Joshua Hollick, Curtin University (Australia)



KEYNOTE: Visualization Facilities

4:10 – 5:10 PM
SD&A Room

This session is jointly sponsored by: The Engineering Reality of Virtual Reality 2020, and Stereoscopic Displays and Applications XXXI.

The keynote will be co-presented by Derek Van Tonder and Andy McCutcheon.

Abstract: With all the hype and excitement surrounding Virtual and Augmented Reality, many people forget that while powerful technology can change the way we work, the human factor seems to have been left out of the equation for many modern-day solutions. For example, most modern Virtual Reality HMDs completely isolate the user from their external environment, causing a wide variety of problems. "See-Through" technology is still in its infancy. In this submission we argue that the importance of the social factor outweighs the headlong rush towards better and more realistic graphics, particularly in the design, planning and related engineering disciplines. Large-scale design projects are never the work of a single person, but modern Virtual and Augmented Reality systems forcibly channel users into single-user simulations, with only very complex multi-user solutions slowly becoming available. In our presentation, we will present three different Holographic solutions to the problems of user isolation in Virtual Reality, and discuss the benefits and downsides of each new approach. With all the hype and excitement surrounding Virtual and Augmented Reality, many people forget that while powerful technology can change the way we work, the human factor seems to have been left out of the equation for many modern-day solutions. For example, most modern Virtual Reality HMDs completely isolate the user from their external environment, causing a wide variety of problems. "See-Through" technology is still in its infancy. In this submission we argue that the importance of the social factor outweighs the headlong rush towards better and more realistic graphics, particularly in the design, planning and related engineering disciplines. Large-scale design projects are never the work of a single person, but modern Virtual and Augmented Reality systems forcibly channel users into single-user simulations, with only very complex multi-user solutions slowly becoming available. In our presentation, we will present three different Holographic solutions to the problems of user isolation in Virtual Reality, and discuss the benefits and downsides of each new approach.


ERVR-295
Social holographics: Addressing the forgotten human factor, Derek Van Tonder and Andy McCutcheon, Euclideon Holographics (Australia)

Derek Van Tonder is senior business development manager specializing in B2B product sales and project management with Euclideon Holographics in Brisbane Australia. Van Tonder began his career in console game development in 2001 with the South African company I-Imagine. Following that, Van Tonder was a senior developer with Pandemic Studios, a senior engine programmer with Tantalus Media and then Sega Studios in Australia, and a lecturer in game programming at Griffith University in Brisbane. In 2010, he founded Bayside Games to pursue development of an iOS game called "Robots Can't Jump" written from scratch in C++. In 2012 he joined Euclideon Pty Ltd, transitioning from leading software development to technical business development. In 2015 he joined Taylors - applying VR technology to urban development, managing an international team of developers to create applications using the most advanced VR/AR technologies available. Currently Van Tonder is involved with several projects, including a Safe Site Pty Ltd project developing a revolutionary new Immersive Training software platform, and a CSIRO Data61 Robotics and Autonomous Systems Group development project to produce a Windows port of the "Wildcat" robotics software framework. Wildcat is an innovative software platform being developed by CSIRO's Data61 organization - it functions as the "brains" of a range of different robotics platforms.

Andy McCutcheon is a former Special Forces Commando who transitioned into commercial aviation as a pilot, after leaving the military in 1990. He dove-tailed his specialised skill-set to become one of the world’s most recognisable celebrity bodyguards, working with some of the biggest names in music and film before moving to Australia in 2001. In 2007, he pioneered the first new alcohol beverages category in 50 years with his unique patented ‘Hard Iced Tea,’ which was subsequently sold in 2013. He is the author of two books and is currently the Global Sales Manager, Aerospace & Defence for Brisbane based Euclideon Holographics, recently named ‘Best Technology Company’ in 2019.


5:30 – 7:00 PM EI 2020 Symposium Interactive Posters Session

5:30 – 7:00 PM Meet the Future: A Showcase of Student and Young Professionals Research

Thursday January 30, 2020

Flourishing Virtual & Augmented Worlds

Session Chairs: Margaret Dolinsky, Indiana University (United States) and Ian McDowall, Intuitive Surgical / Fakespace Labs (United States)
8:45 – 10:10 AM
ERVR Room

8:45
Conference Welcome

8:50ERVR-337
Using virtual reality for spinal cord injury rehabilitation, Marina Ciccarelli, Susan Morris, Michael Wiebrands, and Andrew Woods, Curtin University (Australia)

9:10ERVR-338
Heads-up LiDAR imaging with sensor fusion, Yang Cai, CMU (United States)

9:30ERVR-339
Enhancing lifeguard training through virtual reality, Lucas Wright1, Lara Chunko2, Kelsey Benjamin3, Emmanuelle Hernandez-Morales4, Jack Miller5, Melynda Hoover5, and Eliot Winer5; 1Hamilton College, 2University of Colorado, 3Prairie View A&M University, 4University of Puerto Rico, and 5Iowa State University (United States)

9:50ERVR-340
Transparent type virtual image display using small mirror array, Akane Temochi and Tomohiro Yendo, Nagaoka University of Technology (Japan)



10:10 – 10:50 AM Coffee Break

Experiencing Virtual Reality

Session Chairs: Margaret Dolinsky, Indiana University (United States) and Ian McDowall, Intuitive Surgical / Fakespace Labs (United States)
10:50 AM – 12:30 PM
ERVR Room

10:50ERVR-360
Designing a VR arena: Integrating virtual environments and physical spaces for social, sensorial data-driven virtual experiences, Ruth West1, Eitan Mendelowitz2, Zach Thomas1, Christopher Poovey1, and Luke Hillard1; 1University of North Texas and 2Mount Holyoke College (United States)

11:10ERVR-361
Leaving the windows open: Indeterminate situations through composite 360-degree photography, Peter Williams1 and Sala Wong2; 1California State University, Sacramento and 2Indiana State University (United States)

11:30ERVR-362
User experience evaluation in virtual reality based on subjective feelings and physiological signals (JIST-first), YunFang Niu1, Danli Wang1, ZiWei Wang1, Fang Sun2, Kang Yue1, and Nan Zheng1; 1Institute of Automation, Chinese Academy of Sciences and 2Liaoning Normal University (China)

11:50ERVR-363
Interactive multi-user 3D visual analytics in augmented reality, Wanze Xie1, Yining Liang1, Janet Johnson1, Andrea Mower2, Samuel Burns2, Colleen Chelini2, Paul D'Alessandro2, Nadir Weibel1, and Jürgen Schulze1; 1University of California, San Diego and 2PwC (United States)

12:10ERVR-364
CalAR: A C++ engine for augmented reality applications on Android mobile devices, Menghe Zhang, Karen Lucknavalai, Weichen Liu, and Jürgen Schulze, University of California, San Diego (United States)



12:30 – 2:00 PM Lunch

Developing Virtual Reality

Session Chairs: Margaret Dolinsky, Indiana University (United States) and Ian McDowall, Intuitive Surgical / Fakespace Labs (United States)
2:00 – 3:00 PM
ERVR Room

2:00ERVR-380
Development and evaluation of immersive educational system to improve driver’s risk prediction ability in traffic accident situation, Hiroto Suto1, Xingguo Zhang2, Xun Shen2, Pongsathorn Raksincharoensak2, and Norimichi Tsumura1; 1Chiba University and 2Tokyo University of Agriculture and Technology (Japan)

2:20ERVR-381
WARHOL: Wearable holographic object labeler, Matthew Shreve, Bob Price, Les Nelson, Raja Bala, Jin Sun, and Srichiran Kumar, Palo Alto Research Center (United States)

2:40ERVR-382
RaViS: Real-time accelerated view synthesizer for immersive video 6DoF VR, Daniele Bonatto, Sarah Fachada, and Gauthier Lafruit, Université Libre de Bruxelles (Belgium)



No content found

No content found

 

Important Dates
Call for Papers Announced 1 April 2019
Journal-first Submissions Due 15 Jul 2019
Abstract Submission Site Opens 1 May 2019
Review Abstracts Due (refer to For Authors page
· Early Decision Ends 15 Jul 2019
· Regular Submission Ends 30 Sept 2019
· Extended Submission Ends 14 Oct 2019
 Final Manuscript Deadlines  
 · Manuscripts for Fast Track 25 Nov 2019
 · All Manuscripts 10 Feb 2020
Registration Opens 5 Nov 2019
Early Registration Ends 7 Jan 2019
Hotel Reservation Deadline 6  Jan 2020
Conference Begins 26 Jan 2020


 
View 2019 Proceedings
View 2018 Proceedings
View 2017 Proceedings
View 2016 Proceedings

Conference Chairs
Margaret Dolinsky, Indiana University (United States) and Ian E. McDowall, Fakespace Labs, Inc. (United States) 

Program Committee
Dirk Reiners, University of Arkansas at Little Rock (United States); Jürgen Schulze, University of California, San Diego (United States); Andrew Woods, Curtin University (Australia)