|Laura Waller, Head of UC Berkeley's Computational Imaging Lab
Monday, January 30, 2017 at 2:00 PM
Berkeley's Computational Imaging Lab develops new methods for optical imaging, with a specific focus on measuring and controlling wave effects in microscopes and cameras.
In her presentation, Giga-scale 3D computational microscopy, Waller describes computational imaging methods for fast capture of gigapixel-scale 3D intensity and phase images in a commercial microscope. The experimental setups employ illumination-side and detection-side coding of angle (Fourier) space with simple hardware and fast acquisition. The result is high-resolution reconstructions across a large field-of-view, achieving high space-bandwith-time product. Experimentally, the system achieves real-time 3D and phase imaging with digital aberration correction and mitigation of scattering effects, by sparsity-constrained nonlinear optimization methods.
Laura Waller is the Ted Van Duzer Endowed Assistant Professor of Electrical Engineering and Computer Sciences (EECS) at UC Berkeley. She is a Senior Fellow at the Berkeley Institute of Data Science, with affiliations in Bioengineering and Applied Sciences & Technology. Previously, she was a Postdoctoral Researcher and Lecturer of Physics at Princeton University from 2010-2012 and received B.S., M.Eng. and Ph.D. degrees in EECS from the Massachusetts Institute of Technology (MIT) in 2004, 2005 and 2010, respectively. She is recipient of the Moore Foundation Data-Driven Investigator Award, Bakar Fellowship, Carol D. Soc Distinguished Graduate Mentoring Award, NSF CAREER Award and Packard Fellowship for Science and Engineering.
|Gordon Wetzstein, Leader of the Stanford Computational Imaging Group
Tuesday, January 31, 2017 at 2:00 PM
The Stanford Computational Imaging Group is an interdisciplinary group focused on advanced imaging, microscopy, and display systems. In his talk, VR 2.0: Making Virtual Reality better than Reality, Wetzstein discusses virtual reality, the new medium that provides unprecedented user experiences. Eventually, VR/AR systems will redefine communication, entertainment, education, collaborative work, simulation, training, telesurgery, and basic vision research. In all of these applications, the primary interface between the user and the digital world is the near-eye display. While today’s VR systems struggle to provide natural and comfortable viewing experiences, next-generation computational near-eye displays have the potential to provide visual experiences that are better than the real world. In this talk, Wetzstein explores the frontiers of VR systems engineering.
Gordon Wetzstein is an Assistant Professor of Electrical Engineering and, by courtesy, of Computer Science at Stanford University. He is the leader of the Stanford Computational Imaging Group, an interdisciplinary research group focused on advancing imaging, microscopy, and display systems. At the intersection of computer graphics, machine vision, optics, scientific computing, and perception, Prof. Wetzstein's research has a wide range of applications in next-generation consumer electronics, scientific imaging, human-computer interaction, remote sensing, and many other areas. Prior to joining Stanford in 2014, Prof. Wetzstein was a Research Scientist in the Camera Culture Group at the MIT Media Lab. He received a Ph.D. in Computer Science from the University of British Columbia in 2011 and graduated with Honors from the Bauhaus in Weimar, Germany before that. His doctoral dissertation focuses on computational light modulation for image acquisition and display and won the Alain Fournier Ph.D. Dissertation Annual Award. He organized the IEEE 2012 and 2013 International Workshops on Computational Cameras and Displays, founded displayblocks.org as a forum for sharing computational display design instructions with the DIY community, and presented a number of courses on Computational Displays and Computational Photography at ACM SIGGRAPH. Gordon is the recipient of an NSF CAREER award, he won best paper awards at the International Conference on Computational Photography (ICCP) in 2011 and 2014 as well as a Laval Virtual Award in 2005.
|Brian Cabral, Director of Engineering at Facebook
Wednesday, February 1, 2017 at 2:00 PM
Cabral and his team develop Facebook Surround 360, an open, high-quality 3D-360 video capture system. Cabral's plenary topic is titled, Designing VR video camera systems. Unlike traditional digital video camera systems which are fairly linear and composed of a single streaming optical and digital pipeline, VR video capture systems are not. They are composed of multiple, possibly homogenous, optical and digital components - all of which must operate as if they were one seamless optical system. The design of VR video cameras require a whole new set of technologies and engineering approaches. The arrangement of cameras, optical choices, SNR all of which play important roles in every camera design become far more complex for a VR camera and require tight coupling to the computational system components.
Brian Cabral is Director of engineering at Facebook specializing in computational photography, computer vision, and computer graphics. He is the holder of numerous patents (filed and issued) and leads the Surround 360 VR camera team. He has published a number of diverse papers in the area of computer graphics and imaging including the pioneering Line Integral Convolution algorithm. Brian's interests include computational photography, computer graphics and image processing hardware and software, numerical computation, differential geometry, hardware and software architecture, computational geometry, and statistical learning.