IMPORTANT DATES
Dates currently being confirmed; check back.
 

2022
Call for Papers Announced 2 May
Journal-first (JIST/JPI) Submissions

∙ Submission site Opens 2 May 
∙ Journal-first (JIST/JPI) Submissions Due 1 Aug
∙ Final Journal-first manuscripts due 28 Oct
Conference Papers Submissions
∙ Abstract Submission Opens 1 June
∙ Priority Decision Submission Ends 15 July
∙ Extended Submission Ends  19 Sept
∙ FastTrack Conference Proceedings Manuscripts Due 25 Dec 
∙ All Outstanding Proceedings Manuscripts Due
 6 Feb 2023
Registration Opens 1 Dec
Demonstration Applications Due 19 Dec
Early Registration Ends 18 Dec


2023
Hotel Reservation Deadline 6 Jan
Symposium begins
15 Jan


Partners






Electronic Imaging 2023

3D Imaging Systems Hardware and its Calibration

SC10

3D Imaging Systems Hardware and its Calibration
Instructor: Kevin J. Matherson, Microsoft Corporation
Level: Intermediate (was Advanced in 2022)
Duration: 4 hours
Course Date/Time: Sunday 15 January 13:30 - 17:45
Prerequisites: Basic understanding of linear algebra.

Benefits:
This course enables the attendee to:

  • Describe fundamental principles of depth technology and 3D imaging.
  • Understand the trade-offs among currently available depth system technologies and determine which will best match a particular application.
  • Understand the key components of various depth technologies: optics, illuminators, sensors.
  • Concepts and design considerations for depth cameras: stereo, active stereo, structured light, time of flight.
  • Passive and active depth camera calibration.
  • Comparison of time-of-flight imaging to triangulation-based approaches.
  • Understand methods of benchmarking depth cameras.

Course Description:
Camera modules are now commonplace, integrated in devices ranging from mobile phones to automobiles. CMOS image sensor technology and advances in image processing technology, as well as advances in packaging and interconnect technology, have created the ability to use cameras in applications that were unheard of just a few years ago. Emerging applications of cameras include Internet of Things (IoT), biometrics, augmented reality, machine vision, medicine, and 3D imaging. There are a variety of different methods for three-dimensional imaging, each with their own strengths and weaknesses. This course provides an overview of the main methods for depth imaging: stereo, active stereo, structured light, and time of flight. Following a review of 2D camera fundamentals, the course compares the various architectures of depth cameras. Next, the course covers camera calibration, a crucial step in machine vision that is required for measurements of the environment. Finally, the course provides an overview of depth camera benchmarking, which adds more complexity to testing as compared to 2D cameras.

Intended Audience:
Engineers, researchers, managers, students, and those who want to understand 3D imaging systems and their characterization. The course assumes a basic knowledge of linear algebra. No prior optics or image science knowledge is assumed.

Kevin Matherson is a director of optical engineering at Microsoft Corporation working on advanced optical technologies for AR/VR, machine vision, and consumer products. Prior to Microsoft, he participated in the design and development of compact cameras at HP and has more than 15 years of experience developing miniature cameras for consumer products. His primary research interests focus on sensor characterization, optical system design and analysis, and the optimization of camera image quality. Matherson holds a Masters and PhD in optical sciences from the University of Arizona.

 

 

Until 25 December

Starting 26 December

Member

$ 305

$ 355

Non-member

$ 330

$ 380

Student

$ 95

$ 120

 

Discounts given for multiple classes. See Registration Page for details to register.

For office use only:

Category
2. Short Courses
Track
Camera / Sensors / Calibration
When
1/15/2023 1:30 PM - 5:45 PM
Eastern Standard Time