Advanced Driver Assist System (ADAS) and autonomous vehicle technologies have disrupted the traditional automotive industry with potential to increase safety and optimize the cost of car ownership. Among the challenges are those of sensing the environment in and around the vehicle. Infrared camera sensing is seeing a rapid growth and adoption in the industry. The applications and illumination architecture options continue to evolve. This course will provide the foundation on which to build near infrared camera technologies for automotive applications.
The seminar will begin with a review of infrared basics – electromagnetic spectrum, spectral irradiance, night vision and eye safety. The content includes an in-depth calculation for infrared camera illumination and eye safety with focus on driver monitoring for interior and machine vision for exterior. Participants will receive insights into rolling and global shutter imagers, wavelength selection, use of secondary optics, continuous vs pulsed IRED operation, thermal design, power consumption, eye safety certification and HMI considerations. Also included is a brief review of iris recognition, cabin monitoring and face recognition and a discussion on trends and challenges facing optical sensing in autonomous vehicles.
Learning Objectives
By participating in this course, you will be able to:
- Give examples of market forces, regulation and technology in the eco system
- Comprehend electromagnetic spectrum, spectral irradiance, night vision and eye safety
- Prescribe infrared camera requirements with a comprehension of key variables
- Apply monitoring calculations for illumination, eye safety and power consumption
- Practice exterior IR camera calculations for illumination, eye safety and power consumption
- Define iris recognition, face recognition, cabin monitoring and gesture recognition
- Identify challenges and opportunities for sensing trends in ADAS and AV
Who Should Attend
Mechanical, lead, application, and electrical engineers, along with head of innovation and BOM family owner will benefit from this course. Those involved in driver monitoring, exterior IR camera for machine vision and other IR camera applications will also gain valuable sensing insights.
Prerequisites
An undergraduate engineering degree or a strong technical background is highly recommended. A basic knowledge of college algebra, college physics, and a basic awareness of infrared camera applications in ADAS and autonomous vehicles will be beneficial.
You must complete all course contact hours and successfully pass the learning assessment to obtain CEUs.
Course Outline
- ADAS and Autonomous Sensing
- Market forces shaping industry
- Regulation landscape
- From guidelines to regulation – typical path
- ADAS – NCAP & IIHS rating
- NHTSA Autonomous Driving Guideline
- NHTSA Autonomous Levels
- Human Vs Autonomous Driver
- LIDAR, Camera, RADAR – overview
- Sensor Fusion – need for holistic view
- The importance of HMI for Level 3
- Camera and Image Processing Basics
- Image Sensor Array
- Aperture size
- Thin lens optics
- FOV, range, image size, resolution
- Frame rate, shutter speed and image accuracy
- Rolling and global shutter camera
- NIR sensitivity of CMOS cameras
- Infrared Camera Topics
- IR illumination sources and efficiency
- IR illumination calculations (# of infrared LED)
- Eye Safety for LED
- IEC and ANSI standards
- Example of limit calculation (AEL, MPE)
- Certification
- Hazard zone intrusion – Design Strategies
- Driver Monitoring with NIR
- System requirements
- Wavelength selection
- Image sensor selection
- Use of secondary optics
- Continuous vs Pulsed operation of IRED
- Thermal design
- Power consumption
- Wavelength shift
- Eye safety – certification
- Human Machine Interface considerations
- Exterior camera with NIR
- Need for application – sensor fusion gaps
- System requirements
- Power consumption challenges
- Illumination concepts
- Integration to vehicle headlamps and tail lamps
- Use of simulation software
- Other infrared camera applications and trends
- Iris recognition
- Face recognition
- Cabin monitoring
- Mood lighting
- Gesture recognition
- Autonomous Vehicles and Optical Sensors
- Trends and challenges
- Artificial intelligence
- New sensing technologies
- Working with startups, Tier1, Tier2, OEM and other ecosystem partners
Instructors
Rajeev Thakur
Rajeev Thakur is currently Director Automotive Programs at Velodyne Lidar – responsible for building Velodyne’s LIDAR business in the automotive market. In this role he supports OEM customers to select, design-in and launch Velodyne’s wide LIDAR portfolio for autonomous vehicles and ADAS functions. Prior to this, he was at OSRAM Opto Semiconductors as Regional Marketing Manager for infrared product management and business development in the NAFTA automotive market. His focus was on LIDAR, driver monitoring, night vision, blind spot detection and other ADAS applications. He has been in the Detroit automotive industry since 1990 – working for companies such as Bosch, Johnson Controls and Chrysler. He has concept-to-launch experience in occupant sensing, seating and power train sensors. He holds a masters degree in Manufacturing engineering from the University of Massachusetts, Amherst and a Bachelors degree in Mechanical engineering from Guindy engineering college in Chennai, India. He is a licensed professional engineer and holds a number of patents on occupant sensing. He is also a member of the SAE Active Safety Standards development committee and a reviewer for IEEE Intelligent Transportation Systems Transactions.