Robotics for Autonomous Vehicle Systems Bootcamp C2012

The Robotics for AV Systems Bootcamp was developed by SAE International and Clemson University, with industry guidance from Argo AI. This rigorous, twelve-week, virtual-only experience is conducted by leading experts in industry and academia. You’ll develop a deep, technical understanding of how to build autonomous systems by learning to program a mobile robot through hands-on approaches using ROS, Gazebo, and Python. Following an introduction to Mechatronics and Kinematics, group assignments will be established, and the hardware will be sent to your location of choice allowing you to put these principles to practice in the comfort of your home or office. You’ll learn and apply software engineering principles such as SLAM; Localization; Navigation/Path Planning; Perception: LiDAR, Cameras & Vision and Visual Intelligence; and Machine Learning.

The Robotics for AV Systems Bootcamp consists of asynchronous videos you’ll work through at your own pace throughout each week, followed by a live-online synchronous experience each Friday. The videos are led by Dr. Venkat Krovi, Michelin Endowed SmartState Chair Professor of Vehicle Automation at Clemson University. The live sessions are taught by Jeff Blackburn, who joins us from Ansys and comes with an extensive background in software engineering. Given the extremely rigorous nature of this program, optional office hours will be available weekly for added assistance.

There will be a two-week culminating project providing you with the opportunity to apply everything you’ve learned by navigating your robot through an at-home obstacle course and by working through a series of set tasks designed to enhance your hands-on experience. After a successful completion, SAE International will provide you with a Certificate of Achievement showing mastery in Robotics for AV Systems.



Who Should Attend

Participants will be recent graduates or new to/newly hired mechanical, electrical, and computer science engineers joining industry to support autonomous vehicle system development. The program is designed for working engineers. We also encourage those who are interested in learning more about the field of Autonomous Vehicles. This program will take an estimated 10-15 hours of your time on a weekly basis. This includes the 1.5hr live experience on Fridays, self-paced videos, as well as homework assignments.

Prerequisites
  • B.S. in Mechanical, Software, or Electrical Engineering, or Computer Science
  • Interest in working on autonomous vehicles
  • Some coding ability in C or Python
  • Coursework in linear algebra, statistics
Testimonials

“I really appreciated the support received throughout the bootcamp. The structure of the program is well thought out and covers a broad range of topics that is perfect for someone new to the field of autonomous devices.” -Robotics Bootcamp Attendee 

“The Bootcamp accomplished exactly what I set out for. After completing all the lectures and the assignments, capped with the group Capstone project, it gave me a very good overall understanding of the Robotics topics that enables me to start contributing at my job right away.” -Robotics Bootcamp Attendee 

“Being new to robotics field, the bootcamp helped me to get ramped quickly and introduced to different aspects of robotics. I am more confident in meeting where robotics is being discussed.” -Robotics Bootcamp Attendee 

“Excellent introductory course to AV Robotics, with plenty of simulation and hands-on work.” -Robotics Bootcamp Attendee 

You must complete all course contact hours and successfully pass the learning assessment to obtain CEUs.



Course Outline

Week One: Course Introduction
• Demonstrate basic Linux (Ubuntu) command line use
• Describe how Robotics is used in Autonomous Vehicles
• Set up a Git repository and explore the Robot Operating System framework with a simple example

Week Two: Mechatronics
• Describe the Sense-Think-Act framework and its’ relevance in an autonomous system
• Outline basic topology & commands of the Robot Operating System

Week Three: Re-introduction/Kinematics
• Define Reactive control systems
• Understand homogeneous transformations
• Install Gazebo and explore the backend of the simulation framework

Week Four: Vehicle Architecture and Kinematics
• Explain different Wheeled Mobile Robot architectures
• Define Differential & Ackerman drive types
• Explain forward/inverse kinematics and motion models
• Use Git for collaboration in a team setting
• Set up a vehicle model in Gazebo and implement longitudinal control

Week Five: Sensors/Perception: Lidar
• Describe the role of sensing systems in autonomous navigation.
• Implement lateral control for wall following
• Use OpenCV to implement a line follower

Week Six: Sensors/Perception: Cameras, Vision and Visual Intelligence
• Explain how the camera works and how depth is calculated with a stereo camera
• Explore other vision-based sensors
• Implement a lane keeping controller on the car-like robot with OpenCV and ROS

Week Seven: Perception: Machine Learning
• Explain basic machine learning concepts
• Understand the difference between traditional perception and ML
• Implement a ROS wrapper around trained models to detect and classify objects using Keras

Week Eight: Localization
• Describe different bayesian approaches to localization
• Understand costmaps and how to use them in the Navigation Stack
• Use the Navigation Stack in ROS to plan a path to a goal while performing obstacle avoidance 

Week Nine: SLAM

• Define SLAM & Differentiate several SLAM approaches
• Explore different localization algorithms
• Implement SLAM for mapping and use the Navigation Stack in ROS to localize and plan the robot motion

Week Ten: Navigation/Path Planning
• Discuss the relative uses and merits of greedy non-greedy
path planning approaches
• Use the Navigation Stack in ROS to extract waypoints for robot motion and implement a pure pursuit controller to navigate through them. 

Week Eleven & Twelve: Final Project

• Virtual Hands-On Workshop
• Participants will work virtually in-teams or individually with a Turtlebot3 Burger
• Participants will implement exercises on the hardware that will test their skills in:
• Working with Lidar and Camera data
• Object Detection using pre-trained neural networks
• Generating maps and localizing within that environment
• Planning and control



Materials & Requirements

Equipment requirements:

  • An installation of Unbuntu 16.03 (installation options in course)
  • Headset and webcam for online audio and video conferencing
  • A joystick is not required; but is useful for teleoperation

Instructors

Dr. Venkat Krovi, Jeff Blackburn, and Huzefa Kagalwala

Prof. Venkat N. Krovi (FASME, SM IEEE) is currently the Michelin Endowed SmartState Chair Professor of Vehicle Automation at Clemson University – International Center for Automotive Research. His research focuses on intelligent modulation of distributed physical-power-interactions (motions/forces) between humans with connected and autonomous systems to unlock the “power of the many.”. Research activities focus on the life-cycle treatment (design, modeling, analysis, control, implementation and verification) of a new generation of systems for realizing Connected-Autonomy synergy with applications in vehicle automation, plant-automation, and defense arenas. He currently serves as the Editor-in-Chief of the ASME Journal of Mechanisms and Robotics and was the Founding EiC of the SAE Journal of Connected and Automated Vehicles. He has also taken significant leadership roles within multiple professional societies (ASME, IEEE) and currently serves on the Executive Committee of the IEEE Robotics and Automation Society. Further details are available from http://cecas.clemson.edu/armlab-cuicar.


Jeff Blackburn is the Senior Product Sales Manager for Ansys Autonomy, the world’s largest supplier of simulation software. Prior to joining Ansys, Jeff worked on developing autonomous vehicle research platforms at Dataspeed, was a founding member of Metamoto who developed a massively scalable cloud-based simulation platform, and was the North American ADAS and Autonomous Vehicle subject matter expert for Siemens / Tass PLM Software, Inc. He has also held positions in controls and systems engineering with National Instruments, Takata, Fanuc Robotics, and Rockwell Automation. Jeff has organized and presented at numerous technical forums. He has been issued twenty-one U.S patents, primarily in the area of occupant safety. Jeff holds a B.S. in Engineering and a J.D. from the University of Akron.


Huzefa Kagalwala is currently working as an ADAS Simulations Engineer at the Ford Motor Company. Huzefa has a keen interest in all aspects of mobility. Having dabbled in diesel engine emissions calibration during his time at Mahindra & Mahindra, India, he decided to switch over to making vehicles safer via autonomy during his Masters in Automotive Engineering at Clemson University. Being a simulations engineer, makes him proficient in all aspects of the autonomy stack including perception, localization, planning and control. Having kick started his career via the SAEINDIA BAJA competition, to becoming an instructor for SAE, Huzefa’s association with SAE has been for life!



Leave a Reply