I2C SPI USB CAN eSPI Cable Testing View All Quick Start Guides User Manuals Software Downloads Knowledge Base Videos Case Studies App Notes White Papers Sales Support How to Order
Products Blog Sales Support Contact Search
Autonomous Vehicles and AI: How Self-Driving Cars Work
Isabel Johnson

Autonomous vehicles, or self-driving cars, rely on the use of artificial intelligence (AI) to interpret data, make decisions, and control actuation systems, all in real time and without human input. For over a decade, it’s been clear that self-driving cars are the future, but with projects like Tesla’s Autopilot and NASA’s Mars Perseverance rover, and autonomous vehicle innovations from companies like, Waymo,  Transdev with ParkShuttle and Zoox with its robotaxi, that future already is here. We can predict that autonomous vehicles will become the dominant mode of transportation in the not-so-distant future, even for mass consumers, due to their convenience and capacity for increased safety and efficiency on the road. So how exactly do they operate independently while upholding these standards?

The Role of Artificial Intelligence in Self-Driving Cars

AI—particularly machine learning (ML) algorithms—is considered to be the “brain” of an autonomous vehicle, mimicking human decision-making to interpret data and recognize patterns to perform real-time driving maneuvers. Physical objects and surrounding environmental data are detected through various onboard sensors and classified through neural networks. This information is then used by AI algorithms, which work with the vehicle’s embedded systems, to make real-time navigation decisions based on changing road conditions. This process is increasingly powered by Edge AI, where AI models can run directly on the vehicle itself, rather than relying on cloud computing. This approach is especially useful for autonomous vehicles, as reducing dependency on external networks optimizes bandwidth and improves response times.

Core Technologies in Autonomous Vehicles

Autonomous vehicles rely on various core technologies to enable them to perceive their surroundings, interpret data, and make real-time decisions. These include:

  • Sensors, including cameras, LiDAR, radar, and ultrasonic sensors, are critical in detecting and classifying objects in the vehicle’s environment. Live data streams from these devices are transmitted to Electronic Control Units (ECUs), where they are processed and interpreted to perform task negotiations to coordinate which driving action(s) need to be taken and in which order, including steering, braking, and acceleration.
  • Neural networks, typically Convolutional Neural Networks (CNNs), classify and segment objects into different categories, like other vehicles, traffic lights, and pedestrians. CNNs use bias values to assign spatial hierarchies to objects in the vehicle’s environment. This classification is used to inform the vehicle’s prediction and planning systems, guiding navigation and driving decisions.
  • Navigation is done by combining real-time localization with high-definition maps and GPS to create highly accurate path planning. AI algorithms are used to pre-scan the route to the destination and dynamically adjust for localized traffic and road conditions.
  • Decision-making AI algorithms use the data from sensors, neural networks, and navigation technologies to dictate how the car moves in real time. A hierarchical tree determines the appropriate action based on all the factors mentioned above. This could mean accelerating when entering onto a freeway to match the surrounding traffic, braking at stops signs, red lights, or in traffic, and steering to follow curves or perform lane changes. These are actuation commands performed by embedded systems. ECUs and other microcontroller-based systems run on a Real-Time Operating Systems (RTOS) to ensure that actions are taken at the correct time intervals.
  • Memory enables data storage, retrieval, and high-speed access for the large volume of information processed in autonomous vehicles. Volatile memory, like Dynamic Random-Access Memory (DRAM), is used to temporarily store sensor data as it is processed in real time. Graphics Double Data Rate (GDDR) memory is used in neural networks for high-speed data transfer when rendering graphics. Flash memory manages map storage and system logs for navigation purposes. Electrically Erasable Programmable Read-Only Memory (EEPROMs) provides persistent storage, even when the car is off, for continuous configuration, calibration, and updates as a result of data from testing.
autonomous vehicle diagram

Safety and Testing

Autonomous vehicles must meet rigorous safety and reliability standards. Generative AI can be used to create realistic simulations that allow the cars to be tested in a controlled environment. This can include synthetic environment generation to mimic realistic road environments and weather conditions, and scenario generation for randomized variables like pedestrians or other cars making lane changes. Triangulation algorithms are used to asses what errors are occurring, and why. They send constant feedback to the car’s decision-making algorithms for adjustment.

There are two kinds of generative AI algorithms used for autonomous vehicle testing: supervised learning and unsupervised learning. Supervised learning uses labeled datasets to correctly map inputs and outputs when training the AI models. This focuses on controlled variables that can train the model on object recognition and behavior prediction. Once a model has been trained to output “normal” responses to planned stimuli, unsupervised learning is used to find patterns in unlabeled datasets. This trains the vehicle’s algorithms to cluster raw data on their own without instruction, testing their response and anomaly detection skills.

While generative AI plays a crucial role in developing and testing the models behind autonomous driving, vehicle validation doesn’t end there. Once models are deployed, validating real-time communication between vehicle components becomes essential to ensure reliable operation. This includes debugging how ECUs exchange data over the Controller Area Network (CAN) bus and validating the accuracy of sensor data transmitted via I2C and SPI protocols. Tools like host adapters and protocol analyzers allow engineers to simulate active CAN transmissions, monitor bus traffic for insights into task negotiations, and troubleshoot communication between embedded systems and peripherals to validate sensor data and controller performance.

How Total Phase Tools Support Autonomous Vehicle Testing

The Komodo CAN Duo Interface is an essential tool for debugging CAN systems. It is a powerful dual-channel USB-to-CAN adapter and CAN bus analyzer capable of active CAN data transmission and non-intrusive CAN bus monitoring. It records all CAN bus traffic while acting as an active node and provides real-time visibility into CAN bus data.

The Promira Serial Platform is our most advanced serial device with applications for I2C or SPI master/slave emulation and eSPI protocol analysis. It supports I2C master and slave speeds up to 3.4 MHz and SPI master and slave speeds up to 80 MHz and 20 MHz respectively, supports Dual and Quad SPI, and has gigabit Ethernet and High-Speed USB connectivity options.

The Beagle I2C/SPI Protocol Analyzer is a high-performance bus monitoring solution that provides real-time data capture and display of I2C and SPI protocol-level decoded data packets. It non-intrusively monitors I2C up to 4 MHz and non-intrusively monitors SPI up to 24 MHz, with bit-level timing down to 20 ns resolution and nearly limitless capture.

Do you have any questions on how our tools can support autonomous vehicle testing? For more information, please contact us at sales@totalphase.com.