Aionda

2026-01-14

Mercedes-Benz CLA Integrates NVIDIA DRIVE for AI Centric Driving

Mercedes-Benz CLA evolves into an AI server with NVIDIA DRIVE, featuring 508 TOPS performance and centralized SDV architecture.

Mercedes-Benz CLA Integrates NVIDIA DRIVE for AI Centric Driving

The era of checking engine specifications first upon entering a car showroom is over. Now, the first question consumers will ask is, "How many TOPS (Tera Operations Per Second) of computing power does this car have?" With the integration of NVIDIA DRIVE AV software into its entry-level CLA sedan, Mercedes-Benz has declared the evolution of the automobile from a mere means of transportation into an "AI server on wheels." This collaboration transcends a simple component supply contract, symbolizing a complete shift in the automotive industry's power axis from mechanical engineering to AI computational capability.

Intelligence Beyond Hardware: The Power of the '3-Computer' Architecture

At the core of the new Mercedes-Benz CLA is the realization of a "Software-Defined Vehicle (SDV)" based on the NVIDIA DRIVE platform. While conventional cars featured a fragmented structure where dozens of independent Electronic Control Units (ECUs) separately controlled windows, engines, and brakes, the Mercedes-Benz CLA integrates all these processes into a single massive "brain" via NVIDIA’s centralized computing architecture.

To implement this, NVIDIA employs a so-called "three-computer" strategy. The first is the "DGX" supercomputer in the data center, used for training AI models. The second is an "Omniverse"-based simulation computer that allows the car to drive trillions of virtual kilometers before hitting actual roads. The third is the "DRIVE AGX" installed directly in the vehicle. These three computers form a massive loop, exchanging data.

The system installed in the CLA boasts an overwhelming computational performance of 508 TOPS, a figure more powerful than dozens of standard laptops combined. This is organically coupled with a total of 30 multimodal sensors, including LiDAR, radar, and cameras. Unlike Tesla’s vision-only approach, which relies solely on cameras, Mercedes-Benz and NVIDIA utilize all sensory organs for "safety redundancy." Particularly in complex urban intersections or adverse weather conditions, while the AI makes decisions, a classic safety stack called "Halos" operates a dual monitoring network to preemptively block any possibility of accidents.

'Point-to-Point' Autonomous Driving: Testing the Limits of L2

The highlight of this commercialization is the Level 2 (L2) "Point-to-Point" driver assistance system. Moving beyond simple lane-keeping on highways, the AI assists in the entire process from the doorstep to the destination parking lot. The fact that the ability to recognize traffic lights, stop and start autonomously, and perform unprotected left turns at complex intersections is included in the mass-market CLA model is highly significant.

However, the reality is more nuanced. While point-to-point driving is technically possible, legal liability remains tied to the "Level 2" category. Drivers must keep their hands on the wheel and eyes on the road, and they bear responsibility for any accidents resulting from system errors. The reason for not immediately entering Level 3 (the stage where drivers are freed from the obligation to monitor the road) despite having high-performance hardware like 508 TOPS is more about the weight of responsibility than technical deficiency.

Furthermore, it remains uncertain whether the CLA can achieve full Level 3 status through Over-the-Air (OTA) software updates alone. Implementing Level 3 or higher requires hardware redundancy to allow immediate intervention in case of system failure. The industry remains critical of whether the entry-level CLA’s hardware configuration perfectly meets these requirements. Key factors include the inclusion of expensive LiDAR sensors and whether the computational overhead can withstand the variables of real-world roads.

Disruptor of the Supply Chain, and a New Order

The union of NVIDIA and Mercedes-Benz is shaking the foundations of the automotive supply chain. While Tier 1 suppliers like Bosch or ZF previously created value centered on hardware components, they are now being integrated atop NVIDIA’s reference architecture.

This shift is likely to force a new experience on consumers: the "subscription economy." Instead of buying all features at once, users might subscribe to autonomous driving functions or upgrade performance as needed. Through this, Mercedes-Benz aims to transition to a "software business model" that generates continuous revenue even after the vehicle sale. While this is an opportunity for manufacturers, it could increase uncertainty regarding vehicle maintenance costs for consumers.

Developers are also facing a new era. Automotive software development has moved beyond embedded coding into a sophisticated field of AI engineering that integrates Large Language Models (LLMs) and visual intelligence. Proficiency in handling NVIDIA’s platform will become a core competency for future automotive engineers.

FAQ

Q: What is the biggest difference compared to Tesla’s FSD (Full Self-Driving)? A: Tesla adheres to a "vision-only" approach using only cameras and leverages vast amounts of driving data as its strength. In contrast, the Mercedes-Benz and NVIDIA system adopts multimodal sensor fusion, including LiDAR and radar, to increase perception accuracy, and adds hardware safety logic that operates independently of the AI stack to maximize reliability.

Q: It is a Level 2 system; why is such high computing performance (508 TOPS) necessary? A: It is not just for performing current functions but for future scalability. Point-to-point driving requires immense calculation because it must recognize and judge numerous objects in real-time. Additionally, high-spec chipsets are installed in advance to ensure that the hardware does not become a bottleneck when improving functions through future OTA updates.

Q: If I purchase a CLA equipped with this feature, can I sleep or look at my smartphone while driving? A: Absolutely not. This system is Level 2, which is "driver assistance." All system decisions must be made under the driver's supervision, and legally, the driver must always monitor the road and be prepared to intervene immediately.

Conclusion

참고 자료

Share this article:

Get updates

A weekly digest of what actually matters.

Found an issue? Report a correction so we can review and update the post.