Decomposing the Autonomous Mobility Stack

Updated: Aug 29, 2019

The autonomous driving industry is at a maturity level comparable to the traditional automotive industry 100 years ago. Henry Ford manufactured in the Model T factory, not only the car itself, but also the wheels and most other components and parts in-house. One hundred years later, a couple of mega suppliers and hundreds of smaller suppliers provide 70% of the components of a typical automotive vehicle and car makers only make 30% in-house. 

The combined revenue of just the 10 largest suppliers rose to $315 billion in 2017 and the revenue of the largest 100 suppliers is close to $800 billion worldwide according to Automotive News. And it makes sense: Parts under the hood usually do not enable a manufacturer of a vehicle to differentiate from a competitor, or in other words the end customer - i.e. the owner/operator - usually does not care or typically even notice whether the radar sensor for the adaptive cruise control system was manufactured by Bosch or Denso or someone else, as long as it is reliable and it works.

The trend in the autonomous driving industry continues to go in the opposite direction.  For example, Waymo developed and is building its lidar in-house, Cruise bought a lidar company in 2017, and just recently Aurora also acquired a lidar company. The reasons are simple. Practically everybody (except probably one person) thinks that lidar is critical for developing the safest and most reliable self-driving system.  In other words, companies think they’ll have a competitive advantage by having what they believe are key technologies in-house.  On the flip side, this perceived short-term advantage is very uncertain - there are 70 Lidar companies (not including the ones in China) and it’s unclear which technology will prevail in a couple of years. And again, as a passenger in a robo-taxi in a couple of years, my concern is not the performance of an individual component, as long as the car provides a safe and comfortable ride to my destination.

Mid- and long-term, the (mostly venture-funded) full-stack (i.e. the ones that attempt to make most to all components – both hardware and software – in-house) autonomous driving companies will notice that costs and complexity will explode.

The founder of a (relatively small) full-stack self-driving company recently told me that they "can afford to develop the full-stack in-house because their addressable market will be so large”. I am not so sure that he fully understands the economics of scale. A large and profitable market attracts more competitors, ultimately lowering prices. A supplier delivering to 10 manufacturers can obviously offer a lower cost. And again, that’s what led to automotive suppliers contributing more than 70% to the value of a conventional vehicle.

Recently, partnerships are developing in self-driving, which were previously unseen. Volkswagen gave up the contract with Aurora and invested into Argo instead, and even donated their autonomous vehicle subsidiary AID to Argo as part of the deal. BMW and Daimler are joining their self-driving divisions, and rumors are even saying that Audi will join.

We think this is just an intermediate step. Ultimately the self-driving industry will see the same transition. I call this decomposing the autonomous mobility stack. The whole stack is too large, too complex, too expensive, too resource-intensive for most companies to develop in-house. It involves too many different disciplines and skills.

The Autonomous Mobility Stack is made up of five major groups: hardware, off-board software and data, on-board software, various methodologies, which combined lead into a developed product.

  1. The hardware stack is comprised of a vehicle platform, often customized or customizable to a specific application, which contains interfaces to the actuators, i.e. the drivetrain, the braking and steering systems, and the electronics. Furthermore, components include vehicle computer(s), communication system within the vehicle as well as to the cloud, plus data recording and storage components. Sensors include GNSS, motion sensors, lidar, radar, camera, and sometimes ultrasonic.

  2. Off-board software and data includes maps (with its various layers, see e.g. the Lanelet 2 paper for details). Maps need to be created, annotated with meta-information, updated and distributed – also in parts – while maintaining consistency over the whole map data base. Highly autonomous vehicles  will often be operated in a fleet, which requires fleet management, fleet routing, teleoperation, self- and remote-diagnostics for driverless vehicles. Sensors of one autonomous vehicle generate up to 64 Gb/s (or 8 GB/s or 480 GB/min or 28 TB/hour or 560 TB/day or 200 Peta B/year). Fleets obviously create a multiple of this amount. This amount of data needs to be recorded, stored, annotated, analyzed and managed. Software developers need a software development environment that enables productivity. Tools include playback of data, visualization of data, and the ability to simulate data on various levels.  

  3. Various methodologies are applied during the development process, including system design, software architecture design, hardware design, interaction design. Tests have to be developed on all levels, including, software unit, regression, integration, SIL, HIL, vehicle tests. Other aspects include functional safety, regulations, homologation, security, safety, verification, and validation.

  4. The on-board software stack consists of an operating system (hopefully real-time for safety critical systems and not just Linux) which itself consists of a kernel, scheduler, and drivers. On top of the OS sits (at least in a well-designed system) a software framework, which abstracts a lot of the complexity of the aforementioned components, e.g. the OS, the computer hardware, the sensor interfaces, data recording, playback, visualization, and the middleware. It should provide support for safety, security, and diagnostics.  ROS, the Robot Operating System, is the defacto standard framework and the collection of ROS 2 design articles provides a more detailed overview of the components that go into a robotic framework. Here at Apex.AI, we have developed a commercial, soon-to-be safety-certified, fork of ROS 2, which we call Apex.OS. On top of the framework sit the algorithmic components. Perception refers to processing information from sensors into a concise model of the environment. Localization is locating the vehicle relative to the lane, road, and the world, represented by the maps. Scene understanding gets to a semantic understanding of the perceived world. Driving decisions are made based on a set of goals and constrained by the environment, the vehicle’s desired motion is planned and sent to the vehicle’s actuators via the controller. Many of these algorithmic components are implemented using modern Artificial Intelligence (AI) techniques, which can achieve human-like accuracy but come with new requirements for many parts of the stack.  

  5. All components need to be integrated into a product, which here refers to the application that interfaces with the user/operator. The application is adapted for its intended use, tuned for the capabilities of the stack supporting it, and released. System integration for highly autonomous operation is a collaborative endeavor with multiple players, but with a well designed stack, can move quickly through definition, tuning, test and release.

The enabler for a modular and decomposed stack is a universally accepted architecture.  ROS has given us a standardized and open architecture and an open-source implementation, which is managed by a foundation.  We extended the model of ROS to the application stack and recently co-founded the Autoware Foundation. The Autoware Foundation develops a functional architecture for self-driving and builds this architecture and reference implementation completely in open-source. Over 35 companies and organizations have already joined the Autoware Foundation. Please join this powerful group to help build a standard. 

Waymo would not be ahead of everybody else if they had waited for the decomposition of the stack to develop, but that’s the burden of the first mover. Everybody else trying to catch up will be faster and cheaper by supporting the stack decomposition, helping to establish standards, and choosing the right partners, with the hope that it will not take 100 years to reach the next level of maturity in the autonomous industry.

Stay tuned.




For product inquiries, contact

For media inquiries, see Media Kit or contact

Subscribe to our newsletter 

  • White LinkedIn Icon
  • White Twitter Icon
  • White YouTube Icon

© 2020 Apex.AI, Inc. All rights reserved. Apex.AI, Apex.OS, Apex.Autonomy are registered trademarks of Apex.AI, Inc.