We mentioned back in January that more than 20 of our clients are developing technology needed to further enable self-driving cars. With California recently joining other states that allow driverless cars, we want to explore the impacts of the technology adoption.
As we move toward a market disruption of the transportation industry, professional drivers have reason for concern. Goldman Sachs predicted trucker job losses of 25,000 per month as self-driving trucks roll out. McKinsey Global Institute estimates that 1.5 million jobs might be lost in trucking over the next 10 years. The International Transport Forum proposed that 2 million American and European truckers could be directly displaced by 2030.
Not just luxury vehicles
By using and integrating data from onboard sensors, vehicles can display various levels of autonomy, ranging from intervening in speed to controlling the entire drive:
- Level 1: driver assistance. Based on sensor data, the vehicle controls either speed or steering to adapt cruising speed to the surrounding traffic, or to avoid collision or lane departure.
- Level 2: partial automation. The vehicle executes control over both steering and acceleration.
- Level 3: conditional automation. The vehicle executes all aspects of dynamic driving; the human driver responds to requests for intervention.
- Level 4: high automation. The vehicle executes all aspects of dynamic driving, even if the human driver ignores the intervention request.
- Level 5: full automation. The vehicle undertakes all aspects of driving.
Tesla is responsible for much of the discourse about self-driving cars. With enhanced autopilot, the ability to respond to being summoned, Tesla vehicles command a premium—one that consumers are willing to pay, and for which they’re willing to wait.
Although at first only found in luxury vehicles, many level 1 and 2 features are now offered in $20,000 cars. A 2017 Toyota Corolla, for example, comes with a backup camera with alerts, lane departure detection and intervention, adaptive cruise control, collision avoidance.
There are about 3.5 million professional truck drivers in the US. Combined with other transportation and delivery drivers, our country employs between 10 and 15 million professional drivers. On-highway trucks will probably be the first industry to feature fully autonomous vehicles on public roads.
Almost two years ago, an autonomous semi from Otto, a company that Uber acquired, drove itself and 2,000 cases of Budweiser beer 150 miles down Interstate 25 (with a driver in the back of the cab and police cars and staff in nearby cars). Companies continue to develop prototypes and algorithms needed to enable autonomous fleets of delivery and transportation vehicles.
A coalition of companies (including Uber, Lyft, and Zipcar) want self-driving cars in metropolitan areas, officially announcing a set of shared mobility principles. The coalition supports autonomous vehicles in dense urban areas should only be operated in fleets. In other words, if you live in San Francisco, they don’t want you to own a self-driving car, but they want you to pay to ride in one of theirs.
As often is the case, the emergence of technology over a job class (taxi drivers, for example) often begets new needs and positions. California’s DMV requires that self-driving cars link to remote operators who monitor and can steer them to the side of the road, just to clear the roadway, not to avoid accidents.
Need for more technology
McKinsey Global Institute estimates that fully self-driving cars could be a decade away, based on the need for more software development in key areas:
- Object analysis: Identifying and analyzing objects is critical for autonomous vehicle. A stationary motorcycle and a bicyclist riding on the side of the street are very different, but correctly predicting behaviors is crucial. Integrating sensor data from lidar, radar, and camera vision is also complicated.
- Decision-making: Building an executing a thorough situational training library to enable deep machine learning is a difficult and time-consuming project that needs extensive testing and validation.
- Fail-safe mechanism: Designing a way for an autonomous vehicle to fail without endangering its passengers or others is imperative. Testing and planning for every possible software state and outcome is impossible, so we must be prepared to have a safety net for when the machine can’t operate.
Other potential game-changing computational paradigms just emerging will eventually replace GPUs in this space. GPUs are, after all, graphical processing units, designed primarily for 3D graphics in computer gaming and the like. It just happens that 3D graphics requires massively parallel computing, as do most modern AI algorithms (e.g., machine learning via deep neural nets to enable multi-sensor computer vision object detection)
The key hardware requirement is being able to run huge (configurable) inference networks very quickly, which next generation hardware will do (and is already doing) better than GPUs.
In the era of early adoption that’s burgeoning, lots could change and reshape how we work, live, and move ourselves and our goods around the globe. Careers could die, new ones emerge. Companies could reshape supply chains and logistics. We non-professional drivers might have more time for everything, freeing up our commuting time, and our roads could become safe as accident rates drop.
Interested in learning more on this topic?
Follow us: @EarlyGrowthFS
Check out Part 1 & Part 2!
We’d love to hear what you think – firstname.lastname@example.org