Chief Business Officer and Co-Founder at Recogni.
The auto industry is undergoing a major shift, from human-driven cars to autonomous vehicles (AV). Subsequently, the transportation industry as we have always known it will completely change. Generally speaking, various technology companies are spearheading this evolution. These are the same firms that have already shown a rapid rate of innovation over time that’s unmatched by the incumbent automotive original equipment manufacturers (OEMs).
Tesla, for one, surprised the auto industry by introducing the Autopilot system back in 2015. The system has evolved since then and can maneuver on highways autonomously but requires the presence of drivers. Many automotive companies have also developed systems to drive autonomously on highways with a human present.
The next logical step toward full autonomy is to enable automated driving on urban streets in the presence of a driver (known as Level 2+ autonomy, as per the Society of Automotive Engineers). Driving on city streets versus the highway presents an exponentially more complicated situation stemming from the visual perception problem.
When you drive, your brain uses billions of neurons to process the plethora of surrounding visual information. In addition, the brain must process significantly more variables when navigating urban streets relative to highways. Current AV solutions, limited by their computational capabilities, can only enable highway automation because they are based on legacy technology, constraining their ability to drive like humans — especially in urban settings.
Delving into the complex nature of urban driving, the two major issues that stand out and must be addressed by the AV space are recognizing the phase of traffic lights and understanding the intent of vulnerable road users (VRU) like pedestrians. It is not good enough to solely identify VRUs and traffic lights — AVs must also understand what they are doing. The fidelity of details needed to comprehend the intent of these objects, especially under harsh lighting conditions, is enormous and requires a massive amount of computation in real time.
The AV must be equipped with a platform that can, from a distance of over 150 meters, understand the phase of a traffic light under any lighting conditions. This platform also must not only identify VRUs in its field of view, but also comprehend what the pedestrian or cyclist plans to do. Compared to cruising on the highway at a constant speed with little interruption, imagine the compute needed to solve these issues. Current solutions on the market are constrained from a processing standpoint and cannot solve these problems related to urban vehicle automation.
Looking into what needs to happen from a technical standpoint, one must first take into account that an AV system must be robust under any condition, no matter the weather or the lighting surrounding the car. Due to concerns over a rise in AV accidents due to lighting conditions that cannot be processed, this is extremely important. To prevent such accidents, the vehicles must possess high-resolution cameras that have high dynamic range.
To support all of this, an AV must be equipped with a visual perception platform with exponentially more compute and efficiency than the incumbent solutions of today. Unlike current products, this must be purpose-built to solve the unique conundrums outlined above that are related to urban vehicle automation. Using legacy technology, such as the GPU, is not an option — innovation must occur.
The solution must have a data-center class of computational capability while consuming a minuscule amount of power to ensure that the battery is used to drive the AV, not sense the environment. With such a computationally efficient system, traditional auto OEMs can horizontally integrate the needed technology into their vehicles to enable urban driving automation. This is the only way automotive OEMs can compete with the technology companies instead of “playing catch-up.”
The auto industry is experiencing an evolution in vehicle autonomy. Companies are racing to equip their vehicles with the most advanced self-driving solution. Currently, we are in a state of automated highway driving with the presence of a human required. The next logical step is automated urban driving — a problem that is vastly more complex than driving on the highway. To compete with the evolving auto industry, OEMs must source and horizontally integrate a visual perception platform that is capable of enabling a car to drive autonomously on city roads.
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?