Last year when Intel completed its acquisition of autonomous driving software company Mobileye, it was announced that it would soon thereafter build a fleet of fully autonomous vehicles for testing. Now, the first of the 100-car fleet is in operation in Jerusalem, demonstrating the capabilities of its technology and safety model.
In the testing, the cars are demonstrating the power of the Mobileye software approach and technology, to prove that the Responsibility-Sensitive Safety(RSS) model increases safety, and to integrate key learnings to products and customer projects, according to Professor Amnon Shashua, senior vice president at Intel Corporation and the chief executive officer and chief technology officer of Mobileye.
The key differentiator in the system launched by Intel and Mobileye, according to Shashua, is the fact that they have targeted a vehicle that "gets from point A to point B faster, smoother and less expensively than a human-driven vehicle; can operate in any geography; and achieves a verifiable, transparent 1,000 times safety improvement over a human-driven vehicle without the need for billions of miles of validation testing on public roads."
During the initial phase, the vehicles in the fleet will feature only cameras, in a 360° configuration. Each vehicle uses 12 cameras, with eight cameras providing long-range surround view and four cameras utilizing for parking. The goal, according to Intel, is to prove that it can create a "create a comprehensive end-to-end solution from processing only the camera data."
An "end-to-end" autonomous vehicle solution, according to Shashua, is a solution consisting of a "surround view sensing state capable of detecting road users, drivable paths and the semantic meaning of traffic signs/lights; the real-time creation of HD-maps as well as the ability to localize the AV with centimeter-level accuracy; path planning (i.e., driving policy); and vehicle control."
This "camera-only" phase is the company’s strategy for achieving what it refers to as "true redundancy" of sensing, with "true redundancy" referring to a sensing system consisting of multiple independently-engineered sensing systems, each of which can support fully autonomous driving on its own. This is in contract to fusing raw sensor data from multiple sources together early in the process, which in practice results in a single sensing system, according to Intel.
>>> View more information on "true redundancy" here.
Radar and LiDAR will be added to the vehicles in the current weeks as a second phase of development, and then the synergies among the various sensors can be used for increasing the "comfort" of driving, suggests Intel.
Mobileye’s proprietary software is designed using artificial intelligence-based reinforcement learning techniques and was trained offline to optimize an "assertive, smooth and human-like driving style." To enable the system to understand the boundary where assertive driving becomes unsafe, the AI system is governed by a formal safety envelope, in the form of the aforementioned RSS system.
This system is a model that formalizes the "common sense principles" of what it means to drive safely into a set of mathematical formulas that a machine can understand. This includes things like safe following/merging distances, right of way, and caution around obstructed objects. If the AI-based software proposes an action that would violate one of these principles, the RSS layer rejects the decision.
"Put simply, the AI-based driving policy is how the AV gets from point A to point B; RSS is what prevents the AV from causing dangerous situations along the way," wrote Shashua. "RSS enables safety that can be verified within the system’s design without requiring billions of miles driven by unproven vehicles on public roads."
The current fleet implements this view of the appropriate safety envelope, but Intel and Mobileye have shared the approach publicly and look to collaborated on an industry-led standard that is technology neutral.
On the computing end, the fleet is powered by four Mobileye EyeQ4 System on Chips (SoC). Each SoC has 2.5 Terra OP/s running at 6 W of power. The next generation, EyeQ5, targets fully autonomous vehicles, and engineering samples of this product are due later this year, according to Intel.
Looking forward, Intel’s goal—in support of its automakers—is to bring this system to series production in L4/L5 vehicles by 2021.
View thefull Intel blog post.
Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design
To receive news like this in your inbox, click here.
Join our LinkedIn group | Like us on Facebook | Follow us on Twitter
James Carroll
Former VSD Editor James Carroll joined the team 2013. Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.