Nvidia simulator and Safety Force Field and other news from GTC

Topic: 
Tags: 

This week I am at the Nvidia GPU Technology Conference, which has become a significant conference for machine learning, robots and robocars.

Here is my writeup on a couple of significant announcements from Nvidia -- a new simulation platform and a "safety force field" minder for robocar software, along with radar localization and Volvo parking projects.

News from Nvidia GTC

Comments

2019-Apr-12
ZF coPILOT debut today, available 2021. Needs 2 winters of testing?

Not a breakthrough. but free addition of Intel APB added to AEB will be bundled with tricam.

Intel Newsroom

Search Newsroom...
shashua eyeq5 2x1
Using Autonomous Vehicle Technology to Make Roads Safer Today
Technologies Developed for Fully Autonomous Vehicles Can Improve the Advanced Driver Assistance Systems Already in Wide Use
Editorial
January 8, 2019

Amnon ShashuaBy Professor Amnon Shashua

Safety has always been our North Star. We view it as a moral imperative to pursue a future with autonomous vehicles (AV), but to not wait for it when we have the technology to help save more lives today.

We fundamentally also believe that everything we do must scale, and we constantly search for the best ways to match our technology to market needs. Founded on the idea that we could use computer vision technology to help save lives on the road, Mobileye became a pioneer in advanced driver assistance systems (ADAS). These capabilities are now scaling up to become the building blocks for a fully autonomous vehicle.

More: Intel at CES 2019 | Autonomous Driving at Intel | Mobileye News

The same is also true in reverse. New technologies developed specifically for AVs are enabling greater scale of advanced driving assistance systems and bringing a new level of safety to roads.

AV Technology Raises ADAS to the Next Level

There are five commonly accepted levels of vehicular autonomy. (Zero is no autonomy.) ADAS systems fall into levels 1 and 2, while levels 3 to 5 are degrees of autonomy ranging from autonomy in some circumstances to full autonomy with no human intervention.

While level 1 and 2 cars can be bought today, cars with varying degrees of autonomy are still in development. We know self-driving cars are technically possible. But the true challenge to get them out of the lab and onto the roads lies in answering more complex questions, like those around safety assurance and societal acceptance. To that end, we have been innovating around the more difficult enablers of AV technology such as mapping and safety.

This technology envelope that we’ve designed around the AV will take ADAS to the next level.

At Mobileye, we developed Road Experience Management™ (REM™) technology to crowdsource the maps needed for AVs – what we call the global Roadbook™. We are now harnessing those maps to improve the accuracy of ADAS features. An example of this is the work that Volkswagen and Mobileye are continuing in their efforts to materialize a L2+ proposition combining the front camera and Roadbook technologies, and leveraging the previously announced data harvesting asset. The ongoing development activity is targeting a broad operational envelope L2+ product addressing mass market deployment.

Amnon Shashua Mobileye CES 2019 3
Professor Amnon Shashua, Intel senior vice president and preside
Professor Amnon Shashua, Intel senior vice president and preside
Professor Amnon Shashua, Intel senior vice president and preside
Professor Amnon Shashua, Intel senior vice president and preside
Professor Amnon Shashua, Intel senior vice president and preside
Professor Amnon Shashua, Intel senior vice president and preside
Professor Amnon Shashua, Intel senior vice president and preside
Professor Amnon Shashua, Intel senior vice president and preside
Professor Amnon Shashua, Intel senior vice president and preside

» Download all images (ZIP, 28 MB)
We also developed the technology-neutral Responsibility-Sensitive Safety (RSS) mathematical approach to safer AV decision-making, which is gaining traction as industry and governments alike have announced plans to adopt RSS for their AV programs and help us work toward development of an industry standard for AV safety. For example, China ITS Alliance – the standards body under the China Ministry of Transportation – has approved a proposal to use RSS as the framework for its forthcoming AV safety standard; Valeo adopted RSS for its AV program and agreed to collaborate on industry standards; and Baidu announced a successful open-source implementation of RSS in Project Apollo.

"Today, we are taking RSS technology back into our ADAS lab and proposing its use as a proactive augment to automatic emergency braking (AEB). We call this automatic preventative braking (APB). Using formulas to determine the moment when the vehicle enters a dangerous situation, APB would help the vehicle return to a safer position by applying small, barely noticeable preventative braking instead of sudden braking to prevent a collision.

If APB were installed in every vehicle using an affordable forward-facing camera, we believe this technology can eliminate a substantial proportion of front-to-rear crashes resulting from wrong driving decision-making. And if we add surround camera sensing and the map into the equation so that preventative braking can be applied in more situations, we can hope to eliminate nearly all collisions of this nature."

Please just don't post press releases.

The amount of time to develop and test APB for L2 or L2+ ADAS and incorporate technology could affect EyeQ6 feature set completion deadline.

"Autonomous Vehicle platform software team supporting Mobileye’s EyeQ and FPGA solutions"
San Jose group.

EyeC is most likely a ultra-wide band 79 GHz Radar (see Yole)

" new 79 GhZ ... used for Simultaneous Localization And Mapping (SLAM) providing accurate distance information to detected objects in real time. It would be helpful to complement geo-localization technologies for autonomous driving especially in urban canyon condition where GNSS technologies show some accuracy issue. Another advantage of 79 GHz Radar is the mitigation of interference issues that could happen when the streets will be loaded with Radars embedded in the cars."

Mobileye hiring focus on "sensors" and "peripheral devices".

"integration of bleeding edge sensors and other peripheral devices into the hardware & software platform" JR0100377

Since EyeC for redundacy, and thus future breakthrough of totally separate systems, I cannot find any current breakthrough possible. RSS in ADAS L2 is all I see.

RSS live fleet testing predates May 2018 per Intel press. Announced in October of 2017, and research begun in mid 2016 ("we have been working on RSS 2 1/2 years"), so hard to determine how much BMW saw early on given May 2017 tie-up. Vision Zero APB appears Dec 2018, and suggested in ADAS in Jan 2019. Not sure what a breakthrough could be.

typo mistake JUL 2016 for BMW

BMW had to have advanced insight of RSS one would think.

RSS live fleet testing predates May 2018 per Intel press. Announced in October of 2017, and research begun in mid 2016 ("we have been working on RSS 2 1/2 years"), so hard to determine how much BMW saw early on given Jul 2016 tie-up. Vision Zero APB appears Dec 2018, and suggested in ADAS in Jan 2019. Not sure what a breakthrough could be.

Both Toyota and SoftBank's start of Monet Technologies, and Honda now on board and another pending investment in Uber could propel Waymo to new partnerships.

Google purchase of HERE instead of a HERE IPO ?

Google Maps platform w/REM is a thought but since Intel chose AWS to host REM, end of that discussion not to mention ADAS not involved. As the map companies all jockey to be relevant, survival seems pinned to SDV. Why not tether existence to both safety and SDV but not sure how.

Nothing in Japan SIP-adus conference notes about high-definition digital road maps of help

2019 SDV arena being dominated by mapping news and suggests an industry in flux over the localization platform. Has sensor fusion become an issue? Open-source data has limits. Did ZF stumble across issues or is the flying drone world entering into the field?

“ZF AD Environment”

Airbus provides its unique high precise Ground Control Points (GCPs),
serves as independent data source to improve and validate accuracy, based on aerial and space borne approach, to complement ZF semantic cards and will be integrated as foundation layers into the “ZF AD Environment” –

the “ZF AD Environment” – an enhanced HD maps solution ZF will
present soon – where all needed information for autonomous driving will be implemented in a cloud based system.

With ZF well along with “ZF AD Environment” and COPilot, and surely Nvidia Mapstream, how does Bosch respond?

Nicolas Peter said at the Shanghai Autoshow No Plans to Develop Compact Vehicle With Rival pouring cold water on rumors that BMW was about to deepen its alliance with Daimler.

“We have no plans to develop a smaller car together with a German competitor,” Peter said at the Shanghai Autoshow

Apple's Lidar leak ambitions almost sound reactionary [seeking "revolutionary design" and form factor worries in 2019].

Radar imaging satellites like TerraSAR-X are able to acquire images having very high absolute geo-location accuracy, due the availability of precise orbit information. By using multiple stereo images in a radargrammetric process, so-called Ground Control Points (GCPs) can be extracted. GCPs are precisely measured land marks given the exact position on the earth. These GCPs are derived from pole-like structures along the road e.g. street lights, signs or traffic lights, since these objects are having a high backscatter in the radar image and therefore being easily identifiable in multiple images. By using a stack of multiple TerraSAR-X images, a dense point cloud of GCPs having an accuracy of less than 10 centimeters can be automatically extracted.

However, in order to make use of this high positional accuracy for the use case of autonomous driving, the link between landmarks like street lights identified from mobile mapping data and the coordinates of the respective GCP needs to be established. The goal of this project is to find and implement an algorithm for the automatic matching of 3D point clouds from GCPs extracted by radar space geodesy and in-situ LIDAR mobile mapping data derived from a car acquisition. A precise matching process would enable the generation of an accurate data basis as indispensable basis for highly automated and autonomous driving.

No where near a breakthrough.

Carmera and ZF / Airbus PoC's now seem rolled out too early for press.

Pages

Add new comment