Safe and Agile Verification Framework - Autonomous Vehicles

Sensor emulation has long been used in automotive engineering for speeding up product development time and generating consistent results while many design changes are taking place. In this post, we are going to discuss why sensor emulation in autonomous vehicles is so important for successful resource planning, on-time product release, safety compliance, and showing how the ODD (Operating Design Domain) is being satisfied.

Retrospect_Testbed_Agnostic_Verification_Framework (3).png

First, looking at Retrospect’s ADS Testbed-Agnostic Verification Framework in the figure above, we see different Emulation Levels depicted in the red box from Level 0 to Level 3.  Here’s a quick description of each level:

  • Emulation Level 0 – At this level, we are creating the sensor inputs at the hardware level. Examples of Level 0 sensor emulation would include a projected video image in front of a camera, a robotic platform shaking an inertial measurement unit (IMU), or a simulation of radar waves acting on an antenna.

  • Emulation Level 1 – At Level 1, the raw analog output or digital data from the sensor is generated. Such examples could be pixel format data or point cloud data.

  • Emulation Level 2 – This is the level that would likely have some filtering and processing relating to the specific autonomous detection function, such as detecting the green light from a traffic signal. Emulated sensor models may include white noise error and time delays.

  • Emulation Level 3 – Level 3 is where the “sensing” function ends and all the necessary sensor information has been provided. For example, an “ideal” sensor, that passes the information from a simulator engine to the autonomous software stack would be an Emulation Level 3 sensor. Latency and static or transient error models may be part of Emulation Level 3 inputs.

It’s important to note that while these Emulation Levels are abstract, they are also useful concepts for discussing a particular sensor or sensing-mode. They aren’t to be taken too seriously--that’s not the goal.

The Emulation Level says nothing about the state of development or verification completion for an ADS. The Emulation Level also says nothing about the “fidelity” of the emulated sensor outputs with respect to the intended or desired scenario. It’s ok for different interpretations of Emulation Level to exist, so as long as everyone on the team agrees on the point in which the emulated sensor will interface with the rest of the ADS, that’s really what matters. If there is any disagreement about that, then the next step cannot be completed.

The next step after determining the sensor Emulation Level is to prescribe the desired ADS output response if a certain emulated sensor condition exists. In short: Write requirements! Make sure to write the requirements in terms of the emulated sensor signals. Here’s an example of an Emulation Level 3 requirement:

[REQ-01]       The [ADS_Trajectory_Stop_Point] shall be less than or equal to [15 m] within [100 ms] if a [VRU] is present within [1 m laterally] of the [ADS_Trajectory_Path] and [15 m longitudinally] [in front of the EGO].

So, in the example of REQ-01, the conditional criteria of the requirement, “if a [VRU] is present within [1 m laterally] of the [ADS_Trajectory_Path] and [15 m longitudinally] [in front of the EGO],” sounds like a requirement that could be tested on the track or on a software-in-the-loop (SIL) tool using the emulated sensor signal. This is exactly what Emulation Level 3 is trying to do. It is attempting to translate from a real-world or ground-truth frame to an internal world-model frame and generally reads like a scenario description.

But what about lower levels of emulation? It does get difficult to give a meaningful example of lower Emulation Levels without some expertise in a specific sensor technology. We won’t discuss that here, but we will go with two more examples of Emulation Level 2 requirements and describe the process to get to lower levels from there:

[REQ-02]       The [VRU_Object_Track] shall be initialized within [50 ms] if a [REFLECTIVITY CLASS 2 PEDESTRIAN] is present within [2 m laterally] of the [ADS_Trajectory_Path] and [30 m longitudinally] [in front of the EGO] and [VISIBILITY CLASS 2 ENVIRONMENT] conditions exist.

[REQ-03]       The [VRU_Object_Track] shall be initialized within [80 ms] if a [REFLECTIVITY CLASS 2 PEDESTRIAN] is present within [2 m laterally] of the [ADS_Trajectory_Path] and [30 m longitudinally] [in front of the EGO] and [VISIBILITY CLASS 3 ENVIRONMENT] conditions exist.

The very important implication in REQ-02 and REQ-03 is that these specific terms, like “REFLECTIVITY CLASS 2 PEDESTRIAN” and “VISIBILITY CLASS 3 ENVIRONMENT” are 1) defined, and 2) defined in a way that relates to the performance of the emulated signal. For example, say we are using a camera, it may be that the emulated sensor signal has poor color saturation and longer pixel “rise distance” (more blurriness) in a “VISIBILITY CLASS 3 ENVIRONMENT” than a CLASS 2 or CLASS 1 ENVIRONMENT. Color saturation and pixel “rise distance” can be bounded and specified into meaningful brackets that the team finds helpful for further development. So, really, the most important aspect in going down to lower levels of sensor emulation is ensuring that requirements on the emulated sensor signal can trace to higher levels of emulation, and eventually up to the highest levels of real-world scenarios.

This brings us to our third and final step: The Operating Design Domain or ODD.

The definition of the ODD is currently receiving a lot of attention for good reasons. One important reason is that a well-defined ODD is necessary for safety standards compliance. The notation used in defining conditions in the safety requirements and safety testing has to be clearly understood and quantifiable, as discussed in this paper regarding ODD and Safety Argumentation. Another important reason is that the ODD language itself does not need to be unique to every developer, but is improved by having a common definition across the industry. SAE Best Practices for ODD, ISO/SAE AWI PAS 22736, and ASAM OpenODD, are all examples of industry consortia working on the ODD definition.

The ODD definition won’t be re-hashed here. You can find our slides from the ASAM OpenODD Ideation Workshop here, but the main point is this: The ODD must be able to translate all the way down to Emulation Level 0 interfaces with the ADS (bottom left corner).

odd_language.png

Consider the figure above. If the ODD language is not connecting down to the ADS interfaces in the lower left, like sensor inputs and the output at the wheels, then the rubber is still not meeting the road, so to speak. If the ODD language does not include descriptions of the ADS inputs and outputs, then something else will. It’s important for developers, integrators, providers, and safety certifying bodies to have a common understanding of the types of environments the ADS was designed and tested for, and to be able to independently replicate and prove how the ADS performs in specific conditions.

To clearly illustrate how the Emulation Levels help connect the scenario description to the specific sensor requirements, consider the same figure from above overlaid with the Emulation Levels:

second odd_language_emulation_levels.png

The goal is to show how the efforts to consistently and clearly describe the real world conditions in an easy-to-understand language, while beneficial, only serves to re-define Scenario Descriptions, such as ASAM’s OpenSCENARIO, if it does not describe low-level ADS sensor inputs and actuator outputs.

So how do we tie all of this together? In order to verify the safe performance of an ADS, everyone needs to be able to understand what the ADS was designed for, what the ADS design requirements are, what the safety requirements are, and to be able to measure the ADS performance. The high risk of product liability with the ADS requires more than just the developers knowing what is safe and unsafe for the ADS.

Developers can communicate to outside parties and across development teams about how their part interacts with and meets the safety requirements. This can be done with the understanding of Emulation Levels, as shown in the Retrospect ADS Testbed-Agnostic Verification Framework.

Through the use of Emulation Level definition, the Development and Testing processes can be planned in parallel, thereby shortening development effort and hitting milestones on-time. Resources can be better distributed whenever tough problems are identified across the product instead of always going to the latest bottleneck which appears to be slowing everyone down.

If you would like help with completing ADS safety requirements or test cases, setting up Emulation Levels for speeding up development time, or would like to begin preparing to meet the evolving safety standards of the future, please contact us!

For more articles like this, be sure to follow us on LinkedIn and sign-up for our newsletter. We will be asking you for your input on future ADS safety topics you would like to see covered. Please comment or contact us on what you found helpful and what you would like further explanation on.

Michael WoonComment