Humanising Autonomy: ODDs Need a People Parameter: how humans are caught in the blindspot of urban mobility systems

Posted by STEER On October 14, 2020

Operational Design Domains (ODDs), Operational Domains (ODs) or even Operational Design Condition (ODC), as some experts refer to them, have been top of mind for mobility industry mavens since AVs have entered the public conscience. Their descriptions are nearly as numerous as the ODDs themselves, and have been hailed as everything from the key to rollout of autonomous vehicles, critical to the safety of the public, and, more colloquially, a bit of a pain to define. That’s not for lack of trying – SAE J3016 defines an ODD as:

Operating conditions under which a given driving automation system, or feature thereof, is specifically designed to function, including, but not limited to, the environmental, geographical and time-of-day restrictions, and/ or the requisite presence or absence of certain traffic or roadway characteristics.

Recently, the British Standards Institute (BSI) launched PAS 1883, which provides the “requirements for the minimum hierarchical taxonomy for specifying an [ODD]”. Both examples are encouraging signs of the industry’s mission to prioritise safety, protecting passengers, drivers and pedestrians alike.

The question is: do current ODD discussions go far enough? Says Mark Cracknell, Head of Technology, Zenzic: “Given the criticality of the ODD to ensuring the safe deployment of CAM, the ability to account for and include vulnerable road users is at the heart of what it means to deploy CAM safely and securely.” The majority of ODD factors are related to road conditions and scenarios, like time of day, weather or terrain, before expanding on the finer details of what these different road conditions will affect – such as glare on the camera, incorrect or outdated mapping detail, misleading water-filled potholes, reaction to downed power lines, etc.

These “edge cases” are near endless, but there are not nearly enough that include considerations to vulnerable road users. In January 2019, Philip Koopman, co-founder of Edge Case Research co-authored How many Operational Design Domains, Objects, and Events, a four-page bulleted list of scenarios, factors and maneuvers to consider when classifying the environments autonomous vehicles should operate within.

Data collection regarding road conditions is nowhere near complete, with the industry’s enthusiasm steadily pushing this work forward. However, for standards and regulations that are primarily meant to protect public safety, there is a lack of focus on the “pedestrian parameter”, or vulnerable road users (VRUs). In most studies looking at safety for autonomous vehicles and automated driving, VRUs are consistently lumped into “dynamic object” classification, which is defined as anything moving with intention, including other vehicles or drones, their operators, pedestrians and animals. Often this classification does not go far enough, and can result in dangerous oversights, especially for VRUs. It does not account for different interactions between vehicles and VRUs or address Right of Way (RoW) expectations to prioritise safety or give sufficient context to the artificial intelligence (AI) involved. Even if the choice is made to include VRUs within the dynamic object category, further fleshing out of this specific actor is needed.

Current parameters found in ODDs do not account for cultural considerations that may impact how they act in a road environment, nor does it look at differently abled persons or situational context as essential human factors that affect the performance of AI systems. Due to this complexity, special consideration for VRUs is paramount, requiring a separate category, or parameter, when it comes to ODD classification.

Already defined ODDs should be representative of the varied and complex situations that pop up in cities – like a cyclist turning left in a blind spot, or a pedestrian crossing without looking at the road in front of a shopping street or near bars. With this distinction in mind, how does the industry move forward to ensure that people are appropriately accounted for when building new urban mobility systems with autonomous technology?

Quintessential human characteristics to build out the “people parameter”
ODDs and autonomous vehicles’ AI are inextricably linked – for higher levels of autonomy, manufacturers must be able to provide indication as to which ODDs the vehicle has been designed and tested over, defining when and where it is able to safely operate. Across the stack, AI is used for vision, path prediction and often decision making, intermixed with numerous algorithms. For this to be safe, the vehicle needs to be able to detect where its ODD ends and when the human should take over the car again, also referred to as attribute awareness. An example of this would be Tesla’s auto-summon parking feature. In theory, the car should be able to sense when it can and cannot offer its auto-summon feature by assessing its environment and deducing whether or not it is in a parking lot, in the process informing the user.

With this logic, cars should be able to identify which road they are on (highway versus urban street), and therefore which ODD they are operating within. For many, when training datasets to recognise vulnerable road users the variables become innumerable, making it impractical to include in ODD. However, the industry must work together to develop methods that identify “people” regardless of skin colour, age, hairstyle or any other cultural human attributes. It is important to note that some VRU groups can be specially segmented for edge case testing – like those using a wheelchair, for example – as they engage in different behaviours on the road. Currently, this is not being done nearly efficiently enough; a recent study demonstrates how biases in data labelling can lead to inherent biases in detection mechanisms of autonomous vehicles when sensing dark skinned pedestrians compared with their light skinned counterparts

Ensuring that AI systems behave in societal ways that are ethically sound is of particular importance for the mobility industry, minimising algorithmic bias is something all stakeholders will need to work towards and push for. Explainability, trust, accountability, repeatability and responsibility should be guiding principles when approaching automation. In practice, this means building out edge cases and ODD classifications that prioritise humans and VRUs separately beyond dynamic objects. This must be applied to every ODD – not just those that operate within urban environments, such as in a parking lot or in a city. Just as all likely scenarios should be accounted for, so must their opposites should the vehicle find itself in a less likely situation. For example, AVs or autonomous trucks that are trained for operation on a motorway, interstate or freeway should be able to identify, trace and respond to humans in that environment. Recent protests on US highways are a prime example of why comprehensive classification is essential to ethical ODD practices.

Shared datasets to protect VRUs
Just as ethical AI practices are reliant on the mobility industry working collaboratively, so must actions to build a public dataset for VRUs. Effectively engineering safety-critical functions for AVs means building a set of relevant governance tools, including a robust database that can drive industry, regulators, insurers, and other relevant stakeholders towards a consensus on safety. Data sharing plays a key role in this initiative, and multi-disciplinary data sharing will shed more light on “edge cases”, and will allow AV developers to benchmark their performance in these scenarios in simulation and share best practice. Accounting for these edge cases is necessary to the public safety of VRUs; they are far less safe if a system is not able to cope with environmental diversity.

Agreeing common scenarios in which vehicles with higher levels of automation must demonstrate a basic level of operational safety will help the form common industry and regulator safety understanding, clearing up uncertainties and providing insights to regulators on the evolving state of AV safety. With data insight, data pools can help develop policy instruments, avoiding scattered national and international approaches and requirements.

Moving forward as an industry
It’s clear that including people into the dynamic object parameter does not go far enough in prioritising VRU safety. Autonomous and automated driving systems do not account for a wide enough subset of human behaviour, and assume that edge cases happen rarely, whereas in reality they are everyday occurrences. Introducing ethics into defining principles for ODDs and approaching shared datasets of vulnerable road user behaviour are both good first steps towards industry consensus on safety guidance for pedestrians and VRUs. Future work will also need to continue to address ODDs from a wide variety of viewpoints, including AV manufacturers, verification and validation stakeholders, and end-users, and will need to be based on common language and taxonomy. Only by using formalised and agreed upon ontology will the industry be able to develop ODDs representative of every-day urban scenarios that incorporate the needs of VRUs.

Read the original article on the Humanising Autonomy blog here.