Lizard glances: the Achilles Heel of DMS that rely on head pose

The two most recognised types of DMS are those that detect whether drivers’ hands are on the steering wheel and those that use cameras to measure driver attention and state.

  • Lizard glances: the Achilles Heel of DMS that rely on head pose

There is growing acknowledgement that Driver Monitoring Systems (DMS) offer a means to identify and prevent driver distraction, through their ability to recognise and warn drivers of distracting behaviour. According to the U.S. National Highway Traffic Safety Administration (NHTSA), 13% of fatal crashes in the U.S. in 2019 were associated with mobile phone use.

The two most recognised types of DMS are those that detect whether drivers’ hands are on the steering wheel and those that use cameras to measure driver attention and state. The former tends to use torque sensors or capacitive sensors to infer whether there are hands on the wheel, while the latter use cameras to track head and eye movements that are associated with unsafe driver states. We previously outlined some major issues with steering-based DMS, when we highlighted that not all DMS are created equal.

Despite growing acknowledgement that camera-based DMS offer the best way to support driver attention, there are also important differences among camera-based DMS and this has received little attention to date.

A major difference relates to which data from the driver is used to infer where they are looking. It is often assumed that all camera-based DMS track eye movements to determine a driver’s gaze direction. In fact, many DMS instead use the driver’s head movements (i.e., ‘head pose’) to infer gaze direction. Accurately tracking head movement in an unpredictable real-world environment with drivers of varying shapes and sizes is challenging, but tracking eye movements is even tougher.

Head pose is often a reliable indicator of where someone is looking. Research with pedestrians using their mobile phones showed they coordinated their eye and head movement when looking down to view their phones. The sheer prevalence of this behaviour led to the term ‘text neck’ being coined.

Other research supports that the coordination between eye and head movement also happens in vehicles. Over the past ten years, studies have identified that when drivers look toward regions in the vehicle that require large gaze shifts away from the road ahead (e.g., the centre console, side mirror, or their lap), their head mostly moves in the same direction as their eyes.

This generally supports that detecting head movement is a reliable way to infer where a driver is looking. Given the concerning number of fatalities that arise from cell phone distraction though, it is worth considering whether detecting head pose by itself is also appropriate for recognising drivers distracted by their phone.

Humans naturally gravitate toward behaviour that minimises effort (i.e., we are lazy). As we found in our research with a level 2 automated vehicle on a test track, drivers using a mobile phone often hold the device near to the forward roadway, for example around the steering wheel region. Unlike the text neck behaviour seen in people walking the street, many drivers move their eyes independently to their head when interacting with a mobile phone. This shows eye-head dis-coordination in distracted driving that minimises the effort of splitting attention between the road and the phone. Recent research conclusively shows that this presents an issue for DMS that uses only head pose to infer distraction.

In the space of automotive human factors research, these types of eye-head movements are known as ‘lizard glances’, owing to the way lizards look around their environment with minimal head movement. These contrast with ‘owl glances’, where the eyes and the head generally move together. Lizard glances are the Achilles’ heel of DMS that rely upon head movements to infer gaze direction.

The safest and most reliable DMS are those that can detect driver eye movements in addition to head movements. Head pose is still an important input to measuring driver gaze direction—for example, it is a useful and necessary surrogate for glances that go beyond 60 degrees horizontally in relation to the DMS sensor. However, DMS that is unable to track eye behaviour will fail to protect road users against one of the most dangerous and prolific behaviours that a driver can engage in—cellphone use.

This must be considered by OEMs for DMS design and regulatory bodies globally for distraction-related safety regulations.