Keywords

1 Background

1.1 Inclusive Design and Exclusion Calculation

Inclusive design researchers have attempted to understand which characteristics of interactions cause difficulty and exclusion for people in general and specifically for people with capability impairments [1]. It has been demonstrated that adoption of inclusive design tools and processes during the design and development of mainstream products and services can not only improve the uptake for those with capability impairment, but also improve the user experience for those who do not consider themselves impaired [2].

One of the tools used to evaluate the inclusivity of a user journey, is the Inclusive Design Toolkit’s Exclusion Calculator [3]. The proportion of the adult UK population who are unable to achieve certain interactions due to degradation of perceptual and motor skill performance is estimated by comparison of task demand to capability data collected in 1996/7 The Disability Follow-Up Survey [4]. For example, the exclusion calculator can estimate the percentage of UK adults who would not be able to read a small sized font used on a display by comparison to whether the task would be possible by someone who can read a newspaper headline, someone who can read a large print book, or someone who can read ordinary newsprint. By use of similar comparisons, the calculator helps estimate a prediction of the proportion of the UK adult population excluded through the visual, hearing, thinking, dexterity, reaching and locomotion demands of the interaction required to achieve a goal. This enables an assessment to be carried out on a specified user journey which provides a percentage of the UK population who are unable to complete some of the tasks, at an individual task level or aggregated to reflect the complete user journey. By extension, this population exclusion is a proxy measure for difficulty that fully-abled users would be likely to experience also. However, it is recognised that there is a need to develop the cognitive part of the exclusion calculator to allow a more comprehensive assessment of exclusion caused by the thinking required particularly to address the load caused by unfamiliarity in digital interfaces [5]. For some people interacting with technology, it is the learning requirements that cause the most exclusion [6, 7]. In some of the studies conducted on digital interface tasks have shown that this can provide an impenetrable barrier particularly for older and novice users of digital technology interfaces [8]. From previous research, experience with particular interaction patterns combined with the cognitive ability to apply learning in one interface to another potentially unrelated interface are key to an ‘inclusive’ experience with a new interface [9]. Clearly the learning and trial and error approach are related to the cognitive aspect of interacting with an interface, but the data accessed by the exclusion calculator is not directly comparable [5]. So whilst accurate quantification for cognitive exclusion is not possible with the exclusion calculator, there is evidence that its use is able to identify potential issues.

Some of this evidence was provided during a recent training exercise for corporate clients. A design engineer from the Pharmaceutical industry commented that carrying out an exclusion audit over a 2 hour period with a group of 4 trainees, had elicited the top 3 concerns that had taken hundreds of thousands of dollars to elicit using conventional user trials. However, clearly further research would be needed to validate this.

1.2 Ageing Driver Population

It is well documented that in the developed world, populations are ageing partially as a result of increased life expectancies and partly due to historic population anomalies such as those that precipitated the baby boomer generation. In the UK for example, the group aged 85 and older is forecast to increase by over 100 % from 2016 to 2036 (data drawn from [10]). As a result of this, some premium models of vehicles sold in developed markets, the average age of the buyer can be as high as 70, and across one premium vehicle brand has seen its average age of US buyers rise by 10 years in just 13 years: from 51 in 2000 [11], to 61 in 2013 [12].

Ageing is known to affect capability of human beings in vision, hearing, cognition, dexterity and locomotion [3]. A very familiar example is ‘age related long-sightedness’ (presbyopia) which starts to affect people over the age of 40, which makes adjusting the focal length of the eye to near distances more difficult, and therefore can cause exclusion in situations where reading glasses are not worn, or varifocal lenses are inappropriate.

An ageing driver population therefore needs vehicle interfaces to be designed inclusively, to ensure that manufacturers are able to sell their products to many of the consumers, particularly in developed markets.

1.3 Automotive HMI’s Through the Years

Since the car was invented, over 100 years ago, there has been a slow convergence of placement location, direction-of-motion and logic of operation of both primary and secondary driving controls. For example, in the 1930’s the UK vehicle manufacturer Alvis used to position the accelerator (throttle) in between the clutch and brake pedals (see Fig. 1), and since then the convention of accelerator pedal being positioned to the right of the brake pedal has been consistently applied in production vehicles. It has long been recognised that this consistency provides drivers with the ability to leap from one vehicle into another, and be able to have a ‘near transfer’ [13]. The near transfer allows the procedural knowledge that a driver will have had from their driver education and experience to bring to bear successfully on a new vehicle. Whilst drivers do expect to make some accommodation to an unfamiliar vehicle, critical primary driving controls would not be expected to change position dramatically, and hitherto, in the main they have not.

Fig. 1.
figure 1

1933 Alvis 12 interior showing worn metal accelerator pedal visible through steering wheel, between darker clutch (left) and brake (right) pedals. Picture attribution: Steve Glover, reproduced under creative commons attribution 2.0 generic license.

2 Recent Developments in Automotive HMI’s

However, there is a trend that some secondary, and less-frequently-used primary driving controls have begun to change location, mode of operation and logic. An example of this would be the handbrake, parking or emergency brake control, which in many markets was relatively consistently located on the centre transmission tunnel, between the driver and front passenger occupant. It would be activated by pulling upwards on the large lever, which would raise the lever to an inclined position with an associated increase in force proportional to the force applied to the braked wheels. It was deactivated by pulling up on the lever whilst depressing the end lock button, then lowering the lever back to the horizontal position. The difficulties and exclusion caused by this system were due to the potentially high forces required to pull on the lever, to generate sufficient braking force, and then the high forces required to release the brake when someone stronger had previously set it.

The technology that has altered this interaction is known as an Electronic Parking Brake (EPB) and its sister feature, Hill Start Assist (HSA). This technology applies the braking force with electric power, and so the interfaces have changed from mechanical levers to become electronic switches, or in some vehicle or usage cases, made redundant as the vehicle can activate them autonomously. However, the implementation of switch locations and operating logic across different brands and models has caused significant diversity (see Fig. 2). Some manufacturers locate them on the dashboard (instrument panel), others on the centre console (where the handbrake lever traditionally was located), and others have combined them with gear lever controls [13]. They have been implemented to be automatically applied and released for example to provide the HSA functionality, to simply mimicking the functionality of the mechanical system, requiring driver activation and deactivation. The switch operation has also diverged with some requiring a ‘pull’, and others a ‘push’, to activate the brake. This technology has removed the exclusion and difficulty posed by the biomechanical force required to operate the lever. Instead, it introduces a cognitive learning and recalling task for knowing where the controls are, understanding the modes of operation for an individual vehicle driving experience. In addition there is the potential challenge of attempting to learn a mental model which will apply across all the varieties of EPB that a driver may experience, whilst not becoming reliant on one behaviour when it may become inappropriate in another vehicle. This may not be easy for some drivers to do, without the potential for making significant errors.

Fig. 2.
figure 2

Divergence of control location for Electronic Parking Brake (EPB) from conventional mechanical handbrake lever.

Automotive manufacturers are under considerable market pressure to add features to their vehicles as consumers are often seduced by the promise of new functionality, and marketers look to new features to draw attention to sell the product. However, the HCI community has long recognised that added functionality nearly always brings a reduction in usability. Since the advent of the screen-based menu functionality in modern vehicles, the number of features able to be integrated is almost infinite. Manufacturers have taken advantage of this to provide extensive consumer customisation and functionality. However the use of such systems often requires some level of digital experience and exposure to the interaction patterns from the digital world.

Recently, through the increasing complexity of functions available and the consequent desire to integrate them into standalone electronic HMI units, secondary controls have started to diverge in terms of their interaction patterns, locations and the mental models required to operate them. This was most markedly seen with the BMW iDrive HMI system introduced in 2001 into the E65 7 Series sedan. This system replaced large numbers of discrete buttons on the dashboard and console, with a sophisticated haptic rotary and joystick action controller in the centre console and an associated display screen at the top of the dashboard. This interface implicitly assumed that drivers would learn to use the system, either from a position of relative digital naivety, or more hopefully by extension of prior digital interface experiences, if they experienced ‘near-transfer’.

Since then partly as a result of increased function automation, and Advanced Driver Assistance Systems (ADAS) becoming available, some other primary driving controls have also been affected, such as gear shifters, and adaptive and conventional cruise controls. The development of more ADAS and autonomous features seems very likely to further increase the complexity and diversity of the available functions that the driver is able to interact with.

The HMI technologies through which these functions can be accessed has also evolved, such as touchpads, touchscreens, haptic controllers and gestural interfaces, such that now there is for the first time a significant divergent effect on the consistency of control placement, direction-of-motion and logic of operation even for features that drivers are familiar with. If new functionality additionally extends this divergence of control location, interaction and mode of operation, then this can only further increase the learning demand on a driver.

2.1 Causes of Exclusion for Automotive Heating, Ventilation and Air Conditioning (HVAC) Controls – Manual Electro-Mechanical Type

HVAC controls are considered to be secondary driving controls, i.e. they are used by the driver while driving, but not as a part of the primary driving task. Thus they should not unnecessarily distract the driver from the driving task. However, they are also sometimes required to be operated beyond just that for occupant comfort, for example to ensure that the driver’s view is clear through the windshield/windscreen and other windows and not obscured by frost or misting.

Prior assessments of conventional electro-mechanical manual HVAC controls conducted (see Fig. 3) by the Engineering Design Centre have suggested that the reasons for difficulty and exclusion are predominantly caused by visual obscuration and legibility of labels on the instrument panel surface, reach to some of the controls located furthest from the driver, and forces can be a problem for mechanically only activated controls. It is believed most drivers are either sufficiently familiar with the symbols used to label the controls, or are able to work out what the symbols mean in the context of the other cues. However, the frequency of interaction for these sorts of controls is dependent on the functionality required and the variability of the ambient weather conditions. In colder climates, heating controls are needed to operate potentially safety critical demisting functions, and the system response is subject to the vehicle’s engine heating the coolant fluid. So adjustment of the temperature, distribution and fan settings may be required to achieve comfort and demisting functions. In hotter climates, the air conditioning function, temperature and distribution controls, may remain relatively static, and the only the fan speed adjusted to accommodate, for example to a hot vehicle having been parked in the sun. This frequency of use is what the electronic automated HVAC controls attempt to resolve.

Fig. 3.
figure 3

A typical manual HVAC control layout

2.2 Causes of Exclusion for Automotive HVAC Controls – Automatic Electronic Type

In the attempt to reduce the need for the driver to adjust the HVAC controls, in this example of an automatic electronic interface, the temperature can be set for both the passenger and the driver independently, and the fan speed and distribution (where the air is directed in the vehicle) are controlled automatically. However, this type of HVAC control introduces another significant layer of potential cognitive demand due to the complexity of available functions, the modal nature of some of the sub-systems and the unfamiliarity of some of the labels. So whilst the temperature adjustment is achieved relatively easily by use of the appropriate rotary control, adjustment of fan and distribution is achieved only by more complicated repeated pressing of small push button controls. In this particular example (see Fig. 4) there are also reflection issues impairing the legibility of the display symbols and characters.

Fig. 4.
figure 4

An example of an electronic HVAC control layout

A further trend from some manufacturers has been to integrate the HVAC, audio and satellite navigation functions into one display and control system, which potentially provides more space for legibility of labels and icons, however also presents even more of a cognitive challenge in terms of modality and system comprehension.

3 Future Gazing: Advanced Driver Assistance System (ADAS) and Autonomous Feature Integration

If the trend of divergence continues, drivers will be unable to leap into an unfamiliar (such as a rental or loan) vehicle and drive without requiring re-training to operate even some of the most basic of features. Some drivers may be unwilling or unable to make the re-learning transition, and hence become excluded. As ADAS features become more prevalent, and more expected by drivers, this has the potential to cause significant problems both in the learning to operate, and understanding of operation of such features.

In additional risk of accommodating to these helpful features, may prove problematic in situations where the feature is no longer available for them. For example, anecdotal evidence would suggest that many drivers are becoming reliant on parking aid auditory warnings, particularly for reverse parking manoeuvres. If then they are faced with driving a vehicle that doesn’t have this feature, and if they are not conscious of it at the time of parking, they may revert to a behaviour that is dependent on the technology to remind them when to stop the vehicle. Extrapolating from this scenario to that for assistance systems which work at higher speeds to help the driver, when not present or active for the driver, have more dire potential consequences.

In contrast to this trend, an alternative path does exist. Previous work [14] engaged users who would not normally be consulted in a deep dialogue, found out their specific needs and difficulties and then linked those with technologies that can help with those needs and developed appropriate interfaces that required very little learning and were highly intuitive. In that instance the older user group identified unintended speeding in the context of speed cameras was an important issue. The solutions of a visual speedometer display showing speed in the context of the speed limit, and a haptic accelerator to give a virtual detent at the speed limit, showed that useful technology could be implemented in a usable unchallenging way, without causing undue cognitive complexity.

Given that the world population is predicted to continue to age, full autonomy would appear to offer both a commercial and humanitarian opportunity to those who struggle with mobility for disability, confidence, capability or for financial reasons. However, the current trajectory for the implementation of the interfaces providing partial autonomy, would suggest that a radical rethink is required to ensure that those who stand to gain the most from the ability to travel easily without being able to drive, are actually able to benefit from full autonomous road transportation. Or perhaps more hopefully, this is a temporary state of affairs before the interaction task is reduced through full automation, to merely asking a car to take you to your destination?