If you were around in the 1960s and early 1970s, you probably remember The Jetsons.
This popular cartoon series offered a tantalising glimpse into a future where robots cleaned our homes and we went about our daily business in flying cars.
Well, it’s now the future and while we may have robotic vacuum cleaners, flying cars are still pie in the sky. In fact, the experts are still trying to grapple with self-driving vehicles that are safe for our roads.
It seems that as soon as there is an advance in that area, we hear about a setback – including some that, sadly, have ended in human deaths.
In Australia, there have been limited trials of autonomous vehicles since 2015, with Adelaide looking into their use as public transport. However, robotic private cars are unlikely to be a common feature on city streets anytime soon.
Recent research conducted by University of New South Wales (supported by National Seniors) showed that older people welcome existing driver-assistance features in new cars, but don’t fully use or trust them.
Other countries are further advanced with fully autonomous technology, but the vehicles often operate on dedicated tracks or other restricted areas and are closely monitored.
Elsewhere, their widespread use on open roads is not so far away.
Meanwhile, there are plans in Saudi Arabia for 15% of public transport vehicles to be autonomous by 2030.
Anyone familiar with traffic accident statistics in the Middle East might wonder about the wisdom of moving to driverless technology when road deaths are already very high.
Well, one compelling reason is that robotic cars are safer than those driven by humans. In trials comparing rideshare vehicles in San Francisco, those with driverless technology were involved in fewer accidents than those with a person behind the wheel.
Human drivers had a crash rate of 50.5 per million miles (CPMM), while self-driving cars had a rate of 23 CPMM. Furthermore, human drivers were found to be the primary contributors to 69% of their crashes, while robotic cars were to blame for only 10% of their accidents.
But statistics are one thing; perception is something else entirely.
According to academics who specialise in the ethical choices made by machines, humans will be quicker and harsher to judge when deaths occur in driverless cars.
Even if all cars were self-driving, accidents would occur – due to computer malfunctions or an unexpected event such as a person running on to the road.
In the latter scenario, the computer driving the car might have to make a split-second choice between hitting the person on the road in front of them, or swerving into another vehicle, potentially causing the death of its own passengers and the passengers of the other vehicle.
According to the experts, society will have trouble accepting road deaths that are the result of machines making life-or-death decisions – even though we now accept human misjudgment as the cause of many fatalities.
Researchers at the Massachusetts Institute of Technology have been working on this issue for many years. Their Moral Machine website has a “game” in which users are asked to judge how they think a driverless car ought to react in different scenarios.
It comes down to whether the car should risk the lives of its passengers or of other road users – but the number, ages, and genders of the participants change from scenario to scenario.
The person playing the game is essentially required to weigh the worth of their own friends and family against that of, among others, a dog, a young boy, a girl, a pregnant woman, and an elderly man.
Since it will be impossible to reduce the road toll to zero, these remain decisions that someone, or something, will have to make.
If that’s not scary enough, very soon the same kind of judgment may have to be applied to private car-like vehicles plying the skies above us.