The future dictates that we reassess our ethical and moral values.
I am no philosopher but I do believe that we are entering a period that will bring ethical and moral challenges never seen before or even imagined. Are we prepared or will we bury our collective heads in the sand and await the inevitable?
Technology, Morality & Ethics
Most readers will be aware of the rapidly accelerating technological developments in the ‘self-drive car’ or autonomous vehicle arena. Google’s fleet of test vehicles have travelled over 2.5 million kilometres as they develop and test sensors and software. We are probably between five and 20 years away from seeing mass production autonomous vehicles on our roads, but are we ready for the social and moral impacts?
Imagine a world where we don’t own a car – we just order one whenever it is required and within five minutes an autonomous vehicle will be at the door. Cars become a service – not a product. Need the children picked up from school? No problem, just order an autonomous vehicle to do the job.
The automotive industry will be turned on its head as vehicles are purchased by fleet owners rather than individuals
Parking problems will be dramatically reduced because the pool of cars from autonomous vehicle suppliers will be away from populated, congested areas. There will be limited need for parking stations – just pickup and drop off zones.
Fewer cars will be required due to their more efficient use and pollution will be reduced as people order cars appropriate to their need rather than relying on a 1400kg, five seat, petrol guzzling monster to move just one or two people around. But if you need a four or six seater – no problem, you will be able to order that too.
The automotive industry will be turned on its head as vehicles are purchased by fleet owners rather than individuals. Two seat vehicles will be commonplace. Hydrogen fuel cell, electric or hybrid electric will become the standard.
Autonomous vehicles always obey the road rules, so we’ll need less traffic police. Car insurance costs will drop dramatically because autonomous vehicles will have very few accidents (over 90 per cent of traffic accidents are a result of human error), causing massive disruption in the insurance industry. And health care costs and disability support costs will go down due to far fewer injuries.
People with impaired vision will no longer be restricted when it comes to driving and age will not be an impediment (young or old) to autonomous vehicle use.
But with all these benefits come challenges, many of them related to autonomous vehicle software, the decision making part of the autonomous vehicle.
Software and Morality
Imagine the following scenarios and your likely response to them if you were designing the software for an autonomous vehicle:
- A driver is on a narrow road with a steep and dangerous embankment on either side. A pedestrian jumps out of nowhere in front of the autonomous vehicle. Should the vehicle hit the pedestrian and potentially save the driver or drive over the embankment and potentially injure or even kill the driver.
- Same scenario but there are three pedestrians including two small children. Does the vehicle hit the pedestrians or run off the road with all the inherent risks to the driver?
- Same scenario but there are five children crossing and you are the vehicle driver and you are the one who is likely to be injured or killed in avoiding the pedestrians.
You can see where this discussion is going. Most people would have the autonomous vehicle run off the road and save the pedestrians in scenarios one and two. When it comes to scenario three however, most people will opt to save themselves at the expense of the pedestrians. You can of course imagine any number of variations of these scenarios but with equally predictable results.
Tragedy of the Commons
Researchers have likened these responses to the ‘tragedy of the commons’ — an economic theory suggesting that, when shared resources are at stake, individuals will act in their own self-interest instead of taking the common good into account, thereby depleting the resource and causing harm to everyone.
“Even if you started off as one of the noble people who are willing to buy a self-sacrificing car, once you realize that most people are buying self-protecting ones, then you are going to reconsider when you’re putting yourself at risk to shoulder the burden of the collective when no one else will.”1
So, who will make the decision on how autonomous vehicle software is programmed? It would seem that we cannot rely on the ethics of individuals for answers as they are driven largely by self-preservation. The result that best serves the common good is probably going to have to be dictated by government.
This quandary is of course not unique to autonomous vehicles. Governments have been facing these issues on a daily basis as they wrestle with the allocation of scarce resources and none can be more demanding than the decisions being made in Medicare and the Pharmaceutical Benefits Scheme.
Morality and Universal Healthcare
In Australia we place a high value on our universal healthcare system as demonstrated by reactions during the July 2016 election campaign. Labor or Liberal, there was no denying the impact of the Mediscare campaign. It served to warn politicians on either side that they should tread carefully when proposing to erode Medicare services or increase out of pocket charges.
This system of universal healthcare however is much more than just a budget. Medicare must address issues like “Do we spend AU$100,000 on a new pharmaceutical for the treatment of a disorder that manifests itself in 90-year-old males or a similar amount on the treatment of mental illness in young homeless people?”
As we ponder the Tragedy of the Commons we need to consider not only the morality of our collective selves but more importantly, our personal ethical stance and, we need to be prepared to debate the gaps between the two.
The ‘Me’ generation is dead; they just don’t know it yet. It is time we adjust our personal ethics to align solidly with the morals we expect of our society. We cannot save the autonomous vehicle driver and the pedestrians.
Are you ready to make those choices?
Michael Jacobs is a business consultant and columnist for mivision. He was the former Chief Executive Officer of Eyecare Plus for 10 years until early 2015.