The trial of self-driving Fiat Chrysler Pacifica minivans is being run in Phoenix, Arizona
The promise of truly self-driving cars hitting our roads has finally been realised. In a landmark announcement, Google's Waymo has said it is removing human safety drivers in a trial of autonomous cars in the US.
The company, which is owned by Google's parent company Alphabet, has tested its prototype on short, pre-determined routes without a safety driver before but it now plans to ditch them and make a free taxi service available to the public in Phoenix, Arizona over the coming months.
At launch, an employee will ride in the back of the Waymo Fiat Chrysler Pacifica minivans to accompany the passengers, but these employees won't be on hand to take over the controls. Current trials have seen human drivers sitting directly behind the wheel ready to take over immediately should the worst happen. After this initial trial, Waymo plans to remove these employees completely.
Eventually the taxi service will be available in an area similar in size to Greater London and Waymo also believes it will be able to charge fares, which would be the first time the project has made money after almost ten years of heavy investment.
The public will surely be skeptical of safety of autonomous cars being commonplace on public roads, and Waymo acknowledges it's taking an ambitious step. However, its cars have already covered a whopping 3.5 million miles on US public road and experts have argued that introducing the technology sooner rather than later could save many more lives in the long run.
In a recent study about the safety of autonomous vehicles, its author, Nidi Kalra, highlighted that human error is responsible for more than 90% of crashes in the US. The critical thing will be how the public reacts to future incidents involving driverless cars. Kalra said: “A major backlash against a crash caused by even a relatively safe autonomous vehicle could grind the industry to a halt—resulting in potentially the greatest loss of life over time. The right answer [in terms of timing] is probably somewhere in between introducing cars that are just better than average and waiting for them to be nearly perfect.”
In September, Tesla was found to be partly responsible for a fatal incident involving one of its cars in 2016.
A wide range of ethical questions have to be considered when programming an autonomous vehicle. To see the type of moral decisions driverless cars have to make, try this simulator.