Semi-Autonomous Vehicles: Enhancing Road Safety or Paving the Way for New Risks?
In December 2023, Tesla announced a recall of over 2 million vehicles due to a ruling in which the US regulatory body found their Autopilot driver assistance system to be defective. So, what does this decision mean for the future of semi-autonomous vehicles on our roads?
The Tesla recall
Studying 956 crashes of Tesla vehicles with AutoPilot software enabled, a 2-year investigation led by the US National Highway Traffic Safety Administration found that in the vast majority of cases, the technology was simply not robust enough to prevent misuse by drivers or mistakes being made. This was particularly an issue when it came to a part of the Tesla self driving system known as Autosteer, which is designed to help keep a car in the correct lane of travel.
This has supposedly been remedied by Tesla in the form of a software update for all affected vehicles, which goes all the way back to cars purchased in 2015.
What is a semi-autonomous vehicle?
A semi autonomous vehicle has the ability to be able to control some of the main functions involved in driving, such as lane assist and cruise control by utilising the throttle and brake, but the main control of the vehicle remains with the driver. This handy chart shows the different levels of driving automation available, with most cars, including the Teslas involved in the recall, being at level 2 or lower.
Are semi-autonomous vehicles safe?
Of course, this recent news has sparked debates as to whether this technology is safe and fit for use on our roads. In fact, one of the primary purposes of autopilot functions in vehicles is to aid human drivers in navigating and driving safely.
For example, semi-autonomous driving features can help to limit the speed of the vehicle in line with legal limits and the pace of the road, as well as safely driving within the right lane. By having specialised technology keep them in check, drivers can potentially avoid mistakes caused by human error, especially on longer highway journeys.
Of course, this technology is limited, and it certainly is not foolproof. This is why many companies offering semi-autonomous vehicles like Tesla only suggest that this function be used for highway driving. Urban locations and smaller country roads off the beaten track carry too many potential unknowns and hazards that the computer system may not be aware of or able to react to.
In order to further safeguard Australia’s drivers, the National Transport Commission issued a safety assurance system in 2018 designed to support the successful deployment of autonomous vehicles on the country’s roads. This includes the requirement of self-certification for all automated driving systems providers, ensuring that they only offer a product that is in line with the safety requirements set by the government.
To err is human
It’s important to remember that we are talking about semi-autonomous vehicles here, and not self-driving ones, and this is where, in many cases, the issues start to arise.
Semi-autonomous vehicles still require input and awareness from the human in the driving seat. Most accidents in semi-autonomous vehicles happen due to the driver not taking control of the car in time, or coming into contact with other road users such as in rear-end collisions or being side-swiped. After all, humans are unpredictable creatures, so it only stands to reason that a computer system may not be able to react to every move made by other vehicles and pedestrians!
There are warning systems in place to remind users to maintain a driving position when using autopilot features, including making sure their hands remain on the wheel, however, this doesn’t necessarily mean that these are acknowledged or responded to in time. Especially on longer journeys or in highway driving, there can be the temptation for some drivers to let their car do all the work.
Legal implications of semi-autonomous vehicles
So, if a semi-autonomous vehicle gets into an accident, who exactly is at fault?
Unlike fully self-driving cars, the fault lies with the driver, as they are the ones in control and expected to intervene should anything go wrong while the autopilot system is in use.
However, in some cases, it can be argued that the software developer or the manufacturer of the car and its components like sensors and cameras, could be to blame. This is rare in semi-autonomous vehicles though, and this normally ends up applying to fully self-driving vehicles.
What does the future hold?
As technology continues to advance, we can expect more and more new cars to be released with semi-automated features, making this a more accessible option for everyone. However, the likelihood is that we’ll see more of a shift towards fully automated and driverless vehicles, such as in the case of the “robotaxis” on the Las Vegas strip, self driving taxis that provide a safer mode of transport for those ready to call it quits from the casinos.
Semi-autonomous vehicles in Queensland
At the moment, cars in Queensland sit between an automation level of 0 to 2, meaning that they are either fully driver-controlled or have semi-autonomous features that still need driver input and supervision. However, we expect the semi-autonomous vehicle market to continue to rise in the coming years.
In order to examine the ways that semi-autonomous and self driving vehicles work on our roads, the Queensland Government launched the ZOE2 project. These little cars are the first level 4 automated vehicles in Australia. The ZOE2 project is helping to pave the way for legislation and planning when it comes to the inevitable rise in fully self-driving cars and demonstrates how the technology could be used by both Queensland’s industries and its people in the future in a safe and sustainable way.
Things are currently in the testing phase, so we’re not sure when this will be fully rolled out. That being said, if you see a white Renault ZOE driving about, don’t be too alarmed if you don’t spot anyone sitting in the driver’s seat!