As semi-autonomous driving vehicles begin to share the roads with human-driven cars and trucks, a number of safety concerns about the interaction between the two have emerged. Common human interactions between drivers, other motorists, bicyclists, and pedestrians that help prevent accidents and mishaps may become more complicated as drivers let machines take the wheel.
Automation Hits the Road
Major vehicle manufacturers like Audi and Tesla have revealed plans to release vehicles that allow the computerized system to take over most of the driving responsibilities on the road. These vehicles are equipped with special features like V2V communication, radar, lidar sensors, and cameras to help them navigate the roadways while minimizing the risk of a crash. Some of these vehicles do not include traditional components like a brake pedal or steering wheel, however, some automated mass transit vehicles may include an emergency button to stop the vehicle or allow a driver to take control.
Dangers of Automated Vehicles
Las Vegas is now home to pilot programs involving automated vehicles. However, problems have already arisen in this context. On the same day as an autonomous shuttle bus was released as part of a study on driverless vehicles, the bus was involved in a collision with another motor vehicle. The accident occurred after the shuttle came to a complete stop after detecting the other motor vehicle. The human-driven vehicle backed into the occupied automated bus.
While the driver of the motor vehicle received a citation for the accident, some critics argue that if a human was behind the wheel of the shuttle bus, the accident would not have occurred. Critics assert that a normal driver would have honked and alerted the passenger vehicle of impending danger. The incident also raised concerns that human drivers sharing the road with computerized drivers will pose additional safety risks because computers will not respond or interact the way human drivers would under the same circumstances. These concerns are that people may pay less attention when they believe that the vehicle is driving itself. Drivers may fall asleep or be distracted by their phones because they believe the vehicle can drive itself. This is not true as even Tesla has stated that their autopilot is only to be used with a fully attentive driver.
Toyota has expressed concern about the lag time between the automated version being in control and a driver taking over in an emergency. The auto manufacturer said that it is waiting for the legal framework to be established before moving forward on creating certain driverless technology, such as a Level 3 driving which allows the system to claim responsibility for driving when enabled.