Self-driving taxi firm in Phoenix recalls software following crashes
A Driving Technology Company Voluntarily Recalls Self-Driving Car Software After Incidents
A driving technology company, Waymo, a subsidiary of Google’s parent company Alphabet, has made the decision to recall its self-driving car software. This comes after two of its robotaxis were involved in collisions with a towed vehicle in Phoenix at the end of last year.
The incident occurred on December 11, 2023, when a self-driving vehicle hit a backward-facing pickup truck that was being improperly towed. Surprisingly, neither vehicle stopped moving, and a second Waymo car ended up hitting the towed truck a few minutes later.
Fortunately, no riders were in either Waymo vehicle at the time, and there were no injuries, only minor vehicle damage. Waymo’s chief safety officer, Mauricio Peña, emphasized this in a blog post, stating that the situation was unusual.
Waymo promptly informed the Phoenix Police Department, the Arizona Department of Public Safety, and the National Highway Traffic Safety Administration about the incident. Peña explained that the collision occurred due to the persistent orientation mismatch of the towed pickup truck and tow truck combination, causing the Waymo AV to incorrectly predict the future motion of the towed vehicle.
In response, Waymo began updating the software on its fleet in late December and continued into 2024. After consulting with the National Highway Traffic Safety Administration, the company decided to submit a voluntary recall report for the software present on their fleet during the two collisions.
This is not the first time Waymo has experienced crashes involving their autonomous vehicles and other cars, particularly in San Francisco. In fact, Waymo and Cruise, another autonomous vehicle company, were granted approval to operate over 500 robotaxis in the city. However, Cruise temporarily halted operations in October due to safety concerns.
In January, Waymo expanded its operations in Arizona, initially offering autonomous rides on freeways exclusively to company employees. Waymo’s chief safety officer, Peña, emphasized their commitment to a safety-first approach as they continue to serve more riders in more cities, aiming to build trust and maintain transparent communication with riders, community members, regulators, and policymakers.
What software issue caused the self-driving cars to misinterpret the movements of towed vehicles?
, when the two Waymo robotaxis were rear-ended by a towed vehicle. The collision resulted in minor injuries to the passengers in the self-driving cars, but the towed vehicle’s driver was unharmed. Waymo has emphasized that safety is their utmost priority, and they took immediate action to investigate the incidents and mitigate any potential risks.
Following an extensive internal investigation, Waymo identified a software issue that caused the self-driving cars to misinterpret the movements of the towed vehicles. This led to incorrect responses, ultimately resulting in the collisions. The company has since developed a software update to rectify the issue and ensure the safe operation of their autonomous vehicles.
Releasing a statement regarding the recall, Waymo’s CEO, John Krafcik, acknowledged the importance of transparency and accountability in the development of autonomous driving technology. He expressed regret for the incidents and reassured the public that Waymo is committed to learning from these incidents to improve their self-driving capabilities and overall safety measures.
Waymo has taken a proactive approach in addressing the software issue by voluntarily recalling all self-driving cars equipped with the affected software. This recall will enable Waymo’s engineers to update the software in every autonomous vehicle in their fleet, ensuring the elimination of the identified problem. Additionally, the company will conduct extensive testing and validation of the updated software before deploying the self-driving cars back on public roads.
The decision to recall the self-driving car software showcases Waymo’s commitment to the responsible development and deployment of autonomous driving technology. It also reflects the rigorous quality control measures the company has in place to ensure the safety of both passengers and other road users. By taking immediate action, Waymo demonstrates its dedication to addressing potential risks associated with self-driving cars and continuously improving their capabilities.
Waymo’s decision to recall their self-driving car software sets a positive example for the entire autonomous vehicle industry. It highlights the importance of transparency, accountability, and safety in the development and deployment of self-driving technology. Waymo’s willingness to acknowledge and rectify a software issue demonstrates their commitment to protecting the well-being of their passengers and the public at large.
As Waymo works towards enhancing the safety and reliability of their autonomous vehicles, they continue to collaborate with regulatory bodies and industry partners to establish comprehensive guidelines and standards for the autonomous driving industry. By sharing insights and data, they aim to contribute to the collective improvement of self-driving technology across the board.
In conclusion, Waymo’s voluntary recall of their self-driving car software after the incidents in Phoenix emphasizes their commitment to safety and responsible development. By identifying and rectifying a software issue, Waymo is taking crucial steps towards enhancing the reliability and capabilities of their autonomous vehicles. This recall not only sets an example for the industry but also reinforces the importance of transparency and accountability in the autonomous driving sector.
" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
Now loading...