Scaled Back Autonomous Vehicle Regulations Increase Dangers
Over the winter, we reported on the proposed changes to federal regulations regarding reporting of autonomous vehicle crashes. The administration says the reason it’s collecting less data about self-driving car crashes is that we are in an ” innovation race” with China and the stakes have never been higher. But the stakes are also high for purchasers of autonomous vehicles and those who share the road with them.
The changes take effect June 16th 2025.
What’s Changing About AV Crash Reporting?
Introducing the pared-down protections, Secretary of Transportation Sean Duffy declared that prioritizing safety came first. Then, he followed up with a “but,” saying that manufacturers of autonomous vehicles had said current safety reporting requirements were too expensive.
To save those companies money and ostensibly make them more competitive with China, the federal government will no longer require autonomous vehicle manufacturers to report crashes where there was only minor property damage. They also won’t be required to report crashes involving vulnerable users like pedestrians and bicyclists unless the vehicle itself actually collided with the pedestrian or bicyclist.
On the surface it may seem to make sense that minor accidents need not be reported. However, the reason for the reporting requirements relating to autonomous vehicles is to allow regulators and engineers to see patterns that can help them make those vehicles safer. To understand what goes wrong with autonomous vehicles, it is necessary to see a large amount of data regarding the causes of accidents and the conditions under which they occur. That’s true regardless of how much damage the accident does.
The new version of the order also extends the time manufacturers have to submit crash reports in many circumstances.
How is Access to AV Crash Data Changing?
The reduced crash reporting requirements mean that far less data will be available to identify any potential safety risks with autonomous vehicles and improve the safety of these vehicles before they proliferate on US roads. However, that’s not the only way the changes in the standing order put the public at greater risk.
The updated order also allows an autonomous vehicle manufacturer to request “shielding” of certain important data about the crashes. The administration says this change is to protect confidential business information. However, information about the circumstances under which your autonomous vehicle may crash should not be confidential.
Manufacturers will be able to request confidentiality regarding:
- Whether or not the collision occurred while the car was operating in conditions in which it was designed to operate– a critical data point for determining whether or not the vehicle operates safely when used correctly
- A description of the circumstances of the crash– information that is important for understanding how the collision occurred and for drivers to understand potential risk situations and make adaptations
- Which version of the automated self-driving software was in use at the time of the collision– a data point which would allow self-driving vehicle owners to know whether or not a risk was associated with their particular version of the software.
But Wait: It May Get Worse
The scaling back of reporting requirements and the hiding of key data points has the potential to make the road to widespread autonomous vehicle use more dangerous – and not just for those operating autonomous vehicles. Unidentified risks could put everyone on the road with autonomous vehicles in danger.
Currently more than half of U.S. states have some form of regulation relating to the use of autonomous vehicles. These regulations vary significantly from state to state. However, a little-known provision in the pending “big beautiful bill” would prohibit states from enacting any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems “for 10 years from the date the law was enacted.” If the bill becomes law, this provision would also render many existing laws unenforceable.
Help for Victims of Self-Driving Car Accidents in Massachusetts and New Hampshire
In many ways, autonomous vehicle accidents are like any other motor vehicle accident. Any party whose negligence contributed to the collision may be liable for damages. When an autonomous vehicle is involved, the case may be more complicated because there is a higher likelihood that there may be claims against the manufacturer in addition to any claims against the operator of the vehicle or others on the road.
If you have been involved in a motor vehicle accident with a self-driving car, you should speak to an experienced local car accident lawyer right away. Attorney Kevin P. Broderick has been representing accident victims in Massachusetts and New Hampshire for decades. He has the skills and experience necessary to help you secure the compensation you deserve.
Call us today at 978-459-3085 or fill out our contact form.
Have you been injured?
Let Attorney Kevin Broderick answer your questions and evaluate your personal injury or vehicle accident case for free!
CALL TODAY 978-459-3085
Kevin Broderick Law serves clients in Massachusetts and
New Hampshire.
Areas of service in
Massachusetts
Lowell
Lawrence
Littleton
Billerica
Andover
Chelmsford
Westford
Groton
Acton
Tyngsborough
Dracut
Methuen
Areas of Service in
New Hampshire
Hudson
Nashua
Pelham
Disclaimer
The information you obtain on this site is not, nor is it intended to be, legal advice. You should consult an attorney for advice regarding your individual situation. We invite you to contact us and welcome your calls, letters, and electronic mail. Contacting us does not create an attorney-client relationship. Please do not send any confidential information to us until such time as an attorney-client relationship has been established.