Due to the complexity of autonomous vehicle technology, a plaintiff needs expert testimony to explain product safety and accidents to the court and jury. But an applicant doesn`t just need an expert; He or she will need several experts to clarify the issues presented, which could make the prosecution of product liability claims prohibitive. For example, according to a design flaw theory, the applicant may have to provide evidence explaining how a complex algorithm could have been written more securely and that the cost of discovering and implementing this new algorithm would not outweigh the benefits. This requires a computer scientist to understand the algorithm, a mathematician to rewrite the equation, an economist to weigh the costs and benefits of the change, and an autonomous vehicle expert to confirm the possibility of the change and that it would not negatively impact the vehicle. Product warranties and many other aspects of commercial transactions are covered by the Uniform Commercial Code (UCC),41 originally issued in 1952 by the National Conference of Commissioners on Uniform State Laws (now the Uniform Law Commission)42 and the American Law Institute (ALI). It has been revised several times in the decades since its initial release to adapt to changes in the legal and business environment in general. The UCC is intended to help unify the law regarding commercial transactions in multiple jurisdictions, and in some cases has been adopted with amendments by all states and the District of Columbia. As far as product liability is concerned, the main parts of the CDU are those relating to express and implied warranties. An explicit warranty arises from promises made by a seller to a potential buyer in connection with the sale of goods.43 In the context of vehicle automation, this could be done through the actual vehicle warranties given to a buyer. This could also be done through advertising.

If a supplier of automated parallel parking systems advertises that its technology works equally well at night as well as during the day, but that the system works well during the day but not at night, a buyer could legitimately claim that the express warranty on the performance of the system has been violated. To state the obvious, the above description is not intended to be a complete treatment of product liability law with respect to autonomous vehicles. A growing number of legal scholars are studying this topic in much greater depth, including legal review articles by Ryan Calo48, Kyle Colonna,49 Sophia H. Duffy and Jamie Patrick Hopkins,50 Andrew Garza,51 Kyle Graham,52 Gary Marchant and Rachel Lindor,53 Bryant Walker Smith,54 and others. In addition, researchers at the RAND Corporation addressed the responsibility of autonomous vehicles in reports published in 200955 and 2014.56 Kayla Matthews is a legal and technology journalist specializing in IT, cybersecurity, business efficiency, and professional productivity. Her work has appeared in publications such as VentureBeat, VICE`s Motherboard, Gear Diary, Inc.com, The Huffington Post, CloudTweaks and others. She is a senior writer for MakeUseOf and owner and editor of the productivity and technology blog Productivity Bytes. Location data is necessarily associated with the use of autonomous vehicles. In fact, this has been happening for quite some time now, but additional location information would allow it to provide additional functionality and benefits to the user. For example, navigation features available in many modern cars include the ability to save certain locations to memory. use the current location and planned route to determine additional information relevant to the journey, including real-time traffic data, points of interest on or near the planned route; and to specify route parameters, such as avoiding highways or toll roads. However, the means of informing the driver of the risk, the interactive digital interface, is also associated with known problems, so that, even if all relevant information on risks and legal conditions is presented as required, the driver cannot absorb this information before notifying consent.

It is argued that while drivers are known to be unlikely to understand information transmitted to them electronically, this is also an issue that may affect the validity of consent. If a driver negligently sues the autonomous vehicle manufacturer after an accident, successfully arguing that he did not understand the risk he would have accepted, the volens defense (the mindset necessary to voluntarily take risks) would not be available to protect the manufacturer from liability. Vehicles at this level of automation allow the driver to relinquish full control of all safety-critical functions in certain traffic or environmental conditions and rely heavily on the vehicle in these conditions to monitor changes in those conditions that require a transition to driver control. The driver should be available for occasional checks, but with a sufficiently comfortable transition period. The vehicle is designed to ensure safe operation in automated driving mode. An example would be an automated or autonomous car that can detect when the system is no longer able to support automation, for example from an oncoming construction zone, and then signal the driver to re-engage in the driving task, giving the driver a reasonable transition period to safely regain manual control. The main difference between Level Two and Level Three is that the Level 3 vehicle is designed in such a way that the driver is not expected to constantly monitor the road while driving. Interactive digital interfaces are likely to be used in autonomous vehicles to provide information to the driver, including information related to driver safety and liability (Global Road Safety Forum, 2020). Interactive interfaces can also be used to define other conditions under which the driver is allowed to drive the vehicle. This may include limitation of liability.

For example, a condition may state that the driver is legally responsible for an accident that occurs while in control of the vehicle. For conditions to have legal meaning, the driver must understand and accept these terms (Bix, 2010). If the conditions are presented to the driver via an interactive interface, the driver can give consent by selecting “I agree” or “I confirm” (or by notifying it verbally) on the interface before boarding the vehicle. Notification of consent to supposedly legally binding conditions is a framework that can facilitate the transfer of responsibility between driver and vehicle and make the legal consequences of an accident predictable. Training specifically tailored to the requirements of human-to-vehicle and vehicle-to-human transfer would determine whether individuals are aware of the risks and responsibilities associated with the use of highly automated vehicles.