Consumer Reports is once again taking Tesla Inc. (NASDAQ: TSLA) to task for using existing owners to test new self-driving features on public roads.
Jake Fisher, senior director of CR’s test center, said, “Consumers are simply paying to be test engineers for developing technology without adequate safety protection.” And the testers are not just owners and drivers of Tesla vehicles: Using owners as beta-testers “worries some road safety experts because other drivers, as well as cyclists and pedestrians, are unaware that they’re part of an ongoing experiment that they did not consent to.”
CR spoke to other companies developing self-driving technology, including Argo AI, Cruise and Waymo, and all said that their safety tests are limited to private tracks or use trained monitors in the vehicles while they are being tested. Tesla did not respond to CR’s request for comment.
MIT professor Bryan Reimer told CR that “while [Tesla’s] drivers may have some awareness of the increased risk that they are assuming, other road users—drivers, pedestrians, cyclists, etc.—are unaware that they are in the presence of a test vehicle and have not consented to take on this risk.”
CR first warned of the risks of using owners as beta testers some five years ago. Not the least of CR’s issues was calling the system Autopilot. An executive for the consumer publication said then:
By marketing their feature as “Autopilot,” Tesla gives consumers a false sense of security. In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we’re deeply concerned that consumers are being sold a pile of promises about unproven technology.
The same is true for Tesla’s Full Self-Driving (FSD) marketing term. According to CR, the term gives “a false and deceptive impression that the vehicles can drive without human intervention.”
Reimer and other experts have called on Tesla to add strong driver monitoring systems that “ensure that the person behind the wheel is ready to take control as soon as the car cannot handle a driving task—a step Tesla has been reluctant to take in the past.” Reimer does note, however, the improvements that the company has made over the years: “It is interesting how fast the engineers at Tesla appear to be using data to improve system performance.”
The U.S. Department of Transportation and the National Highway Traffic Safety Administration (NHTSA) also need to overcome their reluctance to “determine what kinds of vehicle software can be used on public roads.” William Wallace, CR’s manager of safety policy said:
Car technology is advancing really quickly, and automation has a lot of potential, but policymakers need to step up to get strong, sensible safety rules in place. Otherwise, some companies will just treat our public roads as if they were private proving grounds, with little holding them accountable for safety.
Source: Read Full Article