A video of a self-driving Tesla has caused controversy after it appeared to show the car failing to yield to a pedestrian in San Francisco. The seven-second clip, posted on Twitter by Whole Mars Catalog, shows the Tesla edging towards the pedestrian as they enter an intersection. Although the car’s software appears to detect the person, it does not comply with the law requiring vehicles to give way to pedestrians. The tweet accompanying the video describes the incident as “bullish” and “exciting”. The latest version of Tesla’s Full Self-Driving beta software, 11.4.1, has been praised by CEO Elon Musk for its improvements.
The video has raised concerns about the safety of self-driving cars, with some Twitter users suggesting that the incident was illegal. However, Whole Mars Catalog has defended the software, arguing that previous versions would have caused the car to stop within the crosswalk, which would have been unsafe for pedestrians. The account also claimed that people who do not live in cities do not understand the challenges of driving in urban areas.
The incident highlights the need for public debate about the programming of self-driving cars. While it is true that human drivers may also ignore the law in similar situations, it is important to ensure that autonomous vehicles are programmed to comply with relevant regulations. A democratic decision-making process could help to ensure that new norms are rolled out safely and responsibly.