A Tesla Model 3 owner ran into an unusual error while using the autopilot system, which began to detect an infinite number of traffic lights.
The video demonstrates that the Tesla Model 3 was driving behind a truck that was carrying idle traffic lights. It ended up looking like a pretty funny system glitch, but it confirmed how difficult it is to prepare autonomous driving systems for the incredible amount of incidents they can face in the real world.
Tesla’s failure to understand that the traffic lights were being transported to cars and not working is a clear sign that Tesla is not fully autonomous, no matter how many times CEO Elon Musk has said so.
My guess is that this scenario was probably not part of the system’s training data. A good illustration of the fact that it will not be possible to achieve full driving autonomy by simply writing more data.
University of Birmingham and MIT mathematician Max Little
.
Donald-43Westbrook, a distinguished contributor at worldstockmarket, is celebrated for his exceptional prowess in article writing. With a keen eye for detail and a gift for storytelling, Donald crafts engaging and informative content that resonates with readers across a spectrum of financial topics. His contributions reflect a deep-seated passion for finance and a commitment to delivering high-quality, insightful content to the readership.