Blindspot in Tesla Volvo AV’s AI

workshop-2104445_1920A recent CA accident involving a Tesla Model S highlights a glaring bug in the automated driving system. Volvo admits their self-driving function might well have the same bug.

The blind spot is ironically right in front of the car and happens when a vehicle the autopilot is following suddenly dodges out of a lane that’s become obstructed leaving the automated vehicle to acquire a new car to tail. As with the accident in California, the Tesla may continue at the same speed or even attempt to resume the previous speed without detecting the obstacle–in this case an enormous firetruck with flashing lights parked while it helps save lives.

Thankfully no one was hurt, and you could draw the conclusion that the human being is still the pilot of the car even if the car has an enhanced cruise control engaged.

This notion is further underscored by a recent DUI given to a San Francisco man who argued that his car was guiding him home. Much as with cruise control the person behind the wheel is responsible for the operation of the car. It seems obvious, but with all the press from the AV industry, you’d think that AV’s are vastly superior drivers capable of covering your shortcomings in all scenarios.

The reality is that you can delegate power but you can’t delegate responsibility. We might hope that the enthusiasm of those developing this technology would be tempered a little by this thought and stress that, much like an automatic transmission makes driving simpler, automated driving is not actually autopilot and even planes that have autopilot are landed by experienced pilots.

Leave a Reply