Does Tesla need to accept its Autopilot is faulty?

Tesla Model 3.

San Francisco, The infamous Tesla Model 3 sedan crash with a semi-truck that happened in March was triggered when the Autopilot system failed to detect the driver’s hand on the steering wheel, the media said citing reports from the National Transportation Safety Board (NTSB).

The crash, which happened on March 1 in Florida, killed the 50-year-old Tesla driver Jeremy Beren Banner.

“The NTSB’s report did not indicate the Tesla driver was at fault and said the investigation is ongoing. But the news raises more questions about Tesla’s marketing of Autopilot — the company’s semi-autonomous driving software,” CNN reported on Friday.

Despite Tesla CEO Elon Musk regularly defending the technology, critics argue that slapping the “Autopilot” name onto a driver-assistance feature can lull people into a false sense of security, making them less likely to stay fully alert and more vulnerable to a crash.

“Tesla drivers have logged more than one billion miles with Autopilot engaged, and our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance,” the report quoted a Tesla spokesperson as saying.

According to the Model 3 owner’s manual, the car “detects your hands by recognising light resistance as the steering wheel turns, or from you manually turning the steering wheel very lightly, without enough force to retake control. Engaging a turn signal or using any steering wheel button or scroll wheel also qualifies for your hands being detected”.

However, this is not the first Tesla Autopilot failure that killed a person.

Earlier in 2016, Joshua Brown died in a similar Tesla crash near Gainesville, Florida when his Model S sedan crashed into a semi-trailer truck.

Previous articleGoogle tracking your online purchase history
Next articleChina reports nearly 1 mn cases of occupational diseases