Tesla’s camera-based driver monitoring system exists; pretend it doesn’t make the roads less safe
Tesla’s FSD Beta program has started to expand to more users. And while the system is only distributed today to drivers with a perfect safety score, the Advanced Driver Assistance System is expected to be available to users with a rating of 99 and below in the near future. True to form, with the expansion of FSD Beta also came the predictable wave of complaints and criticism, some of which still refuse to acknowledge that Tesla is now using its vehicles’ on-board camera to bolster its driver monitoring systems. .
Just recently, the NHTSA sent a letter to Tesla asking them to explain why the company made improvements to the autopilot without issuing a safety recall. According to NHTSA, Tesla should have filed a recall notice if the company discovered a “safety defect” on its vehicles. What NHTSA missed is that the autopilot update, which allowed the company’s vehicles to slow down and alert their drivers when an emergency vehicle is detected, was performed. like a proactive measure, not as a response to a fault.
Consumer reports weighs
Weighing on the question, Consumer reports argued that at the end of the day, over-the-air software updates don’t really solve Teslas’ main weakness, which is driver oversight. The magazine admitted that Tesla’s driver assistance system object detection and response is better than comparable systems, but Kelly Funkhouser, responsible for connected and automated vehicle testing for Consumer reports, argued that this is precisely why the magazine has safety issues with Tesla cars.
“In our testing, Tesla continues to perform well in object detection and response compared to other vehicles. It’s actually because the driver assistance system works so well that we are worried about not relying on it too much. The most important change Tesla needs to make is to add protections, such as an effective direct driver monitoring system, to ensure the driver is aware of their surroundings and able to take over in these types of situations. scenarios, ”Funkhouser said.
Jake Fisher, Senior Director of Consumer reports‘Auto Test Center, also shared its take on the matter, particularly around some autopilot accidents involving emergency vehicles standing by the side of the road. “CR’s position is that such accidents can be avoided if there is an effective pilot monitoring system, and that is the underlying problem here,” Fisher said, adding that the live software updates are usually not sent to correct faults.
Tesla’s camera-based DMS
Funkhouser and Fisher’s reference to direct driver monitoring systems is interesting as the exact functionality has been gradually rolled out to Tesla vehicles over the past few months. It’s quite strange that Consumer reports seems to ignore it, given that the magazine has Tesla in its fleet. After all, Tesla has deployed its camera-based driver monitoring system in its fleet since late May 2021. A camera-based system deployment to vehicles equipped with radar was carried out in the previous quarter.
Tesla’s patch notes for its camera-based driver monitoring feature describe how the feature works. “The cab camera above your rearview mirror can now detect and alert driver inattention when the autopilot is engaged. The data from the camera does not leave the car itself, which means the system cannot record or transmit information unless data sharing is enabled, ”Tesla noted in its release notes.
What is interesting is that Consumer reportsJake Fisher was briefed on the feature when it launched last May. In a tweet, Fisher even noted that the camera-based system was not “just about preventing abuse; It also has “the potential to save lives by preventing distractions.” This shows that Consumer reports, or at least the head of its automatic test center, was fully aware that Tesla’s on-board cameras are now regularly used for driver monitoring purposes. This makes his recent comments about Tesla’s lack of driver oversight quite odd.
Heritage or bust?
That being said, Consumer reports appears to have a narrative prepared once it acknowledges the existence of Tesla’s camera-based driver monitoring system. In March, the magazine ran an article criticizing Tesla for its on-board cameras, titled “Tesla’s on-board cameras raise privacy concerns.” In the article, the magazine noted that the electric vehicle maker could simply use its on-board cameras to its own advantage. “
“We’ve seen Tesla before blaming the driver for not paying attention immediately to reports of an accident while a driver was using autopilot. Now Tesla can use video footage to prove that a driver is distracted rather than addressing the reasons the driver was not paying attention in the first place, ”said Funkhouser.
Given that Consumer reports seems to criticize Tesla’s use (or non-use for that matter) of on-board cameras in its vehicles, it appears the magazine claims that the only effective and safe driver monitoring systems are those used by veteran automakers like General Motors for its Super Cruise system. However, even the advanced eye-tracking technology used by GM for Super Cruise, which Consumer reports openly praise, proved susceptible to driver abuse.
This has been proven by Car and driver, when the automotive publication tricked Super Cruise into making it operate without a driver using a pair of gag glasses with eyes painted on them. We could easily criticize Car and driver for publicly showing a vulnerability in Super Cruise’s driver monitoring systems, but it must be remembered that Consumer reports also released a comprehensive guide on how to trick Tesla’s autopilot to run driverless using a series of tricks and a defeat device.
Salivating for the first crash of the FSD Beta
What’s rather unfortunate amid criticism surrounding the expansion of FSD Beta is the fact that skeptics seem to be salivating for the first crash involving the Advanced Driver Assistance System. Fortunately, Tesla seems to be aware of this, which may be why the beta is only released to the safest drivers in the fleet. Tesla plans to release the system to drivers with lower safety scores, but it wouldn’t be surprising if the company ends up taking an even more cautious approach when it does.
That being said, incidents on the road are inevitable, and one can only hope that when something does happen it wouldn’t be too easy for an organization like Consumer reports run away with a narrative that echoes lies its own executives have publicly acknowledged, such as the potential benefits of Tesla’s camera-based driver surveillance system. After all, Tesla’s FSD suite and Autopilot are designed as safety features and, so far, already make the company’s fleet of vehicles less vulnerable to road accidents. Over time, and as more and more people participate in FSD’s beta program, autopilot and fully autonomous driving would only become safer.
Tesla is not above criticism, of course. There are several aspects of the business that deserve to be mentioned. Service and quality control, as well as the treatment of longtime Tesla customers who have purchased FSD cars with MCU1 units, are just a few of them. However, it is difficult to argue that FSD and autopilot make roads less safe. Autopilot and FSD have saved many lives already, and they have the potential to save countless once they are fully developed. So why block their development and deployment?
Do not hesitate to contact us with new tips. Just send a message to [email protected] to warn us.