为了促进使用自动驾驶系统的司机更关注道路,特斯拉召回了200多万辆电动汽车。然而研究表明,相关技术可能达不到预期。
历经两年调查后,美国国家公路交通安全管理局(U.S. National Highway Traffic Safety Administration)发现,市场领先的电动车制造商特斯拉监测司机的系统存在缺陷需要修复,上周特斯拉勉强同意召回。
系统如果监测不到握住方向盘手的扭矩,就会向司机发出警报,专家称该系统并无作用。
特斯拉向政府提交的文件称,调整在线软件可以加强对司机的警告和提醒,确保司机把手放在方向盘上。软件也可能限制最常用自动驾驶版本的使用范围,不过特斯拉的文件中并未明确提及。
2021年美国国家公路交通安全管理局开始调查,因为此前收到11份报告称部分自动驾驶的特斯拉汽车撞上了停放的应急车辆。2016年以来,该机构已派出调查人员调查至少35起车祸,其中涉嫌部分自动驾驶的特斯拉曾撞上停放在行驶路径上的应急车辆、摩托车司机或货柜车,共造成17人死亡。
不过美国国家公路交通安全管理局、美国国家运输安全委员会(National Transportation Safety Board)和其他调查人员的研究表明,仅监测方向盘的扭矩并不能确保司机在专心开车。专家表示,需要夜视摄像头检测司机的眼睛,确保他们在看路才行。
“我确实很担心解决方案,”美国国家运输安全委员会主席詹妮佛·霍曼迪表示。“不管是技术本身还是监测方式,即便监控转向扭矩也没法维持司机专心,司机还能关闭系统。”
此外,美国国家公路交通安全管理局发现,数据详细的43起撞车事故中,37名司机在撞车前最后一秒手都放在方向盘上,说明事故发生时司机不够专心。
“人类不善长监控自动化系统,出现问题时也不会应对,”三起针对特斯拉自动驾驶诉讼的原告律师唐纳德·斯拉维克说。“这就是为何研究相关情况下人为因素时发现,司机反应存在显著延迟。”
乔治梅森大学(George Mason University)研究自动驾驶汽车的工程与计算教授米西·卡明斯表示,研究人员普遍认为,仅监测手握没握方向盘没法确保司机的注意力放在前方道路。
“该指标主要监测注意力,其实参考意义不大,”她说。
专家表示,更好的解决方案是要求特斯拉用摄像头监控司机的眼睛,确保司机在看路。有些特斯拉确实有内部摄像头。卡内基梅隆大学(Carnegie Mellon University)研究车辆自动化安全的教授菲利普·库普曼说,特斯拉的内部摄像头跟通用汽车或福特汽车的司机监控系统不同,晚上基本看不清。
库普曼指出,年限较长的特斯拉没配备内部摄像头。
特斯拉的召回文件没提到增加使用摄像头。不过该公司在X(原Twitter)上发布的软件说明称,现在后视镜上方的摄像头可以监测司机是否专心,如果发现司机分心就会触发警报。特斯拉没有公关部门,因此未回复有关软件发布说明或其他与召回相关问题的邮件。
特斯拉官网称,自动驾驶和更复杂的“全自动驾驶”软件并不能完全自行驾驶,司机必须做好接手准备。
专家表示,可以将自动驾驶的运行地点限制在可控的高速公路上,但目前尚不清楚特斯拉召回后会不会如此调整。
特斯拉向美国国家公路交通安全管理局提交的召回文件中表示,基本自动驾驶系统包括自动辅助转向和主动巡航控制。文件称,自动辅助转向适用于可控的高速公路,如果条件不符合时司机启动也无法工作。文件称,软件更新将“在启用自动辅助转向时、在高速公路外启动功能时以及接近交通管制时进行额外检查”
卡明斯指出,从文件看不出特斯拉将把自动驾驶区域限制在高速公路——这一概念也叫“地理围栏”。
“特斯拉提到条件时,并没有地理围栏的意思,”她说。
《消费者报告》(Consumer Reports)杂志汽车技术副总监凯莉·芬克豪瑟表示,测试软件已更新的特斯拉Model S时发现,在非高速道路上可以启动自动驾驶。不过她表示,很难测试召回的其他调整,因为特斯拉对具体变动语焉不详。
运输安全委员会主席霍曼迪表示,希望国家公路交通安全管理局已审查特斯拉的解决方案,确保均按规定调整。
霍曼迪说,美国国家运输安全委员会只能提出建议,如果发现召回维修的特斯拉存在问题将展开调查。
美国国家公路交通安全管理局公关总监维罗妮卡·莫拉莱斯表示,不会预先批准召回,因为联邦法律将开发和维修的负担都施加给汽车制造商。不过她表示,该机构的调查将持续,还将通过美国国家公路交通安全管理局在俄亥俄州的研究和测试中心测试监控特斯拉的软件或硬件修复,目前已有几辆特斯拉就位。
莫拉莱斯说,几天前该机构收到车辆软件更新,目前尚未评估。美国国家公路交通安全管理局表示,软件更新也必须解决包括高速公路在内各种道路上的撞车事故。
佛罗里达州即将对特斯拉提起的诉讼中,美国国家公路交通安全管理局前特别顾问卡明斯将担任原告的专家证人。她表示,希望特斯拉的警告能阻止少数滥用自动驾驶的司机。卡明斯也表示,特斯拉的问题不会就此结束,除非公司能限制自动驾驶的应用范围,修复计算机视觉系统从而提升障碍物监测的准确性。(财富中文网)
译者:梁宇
审校:夏林
为了促进使用自动驾驶系统的司机更关注道路,特斯拉召回了200多万辆电动汽车。然而研究表明,相关技术可能达不到预期。
历经两年调查后,美国国家公路交通安全管理局(U.S. National Highway Traffic Safety Administration)发现,市场领先的电动车制造商特斯拉监测司机的系统存在缺陷需要修复,上周特斯拉勉强同意召回。
系统如果监测不到握住方向盘手的扭矩,就会向司机发出警报,专家称该系统并无作用。
特斯拉向政府提交的文件称,调整在线软件可以加强对司机的警告和提醒,确保司机把手放在方向盘上。软件也可能限制最常用自动驾驶版本的使用范围,不过特斯拉的文件中并未明确提及。
2021年美国国家公路交通安全管理局开始调查,因为此前收到11份报告称部分自动驾驶的特斯拉汽车撞上了停放的应急车辆。2016年以来,该机构已派出调查人员调查至少35起车祸,其中涉嫌部分自动驾驶的特斯拉曾撞上停放在行驶路径上的应急车辆、摩托车司机或货柜车,共造成17人死亡。
不过美国国家公路交通安全管理局、美国国家运输安全委员会(National Transportation Safety Board)和其他调查人员的研究表明,仅监测方向盘的扭矩并不能确保司机在专心开车。专家表示,需要夜视摄像头检测司机的眼睛,确保他们在看路才行。
“我确实很担心解决方案,”美国国家运输安全委员会主席詹妮佛·霍曼迪表示。“不管是技术本身还是监测方式,即便监控转向扭矩也没法维持司机专心,司机还能关闭系统。”
此外,美国国家公路交通安全管理局发现,数据详细的43起撞车事故中,37名司机在撞车前最后一秒手都放在方向盘上,说明事故发生时司机不够专心。
“人类不善长监控自动化系统,出现问题时也不会应对,”三起针对特斯拉自动驾驶诉讼的原告律师唐纳德·斯拉维克说。“这就是为何研究相关情况下人为因素时发现,司机反应存在显著延迟。”
乔治梅森大学(George Mason University)研究自动驾驶汽车的工程与计算教授米西·卡明斯表示,研究人员普遍认为,仅监测手握没握方向盘没法确保司机的注意力放在前方道路。
“该指标主要监测注意力,其实参考意义不大,”她说。
专家表示,更好的解决方案是要求特斯拉用摄像头监控司机的眼睛,确保司机在看路。有些特斯拉确实有内部摄像头。卡内基梅隆大学(Carnegie Mellon University)研究车辆自动化安全的教授菲利普·库普曼说,特斯拉的内部摄像头跟通用汽车或福特汽车的司机监控系统不同,晚上基本看不清。
库普曼指出,年限较长的特斯拉没配备内部摄像头。
特斯拉的召回文件没提到增加使用摄像头。不过该公司在X(原Twitter)上发布的软件说明称,现在后视镜上方的摄像头可以监测司机是否专心,如果发现司机分心就会触发警报。特斯拉没有公关部门,因此未回复有关软件发布说明或其他与召回相关问题的邮件。
特斯拉官网称,自动驾驶和更复杂的“全自动驾驶”软件并不能完全自行驾驶,司机必须做好接手准备。
专家表示,可以将自动驾驶的运行地点限制在可控的高速公路上,但目前尚不清楚特斯拉召回后会不会如此调整。
特斯拉向美国国家公路交通安全管理局提交的召回文件中表示,基本自动驾驶系统包括自动辅助转向和主动巡航控制。文件称,自动辅助转向适用于可控的高速公路,如果条件不符合时司机启动也无法工作。文件称,软件更新将“在启用自动辅助转向时、在高速公路外启动功能时以及接近交通管制时进行额外检查”
卡明斯指出,从文件看不出特斯拉将把自动驾驶区域限制在高速公路——这一概念也叫“地理围栏”。
“特斯拉提到条件时,并没有地理围栏的意思,”她说。
《消费者报告》(Consumer Reports)杂志汽车技术副总监凯莉·芬克豪瑟表示,测试软件已更新的特斯拉Model S时发现,在非高速道路上可以启动自动驾驶。不过她表示,很难测试召回的其他调整,因为特斯拉对具体变动语焉不详。
运输安全委员会主席霍曼迪表示,希望国家公路交通安全管理局已审查特斯拉的解决方案,确保均按规定调整。
霍曼迪说,美国国家运输安全委员会只能提出建议,如果发现召回维修的特斯拉存在问题将展开调查。
美国国家公路交通安全管理局公关总监维罗妮卡·莫拉莱斯表示,不会预先批准召回,因为联邦法律将开发和维修的负担都施加给汽车制造商。不过她表示,该机构的调查将持续,还将通过美国国家公路交通安全管理局在俄亥俄州的研究和测试中心测试监控特斯拉的软件或硬件修复,目前已有几辆特斯拉就位。
莫拉莱斯说,几天前该机构收到车辆软件更新,目前尚未评估。美国国家公路交通安全管理局表示,软件更新也必须解决包括高速公路在内各种道路上的撞车事故。
佛罗里达州即将对特斯拉提起的诉讼中,美国国家公路交通安全管理局前特别顾问卡明斯将担任原告的专家证人。她表示,希望特斯拉的警告能阻止少数滥用自动驾驶的司机。卡明斯也表示,特斯拉的问题不会就此结束,除非公司能限制自动驾驶的应用范围,修复计算机视觉系统从而提升障碍物监测的准确性。(财富中文网)
译者:梁宇
审校:夏林
Tesla’s recall of more than 2 million of its electric vehicles — an effort to have drivers who use its Autopilot system pay closer attention to the road — relies on technology that research shows may not work as intended.
Tesla, the leading manufacturer of EVs, reluctantly agreed to the recall last week after a two-year investigation by the U.S. National Highway Traffic Safety Administration found that Tesla’s system to monitor drivers was defective and required a fix.
The system sends alerts to drivers if it fails to detect torque from hands on the steering wheel, a system that experts describe as ineffective.
Government documents filed by Tesla say the online software change will increase warnings and alerts to drivers to keep their hands on the steering wheel. It also may limit the areas where the most commonly used versions of Autopilot can be used, though that isn’t entirely clear in Tesla’s documents.
NHTSA began its investigation in 2021, after receiving 11 reports that Teslas that were using the partially automated system crashed into parked emergency vehicles. Since 2016, the agency has sent investigators to at least 35 crashes in which Teslas that were suspected of operating on a partially automated driving system hit parked emergency vehicles, motorcyclists or tractor trailers that crossed in the vehicles’ paths, causing a total of 17 deaths.
But research conducted by NHTSA, the National Transportation Safety Board and other investigators show that merely measuring torque on the steering wheel doesn’t ensure that drivers are paying sufficient attention. Experts say night-vision cameras are needed to watch drivers’ eyes to ensure they’re looking at the road.
“I do have concerns about the solution,” said Jennifer Homendy, the chairwoman of the NTSB, which investigated two fatal Florida crashes involving Teslas on Autopilot in which neither the driver nor the system detected crossing tractor trailers. “The technology, the way it worked, including with steering torque, was not sufficient to keep drivers’ attention, and drivers disengaged.”
In addition, NHTSA’s investigation found that out of 43 crashes it examined with detailed data available, 37 drivers had their hands on the wheel in the final second before their vehicles crashed, indicating that they weren’t paying sufficient attention.
“Humans are poor at monitoring automated systems and intervening when something goes awry,” said Donald Slavik, a lawyer for plaintiffs in three lawsuits against Tesla over Autopilot. “That’s why the human factors studies have shown a significant delayed response under those conditions.”
Missy Cummings, a professor of engineering and computing at George Mason University who studies automated vehicles, said it’s widely accepted by researchers that monitoring hands on the steering wheel is insufficient to ensure a driver’s attention to the road.
“It’s a proxy measure for attention and it’s a poor measure of attention,” she said.
A better solution, experts say, would be to require Tesla to use cameras to monitor drivers’ eyes to make sure they’re watching the road. Some Teslas do have interior-facing cameras. But they don’t see well at night, unlike those in General Motors or Ford driver monitoring systems, said Philip Koopman, a professor at Carnegie Mellon University who studies vehicle automation safety.
Koopman noted that older Teslas lack such cameras.
Tesla’s recall documents say nothing about increased use of cameras. But the company’s software release notes posted on X, formerly Twitter, say that a camera above the rearview mirror can now determine whether a driver is paying attention and trigger alerts if they aren’t. Tesla, which has no media relations department, didn’t answer emailed questions about the release notes or other recall-related issues.
Tesla’s website says that Autopilot and more sophisticated “Full Self Driving” software cannot drive themselves and that drivers must be ready to intervene.
Experts say that although limiting where Autopilot can operate to controlled access highways would help, it’s unclear whether Tesla will do so with its recall.
In the recall documents it filed with NHTSA, Tesla says its basic Autopilot includes systems called Autosteer and Traffic Aware Cruise Control. The documents say that Autosteer is intended for use on controlled access highways and won’t work when a driver activates it under the wrong conditions. The software update, the documents say, will have “additional checks upon engaging Autosteer and while using the feature outside controlled access highways and when approaching traffic controls.”
Cummings noted that doesn’t specifically say Tesla will limit areas where Autopilot can work to limited-access freeways — a concept known as “geofenced.”
“When they say conditions, nowhere does that say geofenced,” she said.
Kelly Funkhouser, associate director of vehicle technology for Consumer Reports, said she was able to use Autopilot on roads that weren’t controlled access highways while testing a Tesla Model S that received the software update. But it’s difficult, she said, to test everything else in the recall because Tesla has been vague on exactly what it’s changing.
Homendy, the chairwoman of the transportation safety board, said she hopes NHTSA has reviewed Tesla’s solution to determine whether it does what the agency intended it to do.
The NTSB, which can make only recommendations, will investigate if it sees a problem with Teslas that received the recall repairs, Homendy said.
Veronica Morales, NHTSA’s communications director, said the agency doesn’t pre-approve recall fixes because federal law puts the burden on the automaker to develop and implement repairs. But she said the agency is keeping its investigation open and will monitor Tesla’s software or hardware fixes to make sure they work by testing them at NHTSA’s research and testing center in Ohio, where it has several Teslas available.
The agency received the software update on its vehicles only a few days ago and has yet to evaluate them, Morales said. The remedy must also address crashes on all roads, including highways, the agency said.
Cummings, a former NHTSA special adviser who is set to be an expert witness for the plaintiff in an upcoming Florida lawsuit against Tesla, said she expects Tesla’s warnings to deter a small number of drivers from abusing Autopilot. But the problems for Tesla, Cummings said, won’t end until it limits where the system can be used and fixes its computer vision system so it better detects obstacles.