这项技术将给驾驶带来巨大变化
我们只要看看朋友的脸,就可以看出他/她是累了还是心事重重,又或者是喝醉了。用不了多久,汽车就也能够识别出来。 随着面部识别技术的逐渐进步,很快机器就不仅可以识别人脸,还能够辨别人类情绪。这就意味着,下一代汽车可能配备面部识别功能,可以辨别司机面部的疲劳表情,又或身体受损的其他迹象。 波士顿初创企业Affectiva等公司已经在开发软件,帮助汽车行业整合相关技术。Affectiva的首席执行官拉娜·卡利乌比表示,情绪辨别软件不只关心司机,也会扫描乘客的状态。 这说明今后在汽车制造商推出的新型汽车上,可能根据乘员视觉信号调整某些功能,例如温度、灯光和娱乐等。随着更多自动驾驶汽车的问世,此类功能可能具有特别的吸引力。 “这是非常重要的技术,不仅有智商,情商也很高。” 卡利乌比于6月11日上午在纽约举行的《财富》首席执行官倡议论坛上说道。 她补充道,让机器感受人类情绪尤其重要,因为人类交流过程中只有7%通过文字,其他93%都是通过声调、表情和肢体语言完成。 虽然看起来汽车业很适合配备情绪识别软件,但也只是受面部识别技术深刻影响的行业之一。 根据卡利乌比的说法,个人护理行业也可以从中获益,并描述了人类护士与一组能够感受人类情绪的机器人合作照顾病人的场景。 不过,此类场景也可能加深机器取代人类的恐惧感,而且理解情绪向来是区分人类与机器人的重要特征。但卡利乌比说道,不必杞人忧天。 “人类和机器之间不是竞争关系,更像是伙伴关系。”她说道,并声称人类永远都将是机器的主导。 卡利乌比还提到,确实存在一种风险,即面部识别技术开发者训练算法时可能只用社会中一小部分群体的数据,难免导致偏见。她说道,Affectiva在努力避免出现此类情况,确保数据库在性别、种族和年龄等方面保持多样性。(财富中文网) 译者:Charlie 审校:夏林 |
We can look at a friend and tell by his or her face that this person is tired or pre-occupied—or perhaps that this person is drunk. Soon, our cars will be able to do the same. Advances in facial recognition technology mean machines can not only recognize different people, but also how they are feeling. This means the next generation of automobiles may contain features that scan drivers’ faces for fatigue or other signs of impairment. Companies, including Boston-based Affectiva, are already making software to help the auto industry integrate such technology. According to CEO Rana el Kaliouby, software that reads emotion is not focused only on drivers, but on passengers, too. This will mean that automakers may come to build vehicles that may adjust comfort factors like heat, lighting, and entertainment based on visual cues from their individual occupants—features that could be especially appealing as more autonomous cars hit the roads. “It’s really important technology not only have IQ, but lots of EQ too,” said el Kaliouby, speaking on June 11 morning at Fortune‘s CEO Initiative in New York. She added that building empathy into machines is especially important given that humans use words for only 7% of their communications. The other 93%, el Kaliouby says, consists of vocal intonations, expression, and body language. While the auto industry appears well-suited to integrate emotion-reading software, it is just one business where facial recognition could have a big impact. According to el Kaliouby, the personal care industry could also benefit from the technology. She described a scenario where a nurse works in tandem with a team of empathetic robots to take care of patients. Such a scenario, however, may also exacerbate fears of machines replacing humans—especially as the ability to understand emotions is what differentiates us from robots. But el Kaliouby said this should not be concern. “It’s not a competition between humans and machines. It’s more like a partnership,” she says, claiming that people will always be the ones in charge of the machines. el Kaliouby also addressed the risk of facial recognition makers perpetuating bias by training their algorithms on a narrow segment of society. She said Affectiva takes pains to avoid this by ensuring their databases are diverse in terms of gender, ethnicity, and age. |