Ford: Using Machine Learning To Humanize Vehicles
Imagine a car that understands how you feel in real time and adapts accordingly. It could adjust the cockpit light if it feels that you are sad, or take over the wheel if you are too angry to drive. Is this where the market should go?
With the rise of the likes of Alexa, Google home, and Siri, consumer demand for highly personalized and adaptive products is increasing, and the automotive industry is no different. “People today want to interact with technology whether that’s a virtual assistant in their homes or their cars” said Dr Rana El Kaliouby, CEO of Affectiva, the global leader in Artificial Emotional Intelligence1. In an effort to maintain market share, the research team at Ford was able to spot this trend early on. They teamed up with Affectiva to integrate machine learning techniques to produce empathetic vehicles with a personalized experience. According to Dimitar Filev, Executive Technical Leader at Ford, incorporating machine learning is a byproduct of growing consumer demand for new features and the abundance of information fueled by high computational power in current vehicles.2
A car that understands how you feel
Ford’s product development strategy is based on using sentiment analysis software to detect the emotions of the driver in the car. Through an emotion-detecting algorithm they are able to capture a wide range of facial expressions conveying feelings including anger, joy, drowsiness, surprise, and engagement in real-time1. The applications of such technology in the automotive industry are limitless and can put Ford on the forefront of creating cars that adapt to drivers and passenger complex cognitive and emotional states. Ford is not alone, other companies in different industries like Facebook, Google, Mercedes are also investing in such algorithms to decode their users’ emotions. It is estimated that market size for such affective machine learning solutions will reach $41 billion by 20223.
In January 2018, Ford revealed a prototype with the first application of cognitive learning, the ‘Buzz Car’. They used wearable sensors to measure drivers’ blood pressure and whenever they get excited while speeding up, the car flashes almost 200,000 LED on its exterior to channel this ‘buzz’ through stunning animations. After series of testing, researches at Ford realized that driving is the runner up for activities that brings excitement to people after riding roller coasters4. It is followed by a number of seemingly rejuvenating activities like kissing or watching a Game of Thrones episode. This puts more pressure on Ford to deliver a more empathetic driving experience, coupled with the fact that other competitors like Mercedes and Tesla are going in the same direction.
From telling a joke to self-driving
Filev believes Ford’s emotional intelligence technology is getting more and more advanced. They are currently developing a feature that adjusts the car’s atmosphere based on the driver’s mood. Tampering with the climate settings, cockpit light, or choice of music to match or change the captured emotion5. If a driver is happy, the car could play a happy song, and if the driver is sad, it could crack a good joke. In the near future, Ford is planning on rolling out a car that at sometimes could take over the wheel in extreme conditions of anger, drowsiness, stress, or distraction. The aim of this application is to reduce the number of accidents and make driving safer. While using machine and deep learning to produce autonomous vehicles is Ford’s end goal, there are many applications for the empathetic car like cognitive-based suspension adjustment and fuel consumption, and perceptive acceleration and braking2.
A closer look into the future
Studies show that machine learning technologies that detect human emotion through facial expressions can be 93.3% accurate at best6. This might seem like an acceptable margin of error in other applications like advertisement rating. In Ford’s case, rolling-out any cognitive based self-driving vehicles must have a 100% degree of accuracy. They are creating a very sensitive and risky product that could result in a wide range of unfavorable outcomes, and one bad user experience could be detrimental to the brand. Moreover, for many people driving is about the experience of being in control of your own vehicle. Changing this behavior requires a behavioral shift in their customer base. Consequently, I think Ford should conduct a more vigorous market research to validate the buy in from their customers to avoid designing a great product that people do not really want. This is applicable especially to their short-term goals where all their iterations aim to provide a humanized luxurious driving experience rather than solving a core issue such as decreasing car accidents.
Emotion based technologies always raise a lot of ethical and existential questions over machines taking power that I believe are very applicable to Ford and the automotive industry; Do we want machines to understand how we feel and control us accordingly? Is this a breach of privacy? What if drivers are going to engage in unlawful acts, will the car lock them in? Does this limit human’s free will? What if cars can communicate with each other, does this raise any security concerns?
1“Affectiva and Nuance to Bring Emotional Intelligence to AI-Powered Automotive Assistants,” September 6, 2018, Business Wire, https://tinyurl.com/y7ef6dw4, accessed November 2018.
2 Filev, Dimitar. Interview by Medha Agarwal. “Adopting AI in the Enterprise: Ford Motor Company,” O’Reilly, July 20, 2017, https://tinyurl.com/ycl9yf2v, accessed November 2018.
3Sophie Kleber, “3 Ways AI Is Getting More Emotional,” Harvard Business Review, July 31, 2018, https://tinyurl.com/y8rjkf8z, accessed November 2018.
4 Ford Company, “Ford Media Center,” https://media.ford.com/content/fordmedia/fna/us/en/news/2018/01/23/want-to-feel-good–forget-kissing–football-and-dancing–get-a-s.html, accessed November 2018.
5 Ian Thibodeau, “Ford wants your new car to pick a song – or tell a joke,” Detroit News, February 22, 2017, https://tinyurl.com/y9vg2zd2, accessed November 2018.
6 M. S. Bartlett, G. Littlewort, C. Lainscsek, I. Fasel and J. Movellan, “Machine learning methods for fully automatic recognition of facial expressions and facial actions,” 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583), The Hague, 2004, pp. 592-597 vol.1. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1398364&isnumber=30409
Student comments on Ford: Using Machine Learning To Humanize Vehicles
Very interesting article. I believe that his technology’s biggest application would be regarding safety. Imagine a car that could detect if a driver is having a heated argument, or falling asleep, and could consequently deploy a safety mechanism to take over and avoid an accident. Sounds like a winner for everyone to me. However, to what extent will machines ever be able to do this effectively? What if I’m driving someone to the hospital in an emergency? Would the car assess my stress level and facial expression as “danger” and take over? You touch on a really interesting point, which is the extent to which we want to surrender control to machines. It’s a slippery slope that we don’t know much about and that could have catastrophic implications.
As someone who has driven a vehicle in some pretty stressful situations, I’d argue that when you’re displaying “danger” is the best time for a machine to take over. As levels of stress go up, human performance declines significantly. By taking over when you are at your worst, the AI driving the car is more likely to prevent an accident by driving by itself.
The potential to establish a link between autonomous cars could fix the issue of emergencies. If autonomous vehicles were able to communicate, they could move out of the way for a vehicle experiencing an emergency, allowing you to get to the hospital faster. This would also have a use case for mechanical failure. I recently had a break line blow out but had no way to effectively communicate that with other cars in the area as I barreled through a stop sign at 30 MPH. An autonomous vehicle would have been able to send out a danger signal to others in the area and keep them clear, preventing an accident.
Unexpected application of facial/emotional recognition by Ford! I think that combined with information on driving speed/pattern, the technology could have a lot of cool applications. (One great implementation would be in the prevention of drunk driving.) While I think it has potential for more good than harm, you’re right that Ford will have to get over consumers’ negative visceral reaction to having their emotions monitored. Especially for negative emotions, such as anger or sadness, it is hard to predict how an individual will react to different kinds of intervention, and so that poses a challenge for design as well.