As software plays an increasingly important role in defining a car, a growing number of tech companies are ogling the automotive industry as the next frontier. Apple is reportedly building an EV. Sony surprised us by making one, and even tested it in real-world conditions, but the company stressed that it currently has no plans to sell the Vision-S electric car to the public.
Samsung-owned Harman is taking a different path to the sector by focusing on specific elements of the user experience, including advanced driver-assistance systems (ADAS) and connectivity. It’s developing features like rearview cameras and augmented reality navigation systems — but told Digital Trends that personalization is the real game-changer.
“There are many people who say that lane-keeping assist is a great feature, it keeps them in their lane if they’re not 100% attentive, but there are many others who say that they don’t like it. It reacts in a way that they don’t understand. To get these people motivated, personalization is key. This, for me, will be the major breakthrough. Identifying who is behind the steering wheel, and providing personalized application of the different features via the cloud,” explained Bernhard Pirkl, Harman’s vice president of ADAS, in an interview with Digital Trends.
Pirkl hopped on the ADAS train when the technology was still in its infancy, and he has watched it evolve from a science fair-like experiment to one of the most important trends shaping the automotive industry. He envisions a near future where drivers are able to customize how the various electronic driving aids (like lane-keeping assist and automatic emergency braking) behave; you’ll be able to adjust them to suit your driving style, from loose and occasional interventions to aggressive and frequent take-overs. Better yet, these systems will gain the ability to adjust automatically after you save your preferences. It’s like a memory function for electronics, and it’s part of a broader shift towards the personalization of in-car technology that Harman calls Experiences Per Mile.
Infotainment is part of this shift, too. Even if you and your significant other have an identical phone, you’ve each got your own apps and background images. Why should your car’s touchscreen be different? In some late-model cars, notably most new Audi models, users can already drag-and-drop icons to move them around, like on a smartphone.
Providing tailored content, whether it’s a ska playlist or an adaptive cruise control setting, requires figuring out who is behind the wheel. That’s where Harman parent company Samsung comes in. It already has face recognition, gaze tracing, and pupil identification technology in its arsenal — it has put some of these features in its phones for several years. From there, it’s just a matter of embedding them in a car and connecting the system to the cloud.
Offering more personalization options will allow carmakers to differentiate their products as electrification and different degrees of autonomy gradually spread across the industry. Imagine if Apple ends up building a car, whether it’s with Kia or with another company. The model’s main selling points will undoubtedly be its design and its user interface, not its lap time on the Nürburgring or its hand-built V8 engine. Mechanical specifications will matter less than connectivity. “Our car knows you love horror films and has already curated a playlist for your trip to Oregon” will lure more buyers than “our car has a twin-turbocharged, 4.0-liter V8 built with rally-bred technology.”
Harman doesn’t build vehicles, so it’s up to individual manufacturers to bake these features into your next car. Pirkl is confident that demand for personalization will increase as partially-automated systems spread across the industry.
Full autonomy? Check back later
Harman, like a majority of its peers and rivals, doesn’t believe full autonomy is around the corner. Pirkl explained that technology developed in the last decade can drive a car better than a human. It can steer, brake, accelerate, and change gears more smoothly and with more accuracy. What these systems struggle with is the unexpected.
“Observing a situation, understanding the context, and predicting what will happen in the next five, 10, or 15 seconds is the key challenge,” he explained. Technology like lidar and 5G can help engineers overcome this hurdle.
Picture this admittedly extreme scenario. Driving to the bakery for a delicious scone, you approach an intersection and notice that the light is green so you keep your foot down without thinking twice. Meanwhile, a car traveling on the road perpendicular to yours can’t stop for its red light due to a brake problem. Your self-driving car might not see the runaway vehicle if there are signs, trucks, or buildings at the corner of the intersection, and it would be caught off-guard. It might avoid the collision, but it almost certainly wouldn’t be smooth. Adding 5G-enabled vehicle-to-vehicle communication technology to this scenario would allow the runaway car to tell others around it, including yours, “watch out, I can’t stop, please slow down.” Teaching cars to talk promises to make autonomy much safer.
We’re not there yet. That’s not to say self-driving technology has no future, however.
“What we see is that the focus has shifted more towards people-movers in operational design domains, so in dedicated environments like airports and university campuses, things like that. Or, towards commercial vehicles, where there is much less vehicle dynamics. The situations are far less complex. Autonomy will first come via these business areas, but for privately-owned cars it will take much longer than expected,” summed up Pirkl.