In the next couple years, you may be able to diagnose your car’s ailments with little more than a smartphone app, thanks to research from engineers at the Massachusetts Institute of Technology. The system works by turning a phone’s microphone and accelerometers into a kind of automotive stethoscope that analyzes the sounds and vibrations made by a car to detect problems, even before the vehicle shows any obvious symptoms.
The MIT app has the potential to save the average driver $125 per year, according to Joshua Siegel, one of the research scientists behind a recent paper describing the app. That savings could be as much as $600 per year for trucks. Gas mileage is could possibly increase by a few percentage points for all vehicles.
“Right now, we’re able to diagnose several common engine and suspension problems,” Siegel told Digital Trends. “Our engine diagnoses include identifying misfires and optimally timing filter changes, while our suspension diagnostics include identifying wheel imbalances and variations in tread depth and tire pressure.”
As with much of today’s emerging tech, Siegel and his team owe a lot to recent advancements in AI algorithms and cloud computation. Thanks to machine learning, the more data the app is able to collect, the more its performance improves.
“For each diagnostic or prognostic, we collect data from the sensor or sensors best able to measure the physical manifestation of a defect, and then train models to look for certain characteristic “fingerprints” of each fault type,” Siegel explained.
To analyze air filters, the app listens to how the car “breathes” to detect how clogged the airflow is. For misfires, it listens for what Siegel called the “characteristic popping sound” that tends to scare the wits out of people in the passenger seat. By measuring vibrations, the app can pick up on wheel imbalances.
“We’re able to do this because the sensors in your phone — the same microphone you use to talk, and the same accelerometer your phone uses to rotate the screen when you switch from portrait to landscape — are far more sensitive to small changes than people are,” Siegel said.
The MIT team is currently testing the app internally with promising results, Siegel said. Moving forward, the team will train the app on various vehicle types to add to their data, with the hope that the app will work on any type of consumer vehicle.
A paper detailing the research will be published in the November issue of the journal Engineering Applications of Artificial Intelligence.