Skip to main content

Apple may introduce augmented reality functionality into the iPhone’s camera

Report: Apple has up to 1,000 engineers working on AR product for the iPhone

apple encryption court order news logo
Image used with permission by copyright holder
Google isn’t the only company looking to cash in on the augmented reality craze. Business Insider reports that Apple plans to integrate augmented reality tech directly into the iPhone’s camera app.

The new feature would reportedly take the form of computer vision. Apple’s work-in-progress camera app will be able to identify objects in-frame and recognize faces, according to the report. And it will be made available to developers in the form of a software development kit.

The most recent news we’ve heard about Apple’s augmented reality plans suggest that Apple may have as many as 1,000 engineers working on an AR-related product in Israel — a product that will make its way to the next iPhone, at least according to analyst Steven Milunovich and his team at UBS. The report, from Business Insider, also highlights that Apple has recently made a number of AR-related purchases, including PrimeSense, a 3D sensing company in Tel Aviv, and RealFace, a facial recognition company also based in Tel Aviv.

The Cupertino, California-based company is far from the first to enter the AR arena. Google Goggles, Google’s eponymous object recognition app, recognizes landmarks, barcodes, books, and works of art, and parses text of labels and signage using optical character recognition. And Amazon’s Flow app can decode QR codes, scan business cards, and recognize tens of millions of books, DVDs, and packaged products.

Apple’s system sounds nearly as ambitious: an app that can identify objects that users point the iPhone’s camera at in real time. It will rely on machine learning, a type of artificial intelligence that “learns” and improves over time, and a database of 3D objects that Apple will either license or build itself.

Beyond those basics, the project’s implications aren’t clear. Google’s Project Tango, an augmented reality platform, leverages sensors to measure the depth of surroundings, but the iPhone lacks the hardware necessary to perform that sort of tracking. Apple Insider speculates that Apple’s brand of machine-learning-powered object tracking could be used for spatial mapping, and that facial recognition, meanwhile, could be used to apply Snapchat-style filters to people’s faces.

Spearheading the project is a team comprised of employees from recent Apple acquisitions. The iPhone maker purchased PrimeSense, the Israeli company behind the motion-tracking hardware in Microsoft’s Kinect sensor, in 2013. It bought Metaio in February 2014, FaceShift in 2015, and Flyby Media in January 2016, all of which specialize in virtual reality and AR technologies. And it hired a senior optics manufacturing engineer specializing in heads-up displays, camera systems, and image sensors in September.

The project builds on another reportedly in development at Apple’s Cupertino headquarters: AR glasses. According to Business Insider, Apple is developing a compact pair of AR eyewear that connect wirelessly with an iPhone and show images and other information in the wearer’s field of vision.

If anything is for certain, it’s that an updated camera app will debut far ahead of a headset. Bloomberg reports that Apple has begun talking with suppliers about the glasses and has ordered “small quantities” of displays for testing. The publication pegs 2018 as the product’s earliest possible release window.

Until then, Snapchat’s Spectacles will have to do.

Updated on 03-02-2017 by Christian de Looper: Added new Business Insider report saying that Apple had as many as 1,000 engineers working on AR.

Editors' Recommendations

Kyle Wiggers
Former Digital Trends Contributor
Kyle Wiggers is a writer, Web designer, and podcaster with an acute interest in all things tech. When not reviewing gadgets…
Everything Apple says is wrong about the DOJ’s iPhone lawsuit
The Apple logo on the iPhone 14 Pro Max.

The antitrust season is in full swing in 2024. This time around, Apple is in the cross-hairs of regulators, bringing back memories of the historic Microsoft antitrust case that unfolded over two decades ago. Back then, the focus was on Windows and web browsers. In Apple’s case, the iPhone is the centerpiece, with a wide ecosystem woven around it.

Experts say the case against Apple, which dives deep into monopolistic conduct, is surprisingly strong. The Department of Justice, in its lawsuit, has targeted everything from the iMessage “green bubble” mess and Apple Watch incompatibility situation to the locked app ecosystem and objectionable practices that Apple has put in place to maintain its alleged monopoly.

Read more
I’ve had the iPhone 15 Pro for six months. Here’s why it’s still amazing
Blue Titanium iPhone 15 Pro in hand.

The iPhone 15 Pro was released on September 22, 2023. When Apple announced it, I was excited about changes like the Action button and the titanium frame.

Now, as we approach the end of the first quarter of 2024, the iPhone 15 line is six months old. We’ve already had a slew of flagship Android phones, with the OnePlus 12 and the Samsung Galaxy S24 being particular standouts.

Read more
10 reasons you should buy an iPhone in 2024
Purple iPhone 14 (left) and a green iPhone 15 in hand.

The iPhone 15 lineup — which includes the standard iPhone 15 and the iPhone 15 Pro — is the iPhone at its best. It's the latest series of iPhones available today and the default choice if you're buying a new iPhone in 2024.

But it’s not the only choice of iPhones you can purchase. In fact, Apple still sells the iPhone 14, iPhone 13, and the iPhone SE on its website. You could also find other iPhone models available – refurbished or new — from other retailers or carrier stores.

Read more