Skip to main content

You’ll soon be able to control your iPhone and iPad with your eyes

Controlling iPad with eye movement.
Apple

Apple has announced a bunch of new accessibility features that will arrive later this year for iPhone and iPad owners. Notable among them is the ability to interact with iOS and iPadOS interfaces using eye movement, which is something that’s seen in a similar system on Mac hardware.

The company calls it Eye Tracking, and it’s a system built on the Dwell Control foundations. So far, Dwell Control has been available as part of the Accessibility Keyboard on macOS, allowing users to execute mouse actions using eye and head gestures.

Recommended Videos

On the iPhone and iPad, Eye Tracking will merely require a few seconds to calibrate and will work using the front camera. Once enabled, it will let users with physical disabilities perform swipe and button gestures with their ocular movements.

Dwell actions are also available for the Vision Pro headset. On the pricey XR machine, they are bundled as part of the Assistive Touch system under accessibility settings. On Mac machines, eye and head gestures allow mouse click, drag and drop, and other core UI control gestures.

Music Haptics on iPhone.
Apple

For users with hearing challenges, Apple is adding a feature on iPhones called Music Haptics. Once activated, the Taptic Engine fitted inside an iPhone will produce vibrations in sync with the music playback using a mix of rhythmic taps, smooth vibrations, and textures.

This feature has already been certified for the millions of songs in the Apple Music library. Developers can also leverage the application programming interfaces (APIs) to enable vibration-based accessibility feedback to make their apps more inclusive and functionally rewarding for people with hearing issues.

Vocal Shortcuts on iPhone.
Apple

For people living with speech-related difficulties, Apple is adding a couple of new features to its phones and tablets. The first one is Atypical Speech, which relies on machine learning to identify the unique speech signature of a person so that it can help them perform tasks using voice commands.

Next in line is Vocal Shortcuts. This one allows users to record custom audio cues and then assign them as shortcuts for various on-device tasks, which could be single step or multi-step in nature. Apple says these features have been “designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke.”

Vehicle Motion Cues on iPhone.
Apple

Another related upcoming feature is Personal Voice. People who find it hard to say or read long sentences can create a Personal Voice using shorter phrases.

Apple has also developed a wellness feature that takes into account motion sickness for in-vehicle circumstances. The feature in question is called Vehicle Motion Cues, and once enabled, it will show animated dots on the screen that are dynamically aligned with a vehicle’s directional movement. The idea is to reduce sensory conflict, making it easier for users to read on-screen content.

Nadeem Sarwar
Nadeem is a tech journalist who started reading about cool smartphone tech out of curiosity and soon started writing…
Google Gemini arrives on iPhone as a native app
the Google extensions feature on iPhone

Google announced Thursday that it has released a new native Gemini app for iOS that will give iPhone users free, direct access to the chatbot without the need for a mobile web browser.

The Gemini mobile app has been available for Android since February, when the platform transitioned from the older Bard branding. However, iOS users could only access the AI on their phones through either the mobile Google app or via a web browser. This new app provides a more streamlined means of chatting with the bot as well as a host of new (to iOS) features.

Read more
A must-try Android app has finally arrived on the iPhone
Person holding a phone with Google Gemini Live being shown.

A few days ago, Google Gemini appeared in the Apple App Store for a user in the Philippines, who was even able to download it. We took it as a sign that the new AI assistant would soon make its way to the App Store in the U.S. Well, we were right, as you can now download Gemini as a standalone app on your iPhone, after previously only being able to access it through a browser.

The Gemini app is free to download and has a surprising number of features available. More powerful functions are available for a $20-per-month subscription, but you can try Gemini Advanced out for one month for free. It grants priority access to new features and gives a "1 million token" context window.

Read more
A new iPhone may arrive sooner than you think
iPhone SE (2022) held in a mans hand.

With the release of the iPhone 16 models recently, you may think Apple is all done with new releases for a while. Perhaps not, as attention is now shifting to the upcoming year, and we may get another new iPhone sooner than you think. Apple's first new handset, expected in early 2025, will likely be the iPhone SE 4. We now have a clearer idea of when this phone might be launched.

According to Korea's Ajunews (via MacRumors), component manufacturer LG Innotek is expected to begin mass production of a camera module that will potentially be used in the iPhone SE4 as early as next month. The company will supply the front camera module for the budget-friendly phone. The report also stated camera production often starts about three months before the final phone arrives on the market. A spring 2025 release for the iPhone SE 4 has long been rumored, and the report seems to back this up.

Read more