4 New Accessibility Features Worth Checking Out In iOS 18

Apple's WWDC 2024 event was jam-packed with exciting announcements and new features set to roll out with iOS 18. Beyond the customization galore and the bucket load of AI-powered elements, the new software for iPhones and iPads comes with a few potent accessibility-oriented additions. Apple has always catered to a wider demographic of users, and has consistently promoted accessibility settings to ensure fewer roadblocks for anyone who's using its products.  

Despite being tailor-made for people with disabilities or those who have difficulty interacting with their devices, these four accessibility features found in iOS 18 have the potential to be used by just about anyone looking to further enhance the experience of using their iPhone or iPad. At the time of writing, these accessibility options are available in the developer beta of iOS 18, and should soon make their way to a stable release later down the road. With that said, here's everything you need to know about the new batch of inclusive controls for Apple devices.

Eye Tracking

It doesn't get any more futuristic than controlling something using just your eyes — something that is now a possibility in iOS and iPadOS 18. To test out Eye Tracking on your device, navigate to Settings > Accessibility > Eye Tracking, and get started with the calibration process. You will soon be able to interact with your iPhone or iPad using nothing but your eyes. Glancing at different toggles and parts of your screen will prompt your device to select that element, and holding your stare for a bit will perform the dwell action, which by default is set to a single tap. You can perform more specific tasks like summoning Siri or using the volume buttons through the AssistiveTouch shortcut. 

In our testing, the feature in its current state is definitely far from perfect. Despite carrying out the calibration process in more than adequate lighting, the iPhone consistently tracked the wrong options on screen. The complementary Dwell control, while useful, might take some getting used to. Unlike Samsung's Smart Scroll, Apple's Eye Tracking is clearly meant to be used as an accessibility alternative to control your device, as opposed to a cooler way of using a gadget. The feature plays out much better in usefulness for the bigger screen of an iPad, and we hope Apple continues to improve Eye Tracking in later releases of iOS and iPadOS. 

Music Haptics

Listening to music is inarguably one of the most therapeutic ways to relax, and with Music Haptics in iOS 18, those who are hard of hearing can enjoy their favorite songs through feedback received from vibrations. This feature utilizes the Taptic Engine found in iPhones to create vibration patterns that sync to the beat of the song that's playing. To enable Music Haptics on your iPhone, navigate to Settings > Accessibility > Music Haptics, and turn the toggle on. Since iPads don't have the Taptic Engine for vibrations, this accessibility feature is exclusive to iPhones. The good part is, no matter which iPhone you own that supports Music Haptics, you are sure to be blown away by the incredibly precise vibration motor found underneath. 

Currently, Music Haptics is only supported in select songs on Apple Music — so if you use Spotify or YouTube Music as your preferred song streaming app, you might need to wait longer until third-party app developers incorporate this feature into their apps. Luckily, we found that Music Haptics works across a wide variety of songs on Apple Music — although some titles are better optimized than others. Enabling this setting might also cause your iPhone's battery to drain faster. You can customize the Control Center to add a shortcut to this feature for an easier way to toggle it on or off.

Vehicle Motion Cues

Motion sickness refers to when you feel nauseous in a moving car, airplane, or a ship — and happens due to a mismatch between the sensory inputs, such as what you are seeing versus what you feel. The effects often worsen if you attempt to use your phone, or read a book for a prolonged time in a moving vehicle. With iOS 18, Apple has addressed this inconvenience, and as someone who suffers from extreme motion sickness in vehicles, the effectiveness of this feature blew me away.

Once turned on, you'll notice dots on the screen, mostly scattered along the left and right edges of the display. By determining changes in your movement through various sensors in the iPhone, these dots move in a manner that mirrors the motion of the vehicle you're in. This means you can stay focused on the contents on your screen, while the dots trick your brain into mostly negating the motion sickness. Like Music Haptics, you can add a shortcut to toggle Vehicle Motion Cues in the Control Center. Apart from the "On" and "Off" options, you can set the feature to automatically turn on when it detects you are in a moving vehicle. 

Vocal Shortcuts

Also announced at the keynote was Apple Intelligence and its promise to supercharge Siri, giving it perhaps another shot at being a usable smart assistant. Though it has consistently fallen behind the likes of Google Assistant and Alexa, people often overlook the few tricks they can perform to get the most out of Siri. With iOS 18, you can pair Siri's ability to launch apps and control a few device settings with the convenience of Vocal Shortcuts. 

This is a new accessibility feature that, once set up, allows you to carry out your favorite Siri commands without having to invoke the digital assistant by uttering the "Hey Siri" catchphrase. This not only saves valuable time, but cuts down the vocal interactions with your phone. Let's take an example to better understand this feature. If you regularly launch Netflix on your iPhone, you have been able to do so by first saying out loud "Hey Siri," waiting for it to pop up, and then commanding it to "Open Netflix." 

With Vocal Shortcuts, you can bypass this by setting a custom phrase like "Open Netflix". The next time you utter this phrase, your iPhone or iPad will perform the action. Though it still relies on Siri's capabilities, Vocal Shortcuts makes it easier for people with speech difficulties to minimize the vocal instructions to perform tasks on their device.