While some predicted that this year’s annual Apple WWDC 2018 conference wouldn’t have much to show this year, as many of the major displays had been leaked early or were already known about, there was one big surprise: The AI Phone.
Don’t be confused: This new product is still an iPhone, despite the extra A at the start of its title. Instead, this re-branding of sorts represents a shift in direction for the series, and hints at where the devices will be looking to make their biggest advancements in the future. As one might assume, the AI Phone will be focusing on its integrated AI, or in this case, the Siri mobile assistant. This new upgrade will tremendously increase the assistant’s functionality, giving it predictive capabilities that begin to blend with the artificial intelligence of science fiction.
The new “Siri Suggestion” feature will appear right under the iPhone’s search screen, possibly in place of the existing “search suggestions”. Siri may suggest that you call your mother on her birthday, turn your phone to silent at the movies, automatically set an alarm because you’ve set it the same way in previous nights, or open a workout app because you’re at the gym. It looks to automate many of the daily functions that you use your iPhone for and hammer down a routine that you can fall into.
Siri will also be able to launch suggestions on the lock screen, notification-style. But instead of being reactive alerts to events occurring in the outside world, these are Siri’s proactive attempts to predict what you’ll next be doing with your phone. Do you regularly order coffee at a certain time of day? Siri might develop a system to automatically run through your coffee app and place the order for you. Do you need a gym’s mobile app to “scan in” and prove your membership? Siri asks if you’d like to open that app because the phone’s location tracking services say you’ve arrived at the gym.
But perhaps the most impressive feature is a multi-function interface that allows users to set custom commands for Siri. This feature is huge: Both in functionality and in how uncharacteristic it is for Apple to allow users this amount of push over their pre-set interface. Android hacks and apps have allowed custom commands for voice assistants for years, but now Apple is making the function official, and very user-friendly as well.
On the stage floor, Siri Shortcuts team member Kim Bernett demonstrated the power of the feature. When she told her iPhone she was “heading home”, the phone automatically performed a number of commands: It launched Apple Maps and set her destination to home, turned on her favorite NPR station, adjusted her home thermostat, turned on her fan, and sent her estimated arrival time to her roommate.
That’s a lot of options for a single feature, but knowing Apple’s drive for maximum user accessibility, it’s probably limited in some way. That’s where other app developers come in: For the first time, Apple will be allowing other applications to interact and set actions with their mobile voice assistant. This will begin to appear with an “Add to Siri” function in a number of apps.
Clicking the button will make the action you just performed a part of Siri’s suggestion service. Just bought a few dozen apples from Amazon like you do every few weeks? Add it to Siri, and you’ll never have to worry about remembering it again: When the usual number of weeks runs out, Siri will prompt you to make the same purchase again.
Beyond this functionally, it is unclear what kind of access Apple will give third-party developers to its new super-AI software. One thing is clear: Siri is getting very smart, very quickly. But is its automation of your life functions convenient, or a little eerie? How much will Siri grow in future iPhones? Current-gen AI products are quickly matching those of science fiction, and it will be up to consumers to decide this AI integration software begins to hit a little too close for comfort.