“The HomePod uses Siri, Apple’s voice-controlled AI assistant, as its central user interface; and Siri is getting an upgrade.”
Much of the media coverage of Apple’s Worldwide Developers Conference this week has focused on the HomePod, a 7″, off-white device that looks like a speaker shaped like a roll of toilet paper. And while it was certainly Apple’s central showpiece in terms of hardware, it somewhat obscured more important developments occurring behind the scenes – namely, Apple’s push toward better artificial intelligence.
The HomePod uses Siri, Apple’s voice-controlled AI assistant, as its central user interface; and Siri is getting an upgrade. The system will now use machine learning and computer vision to dynamically present information deemed relevant to users’ interests – what traffic is like along the usual morning commute, for example, or suggesting hotels while the user reads about a travel destination. And as Quartz reports, Siri will respond to users with entirely AI-generated sentences, rather than canned dialogue.
Delving deeper into the back-end, Apple also revealed that it will launch a new API suite to let developers incorporate powerful AI into their own apps and programs. Called Core ML, it offers developers image recognition and natural language processing tools, all of which can be run locally on a user’s iPhone, Apple Watch or, presumably, HomePod.
As for that new device, Siri’s enhanced AI will enable it to ‘know’ where it’s positioned in a room and adjust music accordingly, and to control various other connected devices like lights and thermostats to the user’s preference. And given the rumors last autumn that Apple was working to enable facial recognition for its forthcoming smart home device, it’s possible the HomePod will turn out to be even smarter than it’s currently being made out to be.
Sources: Quartz, Futurism, The Washington Post
Follow Us