When we first met Apple‘s Siri assistant years ago, we learned how to interact with it in a very specific way: we asked it questions and it tried to answer.
Now Siri is changing pretty drastically from that original vision, based on how Apple talked about it at its WWDC developers’ conference on Monday.
Instead of Siri becoming “smarter” — that is, answering more complicated questions and holding more natural conversations with humans — it’s starting to learn more about how we use our iPhones and live our lives, and then and making recommendations.
Google’s Duplex project is starting to move toward allowing humans to have natural conversations with artificial intelligence as if one was talking to another person. And you can ask Google or Alexa most questions and get an answer a lot of the time.
Siri, on the other hand, doesn’t seem to have as many answers, and frequently just tries to search the web.
Instead, it’s learning more about you, and letting you train it.
Apple talked about Siri Shortcuts during they WWDC 2018 keynote. It’s a clear result of Apple’s acquisition of a company named Workflow last year. Users will be able to customize the commands that Siri can answer, and developers can use these tools. If you say “Siri, I lost my keys” for example, it can pinpoint them using an app like Tile.
Also, Siri might know that you’re running late to work since it knows that you typically leave the house at 8 am on Wednesdays, but also that you have a meeting at 9 am. Siri, in that instance, might recommend that you call your boss to let them know you’re running late.
This is where Siri seems to be going. Apple didn’t talk about “making Siri smarter,” in the sense that it’ll be able to more questions about the world. But it’s getting smarter in knowing more about you.
We’ll start to see these Siri changes in iOS 12, macOS Mojave, watchOS 5 and other Apple software releases later this year.