Siri Shortcuts

At WWDC this week, Apple announced a huge milestone for Siri: Siri Shortcuts. Back when Apple acquired the company behind Workflow, a powerful automation tool for iOS, there was much speculation on just what would happen. Many feared it would disappear, never to be seen again, but over time we still saw bug fix updates come to the Workflow app. At WWDC they showed off Siri Shortcuts, and an app called Shorcuts that will coincide with this. It is very clear that this Shortcuts app is the result of the Workflow acquisition. The layout is almost identical, but the key difference this time around is that it is much more deeply integrated with iOS.

The Shortcuts app itself would be pretty handy in general, you’ll be able to build Automator-style “Shortcuts” that can trigger actions throughout the OS. Apple showed that third-parties will also be able to integrate with this, so the benefits begin to snowball into something very interesting. Apple didn’t stop there, however. Combine all of this with the “Siri” side of things: Shortcuts will be able to trigger using custom voice phrases that the user can assign for Siri. Telling Siri “laundry time” could kick off a custom Shortcut that the user created, for example. The icing on the cake with all of this is that, beyond just voice, Siri will recognize patterns in usage for these Shortcuts and offer recommendations for them throughout the OS.

Some examples of Siri recommendations include Spotlight/Siri search, where the user can pull down on the home screen and it will present some suggested shortcuts based on relevant context (time, location, etc). Running late for a meeting? It may offer to text your boss. Going to the gym? It may recognize a gym Shortcut you setup to start a workout and listen to a podcast.

These recommendations can also be helped along by third-party apps. These apps can “donate” Siri relevant data for it to analyze and find patterns with. Apple offers various ways to “nudge” Siri in the right direction if you know you want the user to see this information within certain times, events, etc, but overall it sounds like it should be pretty capable of learning and offering useful suggestions.

Some other ways these Shortcuts will come up include the lock screen and the Siri watch face for Apple watch. This is particularly exciting, as it will be possible for many tasks to be carried off in the background, the user simply needs to invoke it from the watch face or trigger phrase.

While some of these details are still unclear, these Siri Shortcuts offer a huge leap in capabilities for Siri. What used to be a fairly limited set of SiriKit domains will not be opened pretty wide for third-parties to do as they please. It hasn’t been made clear if apps will be able to request additional information, such as asking the user for input without opening the app, but if this isn’t available at launch, surely it won’t be far behind.

Since these trigger phrases will be programmed by the user, it sounds like they won’t be fine-grained enough to accept variable inputs. While it would be awesome to, for example, be able to say “order 2 pizzas” and have it parse each word to understand quantity and product, the reality sounds like it will be closer to “order pizza” and the app will guide you through the details.

One curiosity I’m anxious to see is how all of this fits into the Shorcuts app. Workflow the app is capable of requesting user input for variables, but what if this went a step further and allowed Siri to not only trigger by the phrase, in this case “order pizza”, but could also ask follow-up questions like “what toppings?”, “what size?” and “how many?”. These will be more difficult of course, and likely won’t be available immediately, but it would really complete the autonomous nature that this concept is creating.

Earlier this year, Google showed off an AI feature that attempted to mimic a human being calling over the phone for use as their assistant. It was intended to set appointments and things of that nature, but ultimately came off a little too uncanny and creepy for a lot of people. While the idea is neat, it takes much of the human/AI grey area too far. What Apple could accomplish with these new functions is an alternative option to Google’s AI: A way for the user to quickly and effortlessly accomplish the same tasks, but without the creepy phone call to somewhere.

We’ve already seen Apple’s ability to integrate with ride sharing and restaurant booking via Siri. No phone calls are needed, but the user can quickly request what is needed and get a result. Shortcuts has the potential to take this to the next level and allow third-parties to develop their own options for booking appointments, ordering food, etc. While this will depend on the companies to build support, it would allow a clear separation of that uncanny valley and allow the same work to be done. This also leaves far less room for error, as a properly developed app would be able to know what problems they may run into, rather than an AI that hopefully is able to respond when confronted with an unexpected human question.

Overall, Siri Shortcuts are still in early days. We haven’t even seen the full Shortcuts app yet, so it’s hard to know just where the separation from Workflow to Shortcuts exists. The fact that we have any integration with iOS is a huge leap forward, let alone the fact that third-party apps will be able to tie into not only the Shortcuts app, but Siri voice triggers and the various suggestions throughout the ecosystem. These will all continue to evolve into something much more complex that everyone can benefit from.


Aaron Dippner

Software engineer who loves to nerd out about technology, home automation, gadgets and everything else.

Read More