Apple might bring Siri Offline mode
As per various surveys and users’ feedback, Siri lacks a lot when compared to google assistant. Android users seem pretty happy and they get their stuff done on the go.
The way Siri typically works is that it listens to the user command when prompted, sends anonymized received speech data to Apple’s servers, first converting from audio to plain text, then interpreting the command and sending the result back to the user’s iPhone or iPad. The process of performing speech recognition is intensive, and is offloaded from devices to Apple’s servers as it isn’t necessarily capable of being performed on an item like an iPhone.
Apple suggests the use of an onboard system of modules to handle digital assistant queries that does not connect to the outside world. The collection of modules includes elements for speech synthesis, dialog processing, phonetic alphabet conversion based on a default vocabulary and user-created data, and a natural language processing module, among other items.
Apple’s patent application suggests it wants to bifurcate Siri into two systems: an on-device Siri and a server-side Siri, now with the ability to determine which of the two services produces a higher “usefulness score” for the user’s spoken request. If the device’s Siri has a higher usefulness score — or is the only available option — it will be used to respond to the user, but if the response from the server is available and better, the second option will be used.
The keys to making this a reality are on-device processing power and a sufficient database of locally stored knowledge capable of handling user requests. Even though Apple’s devices have the ability to process Siri requests in roughly as much time as it takes to send data over the internet and wait for a response, the company has historically suggested that it didn’t want to burden its mobile devices with the added processing and database storage demands necessary to make that happen.
However, Apple has been increasingly focused on bulking up its AI chips over the past two years, introducing and then dramatically improving the performance of Neural Engines within its A11 and A12 series processors. Adding a local Siri database would likely come with a major update to iOS, such as the expected release of iOS 13 at next June’s Worldwide Developers Conference.
“Recently Apple has filed another patent for camera integration in Apple Watch and this will eventually help in using “FaceID” and make “Face Time” calls via Apple Watch”, said Reports