With iOS 18.4, you probably expected major Siri upgrades, but none are available in Beta 2. Apple might add some of these improvements in later iOS 18.4 betas, however, according to Mark Gurman, they’ve likely been delayed until iOS 18.5. For now, here are all the missing features in iOS 18.4 Beta 2.
Features Missing in iOS 18.4 Beta 2
1. In-App Actions
Siri is set to become even more capable by performing actions across apps on your behalf. Rather than interpret your personal data more effectively, it will complete tasks for you, all without requiring you to preconfigure shortcuts.
This advancement is enabled by App Intents, a framework that allows third-party apps to communicate their available functions to the system. For instance, a camera app can inform Siri that it can take a photo. A messaging app can offer to send a text, and a maps app can provide transit directions. These functions can also be combined. For example, you can extract flight details from a note and send them via messages without any extra input.
Currently, users must manually create shortcuts and activate them with Siri. However, once this feature is introduced, Siri will automatically recognize app capabilities, eliminating the need for manual setup. Instead of configuring a shortcut in advance, you can simply say, “Send the note with my partner’s flight times to them via Messages.” Siri will complete the task immediately.
2. Personal Context
Apple’s upcoming AI features include Personal Context for Siri, which stands out as one of the most advanced and highly anticipated innovations. This feature improves Siri’s ability to understand user data, making it significantly more useful for daily tasks. While Apple has provided glimpses of its capabilities, the ultimate goal is to make Siri more intuitive and personalized.
Apple Intelligence creates a “semantic index” of user data to accomplish this. This includes photos, files, calendar events, notes, and shared links. When a user asks Siri a question, it can quickly retrieve relevant information from this index.
For instance, users can ask Siri to locate a message containing flight details their mom sent, whether via text or email. Additionally, Siri can check if there’s enough time to travel between meetings by analyzing calendar events, locations, and estimated travel times. If a user has taken a photo of their ID, Siri can extract details from it to assist in filling out forms.
3. On-screen Awareness
Siri’s on-screen awareness will enable it to recognize and interact with the content displayed on your device. For instance, if a friend shares their new address via text, you can say, “Add this address to their contact card.” Siri will extract the details directly from your screen. The App Intents framework powers this functionality. It allows developers to integrate their app content with Siri and Apple Intelligence.