Apple Intelligence: The Next Leap in AI for Your Devices

With the release of the iPhone 16 and 16 Pro, Apple has made its push for generative AI, known as Apple Intelligence, almost official. Right now the iPhone 16 series ships without Apple Intelligence, but rumors from Bloomberg’s Mark Gurman point to Apple releasing some Apple Intelligence features on Oct. 28.
Apple will add Apple Intelligence to all of its OSes, not just the iPhone.

Since Apple Intelligence made a significant impact during the WWDC keynote in June and advanced further at the It’s Glowtime event earlier in September, which introduced the iPhone 16 line-up, the hype surrounding Apple’s AI has been growing.

The company had launched Apple Intelligence for public testing just days before the iPhone 16 went into stores. This included some of the promised AI writing tools, improvements to Siri, and some “connected” photo library features that, to be honest, I don’t really understand. The whole thing won’t be out fully until later in the fall, for a subset of iPhones, iPads, and Macs that have the necessary chipsets. Even then, it’ll be a beta thing you can opt into.

Thus, Apple Intelligence is effectively a beta subcomponent: it runs on the public beta of iOS 18.1, iPadOS 18.1, and MacOS Sequoia 15.1.

Apple Intelligence in Beta

Apple is not unveiling its promised upgrades powered by artificial intelligence all at once. The company has rolled out a beta version of Apple Intelligence. The beta is available only to public testers and not widely distributed, so the company can get feedback from only a small portion of its user base. The AI-suggested writing tools of Apple Intelligence pop up in a document or email when a user is in the throes of creating such an entity.

The upgrade is just as transformative as Copilot in Forms, with the potential to make ungainly, old-fashioned spreadsheets and presentation software far more streamlined and user-friendly. Word, Excel, and PowerPoint will become more similar to what we do when we ask a Chatbot to assist us in a conversation.

The public beta has a setting in the Settings app that lets users turn on the special Intelligence features of their devices for testing. This is something that Apple says can take a long time—up to several hours—if you want to get fully approved for the process, which aligns with the idea that the public beta is more for testing and isn’t necessarily as streamlined as an actual release version.

What we don’t know yet is whether the public release of the Apple Intelligence beta later this year will have a similar opt-in process.
Although some of Apple’s AI tools appear to be truly beneficial, the company is restricting their availability to only a subset of its devices in the upcoming rollout later this year. The devices that will get to use the tools are the iPhone 15 Pro models or later, and certain iPads and Macs with Apple’s M-series chips. Hence, not everyone will necessarily use these tools right away. These features’ capabilities are still unknown.

Apple Intelligence, often referred to as “AI for the rest of us,” integrates into your iPhone, iPad, and Mac to assist you in writing, completing tasks, and expressing yourself. It uses the personal context you have across your Apple devices to make and generate more specific recommendations for you. Apple also touts its feature as setting a brand-new standard for privacy in AI.

“Apple seems to be using this tactic as a way to distinguish its own AI efforts from those previously announced by competitors,” said Roger Entner, an analyst with Recon Analytics. “As an example, the company explained how Apple Intelligence can understand multiple factors like traffic, your schedule, and your contacts to help you determine whether you can make it to an event on time.”

Device Compatibility and Requirements for Apple Intelligence Beta

Apple Intelligence is currently in beta and available on the iPhone 16 series, any iPhone 15 Pro, iPhone 15 Pro Max, and iPads or Macs with an M1 chip or newer. This fall, iOS 18, iPadOS 18, and MacOS Sequoia will feature the public release. However, any device eligible for the release must have Siri, and the device language is set to US English.

Currently, the Apple Intelligence is compatible with a list of devices that essentially represent the newest models in Apple’s current lineup.The compatible devices for this application are as follows:

Feature Rollout Timeline and Integration with Third-Party AI Services

We plan to release the first batch of Apple Intelligence features in beta next month as part of iOS 18.1, iPadOS 18.1, and MacOS Sequoia 15.1. This is what we’re excited about on that front. When it comes to unveiling the next batch of features, Apple is a bit fuzzy on the details. The release has been demarcated for “the months to come.” In the meantime, owners of beta-compatible iPhones, iPads, or Macs can explore the full suite of features—provided they turn on Siri and set their device language to US English.

  1. iPhone 16, 16 Plus, 16 Pro and 16 Pro Max
  2. iPhone 15 Pro and 15 Pro Max
  3. iPad Air with an M1 or M2 chip
  4. iPad Pro with an M1, M2 or M4 chip
  5. MacBook Air with an M1, M2 or M3 chip
  6. MacBook Pro with an M1, M2 or M3 chip
  7. Mac Mini with an M1 or M2 chip
  8. Mac Studio with an M1 or M2 chip
  9. iMac with an M1 or M3 chip
  10. Mac Pro with an M2 chip

No, Apple Intelligence operates not only on the device (iPhone, iPad, or Mac), but also on Apple silicon-powered servers in the cloud, a service that Apple refers to as Private Cloud Compute. The processing of your prompt or question on-device or in the cloud depends on its nature. Apple Intelligence is not ChatGPT, nor does it run on OpenAI’s well-known service.

Nonetheless, Apple Intelligence has the potential to back even more third-party artificial intelligence services, the first of which is OpenAI’s ChatGPT. Owners of Apple devices will be able to experience ChatGPT in a conversational manner in Siri as well as in other writing-based tools across the iOS 18, iPadOS 18, and MacOS Sequoia platforms.

Apple Intelligence’s Core Focus

Apple Intelligence has three main focus areas: writing, images, and Siri. Of these, the capabilities of Apple Intelligence in writing seem to be the farthest along and most useful right now. Apple Intelligence can essentially perform any writing operation you desire. The tool can proofread text, rewrite it in different versions with different tones and wordings, or summarize sections of it.


The new Image Playground app will use prompt-based image-generation tools to create original images. Using the Apple Intelligence-driven Image Wand, you will be able to convert a rough sketch into a complementing form that relates to your notes.

You will also have the option to create custom and unique genmojis using Apple Intelligence, straight from your keyboard. Even better, you will literally have a personalized option; that is, you can pick someone from your Photos library and make a genmoji that looks like them. Finally, in the Photos app, you’ll be able to make a custom memory movie from the prompts you provide.


Siri is set to undergo a significant transformation, encompassing both internal and external aspects. Expect a new look, a more robust natural language processing system, and the new capability of understanding typed commands (instead of just dictation), which may allow for even more “personalized” context in terms of what the commands mean and how Siri should respond.

Apple has now granted Siri an unparalleled power to manage not only your iPhone or iPad, but also any other connected devices. If all this works anywhere close to as well as it sounds, your “Siri as personal assistant” experience should really improve in the coming year.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top