Apple announces AI – Apple Intelligence at Keynote launch

Apple’s announcement of “Apple Intelligence” at this year’s keynote launch marks a significant leap in integrating advanced generative models and personal context into its devices, enhancing usability and productivity.

Set to launch with iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1 next month, Apple Intelligence aims to deliver a seamless experience by allowing users to interact with their devices in smarter and more intuitive ways. Initially available in US English, support for other English-speaking regions will follow in December, with broader language options, including Chinese, French, Japanese, and Spanish, coming in 2024.

images courtesy of Apple

Apple Intelligence is deeply integrated into the operating systems of iPhones, iPads, and Macs, leveraging the power of Apple’s latest chips to process language, images, and personal context efficiently. The system runs many models directly on-device, ensuring user privacy and security, with the option to expand to server-based models when needed through Apple’s Private Cloud Compute. The initiative emphasises privacy, as much of the processing happens locally, keeping user data secure.

Several innovative features are introduced, such as Writing Tools, which assist users in refining and summarizing text across various apps, including Mail, Notes, and third-party applications. In Photos, users can create customized movies and search for specific moments using natural language commands. The new Clean Up tool automatically removes distracting objects in photos without affecting the subject, enhancing photo editing capabilities.

Apple Intelligence also extends to Notes and the Phone app, where users can record, transcribe, and summarise audio. When a call is recorded, all participants are notified, and after the call, the system generates a summary of key points. Mail and notifications are also enhanced, with summaries that prioritise important messages and reduce interruptions by surfacing only time-sensitive information.

Siri is another key beneficiary of Apple Intelligence. It becomes more flexible and intuitive, with a redesigned interface and improved natural language understanding. Siri can follow conversations more fluidly, maintain context across multiple requests, and provide more detailed answers about device features. Users can now switch seamlessly between text and voice commands, allowing them to multi-task with ease.

Why is this important?

Alongside the new hardware announcements for iPhone 16 and Apple Watch Series 10, the Apple AI announcement highlighted that more features are slated for release in the coming months. These include the Image Playground for creating visual content and the Image Wand, which transforms rough sketches into polished images. Emoji creation will also evolve, with users able to generate customised Genmoji based on text descriptions or photos.

Apple Intelligence is set to develop, allowing Siri to leverage more personal context, offering a tailored and enhanced experience.

An important aspect of this update is the integration of ChatGPT’s capabilities into Apple Intelligence. Users can access ChatGPT’s knowledge and document-understanding features through Siri and Writing Tools, with strong privacy measures in place. Apple ensures that user data remains private, with IP addresses obscured and no requests stored by OpenAI.

This update is significant because it represents Apple’s push to integrate artificial intelligence seamlessly into everyday tasks. By doing so, Apple is enhancing device functionality, improving user productivity, and maintaining a strong focus on privacy and security—key concerns in today’s digital landscape.

The release of Apple Intelligence is a major step forward for the tech giant in making AI more accessible and practical for users across the new range of Apple’s ecosystem.

Share this post

Sign up to our Newsletter for more content like this

By signing up you agree to our Privacy Policy