Wednesday, September 11, 2024

The New iPhone 16 and Its Apple Intelligence Features Will Help You Write Messages With a Friendly Tone

At its annual product reveal event, Apple unveiled the next generation of iPhones, Apple Watches, and AirPods. The company also showed off new features for Apple Intelligence, its package of generative AI-powered offerings. Apple CEO Tim Cook began the presentation with the Apple Watch, which the company first announced a decade ago. The latest model, Apple Watch Series 10, sports the device's largest screen yet, which will make text more legible at a glance. Apple COO Jeff Williams also said that the Series 10 is their thinnest and lightest watch yet. Williams says the Series 10 will cost a starting price of $399. Next, Cook moved on to AirPods, the company's mega-popular wireless earbuds. In addition to improved audio quality, the next generation of AirPods will feature Active Noise Cancellation. Meanwhile, AirPods Pro received a health-focused update that will enable users to test their hearing and monitor themselves for signs of hearing loss, and will let customers use their AirPods as a professional-grade hearing aid, although Apple has not received clearance from the FDA yet. Finally, Cook introduced the iPhone 16 line of phones, saying that they've been designed "from the ground up" to take advantage of generative AI. The new models include an "action button" that can be customized to open different apps at different times of the day, and a touch pad that will instantly open the camera and let users make adjustments while in the photo app. Craig Federighi, Apple's SVP of software engineering, showed off the marquee feature of the iPhone 16: Apple Intelligence. The hardware within the iPhone 16 is designed to run generative models directly on the device, and for more ambitious requests, can access cloud compute. One feature sure to appeal to entrepreneurs is the "summary" feature, in which the phone will summarize long texts or emails. Apple Intelligence can also help when sending out messages, as Federighi showed how the on-device models can rewrite texts and emails to be in a different tone or create custom emojis. Here's more on those features. Federighi also introduced a new feature called "visual intelligence," in which users can learn more about the world around them by using their phone's camera. In an example, an iPhone owner took a photo of a restaurant, and his phone seamlessly pulled up the restaurant's hours and surfaced a link to make a reservation. (This is all similar to what Google offers with its Lens tool.) In another example, a user took a photo of a concert poster, and the phone instantly offered to create a calendar entry for the concert. Federighi says Apple Intelligence will be available in English starting in October, with more languages to follow.

No comments: