Apple’s AI play with a difference: visionOS, Journal, video calling, AirPods

AI Video & Visuals


It was unthinkable for Apple to mutter about “AI” in its keynote like Google and Microsoft did recently. Alternatively, you can launch a chatbot just because others do. But for those who feel the company isn’t as focused on artificial intelligence (AI) as other tech giants, the WWDC 2023 rollout should put an end to those doubts. Apple has made extensive use of AI across iOS, iPadOS, the upcoming editions of macOS, and several heavily reworked apps. And then there’s the Apple Vision Pro Augmented Reality (AR) headset, which also requires the smarts of neural networks.

Apple has made extensive use of AI across the following editions of iOS, iPadOS, and macOS. (HT photo)

The iPhone’s upcoming operating system, called iOS 17 (which will be released later this year), is reworking quite a few of Apple’s own apps. One of them is the phone app, especially the voicemail feature. If you haven’t used it yet, this might convince you. Live transcriptions of left voicemail messages are available, and you can still answer calls whenever you feel it’s important. During the message delivery and transcription process. Transcription happens on the device, according to Apple.

Also read: Chatbots for India show focus on broader community as Microsoft builds on AI

If you’re still skeptical, the use of natural language models will help improve received autocorrection, new word and sentence autocorrections with a focus on grammar, a new speech recognition transfer model for voice input, and more in the future. Transcribe voice messages in iMessage, a new Journal app that uses context to make smart suggestions, Standby mode for time and context, and use machine learning (ML) to add frames to the next release of iPadOS Includes features such as the ability to synthesize slow motion. Customize your lock screen.

Is this the end of the “ducking” era? You’ll know right away.

Adaptive Audio technology will soon be added to Apple’s AirPods wireless earbuds. The technology uses machine learning to decode the user’s current (and often rapidly changing) environment and dynamically blend transparency and automatic noise cancellation.[Forexampleifyou’reonpublictransportenoughtransparencyisenabled(forcertainfrequencies)soyouwon’tmissanyannouncements[publictransportforinstancejustenoughtransparency(forthosespecificfrequencies)willbeenabledsoyoudon’tmissanyannouncements[たとえば、公共交通機関に乗っている場合は、(特定の周波数に対して)十分な透明性が有効になるため、アナウンスを見逃すことはありません。[publictransportforinstancejustenoughtransparency(forthosespecificfrequencies)willbeenabledsoyoudon’tmissanyannouncements

We’ll see how well the adaptive audio feature works with different noise levels and noise compositions over time as it becomes public and widely used by us. In theory, if someone comes to talk to you, they’ll hear you, but it’s likely that much of the noise around you will remain blocked out. At least that’s the premise.

In the case of the VISIONOS-powered Apple Vision Pro headset, AR relies heavily on AI and machine learning, potentially delivering the immersive, private, and pervasive experiences AR seeks to provide. OpticID technology, the AR version of the iPhone’s FaceID biometric and authentication technology, requires complex algorithms to process iris data on the device. This allows access to apps on visionOS, as well as App Store purchase authorization and Apple Pay transactions.

Apple confirms that all camera data collected by the Apple Vision Pro headset is also processed on the device. please think about it. This is data collected by his 12 cameras, 5 sensors and 6 microphones on his headset.

Last but not least is the Journal app launching with iOS 17. As the name suggests, it’s a journaling app, and the tech giant sees it as a health extension to fitness, sleep, and breathing apps. It uses extensive algorithms to gather data from your contacts, photos, music, location data and more to curate personalized suggestions. However, you have full control over what you have access to curate your suggestions.

It was unthinkable that Apple would announce chatbots like OpenAI’s ChatGPT and Google’s Bard at WWDC 2023. A lot of people, especially social media conversations, seemed to expect it. But Apple’s paths have always been different. AI is what makes the experience of using apps and features radically smarter. AI is treated as a means to an end rather than a focal point. Given the examples we’ve pointed out, it’s safe to assume that this mission was a success. At least in terms of laying the groundwork.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *