Meta made quite a big move with AI this week, but some of them were very expensive. According to Bloomberg, Meta spent a whopping $200 million trying to poach one person from Apple (Ruoming Pang, responsible for developing the massive language model behind Apple Intelligence), which is free. 1 This was scooped up by the sudden increase in AI talent acquisition. There are a few other Openai people who jumped over the ship this week.
That should be expected in one way. AI is the rage these days, and it appears that all businesses with a war chest that are big enough are pouring out spare changes to become the next big thing in chatbots and AI slops. However, especially in the meta, these moves could amount to more than Ho-Hum expenditures to get AI back on track. In fact, it could be a huge boon for one of the most exciting gadget categories. We're talking about smart glasses.
AI is clearly thrown into a lot of stuff (movies, games, web search) for now. And while it's not always ideal for every task it's being thrown, there's one thing that can actually have a big impact (and has already begun). Just as smart glasses like Meta's Ray-Ban are in the early stages, they are incredibly limited and, at worst, feel really worse. Many of these drawbacks are related to the UI. Unlike devices with displays, smart glasses only have one real option for native input. This is a voice assistant. Problem that Many audio assistants smoke. They're fine for basic tasks, but when you ask them more than “play music”, things tend to really get choppy.
However, advances in large-scale language models (LLMs) like those that run ChatGpt could change all of that. LLM is inherently good at natural language prompts, and is more proficient with advanced multi-step commands. If there is one way to make smart glasses that feel more sophisticated now, then improving your voice assistant would be. Not only that, but also only How to do that at the moment. Just as great as a complex UI like Apple's Vision Pro – it uses a very surprising mixture of eye and hand tracking – it's about styling all the hardware needed to tackle a form factor that is still a long way off, if you consider that UI as “glasses.”
For proof of that problem, you don't need to look for the Orion concept of meta. This requires a rather large calculation pack that will alleviate the area of your glasses, as you have to withstand the weight of being a computer (in this case the weight is literal). For now, the issue of moving forward with capabilities while reducing smart glasses remains unresolved, and as a result, hardware companies must be creative about how to approach that confusion. In this case, the approach is all about AI, and millions of meta may be lifting their feet this week.
