Meta’s Connect 2024 event held in September 2024 was nothing sort of ethereal in terms of development focused on Mixed Reality and Wearable Technology. The event witnessed a slew of new launches that enabled a peep into the exciting future of Artificial Intelligence and next-generation computing platforms at Meta. Meta also added several new AI features to the Ray-Ban smart glass, which now boasts the ability to remember things. Meta’s CEO, Mark Zuckerberg also unveiled the fully functioning prototype of Orion, the company’s first consumer-grade full holographic Augmented Reality(AR) glasses. Meta AI with voice, automatic dubbing and lip-syncing were some major AI features that were announced at the conference.

Meta Quest 3S

The all-new Quest 3S headset powered by the Llama 3.2 model was introduced. Quest 3S is a toned-down version of Quest 3 and is priced $300 lower than the previous version. It is available with 128/256GB of internal storage. Meta claimed that the Quest 3S would be ideal for someone who wants to make a foray into Mixed Reality or is simply looking for an affordable upgrade over the MetaQuest or MetaQuest 2 headsets. Meta has incorporated the redesigned Meta Horizon OS for spatial computing which provides better support for 2D apps like YouTube, Facebook, and Instagram. The new feature is touted to enhance the immersive experience.

Ray-Ban Meta Smart Glasses

Meta has added new features to its already popular and tested product, the Ray-Ban Smart Glasses. The upgraded launch can remember details about the user. The user can ask the glasses to send messages on WhatsApp and Messenger which are completely hands-free. Meta has also announced that it is adding video support to Meta AI, which will enable the glass to act as a companion while exploring new places with insights. The glasses also boast the capability to infer a foreign language and let the user hear the translated language. A transparent edition will be launched that showcases the interior technology. Meta also highlighted that they are partnering with an app Be My Eyes to help visually-impaired people. The smart glasses will be able to see things and describe them.

Unveiling of the Meta Orion AR Prototype Glasses

Meta has finally revealed its much-anticipated AR glasses, Orion. Those glasses have large holographic displays to place 2D and 3D content into our physical world(while seeing through the glasses). Meta has also harnessed its AI expertise to bring contextual AI references. These glasses have a great design along with less weight, unlike other bulkier headsets. The Orion glasses are equipped with an input and interaction system that analyses voice, eye, and hand-tracking, to manage tasks in a physical environment. Meta describes Orion as a “Product that combines the benefits of a large holographic display and personalised AI assistance in a comfortable, all-day wearable form factor.” The glasses have a new display architecture made out of silicon carbide to put holograms in different depths and sizes. It will have a voice in AI, hand-tracking, eye-tracking, and a neural interface. The wristband neural interface on Orion will let the user control things through brain signals.

Llama 3.2 AI Model

It is a collection of small AI models and medium-sized vision LLMs(11B and 90B), along with lightweight text-only models(1B and 3B) that can run on Mobile devices. The major advantage of this free AI model lies in its visual capabilities. This model will enable various use cases for VR and Robotics. Zuckerberg pointed out, “This is our first open-source, multimodal, and it’s going to enable a lot of interesting applications that require visual understanding.” He also pointed out that the Llama 3.2 could be a significant asset for the company with its potential ability to be leveraged by consumer-facing apps like WhatsApp and Instagram.