META’S LATEST NEXT-GEN RAY BAN SMART-GLASSES CAN MAKE ANYONE A SMART-PERSON

Meta’s CEO Mark Zuckerberg addresses the crowd at the Meta Connect Keynote on 9/27/23. Photo by Marcus Siu.

Article by Marcus Siu

Do you remember the scene in the classic 1999 science fiction thriller film “The Matrix”, when Neo (Keanu Reeves) asks Trinity (Carrie Anne Moss) if she could fly the “B-21 2” helicopter, and she responds “not yet” and makes a phone call to Tank, the operator. and says, “Tank, I need a pilot program for a B-21/2 helicopter. … Hurry!” As Tank searches the program and uploads it into Trinity’s mind, Trinity re-opens her eyes, and says to Neo, “let’s go!”.

Don’t we all wish we could do something magical like that?

As Neo and Trinity fly in the helicopter to save their friend Morpheus, wouldn’t it be great to be able to experience first hand to what they are experiencing with their own eyes and ears? Maybe, even in real time?

Well, Meta’s latest Ray Ban smart-glasses seem to be heading in that direction with their latest next-gen Meta Ray Ban smart-glasses but instead of using “Tank” as the operator, the operator is “Meta”…and yes, you can now experience someone else’s POV experience with the new smart-glasses through live streaming with the help of Meta A.I.

Though, you might not be able to fly a helicopter immediately like Trinity did in “The Matrix” with Meta’s smart-glasses, you might be able to do simpler things that you might not have been able to do in the past. For example, you will be soon be able to fix a leaky faucet…but more on that later.

Compared to the first generation of Meta smart-glasses, “Ray Ban Stories”, which Meta introduced two years ago, it’s latest model is quite a huge leap forward in technological specs and design.

The next-gen model has a superior 12MP camera that produces superior images with its 1080p HD video, powered by a Qualcomm Snapdragon AR1 Gen1 Platform, all-new custom designed speakers with extended bass two times more powerful than the original, 50% higher maximum volume, and improved directional audio that provides reduced audio leakage for improved calls, music, and podcasts — even in noisy or windy environments.

The new design of the microphone has a five microphone array, which captures the sound exactly how it was experienced, so you can relive your experience anytime you want. They are a lot lighter in weight, so they are much more comfortable then its predecessor.

Amidst the much improved physical and technical specifications, the real selling point, though is its Meta A.I. These next-gen smart-glasses are the very first Meta smart-glasses to include Meta A.I. in them.

I think that’s smart glasses are gonna be an important platform for the future, not only because they’re the natural way to put holograms in the world and put digital objects in our physical space but also because smart glasses are the ideal form factor for you to let an A.I. see what you’re seeing and hear what you’re hearing…” Mark Zuckerberg

Presently, in its present form – right out of the box, Meta smart-glasses users are able to ask basic A.I. questions, recommendations or even translations, just as Google or Alexa would answer, except you would address the name as “Meta”. However, because the glasses are much more portable with you while wearing them where ever you go, as opposed to having to be near a Google Home or Amazon Alexa device, it will always be much more convenient to utilize Meta A.I. because you can always be connected where ever you go…

But that’s just the beginning of Meta A.I.

Next year, according to Meta’s co-founder and CEO, Mark Zuckerberg, Meta is planning on issuing a freeware update which will make the Meta smart-glasses multimodal, which means the Meta A.I. assistant will understand what you are looking at while you are asking your question. Examples include having Meta A.I. identify a building, landmark or monument that you are looking at and giving you a brief history or description of it, just like a personal tour guide would, or translating an international sign not in English and translating it for you, and of course that I mentioned earlier, looking at a leaky faucet and asking how to fix it.

Continuous updates will just only make it smarter and smarter.

“Multimodal” will be the latest word for next years update for the Meta/Ray Ban glasses. Photo by Marcus Siu

Even after seventy five years, the classic Ray-Ban Wayfarer glasses never seemed to go out of style. They were made famous by iconic Hollywood stars such as James Dean in “Rebel Without a Cause”, Audrey Hepburn in “Breakfast at Tiffany”, Tom Cruise in “Risky Business”, and not to mention Will Smith and Tommy Lee Jones from the “Men in Black” movie franchise.

Meta added Ray-Ban’s “Headliner” line glasses to their new lineup smart-glasses, for those who prefer a more retro-style design. They even come with prescription glasses, for those who already wear glasses. Overall, there are at least 150 different combination/options you can choose from that Ray Ban has to offer.

I was lucky enough to experience a full demo of the the Ray Ban/Meta glasses at Meta Connect last month. I was quite impressed and believe Zuckerberg will probably be correct in predicting that Meta A.I. smart-glasses will be the next big platform for the masses.

It’s incredible what we can do with it in its compact form: listening to music, podcasts or just taking phone calls. Considering the smart-glasses weigh only 1.7 ounces, the speakers, even when outside, sound surprisingly clear and accurate. Also, the sound doesn’t drown out the physical environment noise around you. It seems to blend naturally into the physical soundscape.

With five microphones, it will seamlessly record what you are experiencing as realistically, as possible. The multichannel surround is not just coming from the left-right side, but in front and from behind, and to the sides, adding to its immersive realism during playback. I was able to playback my recorded experience and felt that I was reliving it all over again. Talk about deja vu.

Without even touching a button on the smart-glasses, all the user has to say now is, “Meta, shoot photo” or “Meta, shoot video”, or even “Meta, send photo to mom” while wearing Meta’s smart-glasses. It shoots exactly what you see through your glasses (POV) and captures and sends the experience on command. As an option, you can also live-stream it in real time to your friends or audience on Instagram or Facebook.

Gone are the days of abruptly fumbling with ones smartphone and trying to locate the camera (or video) app for that quick photo that they don’t want to miss.

Sharing an experiencing going down a ski slope, or hiking trip, or on top of a mountain will bring brand new immersive experiences to others. Other experiences that might be great to document would be meeting a person for the first time, or going through a petting zoo, or just walking through a crowd of people in the street. The possibilities are endless…though, we are limited to up to 60 seconds of HD video. Of course, birthday parties, weddings, graduations or any traditional gatherings of any kind are always welcome, as well.

Up to 32GB of storage can be saved. which is equivalent to 500 photos or one hundred 30 second HD video clips. The camera shoots in portrait mode, and has video stabilization built in.

Thanks to Meta A.I., exciting times are ahead of us for wannabe helicopter pilots and plumbers.

The many selections that you can choose from on display at Meta Connect 2023. Photo by Marcus Siu.
Unknown's avatar

About mlsentertainment

Bay Area photojournalist - Northern California, United States Promoting the lively film and music scene mainly through the Bay Area, as well as industry and technology events.
This entry was posted in Entertainment, Technology and tagged , , , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

Leave a comment