Officially, the Meta AI beta for Ray-Ban Meta Smart Glasses is only available in the US and Canada, but today I noticed that it's available in the UK in my Meta View app so I'm putting the feature to the test. . Drive in and around London Paddington, close to where our office is located.
Using the glasses' inbuilt camera, and an internet connection, I can ask meta-AI questions like you would with any other generative AI – such as ChatGPT – by starting a prompt with the added benefit of providing an image for context. can ask Hey Meta, look and…”
Long story short, AI – when it works – is pretty simple. It wasn't 100% perfect, struggling at times due to its camera limitations and information overload, but I was pleasantly surprised by its capabilities.
Here's a play-by-play of how my experience went.
I'm on an AI adventure.
Stepping outside my office, I began strolling through my MetaAI-powered car park and immediately asked my smart glasses to do two things: identify trees along the road and then a long list of information. Summarize the sign that discusses the area's parking restrictions.
On tree work it failed straight away – a few exploratory beeps before returning to silence. Great start. But with this sign, MetaAI was actually very helpful, telling me succinctly (and accurately) that I needed a permit to park here or risk paying a hefty fine, which I did. Only saved five minutes to understand. It is otherwise.
After this mixed success, I continued walking towards Paddington Station. To pass the time, I asked questions about London as if I were a real tourist. He provided some interesting facts about Big Ben – reminding me that the name refers to the bell, not the famous clock tower – but he admitted that he couldn't tell me that King Charles III was currently in Buckingham. Lives in the Palace or if I can meet him. It
Admittedly, this is difficult to judge even as a human being. As far as I can tell he is living at Clarence House, which is close to Buckingham Palace, but I haven't been able to find a definitive answer. So I'll mark the test as invalid and appreciate that at least the AI didn't tell me that instead of cheating (a technical term used when an AI makes things up or lies Is).
I also tried my initial tree test with a different plant. This time the glasses said they believed it was a deciduous tree, although I couldn't tell exactly what species I was looking at.
When I got to the station I did a few more sighting and asking tests. They correctly identified the station as Paddington, and in two out of three tests Meta AI correctly used the departure board to tell me what time the next train was leaving for various destinations. . In the third test the route was closed: he missed both trains to Penzance and told me later on a completely different journey to Bristol.
Before heading back to the office, I popped into the station shop to use the feature I was most eager to try – asking the Meta AI to cook dinner for me based on the ingredients first. Recommend. Unfortunately, the abundance of groceries seems to have confused the AI and it wasn't able to provide any suggestions. I'll have to see if it's better with my less busy fridge at home.
When it's right, it's scary good
On the return trip, I gave the smart glasses one last test. I asked MetaAI to help me navigate the complex tube map outside the entrance to the London Underground, and this time it gave me the most impressive answer of the bunch.
I asked Glass a few questions to help me locate different tube stations among the vast collection and the AI was able to point me to the correct general area each time. After a handful of requests I said “Hey Meta, look and tell me where Wimbledon is on this map.”
Glass replied that he couldn't see Wimbledon (probably because I was standing too close to him to see the whole map) but said it must be somewhere in the southwest region, which it was. It might not sound like a standout answer, but this level of understanding – being able to accurately assemble an answer from incomplete data – was impressively human-like. It felt like I was talking to a local.
If you have a pair of Meta Ray-Ban smart glasses I would recommend seeing if you can access the Meta AI. Those of you in the US and Canada certainly can, but those of you who aren't are as lucky as me to have beta available. The best way to check is to just say “Hey Meta, look and …” and see what it responds to. You can also check the Meta AI settings in the Meta View app.
A glimpse into the AI wearable future
There are a lot of scary realities for our upcoming AI future — just read our very own John Loeffler's excellent teardown of Nvidia's latest press conference — but today my test for some other gadgets is that of wearable AI after recent disasters. Emphasizes usability. the location
For me, the biggest advantage of the Ray-Ban Meta Smart Glasses over something like the Human AI pin or the Rabbit R1 — two wearable AI devices that have received rave reviews from every tech critic — is that they're just AI. have no partners. They're open-ear speakers, a wearable camera, and not least, a stylish pair of shades.
Although, I'll be the first to tell you that all of the design department needs work, except for the Ray-Ban Meta sunglasses.
OpenAir audio performance can't compare to my JBL SoundGear Sense or Shokz OpenFit Air headphones, and the camera isn't as crisp or easy to use as my smartphone. But the combination of all these details makes Ray-Ban at least a little bit nifty.
At this early stage, I'm still not convinced that the Ray-Ban Meta smart glasses are something everyone should own. But if you're desperate for AI wearables in this early adopter phase, they're by far the best example I've seen.