Google received a big win today in the patent department regarding Google Glass. Patent 20130070338 is probably the most important patent for the Search Giant’s wearable tech. What makes it so important is the fact that it details everything regarding the eye piece. You know, the part that the whole Project Glass is centered around. The claim details how the HUD will work and interact with the user. That’s just one out of 28 claims though. If you have some free time you can read the full technical document that encompasses all 28 claims here.
I’m sure the folks at Mountain View are sleeping easier knowing that Glass is protected. No one can come after them for patent infringement. It’s a safe bet that Google will waste no time going forward with the project considering that it took the Patent Office over a year to process and accept the patent. Even with the Explorer Edition just passing through the FCC, this win should certainly put some pep into Google’s step.
source: Patent Bolt
Google put everyone with less than perfect vision at ease today by announcing Glass will work with prescription lenses. Google posted a picture of Glass team member Greg Priest-Dorman sporting such a pair. This news is not exactly a surprise as a prescription lens equipped set was spotted in NYC earlier this year but it’s nice to hear something official.
Google said Glass will support both prescription lenses and frames. For the mega-stylish among you this likely means designer frames will also be a go. Unfortunately the initial “Explorer Edition” release version will not accept the alternate lenses and frames. If you wear glasses you’ll have to go with contacts or just wait. Google said the prescription version will be available later this year.
Source: Glass on Google+
Google Glass is expected to hit consumers by the end of this year and we learn more about the next-generation technology all the time. An app by the name of InSight will allow wearers to identify people simply by analyzing what they’re wearing, and doesn’t even need to see the person’s face. In early tests involving 15 volunteers, Google Glass with InSight was able to correctly identify people 93 percent of the time.
Developed by Srihari Nelakuditi at the University of South Carolina and in collaboration with associates at Duke University, they created a fashion-based recognition system. The system captures photos of a user from web pages, emails, and tweets. Photos are then analyzed for colors, patterns, and textures which are added to a file that makes up a specific user. When InSight detects someone you know, their name appears on Google Glass’ display.
Google Glass is the next generation of wearable, internet-connected devices, and Sergey Brin believes smartphones are “emasculating” compared to Glass. Sergey recently spoke at the Technology, Education and Design (TED) conference in Los Angeles.
He made a few points about how people interact with phones and the outside world:
“Is this the way you’re meant to interact with other people? It’s kind of emasculating. Is this what you’re meant to do with your body? I have a nervous tic. The cell phone is a nervous habit — If I smoked, I’d probably smoke instead, It’d look cooler. But I whip this out and look as if I have something important to do. It really opened my eyes to how much of my life I spent secluding myself away in email.”
We’ve seen Google make some pushes into augmented reality with their recent push with Google Glass, and Google’s Direct of Android User Experience, Matias Duarte, fully agrees with that direction. At MWC, Duarte talked about gestures and tangibility being the next step for mobile devices, saying “computers have to work the way people expect, and not the other way around.” He also talked about the importance of users interacting with objects similarly to how they interact in real life, going beyond just your fingers and the palm of your hand. Google revolutionized the mobile space with Android’s open-source nature, and it looks like they’re attempting to spearhead the next revolution in mobile computing.
As a side note, Duarte also had a few comments about the customizability of Android, promising that Google would never clamp down on what makes our Android devices unique. To keep up with the rest of the news from MWC, click here for our latest coverage.
A patent application originally filed in August 2011 by Google for what has become Google Glass was published to the USPTO web site today. There has been a buzz around Google’s wearable computer project after a new video was released yesterday showcasing some of the capabilities of the system from the user’s point of view. The new patent provides details about how Google is going about making the magic happen.
Google usually releases some pretty cool videos, but today is especially cool. Today’s video gives you a really good idea of what Project Glass is going to be like for you. We already know what it will look like aesthetically, but other than the skydiving stunt last year, we really never got a good look at what the experience will be like. This 2 minute video features many clips from family activities and events. It shows you how you will interact with Glass with your voice and it also shows you what types of information will be displayed. If you have even the slightest interest in Project Glass, you need to check out the video. I was already excited about Glass, but now I’m even more excited about the possibilities. Hit the break to see the full video.
At last year’s Google I/O, it was promised that Google Glass beta versions would be available early this year. Those of you that signed up for one might get a nice box in the mail very shortly because the Google Glass Explorer Edition passed through the FCC. Unfortunately there isn’t a lot to go on, but we know it has a Broadcom 2.4GHz 802.11 b/g WiFi radio and Bluetooth 4.0 + LE. There is also a reference to an “integral vibrating element that provides audio to the user via contact with the user’s head.” This probably has something to do with the bone conduction patent we wrote about last week.
I still wouldn’t expect full blow consumer friendly versions to be available until 2014, but I’m sure Google Glass will make another spectacular appearance at this year’s Google I/O.
We recently reported that Google Glass is still very much in the development stage with lots of features and capabilities in flux. Well, one new feature that may or may not make it into the final release of Google’s wearable computer is bone-conduction audio capabilities. A new patent recently filed by the Mountain View company shows us how Google plans to get audio output from the glasses to your ear-drum: by rattling it through your skull! For those of you who aren’t aware, bone-conduction is the process of sending sound waves to the ear drum not through the air, but through a dense material…in this case, your cranium. This is a good solution, because you won’t have to wear an additional ear-bud to hear your notifications, music, or phone calls. It also keeps your audio from being broadcasted through a speaker for the whole world to hear. We’ll have to wait and see if this feature is just another good idea that’ll never see the light of day, or if it’ll make it into the final commercial product of Google Glass. Here’s to hoping for the latter.
Source: US Patent and Trademark Office
Google’s patent request will make their DIY cyborg kit even
crazier better. Need to dial a number? Just type it on the palm of your hand. That’s at least one expected functionality of a tiny laser projector for Project Glass. The system would project the dial pad or QWERTY keyboard on to your hand, arm, desk or whatever. A tiny Kinect-like camera would then interpret your movement. Google, you had me at laser!