MyGlass app lets you control Google Glass from your phone

Like the idea of Google Glass, but don't fancy swiping the side of your specs the whole time? There's an app for that. It's called MyGlass, and it's just been updated to let you control your hi-tech spectacles from your Android-powered mobile.

I know, part of the appeal of Glass is that you don't have to fetch your phone out of your pocket, but endless swiping and speaking commands might get a little tiring. And that's where MyGlass comes in...

Read the full story here... Source: CNET

Lambda Labs launching facial recognition API for Google Glass

Dystopian future, here we come! Google Glass is about to receive access to a new facial recognition API, courtesy of Lambda Labs. The new API should be out within a week, provided that all goes as it should.

Who is Lambda Labs? They are a small startup that released a non-Glass facial recognition tech API just last year, and currently have over 1,000 developers using it. Now they are taking this experience and tailoring it specifically to Google Glass apps.

This means that the door could soon be open to Glass apps that allow you to match names with faces, get detailed info about landmarks and much more. Of course there is also a pretty big limitation here – it doesn't work in real time, due to the Google mirror API.

The Mirror API doesn't allow for live streaming camera data to be sent over to a developer's server. That means that you will need to snap a picture, send it in to Lambda, and then wait for it to be analyzed. After a few seconds, you will then receive a notification with the results.

So does this technology mean that a perfect stranger could look at you, snap a picture and receive detailed information about you? Short answer, no. Long answer, it can only take data specifically from Lambda's database, which includes things like pictures of well-known celebrities and important landmarks...

Read the full story here. Source: Android Authority 

Google Glass will have a laser keyboard, patent suggests

Thought you'd seen all Google Glass had to offer? Think again. The augmented reality specs could shoot out a laser keyboard, if the latest patent application is to be believed, CNET reports.

The keyboard would beam out of the arm of the glasses. And if there's no flat surface nearby to shoot onto, just project it onto your hand and start typing tweets or emails. This is some seriously next-generation stuff we're talking.

It's not even a problem if you've only got one hand free. As well as pressing the virtual buttons, you can input by just moving your hand, with the Google Glass' camera interpreting what you're doing.

This is only a patent application, so there's no certainty it'll ever make it into Google Glass, or any other products. But it shows Google has big plans for its cyber specs, and isn't going to limit them to just voice input. I mean, Google's voice search on Android Jelly Bean is ace, but what if you're in a crowded place? Or you want to write something the old fashioned way? A laser keyboard could be the answer.

Google unveiled its hi-tech spectacles back in April, though it wouldn't be drawn on when we can expect to actually don them ourselves. They let you stay connected to the Internet and bring up info from Google without having to ogle a screen, which is pretty great. Though some augmented reality experts aren't so sure.

They certainly caused a stir though, with MicrosoftOlympus, and Sony all planning rival specs. So they could be the next big thing once tablets have had their day.

Microsoft is also rumoured to be working on a pair that'll work with Xbox and Kinect, which could lead to some interesting possibilities.

[Source: CNET]