Intel made a pair of smart glasses you might actually want to wear out – here’s how they work

Intel's smart glasses close up

Intel has developed a prototype for a pair of smart glasses that are designed to look normal – or at least, normal for a pair of smart glasses.

News of Intel’s latest hardware broke in February, when  Bloomberg’s Sarah Frier and Ian King  published a report saying the company was looking for a number of investors to take a majority stake in its augmented-reality (AR) unit, which had been developing a pair of eyeglasses that lets you see text and other information in your field of view.

Days after Bloomberg’s report went live,  The Verge’s Dieter Bohn posted a video showing off a physical prototype of the Vaunt smart glasses. Here’s a look at how they work:

Intel designed these glasses to be worn in public without feeling like a technophile.

Intel designed these glasses to be worn in public without feeling like a technophile.

The horn-rimmed camera-less glasses have a very different look from Google Glass. Intel’s Vaunt glasses are designed with minimalistic functionality so the pair only weighs about 50 grams (about a tenth of a pound), and apart from an occasional “red glimmer,” the lens display isn’t visible to anyone on the other side of the glasses.

The lens can display “simple basic information” into the right eye.

The lens can display "simple basic information" into the right eye.

The image is called a retinal projection: A red monochrome projector shines an image on a “holographic mirror,” which bounces the image into your eye.

The image is called a retinal projection: A red monochrome projector shines an image on a "holographic mirror," which bounces the image into your eye.

To avoid having text in your line of sight all the time, Intel built the glasses so that the dashboard shows up only when the user glances down at the bottom of the frame.

The second the user looks up, the display disappears. There’s no need for hand gestures.

The display is shown through a “Vertical-Cavity Surface-Emitting Laser,” which is “so low-power, it’s at the very bottom end of a class-one laser,” according to New Devices Group’s industrial design director Mark Eastwood. That apparently means that the laser is safe for your eyes.

The display is shown through a "Vertical-Cavity Surface-Emitting Laser," which is "so low-power, it's at the very bottom end of a class-one laser," according to New Devices Group's industrial design director Mark Eastwood. That apparently means that the laser is safe for your eyes.

The glasses do need to be fitted to your eyes, however, so there’s enough distance between your eyes and the lens to see what’s displayed.

Intel is developing other prototype Vaunt styles, and developers will soon be able to start using the glasses through Intel’s early access program, to create use cases for them.

Intel is developing other prototype Vaunt styles, and developers will soon be able to start using the glasses through Intel's early access program, to create use cases for them.

The Vaunt is going to work with both Android and iPhone. And while the executives in the video do name a few use cases (grocery shopping, choosing between restaurants that are right in front of you), it’s clear that the extent of the Vaunt’s capabilities will be entirely up to software developers.

To learn more about the Vaunt smart glasses, check out The Verge’s video below.



Smart Glasses that allow the blind to see, could help 150,000 in the UK alone, by transforming the way blind and partially sighted people go about their everyday lives.

These incredible glasses uses glasses-mounted camera to project images onto eyepieces.

RNIB and Oxford University they just receive £500,000 funding, that will enable to create 100 pairs of smart glasses and test them with 1,000 people.

This will be the first large-scale test of smart glasses and augmented reality for sight enhancement anywhere in the world.  It’s the first step towards getting the glasses made available to everyone who needs them.


watch the video on youtube.


Smart Glasses Could Help Blind People Navigate

A pair of “smart glasses” might help blind people navigate an unfamiliar environment by recognizing objects or translating signs into speech, scientists say.

The majority of registered blind people have some residual ability to perceive light and motion. But assistive technologies for the visually impaired have been limited.

Printed Photos The Blind Can ‘See’

The FDA gave approval to the Argus II, a bionic eye that could potentially cure blindness in 15,000 people.

Now, researchers from Oxford University in England are developing a set of sophisticated glasses that use cameras and software to detect objects and display them on the lenses of glasses. The team recently won an award from the Royal Society to continue this work. [Bionic Humans: Top 10 Technologies]

“This is the beginning of a golden age for computer vision,” study researcher Stephen Hicks said in a statement. “The Royal Society’s Brian Mercer Innovation award will allow us to incorporate this research into our glasses to help sight-impaired people deal with everyday situations much more easily.”

Here’s how the smart glasses work: Two small cameras mounted on the corners of the glasses capture two different pictures, just as human eyes do. The spectacles display information from the cameras on transparent LED displays on the lenses, so the wearer can see an enhanced image as well as use their remaining sight. Comparing the distance between the cameras reveals how far the object is from the wearer.

A set of headphones takes text and translates it into speech to provide directions or read signs aloud.

Stem Cell Treatment Cures Blindness

The glasses are also equipped with a compass, a GPS and a gyroscope, a tool that measures the orientation of the glasses.

In the United Kingdom, where the research is taking place, more than 2 million people have impaired vision, and more than 300,000 are registered as blind, due to diseases such as macular degeneration, glaucoma and diabetic retinopathy.

Moving forward, the researchers hope to develop software to provide a range of different functions that testers of the glasses say would be useful.

For example, the glasses could use levels of brightness to show depth. They could detect if a person is present based on his or her movement. In addition, the glasses might be able to read the locations or numbers of buses and provide GPS directions via the headphones.

Microsoft files patents for augmented reality smart glasses.

  • Work on digital glasses that overlay information on top of the user’s view of the world has been carried out by Microsoft.

patent applied for by the US tech firm describes how the eyewear could be used to bring up statistics over a wearer’s view of a baseball game or details of characters in a play.

The newly-released document was filed in May 2011 and is highly detailed.

If a product comes to market it could challenge Google’s Project Glass.

Google is planning to deliver its augmented reality glasses to developers early next year and then follow with a release to consumers in 2014.

Smaller firms – such as Vuzix, TTP and Explore Engage – are also working on rival systems.

Although some have questioned how many people would want to wear such devices, a recent report by Juniper Research indicated that the market for smart glasses and other next-generation wearable tech could be worth $1.5bn (£940m) by 2014 and would multiply over following years.

No missed moments

Microsoft’s patent was filed by Kathryn Stone Perez, executive producer of the Xbox Incubation unit which earlier developed the Kinect sensor; and John Tardiff, an audio-video engineer who previously worked at Apple.

It notes that entertainment organisers often provide screens showing information to enhance audience’s enjoyment of their events. But looking at these displays forces the user to turn their head away from the action – for example looking at the scoreboard at a baseball game, or translated lyrics at the side of the stage at an opera.

Microsoft suggests augmented reality headwear would avoid the risk of missing a key moment and also make it possible to see effects otherwise reserved for people watching on TV – for example a computer-drawn line superimposed over an American Football pitch showing the minimum 10-yard distance a team needs to advance the ball.

The patent suggests the key to making this work would be to vary the transparency of the glasses lens.

“[It would be] capable of generating display elements on various portions of a user’s display while remaining portions of the head mounted display are transparent to allow the user to continue to view events or occurrences within the live event,” it says.

“Other alternatives allow the display to become completely opaque, in order for the display to provide, for example, a display of a video such as an instant replay of the live event.”

Anticipated events

Microsoft suggests a wrist-worn computer could be used to operate the device, or alternatively the user might control it through voice-commands and flicking their eyes to a certain spot.

It indicates that most of the processing work – identifying people and other objects in view, and deciding what information to show about them – would likely be carried out by remote computer servers in order to keep the equipment slimline.

The firm adds that many entertainment events follow a set course – such as a character always appearing at the same point in a play – and this could be used to ready information in advance to ensure it is brought up quickly.

Microsoft suggests a wide range of sensors would need to be built into the eyewear – including a microphone, video camera, gyroscope, eye gaze-trackers, infra-red detector and magnetometer as well as wi-fi and/or bluetooth connectivity – to provide the functionality it describes.

The document also describes some of the technologies it could license that have been developed by other firms, suggesting Microsoft has explored the possibility of putting its ideas into practice.

Nitin Bhas, senior analyst at Juniper research said he would not be surprised to see the the Windows-maker release a device over the coming years.

“We think smart glasses and other head-worn displays will be the next major form-factor for computing with adoption by consumers beginning around late-2014 to 2017,” he told the BBC.

“The devices will help integrate technology into human life, making things like augmented reality more seamless than it is on smartphones at present.

“Compared to other devices we think the adoption rate will be low and price points high in the medium-term, but they will catch on eventually.”