Augmented Reality is more than Virtual Reality

Eye operated cellphones

As to the feasibility of convenient, versatile display glasses far beyond today's virtual reality goggles, I think we'll see first, not yet full featured but surprisingly useful applications within a surprisingly short time. Just have a look at these very small cell phones that are now equipped cameras and many more things. Their hardware is already light enough to fit in some (not yet perfect) glasses, it's only built the wrong way. Let's have a look on the possibilities. 

Yet, what we will look at on this page are ideal solutions. Much simpler applications are possible, that can easily be realized with current technology. For these, click here.

No joke at all: it is absolutely possible right now, with available technology, to integrate a complete mobile phone into a pair of glasses at fairly convenient size and weight (50 grams about). Given the increasing habit of wearing micro headsets even despite of the inconvenience with the cable, it's quite conceivable that such a device would encounter a considerable demand. Iris recognition could replace passwords, eye pointing could be used for menus and dialing, and the virtual screen size possible would enable full featured web surfing, with clicking on links just by looking at them, to mention but a few of the possibilities. Not yet the real thing the book is all about, but a good technology teaser for sure. What should be considered, to replace the traditional camera+processor eye tracker by a light sensor chip with integrated semi parallel pupil recognition, that would deliver reaction times under a millisecond. Surely this would be an interesting component for many applications. A component integrating at least the recognition into one chip has meanwhile been demonstrated [86]. It runs up to 800 frames per second and has very low power consumption, excellently suited already for our purpose. Although it needs a second chip as a light sensor, this is just as small as a real one-chipper.

Laser scanners usually suffer from the problem that the beam will miss the pupil if the user moves his eyes. One method to avoid this would be widening the beam, which is easily possible with the optics shown here. But this will waste most of the light. Another method is to move the assembly, to adjust for the changes mechanically. This may appear delicate, but indeed piezoelectric motors have made such incredible progress within the recent years, that it is easily conceivable to have a millimeter small and milligram light assembly moving the scanner and also even the eye tracker according to the eye, using as little as about one milliwatt of power: good piezo electric micro motors may exhibit an energy efficiency of up to 30%.
Bringing the laser power completely into the eye, the display unit can have an incredibly small power consumption as well, even if high brightness is required.

Example cellphone optics, simplifying suggestions from the book: an eye tracker (left, shown bigger than real) and a laser unit (right) are combined in the same optical path. The tiny laser scanner mirror is integrated in the center of the eye tracker chip (hardly affecting its function). Just a single, small dichroic or holographic mirror, over 90% transparent, brings the picture into the eye. Additional components (not shown) ensure that the beam always enters the pupil. The entire assembly could be as light as a bauble.
April 2. 2007

While laser scanners can be miniaturized beyond belief, traditional display/mirror assemblies or certain kinds of holographic optics may save most adaptation requirements if only information display at large virtual distances is required. More about this concept here.

 

Visual flow

The sensors of optical computer mice, once mentioned here for general motion sensing already, could be employed for eye trackers as well. They consist of a light sensor array of no more than 48x48 pixels, integrated with an ingenious signal processing that delivers exact motion vectors on almost any surface. An ideal device already to measure quick eye movements by just watching the iris patterns. Imagine this, together with the usual Hough transform circle detection as a parallel processing unit, fully integrated into a camera chip: a super fast, super low power, one chip eye tracker. It can be done. March25,2008

 

Monocular depth pointing - a light field camera approach

Replacing any pixel of a conventional camera by a small lens, addressing many tiny sub pixels behind it - a tiny camera of its own - allows to select certain rays out of the image from the large lens by selecting the appropriate sub pixels [99], [105].
Any sub pixel together with its lens selects rays as if seen through a pinhole camera located at a certain position on the main lens. Sub pixels may be selected to collect light rays as if we had focused the main lens nearer or farther. Sub pixels may as well be selected to form sub images seen from different points on the main lens, hence different perspectives. The selection rules may be combined to render and entirely crisp 2D or 3D image, or a 2D or 3D image with almost any focus distance and depth of field desired.
   

Suggestion: measuring eye focus by the light field camera principle.

Observing the retina through the pupil with an autofocus camera would be obvious. Taking pixel displacements between several sub camera images might work as well. Just a few small sub cameras at sample positions should be enough. Any sub camera however can only deliver useful pixels where it sees (through) the pupil. So the light field approach could deliver a pretty simple means of focus detection [105].

Already seeing the retina this way, we can also use this picture to detect eye motion, replacing the conventional eye tracker by a retina tracker.

 

 

home          order

 

 

 

 


Copyright © 2006-2011 Rolf R. Hainich; all materials on this website are copyrighted.
Disclaimer: All proprietary names and product names mentioned are trademarks or registered trademarks of their respective owners. We do not imply that any of the technologies or ideas described or mentioned herein are free of patent or other rights of ourselves or others. We do also not take any responsibility or guarantee for the correctness or legal status of any information in this book or this website or any documents or links mentioned herein and do not encourage or recommend any use of it. You may use the information presented herein at your own risk and responsibility only. To the best of our knowledge and belief no trademark or copyright infringement exists in these materials. In the fiction part of the book, the sketches, and anything printed in special typefaces, names, companies, cities, and countries are used fictitiously for the purpose of illustrating examples, and any resemblance to actual persons, living or dead, organizations, business establishments, events, or locales is entirely coincidental. If you have any questions or objections, please contact us immediately. "We" in all above terms comprises the publisher as well as the author. If you intend to use any of the ideas mentioned in the book or this website, please do your own research and patent research and contact the author.