Electronics – Ralph F. Osterhout, John D. Haddick, Robert Michael Lohse, Charles Cella, Robert J. Nortrup, Edward H. Nortrup, Microsoft Technology Licensing LLC

Abstract for AR glasses that allow for event and user action control of applications external to the device

This disclosure relates to an interactive head-mounted eyepiece that can display content and introduce content to an optical assembly. The eyepiece allows the user to view the surrounding environment through which they can see the displayed content. It also includes event and user control for external applications.

Background for AR glasses that allow for event and user action control of applications external to the device

Field

This disclosure concerns an augmented reality eyepiece and associated control technologies as well as applications for their use.

One embodiment of an eyepiece includes a nano-projector or micro-projector. This may include a light source and LCoS display, a freeform waveguide lens enabling TIR bounces and a coupling lens. A wedge-shaped optic (translucent correct lens) is attached to the waveguide lens. It allows proper viewing through the lens regardless of whether the projector has been turned on or off. An RGB LED module may be included in the projector. An RGB LED module can emit field sequential colors. This is where the different colored LEDs are turned off in rapid succession and form a color image that’s reflected off the LCoS. A projection collimator or polarizing beam splitter may be included in the projector.

One embodiment of an eyepiece can include a freeform transparent correction lens, a wave guide lens and a display lens.

An embodiment may include an optical wedge waveguide that is optimized for the ergonomics of the human head. This allows the eyepiece to wrap around the face.

Another embodiment of an eyepiece might include two freeform optical surfaces with waveguide. This allows for folding complex optical paths within a thin prism form factor.

A system may include an interactive head-mounted optical eyepiece that a user wears. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor handles content and introduces it to the assembly. An integrated image source can also be used to insert the content to the assembly. In one embodiment, the interactive control elements are fixed relative to objects in the environment. This is in response to the interactive command element location command.

A system may include an interactive head-mounted eyepiece that a user wears. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor is used to handle content and introduce the content to the assembly. An integrated camera facility images the user’s body as it interacts to the interactive control elements. In this case, the processor subtracts the portion of interactive control element that was determined to be located with the user’s imaged body part.

A system may include an interactive head-mounted optical eyepiece that a user wears. The optical assembly includes a corrective element to correct the user’s view, an integrated processor to display content and an integrated source of images to introduce the content to the optical apparatus. An interactive keyboard control element may be displayed. The keyboard control element can also include an input path analyzer and a word matching search facility. A pointing device, such as a stylus or finger, can be used to input text. You can input text by sliding a stylus or a finger across the keyboard input interface. The input path analyzer determines which characters were contacted, while the word matching facility locates the best word match and enters the text.

A system may include an interactive head-mounted optical eyepiece that a user wears. The optical assembly includes a corrective element that corrects a user’s view of the environment. An integrated processor handles content and introduces content to the optical assemblies. An integrated camera facility images an external visual clue. The integrated processor interprets this external visual cue and commands the display of content that is associated with it. The visual cue could be a sign that is visible in the environment and the content projected therewith. The advertisement may be personalized based on the preferences of the user. The projected content may include a projected virtual keyboard and the visual cue could be a hand gesture. The hand gesture could be a thumb-index finger gesture with one hand and the virtual keyboard projected on another hand. This allows the user to use the virtual keyboard with the second hand. The hand gesture could be a combination of thumb and index fingers gestures from both user hands. The virtual keyboard is projected between user hands according to the hand gesture.

A system may include an interactive head-mounted optical eyepiece that a user wears. The optical assembly includes a corrective element that corrects a user’s view of the environment. An integrated processor handles content and introduces content to the optical assemblies. An integrated camera facility images a gesture and the integrated processor interprets it as a command instruction. A control instruction can be used to manipulate the content, or communicate a command to an external device.

A system may include an interactive head-mounted optical eyepiece, which allows a user to view the surrounding environment and display content. The optical assembly includes a corrective element to correct the user’s vision, an integrated processor to handle content and an image source to introduce the content to it to the user; and a tactile control device mounted on the eyepiece. This interface accepts inputs from the user through at most one of two methods: the user touching the interface or the user being close to the interface.

A system may include an interactive head-mounted optical eyepiece for the user. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor is used to handle content and provides control instructions to the processor based on detecting a predefined type of head motion characteristic.

A head motion characteristic could be a nod by the user’s skull such that it is an overt movement dissimilar to ordinary head movements. An overt motion could be a jerking movement of the head. Control instructions can be used to manipulate the content of the display or communicate with an external device.

A system may include an interactive head-mounted eyepiece that a user wears. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor handles content and introduces content to the optical assemblies. In some embodiments, the optical apparatus includes an electrochromic layer which adjusts the display characteristic based on the displayed content requirements and the environment. The display characteristic can be brightness, contrast, or the like in embodiments. Display characteristic adjustment can be applied to the area of the optical assembly that contains the content.

An embodiment of the eyepiece is a head-mounted interactive eyepiece that a user wears. The eyepiece contains an optical assembly that allows the user to view the surrounding environment and display content. An integrated image source may be used to introduce the content to the optical assemblies. The eyepiece can also include an adjustable wrap-around extendable arm that is made of any shape memory material to secure the position of the eyepiece at the user’s head. An extendable arm can extend from the end of an eyepiece arms. Silicone may be used to cover the end of an extendable wrap-around arm. The extendable arms can be secured to one another or may grasp a part of the head independently. The extendable arm can attach to a portion on the head-mounted eyepiece to secure it to the user’s skull. The extendable arm can extend telescopically beyond the end of an eyepiece arm in certain embodiments. Other embodiments allow at least one wrap-around extendable arm to be detached from the head mounted optical. The extendable arm can also be added to the head-mounted eyepiece.

An embodiment of the eyepiece is a head-mounted interactive eyepiece that a user wears. The eyepiece contains an optical assembly through which the user can view the surrounding environment and display content. An integrated image source may be used to introduce the content to the optical assemblies. The displayed content could also include a local advertisement in which the location of your eyepiece can be determined using an integrated location sensor. The local advertisement could also be relevant to the eyepiece’s location. Other embodiments may include a capacitive sensor that can detect whether the eyepiece comes in contact with skin. Based on whether the capacitive sensor detects that the eyepiece has come into contact with human skin, the local advertisement could be sent to the user. Local advertisements might also be sent if the eyepiece is turned on.

Other embodiments of local advertising may include a banner advertisement or a two-dimensional graphic. Advertisement may also be linked to a physical aspect or environment. Another embodiment of an advertisement could be displayed as an enhanced reality that is associated with a physical aspect or the environment. An augmented reality advertisement can be either two-dimensional or three-dimensional. The advertisement can be animated or linked to the user’s environment. Local advertisements can also be displayed to users based on the results of a user’s web search. The content of the local advertisement can also be determined using the user’s personal data. An advertising facility or web application may have access to the user’s personal data. A web application, advertising facility, or eyepiece may use the user’s information to filter local advertising. Local advertisements may be cashed on a server, where they can be accessed by at most one of the following: advertising facility, web app and eyepiece. The user may then see it.

Another embodiment allows the user to request more information about a local advertisement through any eye, body, or other gesture. A user can also ignore a local advertisement by using any eye movement, body movement or gesture, and not selecting the advertisement to be interacted with within a specified time. Another way to choose not to allow local advertisements is to select such an option from a graphical interface. Alternativly, such advertisements may be disabled by the user via an eyepiece control.

One embodiment may also include an audio device. The displayed content could also include audio and local advertisements. An integrated location sensor may determine the location of an eyepiece. Local advertisements and audio may also be relevant to that location. A user might hear audio that corresponds with the local ads and displayed content.

One aspect of the interactive head-mounted eyepiece includes an optical assembly. The corrective element corrects the user?s view of the surrounding area and displays content. An optical waveguide has a first and second surface that allow total internal reflections. An integrated processor may be included in the eyepiece to display content to the user and an integrated source of images to introduce the content to the optical assemblies. This aspect allows content to be introduced to the optical waveguide at an angle that causes minimal internal reflection. To reflect displayed content to the second optical waveguide surface, the eyepiece has a mirror surface. The mirrored surface allows for either a complete reflection of light entering the opticalwaveguide, or at least a partial reflection. The surface can be 100% or lower percentage mirrored depending on the embodiment. An air gap between the corrective element and the waveguide may be used in some embodiments to create a reflection of light entering the waveguide at an angle that is not conducive to TIR.

One aspect of the interactive head-mounted optical eyepiece may include an optical system. This allows the user to view the surrounding environment and display content. The optical assembly also includes a corrective element which corrects the user’s perception and an integrated processor that handles content for the user. An integrated image source is also included in the eyepiece. This allows the user to view the content from the side of an optical waveguide that is adjacent to the arm of the eyepiece. The displayed content aspect ratio ranges between approximately square and approximately rectangular, with the long axis roughly horizontal.

An interactive head-mounted eyepiece has an optical assembly that allows a user to view the surrounding environment and display content. The corrective element corrects the user’s perception of the environment. A freeform optical waveguide allows internal reflections and a coupling lens directs an image from the LCoS display to an optical waveguide. An integrated processor is included in the eyepiece to display content and an integrated projector facility to project the content to it. The projector facility comprises a lightsource and an LCoS screen. Light from the light source is controlled by the processor. It then traverses a beam splitter to be polarized and reflected off the LCoS and into the optical guide. Another aspect of the interactive head-mounted optical eyepiece is an optical assembly that allows a user to view the surrounding environment and display content. The optical assembly also includes a corrective element to correct the user’s perception of the environment, an optical guide that allows internal reflections and a coupling lens to direct an image from the optical display to an optical waveguide. An integrated processor is included in the eyepiece to handle content and introduce the content to an optical assembly. The integrated image source includes a light source and an optical display. An optical correction lens that is attached to the optical guide may serve as the corrective element. It allows for proper viewing of the surrounding environment, regardless of whether the projector or image source are on or off. Dual freeform surfaces may be used to create a freeform optical waveguide. These two freeform surfaces allow for a curvature or sizing, which allows the waveguide to be placed in the frame of an interactive head-mounted eyepiece. An RGB LED module may be used as the light source. It emits light sequentially and forms a color image which is reflected back off the optical display. A homogenizer may be included in the eyepiece to distribute light from the source. This ensures uniform light beaming. The polarizing beam splitter’s surface reflects the color image of the LCoS or optical display into the optical waveguide. An eyepiece might also include a collimator, which increases the resolution of light entering the opticalwaveguide. The processor may control the light source to emit light and then it will pass through a polarizing beam splittingter, where it is polarized and reflected back onto the optical display. An LCoS or LCD display could be used as the optical display. The projector may be an image source, wherein it is at least one of a micropoprojector or a nanoprojector. A polarizing beam splittingter is also included in the eyepiece. This polarizes the light from the source and reflects it off the LCoS screen and into the optical guide. The polarizing beams splitter’s surface reflects the LCoS display’s color image into the optical wavesguide.

An apparatus for recording biometric data is provided in an embodiment. An apparatus for capturing biometric information is provided in an embodiment. Biometric data can be either visual or audio. An optical assembly is included in the apparatus that allows the user to view the surrounding environment and display content. An optical assembly includes a corrective element which corrects the user?s perception of the environment. The integrated processor manages content that is displayed to the user via the eyepiece. An integrated image source is also included in the eyepiece to introduce the content to the optical assembly. An integrated optical sensor assembly is used to capture biometric data. An integrated microphone array is used to capture audio data. The integrated communications facility transmits the data and remotely processes the biometric data. Remote computing facilities interpret and analyze the captured biometric information, create display content, and deliver the content to the eyepiece.

Another embodiment includes a camera mounted on an eyepiece to obtain biometric images of a person proximate the eyepiece.

Another embodiment provides a method to capture biometric data. The method involves placing an individual near the eyepiece. The eyepiece may allow the wearer to move into a position that allows the capture of biometric data. Once the eyepiece is positioned, it captures biometric information and transmits that data to a facility which stores it in a biometric database. Remote computing facilities are used to interpret the data received and create display content. The display content is then sent back to the user so that it can be displayed on the eyepiece.

Another embodiment provides a method to capture audio biometric data. The method involves placing an individual near the eyepiece. The eyepiece can be adjusted so that the audio biometric data is captured. Once the microphone array is positioned, it captures audio biometric information and transmits it to a facility which stores the audio biometrics data in a database. Remote computing facilities are used to interpret the data received and create display content. The display content is then sent back to the user so that it can be displayed on the eyepiece.

One embodiment of the eyepiece has a see-through correction lenses attached to the exterior surface of an optical waveguide. This lens allows for proper viewing of the surrounding environment, regardless of whether it contains any content. A prescription lens that allows for the viewing of corrective eyeglass prescriptions may be attached to the see-through lens. The see-through lens can be polarized. It may attach to the optical waveguide or a frame. Polarized correction lenses block the oppositely polarized light that is reflected from the user?s eyes. The see-through lens can attach to at most one of the optical wavesguide and a frame. This protects the optical wavelength guide and may contain at least one ANSI-certified ballistic material or polycarbonate material.

One embodiment of an interactive head-mounted optical eyepiece comprises an eyepiece that can be worn by the user, an optical apparatus mounted on the eyepiece that allows the user to view a surrounding environment and displayed content. The optical assembly also includes an integrated processor that handles content and introduces it to the optical system. An electrically adjustable lens that is integrated with the optical unit adjusts the focus of the displayed content.

One embodiment concerns an interactive head mounted eyepiece. An interactive head-mounted optical eyepiece is provided. It includes an eyepiece that can be worn by the user and an optical assembly mounted on it. The corrective element corrects the user’s view. There is also an integrated processor in the interactive head-mounted optical eyepiece to handle content and display it to the user. An integrated liquid lens that can be adjusted electrically to correct the displayed content is part of the interactive head mounted eyepiece.

An interactive head-mounted eyepiece that can be worn by a user is another embodiment. An interactive head-mounted optical eyepiece is included. The user can view the surrounding environment and displayed content through the optical assembly. The integrated processor handles content and displays it to the user. An integrated image source is included in the interactive head-mounted optical eyepiece to introduce the content to it, an electrically adjustable liquid lenses that adjust the focus of the displayed contents for the user, as well as at least one sensor mounted on an interactive head-mounted optical eyepiece. The output of at least one sensor is used for stabilizing the displayed content using either optical stabilization or image stabilization.

One embodiment is a method of stabilizing images. This method involves providing an interactive head-mounted camera with an optical assembly and a camera that allows users to view the surrounding environment and display content. The user then images the surrounding environment using the camera to capture an object within the environment. The method includes displaying the content at a fixed position relative to the user?s view of an imaged object. Sensing vibration and movement with the eyepiece and stabilizing the displayed contents with respect to user’s view through at least one digital technique.

A method of stabilizing images is another embodiment. This method involves providing an interactive head-mounted camera with an optical assembly and a processor that can handle content and project it to the optical assembly. The camera then images the surrounding environment using the integrated projector to capture an image. The method includes displaying the content at a fixed position relative to the user?s view of an imaged object. Sensing vibration and movement with the eyepiece and stabilizing the displayed contents with respect to user?s view of surrounding environment via at minimum one digital technique.

One embodiment includes a method of stabilizing images. One embodiment of the method involves providing an interactive, head mounted eyepiece that a user can use to view the environment. The optical assembly also includes an integrated processor to display the content and an image source to introduce the content to it. A camera is used to image the environment to take an image of an object within the environment. The method also includes steps such as displaying the content at a fixed position relative to the user?s view of an imaged object using the optical assembly.

An interactive head-mounted eyepiece is another embodiment. An interactive head-mounted eyepiece is another embodiment. It includes an eyepiece which can be worn by the user, an optical system mounted on the eyepiece, through which the user views the surrounding environment and displays content. A corrective element mounted on to the eyepiece corrects the user?s view of this environment. An integrated processor is included in the interactive, head-mounted, eyepiece to display content. There’s also an integrated image source to introduce the content to the optical assemblies. Finally, there’s at least one sensor that can be mounted on the camera, or the eyepiece. The output of at least one sensor can be used to stabilize the optical assembly of an interactive head-mounted eyepiece by at least one digital technique.

An interactive head-mounted eyepiece is one embodiment. An interactive head-mounted optical eyepiece is included in the interactive head-mounted device. It includes an eyepiece that can be worn by the user, an optical system mounted on the eyepiece, through which the user can view the surrounding environment and displayed content. A processor inside the eyepiece handles content and displays it to the user. An integrated image source is included in the interactive head-mounted optical eyepiece to introduce the content to the optical assemblies. There is also at least 1 sensor mounted on the interactive eyepiece. The output of at least one sensor helps stabilize the content displayed by the optical assembly of an interactive head-mounted optical eyepiece using either optical stabilization or image stabilization.

An interactive head-mounted eyepiece is another embodiment. An interactive head-mounted eyepiece is another embodiment. It includes an eyepiece that can be worn by the user, an optical unit mounted on the eyepiece that allows the user to view the surrounding environment and displayed content. There is also an integrated processor for handling the content for display. An integrated image source is included in the interactive head-mounted eyepiece to introduce the content to it, an electrooptic lens that stabilizes the content for display to users, and at least 1 sensor mounted on the eyepiece.

The disclosed aspects include an interactive head-mounted optical eyepiece that a user wears. This eyepiece comprises an optical assembly through the user can view a surrounding environment and displayed contents. An integrated processor is used to display the content and an integrated source of image to introduce the content to the optical apparatus.

An eyepiece could also include a control device that is worn on the hand of the person. It may have at least one control element that is actuated by the user’s digit and provide a command instruction from the actuation. A command instruction could be directed at manipulating content to display to the user.

An eyepiece could also include a hand motion sensor device that is worn on the hand of the user and provides control commands to the processor.

A bi-directional optical system may be included in the eyepiece. This allows the user to view the surrounding environment and display content. The processor handles the content for display to users and sensor information. The processor uses information from both the sensor and displayed content to determine the user’s line of sight relative to the projected images.

The eyepiece transmits to the processor line of sight information as commands instructions.

A hand motion sensing device may be included in the eyepiece for tracking hand movements within the field of view. This will provide control instructions to it.

One aspect of social networking is contacting a social network website using an eyepiece. This allows you to request information about other members using the interactive head mounted eyepiece and search for others nearby using the interactive eyepiece.

One aspect of social networking involves contacting a social network website using an eyepiece and asking for information about other members using the interactive eyepiece. The signal also indicates the location of the user using the head-mounted interactive eyepiece. Access to information about the user is then possible.

One aspect of social networking involves contacting a social network website using an eyepiece and asking for information about its members using the interactive, helmet-mounted head-mounted camera.

One aspect of gaming involves contacting an online gaming website using the eyepiece and initiating or joining an online game using the interactive head mount eyepiece. Then, the user can view the game through the interactive head-mounted optical eyepiece. Finally, the player can play the game by using at least one body-mounted control device that is connected to the interactive head mounted eyepiece.

One aspect of gaming involves contacting an online gaming website using the eyepiece, joining or initiating a game with multiple members using an interactive head-mounted optical system, viewing the game content using the optical assembly, and then playing the game using at least one sensor to detect motion.

One aspect of a gaming method is to contact an online gaming website using the eyepiece and then contact at least one other player for a game using the interactive headset-mounted eyespiece. Then, initiate a game using the interactive headset-mounted eyespiece. Finally, view the game using the interactive eyepiece’s optical assembly. Finally, play the game by touching at least one control with the interactive eyepiece.

One aspect of augmented vision is a method that includes an interactive head-mounted optical eyepiece and an optical assembly. The user can view the surrounding environment through the optical assembly and scan it with a black silicon long wave infrared image sensor. A user can also control the SWIR sensor using gestures, commands, movements and gestures. Finally, the sensor sends at least one visual picture to the processor. The user can then view the at least one visual object using the optical assembly.

One aspect of augmented vision is a method that includes a head-mounted camera with an optical assembly and a camera. The user can view the surrounding environment through the eyepiece and send information to the processor.

One aspect of augmented vision is a method that includes an interactive head-mounted optical eyepiece and an optical assembly. The optical assembly provides an interface for viewing the environment through the eyes and displays content. It also contains an integrated processor to display the content.

One aspect of the method includes the following: contacting an accessible data base using an interactive head mounted eyepiece that displays content and the surrounding environment, requesting information from the database, and then viewing the information from the database using the interactive eyepiece. The user does not need to contact the head-mounted device for these steps.

One aspect of the method involves contacting an accessible data base using the eyepiece, asking for information using that eyepiece, displaying the information using an optical facility and manipulating it using the processor. The steps of requesting and manipulating information are done without having to touch the interactive head mounted eyepiece.

One aspect of the method is to contact an accessible database using an eyepiece. The user requests information from the accessible site using the interactive head mounted eyepiece.

One aspect of social networking involves providing an eyepiece and scanning the facial features of nearby people with an optical sensor on the head-mounted camera. Then, the user extracts a facial profile from the person, contacts a social media website using the communication facility of the interactive head mounted eyepiece, and searches a database for a match to the facial profile.

One aspect of social networking involves providing an eyepiece and scanning the facial features of nearby people with an optical sensor from the head-mounted eyespiece. Then, the person extracts a facial profile, contacts a database using the communications facility on the head-mounted eyespiece, and searches the database for someone matching the facial profile.

One aspect of social networking involves contacting a website via the eyepiece and asking for information about nearby members using the interactive, head mounted eyepiece. The eyepiece scans facial features of nearby people to identify them as members using an optical sensor. This extracts a profile of the person and searches at least one other database.

One aspect of augmented vision is that the user provides the eyepiece and controls the camera using gestures, movements or commands. Information from the camera is sent to a processor attached to the interactive head-mounted camera. The user then views visual images through the optical assembly. Visual images from the camera are an improvement in at least one of clarity, brightness, clarity, magnification, and focus.

Another aspect of augmented vision is that the eyepiece can be used to control the camera using movements of the user, send information from the camera back to the processor, and view visual images using the optical assemblies of the interactive eyepiece. The user will see visual images from both the camera and the optical assembly in an improvement of at least one of clarity, brightness, magnification, focus and clarity.

One aspect of augmented vision is that the user provides the eyepiece and controls the camera using their movements. The integrated processor of augmented head-mounted eyes can then send information to the camera, apply an image enhancement technique using computer program and the integrated processor. Finally, the user can view visual images through the optical assembly of the interactive head-mounted eyespiece. Visual images from the camera and optical assemblies are an improvement in at least one of clarity, brightness, magnification, focus and clarity.

One aspect of facial recognition involves taking an image of the subject using the eyepiece, converting it to biometric information, comparing the biometric to a database of biometrics data, identifying biometrics that match previously collected biometrics, and reporting the displayed content.

Another aspect of the system comprises the eyepiece and a face detection device in conjunction with an integrated processor facility. The face detection facility takes images of faces in their surrounding environment and compares them to stored images in a database. It then provides a visual indicator to indicate a match. This visual indication corresponds with the current location of the imaged facial in the environment as part of the projected contents. An integrated vibratory actuator is also included in the eyepiece. The vibratory actuator generates vibrations that alert the user of the match.

One aspect of a method to augment vision is to collect photons using a short-wave infrared sensor mounted on an eyepiece. Then, convert the photons in short wave to electrical signals, relay the signals to the eyepiece to display, collect biometric data using a sensor, and then transfer the audio data to a database.

Another aspect of object recognition is to capture an image of an object using the eyepiece and analyze it to determine if it has been previously photographed and analyzed.

An eyepiece, according to an aspect of the invention includes a frame that can be used to attach a lens and an image source device above it. An LED, a planar lighting facility and a reflective display are all part of the image source facility. The planar illumination device is designed to convert the light beam received from the LED on one side into a top emitting planear light source. The planar lighting facility is placed to evenly illuminate the reflective display. The planar illumination device is further modified to be substantially transmissive so that image light reflected from reflective displays can pass through the planar lighting facility to a beam splitter. The beam splitter is designed to receive image light from the reflective display, and reflect some of it onto a mirror surface. The mirror surface is placed and shaped to reflect image light into the eyes of the wearer of an eyepiece. This creates an image within the field of view. Additionally, the mirrored surface can be modified to partially reflect within the area of image reflection. A reflective display can be a liquid display such a liquid on silicon (LCoS), cholesteric, guest-host, polymer dispersed, liquid crystal and phase retardation display. It also includes a bistable display such a electrophoretic, electrofluidic and electrowetting, electrokinetic, or any combination thereof. The thickness of the planar illumination facility should not be less than 0.5 mm. A cover glass may be used to protect the reflective display.

A wedge-shaped optic may be used to reflect the light from an LED onto a reflective display. The wedge-shaped optic then reflects the light back towards the wedge-shaped optic, directing the polarizing beam splittingter. To further redirect the image towards beam splitter, the planar illumination facility could also include an image direction correction optic.

An optic with a lower-surface optic is used to planar illuminate a reflective display. The lower surface has imperfections that direct light upwards from the LED to illuminate the reflective display. Image light reflected from the reflective displays is then projected back to the optic with a higher surface. It passes through the optic with the lower side in a direction toward the polarizing beam splitter. A correction optic may be included in the planar illumination facility to correct image dispersion due to imperfections.

A multi-layered optic may be used in the planar illumination facility. Each layer has an angle that reflects a portion the light beam from an LED upwards to illuminate the reflective display. The image from the reflective device is projected back to the multi-layered optical and then passes through the multilayered optic in the direction of the polarizing beam splittingter. A diffuser may be included in the planar lighting facility to increase the cone angle of image light passing through the facility to the beam splitter.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to handle content for display. It may also include an image source for introducing content to an optical assembly. The user can view the surrounding environment and displayed content through the eyepiece, and may also include a user interface that is based on an external device type. An external device may be connected to the eyepiece via a communications facility. A memory facility in the eyepiece may also store user interfaces specific to that external device type. The optical assembly will display a user interface specific to the external device type when it is connected to the eyepiece.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to display content and an image source to introduce the content to an optical system. The user can view the surrounding environment and displayed content through the eyepiece, which has a control interface that is based on an external device type. An external device may be connected to the eyepiece by a communications facility. The integrated memory facility of an eyepiece may also store control schemes specific to that external device type. When the external device is connected, an eyepiece can access a control scheme specific to that external device type.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to display content and an image source for introducing content to an optical assembly. The user can view the surrounding environment and displayed content through the integrated image source. In addition, the eyepiece provides a control interface and user interface that are based on the type of connected external device. An external device may be connected to the eyepiece by a communications facility. A memory facility in the eyepiece may also store control schemes and user interfaces based on external device types. When the external device is connected, an optical assembly presents a user interface that is specific to the external type and the eyepiece has a control scheme that is specific to the external type. An external device can be an audio system. The user interface could be an audio controller and the control scheme could be a head nod.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an image source to introduce the content to an optical system. The eyepiece may also include sensor-based command-and-control of external devices, with feedback from the external device to it. An external device can be connected to the eyepiece via a communications facility. A sensor may detect a condition. The eyepiece may present a user interface that allows for command and control of external devices. Feedback from external devices may also be provided. The sensor may emit a signal that can be displayed as content when it detects the condition.

An interactive head-mounted eyepiece can include an integrated processor to handle content for display, and an integrated source for introducing content to an optical assembly. The user can view the surrounding environment and displayed content through the eyepiece, which has a user-action-based command and control system for external devices. An external device can be connected to the eyepiece via a communications facility. A user action capture gadget may detect the user’s input and present a user interface that allows for the command and control of the device. One embodiment of the user action capture device could be a body-worn sensor device and the external device a drone.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to handle content for display, and an image source for introducing content to an optical assembly. The eyepiece can also be used for predictive control of an external device, based on event inputs. An integrated memory facility can be used to record contextual information. This information could include information about an activity, communication, or event that was monitored by the eyepiece. A location may be included in the contextual information to indicate where the activity, communication or event was recorded. A facility may be available to analyze the context information and project a pattern in usage. An external device can be connected to the eyepiece via a communications facility. The eyepiece may control and command the external device when it detects a pattern of usage. If the external device is detected, a command-and-control interface may be displayed on the eyepiece.

An interactive head-mounted eyepiece can include an integrated processor to display content and an image source to introduce the content to an optical system. The eyepiece can also include event input-based control of the eyepiece application and user action control. An eyepiece may have an integrated processor that detects user input. A command and control interface to command and manage the eyepiece can be displayed in the eyepiece. The command and control interface may also accept input from the user action capture devices.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an image source for introducing content to an optical assembly. The eyepiece may also include event and user control of external apps. An external device can be connected to the eyepiece via a communications facility. A user action capture system may detect an input from the user and enable a command-and-control scheme to command and direct an external application. The command and control scheme may also use actions captured by the user as input to the external app.

An interactive head-mounted eyepiece with an integrated processor may be used to display content. The integrated image source may also be used to introduce the content to an optical system. In some embodiments, the eyepiece may allow the user control and feedback on internal and external applications. An external device can be connected to the eyepiece via a communications facility. A user action capture tool may detect an input event. The eyepiece may present a command-and-control interface that allows for the command, control, and management of an internal and external application.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to display content and an image source to introduce the content to the optical assembly. The eyepiece may also include a sensor and user-action based control of external devices with feedback. An external device may be connected to the eyepiece via a communication facility. A sensor can detect a condition. An eyepiece might detect user actions and present them to the user as input. The command and control interface could provide feedback from an external device.

An interactive head-mounted eyepiece with an integrated processor may be used to display content. The integrated image source may also be used to introduce the content to an optical system. In some embodiments, the eyepiece may also include a sensor and user action-based control of eyepiece apps with feedback. The sensor can detect a physical quantity and the user action capture devices may detect an input. An eyepiece application may then be controlled by the sensor through a command-and-control interface. This interface may also provide feedback from the eyepiece’s application.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to handle content for display. It may also include an image source for introducing content to an optical assembly. The eyepiece may also include event, sensor and user action based control over applications on external devices with feedback. A sensor can detect a condition and a user-action capture device could detect a user’s input. An external device may be connected to the eyepiece by a communications facility, and an internal application might detect an event. The eyepiece application may detect an event and present a command-and-control interface to command and direct the operation of an external program. This interface can accept input from at most one sensor or user action capture device. In addition, the command/control interface may provide feedback from the external app.

An interactive head-mounted eyepiece with integrated processor may be used to display content. It may also include an integrated source of image for introducing content to an optical assembly. The user can view the surrounding environment and displayed content through the eyepiece. In some embodiments, the eyepiece may have a state-triggered interaction with an advertising facility. An object detector might detect activity as input. A head-mounted camera or eye-gaze detection device may detect eye movement as input. A navigation system controller may connect to the eyepiece. An ecommerce application may detect an event. The eyepiece may display a 3D navigation interface that allows for the command and control of a bullseye or target tracking system. The 3D navigation interface can accept input from at most one of the object detector, head-mounted camera or eye-gaze detection systems. In this case, feedback may be provided by an advertising facility within the eyepiece.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an image source for introducing content to an optical assembly. In some embodiments, the eyepiece can also include an event and user action capture device control for external applications. An external payment system may be connected to the eyepiece. An inertial motion tracking device may detect finger movements as input. An email application may detect email reception as an event. A navigable list of bills may be displayed, where the user can send the information to the external payment systems for payment. The navigable list may also accept finger movements captured by the inertial motion tracking device.

An interactive head-mounted eyepiece can include an integrated processor to handle content for display. It may also include an image source to introduce the content to an optical system. The eyepiece may further contain an event, sensor and feedback for direct control of external devices. The eyepiece may have a sensor that detects a condition. A user action capture device can detect input from the user. An external device may also be connected to the eyepiece. When a condition is detected, the eyepiece may present a command-and-control interface for commanding and controlling the external device. The command and controlling interface can accept input from at most one of the user-action capture devices and the sensor. In addition, the command and control interface might present feedback from an external device in the eyepiece.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an integrated source for introducing content to an optical assembly. The eyepiece may also include sensor input that triggers user action capture device control. A user action capture device can identify an event and present a command-and-control interface that is based on the event. The command and control interface might accept input from the user.

An interactive head-mounted eyepiece with integrated processors may be used to display content and an image source. The integrated processor allows the user to view the surrounding environment and the displayed content. In some embodiments, the eyepiece may also include sensor-triggered user movement control. The eyepiece can be able to recognize an event and may respond to user input by enabling it to take in movements.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an integrated source for introducing content to an optical assembly. The eyepiece may also include an event and sensor triggered command or control facility. A minimum of one sensor can detect an event, a quantity, or the like, and if the sensor receives input, an interface may be provided for command and control.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to handle content and an integrated source for introducing content to an optical assembly. The user can view the surrounding environment and display content through the eyepiece, and may also include sensor-triggered control of eyepiece apps. An input sensor can detect an event or a physical quantity. An internal application may detect data feeds from a network source. A command scheme may then be available to control an eyepiece program when the eyepiece app detects the data feed and the sensor receives it.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an image source to introduce the content to an optical system. The eyepiece may also include an event and sensor-triggered interfaces to external devices. An external device can be connected to the eyepiece via a communications facility. A sensor may detect an event or a physical quantity and provide input. The eyepiece may also include a command-and-control interface that allows the external device to be controlled and managed by the user.

An interactive head-mounted eyepiece with integrated processors may be used to display content and integrate image sources. The integrated processor may also allow the user to view the surrounding environment and the displayed content. Event-triggered user action control may also be available. An input device that detects hand gestures may trigger the eyepiece to be able to take hand gestures.

The following description and drawings will make it easy for those who are skilled in the arts to see these and other systems, methods and objects.

All documents referenced herein are hereby included in their entirety. If the text does not state otherwise, or it is clear, all references to items in the singular must be understood to include items from the plural. Grammatical conjunctions can express all disjunctive or conjunctive combinations, sentences, words, and other conjoined clauses.

This disclosure is about eyepiece electro-optics. An eyepiece could include projection optics that project images onto a transparent or see-through lens. This allows the user to see the surrounding environment and the image projected. An RGB LED module may be used to project the image onto the projection optics. This is also called a projector. A single full-color image can be broken down by field sequential color to create color fields that are based on primary colors red, green and blue. Each color field is then imaged individually by an LCoS optical display 210. Each color field is then imaged by an optical display 210 and the appropriate LED color is activated. These color fields can be displayed in rapid succession to create a full-color image. Field sequential color illumination allows for adjustments to any chromatic aberrations in the projected image. The red image can be moved relative to the blue or green images, and vice versa. After the image is reflected, it may be projected into a two-surface freeform waveguide. The image light engages with total internal reflections (TIR), until it reaches the active viewing area of a lens. The processor may be a memory or an operating system that controls the LED light source and optical display. A projector can also be optically connected to a display coupling or condenser lens as well as a polarizing beam splitter and a field lens.

Referring to FIG. FIG. 1 shows an example of the augmented reality vision eyepiece 100. It is possible that other embodiments of the eyepiece 100 might not contain all elements shown in FIG. Other embodiments could include different or additional elements. The optical elements can be embedded in the arm portions of the frame 102 of an embodiment. A projector 108 may project images onto at least one lens, 104 that is positioned in the opening of frame 102. An arm portion of the eyepiece frame 101 may contain one or more projectors, such as a picoprojector (nanoprojector), microprojector (femtoprojector), LASER-based projector (holographic projector) and others. Both lenses 104 can be seen through or translucent in certain embodiments. In other embodiments, one lens 104 may be opaque or missing while the other lens 104 is transparent. One projector 108 can be included in eyepiece 100, according to certain embodiments.

FIG. 1. The eyepiece 100 may include at least one articulating and radio transceiver 120, as well as a heat sink (114), to absorb heat from LED light engines and keep it cool. Also included are one or more TI OMAP4s (open multimedia application processors), 112, and a flex-cable with RF antenna 110. All of these will be described in detail here.

Referring to FIG. The projector 200 could be an RGB projector, according to one embodiment. The housing 200 could include a heatsink, 204, and an RGB module or engine 206. The RGB LED engine (206) may contain LEDs, dichroics and concentrators. The digital signal processor (DSP), not shown, may convert the video stream or images into control signals such as voltage drops/current mods, pulse width modulation signals (PWM), and other signals to control the intensity, duration, mixing, and color of the LED light. The DSP can control the duty cycle for each PWM signal, which controls the average current flowing through each LED that generates a variety of colors. The eyepiece’s still image processor may be capable of image enhancements, noise-filtering, video stabilization, face detection, or image/video stability. The eyepiece’s audio back-end processor may use equalization, buffering, and SRC.

A projector 200 could include an optical display 210 such as an LCoS or LCD display and other components, as shown. The projector 200 can be constructed with a single-panel LCoS display (210), but it is possible to design it with three panels. The single-panel display 210 is lit sequentially with red, green, and blue (aka field sequential colors). Other embodiments of the projector 200 can make use alternative optical display technologies such as a backlit liquid crystal display (LCD), front-lit LCD, or a transflective LCD. These technologies include a transparent liquid-crystal microdisplays, quantum dots displays, ferroelectric LCoSs (FLCOS), and liquid crystal technologies mounted onto Sapphire.

Any power source can be used to power the eyepiece, including solar power, battery power and line power. The power can be embedded in the frame, 102, or placed externally to the eyepiece 100. It may also be in electrical communication with the powered elements in the eyepiece 100. A solar energy collector, for example, could be attached to the frame 102 via a belt clip or other means. You can charge your battery using a car charger, a wall charger or a belt clip.

For vibration-free mounting of the LED light engines 206 and hollow tapered light tunnel 220 and diffuser 212, the projector 200 can include the condenser lens 214. Hollow tunnel 220 is used to homogenize rapidly changing light from the RGB LED lighting engine. Hollow light tunnel 220 can be equipped with a silvered coating in one embodiment. The diffuser lens 220 further homogenizes the light and mixes it before it is directed to the condenser lenses 214. The light enters the polarizing beam splittingter (PBS), 218 after it leaves the condenser lenses 214. The PBS propagates the LED light and then splits it into polarization parts before it is refracted into the field lens 216 or the LCoS display. The microprojector’s image is provided by the LCoS display. The image is then reflected back from the LCoS screen and through the polarizing beam splittingter. Finally, the image is reflected ninety degrees. The image is then reflected from microprojector 200 at the center of the microprojector. The light is then led to the coupling lenses 504, which are described below.

Summary for AR glasses that allow for event and user action control of applications external to the device

Field

This disclosure concerns an augmented reality eyepiece and associated control technologies as well as applications for their use.

One embodiment of an eyepiece includes a nano-projector or micro-projector. This may include a light source and LCoS display, a freeform waveguide lens enabling TIR bounces and a coupling lens. A wedge-shaped optic (translucent correct lens) is attached to the waveguide lens. It allows proper viewing through the lens regardless of whether the projector has been turned on or off. An RGB LED module may be included in the projector. An RGB LED module can emit field sequential colors. This is where the different colored LEDs are turned off in rapid succession and form a color image that’s reflected off the LCoS. A projection collimator or polarizing beam splitter may be included in the projector.

One embodiment of an eyepiece can include a freeform transparent correction lens, a wave guide lens and a display lens.

An embodiment may include an optical wedge waveguide that is optimized for the ergonomics of the human head. This allows the eyepiece to wrap around the face.

Another embodiment of an eyepiece might include two freeform optical surfaces with waveguide. This allows for folding complex optical paths within a thin prism form factor.

A system may include an interactive head-mounted optical eyepiece that a user wears. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor handles content and introduces it to the assembly. An integrated image source can also be used to insert the content to the assembly. In one embodiment, the interactive control elements are fixed relative to objects in the environment. This is in response to the interactive command element location command.

A system may include an interactive head-mounted eyepiece that a user wears. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor is used to handle content and introduce the content to the assembly. An integrated camera facility images the user’s body as it interacts to the interactive control elements. In this case, the processor subtracts the portion of interactive control element that was determined to be located with the user’s imaged body part.

A system may include an interactive head-mounted optical eyepiece that a user wears. The optical assembly includes a corrective element to correct the user’s view, an integrated processor to display content and an integrated source of images to introduce the content to the optical apparatus. An interactive keyboard control element may be displayed. The keyboard control element can also include an input path analyzer and a word matching search facility. A pointing device, such as a stylus or finger, can be used to input text. You can input text by sliding a stylus or a finger across the keyboard input interface. The input path analyzer determines which characters were contacted, while the word matching facility locates the best word match and enters the text.

A system may include an interactive head-mounted optical eyepiece that a user wears. The optical assembly includes a corrective element that corrects a user’s view of the environment. An integrated processor handles content and introduces content to the optical assemblies. An integrated camera facility images an external visual clue. The integrated processor interprets this external visual cue and commands the display of content that is associated with it. The visual cue could be a sign that is visible in the environment and the content projected therewith. The advertisement may be personalized based on the preferences of the user. The projected content may include a projected virtual keyboard and the visual cue could be a hand gesture. The hand gesture could be a thumb-index finger gesture with one hand and the virtual keyboard projected on another hand. This allows the user to use the virtual keyboard with the second hand. The hand gesture could be a combination of thumb and index fingers gestures from both user hands. The virtual keyboard is projected between user hands according to the hand gesture.

A system may include an interactive head-mounted optical eyepiece that a user wears. The optical assembly includes a corrective element that corrects a user’s view of the environment. An integrated processor handles content and introduces content to the optical assemblies. An integrated camera facility images a gesture and the integrated processor interprets it as a command instruction. A control instruction can be used to manipulate the content, or communicate a command to an external device.

A system may include an interactive head-mounted optical eyepiece, which allows a user to view the surrounding environment and display content. The optical assembly includes a corrective element to correct the user’s vision, an integrated processor to handle content and an image source to introduce the content to it to the user; and a tactile control device mounted on the eyepiece. This interface accepts inputs from the user through at most one of two methods: the user touching the interface or the user being close to the interface.

A system may include an interactive head-mounted optical eyepiece for the user. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor is used to handle content and provides control instructions to the processor based on detecting a predefined type of head motion characteristic.

A head motion characteristic could be a nod by the user’s skull such that it is an overt movement dissimilar to ordinary head movements. An overt motion could be a jerking movement of the head. Control instructions can be used to manipulate the content of the display or communicate with an external device.

A system may include an interactive head-mounted eyepiece that a user wears. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor handles content and introduces content to the optical assemblies. In some embodiments, the optical apparatus includes an electrochromic layer which adjusts the display characteristic based on the displayed content requirements and the environment. The display characteristic can be brightness, contrast, or the like in embodiments. Display characteristic adjustment can be applied to the area of the optical assembly that contains the content.

An embodiment of the eyepiece is a head-mounted interactive eyepiece that a user wears. The eyepiece contains an optical assembly that allows the user to view the surrounding environment and display content. An integrated image source may be used to introduce the content to the optical assemblies. The eyepiece can also include an adjustable wrap-around extendable arm that is made of any shape memory material to secure the position of the eyepiece at the user’s head. An extendable arm can extend from the end of an eyepiece arms. Silicone may be used to cover the end of an extendable wrap-around arm. The extendable arms can be secured to one another or may grasp a part of the head independently. The extendable arm can attach to a portion on the head-mounted eyepiece to secure it to the user’s skull. The extendable arm can extend telescopically beyond the end of an eyepiece arm in certain embodiments. Other embodiments allow at least one wrap-around extendable arm to be detached from the head mounted optical. The extendable arm can also be added to the head-mounted eyepiece.

An embodiment of the eyepiece is a head-mounted interactive eyepiece that a user wears. The eyepiece contains an optical assembly through which the user can view the surrounding environment and display content. An integrated image source may be used to introduce the content to the optical assemblies. The displayed content could also include a local advertisement in which the location of your eyepiece can be determined using an integrated location sensor. The local advertisement could also be relevant to the eyepiece’s location. Other embodiments may include a capacitive sensor that can detect whether the eyepiece comes in contact with skin. Based on whether the capacitive sensor detects that the eyepiece has come into contact with human skin, the local advertisement could be sent to the user. Local advertisements might also be sent if the eyepiece is turned on.

Other embodiments of local advertising may include a banner advertisement or a two-dimensional graphic. Advertisement may also be linked to a physical aspect or environment. Another embodiment of an advertisement could be displayed as an enhanced reality that is associated with a physical aspect or the environment. An augmented reality advertisement can be either two-dimensional or three-dimensional. The advertisement can be animated or linked to the user’s environment. Local advertisements can also be displayed to users based on the results of a user’s web search. The content of the local advertisement can also be determined using the user’s personal data. An advertising facility or web application may have access to the user’s personal data. A web application, advertising facility, or eyepiece may use the user’s information to filter local advertising. Local advertisements may be cashed on a server, where they can be accessed by at most one of the following: advertising facility, web app and eyepiece. The user may then see it.

Another embodiment allows the user to request more information about a local advertisement through any eye, body, or other gesture. A user can also ignore a local advertisement by using any eye movement, body movement or gesture, and not selecting the advertisement to be interacted with within a specified time. Another way to choose not to allow local advertisements is to select such an option from a graphical interface. Alternativly, such advertisements may be disabled by the user via an eyepiece control.

One embodiment may also include an audio device. The displayed content could also include audio and local advertisements. An integrated location sensor may determine the location of an eyepiece. Local advertisements and audio may also be relevant to that location. A user might hear audio that corresponds with the local ads and displayed content.

One aspect of the interactive head-mounted eyepiece includes an optical assembly. The corrective element corrects the user?s view of the surrounding area and displays content. An optical waveguide has a first and second surface that allow total internal reflections. An integrated processor may be included in the eyepiece to display content to the user and an integrated source of images to introduce the content to the optical assemblies. This aspect allows content to be introduced to the optical waveguide at an angle that causes minimal internal reflection. To reflect displayed content to the second optical waveguide surface, the eyepiece has a mirror surface. The mirrored surface allows for either a complete reflection of light entering the opticalwaveguide, or at least a partial reflection. The surface can be 100% or lower percentage mirrored depending on the embodiment. An air gap between the corrective element and the waveguide may be used in some embodiments to create a reflection of light entering the waveguide at an angle that is not conducive to TIR.

One aspect of the interactive head-mounted optical eyepiece may include an optical system. This allows the user to view the surrounding environment and display content. The optical assembly also includes a corrective element which corrects the user’s perception and an integrated processor that handles content for the user. An integrated image source is also included in the eyepiece. This allows the user to view the content from the side of an optical waveguide that is adjacent to the arm of the eyepiece. The displayed content aspect ratio ranges between approximately square and approximately rectangular, with the long axis roughly horizontal.

An interactive head-mounted eyepiece has an optical assembly that allows a user to view the surrounding environment and display content. The corrective element corrects the user’s perception of the environment. A freeform optical waveguide allows internal reflections and a coupling lens directs an image from the LCoS display to an optical waveguide. An integrated processor is included in the eyepiece to display content and an integrated projector facility to project the content to it. The projector facility comprises a lightsource and an LCoS screen. Light from the light source is controlled by the processor. It then traverses a beam splitter to be polarized and reflected off the LCoS and into the optical guide. Another aspect of the interactive head-mounted optical eyepiece is an optical assembly that allows a user to view the surrounding environment and display content. The optical assembly also includes a corrective element to correct the user’s perception of the environment, an optical guide that allows internal reflections and a coupling lens to direct an image from the optical display to an optical waveguide. An integrated processor is included in the eyepiece to handle content and introduce the content to an optical assembly. The integrated image source includes a light source and an optical display. An optical correction lens that is attached to the optical guide may serve as the corrective element. It allows for proper viewing of the surrounding environment, regardless of whether the projector or image source are on or off. Dual freeform surfaces may be used to create a freeform optical waveguide. These two freeform surfaces allow for a curvature or sizing, which allows the waveguide to be placed in the frame of an interactive head-mounted eyepiece. An RGB LED module may be used as the light source. It emits light sequentially and forms a color image which is reflected back off the optical display. A homogenizer may be included in the eyepiece to distribute light from the source. This ensures uniform light beaming. The polarizing beam splitter’s surface reflects the color image of the LCoS or optical display into the optical waveguide. An eyepiece might also include a collimator, which increases the resolution of light entering the opticalwaveguide. The processor may control the light source to emit light and then it will pass through a polarizing beam splittingter, where it is polarized and reflected back onto the optical display. An LCoS or LCD display could be used as the optical display. The projector may be an image source, wherein it is at least one of a micropoprojector or a nanoprojector. A polarizing beam splittingter is also included in the eyepiece. This polarizes the light from the source and reflects it off the LCoS screen and into the optical guide. The polarizing beams splitter’s surface reflects the LCoS display’s color image into the optical wavesguide.

An apparatus for recording biometric data is provided in an embodiment. An apparatus for capturing biometric information is provided in an embodiment. Biometric data can be either visual or audio. An optical assembly is included in the apparatus that allows the user to view the surrounding environment and display content. An optical assembly includes a corrective element which corrects the user?s perception of the environment. The integrated processor manages content that is displayed to the user via the eyepiece. An integrated image source is also included in the eyepiece to introduce the content to the optical assembly. An integrated optical sensor assembly is used to capture biometric data. An integrated microphone array is used to capture audio data. The integrated communications facility transmits the data and remotely processes the biometric data. Remote computing facilities interpret and analyze the captured biometric information, create display content, and deliver the content to the eyepiece.

Another embodiment includes a camera mounted on an eyepiece to obtain biometric images of a person proximate the eyepiece.

Another embodiment provides a method to capture biometric data. The method involves placing an individual near the eyepiece. The eyepiece may allow the wearer to move into a position that allows the capture of biometric data. Once the eyepiece is positioned, it captures biometric information and transmits that data to a facility which stores it in a biometric database. Remote computing facilities are used to interpret the data received and create display content. The display content is then sent back to the user so that it can be displayed on the eyepiece.

Another embodiment provides a method to capture audio biometric data. The method involves placing an individual near the eyepiece. The eyepiece can be adjusted so that the audio biometric data is captured. Once the microphone array is positioned, it captures audio biometric information and transmits it to a facility which stores the audio biometrics data in a database. Remote computing facilities are used to interpret the data received and create display content. The display content is then sent back to the user so that it can be displayed on the eyepiece.

One embodiment of the eyepiece has a see-through correction lenses attached to the exterior surface of an optical waveguide. This lens allows for proper viewing of the surrounding environment, regardless of whether it contains any content. A prescription lens that allows for the viewing of corrective eyeglass prescriptions may be attached to the see-through lens. The see-through lens can be polarized. It may attach to the optical waveguide or a frame. Polarized correction lenses block the oppositely polarized light that is reflected from the user?s eyes. The see-through lens can attach to at most one of the optical wavesguide and a frame. This protects the optical wavelength guide and may contain at least one ANSI-certified ballistic material or polycarbonate material.

One embodiment of an interactive head-mounted optical eyepiece comprises an eyepiece that can be worn by the user, an optical apparatus mounted on the eyepiece that allows the user to view a surrounding environment and displayed content. The optical assembly also includes an integrated processor that handles content and introduces it to the optical system. An electrically adjustable lens that is integrated with the optical unit adjusts the focus of the displayed content.

One embodiment concerns an interactive head mounted eyepiece. An interactive head-mounted optical eyepiece is provided. It includes an eyepiece that can be worn by the user and an optical assembly mounted on it. The corrective element corrects the user’s view. There is also an integrated processor in the interactive head-mounted optical eyepiece to handle content and display it to the user. An integrated liquid lens that can be adjusted electrically to correct the displayed content is part of the interactive head mounted eyepiece.

An interactive head-mounted eyepiece that can be worn by a user is another embodiment. An interactive head-mounted optical eyepiece is included. The user can view the surrounding environment and displayed content through the optical assembly. The integrated processor handles content and displays it to the user. An integrated image source is included in the interactive head-mounted optical eyepiece to introduce the content to it, an electrically adjustable liquid lenses that adjust the focus of the displayed contents for the user, as well as at least one sensor mounted on an interactive head-mounted optical eyepiece. The output of at least one sensor is used for stabilizing the displayed content using either optical stabilization or image stabilization.

One embodiment is a method of stabilizing images. This method involves providing an interactive head-mounted camera with an optical assembly and a camera that allows users to view the surrounding environment and display content. The user then images the surrounding environment using the camera to capture an object within the environment. The method includes displaying the content at a fixed position relative to the user?s view of an imaged object. Sensing vibration and movement with the eyepiece and stabilizing the displayed contents with respect to user’s view through at least one digital technique.

A method of stabilizing images is another embodiment. This method involves providing an interactive head-mounted camera with an optical assembly and a processor that can handle content and project it to the optical assembly. The camera then images the surrounding environment using the integrated projector to capture an image. The method includes displaying the content at a fixed position relative to the user?s view of an imaged object. Sensing vibration and movement with the eyepiece and stabilizing the displayed contents with respect to user?s view of surrounding environment via at minimum one digital technique.

One embodiment includes a method of stabilizing images. One embodiment of the method involves providing an interactive, head mounted eyepiece that a user can use to view the environment. The optical assembly also includes an integrated processor to display the content and an image source to introduce the content to it. A camera is used to image the environment to take an image of an object within the environment. The method also includes steps such as displaying the content at a fixed position relative to the user?s view of an imaged object using the optical assembly.

An interactive head-mounted eyepiece is another embodiment. An interactive head-mounted eyepiece is another embodiment. It includes an eyepiece which can be worn by the user, an optical system mounted on the eyepiece, through which the user views the surrounding environment and displays content. A corrective element mounted on to the eyepiece corrects the user?s view of this environment. An integrated processor is included in the interactive, head-mounted, eyepiece to display content. There’s also an integrated image source to introduce the content to the optical assemblies. Finally, there’s at least one sensor that can be mounted on the camera, or the eyepiece. The output of at least one sensor can be used to stabilize the optical assembly of an interactive head-mounted eyepiece by at least one digital technique.

An interactive head-mounted eyepiece is one embodiment. An interactive head-mounted optical eyepiece is included in the interactive head-mounted device. It includes an eyepiece that can be worn by the user, an optical system mounted on the eyepiece, through which the user can view the surrounding environment and displayed content. A processor inside the eyepiece handles content and displays it to the user. An integrated image source is included in the interactive head-mounted optical eyepiece to introduce the content to the optical assemblies. There is also at least 1 sensor mounted on the interactive eyepiece. The output of at least one sensor helps stabilize the content displayed by the optical assembly of an interactive head-mounted optical eyepiece using either optical stabilization or image stabilization.

An interactive head-mounted eyepiece is another embodiment. An interactive head-mounted eyepiece is another embodiment. It includes an eyepiece that can be worn by the user, an optical unit mounted on the eyepiece that allows the user to view the surrounding environment and displayed content. There is also an integrated processor for handling the content for display. An integrated image source is included in the interactive head-mounted eyepiece to introduce the content to it, an electrooptic lens that stabilizes the content for display to users, and at least 1 sensor mounted on the eyepiece.

The disclosed aspects include an interactive head-mounted optical eyepiece that a user wears. This eyepiece comprises an optical assembly through the user can view a surrounding environment and displayed contents. An integrated processor is used to display the content and an integrated source of image to introduce the content to the optical apparatus.

An eyepiece could also include a control device that is worn on the hand of the person. It may have at least one control element that is actuated by the user’s digit and provide a command instruction from the actuation. A command instruction could be directed at manipulating content to display to the user.

An eyepiece could also include a hand motion sensor device that is worn on the hand of the user and provides control commands to the processor.

A bi-directional optical system may be included in the eyepiece. This allows the user to view the surrounding environment and display content. The processor handles the content for display to users and sensor information. The processor uses information from both the sensor and displayed content to determine the user’s line of sight relative to the projected images.

The eyepiece transmits to the processor line of sight information as commands instructions.

A hand motion sensing device may be included in the eyepiece for tracking hand movements within the field of view. This will provide control instructions to it.

One aspect of social networking is contacting a social network website using an eyepiece. This allows you to request information about other members using the interactive head mounted eyepiece and search for others nearby using the interactive eyepiece.

One aspect of social networking involves contacting a social network website using an eyepiece and asking for information about other members using the interactive eyepiece. The signal also indicates the location of the user using the head-mounted interactive eyepiece. Access to information about the user is then possible.

One aspect of social networking involves contacting a social network website using an eyepiece and asking for information about its members using the interactive, helmet-mounted head-mounted camera.

One aspect of gaming involves contacting an online gaming website using the eyepiece and initiating or joining an online game using the interactive head mount eyepiece. Then, the user can view the game through the interactive head-mounted optical eyepiece. Finally, the player can play the game by using at least one body-mounted control device that is connected to the interactive head mounted eyepiece.

One aspect of gaming involves contacting an online gaming website using the eyepiece, joining or initiating a game with multiple members using an interactive head-mounted optical system, viewing the game content using the optical assembly, and then playing the game using at least one sensor to detect motion.

One aspect of a gaming method is to contact an online gaming website using the eyepiece and then contact at least one other player for a game using the interactive headset-mounted eyespiece. Then, initiate a game using the interactive headset-mounted eyespiece. Finally, view the game using the interactive eyepiece’s optical assembly. Finally, play the game by touching at least one control with the interactive eyepiece.

One aspect of augmented vision is a method that includes an interactive head-mounted optical eyepiece and an optical assembly. The user can view the surrounding environment through the optical assembly and scan it with a black silicon long wave infrared image sensor. A user can also control the SWIR sensor using gestures, commands, movements and gestures. Finally, the sensor sends at least one visual picture to the processor. The user can then view the at least one visual object using the optical assembly.

One aspect of augmented vision is a method that includes a head-mounted camera with an optical assembly and a camera. The user can view the surrounding environment through the eyepiece and send information to the processor.

One aspect of augmented vision is a method that includes an interactive head-mounted optical eyepiece and an optical assembly. The optical assembly provides an interface for viewing the environment through the eyes and displays content. It also contains an integrated processor to display the content.

One aspect of the method includes the following: contacting an accessible data base using an interactive head mounted eyepiece that displays content and the surrounding environment, requesting information from the database, and then viewing the information from the database using the interactive eyepiece. The user does not need to contact the head-mounted device for these steps.

One aspect of the method involves contacting an accessible data base using the eyepiece, asking for information using that eyepiece, displaying the information using an optical facility and manipulating it using the processor. The steps of requesting and manipulating information are done without having to touch the interactive head mounted eyepiece.

One aspect of the method is to contact an accessible database using an eyepiece. The user requests information from the accessible site using the interactive head mounted eyepiece.

One aspect of social networking involves providing an eyepiece and scanning the facial features of nearby people with an optical sensor on the head-mounted camera. Then, the user extracts a facial profile from the person, contacts a social media website using the communication facility of the interactive head mounted eyepiece, and searches a database for a match to the facial profile.

One aspect of social networking involves providing an eyepiece and scanning the facial features of nearby people with an optical sensor from the head-mounted eyespiece. Then, the person extracts a facial profile, contacts a database using the communications facility on the head-mounted eyespiece, and searches the database for someone matching the facial profile.

One aspect of social networking involves contacting a website via the eyepiece and asking for information about nearby members using the interactive, head mounted eyepiece. The eyepiece scans facial features of nearby people to identify them as members using an optical sensor. This extracts a profile of the person and searches at least one other database.

One aspect of augmented vision is that the user provides the eyepiece and controls the camera using gestures, movements or commands. Information from the camera is sent to a processor attached to the interactive head-mounted camera. The user then views visual images through the optical assembly. Visual images from the camera are an improvement in at least one of clarity, brightness, clarity, magnification, and focus.

Another aspect of augmented vision is that the eyepiece can be used to control the camera using movements of the user, send information from the camera back to the processor, and view visual images using the optical assemblies of the interactive eyepiece. The user will see visual images from both the camera and the optical assembly in an improvement of at least one of clarity, brightness, magnification, focus and clarity.

One aspect of augmented vision is that the user provides the eyepiece and controls the camera using their movements. The integrated processor of augmented head-mounted eyes can then send information to the camera, apply an image enhancement technique using computer program and the integrated processor. Finally, the user can view visual images through the optical assembly of the interactive head-mounted eyespiece. Visual images from the camera and optical assemblies are an improvement in at least one of clarity, brightness, magnification, focus and clarity.

One aspect of facial recognition involves taking an image of the subject using the eyepiece, converting it to biometric information, comparing the biometric to a database of biometrics data, identifying biometrics that match previously collected biometrics, and reporting the displayed content.

Another aspect of the system comprises the eyepiece and a face detection device in conjunction with an integrated processor facility. The face detection facility takes images of faces in their surrounding environment and compares them to stored images in a database. It then provides a visual indicator to indicate a match. This visual indication corresponds with the current location of the imaged facial in the environment as part of the projected contents. An integrated vibratory actuator is also included in the eyepiece. The vibratory actuator generates vibrations that alert the user of the match.

One aspect of a method to augment vision is to collect photons using a short-wave infrared sensor mounted on an eyepiece. Then, convert the photons in short wave to electrical signals, relay the signals to the eyepiece to display, collect biometric data using a sensor, and then transfer the audio data to a database.

Another aspect of object recognition is to capture an image of an object using the eyepiece and analyze it to determine if it has been previously photographed and analyzed.

An eyepiece, according to an aspect of the invention includes a frame that can be used to attach a lens and an image source device above it. An LED, a planar lighting facility and a reflective display are all part of the image source facility. The planar illumination device is designed to convert the light beam received from the LED on one side into a top emitting planear light source. The planar lighting facility is placed to evenly illuminate the reflective display. The planar illumination device is further modified to be substantially transmissive so that image light reflected from reflective displays can pass through the planar lighting facility to a beam splitter. The beam splitter is designed to receive image light from the reflective display, and reflect some of it onto a mirror surface. The mirror surface is placed and shaped to reflect image light into the eyes of the wearer of an eyepiece. This creates an image within the field of view. Additionally, the mirrored surface can be modified to partially reflect within the area of image reflection. A reflective display can be a liquid display such a liquid on silicon (LCoS), cholesteric, guest-host, polymer dispersed, liquid crystal and phase retardation display. It also includes a bistable display such a electrophoretic, electrofluidic and electrowetting, electrokinetic, or any combination thereof. The thickness of the planar illumination facility should not be less than 0.5 mm. A cover glass may be used to protect the reflective display.

A wedge-shaped optic may be used to reflect the light from an LED onto a reflective display. The wedge-shaped optic then reflects the light back towards the wedge-shaped optic, directing the polarizing beam splittingter. To further redirect the image towards beam splitter, the planar illumination facility could also include an image direction correction optic.

An optic with a lower-surface optic is used to planar illuminate a reflective display. The lower surface has imperfections that direct light upwards from the LED to illuminate the reflective display. Image light reflected from the reflective displays is then projected back to the optic with a higher surface. It passes through the optic with the lower side in a direction toward the polarizing beam splitter. A correction optic may be included in the planar illumination facility to correct image dispersion due to imperfections.

A multi-layered optic may be used in the planar illumination facility. Each layer has an angle that reflects a portion the light beam from an LED upwards to illuminate the reflective display. The image from the reflective device is projected back to the multi-layered optical and then passes through the multilayered optic in the direction of the polarizing beam splittingter. A diffuser may be included in the planar lighting facility to increase the cone angle of image light passing through the facility to the beam splitter.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to handle content for display. It may also include an image source for introducing content to an optical assembly. The user can view the surrounding environment and displayed content through the eyepiece, and may also include a user interface that is based on an external device type. An external device may be connected to the eyepiece via a communications facility. A memory facility in the eyepiece may also store user interfaces specific to that external device type. The optical assembly will display a user interface specific to the external device type when it is connected to the eyepiece.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to display content and an image source to introduce the content to an optical system. The user can view the surrounding environment and displayed content through the eyepiece, which has a control interface that is based on an external device type. An external device may be connected to the eyepiece by a communications facility. The integrated memory facility of an eyepiece may also store control schemes specific to that external device type. When the external device is connected, an eyepiece can access a control scheme specific to that external device type.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to display content and an image source for introducing content to an optical assembly. The user can view the surrounding environment and displayed content through the integrated image source. In addition, the eyepiece provides a control interface and user interface that are based on the type of connected external device. An external device may be connected to the eyepiece by a communications facility. A memory facility in the eyepiece may also store control schemes and user interfaces based on external device types. When the external device is connected, an optical assembly presents a user interface that is specific to the external type and the eyepiece has a control scheme that is specific to the external type. An external device can be an audio system. The user interface could be an audio controller and the control scheme could be a head nod.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an image source to introduce the content to an optical system. The eyepiece may also include sensor-based command-and-control of external devices, with feedback from the external device to it. An external device can be connected to the eyepiece via a communications facility. A sensor may detect a condition. The eyepiece may present a user interface that allows for command and control of external devices. Feedback from external devices may also be provided. The sensor may emit a signal that can be displayed as content when it detects the condition.

An interactive head-mounted eyepiece can include an integrated processor to handle content for display, and an integrated source for introducing content to an optical assembly. The user can view the surrounding environment and displayed content through the eyepiece, which has a user-action-based command and control system for external devices. An external device can be connected to the eyepiece via a communications facility. A user action capture gadget may detect the user’s input and present a user interface that allows for the command and control of the device. One embodiment of the user action capture device could be a body-worn sensor device and the external device a drone.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to handle content for display, and an image source for introducing content to an optical assembly. The eyepiece can also be used for predictive control of an external device, based on event inputs. An integrated memory facility can be used to record contextual information. This information could include information about an activity, communication, or event that was monitored by the eyepiece. A location may be included in the contextual information to indicate where the activity, communication or event was recorded. A facility may be available to analyze the context information and project a pattern in usage. An external device can be connected to the eyepiece via a communications facility. The eyepiece may control and command the external device when it detects a pattern of usage. If the external device is detected, a command-and-control interface may be displayed on the eyepiece.

An interactive head-mounted eyepiece can include an integrated processor to display content and an image source to introduce the content to an optical system. The eyepiece can also include event input-based control of the eyepiece application and user action control. An eyepiece may have an integrated processor that detects user input. A command and control interface to command and manage the eyepiece can be displayed in the eyepiece. The command and control interface may also accept input from the user action capture devices.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an image source for introducing content to an optical assembly. The eyepiece may also include event and user control of external apps. An external device can be connected to the eyepiece via a communications facility. A user action capture system may detect an input from the user and enable a command-and-control scheme to command and direct an external application. The command and control scheme may also use actions captured by the user as input to the external app.

An interactive head-mounted eyepiece with an integrated processor may be used to display content. The integrated image source may also be used to introduce the content to an optical system. In some embodiments, the eyepiece may allow the user control and feedback on internal and external applications. An external device can be connected to the eyepiece via a communications facility. A user action capture tool may detect an input event. The eyepiece may present a command-and-control interface that allows for the command, control, and management of an internal and external application.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to display content and an image source to introduce the content to the optical assembly. The eyepiece may also include a sensor and user-action based control of external devices with feedback. An external device may be connected to the eyepiece via a communication facility. A sensor can detect a condition. An eyepiece might detect user actions and present them to the user as input. The command and control interface could provide feedback from an external device.

An interactive head-mounted eyepiece with an integrated processor may be used to display content. The integrated image source may also be used to introduce the content to an optical system. In some embodiments, the eyepiece may also include a sensor and user action-based control of eyepiece apps with feedback. The sensor can detect a physical quantity and the user action capture devices may detect an input. An eyepiece application may then be controlled by the sensor through a command-and-control interface. This interface may also provide feedback from the eyepiece’s application.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to handle content for display. It may also include an image source for introducing content to an optical assembly. The eyepiece may also include event, sensor and user action based control over applications on external devices with feedback. A sensor can detect a condition and a user-action capture device could detect a user’s input. An external device may be connected to the eyepiece by a communications facility, and an internal application might detect an event. The eyepiece application may detect an event and present a command-and-control interface to command and direct the operation of an external program. This interface can accept input from at most one sensor or user action capture device. In addition, the command/control interface may provide feedback from the external app.

An interactive head-mounted eyepiece with integrated processor may be used to display content. It may also include an integrated source of image for introducing content to an optical assembly. The user can view the surrounding environment and displayed content through the eyepiece. In some embodiments, the eyepiece may have a state-triggered interaction with an advertising facility. An object detector might detect activity as input. A head-mounted camera or eye-gaze detection device may detect eye movement as input. A navigation system controller may connect to the eyepiece. An ecommerce application may detect an event. The eyepiece may display a 3D navigation interface that allows for the command and control of a bullseye or target tracking system. The 3D navigation interface can accept input from at most one of the object detector, head-mounted camera or eye-gaze detection systems. In this case, feedback may be provided by an advertising facility within the eyepiece.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an image source for introducing content to an optical assembly. In some embodiments, the eyepiece can also include an event and user action capture device control for external applications. An external payment system may be connected to the eyepiece. An inertial motion tracking device may detect finger movements as input. An email application may detect email reception as an event. A navigable list of bills may be displayed, where the user can send the information to the external payment systems for payment. The navigable list may also accept finger movements captured by the inertial motion tracking device.

An interactive head-mounted eyepiece can include an integrated processor to handle content for display. It may also include an image source to introduce the content to an optical system. The eyepiece may further contain an event, sensor and feedback for direct control of external devices. The eyepiece may have a sensor that detects a condition. A user action capture device can detect input from the user. An external device may also be connected to the eyepiece. When a condition is detected, the eyepiece may present a command-and-control interface for commanding and controlling the external device. The command and controlling interface can accept input from at most one of the user-action capture devices and the sensor. In addition, the command and control interface might present feedback from an external device in the eyepiece.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an integrated source for introducing content to an optical assembly. The eyepiece may also include sensor input that triggers user action capture device control. A user action capture device can identify an event and present a command-and-control interface that is based on the event. The command and control interface might accept input from the user.

An interactive head-mounted eyepiece with integrated processors may be used to display content and an image source. The integrated processor allows the user to view the surrounding environment and the displayed content. In some embodiments, the eyepiece may also include sensor-triggered user movement control. The eyepiece can be able to recognize an event and may respond to user input by enabling it to take in movements.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an integrated source for introducing content to an optical assembly. The eyepiece may also include an event and sensor triggered command or control facility. A minimum of one sensor can detect an event, a quantity, or the like, and if the sensor receives input, an interface may be provided for command and control.

An embodiment of an interactive head-mounted eyepiece includes an integrated processor to handle content and an integrated source for introducing content to an optical assembly. The user can view the surrounding environment and display content through the eyepiece, and may also include sensor-triggered control of eyepiece apps. An input sensor can detect an event or a physical quantity. An internal application may detect data feeds from a network source. A command scheme may then be available to control an eyepiece program when the eyepiece app detects the data feed and the sensor receives it.

An interactive head-mounted eyepiece can include an integrated processor to handle content and an image source to introduce the content to an optical system. The eyepiece may also include an event and sensor-triggered interfaces to external devices. An external device can be connected to the eyepiece via a communications facility. A sensor may detect an event or a physical quantity and provide input. The eyepiece may also include a command-and-control interface that allows the external device to be controlled and managed by the user.

An interactive head-mounted eyepiece with integrated processors may be used to display content and integrate image sources. The integrated processor may also allow the user to view the surrounding environment and the displayed content. Event-triggered user action control may also be available. An input device that detects hand gestures may trigger the eyepiece to be able to take hand gestures.

The following description and drawings will make it easy for those who are skilled in the arts to see these and other systems, methods and objects.

All documents referenced herein are hereby included in their entirety. If the text does not state otherwise, or it is clear, all references to items in the singular must be understood to include items from the plural. Grammatical conjunctions can express all disjunctive or conjunctive combinations, sentences, words, and other conjoined clauses.

This disclosure is about eyepiece electro-optics. An eyepiece could include projection optics that project images onto a transparent or see-through lens. This allows the user to see the surrounding environment and the image projected. An RGB LED module may be used to project the image onto the projection optics. This is also called a projector. A single full-color image can be broken down by field sequential color to create color fields that are based on primary colors red, green and blue. Each color field is then imaged individually by an LCoS optical display 210. Each color field is then imaged by an optical display 210 and the appropriate LED color is activated. These color fields can be displayed in rapid succession to create a full-color image. Field sequential color illumination allows for adjustments to any chromatic aberrations in the projected image. The red image can be moved relative to the blue or green images, and vice versa. After the image is reflected, it may be projected into a two-surface freeform waveguide. The image light engages with total internal reflections (TIR), until it reaches the active viewing area of a lens. The processor may be a memory or an operating system that controls the LED light source and optical display. A projector can also be optically connected to a display coupling or condenser lens as well as a polarizing beam splitter and a field lens.

Referring to FIG. FIG. 1 shows an example of the augmented reality vision eyepiece 100. It is possible that other embodiments of the eyepiece 100 might not contain all elements shown in FIG. Other embodiments could include different or additional elements. The optical elements can be embedded in the arm portions of the frame 102 of an embodiment. A projector 108 may project images onto at least one lens, 104 that is positioned in the opening of frame 102. An arm portion of the eyepiece frame 101 may contain one or more projectors, such as a picoprojector (nanoprojector), microprojector (femtoprojector), LASER-based projector (holographic projector) and others. Both lenses 104 can be seen through or translucent in certain embodiments. In other embodiments, one lens 104 may be opaque or missing while the other lens 104 is transparent. One projector 108 can be included in eyepiece 100, according to certain embodiments.

FIG. 1. The eyepiece 100 may include at least one articulating and radio transceiver 120, as well as a heat sink (114), to absorb heat from LED light engines and keep it cool. Also included are one or more TI OMAP4s (open multimedia application processors), 112, and a flex-cable with RF antenna 110. All of these will be described in detail here.

Referring to FIG. The projector 200 could be an RGB projector, according to one embodiment. The housing 200 could include a heatsink, 204, and an RGB module or engine 206. The RGB LED engine (206) may contain LEDs, dichroics and concentrators. The digital signal processor (DSP), not shown, may convert the video stream or images into control signals such as voltage drops/current mods, pulse width modulation signals (PWM), and other signals to control the intensity, duration, mixing, and color of the LED light. The DSP can control the duty cycle for each PWM signal, which controls the average current flowing through each LED that generates a variety of colors. The eyepiece’s still image processor may be capable of image enhancements, noise-filtering, video stabilization, face detection, or image/video stability. The eyepiece’s audio back-end processor may use equalization, buffering, and SRC.

A projector 200 could include an optical display 210 such as an LCoS or LCD display and other components, as shown. The projector 200 can be constructed with a single-panel LCoS display (210), but it is possible to design it with three panels. The single-panel display 210 is lit sequentially with red, green, and blue (aka field sequential colors). Other embodiments of the projector 200 can make use alternative optical display technologies such as a backlit liquid crystal display (LCD), front-lit LCD, or a transflective LCD. These technologies include a transparent liquid-crystal microdisplays, quantum dots displays, ferroelectric LCoSs (FLCOS), and liquid crystal technologies mounted onto Sapphire.

Any power source can be used to power the eyepiece, including solar power, battery power and line power. The power can be embedded in the frame, 102, or placed externally to the eyepiece 100. It may also be in electrical communication with the powered elements in the eyepiece 100. A solar energy collector, for example, could be attached to the frame 102 via a belt clip or other means. You can charge your battery using a car charger, a wall charger or a belt clip.

For vibration-free mounting of the LED light engines 206 and hollow tapered light tunnel 220 and diffuser 212, the projector 200 can include the condenser lens 214. Hollow tunnel 220 is used to homogenize rapidly changing light from the RGB LED lighting engine. Hollow light tunnel 220 can be equipped with a silvered coating in one embodiment. The diffuser lens 220 further homogenizes the light and mixes it before it is directed to the condenser lenses 214. The light enters the polarizing beam splittingter (PBS), 218 after it leaves the condenser lenses 214. The PBS propagates the LED light and then splits it into polarization parts before it is refracted into the field lens 216 or the LCoS display. The microprojector’s image is provided by the LCoS display. The image is then reflected back from the LCoS screen and through the polarizing beam splittingter. Finally, the image is reflected ninety degrees. The image is then reflected from microprojector 200 at the center of the microprojector. The light is then led to the coupling lenses 504, which are described below.

Click here to view the patent on Google Patents.