Invented by Ralph F. Osterhout, John D. Haddick, Robert Michael Lohse, Microsoft Technology Licensing LLC

The market for AR glasses with predictive control of external devices based on event input is rapidly growing, as advancements in technology continue to enhance the capabilities of augmented reality (AR) devices. These glasses offer a unique and immersive experience by seamlessly integrating virtual elements into the real world, allowing users to interact with their surroundings in a whole new way. AR glasses with predictive control of external devices are designed to go beyond simple display capabilities. They are equipped with sensors and cameras that can detect and interpret real-world events, such as hand gestures, voice commands, or even eye movements. This allows users to control external devices, such as smartphones, smart home appliances, or even drones, with just a simple gesture or voice command. One of the key features of these AR glasses is their predictive control capability. By analyzing the user’s behavior and input, the glasses can anticipate the user’s intentions and initiate actions accordingly. For example, if a user is looking at a specific object and makes a hand gesture towards it, the glasses can predict that the user wants to interact with that object and send a command to the external device to perform a specific action, such as turning on a light or playing a song. The applications for AR glasses with predictive control of external devices are vast and diverse. In the healthcare industry, these glasses can be used to assist surgeons during complex procedures, allowing them to control medical equipment without the need for physical contact. In manufacturing and logistics, workers can use these glasses to control robots or machinery, improving efficiency and reducing the risk of accidents. In the entertainment industry, these glasses can enhance gaming experiences by allowing users to control virtual characters or objects with their gestures or voice commands. The market for AR glasses with predictive control of external devices is expected to witness significant growth in the coming years. According to a report by MarketsandMarkets, the global AR glasses market is projected to reach $73.4 billion by 2024, with a compound annual growth rate of 63.3% during the forecast period. The increasing demand for immersive and interactive experiences, coupled with advancements in sensor technology and artificial intelligence, are driving the adoption of these glasses across various industries. However, there are still challenges that need to be addressed for widespread adoption of AR glasses with predictive control of external devices. One of the main challenges is the development of accurate and reliable gesture recognition algorithms. As the glasses rely heavily on interpreting user gestures and movements, it is crucial to ensure that the recognition algorithms can accurately interpret and respond to these inputs in real-time. Another challenge is the integration of these glasses with existing devices and systems. As the glasses need to communicate and control external devices, compatibility and interoperability become crucial factors. Manufacturers need to ensure that their glasses can seamlessly connect with a wide range of devices and platforms, without any compatibility issues. In conclusion, the market for AR glasses with predictive control of external devices based on event input is poised for significant growth. These glasses offer a unique and immersive experience by seamlessly integrating virtual elements into the real world, allowing users to interact with their surroundings in a whole new way. With advancements in technology and increasing demand for immersive experiences, these glasses have the potential to revolutionize various industries, from healthcare to entertainment. However, challenges such as accurate gesture recognition and device compatibility need to be addressed for widespread adoption.

The Microsoft Technology Licensing LLC invention works as follows

This disclosure concerns an interactive eyepiece mounted on a head with an integrated processor to handle content for display, and an integrated source of image for introducing that content into an optical assembly which allows the user to view the surrounding environment as well as the displayed content. The eyepiece also includes a predictive control device for an external device in response to an event input.

Background for Ar Glasses with Predictive Control of External Device based on Event Input

Field

This disclosure concerns an augmented reality eyepiece and associated control technologies as well as applications for their use.

One embodiment of an eyepiece includes a nano-projector or micro-projector. This may include a light source and LCoS display, a freeform waveguide lens enabling TIR bounces and a coupling lens. A wedge-shaped optic (translucent correct lens) is attached to the waveguide lens. It allows proper viewing through the lens regardless of whether the projector has been turned on or off. An RGB LED module may be included in the projector. An RGB LED module can emit field sequential colors. This is where the different colored LEDs are turned off in rapid succession and form a color image that’s reflected off the LCoS. A projection collimator or polarizing beam splitter may be included in the projector.

One embodiment of an eyepiece can include a freeform transparent correction lens, a wave guide lens and a display lens.

An embodiment may include an optical wedge waveguide that is optimized for the ergonomics of the human head. This allows the eyepiece to wrap around the face.

Another embodiment of an eyepiece might include two freeform optical surfaces with waveguide. This allows for folding complex optical paths within a thin prism form factor.

A system may include an interactive head-mounted optical eyepiece that a user wears. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor handles content and introduces it to the assembly. An integrated image source can also be used to insert the content to the assembly. In one embodiment, the interactive control elements are fixed relative to objects in the environment. This is in response to the interactive command element location command.

A system may include an interactive head-mounted eyepiece that a user wears. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor is used to handle content and introduce the content to the assembly. An integrated camera facility images the user’s body as it interacts to the interactive control elements. In this case, the processor subtracts the portion of interactive control element that was determined to be located with the user’s imaged body part.

A system may include an interactive head-mounted optical eyepiece that a user wears. The optical assembly includes a corrective element to correct the user’s view, an integrated processor to display content and an integrated source of images to introduce the content to the optical apparatus. An interactive keyboard control element may be displayed. The keyboard control element can also include an input path analyzer and a word matching search facility. A pointing device, such as a stylus or finger, can be used to input text. You can input text by sliding a stylus or a finger across the keyboard input interface. The input path analyzer determines which characters were contacted, while the word matching facility locates the best word match and enters the text.

A system may include an interactive head-mounted optical eyepiece that a user wears. The optical assembly includes a corrective element that corrects a user’s view of the environment. An integrated processor handles content and introduces content to the optical assemblies. An integrated camera facility images an external visual clue. The integrated processor interprets this external visual cue and commands the display of content that is associated with it. The visual cue could be a sign that is visible in the environment and the content projected therewith. The advertisement may be personalized based on the preferences of the user. The projected content may include a projected virtual keyboard and the visual cue could be a hand gesture. The hand gesture could be a thumb-index finger gesture with one hand and the virtual keyboard projected on another hand. This allows the user to use the virtual keyboard with the second hand. The hand gesture could be a combination of thumb and index fingers gestures from both user hands. The virtual keyboard is projected between user hands according to the hand gesture.

A system may include an interactive head-mounted optical eyepiece that a user wears. The optical assembly includes a corrective element that corrects a user’s view of the environment. An integrated processor handles content and introduces content to the optical assemblies. An integrated camera facility images a gesture and the integrated processor interprets it as a command instruction. A control instruction can be used to manipulate the content, or communicate a command to an external device.

A system may include an interactive head-mounted optical eyepiece, which allows a user to view the surrounding environment and display content. The optical assembly includes a corrective element to correct the user’s vision, an integrated processor to handle content and an image source to introduce the content to it to the user; and a tactile control device mounted on the eyepiece. This interface accepts inputs from the user through at most one of two methods: the user touching the interface or the user being close to the interface.

A system may include an interactive head-mounted optical eyepiece for the user. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor is used to handle content and provides control instructions to the processor based on detecting a predefined type of head motion characteristic.

A head motion characteristic could be a nod by the user’s skull such that it is an overt movement dissimilar to ordinary head movements. An overt motion could be a jerking movement of the head. Control instructions can be used to manipulate the content of the display or communicate with an external device.

A system may include an interactive head-mounted eyepiece that a user wears. The eyepiece comprises an optical assembly that allows the user to view the surrounding environment and display content. An integrated processor handles content and introduces content to the optical assemblies. In some embodiments, the optical apparatus includes an electrochromic layer which adjusts the display characteristic based on the displayed content requirements and the environment. The display characteristic can be brightness, contrast, or the like in embodiments. Display characteristic adjustment can be applied to the area of the optical assembly that contains the content.

An embodiment of the eyepiece is a head-mounted interactive eyepiece that a user wears. The eyepiece contains an optical assembly that allows the user to view the surrounding environment and display content. An integrated image source may be used to introduce the content to the optical assemblies. The eyepiece can also include an adjustable wrap-around extendable arm that is made of any shape memory material to secure the position of the eyepiece at the user’s head. An extendable arm can extend from the end of an eyepiece arms. Silicone may be used to cover the end of an extendable wrap-around arm. The extendable arms can be secured to one another or may grasp a part of the head independently. The extendable arm can attach to a portion on the head-mounted eyepiece to secure it to the user’s skull. The extendable arm can extend telescopically beyond the end of an eyepiece arm in certain embodiments. Other embodiments allow at least one wrap-around extendable arm to be detached from the head mounted optical. The extendable arm can also be added to the head-mounted eyepiece.

An embodiment of the eyepiece is a head-mounted interactive eyepiece that a user wears. The eyepiece contains an optical assembly through which the user can view the surrounding environment and display content. An integrated image source may be used to introduce the content to the optical assemblies. The displayed content could also include a local advertisement in which the location of your eyepiece can be determined using an integrated location sensor. The local advertisement could also be relevant to the eyepiece’s location. Other embodiments may include a capacitive sensor that can detect whether the eyepiece comes in contact with skin. Based on whether the capacitive sensor detects that the eyepiece has come into contact with human skin, the local advertisement could be sent to the user. Local advertisements might also be sent if the eyepiece is turned on.

Other embodiments of local advertising may include a banner advertisement or a two-dimensional graphic. Advertisement may also be linked to a physical aspect or environment. Another embodiment of an advertisement could be displayed as an enhanced reality that is associated with a physical aspect or the environment. An augmented reality advertisement can be either two-dimensional or three-dimensional. The advertisement can be animated or linked to the user’s environment. Local advertisements can also be displayed to users based on the results of a user’s web search. The content of the local advertisement can also be determined using the user’s personal data. An advertising facility or web application may have access to the user’s personal data. A web application, advertising facility, or eyepiece may use the user’s information to filter local advertising. Local advertisements may be cashed on a server, where they can be accessed by at most one of the following: advertising facility, web app and eyepiece. The user may then see it.

Another embodiment allows the user to request more information about a local advertisement through any eye, body, or other gesture. A user can also ignore a local advertisement by using any eye movement, body movement or gesture, and not selecting the advertisement to be interacted with within a specified time. Another way to choose not to allow local advertisements is to select such an option from a graphical interface. Alternativly, such advertisements may be disabled by the user via an eyepiece control.

One embodiment may also include an audio device. The displayed content could also include audio and local advertisements. An integrated location sensor may determine the location of an eyepiece. Local advertisements and audio may also be relevant to that location. A user might hear audio that corresponds with the local ads and displayed content.

Click here to view the patent on Google Patents.