Metaverse – Samuel A. Miller, Paul M. Greco, Brian T. Schowengerdt, Magic Leap Inc

Abstract for “Augmented Reality Systems and Methods with Variable Focus Lens Elements”

An augmented reality display system consists of a pair variable focus lens elements sandwiched between a waveguide stack. To correct refractive errors caused by the focus of light from the waveguide to the user’s eyes, one of the lens elements is placed between the waveguide stack (and the user’s eye). You can also configure the lens elements to provide sufficient optical power to place virtual content on the desired depth plane. The second lens element is located between the ambient and waveguide stack. It is designed to provide optical power to correct for aberrations in light transmission through the waveguide tower and the lens element nearest to the eye. An eye-tracking system monitors vergence and adjusts optical powers automatically and continuously based on that vergence.

Background for “Augmented Reality Systems and Methods with Variable Focus Lens Elements”

Field

“The present disclosure pertains to optical devices, which include augmented reality imaging systems and visualization systems.”

“DESCRIPTION OF RELATED ART”

“Modern computing technology and display technologies have made it possible to develop systems for virtual reality. “Virtual reality” or “augmented reality?” Experiences in which digitally reproduced images, or portions thereof, are presented to users in a way that makes them appear real or can be perceived as such. A virtual reality (or?VR?) scenario usually presents digital or virtual information without transparency to any other real-world visual input. An augmented reality (or?AR?) scenario typically presents digital or virtual information as an enhancement to the visualization of the actual world. Mixed reality (or?MR?) is a type AR scenario that typically includes virtual objects that are integrated into and responsive to the natural world. An example of AR image content in a mixed reality scenario is AR content that appears to block or be perceived as interfacing with objects in real life.

Referring to FIG. “Referring to FIG. 1, an augmented reality scene 10, is shown. AR technology allows the user to see a park-like setting 20 with people, trees, buildings, and concrete platform 30. User also perceives that they?see? ?virtual content? There is a robot statue 40 that stands on the real-world platform 30 and an avatar character 50 that looks like a cartoon, which appears to be a personification for a bumblebee. These elements 50 and 40 are virtual? They are virtual in that they don’t exist in the real world. It is difficult to create AR technology that allows for a natural-feeling, rich presentation among other real-world or virtual imagery elements.

“Systems and Methods disclosed herein address different challenges related to AR/VR technology.”

“In some embodiments, there is a display system. A head-mountable display is used to project light onto a viewer in order to display image information on one of several depth planes. One or more waveguides are used to project light onto the display. One or more waveguides can also transmit light from objects within the surrounding environment to the viewer. A first variable focus lens element is placed between one or more of the waveguides, and the first eye of the viewer. The second variable focus lens element is placed between the one/more waveguides as well as the environment. An eye tracking system can be used to determine the vergence of the viewers eyes. Display system can correct refractive errors by setting the optical power of the first or second variable focus lens elements to adjust based on the vergence.

“In other embodiments, a method is provided for displaying image information on an head-mountable LCD display. The method involves mounting the display on the head of the viewer. The display is configured to display image information in one or more depth plans. One or more waveguides are used to project light onto the viewer in order to display the image information. One or more waveguides can also transmit light from objects within the surrounding environment to the viewer. This method also includes correcting an eye’s refractive error and determining the vergence point for the viewers eyes. A first variable focus lens element placed between the one- or more waveguides, and the eye of the viewer determined by the vergence points may correct the refractive error. The second variable lens element placed between the one and more waveguides will adjust the optical power to the environment around the viewer determined by the vergence points.

“Example 1: A display that includes:

“A head-mountable display that projects light to the viewer in order to display image information on one of several depth planes. The display comprises:

“One or more waveguides that project light to the viewer. The one or more waveguides can also transmit light from objects in the surrounding environment to the viewer.

“A first variable focus lens element between one or more waveguides, and the first eye of the viewer;

“A second variable focus lens element that is between one or more waveguides, the surrounding environment and the other; and

“An eye tracking system that determines vergence and adjusts the optical power of the first or second variable focus lens elements in response to the determined vergence.

“Example 2”: The display system in Example 1 where the system can modify the optical power for the first and second variable focal lens elements based on a depth plane to display the image information.

“Example 3”: A display system for any of Examples 1 through 2, where the display system adjusts an optical power to the second variable focal lens element in response the optical power to the first variable lens element.

“Example 4 is the display system for any of Examples 1 through 3, where the one or more waveguides project divergent light to show the image information.

“Example 5 – The display system for any of Example 1 – 4, where each waveguide has a fixed optical strength.

“Example 6”: A display system based on any of Examples 1-5. It also includes a third variable focal element that is between one or more waveguides, and a second eye.

“Example 7: A display system based on Example 6 also includes a fourth variable focal element that is positioned between one or more waveguides in the surrounding environment.

“Example 8”: A display system for any of Examples 7 through 7, where the system adjusts an optical power of the third lens element variable focus to alter the wavefront of projected light based upon the determined vergence.

“Example 9”: A display system for any of Examples 6-8. The system adjusts an optical power of the fourth lens element variable focus to alter the wavefront of incoming sunlight from the object. This is based on the determined Vergence.

“Example 10”: A display system for any of Examples 1-9. It includes one or more cameras.

“Example 11”: A display system displaying any of Examples 1-10. In which the optical power of the first or second variable focus lens elements is adjusted according to a prescription for correcting the viewer?s vision at distances of two or more.

“Example 12”: A display system for any of Examples 1-11 that has three or more preset prescription optical powers for each variable focus lens element.

“Example 13”: A display system based on any of Examples 1-12 where the number of prescription optical powers available is equal to or less than the total display depth planes.

“Example 14”: A display system based on any of Examples 1-13 where the first or second variable focus lens elements are composed of a layer liquid crystal sandwiched between two substrates.

“Example 15 – The display system in Example 14 where the first and/or the second variable focus lens elements are electrodes that alter the refractive index of liquid crystal layers upon application of voltage.

“Example 16”: The display system shown in Examples 14-15. Substrates are glass.

“Example 17”: A display system for any of Examples 1-16 that includes an electronic hardware control system to adjust the refractive index of either the first or second variable focus lens elements by applying an electrical current, voltage.

“Example 18 – The display system in Example 17. In which the eye tracking system forms an feedback loop to electronic hardware control system, the refractive index for the first and/or the second variable focus lens elements can be varied according to the determined vergence.

“Example 19”: A method to display image information on a headmountable display. The method comprises:

“Providing the display mounted on the head of a viewer and configured to display image information in one or more depth plans.

“One or more waveguides that project light to the viewer in order to display image information”

“wherein one or more waveguides can also be configured to transmit light from surrounding objects to the viewer;

“Determining a vergence level of eyes by the viewer; and

“Correcting a refractive error in the eye of the viewer by:

“A first variable focal lens element with varying optical power, positioned between one or more waveguides; an eye of the viewer determined by the vergence point; and

“A second variable focus lens element with varying optical power, positioned between one or more waveguides and an environment around the viewer according to the determined vergence points.”

“Example 20”: The method in Example 19 further includes:

“A third variable lens element and a fourth lens element, wherein one or more of the waveguides is between the viewer’s eye and another eye, and the fourth lens element is directly ahead of the third lens element and between the one/more waveguides, the surrounding environment, and the third variable lens element;

“Correcting a refractive mistake of the other eye by changing an optical power of third and fourth variable focal lens elements based upon the determined vergence.

“Example 21. The method in Example 20. In which the vergence point is determined, one or more cameras are used to track the vergence between the eye and the other eye.

“Example 22”: A method that combines the optical power from the first and second variable focus lens elements simultaneously.

“Example 23: A method that is similar to any of Examples 19-22 except that the waveguides contain diffractive optical elements which are designed to emit divergent light from waveguides.

“Adjusted reality (AR), systems may present virtual content to a viewer, while still allowing them to see the surrounding environment. This content should be displayed on a head-mountable device, such as eyewear. It projects information to the eyes and transmits light from the environment to the eyes.

Refractive errors are a problem that prevent light from properly focusing on the retinas of many viewers. Refractive errors can be caused by myopia, hyperopia and presbyopia. To clearly see the projected image information, these viewers might require lenses elements that have a specific prescription optical power. These lens elements can be placed between the waveguide and the viewer’s eye in some embodiments. These lens elements, and other optically transmissive components of the display such as waveguides, can cause aberrations in the viewer?s view of the environment. A lot of lens elements have a fixed optical strength that might not be able to correct all refractive errors.

“In certain embodiments, the display system may include first and second variable focal lens elements that are sandwiched (are placed on either side) of a waveguide. The first lens element can be placed between one or more of the waveguides and the eye of the viewer. It may correct refractive errors in focusing light from one or more of the waveguides to that eye. In some embodiments, the first lenses elements can be set up to provide sufficient optical power to position displayed virtual content in a desired depth plane. The second lens element can be placed between the environment and one or more waveguides. It may be designed to provide optical power to correct for aberrations in light transmission from the environment through the waveguides. Refractive errors in the other eye of the viewer may be corrected separately in some embodiments. To correct refractive errors in the other eye, it may be possible to use a third and fourth variable focus lenses elements between the eye and the waveguides. Variable focus elements can be adjusted to adjust the focal length/optical power so that both the real world and/or virtual content are focused onto the retina of the user. This allows the user to see both real and virtual objects at high optical quality.

“In some embodiments, a display is part of an eye tracking system that determines the vergence of the eye. One or more cameras may determine the vergence of the eye and may then be used to determine the distance at the focus of the eyes to calculate the correct correction. Different corrections might be needed for different vergence points. For example, the corrective lenses may need to be adjusted for viewing intermediate, near or far objects. Variable focus lens elements may be able to provide gradations of correction that are not available in other embodiments. In some embodiments, there may be multiple, more than three, more, four, five, or more unique corrections for each eye.

The variable focus lens elements can be used to correct the vision of the user, instead of wearing fixed prescription lenses. The augmented reality display system can be set up to provide different optical powers for virtual objects that are projected from different depths or at different distances. Users who require near vision correction may have the option to use the variable focus lens elements to provide near vision power for virtual objects and real-world objects at close range. For users who require intermediate distance vision correction, variable focus lens elements can be set up to provide intermediate distance vision power for those viewing virtual objects and real-world objects at distances that correspond to the intermediate distance vision zone. Another example is for users who require far vision correction. The variable focus lens elements can be set up to provide far vision optical power to users viewing virtual objects or real world objects at distances that correspond to far vision zones. The display system can access the prescriptions of users for near, intermediate, and far sight corrections. In certain embodiments, the system may adjust the optical power of the varible focus lens elements to suit the user’s prescription while the user is viewing real-world objects or virtual objects at distances that correspond to the far vision zone, intermediate vision zone, or near vision zones.

“The advantage is that the first and/or the second lens elements can allow the same head-mountable LCD display to be used for multiple users without the need to physically change the corrective lenses elements. The displays are flexible and adapt to the user. The variable focus lens elements can be set up to provide sufficient optical power to project image information from one or more waveguides onto a desired depth plane. Variable focus lens elements can be used to adjust the light divergence from one or more waveguides to the viewers. Variable focus lens elements can be used to simplify the manufacturing and design of displays. Different users may use the same display and require fewer optical structures to display information on different depth planes. The ability to provide a wider range of corrections in real-time may enable you to make more corrections than is possible with traditional corrective glasses. This can improve the clarity and/or acuity in the viewer’s view and display information. It may also increase viewer comfort over the long-term. The variable focus lens elements can be set up with different prescriptions. This allows the display to adapt to user prescriptions, such as changing the preset corrections in the display system.

“Reference will now go to the figures. Like reference numerals refer all throughout to like parts.”

“FIG. “FIG. Display system 60 comprises a display 70 and various electronic and mechanical modules that support its operation. A frame 80 may be used to attach the display 70 to the frame. This frame is worn by the viewer 90 or user of the display system 90. It is designed to position the display 70 in front the user’s eyes 90. In some embodiments, the display 70 can be considered eyewear. One embodiment of the display 70 is connected to the frame 80 by a speaker 100. This speaker can be placed next to the user’s 90 ear canal to provide stereo/shapeable audio control. One or more microphones 110, or other devices that detect sound, may be included in the display system. The microphone may be configured in some embodiments to allow the user inputs or commands to system 60 (e.g. the selection of voice commands, natural language queries, etc.). The microphone may also allow audio communication between other people (e.g. with users of similar display system). A microphone can also be used as a peripheral sensor to record audio data (e.g. sounds from the user or environment). The display system may include a peripheral sensor 120a in some embodiments. This sensor may be detached from the frame 80 and attached directly to the user 90 (e.g. on the head, torso or extremity). The user 90. In some embodiments, the peripheral sensor 120a can be used to collect data that characterizes a physiological state of user 90. The sensor 120 a could be, for example, an electrode.

“With reference to FIG. “With continued reference to FIG. 2, the display 70 can be operatively coupled via communications link 130. This could be a wired or wireless connectivity to a local processing module 140. The display 70 may be fixedly attached or attached to a helmet/hat worn by the user, embedded or attached in headphones or otherwise removable attached to user 90 (e.g. backpack-style configuration or belt-coupling style configuration). The sensor 120 a can be similarly operatively connected to the local processor 140 and data module 140 by communications link 120b. This could be either a wired or wireless lead. Both the local processing and data module 140 could include a hardware processor and digital memory such as flash memory or hard drives, which can be used to aid in data processing, caching and storage. Data may include data a), captured from sensors (which could be, e.g.., attached to frame 80 or to the user 90), and/or data b), acquired and/or processed with remote processing module 150 (including data relating virtual content), possibly for passage onto the display 70. Local processing and data modules 140 can be operatively connected via communication links 170, 180 to remote processing module 150 or remote data repository 160. This allows remote modules 150, 160 to be operatively coupled to one another and make them available for local processing and datamodul 140. One or more image capture devices, microphones and inertial measurement units may be included in the local processing and data modules 140. Other embodiments may include one or more sensors that are attached to the frame 80 or standalone structures that communicate with local processing and data modules 140 via wired or wireless communication paths.

“With reference to FIG. 2. In some embodiments, remote processing module 150 could include one or more processors that are capable of processing data and/or images. The remote data repository 160 could be a digital storage facility that is accessible via the internet or another networking configuration. Resource configuration. One or more remote servers may be included in some embodiments of the remote data repository 160. These remote servers provide information to the local processing and data modules 140 and 150, respectively, for the generation of augmented reality content. Some embodiments store all data and perform all computations in the local processing or data module. This allows for fully autonomous use from remote modules.

“With reference to FIG. 3. The perception of an image being “three-dimensional?” 3 The perception of an image as being?three-dimensional? You can achieve this by giving slightly different images to each viewer’s eyes. FIG. FIG. 3 shows a traditional display system that simulates three-dimensional imagery. The user is able to see two distinct images, 190 and 200, one for each eye 210 and 220-and they are outputted. The images 190 and 200 are separated from the eyes 220, 220 by distances of 230 along an optical z-axis parallel to the line in the sight of the viewer. The images 190,200 are flat, and the eyes of 210, 220 can focus on the images by taking one accommodated state. These 3-D display systems use the human visual system to combine images 190, 200 and provide a perception for depth and/or scale.

“It will however be appreciated that the human visual system can be more complex and it is more difficult to perceive depth accurately. Many viewers are not comfortable with?3D?. Many viewers of conventional?3-D? display systems find these systems uncomfortable, or may not feel any depth. It is possible that objects may be perceived as ‘three-dimensional’ by viewers, but this theory is not limited. This is due to vergence and accommodation. Vergence movements, i.e. rotation of the eyes so the pupils move towards or away from one another to converge the lines in the eyes to fixate on an object, are closely related to focusing (or?accommodation?). The lenses and pupils of your eyes. Under normal conditions, changing the focus of the lenses of the eyes, or accommodating the eyes, to change focus from one object to another object at a different distance will automatically cause a matching change in vergence to the same distance, under a relationship known as the ?accommodation-vergence reflex,? also pupil dilation and constriction. Under normal conditions, a change of vergence can also trigger a similar change in accommodation for lens shape or pupil size. Many stereoscopic and?3-D? systems are available, as we will see. Many stereoscopic or?3-D display systems present a scene with slightly different presentations (and so slightly different images) to each eyes such that the human visual system perceives a three-dimensional perspective. Such systems are uncomfortable for many viewers, however, since they, among other things, simply provide different presentations of a scene, but with the eyes viewing all the image information at a single accommodated state, and work against the ?accommodation-vergence reflex.? Display systems that offer a better match between accommodation, vergence, and three-dimensional imagery may create more comfortable and realistic simulations.

“FIG. 4. illustrates aspects of a method for simulating three-dimensional imagery by using multiple depth planes. FIG. FIG. 4 shows objects that are at different distances from the eyes 210 and 220 along the z-axis. The eyes 210 and 220 accommodate these objects so they are in focus. The eyes 210 and 220 have particular accommodated states that bring objects at different distances along z-axis into focus. Accordingly, an accommodated state can be associated with one of the depth planes 240. It has a associated focal distance so that objects or parts of objects in that particular depth plane are in focus when it is in that accommodated state. Three-dimensional imagery can be created in some embodiments by providing different images for each eye 210, 220 and different versions of each image corresponding to each depth plane. Although the fields of view are shown separately for illustration purposes, it is possible that they overlap as the distance along the Z-axis increases. For ease of illustration, the contours of depth planes may be shown flat. However, it is possible to have all the features of depth planes in focus when the eye is in a particular accommodationed state.

“The distance between an object 210 and 220 can also affect the amount of divergence light from the object as seen by the eye. FIGS. FIGS. 5A-5C show relationships between distances and divergence of light Rays. In order of decreasing distance R1, R2, R3 are the three distances between the object and eye 210. FIGS. FIGS. 5A-5C show that the distance between the object and the light rays becomes less. The light rays become collimated as the distance between objects increases. Another way to put it, the light field created by a point (an object or part of an object) has a spherical wavesfront curvature. This is a function how far the point is from the user’s eye. Curvature increases as the distance between the object (or part of it) and the eye decreases 210. The degree of light ray divergence at different depths can also vary. This is because the distance between the depth planes and the viewer’s eyes 210 increases with increasing distance. FIGS. illustrate only one eye 210. 5A-5C, and the other figures in FIGS. 5, it will be apparent that eye 210 can be applied to both eyes 220 and 210 of a viewer.

“It is not believed that the human eye can perceive depth in a finite number depth planes. This theory does not limit it. It is possible to create a convincing simulation of depth perception by showing the eye different images that correspond to each of the limited number of depth plans. Different presentations can be focused separately by the viewers eyes. This helps to provide depth cues for the user based on how the eye adjusts to bring into focus different images or on what image features are out of focus on different depths.

“FIG. “FIG. 6 shows an example of a waveguide stack that outputs image information to a user. Display system 250 comprises a stack of waveguides or stacked waveguide assemblies, 260 which can be used to provide three-dimensional perception for the eye/brain by using a plurality waveguides 270 to 280, 300 to 310. The display system 250 may be the system 60 in FIG. 2 with FIG. 6 shows some of the system 60 in more detail. The display 70 may include the waveguide assembly 265. 2. In some embodiments, the display system 250 could be considered a light-field display. The waveguide assembly 262 may also be called an eyepiece.

“With reference to FIG. 6. The waveguide assembly 260 could also include a variety of features 320 to 330, 340 and 350 between the waveguides. One or more lenses may be included in some embodiments of the features 320 to 330, 340 and 350. Waveguides 270-280, 290, 300 and/or the plurality lenses 320-330,340, 350 can be used to transmit image information to the eye at different levels of light ray divergence or wavefront curvature. Each waveguide level can be associated with a specific depth plane, and may be configured so that it outputs image information according to that depth. Image injection devices 360-370, 380 and 390-400 may serve as light sources for waveguides. They may be used to inject information into waveguides 270 to 280 to distribute light along each waveguide. This may allow for output towards the eye 210. An output surface 410 to 430 to 440 is illuminated by light. Light is then injected into the corresponding input surface 490, 490 and 500. In some embodiments, each of the input surfaces (460, 470.480, 490. 500) may be an edge or part of a major surface in the corresponding waveguide (i.e. one of the waveguide surfaces facing the world 510, or the viewer’s eyes 210). One beam of light, such as a laser beam, may be used in some embodiments. A collimated beam may be injected into each waveguide to produce an entire field of collimated collimated beams directed at the eye 210 at specific angles and divergence corresponding to the depth of the waveguide. One of the image injection devices 360-370, 380-390, 390, or 400 may be used to inject light into multiple waveguides (e.g. three), 270-280, 290, 290, 300 and 310.

“In some embodiments the image injection devices 360-370, 380 and 390-400 are discrete displays that produce image information to be injected into waveguide 270, 281 or 290. 300, 310. Other embodiments include the image injection device 360,370,380, 390 and 400. These are the output ends for a single multiplexed monitor that may pipe image information to one or more of the image injectors 360, 360,370,380, 390 or 400. You will appreciate that image information from the image injection devices 360-370, 380-390, 390, and 400 could include different wavelengths or colors (e.g. different component colors).

“In some embodiments, light is injected into waveguides 270 to 280, 290, 290, 300 and 310 by a lighting projector system (520), which includes a light module, 540. This may include a light emitter such as an LED. A light modulator 530 may direct the light from the module 540 to a beam splitter, 550 and modify it. The light modulator530 can be used to alter the intensity of light injected into waveguides 270-280, 290-290, 300-310. Liquid crystal displays (LCD), including liquid crystal on silicon (LCOS), are examples of spatial light modulators. It is obvious that the image injection device 360, 370 and 380, 390 and 400 are shown schematically. In some embodiments, these injection devices may represent different light paths or locations in a common project system, which outputs light into the associated waveguides 270 to 280, 295, 290, 300 and 310.

“In some embodiments, the display device 250 may include a scanning fiber display that includes one or more scanning fibres which project light in different patterns (e.g., raster, spiral scan, Lissajous patterns etc.). In one or more of the waveguides 270-280, 290/300, 310, and finally to the eye 210. The illustrated image injection devices 360-370, 380 and 390-400 may be used to represent one scanning fiber or a group of scanning fibers that are configured to inject light into a number of waveguides (270, 280 or 290 or 300 or 310). Other embodiments may show the illustrated image injection device 360,370,380, 390 and 400 as a plurality or multiple scanning fibers. Each scanning fiber is designed to inject light into a associated waveguide 270, 288, 290, 300, 310. One or more optical fibers can transmit light from the lightmodule 540 to one or more waveguides (270, 280. 290. 300. 310). One or more interconnected optical structures can be placed between the scanning fiber or fibers and one or multiple waveguides (270, 285, 290 or 300, respectively) to redirect light from the scanning fiber into one or several waveguides (270, 281 or 290 or 290 or 300, 310.

“Controller 560 controls operation of one or more stacked waveguide assemblies 260, including operation by the image injection devices 360 and 370. Also, operation of the 390, 400, light source 540 and light modulator 530. The controller 560 may be part of the local information processing module 140 in some embodiments. The controller 560 contains programming (e.g. instructions in a nontransitory medium) that controls the timing and provision image information to waveguides 270 to 290 to 300 to 310 according to any of the schemes described herein. The controller can be either a single device or a distributed system connected via wired or wireless communication channels. The processing modules 140 and 150 may include the controller 560 (FIG. 2) in certain embodiments.

“With reference to FIG. 6. The waveguides 270-280, 290 and 300 may each be designed to transmit light through their respective waveguides by total internal reflection (TIR). Waveguides 270-280, 290 and 300, 310 can be either planar or curved. They may have major top and/or bottom surfaces, and edges that extend between them. The illustrated configuration shows that the waveguides 270 to 280 to 290 to 300 to 310 can each have out-coupling optic elements 570 to 580 to 590 to 600 to 610. These optical elements are designed to extract light from a waveguide by directing the light and propagating it out of each waveguide to provide image information to the eye 220. Out-coupled light, also known as extracted light, and out-coupling optical element light may also refer to light extracting optical components. The waveguide may produce an extracted beam of light at the locations where the light from the waveguide strikes a light extraction optical element. For example, the out-coupling elements 570, 580 and 590, 600 and 610 could be gratings with diffractive optical features. For ease of description and drawing clarity, the out-coupling elements 570, 580 and 590 are shown at the bottom major surface of the waveguides. However, some embodiments allow the out-coupling element 570, 580 and 590 to be placed at the top or bottom major surfaces. The out-coupling elements 570, 580 and 600, 610 can be formed in a layer made of material attached to a transparent substrate. This will form the waveguides 277, 280, 295, 290, 300, 310. Other embodiments allow the waveguides 270-280, 290/300, 310 to be formed in a single piece of material. The out-coupling elements 570/580, 590/600, 610 can be formed on a surface or within that piece of material.

“With reference to FIG. “Continued reference to FIG. The waveguide 270 closest to the eye could be set up to deliver collimated lighting (which was injected into waveguide 277) to the eye 210. The collimated light could be representative of optical infinity’s focal plane. The next waveguide 280 may send out collimated beams which pass through the first lens 350 (e.g. a negative lens) before reaching the eye 210. This first lens 350 may create a slight convexwavefront curvature so the eye/brain understands light coming in from the next waveguide 280 as coming in from an optical infinity focal plane that is closer inward towards the eye 210. The third-up waveguide 290 also passes its output light through the second 350 and second 343 lenses before reaching the eye. In this way, the eye/brain may interpret light coming from the thirdwaveguide 290 as coming in from an additional focal plane. This second focal plane is closer to the person than the one from the previous waveguide up 282.

The other waveguide layers 300 to 310, and lenses 330 to 320 are similarly configured. The highest waveguide 310 sends its output through all lenses between it, the eye, for an aggregate focal power that is representative of the nearest focal plane to the person. A compensating lens layer (620) may be placed at the top to offset the effect of the lens stack 320-330,340,350 when viewing/interpreting light from the world 510. This configuration can provide as many perceived focal points as possible, depending on the available lens/waveguide pairings. Both the out-coupling optical elements and the focusing parts of the lenses can be static. Alternate embodiments may include one or both of these elements.

“In some cases, more than one waveguide 270, 285, 290, 300 or 310 may have the exact same depth plane. Multiple waveguides may be configured to output images to the same depth plane. Or multiple subsets may be used to output images to different depth planes. This allows for the creation of a tiled image that provides a wider field of view at these depth planes.

“Continued reference to FIG. “With continued reference to FIG. 6, the out-coupling optic elements 570, 580. 590. 600, 610 could be configured to both redirect light from their respective waveguides as well as to output the light with the correct amount of divergence/colimation for the particular depth plane. Waveguides with different depth planes can have different out-coupling optical element configurations 570, 580 590 600, 610 which may output light with different amounts of divergence. The light extracting optical elements 580, 590 600, 610 in some embodiments may be volumetric features or surface features that can be set up to produce light at certain angles. The light extracting optical element 570, 580 590 600, 610 could be volume holograms or surface holograms and/or diffraction gratings. The features 320-330, 340 and 350 may not be lenses in some embodiments. They may be spacers (e.g. cladding layers or structures for forming gaps in air)

“In certain embodiments, out-coupling optic elements 570, 580 and 590, 600 and 610 are diffractive structures that form a pattern or?diffractive element?. This is also known as a ‘DOE? The DOE’s should have a sufficient low diffraction efficiency to ensure that only a small portion of the beam is deflected toward the eye 210 at each intersection. The rest of the beam continues to travel through a waveguide via IR. This allows the light carrying the image information to be divided into several related exit beams, which exit the waveguide at a variety of locations. The result is a relatively uniform pattern of exit emission towards the eye 210 for the particular collimated beam that bounces around within a waveguide.

“In some embodiments, one of the DOEs can be switched between?on and?off? They actively diffract in states that they are active, and in states where they are?off? States in which they don’t significantly diffract. A switchable DOE could be a layer of polymer-dispersed liquid crystal. In this case, the microdroplet’s refractive index may be changed to substantially match the host material’s (in which case it does not diffract incident sunlight) or to an index that is not as high as the host medium’s (in which case it actively diffracts the incident light).

“In certain embodiments, a camera assembly (e.g. a digital camera including visible light and/or infrared cameras) may be used to capture images of the eyes 210 and surrounding tissue 210. This is to, e.g. detect user inputs and/or monitor the physiological status of the user. Any image capture device may be considered a camera. In some embodiments, the camera assembly may include an image-capture device and a light source that projects light (e.g. infrared light), to the eye. This light may then be detected by the imagecapturing device. The camera assembly 630 may be attached (FIG. 2), and may be in electric communication with the processing module 140 and/or150 which may process image data from the camera assembly. One camera assembly 630 may be used for each eye in some embodiments to monitor each eye separately.

“Refer to FIG. 7 shows an example of exit beams produced by a waveguide. Although one waveguide is shown, it is easy to see how other waveguides from the waveguide assembly (FIG. If the waveguide assembly 2260 contains multiple waveguides, the waveguide assembly 266, may work similarly. The waveguide 276 is infused with light 640 at the input surface 456 of the waveguide 277 and propagates through the waveguide 277 by TIR. A portion of the light 640 exits the waveguide 650 as exit beams at 650 points where it impedes on the DOE570. Although the exit beams 650 appear to be substantially parallel, depending on the depth plane of the waveguide 270, they can also be diverted to propagate to eye 210 at an angle (e.g. forming divergent exit beacons). You will see that the exit beams 650 are substantially parallel. This could be a sign of a waveguide that uses out-coupling optical elements to out-couple light, forming images that appear to be set at a deep plane that is far away (e.g. optical infinity). Waveguides with other out-coupling optical elements could produce a divergent exit beam pattern. This would require the eye 220 to adjust to a greater distance to focus the image on the retina. The brain would interpret this as light coming from a farther distance than the eye 210.

“In certain embodiments, a full-color image can be created at each depth plane using overlaying images in each component color, e.g. three or more components. FIG. FIG. 8 shows an example of a waveguide assembly that stacks using multiple component colors. Although the illustrated embodiment depicts depth planes from 240 a to 240 f, other depths may be possible. A depth plane can have three or more color images. These include a first image of the first color G, a second image representing R, and a third image representing B. Different depth planes can be identified in the figure with different numbers for diopters, (dpt), following the letters G and R. As an example, the numbers after each letter indicate the diopters (dpt) and the inverse distance from the depth plane to a viewer. Each box in the figures is a separate component color image. Some embodiments may allow for variations in the eye’s ability to focus light of different wavelengths. This means that the placement of depth planes for different components colors can vary. Different component colors may be placed on depth plans corresponding to different distances from users. This arrangement can increase visual acuity, user comfort, and/or decrease chromatic aberrations.

“In some embodiments, the light of each component’s color may be produced by one dedicated waveguide. Therefore, each depth plane could have multiple waveguides associated to it. Each box, including R, G, or B, may be taken to mean an individual waveguide. Three waveguides could be provided for each depth plane, while three color images may be provided for each depth plane. The waveguides for each depth plane are shown next to one another in the drawing. However, physical devices may have multiple waveguides per level. Other embodiments may output multiple colors from the same waveguide. For example, one waveguide per depth plane may provide multiple components.

“Continued reference to FIG. 8 in some embodiments, G represents the color green, R the color red, and Blue is the color. Other colors may also be associated with different wavelengths of light. In other embodiments, magenta or cyan may be used instead of or in place of one or more red, green, or blu.

It will be appreciated that any reference to a particular color of light in this disclosure may include light of one or more wavelengths from a range that is perceived by the viewer as that color. Red light, for example, may contain light from one or more wavelengths within the range of 620-780nm. Green light may include light from one or several wavelengths within the range 492-577nm. Blue light may also include light from one or multiple wavelengths within the range 435-493nm.

“In some embodiments the light source 540 (FIG. “6 may emit light at wavelengths beyond the visual perception range of the viewer. The waveguides 250 can also be used to redirect light from the display towards the user’s eyes 250, such as for imaging or user stimulation.

“Refer to FIG. “With reference now to FIG. 9A, some embodiments may require that light impinging upon a waveguide be redirected in order to in-couple the light into the waveguide. In-coupling optical elements can be used to in-couple and redirect the light into their corresponding waveguides. FIG. FIG. 9A shows a side-view of a cross-sectional view of an example of a plurality (or set) 660 of stacked wavelengthguides, each with an in-coupling optic element. Each waveguide may be designed to emit light at one or more wavelengths or a range of wavelengths. The stack 660 could correspond to stack 260 (FIG. 6 and the illustrated waveguides in the stack 660 could correspond to a portion of the plurality waveguides 270-280, 290, 300-310. However, light from one of the image injection devices 360 or 370 is injected into the waveguides at a position that requires it to be redirected for In-coupling.

The illustrated set 660 includes waveguides 670-680 and 690. Each waveguide has an associated incoupling optic element (also known as a light input on the waveguide). For example, incoupling element 700 is disposed on waveguide 670. In-coupling element 710 is disposed upon waveguide 680. In-coupling element 720 is disposed onto waveguide 690. One or more of the incoupling optical element 700, 710 or 720 may be placed on the bottom major surface (e.g., an upper major surface) of each waveguide 670 or 680 or 690. This is especially true if the one or more incoupling optical element are reflective, deflecting optic elements. The in-coupling elements 700, 710 and 720 can be placed on the upper major surfaces of their respective waveguide 670-680, 690 or the top of the next lowest waveguide. This is especially true if the in-coupling elements are transmissive deflecting optical element. In certain embodiments, the incoupling optical elements 700-710, 720 can be placed in the body of their respective waveguide 670-680 or 690. As discussed above, some embodiments have the in-coupling elements 700, 710 and 720 that are wavelength selective. They selectively redirect one wavelength of light while transmitting another wavelength. Although they are shown on one corner of the waveguide 670-680, 690, it is clear that in-coupling elements 700, 710 and 720 can be found in other areas of the waveguide 670-680, 690. In some embodiments, however, these elements may also be located in other areas of the waveguide 670-680, 690.

“As shown, the in-coupling elements 700, 710 and 720 can be offset laterally from one another. In certain embodiments, each optical element in-coupling may be offset so that it receives light from another optical element. As shown in FIG. 6, each in-coupling optic element 700, 710 or 720 can be set up to receive light from an image injection device 360, 370, 380, 390, and 400. 6 and may be separated (e.g. laterally spaced away) from other incoupling optical element 700, 710 or 720 so that it does not substantially receive light from the in-coupling elements 700, 710 or 720.

Each waveguide includes associated light-distributing elements. These include light distributing element 730 placed on a major (e.g. a top surface) waveguide 670 and light distributing element 740 placed on a major (e.g. a top surface) waveguide 680. Light distributing element 750 is placed on a major (e.g. a top surface) waveguide 690. Other embodiments may place the light distributing element 730, 740 and 750 on a top major surface of associated wavesguides 670-680 or 690. Other embodiments allow the light-distributing elements 730-740 and 750 to be placed on the top and bottom major surfaces of associated waveguides 670-680 and 690 respectively. Or the light-distributing elements 730-740 and 750 may be placed on different top and bottom major surfaces of different associated waveguides 670-680 and 690.

“The waveguides 670-680 and 690 can be separated and spaced by e.g. gas, liquid and/or solid layers. As an example, layer 760a can separate waveguides 670, 680, and layer 760b may separate wavesguides 680, 690, and 680, respectively. In some embodiments, layers 760a and 760b may be formed from low refractive materials. This means that they have a lower refractive than the material forming waveguides 670-680 or 690. Preferably the refractive indices of the materials forming layers 760a, 760b are 0.05 or greater, or 0.10 or lower than the refractive indices of the materials forming waveguides 670-680, 690, 690. The lower refractive layers 760a, 760b can be used as cladding layers to facilitate total internal reflection (TIR), of light through the waveguides 670-680, 690, 690. (e.g. TIR between the top major surfaces of each waveguide). Some embodiments of the layers 760a and 760b are made from air. Although not illustrated, it is possible to see that the top and bottom of illustrated set 660 waveguides could include adjacent cladding layers.

“Preferably, for ease in manufacturing and other considerations the material forming waveguides 670-680, 690 and 690 are identical or the same. The material forming layers 760a, 760b are the same or similar. Some embodiments may have different materials for the waveguides 670-680 or 690. The material that forms the layers 760a-760b could be different while still adhering to the refractive index relationships mentioned above.

“With reference to FIG. 9A, the light rays 770 to 780 and 790 are incident on waveguides set 660. You will appreciate that the light rays 780, 790, 790 can be injected into waveguides 670-680, 690 using one or more image injection device 360, 370. 380. 390. 400 (FIG. 6).”

“In certain embodiments, the light beams 780, 790, and 770 have different properties. They may have different wavelengths or ranges of wavelengths which could correspond to different colors. In-coupling optical elements 700-710 and 720 deflect incident light so that it propagates through respective waveguides 670-680 or 690 by TIR. In certain embodiments, the incoupling elements 700, 710 and 720 selectively deflect a particular wavelength of light while transmitting other wavelengths through an underlying waveguide or associated incoupling optic element.

In-coupling optical element 700, for example, may be set up to deflect ray 777, which has a first wavelength, or range of wavelengths. However, it will transmit rays 780 or 790, which have different wavelengths or ranges. In-coupling optical element 710 is designed to deflect light from a second wavelength. In-coupling optical element 710 deflects the ray 790. It is designed to selectively deflect light from third wavelengths or other wavelengths.

“Continued reference to FIG. 9A: The deflected light beams 770-780 and 790 are directed so that they propagate along a corresponding wavelength 670-680, 690. That is, the incoupling optical elements 700-710, 720 deflect light into the waveguide 670-680, 690, 690 in order to in-couple the light into the waveguide. The light rays 780, 790, 770 are deflected at angles such that they propagate through the waveguide 670-680 or 690 by TIR. The light rays 780, 790, 770 propagate through the waveguide 670-680-690 by TIR, until they impinge on the waveguide’s light distributing elements 730-740, 750.

“With reference to FIG. “With reference now to FIG. 9B, a perspective of an example from the plurality stacked waveguides in FIG. Illustration 9A. The in-coupled light beams 770, 780 and 790 are deflected above by the incoupling optical elements 700.710 and 720. They then propagate via TIR within the waveguides 670.680.690. The light rays 780, 790, 770 then impact upon the light distributing element 730, 740 and 750. The light distributing elements 730-740, 750 reflect the light rays 780-790 to propagate towards out-coupling optical element 800, 810 and 820.

“In some embodiments the light distributing elements 730-740 and 750 are orthogonal pupil extenders (OPE). OPE’s can be used to distribute or deflect light to out-coupling elements 800, 810 and 820. In some cases, they may also increase the spot size or beam of the light that propagates to out-coupling elements. In some cases, the light-distributing elements 730-740, 750 can be eliminated and the in-coupling optic elements 700, 710 or 720 may be used to direct light to the out-coupling elements 800, 810 and 820. FIG. 9A shows an example. 9A, light distributing elements 730-740 and 750 can be replaced by out-coupling optic elements 800, 810 or 820. In certain embodiments, out-coupling optical elements 800-810, 820 may be exit pupils (EP) or exit pupil expanders(EPE) that direct light into the viewer’s eyes 210 (FIG. 7). The OPE’s can be set up to increase the size of the eyebox in at least one axis. EPE’s could be used to increase the eyebox in an axis crossing that is orthogonal to the OPEs. Each OPE could be designed to redirect some of the light hitting it to the EPE within the same waveguide while still allowing the rest to propagate down. Once the OPE is impinged on again, another portion will be directed to the EPE. The remaining portion of the light continues to propagate down the waveguide. The same thing happens when you strike the OPE. A portion of the light that impinges on the EP is directed towards the user. After striking the EP, another portion is directed to the waveguide. A single beam of incoupled lighting may therefore be “replicated?” Each time that a portion of the light is redirected by an OPE/EPE, it forms a field of cloned light beams, as shown in FIG. 6. In certain embodiments, the OPE or EPE can be used to alter the size of the light beams.

“Accordingly, with regard to FIGS. “Accordingly, with reference to FIGS. 9A and 9B. In some embodiments, set 660 of waveguides comprises waveguides 670-680, 690, 700, 710 and 720; in-coupling optic elements 700, 710 and 720; light-distributing elements (e.g. OPE’s), 730, 740 and 750; and outcoupling optical element (e.g. EP’s), 800, 810 and 820 for each color. Waveguides 670-680, 690 can be stacked with an air gap/cladding between them. In-coupling optic elements 700, 710 and 720 deflect or redirect incident light (with different elements receiving light at different wavelengths) into their waveguide. The incident light is then propagated at an angle that will produce TIR within its respective waveguide 670-680 or 690. In the above example, light ray 780 (e.g. blue light) is deflected first by in-coupling element 700. The light then bounces down the waveguide and interacts with the light distributing elements (e.g. OPE’s) 730, then the out-coupling element (e.g. EPs 800) in the same manner as described previously. The waveguide 670 will be used to guide the light rays 780 (e.g., red and green light) through it. Light ray 780 will then pass through the waveguide 780, where it will impinge on and be deflected 710. The light ray 780 bounces down the waveguide 680 via IR. It then proceeds to its light distribution element (e.g. OPEs) 740, and then on to the out-coupling element (e.g. EP’s) 810. The light ray 790, e.g. red light, passes through the waveguide690 to impact the waveguide 690’s light in-coupling elements 720. The light in-coupling elements 720 reflect the light ray 790 so that it propagates to OPEs (e.g. OPEs) 750 and then to EPs (820 by TIR). The viewer receives the outcoupled light from other waveguides 670 and 680.

“FIG. 9C is a top-down view of one example of the plurality stacked waveguides in FIGS. 9A and 9B. 9A and 9B. The in-coupling elements 700, 710 and 720, however, are not vertically aligned. Instead, they are preferable to be non-overlapping (e.g. lateral spaced as in the top-down perspective). This spatial arrangement allows for the injection of light from different sources into different waveguides on an individual basis. It also facilitates the unique coupling of a particular light source to a particular waveguide. Some arrangements include nonoverlapping spatially-separated optical elements in-coupling optical components. These arrangements can be called a “shifted pupil system” and may correspond to sub-pupils.

“Refer to FIGS. 10A and 10B. One embodiment of augmented realities devices, such the ones described above, can adjust the wavefront light (light for image information projected from augmented reality systems as well as light from objects in real world surrounding it) by adjusting focal lengths of the variable focus lens elements. The augmented reality system could include multiple stacked waveguides, as discussed above. 9A and 9B or corresponding with the stacked waveguide assembly 265 of FIG. 6), that project light towards the eye of a user (e.g., user 90 of FIG. 2). Other embodiments may only include one waveguide. While plural waveguides are mentioned in the disclosure, it is possible to replace them with a single waveguide.

“As we have discussed, the projected light from waveguides can be used to provide virtual, enhanced reality image information to viewers. The projected light could be so that the viewer perceives it as coming from a different depth or distance from them. The display device can be optically transmissive so that the user can see real objects in the environment. Some waveguides may have fixed optical power. The waveguides can be set up to emit divergent beams with different levels of divergence to give the illusion that the light is coming from different depths.

“It is important to understand that the fixed optical power of waveguides assumes the viewer has an adequate accommodative range to focus light from the waveguides. However, some viewers may need corrective lenses to see clearly. As a result, image information from a waveguide might not be easily seen by these viewers. A first variable focus lens element, which may be placed between the waveguide’s waveguide and the viewer to adjust the wavefront of light produced by the waveguide to correct the viewer’s vision, is possible in some embodiments. However, this first lens element is also in direct contact with light that is propagating from the environment to the viewer. The first lens element can alter the wavefront of light from the environment, which may cause aberrations in the viewer?s perception of the world. A second variable focal lens element can be placed on the opposite side to the plurality stacked waveguides. This will correct aberrations caused by the first lens element. The second variable focus element may be located between the plurality stacked waveguides. It may also be placed between the real world and adjust the wavefront of light coming from real-world objects. The second variable focal lens element can be used to correct aberrations due to the first variable lens element. The second variable focus lens can also be used to compensate for aberrations due to waveguides in some embodiments.

“In some embodiments the focus of a second variable focus lens lens element could be opposite or inverted to the first. If the first variable focal lens element has a positive optical potential, the second variable lens element could have a negative optical pot, which may be similar to the aggregate optical power of the first lens element. Other embodiments allow for the compensation of both the optical powers of the first variable focal lens element as well as those of the interconnected waveguides. In these cases, the optical power from the second lens elements could be of opposite magnitude to the combined optical power and waveguides.

In some embodiments, waveguides might not have optical power. For example, waveguides could be configured to emit collimated light. The first variable focus lens elements can be used to adjust the wavefront of light emitted by waveguides in order to provide sufficient divergence to allow image information to be interpreted as being on a specific depth plane. The appropriate amount of divergence for different viewers may differ because the optical power required to place image information on a specific depth plane will be adjusted using a differential to adjust for the viewer’s particular optical prescription. The waveguide stack comprising the first and second variable focal lens elements can be simply formed by one waveguide.

It will be appreciated that the first two variable focus lens element elements can be used for one eye, while the third and fourth lens elements may be used for the other.

Summary for “Augmented Reality Systems and Methods with Variable Focus Lens Elements”

Field

“The present disclosure pertains to optical devices, which include augmented reality imaging systems and visualization systems.”

“DESCRIPTION OF RELATED ART”

“Modern computing technology and display technologies have made it possible to develop systems for virtual reality. “Virtual reality” or “augmented reality?” Experiences in which digitally reproduced images, or portions thereof, are presented to users in a way that makes them appear real or can be perceived as such. A virtual reality (or?VR?) scenario usually presents digital or virtual information without transparency to any other real-world visual input. An augmented reality (or?AR?) scenario typically presents digital or virtual information as an enhancement to the visualization of the actual world. Mixed reality (or?MR?) is a type AR scenario that typically includes virtual objects that are integrated into and responsive to the natural world. An example of AR image content in a mixed reality scenario is AR content that appears to block or be perceived as interfacing with objects in real life.

Referring to FIG. “Referring to FIG. 1, an augmented reality scene 10, is shown. AR technology allows the user to see a park-like setting 20 with people, trees, buildings, and concrete platform 30. User also perceives that they?see? ?virtual content? There is a robot statue 40 that stands on the real-world platform 30 and an avatar character 50 that looks like a cartoon, which appears to be a personification for a bumblebee. These elements 50 and 40 are virtual? They are virtual in that they don’t exist in the real world. It is difficult to create AR technology that allows for a natural-feeling, rich presentation among other real-world or virtual imagery elements.

“Systems and Methods disclosed herein address different challenges related to AR/VR technology.”

“In some embodiments, there is a display system. A head-mountable display is used to project light onto a viewer in order to display image information on one of several depth planes. One or more waveguides are used to project light onto the display. One or more waveguides can also transmit light from objects within the surrounding environment to the viewer. A first variable focus lens element is placed between one or more of the waveguides, and the first eye of the viewer. The second variable focus lens element is placed between the one/more waveguides as well as the environment. An eye tracking system can be used to determine the vergence of the viewers eyes. Display system can correct refractive errors by setting the optical power of the first or second variable focus lens elements to adjust based on the vergence.

“In other embodiments, a method is provided for displaying image information on an head-mountable LCD display. The method involves mounting the display on the head of the viewer. The display is configured to display image information in one or more depth plans. One or more waveguides are used to project light onto the viewer in order to display the image information. One or more waveguides can also transmit light from objects within the surrounding environment to the viewer. This method also includes correcting an eye’s refractive error and determining the vergence point for the viewers eyes. A first variable focus lens element placed between the one- or more waveguides, and the eye of the viewer determined by the vergence points may correct the refractive error. The second variable lens element placed between the one and more waveguides will adjust the optical power to the environment around the viewer determined by the vergence points.

“Example 1: A display that includes:

“A head-mountable display that projects light to the viewer in order to display image information on one of several depth planes. The display comprises:

“One or more waveguides that project light to the viewer. The one or more waveguides can also transmit light from objects in the surrounding environment to the viewer.

“A first variable focus lens element between one or more waveguides, and the first eye of the viewer;

“A second variable focus lens element that is between one or more waveguides, the surrounding environment and the other; and

“An eye tracking system that determines vergence and adjusts the optical power of the first or second variable focus lens elements in response to the determined vergence.

“Example 2”: The display system in Example 1 where the system can modify the optical power for the first and second variable focal lens elements based on a depth plane to display the image information.

“Example 3”: A display system for any of Examples 1 through 2, where the display system adjusts an optical power to the second variable focal lens element in response the optical power to the first variable lens element.

“Example 4 is the display system for any of Examples 1 through 3, where the one or more waveguides project divergent light to show the image information.

“Example 5 – The display system for any of Example 1 – 4, where each waveguide has a fixed optical strength.

“Example 6”: A display system based on any of Examples 1-5. It also includes a third variable focal element that is between one or more waveguides, and a second eye.

“Example 7: A display system based on Example 6 also includes a fourth variable focal element that is positioned between one or more waveguides in the surrounding environment.

“Example 8”: A display system for any of Examples 7 through 7, where the system adjusts an optical power of the third lens element variable focus to alter the wavefront of projected light based upon the determined vergence.

“Example 9”: A display system for any of Examples 6-8. The system adjusts an optical power of the fourth lens element variable focus to alter the wavefront of incoming sunlight from the object. This is based on the determined Vergence.

“Example 10”: A display system for any of Examples 1-9. It includes one or more cameras.

“Example 11”: A display system displaying any of Examples 1-10. In which the optical power of the first or second variable focus lens elements is adjusted according to a prescription for correcting the viewer?s vision at distances of two or more.

“Example 12”: A display system for any of Examples 1-11 that has three or more preset prescription optical powers for each variable focus lens element.

“Example 13”: A display system based on any of Examples 1-12 where the number of prescription optical powers available is equal to or less than the total display depth planes.

“Example 14”: A display system based on any of Examples 1-13 where the first or second variable focus lens elements are composed of a layer liquid crystal sandwiched between two substrates.

“Example 15 – The display system in Example 14 where the first and/or the second variable focus lens elements are electrodes that alter the refractive index of liquid crystal layers upon application of voltage.

“Example 16”: The display system shown in Examples 14-15. Substrates are glass.

“Example 17”: A display system for any of Examples 1-16 that includes an electronic hardware control system to adjust the refractive index of either the first or second variable focus lens elements by applying an electrical current, voltage.

“Example 18 – The display system in Example 17. In which the eye tracking system forms an feedback loop to electronic hardware control system, the refractive index for the first and/or the second variable focus lens elements can be varied according to the determined vergence.

“Example 19”: A method to display image information on a headmountable display. The method comprises:

“Providing the display mounted on the head of a viewer and configured to display image information in one or more depth plans.

“One or more waveguides that project light to the viewer in order to display image information”

“wherein one or more waveguides can also be configured to transmit light from surrounding objects to the viewer;

“Determining a vergence level of eyes by the viewer; and

“Correcting a refractive error in the eye of the viewer by:

“A first variable focal lens element with varying optical power, positioned between one or more waveguides; an eye of the viewer determined by the vergence point; and

“A second variable focus lens element with varying optical power, positioned between one or more waveguides and an environment around the viewer according to the determined vergence points.”

“Example 20”: The method in Example 19 further includes:

“A third variable lens element and a fourth lens element, wherein one or more of the waveguides is between the viewer’s eye and another eye, and the fourth lens element is directly ahead of the third lens element and between the one/more waveguides, the surrounding environment, and the third variable lens element;

“Correcting a refractive mistake of the other eye by changing an optical power of third and fourth variable focal lens elements based upon the determined vergence.

“Example 21. The method in Example 20. In which the vergence point is determined, one or more cameras are used to track the vergence between the eye and the other eye.

“Example 22”: A method that combines the optical power from the first and second variable focus lens elements simultaneously.

“Example 23: A method that is similar to any of Examples 19-22 except that the waveguides contain diffractive optical elements which are designed to emit divergent light from waveguides.

“Adjusted reality (AR), systems may present virtual content to a viewer, while still allowing them to see the surrounding environment. This content should be displayed on a head-mountable device, such as eyewear. It projects information to the eyes and transmits light from the environment to the eyes.

Refractive errors are a problem that prevent light from properly focusing on the retinas of many viewers. Refractive errors can be caused by myopia, hyperopia and presbyopia. To clearly see the projected image information, these viewers might require lenses elements that have a specific prescription optical power. These lens elements can be placed between the waveguide and the viewer’s eye in some embodiments. These lens elements, and other optically transmissive components of the display such as waveguides, can cause aberrations in the viewer?s view of the environment. A lot of lens elements have a fixed optical strength that might not be able to correct all refractive errors.

“In certain embodiments, the display system may include first and second variable focal lens elements that are sandwiched (are placed on either side) of a waveguide. The first lens element can be placed between one or more of the waveguides and the eye of the viewer. It may correct refractive errors in focusing light from one or more of the waveguides to that eye. In some embodiments, the first lenses elements can be set up to provide sufficient optical power to position displayed virtual content in a desired depth plane. The second lens element can be placed between the environment and one or more waveguides. It may be designed to provide optical power to correct for aberrations in light transmission from the environment through the waveguides. Refractive errors in the other eye of the viewer may be corrected separately in some embodiments. To correct refractive errors in the other eye, it may be possible to use a third and fourth variable focus lenses elements between the eye and the waveguides. Variable focus elements can be adjusted to adjust the focal length/optical power so that both the real world and/or virtual content are focused onto the retina of the user. This allows the user to see both real and virtual objects at high optical quality.

“In some embodiments, a display is part of an eye tracking system that determines the vergence of the eye. One or more cameras may determine the vergence of the eye and may then be used to determine the distance at the focus of the eyes to calculate the correct correction. Different corrections might be needed for different vergence points. For example, the corrective lenses may need to be adjusted for viewing intermediate, near or far objects. Variable focus lens elements may be able to provide gradations of correction that are not available in other embodiments. In some embodiments, there may be multiple, more than three, more, four, five, or more unique corrections for each eye.

The variable focus lens elements can be used to correct the vision of the user, instead of wearing fixed prescription lenses. The augmented reality display system can be set up to provide different optical powers for virtual objects that are projected from different depths or at different distances. Users who require near vision correction may have the option to use the variable focus lens elements to provide near vision power for virtual objects and real-world objects at close range. For users who require intermediate distance vision correction, variable focus lens elements can be set up to provide intermediate distance vision power for those viewing virtual objects and real-world objects at distances that correspond to the intermediate distance vision zone. Another example is for users who require far vision correction. The variable focus lens elements can be set up to provide far vision optical power to users viewing virtual objects or real world objects at distances that correspond to far vision zones. The display system can access the prescriptions of users for near, intermediate, and far sight corrections. In certain embodiments, the system may adjust the optical power of the varible focus lens elements to suit the user’s prescription while the user is viewing real-world objects or virtual objects at distances that correspond to the far vision zone, intermediate vision zone, or near vision zones.

“The advantage is that the first and/or the second lens elements can allow the same head-mountable LCD display to be used for multiple users without the need to physically change the corrective lenses elements. The displays are flexible and adapt to the user. The variable focus lens elements can be set up to provide sufficient optical power to project image information from one or more waveguides onto a desired depth plane. Variable focus lens elements can be used to adjust the light divergence from one or more waveguides to the viewers. Variable focus lens elements can be used to simplify the manufacturing and design of displays. Different users may use the same display and require fewer optical structures to display information on different depth planes. The ability to provide a wider range of corrections in real-time may enable you to make more corrections than is possible with traditional corrective glasses. This can improve the clarity and/or acuity in the viewer’s view and display information. It may also increase viewer comfort over the long-term. The variable focus lens elements can be set up with different prescriptions. This allows the display to adapt to user prescriptions, such as changing the preset corrections in the display system.

“Reference will now go to the figures. Like reference numerals refer all throughout to like parts.”

“FIG. “FIG. Display system 60 comprises a display 70 and various electronic and mechanical modules that support its operation. A frame 80 may be used to attach the display 70 to the frame. This frame is worn by the viewer 90 or user of the display system 90. It is designed to position the display 70 in front the user’s eyes 90. In some embodiments, the display 70 can be considered eyewear. One embodiment of the display 70 is connected to the frame 80 by a speaker 100. This speaker can be placed next to the user’s 90 ear canal to provide stereo/shapeable audio control. One or more microphones 110, or other devices that detect sound, may be included in the display system. The microphone may be configured in some embodiments to allow the user inputs or commands to system 60 (e.g. the selection of voice commands, natural language queries, etc.). The microphone may also allow audio communication between other people (e.g. with users of similar display system). A microphone can also be used as a peripheral sensor to record audio data (e.g. sounds from the user or environment). The display system may include a peripheral sensor 120a in some embodiments. This sensor may be detached from the frame 80 and attached directly to the user 90 (e.g. on the head, torso or extremity). The user 90. In some embodiments, the peripheral sensor 120a can be used to collect data that characterizes a physiological state of user 90. The sensor 120 a could be, for example, an electrode.

“With reference to FIG. “With continued reference to FIG. 2, the display 70 can be operatively coupled via communications link 130. This could be a wired or wireless connectivity to a local processing module 140. The display 70 may be fixedly attached or attached to a helmet/hat worn by the user, embedded or attached in headphones or otherwise removable attached to user 90 (e.g. backpack-style configuration or belt-coupling style configuration). The sensor 120 a can be similarly operatively connected to the local processor 140 and data module 140 by communications link 120b. This could be either a wired or wireless lead. Both the local processing and data module 140 could include a hardware processor and digital memory such as flash memory or hard drives, which can be used to aid in data processing, caching and storage. Data may include data a), captured from sensors (which could be, e.g.., attached to frame 80 or to the user 90), and/or data b), acquired and/or processed with remote processing module 150 (including data relating virtual content), possibly for passage onto the display 70. Local processing and data modules 140 can be operatively connected via communication links 170, 180 to remote processing module 150 or remote data repository 160. This allows remote modules 150, 160 to be operatively coupled to one another and make them available for local processing and datamodul 140. One or more image capture devices, microphones and inertial measurement units may be included in the local processing and data modules 140. Other embodiments may include one or more sensors that are attached to the frame 80 or standalone structures that communicate with local processing and data modules 140 via wired or wireless communication paths.

“With reference to FIG. 2. In some embodiments, remote processing module 150 could include one or more processors that are capable of processing data and/or images. The remote data repository 160 could be a digital storage facility that is accessible via the internet or another networking configuration. Resource configuration. One or more remote servers may be included in some embodiments of the remote data repository 160. These remote servers provide information to the local processing and data modules 140 and 150, respectively, for the generation of augmented reality content. Some embodiments store all data and perform all computations in the local processing or data module. This allows for fully autonomous use from remote modules.

“With reference to FIG. 3. The perception of an image being “three-dimensional?” 3 The perception of an image as being?three-dimensional? You can achieve this by giving slightly different images to each viewer’s eyes. FIG. FIG. 3 shows a traditional display system that simulates three-dimensional imagery. The user is able to see two distinct images, 190 and 200, one for each eye 210 and 220-and they are outputted. The images 190 and 200 are separated from the eyes 220, 220 by distances of 230 along an optical z-axis parallel to the line in the sight of the viewer. The images 190,200 are flat, and the eyes of 210, 220 can focus on the images by taking one accommodated state. These 3-D display systems use the human visual system to combine images 190, 200 and provide a perception for depth and/or scale.

“It will however be appreciated that the human visual system can be more complex and it is more difficult to perceive depth accurately. Many viewers are not comfortable with?3D?. Many viewers of conventional?3-D? display systems find these systems uncomfortable, or may not feel any depth. It is possible that objects may be perceived as ‘three-dimensional’ by viewers, but this theory is not limited. This is due to vergence and accommodation. Vergence movements, i.e. rotation of the eyes so the pupils move towards or away from one another to converge the lines in the eyes to fixate on an object, are closely related to focusing (or?accommodation?). The lenses and pupils of your eyes. Under normal conditions, changing the focus of the lenses of the eyes, or accommodating the eyes, to change focus from one object to another object at a different distance will automatically cause a matching change in vergence to the same distance, under a relationship known as the ?accommodation-vergence reflex,? also pupil dilation and constriction. Under normal conditions, a change of vergence can also trigger a similar change in accommodation for lens shape or pupil size. Many stereoscopic and?3-D? systems are available, as we will see. Many stereoscopic or?3-D display systems present a scene with slightly different presentations (and so slightly different images) to each eyes such that the human visual system perceives a three-dimensional perspective. Such systems are uncomfortable for many viewers, however, since they, among other things, simply provide different presentations of a scene, but with the eyes viewing all the image information at a single accommodated state, and work against the ?accommodation-vergence reflex.? Display systems that offer a better match between accommodation, vergence, and three-dimensional imagery may create more comfortable and realistic simulations.

“FIG. 4. illustrates aspects of a method for simulating three-dimensional imagery by using multiple depth planes. FIG. FIG. 4 shows objects that are at different distances from the eyes 210 and 220 along the z-axis. The eyes 210 and 220 accommodate these objects so they are in focus. The eyes 210 and 220 have particular accommodated states that bring objects at different distances along z-axis into focus. Accordingly, an accommodated state can be associated with one of the depth planes 240. It has a associated focal distance so that objects or parts of objects in that particular depth plane are in focus when it is in that accommodated state. Three-dimensional imagery can be created in some embodiments by providing different images for each eye 210, 220 and different versions of each image corresponding to each depth plane. Although the fields of view are shown separately for illustration purposes, it is possible that they overlap as the distance along the Z-axis increases. For ease of illustration, the contours of depth planes may be shown flat. However, it is possible to have all the features of depth planes in focus when the eye is in a particular accommodationed state.

“The distance between an object 210 and 220 can also affect the amount of divergence light from the object as seen by the eye. FIGS. FIGS. 5A-5C show relationships between distances and divergence of light Rays. In order of decreasing distance R1, R2, R3 are the three distances between the object and eye 210. FIGS. FIGS. 5A-5C show that the distance between the object and the light rays becomes less. The light rays become collimated as the distance between objects increases. Another way to put it, the light field created by a point (an object or part of an object) has a spherical wavesfront curvature. This is a function how far the point is from the user’s eye. Curvature increases as the distance between the object (or part of it) and the eye decreases 210. The degree of light ray divergence at different depths can also vary. This is because the distance between the depth planes and the viewer’s eyes 210 increases with increasing distance. FIGS. illustrate only one eye 210. 5A-5C, and the other figures in FIGS. 5, it will be apparent that eye 210 can be applied to both eyes 220 and 210 of a viewer.

“It is not believed that the human eye can perceive depth in a finite number depth planes. This theory does not limit it. It is possible to create a convincing simulation of depth perception by showing the eye different images that correspond to each of the limited number of depth plans. Different presentations can be focused separately by the viewers eyes. This helps to provide depth cues for the user based on how the eye adjusts to bring into focus different images or on what image features are out of focus on different depths.

“FIG. “FIG. 6 shows an example of a waveguide stack that outputs image information to a user. Display system 250 comprises a stack of waveguides or stacked waveguide assemblies, 260 which can be used to provide three-dimensional perception for the eye/brain by using a plurality waveguides 270 to 280, 300 to 310. The display system 250 may be the system 60 in FIG. 2 with FIG. 6 shows some of the system 60 in more detail. The display 70 may include the waveguide assembly 265. 2. In some embodiments, the display system 250 could be considered a light-field display. The waveguide assembly 262 may also be called an eyepiece.

“With reference to FIG. 6. The waveguide assembly 260 could also include a variety of features 320 to 330, 340 and 350 between the waveguides. One or more lenses may be included in some embodiments of the features 320 to 330, 340 and 350. Waveguides 270-280, 290, 300 and/or the plurality lenses 320-330,340, 350 can be used to transmit image information to the eye at different levels of light ray divergence or wavefront curvature. Each waveguide level can be associated with a specific depth plane, and may be configured so that it outputs image information according to that depth. Image injection devices 360-370, 380 and 390-400 may serve as light sources for waveguides. They may be used to inject information into waveguides 270 to 280 to distribute light along each waveguide. This may allow for output towards the eye 210. An output surface 410 to 430 to 440 is illuminated by light. Light is then injected into the corresponding input surface 490, 490 and 500. In some embodiments, each of the input surfaces (460, 470.480, 490. 500) may be an edge or part of a major surface in the corresponding waveguide (i.e. one of the waveguide surfaces facing the world 510, or the viewer’s eyes 210). One beam of light, such as a laser beam, may be used in some embodiments. A collimated beam may be injected into each waveguide to produce an entire field of collimated collimated beams directed at the eye 210 at specific angles and divergence corresponding to the depth of the waveguide. One of the image injection devices 360-370, 380-390, 390, or 400 may be used to inject light into multiple waveguides (e.g. three), 270-280, 290, 290, 300 and 310.

“In some embodiments the image injection devices 360-370, 380 and 390-400 are discrete displays that produce image information to be injected into waveguide 270, 281 or 290. 300, 310. Other embodiments include the image injection device 360,370,380, 390 and 400. These are the output ends for a single multiplexed monitor that may pipe image information to one or more of the image injectors 360, 360,370,380, 390 or 400. You will appreciate that image information from the image injection devices 360-370, 380-390, 390, and 400 could include different wavelengths or colors (e.g. different component colors).

“In some embodiments, light is injected into waveguides 270 to 280, 290, 290, 300 and 310 by a lighting projector system (520), which includes a light module, 540. This may include a light emitter such as an LED. A light modulator 530 may direct the light from the module 540 to a beam splitter, 550 and modify it. The light modulator530 can be used to alter the intensity of light injected into waveguides 270-280, 290-290, 300-310. Liquid crystal displays (LCD), including liquid crystal on silicon (LCOS), are examples of spatial light modulators. It is obvious that the image injection device 360, 370 and 380, 390 and 400 are shown schematically. In some embodiments, these injection devices may represent different light paths or locations in a common project system, which outputs light into the associated waveguides 270 to 280, 295, 290, 300 and 310.

“In some embodiments, the display device 250 may include a scanning fiber display that includes one or more scanning fibres which project light in different patterns (e.g., raster, spiral scan, Lissajous patterns etc.). In one or more of the waveguides 270-280, 290/300, 310, and finally to the eye 210. The illustrated image injection devices 360-370, 380 and 390-400 may be used to represent one scanning fiber or a group of scanning fibers that are configured to inject light into a number of waveguides (270, 280 or 290 or 300 or 310). Other embodiments may show the illustrated image injection device 360,370,380, 390 and 400 as a plurality or multiple scanning fibers. Each scanning fiber is designed to inject light into a associated waveguide 270, 288, 290, 300, 310. One or more optical fibers can transmit light from the lightmodule 540 to one or more waveguides (270, 280. 290. 300. 310). One or more interconnected optical structures can be placed between the scanning fiber or fibers and one or multiple waveguides (270, 285, 290 or 300, respectively) to redirect light from the scanning fiber into one or several waveguides (270, 281 or 290 or 290 or 300, 310.

“Controller 560 controls operation of one or more stacked waveguide assemblies 260, including operation by the image injection devices 360 and 370. Also, operation of the 390, 400, light source 540 and light modulator 530. The controller 560 may be part of the local information processing module 140 in some embodiments. The controller 560 contains programming (e.g. instructions in a nontransitory medium) that controls the timing and provision image information to waveguides 270 to 290 to 300 to 310 according to any of the schemes described herein. The controller can be either a single device or a distributed system connected via wired or wireless communication channels. The processing modules 140 and 150 may include the controller 560 (FIG. 2) in certain embodiments.

“With reference to FIG. 6. The waveguides 270-280, 290 and 300 may each be designed to transmit light through their respective waveguides by total internal reflection (TIR). Waveguides 270-280, 290 and 300, 310 can be either planar or curved. They may have major top and/or bottom surfaces, and edges that extend between them. The illustrated configuration shows that the waveguides 270 to 280 to 290 to 300 to 310 can each have out-coupling optic elements 570 to 580 to 590 to 600 to 610. These optical elements are designed to extract light from a waveguide by directing the light and propagating it out of each waveguide to provide image information to the eye 220. Out-coupled light, also known as extracted light, and out-coupling optical element light may also refer to light extracting optical components. The waveguide may produce an extracted beam of light at the locations where the light from the waveguide strikes a light extraction optical element. For example, the out-coupling elements 570, 580 and 590, 600 and 610 could be gratings with diffractive optical features. For ease of description and drawing clarity, the out-coupling elements 570, 580 and 590 are shown at the bottom major surface of the waveguides. However, some embodiments allow the out-coupling element 570, 580 and 590 to be placed at the top or bottom major surfaces. The out-coupling elements 570, 580 and 600, 610 can be formed in a layer made of material attached to a transparent substrate. This will form the waveguides 277, 280, 295, 290, 300, 310. Other embodiments allow the waveguides 270-280, 290/300, 310 to be formed in a single piece of material. The out-coupling elements 570/580, 590/600, 610 can be formed on a surface or within that piece of material.

“With reference to FIG. “Continued reference to FIG. The waveguide 270 closest to the eye could be set up to deliver collimated lighting (which was injected into waveguide 277) to the eye 210. The collimated light could be representative of optical infinity’s focal plane. The next waveguide 280 may send out collimated beams which pass through the first lens 350 (e.g. a negative lens) before reaching the eye 210. This first lens 350 may create a slight convexwavefront curvature so the eye/brain understands light coming in from the next waveguide 280 as coming in from an optical infinity focal plane that is closer inward towards the eye 210. The third-up waveguide 290 also passes its output light through the second 350 and second 343 lenses before reaching the eye. In this way, the eye/brain may interpret light coming from the thirdwaveguide 290 as coming in from an additional focal plane. This second focal plane is closer to the person than the one from the previous waveguide up 282.

The other waveguide layers 300 to 310, and lenses 330 to 320 are similarly configured. The highest waveguide 310 sends its output through all lenses between it, the eye, for an aggregate focal power that is representative of the nearest focal plane to the person. A compensating lens layer (620) may be placed at the top to offset the effect of the lens stack 320-330,340,350 when viewing/interpreting light from the world 510. This configuration can provide as many perceived focal points as possible, depending on the available lens/waveguide pairings. Both the out-coupling optical elements and the focusing parts of the lenses can be static. Alternate embodiments may include one or both of these elements.

“In some cases, more than one waveguide 270, 285, 290, 300 or 310 may have the exact same depth plane. Multiple waveguides may be configured to output images to the same depth plane. Or multiple subsets may be used to output images to different depth planes. This allows for the creation of a tiled image that provides a wider field of view at these depth planes.

“Continued reference to FIG. “With continued reference to FIG. 6, the out-coupling optic elements 570, 580. 590. 600, 610 could be configured to both redirect light from their respective waveguides as well as to output the light with the correct amount of divergence/colimation for the particular depth plane. Waveguides with different depth planes can have different out-coupling optical element configurations 570, 580 590 600, 610 which may output light with different amounts of divergence. The light extracting optical elements 580, 590 600, 610 in some embodiments may be volumetric features or surface features that can be set up to produce light at certain angles. The light extracting optical element 570, 580 590 600, 610 could be volume holograms or surface holograms and/or diffraction gratings. The features 320-330, 340 and 350 may not be lenses in some embodiments. They may be spacers (e.g. cladding layers or structures for forming gaps in air)

“In certain embodiments, out-coupling optic elements 570, 580 and 590, 600 and 610 are diffractive structures that form a pattern or?diffractive element?. This is also known as a ‘DOE? The DOE’s should have a sufficient low diffraction efficiency to ensure that only a small portion of the beam is deflected toward the eye 210 at each intersection. The rest of the beam continues to travel through a waveguide via IR. This allows the light carrying the image information to be divided into several related exit beams, which exit the waveguide at a variety of locations. The result is a relatively uniform pattern of exit emission towards the eye 210 for the particular collimated beam that bounces around within a waveguide.

“In some embodiments, one of the DOEs can be switched between?on and?off? They actively diffract in states that they are active, and in states where they are?off? States in which they don’t significantly diffract. A switchable DOE could be a layer of polymer-dispersed liquid crystal. In this case, the microdroplet’s refractive index may be changed to substantially match the host material’s (in which case it does not diffract incident sunlight) or to an index that is not as high as the host medium’s (in which case it actively diffracts the incident light).

“In certain embodiments, a camera assembly (e.g. a digital camera including visible light and/or infrared cameras) may be used to capture images of the eyes 210 and surrounding tissue 210. This is to, e.g. detect user inputs and/or monitor the physiological status of the user. Any image capture device may be considered a camera. In some embodiments, the camera assembly may include an image-capture device and a light source that projects light (e.g. infrared light), to the eye. This light may then be detected by the imagecapturing device. The camera assembly 630 may be attached (FIG. 2), and may be in electric communication with the processing module 140 and/or150 which may process image data from the camera assembly. One camera assembly 630 may be used for each eye in some embodiments to monitor each eye separately.

“Refer to FIG. 7 shows an example of exit beams produced by a waveguide. Although one waveguide is shown, it is easy to see how other waveguides from the waveguide assembly (FIG. If the waveguide assembly 2260 contains multiple waveguides, the waveguide assembly 266, may work similarly. The waveguide 276 is infused with light 640 at the input surface 456 of the waveguide 277 and propagates through the waveguide 277 by TIR. A portion of the light 640 exits the waveguide 650 as exit beams at 650 points where it impedes on the DOE570. Although the exit beams 650 appear to be substantially parallel, depending on the depth plane of the waveguide 270, they can also be diverted to propagate to eye 210 at an angle (e.g. forming divergent exit beacons). You will see that the exit beams 650 are substantially parallel. This could be a sign of a waveguide that uses out-coupling optical elements to out-couple light, forming images that appear to be set at a deep plane that is far away (e.g. optical infinity). Waveguides with other out-coupling optical elements could produce a divergent exit beam pattern. This would require the eye 220 to adjust to a greater distance to focus the image on the retina. The brain would interpret this as light coming from a farther distance than the eye 210.

“In certain embodiments, a full-color image can be created at each depth plane using overlaying images in each component color, e.g. three or more components. FIG. FIG. 8 shows an example of a waveguide assembly that stacks using multiple component colors. Although the illustrated embodiment depicts depth planes from 240 a to 240 f, other depths may be possible. A depth plane can have three or more color images. These include a first image of the first color G, a second image representing R, and a third image representing B. Different depth planes can be identified in the figure with different numbers for diopters, (dpt), following the letters G and R. As an example, the numbers after each letter indicate the diopters (dpt) and the inverse distance from the depth plane to a viewer. Each box in the figures is a separate component color image. Some embodiments may allow for variations in the eye’s ability to focus light of different wavelengths. This means that the placement of depth planes for different components colors can vary. Different component colors may be placed on depth plans corresponding to different distances from users. This arrangement can increase visual acuity, user comfort, and/or decrease chromatic aberrations.

“In some embodiments, the light of each component’s color may be produced by one dedicated waveguide. Therefore, each depth plane could have multiple waveguides associated to it. Each box, including R, G, or B, may be taken to mean an individual waveguide. Three waveguides could be provided for each depth plane, while three color images may be provided for each depth plane. The waveguides for each depth plane are shown next to one another in the drawing. However, physical devices may have multiple waveguides per level. Other embodiments may output multiple colors from the same waveguide. For example, one waveguide per depth plane may provide multiple components.

“Continued reference to FIG. 8 in some embodiments, G represents the color green, R the color red, and Blue is the color. Other colors may also be associated with different wavelengths of light. In other embodiments, magenta or cyan may be used instead of or in place of one or more red, green, or blu.

It will be appreciated that any reference to a particular color of light in this disclosure may include light of one or more wavelengths from a range that is perceived by the viewer as that color. Red light, for example, may contain light from one or more wavelengths within the range of 620-780nm. Green light may include light from one or several wavelengths within the range 492-577nm. Blue light may also include light from one or multiple wavelengths within the range 435-493nm.

“In some embodiments the light source 540 (FIG. “6 may emit light at wavelengths beyond the visual perception range of the viewer. The waveguides 250 can also be used to redirect light from the display towards the user’s eyes 250, such as for imaging or user stimulation.

“Refer to FIG. “With reference now to FIG. 9A, some embodiments may require that light impinging upon a waveguide be redirected in order to in-couple the light into the waveguide. In-coupling optical elements can be used to in-couple and redirect the light into their corresponding waveguides. FIG. FIG. 9A shows a side-view of a cross-sectional view of an example of a plurality (or set) 660 of stacked wavelengthguides, each with an in-coupling optic element. Each waveguide may be designed to emit light at one or more wavelengths or a range of wavelengths. The stack 660 could correspond to stack 260 (FIG. 6 and the illustrated waveguides in the stack 660 could correspond to a portion of the plurality waveguides 270-280, 290, 300-310. However, light from one of the image injection devices 360 or 370 is injected into the waveguides at a position that requires it to be redirected for In-coupling.

The illustrated set 660 includes waveguides 670-680 and 690. Each waveguide has an associated incoupling optic element (also known as a light input on the waveguide). For example, incoupling element 700 is disposed on waveguide 670. In-coupling element 710 is disposed upon waveguide 680. In-coupling element 720 is disposed onto waveguide 690. One or more of the incoupling optical element 700, 710 or 720 may be placed on the bottom major surface (e.g., an upper major surface) of each waveguide 670 or 680 or 690. This is especially true if the one or more incoupling optical element are reflective, deflecting optic elements. The in-coupling elements 700, 710 and 720 can be placed on the upper major surfaces of their respective waveguide 670-680, 690 or the top of the next lowest waveguide. This is especially true if the in-coupling elements are transmissive deflecting optical element. In certain embodiments, the incoupling optical elements 700-710, 720 can be placed in the body of their respective waveguide 670-680 or 690. As discussed above, some embodiments have the in-coupling elements 700, 710 and 720 that are wavelength selective. They selectively redirect one wavelength of light while transmitting another wavelength. Although they are shown on one corner of the waveguide 670-680, 690, it is clear that in-coupling elements 700, 710 and 720 can be found in other areas of the waveguide 670-680, 690. In some embodiments, however, these elements may also be located in other areas of the waveguide 670-680, 690.

“As shown, the in-coupling elements 700, 710 and 720 can be offset laterally from one another. In certain embodiments, each optical element in-coupling may be offset so that it receives light from another optical element. As shown in FIG. 6, each in-coupling optic element 700, 710 or 720 can be set up to receive light from an image injection device 360, 370, 380, 390, and 400. 6 and may be separated (e.g. laterally spaced away) from other incoupling optical element 700, 710 or 720 so that it does not substantially receive light from the in-coupling elements 700, 710 or 720.

Each waveguide includes associated light-distributing elements. These include light distributing element 730 placed on a major (e.g. a top surface) waveguide 670 and light distributing element 740 placed on a major (e.g. a top surface) waveguide 680. Light distributing element 750 is placed on a major (e.g. a top surface) waveguide 690. Other embodiments may place the light distributing element 730, 740 and 750 on a top major surface of associated wavesguides 670-680 or 690. Other embodiments allow the light-distributing elements 730-740 and 750 to be placed on the top and bottom major surfaces of associated waveguides 670-680 and 690 respectively. Or the light-distributing elements 730-740 and 750 may be placed on different top and bottom major surfaces of different associated waveguides 670-680 and 690.

“The waveguides 670-680 and 690 can be separated and spaced by e.g. gas, liquid and/or solid layers. As an example, layer 760a can separate waveguides 670, 680, and layer 760b may separate wavesguides 680, 690, and 680, respectively. In some embodiments, layers 760a and 760b may be formed from low refractive materials. This means that they have a lower refractive than the material forming waveguides 670-680 or 690. Preferably the refractive indices of the materials forming layers 760a, 760b are 0.05 or greater, or 0.10 or lower than the refractive indices of the materials forming waveguides 670-680, 690, 690. The lower refractive layers 760a, 760b can be used as cladding layers to facilitate total internal reflection (TIR), of light through the waveguides 670-680, 690, 690. (e.g. TIR between the top major surfaces of each waveguide). Some embodiments of the layers 760a and 760b are made from air. Although not illustrated, it is possible to see that the top and bottom of illustrated set 660 waveguides could include adjacent cladding layers.

“Preferably, for ease in manufacturing and other considerations the material forming waveguides 670-680, 690 and 690 are identical or the same. The material forming layers 760a, 760b are the same or similar. Some embodiments may have different materials for the waveguides 670-680 or 690. The material that forms the layers 760a-760b could be different while still adhering to the refractive index relationships mentioned above.

“With reference to FIG. 9A, the light rays 770 to 780 and 790 are incident on waveguides set 660. You will appreciate that the light rays 780, 790, 790 can be injected into waveguides 670-680, 690 using one or more image injection device 360, 370. 380. 390. 400 (FIG. 6).”

“In certain embodiments, the light beams 780, 790, and 770 have different properties. They may have different wavelengths or ranges of wavelengths which could correspond to different colors. In-coupling optical elements 700-710 and 720 deflect incident light so that it propagates through respective waveguides 670-680 or 690 by TIR. In certain embodiments, the incoupling elements 700, 710 and 720 selectively deflect a particular wavelength of light while transmitting other wavelengths through an underlying waveguide or associated incoupling optic element.

In-coupling optical element 700, for example, may be set up to deflect ray 777, which has a first wavelength, or range of wavelengths. However, it will transmit rays 780 or 790, which have different wavelengths or ranges. In-coupling optical element 710 is designed to deflect light from a second wavelength. In-coupling optical element 710 deflects the ray 790. It is designed to selectively deflect light from third wavelengths or other wavelengths.

“Continued reference to FIG. 9A: The deflected light beams 770-780 and 790 are directed so that they propagate along a corresponding wavelength 670-680, 690. That is, the incoupling optical elements 700-710, 720 deflect light into the waveguide 670-680, 690, 690 in order to in-couple the light into the waveguide. The light rays 780, 790, 770 are deflected at angles such that they propagate through the waveguide 670-680 or 690 by TIR. The light rays 780, 790, 770 propagate through the waveguide 670-680-690 by TIR, until they impinge on the waveguide’s light distributing elements 730-740, 750.

“With reference to FIG. “With reference now to FIG. 9B, a perspective of an example from the plurality stacked waveguides in FIG. Illustration 9A. The in-coupled light beams 770, 780 and 790 are deflected above by the incoupling optical elements 700.710 and 720. They then propagate via TIR within the waveguides 670.680.690. The light rays 780, 790, 770 then impact upon the light distributing element 730, 740 and 750. The light distributing elements 730-740, 750 reflect the light rays 780-790 to propagate towards out-coupling optical element 800, 810 and 820.

“In some embodiments the light distributing elements 730-740 and 750 are orthogonal pupil extenders (OPE). OPE’s can be used to distribute or deflect light to out-coupling elements 800, 810 and 820. In some cases, they may also increase the spot size or beam of the light that propagates to out-coupling elements. In some cases, the light-distributing elements 730-740, 750 can be eliminated and the in-coupling optic elements 700, 710 or 720 may be used to direct light to the out-coupling elements 800, 810 and 820. FIG. 9A shows an example. 9A, light distributing elements 730-740 and 750 can be replaced by out-coupling optic elements 800, 810 or 820. In certain embodiments, out-coupling optical elements 800-810, 820 may be exit pupils (EP) or exit pupil expanders(EPE) that direct light into the viewer’s eyes 210 (FIG. 7). The OPE’s can be set up to increase the size of the eyebox in at least one axis. EPE’s could be used to increase the eyebox in an axis crossing that is orthogonal to the OPEs. Each OPE could be designed to redirect some of the light hitting it to the EPE within the same waveguide while still allowing the rest to propagate down. Once the OPE is impinged on again, another portion will be directed to the EPE. The remaining portion of the light continues to propagate down the waveguide. The same thing happens when you strike the OPE. A portion of the light that impinges on the EP is directed towards the user. After striking the EP, another portion is directed to the waveguide. A single beam of incoupled lighting may therefore be “replicated?” Each time that a portion of the light is redirected by an OPE/EPE, it forms a field of cloned light beams, as shown in FIG. 6. In certain embodiments, the OPE or EPE can be used to alter the size of the light beams.

“Accordingly, with regard to FIGS. “Accordingly, with reference to FIGS. 9A and 9B. In some embodiments, set 660 of waveguides comprises waveguides 670-680, 690, 700, 710 and 720; in-coupling optic elements 700, 710 and 720; light-distributing elements (e.g. OPE’s), 730, 740 and 750; and outcoupling optical element (e.g. EP’s), 800, 810 and 820 for each color. Waveguides 670-680, 690 can be stacked with an air gap/cladding between them. In-coupling optic elements 700, 710 and 720 deflect or redirect incident light (with different elements receiving light at different wavelengths) into their waveguide. The incident light is then propagated at an angle that will produce TIR within its respective waveguide 670-680 or 690. In the above example, light ray 780 (e.g. blue light) is deflected first by in-coupling element 700. The light then bounces down the waveguide and interacts with the light distributing elements (e.g. OPE’s) 730, then the out-coupling element (e.g. EPs 800) in the same manner as described previously. The waveguide 670 will be used to guide the light rays 780 (e.g., red and green light) through it. Light ray 780 will then pass through the waveguide 780, where it will impinge on and be deflected 710. The light ray 780 bounces down the waveguide 680 via IR. It then proceeds to its light distribution element (e.g. OPEs) 740, and then on to the out-coupling element (e.g. EP’s) 810. The light ray 790, e.g. red light, passes through the waveguide690 to impact the waveguide 690’s light in-coupling elements 720. The light in-coupling elements 720 reflect the light ray 790 so that it propagates to OPEs (e.g. OPEs) 750 and then to EPs (820 by TIR). The viewer receives the outcoupled light from other waveguides 670 and 680.

“FIG. 9C is a top-down view of one example of the plurality stacked waveguides in FIGS. 9A and 9B. 9A and 9B. The in-coupling elements 700, 710 and 720, however, are not vertically aligned. Instead, they are preferable to be non-overlapping (e.g. lateral spaced as in the top-down perspective). This spatial arrangement allows for the injection of light from different sources into different waveguides on an individual basis. It also facilitates the unique coupling of a particular light source to a particular waveguide. Some arrangements include nonoverlapping spatially-separated optical elements in-coupling optical components. These arrangements can be called a “shifted pupil system” and may correspond to sub-pupils.

“Refer to FIGS. 10A and 10B. One embodiment of augmented realities devices, such the ones described above, can adjust the wavefront light (light for image information projected from augmented reality systems as well as light from objects in real world surrounding it) by adjusting focal lengths of the variable focus lens elements. The augmented reality system could include multiple stacked waveguides, as discussed above. 9A and 9B or corresponding with the stacked waveguide assembly 265 of FIG. 6), that project light towards the eye of a user (e.g., user 90 of FIG. 2). Other embodiments may only include one waveguide. While plural waveguides are mentioned in the disclosure, it is possible to replace them with a single waveguide.

“As we have discussed, the projected light from waveguides can be used to provide virtual, enhanced reality image information to viewers. The projected light could be so that the viewer perceives it as coming from a different depth or distance from them. The display device can be optically transmissive so that the user can see real objects in the environment. Some waveguides may have fixed optical power. The waveguides can be set up to emit divergent beams with different levels of divergence to give the illusion that the light is coming from different depths.

“It is important to understand that the fixed optical power of waveguides assumes the viewer has an adequate accommodative range to focus light from the waveguides. However, some viewers may need corrective lenses to see clearly. As a result, image information from a waveguide might not be easily seen by these viewers. A first variable focus lens element, which may be placed between the waveguide’s waveguide and the viewer to adjust the wavefront of light produced by the waveguide to correct the viewer’s vision, is possible in some embodiments. However, this first lens element is also in direct contact with light that is propagating from the environment to the viewer. The first lens element can alter the wavefront of light from the environment, which may cause aberrations in the viewer?s perception of the world. A second variable focal lens element can be placed on the opposite side to the plurality stacked waveguides. This will correct aberrations caused by the first lens element. The second variable focus element may be located between the plurality stacked waveguides. It may also be placed between the real world and adjust the wavefront of light coming from real-world objects. The second variable focal lens element can be used to correct aberrations due to the first variable lens element. The second variable focus lens can also be used to compensate for aberrations due to waveguides in some embodiments.

“In some embodiments the focus of a second variable focus lens lens element could be opposite or inverted to the first. If the first variable focal lens element has a positive optical potential, the second variable lens element could have a negative optical pot, which may be similar to the aggregate optical power of the first lens element. Other embodiments allow for the compensation of both the optical powers of the first variable focal lens element as well as those of the interconnected waveguides. In these cases, the optical power from the second lens elements could be of opposite magnitude to the combined optical power and waveguides.

In some embodiments, waveguides might not have optical power. For example, waveguides could be configured to emit collimated light. The first variable focus lens elements can be used to adjust the wavefront of light emitted by waveguides in order to provide sufficient divergence to allow image information to be interpreted as being on a specific depth plane. The appropriate amount of divergence for different viewers may differ because the optical power required to place image information on a specific depth plane will be adjusted using a differential to adjust for the viewer’s particular optical prescription. The waveguide stack comprising the first and second variable focal lens elements can be simply formed by one waveguide.

It will be appreciated that the first two variable focus lens element elements can be used for one eye, while the third and fourth lens elements may be used for the other.

Click here to view the patent on Google Patents.