Invented by Brian Bucknor, Christopher Lopez, Michael Janusz Woods, Aly H. M. Aly, James William Palmer, Evan Francis Rynk, Magic Leap Inc
The Magic Leap Inc invention works as followsHead mounted augmented reality devices (AR) can track the pose of a user’s head in order to provide a virtual three-dimensional representation of objects within their environment. A tracking system that uses electromagnetic fields (EMs) can track body or head pose. The handheld input device may include an EM-emitter, which generates an EM-field. The head mounted AR device may include an EM-sensor that detects the EM-field. The sensor’s EM data can be analyzed in order to determine the location and/or the orientation of the sensor, and thereby the pose of the wearer. The EM sensor and emitter can use dynamic frequency tuning or time division multiplexing to operate on multiple frequencies. The voltage gain control can be implemented on the transmitter rather than in the sensor. This allows for smaller and lighter sensor designs. The EM sensor is able to implement noise cancellation in order to reduce the amount of EM interference caused by audio speakers.
Background for Electromagnetic Tracking with Augmented Reality Systems
The present disclosure is about systems and methods for localizing position or orientation of objects within augmented reality systems.
Modern computing and display technology has facilitated the development systems for so-called?virtual realities? “Modern computing and display technologies have facilitated the development of systems for so-called ‘virtual reality? Digitally reproduced images, or portions thereof, are presented in such a way that they appear to be real, or can be perceived to be so. Virtual reality (VR) is a scenario where digital images or portions thereof are presented to a user in a way that they appear to be real.
Head mounted augmented reality devices (AR) can track the position of the wearer?s head (or another body part) in order to be able provide a virtual representation of objects within the wearer?s environment. The electromagnetic tracking system (EM) can be used for the tracking of body gestures or head poses. A handheld input device, for example, can have an EM transmitter and a head-mounted AR system can have an EM detector. In some implementations the EM emitter can generate an EM field which the EM sensors can sense. The sensor’s EM data can be analyzed in order to determine the location and/or the orientation of the sensor, and thereby the head position of the wearer. The EM sensor and emitter may use time division multiplexing or dynamic frequency tuning to allow the tracking system operate on multiple frequencies. The voltage gain control can be implemented on the transmitter rather than in the sensor. This allows for smaller and lighter sensor designs. The EM sensor is able to implement noise cancellation in order to reduce the amount of EM interference caused by audio speakers.
An embodiment” of an electromagnetic tracking system includes an EM field transmitter comprising an automatic gains control (AGC), a transmitter coil, and an EM detector without an AGC, the EMsensor comprising a coil. The EM tracking system can be incorporated into a head-mounted augmented display device.
Details about one or more implementations are provided in the accompanying drawings as well as the description. The claims, drawings, and description will reveal other features, aspects, or advantages. This summary and the detailed description below do not attempt to limit or define the scope of the inventive subject matter.
Overview of AR, VR, and Localization Systems
In FIG. In Figure 1, an augmented-reality scene (4) depicts a park-like environment (6) with people, trees and buildings in the distance, as well as a concrete platform (1120). The user of AR technology perceives, in addition to the items mentioned above, that he also’sees’ a robot statue (1110) standing on the real-world platform (1120), and a cartoon-like avatar character (2) flying by which seems to be bumble bee personified. A robot statue (1110) standing on the real-world platform (11120), as well as a cartoon avatar character (2), which appears to personify a bumblebee, are all perceived by the user of the AR technology, despite the fact that these elements (2,1110) do not actually exist in the world. It turns out that the human visual system is complex. Producing a VR/AR technology that allows for a rich, comfortable presentation of virtual images amongst real or virtual imagery is difficult.
The display system can update the data displayed to take into account changes in head pose if it detects user head motions. The data displayed can be adjusted to reflect the changes in the head position if the display system detects the head movements of the user.
The user can have the impression that they are walking around a real object by re-rendering the 3D model for each view. The head-worn display can be used to show multiple objects in a virtual environment (such as a rich virtual universe). Measurements of head pose, such as the position and orientation of the users head, can be used to re render the scene and give the user a greater sense of immersion.
In AR systems, the detection or calculation can help the display system render virtual objects in a way that is intuitive to the user. As well, the detection of the location and/or direction of a real item, such as a handheld device, which can also be called a “totem”, is important. The display system can also present display information that is relevant to the AR system to the user. This allows the user interact efficiently with certain features of the AR system. The virtual objects can be rendered in a new way based on the head position of the user, so that they appear stable in relation to the real-world. For AR applications at least, the placement of virtual objects spatially in relation to physical objects can be a difficult problem. Placement of virtual objects within a view of a surrounding environment can be complicated by head movements. This is true regardless of whether the view captured is an image of ambient environment that is then projected or shown to the user or whether the user directly perceives the view. A head movement may cause the field of view to change for the user, which would require an update of where virtual objects were displayed. Head movements can also occur in a wide range of speeds and ranges. The speed of head movement can vary, not only among different head movements but also within or across a range of one head movement. Head movement speed can increase initially (e.g. linearly or non-linearly) as a starting point and decrease when an ending point is achieved, with a maximum speed that lies somewhere between these two points. Rapid head movements can even exceed the capability of the display or projection technology in rendering images that are uniform and/or smooth motion for the end user.
Head tracking accuracy and latency, i.e. the time between the moment the user moves their head and when the image is updated and shown to them has been a challenge for VR and AR. It is important that head tracking accuracy is high, and the system latency be low. This is especially true for systems that cover a large portion of a user’s field of vision with virtual elements. The system’s latency can cause a mismatch in the sensory systems of the user, which can result in motion sickness. “If the system latency increases, virtual objects’ apparent locations will be unstable when the user moves their head rapidly.
Other display systems may also benefit from accurate head pose detection. This includes head-tracked displays that are mounted on walls or other surfaces, and not worn by the user. The head-tracked displays act as a window into a scene. As a user moves their head in relation to the “window”, it is rendered. The scene is re-rendered according to the changing perspective of the user. Another system is a head worn projection system in which the display on a headset projects light onto real-world objects.
Also, to provide a realistic experience of augmented reality, AR systems can be designed to interact with the user. Multi-users can play a game of ball with a virtual object or ball. One user may ?catch? One user may?catch? In another embodiment, the first user is provided with a bat (e.g. a real one that communicates with the AR system), to hit the virtual object. In other embodiments a virtual interface can be shown to the AR user so that they can choose from a variety of options. To interact with the system, the user can use totems or haptic devices. Wearable components may also be used.
Detecting the head position and orientation of the users, as well as detecting the physical location of objects in space allows the AR system display virtual content effectively and in a fun way. These capabilities are important for an AR system but they are also difficult to achieve. The AR system is able to recognize the physical location of an object (e.g. user’s head or hand, totems, haptic devices, wearable components, etc.). The AR system can correlate the physical coordinates for the real object with virtual coordinates that correspond to one or several virtual objects displayed to the user. It is necessary to use highly accurate sensors, and sensor recognition systems which track the position and orientation of multiple objects rapidly. “The current approaches are not able to perform localization with the speed and precision required.
There is a need to improve the localization system for AR and VR devices.
Examples of AR and VR Systems, Components and Systems
Referring to FIGS. Some general componentry options have been illustrated in FIGS. The sections of the detailed description that follow the discussion of FIGS. 2A-2D, various systems, subsystems, and components are presented for addressing the objectives of providing a high-quality, comfortably-perceived display system for human VR and/or AR.
As shown in FIG. In 2A, a user of an AR system (60) wearing a head-mounted component (58) with a frame structure (64) coupled to a display (62) placed in front of their eyes is shown. In the configuration depicted, a speaker (66), coupled to the frame (64), is located adjacent to the user’s ear canal (in another embodiment, a second speaker, not illustrated, is placed adjacent to the other user’s ear canal to provide stereo/shapeable audio control). The display (62), which can be fixedly mounted to the frame (64) or fixedly mounted to a hat or helmet (80), as shown in FIG. The display (62) is operatively coupled (68), such as by a wired lead or wireless connectivity, to a local processing and data module (70) which may be mounted in a variety of configurations, such as fixedly attached to the frame (64), fixedly attached on hats or helmets (81) as shown in FIG. The embodiments of FIGS. 2C and 2D show headphones that can be removed from the torso (82) of the user (62) in a backpack-style configuration. 2D.Click here to view the patent on Google Patents.