Invented by Eric Whitmire, Kaan Aksit, Michael Stengel, Jan Kautz, David Luebke, Ben Boudaoud, Nvidia Corp
The Nvidia Corp invention works as follows
The patent describes a gaze tracking system that is used by the driver to track their eyes while they are looking through the transparent field. It includes an opaque frame that surrounds a transparent view field, light emitting devices attached to the opaque frame, which emit infrared lights onto different regions of the eye of the driving, and detectors for the intensity of the infrared reflected light from various areas of the eye.Background for Driver gaze tracking system for vehicles
Head mounted displays (HMDs), which are increasingly popular, use various sensors to create immersive virtual reality experiences. Consumer head mounted displays that are geared towards virtual reality will use different sensors, whether they’re embedded in the headset, or grouping them into an external unit. These include gyroscopes and accelerometers as well as various optical sensors, such a cameras, photodiodes and/or LEDs. These sensors are able to detect the wearer of the device and can track the head orientation and motions. They also measure the duration and direction of gaze. The use of gaze input can enhance the experience for users in HMD systems. Gaze-supported target acquisition is fast and intuitive, unlike non-mouse inputs that are decoupled from the mouse. “The virtual reality (VR), augmented reality, and HMD industries are moving towards the use of gaze-tracking as a core component of future HMDs.
Although the current prototypes are bulky and costly, the increasing interest in low-cost eye trackers has encouraged development which has already produced some promising results. However, there are still major challenges to overcome in order to improve conventional gaze tracking software and hardware. The typical gaze tracker relies on imaging techniques that are camera-based and require a lot of power. This may not work well for mobile solutions with low-power. The imaging equipment requires complex image processing software. This adds a cost-intensive image processing block into the pipeline. “Because gaze trackers use high-dimensional images (high-resolution pictures), they cause unwanted latency to the hardware, communication protocols and gaze estimation pipeline.
Various implementations are disclosed herein that relate to gaze tracking systems used in head mounted displays. A head-mounted display with a lens and a display in front of it in an optical pathway, as well as one or more photosensors behind the lens is disclosed in various implementations. ?Photosensor? Photosensors are any devices capable of detecting light. Photosensors can be light emitting diodes in sensing mode or photodiodes.
Various implementations can further include one or more light-emitting diodes, (optionally adaptively configured as illumination or sensing modes) positioned behind lens in optical path adjacent to photosensor elements.
The system includes an eyepiece with an opaque frame enclosing the transparent field, a plurality light emitting elements attached to the frame to emit infrared lights onto an eye looking through the field, and a number of photosensors coupled to this opaque frame. Each sensor element of the plurality is operationally coupled to at least one light emitting device from the plurality.
The method includes the following: (1) determining, with a calibration logic, a set corresponding to first measures of intensities infrared reflected by an eye when the eye is gazing a plurality at calibration points in a first period of time, the calibrations points corresponding a number of known gaze directions, and (2) comparing a gaze direction for an eye in a second period of time based on the second measure of intensity infrared reflected by an eye.
BRIEF DESCRIPTION ABOUT THE VIEWS FROM THE DRAWINGS
To easily identify the discussion on any element or act in particular, the most significant number or numbers in a reference refers to the number of the figure in which the element was first introduced.
FIG. “FIG.
FIG. “FIG.
FIG. “FIG.
FIG. “FIG.
FIG. “FIG.
FIG. “FIG.
FIG. “FIG.
FIG. “FIG.
FIG. “FIG.
FIG. “FIG.
FIG. “FIG.
FIG. “FIG.
FIG. “FIG.
Click here to view the patent on Google Patents.