Electronics – Matthew Bailey, Stefan Alexander, Google LLC

Abstract for “Systems, devices and methods to interact with content on head-mounted displays”

“Systems, devices and methods that allow sophisticated and discreet interactions with content displayed on a helmet-mounted display are described. The head-mounted display also includes an eye-tracker. Additionally, the user carries/wears a portable wireless interface device anywhere on their body such as a ring. Because the portable interface device is wireless, it has a small form factor that is discreet. An actuator is included in the portable interface device that allows the user to activate it and wirelessly transmit a signal, such as a radio frequency or sonic signal. The user activates the actuator while simultaneously gazing at the displayed object (as determined by the eye-tracker). This is a selection operation. The user sees a visual effect when the head-mounted display responds to the selection operation.

Background for “Systems, devices and methods to interact with content on head-mounted displays”

“Technical Field”

“The present systems and devices and methods relate generally to interfacing with content displayed on head mounted displays. They particularly relate to a multi input interface that combines eye track with a wireless portable interface device.

“Description of Related Art”

“Wearable Electronic Devices.”

“Electronic devices are now commonplace in most parts of the world. Electronic devices are now small enough to be easily carried thanks to advances in integrated circuit technology. These electronic devices are ‘portable’. These electronic devices can include on-board power supplies, such as batteries or other power storage system. They may also be ‘wireless? This means that they are designed to work without wire-connections to non-portable electronic devices. However, small, lightweight electronic devices can still be considered portable. A microphone, for example, can be considered portable electronic devices regardless of whether it is wirelessly connected or wired.

“The portability of electronic devices has created a large industry. There are many types of portable electronic devices, including smartphones, ebook readers, laptop computers and tablet computers. The convenience of carrying a portable electronic device around has brought with it the inconvenience of having your hand(s) encumbered. An electronic device can now be worn, and is portable.

A wearable electronic device can be any electronic device that the user can hold onto without having to physically grasp, grip, or clutch it. A strap, band, or band may attach a wearable device to the user. An adhesive clip, pin and clasp, article of clothing, tension, elastic support, interference fit, or other form can also be used. Wearable electronic devices can include electronic wristwatches and electronic armbands, electronic rings or electronic ankle-bracelets. Head-mounted electronic display units, hearing aids and so forth.

Wearable electronic devices are designed to be worn on the body, visible to others and for long periods of times. Form factor (i.e. size, geometry and appearance) is an important design consideration.

“Head-Mounted Displays”

“Human-Electronics Interfaces and Devices.”

A human-electronics interfacing facilitates communication between a person and one or more electronic devices. A human-electronics interfacing device is generally one or more electronic devices that: a) sense inputs from the user and convert them into electrical signals that can then be processed or acted on by the one(s) or more electronics device(s); and/or, b) provide sensory inputs to the user (e.g., usually visual, auditory and/or tactile) from the one(s), where the user can sense the outputs and interpret some information contained in the outputs. An interface between human and electronic devices can be either one-directional or bidirectional. A complete interface might use multiple interface devices. The computer mouse, for example, is a one way interface device. It detects inputs from a user and converts them into electrical signals that can then be processed by the computer. The computer’s monitor or display is a two-way interface device. It provides visual outputs that the user can use to understand the information. The computer mouse and the display form a bidirectional human/computer interface (?HCI?) An example of a human-electronics interfacing is the HCI.

A wearable electronic device can function as an interface device when it: a) contains sensors that detect inputs from a user and b) transmits signals based on those inputs to another electronic device. There are many types of input-types and sensor-types. These include tactile sensors (e.g. buttons, switches or keys) that provide manual control, acoustic devices that provide voice-control, electromyography and accelerometers that provide gestural control and/or electromyography sensors that provide gestural control.

“Interacting With Head-Mounted Displays

Display screens in portable electronic devices require that the user use their hands to hold the device and/or orient it so that they can see, access, get feedback from, or interact with the display screen. The inconvenience of having to hold the device in one’s hands can make it difficult for the user to use the device, and/or interact with their surroundings while using the portable electronic devices. This problem can be overcome by making the display screen of a portable electronic device, like the head-mounted displays, wearable. The user can view, access and/or get feedback from the display screen by making it wearable. Head-mounted displays have gained acceptance in recent years. There are a few recently introduced head mounted display devices that could be widely adopted by consumers.

Interfaces that allow us to interact with content on head-mounted displays are not satisfactory. The challenge with wearable heads-up displays is to provide sophisticated control capabilities while maintaining a discreet, minimally hands-free interface mechanism that has a minimal form factor. There are many options, including voice control, touch pads on the frames of wearable head-up displays, and a remote that can be tethered via wired connection to the display. However, none of these solutions meet all the criteria above. To make wearable heads-up displays more popular, it is necessary to overcome the technical challenges of interfacing with their content.

A system that allows interaction with content displayed on head-mounted displays may include: at most one display that is positioned in the field of view for at least a single eye of a user when the display is mounted on their head; a processor communicatively connected to the processor; a nontransitory processor readable storage medium communicatively linked to the processor. The processor contains processor-executable instructions or data that are executable by the processor. An eye-tracker that detects that the user is looking at least 1 actuator A substantially simultaneous combination of gazing at at least one object on the display and activating at least one actuator of the portable device may be used to select the object. A visual effect can be displayed on at least one of the displays mounted to the head-mounted display in response to the user’s selection operation. Non-transitory processor readable storage media may also store processor-executable instruction and/or data. These instructions, when executed by processors in response to wireless receiver wirelessly receiving signal from the portable device, will cause processor to: Request current gaze direction information from the eye tracker; identify an object at which the user gazes based on current gaze data received from eye-tracker; cause at least one display display to display the visual effects on that object.

The portable interface device can be battery-powered and may include a piezoelectric component communicatively connected to the actuator and an antenna communicatively linked to the piezoelectric elements. A radio frequency receiver may be included in the wireless receiver of the head mounted display. The actuator can mechanically activate the piezoelectric elements when activated by the user. The piezoelectric element can generate an electrical signal in response to mechanical actuation. An antenna can transmit a radio frequency message in response to an electric signal generated by the piezoelectric elements.

The portable interface device can be battery-free and may also include a mechanical resonance that is physically coupled to it. A wireless receiver for the head-mounted display could include at least one of the following: a microphone, and/or a piezoelectric component tuned to respond to a sonic sound. The actuator can mechanically activate the mechanical resonator when it is activated by the user. The mechanical resonator can generate the sonic sound in response to mechanical actuation. An ultrasonic signal can be included in the sonic sound.

“The object can include at least one object from a group consisting: a menu item; a graphical button; a keyboard key; a notification; one of multiple objects displayed on the at-least one display of the head mounted display; a file, folder and an alphanumeric characters. A wearable device that is part of the portable interface device could include one of three types: a ring or a wristband. A button may be included in the actuator of the portable device interface device. The eye-tracker can be carried and physically connected to the head-mounted display.

A method for operating a system that includes a head mounted display, an eye tracker and a wireless interface device. The portable interface devices can wirelessly transmit a signal to the portable device. Wirelessly receiving the signal from the wireless receiver of head-mounted displays, the portable device can then display a visual effect. A processor and a nontransitory processor readable storage medium may be included in the head-mounted display. Non-transitory processing-readable data storage mediums may contain processor-executable instruction and/or information. The method may include the execution by the processor of processor-executable data and/or instructions within the field view of at most one eye of the user to cause the at-least one display on the head-mounted screen to display the object; and at least one display on the head-mounted monitor to display the visual effect as a result of the system receiving the selection command from the user.

The portable interface device can be batteryless, and may include a communicatively-coupled actuator and an antenna communicatively coupling to the piezoelectric elements. A radio frequency receiver may also be included in the wireless receiver of a head-mounted display. This configuration allows the user to activate the portable device’s actuator by activating the device. The portable device can also receive a mechanical actuation by the piezoelectric elements from the user. The piezoelectric element can generate an electric signal in response to the mechanical action. The portable interface device can wirelessly transmit a signal. This may be accomplished by transmitting a radiofrequency signal using the antenna. The wireless reception of the signal may be accomplished by the wireless receiver attached to the head-mounted LCD.

The portable interface device can be batteryless, and may also include a mechanical resonance physically coupled to it. This configuration may allow the portable device to receive from the user an activation of an actuator. It could also include the device receiving a mechanical actuation from the mechanical resonator. The mechanical resonator can generate a sound signal in response to the mechanical actuator. The portable interface device can wirelessly transmit a signal. This could include transmitting the sonic signals by the mechanical resonator. The wireless reception of the signal may be accomplished by wirelessly receiving it by a wireless receiver attached to the head mounted display. This wireless receiver can include at least one of the following: a microphone or a piezoelectric element that is tuned to respond to the sonic signals. An ultrasonic signal may be included in the sonic signal generated from the mechanical resonator on the portable interface device.

The portable interface device could include an onboard power source and a radio frequency transmitter. The portable interface device can wirelessly transmit a signal using the radio frequency transmitter. This radio signal has a frequency range of 10 MHz-10 GHz. The wireless reception of the signal may be accomplished by the wireless receiver attached to the head mounted display.

A wearable human-electronics interface could be described as: a wearable head-up display with a wireless receiver, an eye-tracker that is carried by the wearable head-up display; and a wearable actuator that transmits wireless signals to the wireless receiver. A processor may be added to the wearable head-up display to communicate with both the wireless receiver and the eye tracker. This processor will enable the processor to interact with the content displayed on the wearable head-up display in response both to the concurrent inputs from the eye-tracker as well as the wearable actuator.

“BRIEF DESCRIPTION ABOUT THE VIEWS FROM THE DRAWINGS”

“Identical reference numbers are used to identify elements or acts that are similar in the drawings. Drawings may not be drawn to scale due to the relative sizes and positions of elements. The shapes and angles of elements and angles are not always drawn to scale. Some elements have been arbitrarily enlarged to make it easier to read. The particular shapes of elements drawn do not convey information about the exact shape of the elements and were chosen for their ease of recognition in drawings.

“FIG. “FIG.

“FIG. “FIG.

“FIG. “FIG.

“The following description provides a detailed understanding of the various disclosed embodiments. One skilled in the relevant arts will know that embodiments can be used without any of these details or with other components, materials, and methods. Other instances of well-known structures that are associated with head-mounted displays or electronic devices have not yet been shown or described in detail to avoid confusing descriptions of the embodiments.

“Unless the context otherwise requires, the word “comprise” is used throughout the specification and claims that follow. variations of the word?comprise?, such as?comprises? ?comprising’ and?comprises? are to be understood in an inclusive, open sense that includes?including but not limited to.?”

“Refer throughout this specification only to?one embodiment?” or ?an embodiment? It means that one or more embodiments may combine a particular feature, structure, or characteristic in any way they see fit.”

“As used herein and in the appended claims, singular forms?a? ?an,? ?an,? If the content is clear, plural referents should be included. Also, the term “or” is not used in its broadest sense. It should also be noted that the term?or? is used in its broadest meaning, which is to say?and/or’ Unless the content clearly indicates otherwise.

“The Abstract and headings of the disclosure provided herein are for convenience and do not reflect the scope or meanings of the embodiments.”

“The various embodiments herein provide systems and devices for interfacing with content on head-mounted displays. This interface has a small form factor which allows sophisticated control interactions to be performed in a discreet, hands-free manner. This is possible with a multi-modal, fully-wearable interface which combines substantially concurrent inputs from an eye tracker as well as a wireless portable interface.

“FIG. “FIG. 110 according to the present systems and devices. HMD 110 is a system that includes at least one display (two of these displays are illustrated in FIG. 1) is positioned within the user’s field of vision when HMD 110 has been worn on their head. Display(s) 111 can employ one or several waveguide(s), one, or more microdisplays, and/or any of the display technologies described by U.S. Patent Application Publication 2015.0205134, U.S. non-provisional patent application ser. No. No. No. No. No. No. Provisional Patent Application Se. No. 62/117,316, U.S. Provisional Patent Application Se. No. No. Provisional Patent Application Se. No. No. Nos. Nos. HMD 110 also contains a processor 112 that is communicatively connected to at least one display 112 and a nontransitory processorreadable storage medium, or memory 112 which is communicatively linked to processor 112. Memory 113 stores data or instructions 114 that are executable by processor 112 (HMD 110) and a non-transitory processor-readable storage medium or memory 112 communicatively coupled to processor 112.

“HMD 110” also includes receiver 116, which is a wireless transceiver or wireless receiver that can wirelessly receive signals. The receiver 116 is communicatively connected to processor 112.

“System 100 also includes an eye-tracker117, which is used to determine the eye position or gaze direction of the user. It can be communicatively connected to processor 112. Eye-tracker 112 includes at least one camera, or photodetector, to measure light (e.g. visible light or infrared) reflected from at least one of the eyes. Processor 112 may use measured reflections to determine the eye position and gaze direction. The technology described in U.S. may be implemented by Eye-tracker117. Provisional Patent Application Ser. No. No. Nos. Nos. 15/167.458 and 15/167.472. The present systems, devices and methods include an eye-tracker 117 that detects whether the user is looking at (e.g., staring, or generally pointing in the direction) at least one object displayed on the display 111. FIG. HMD 110 carries eye-tracker 112, but in other implementations, eye-tracker 110 may physically be carried by eye-tracker 112.

“System 100 offers a multi-modal interface to interact with content on a head-mounted monitor. Eye-tracker117 is able to realize a first mode of interaction, i.e. via eye position and/or gaze direction. System 100 also includes a wireless portable interface 120 that can be carried by the user. FIG. FIG. 1 shows a portable interface device 120 that is the same size as a ring and can be worn on the thumb or finger of the user. Other implementations of portable interface device 120 can be worn on the wrist or arm, or adopt a non-annular form that clips, sticks or attaches to the user (e.g. a pen with clip). HMD 110 is physically distinct from portable interface device 120. It includes at least one actuator (e.g. a button or toggle, lever, dial, knob, or other component) that when activated by the users causes portable interface 120 to wirelessly transmit data from a wireless signal generator 122.

“Portable interface devices 120 could include a portable power source such as a battery, or a supercapacitor (i.e. a capacitor with capacitance of 0.01 F or higher). Alternatively, portable interface device 120 may be ?batteryless.? The term “batteryless” is used throughout this specification and in the appended claims. Literally, it means “without any battery or batteries?” (or any equivalent device providing a comparable function, such a supercapacitor). It is used to indicate that the corresponding device, e.g., portable device 120, has no onboard battery or other source (i.e. generated from the device and stored in it) power.

“Portable interface devices 120” are generally referred to as wireless devices. The term “wireless” is used herein. The term “wireless” literally means “without any external wire connections to anything?” It is used to indicate that the device is not tethered with any external wire-connections (or optical fiber connections or cable connections, etc.). Any other electronic device or any source of electricity. In other words, if portable interface device 120 can be both wireless and batteryless, then portable interface device 120, unless there is any actuation (which is described more in detail below), is usually without any electric power.

“Wearable electronic accessories are usually larger and more bulky than traditional jewelry. This is due to the fact that wearable electronic devices often require large and bulky components such as an onboard battery. Portable interface device 120 in the present systems, devices, and methods is wireless. This allows for the removal of large and bulky electric parts (e.g. a battery or a charging port if batteryless) to create a compact and small form factor that is not common in wearable electronic devices. Portable interface device 120 can still be used to generate electric signals by mechanical actuation, such as one or more of the on-board piezoelectric components.

“Portable interface device 120” only contains one actuator or a?button 121. There are other implementations that may use a second or third actuator. However, the general portable interface device 120 has very few actuators to reduce its size. FIG. 1 actuator 121 could provide a?select’ function in combination with the illustrated example of FIG. Function in conjunction with the display 111 of HMD110 that the user is looking at, as determined by eye-tracker117 and processor 112. Memory 113 of HMD110 stores processor-executable instruction and/or data 112, which, when executed in processor 112 (HMD 110), cause at least one display 110 to display at most one object 115 responsive to a user selection operation. According to the present systems and methods, the selection operation may consist of a substantially simultaneous combination of gazing at at least one object (115) displayed by the at minimum one display 111 (as detected using eye-tracker 117) and activating at least the actuator 121 of portable interface device 120. HMD 110, e.g. processor 112 from HMD 110, may perform the selection operation in response to a wireless’selection signal? 150 at receiver 112 transmitted from wireless signal generator 122 of portable interface device 120. The selection operation could include “selecting?” The wireless selection signal 150 is received at receiver 116 by the object 115 that the eye tracker 117 recognizes as the user is gazing at it. This is why, when HMD 110 wireless receiver 116 receives a wireless signal 150, processor 112 executes processor executable instructions and/or data 112 stored in memory 113. These instructions cause processor 112: i. to request current gaze data from eyetracker 117; and ii. to identify the object 115 that the user is looking at based on current gaze data received from eyetracker.117.

A visual effect can be displayed on at least one display (111) of HMD 110 depending on the user’s selection. A visual effect could include highlighting an object 115 or visually changing or modifing the object. Depending on the application and/or interface used, the object 115 could be any image displayed, such as a menu item or a graphical button, keyboard key, a notification, multiple objects displayed by at least one display 110 of HMD 110, file, folder and/or alphanumeric characters. FIG. 1. Display 111 shows a representation for a virtual keyboard. The specific object 115 that the user selects corresponds to a particular key (i.e. letter) on the keyboard. A typical qwerty keyboard configuration shows that the letter?T is the letter in the chosen key/letter.

“System 100, an example application, may be used to allow the user to type. It does this by: i. displaying a virtual keyboard on at most one display 111, and over a number more instances: ii. detecting which letter is being gazed at by eyetracker 117 and iii. selecting that letter when the user activates actuators 121 and 120 of the portable interface device 120.

“The form factor of wearable electronics devices is an important consideration. It can determine whether or not they will be adopted by users, as we have already discussed. These systems, devices and methods provide a portable device 120 that can interact with HMD 110 content. The portable interface device 120 is small in bulk and has similar technological capabilities to traditional jewelry and other accessories. The portable interface device 120 can be wirelessly connected to the HMD 110 and may even be battery-free.

Portable interface device 120 can still wirelessly transmit signals wirelessly to HMD 110, even though it is wireless and batteryless. These systems, devices and methods offer two examples of configurations that allow portable interface device 120 wirelessly transmit signals, despite it being wireless and batteryless.

“As an example, a portable interface device 120 might include a piezoelectric component communicatively connected to actuator 121 and radio frequency antenna 122, communicatively coupled with the piezoelectric elements. The actuator 121 mechanically activates the piezoelectric elements when it is activated by the user (e.g., pressed or depressed, switched or similar). The piezoelectric element produces an electric signal in response to mechanical actuation. This electric signal is communicatively connected to radio frequency antenna 122, where it transmits wirelessly a radio frequency signal (e.g. radio frequency). Antenna 122, and associated circuitry, may be used to wirelessly transmit radio frequency signals 150 or microwave frequency signals 150 with a particular frequency or within a specified range. This configuration includes receiver 116 (HMD 110) that has a radio frequency receiver or microwave receiver. It is tuned to receive radio or microwave signals within the range of signal 150 wirelessly transmitted via antenna 122 of portable device 120.

“A second example is that a portable interface device 120 could include a mechanical resonance 122, which can be physically coupled to actuator 121. Actuator 121 is mechanically activated by the user (e.g. pressed, pushed or depressed), and causes mechanical resonator 122 to act (e.g. strikes, impacts, oscillates or vibrates or similar). Mechanical resonator 122 generates a sound, an acoustic or aural signal 150 in response to mechanical actuation. It may also be an ultrasonic sound. This configuration includes receiver 116 of HMD110, which contains a microphone and/or piezoelectric elements that can be tuned to respond to sonic signals within the range of 150 wirelessly transmitted via mechanical resonator 122 of portable interface device 120.

“Thus, the signal 150 is wirelessly transmitted in the current systems, devices and methods. There may be a variety signals, such as radio frequency, sonic, or ultrasonic signals, as well as optical and photonic signals. A signal that is “wireless transmitted” is generally defined as one that is not wired. Any signal that is not transmitted via a conductive medium.

“FIG. 2. This is an illustration diagram of a human-electronics interaction 200. In it, a user 201 wears the system which allows him to interact with the displayed content according to the current systems, devices and methods. The system includes a HMD210 and a portable device 220. HMD 210 is very similar to HMD 110 in FIG. 1, and portable interface devices 220 are substantially the same as portable interface device 120. 1. FIG. FIG. 2 shows a portable interface device 220 that is worn on the finger of user201. However, alternative implementations of portable interface device 220 could adopt a different form factor and be worn somewhere else on/by user201 such as a wristband or armband or a device which clips, affixes or otherwise couples to user201 or an article of clothing worn in user201. It is generally advantageous for the actuator (121 FIG. 1. Not visible in FIG. 2) of portable interface device 220- to be inconspicuously and easily accessible by user 201. An actuator on portable interface devices 220 can be activated easily by user 201 using the thumb adjacent to the ring. The actuator activates portable interface device 220 wirelessly transmits signal 250. HMD 210 also includes a receiver that wirelessly gets signal 250. HMD 210 receives signal 250 while HMD 210’s eye-tracker detects that HMD 210 is detecting that HMD 210 is displaying an object responsive to a selection. Then HMD 210 will wirelessly transmit signal 250. If HMD 210 is activated by HMD 210, the combination of HMD 210 receiving signal 250 and user 201 simultaneously activating HMD 210 will result in the selection operation. HMD210 may display a visual effect for user 201 in response to the selection operation.

“FIG. “FIG. This system is substantially identical to the system 100 in FIG. The system (e.g. substantially similar to FIG. 100) includes a HMD (e.g. 110), an eye tracker (e.g. 117), and a mobile interface device (e.g. 120). The portable interface device (120), is physically distinct from the HMD (1110). The portable interface device, which is wireless, can be, in certain implementations, a batteryless device. In the description of method 300, we often refer to system 100 in FIG. 1. An artist of good skill will know that system 100 is merely an example and that methods described herein can be applied using other systems or devices. FIG. 1. The appended claims should be used to define the scope of the inventions, devices, and methods. This is in addition to the examples described in this specification. In the description of method 300, references to system 100 elements from FIG. 1 are in parentheses to indicate the non-limiting nature of such references and that they are intended for illustration purposes only.

Method 300 includes three acts: 310, 320 and 330. Act 320 also includes four sub-acts 321a, 321b, 322, 322, and 323. The art is not limited to the following acts/sub-acts. However, those skilled in the art will be able to appreciate that alternative embodiments can be made. The order in which the acts/sub-acts are illustrated is only an example and can change in other embodiments.

“At 310, at most one display (111), of the HMD (110) displays a object (115) within a user’s field of vision. The object (115), which responds to a selection operation by the user, may include, without limitation, a menu item (111), a graphical button (keyboard key), a notification, one or more objects displayed by the HMD (110), and/or a file, folder, and/or an alphabetic character.

“At 320 the system (100), receives a selection from the user. The selection operation by the user can include two portions that are substantially concurrent: i. the user looking at at least one object (115), displayed on at least one display (111 per 310), and ii. the user activating at most one actuator from the portable interface device 120. Multiple inputs in different modes and communication among different devices are required for the selection operation (100). Act 320 includes sub-acts 321 a to 321 b and 322, which collectively define the reception of the components by and among the system components (100).

“At 321a, the eyetracker (117), of the system 100 detects that the user gazes at the object (115), displayed at 310. Sub-act 321a provides the first section of act 320. Sub-act 321a a provides a first part of act 320. This first portion corresponds to the user gazing at the object (115). ”

“At 321b, the portable device interface (120) of system (100), receives a second part of the selection operation and an activation by at least one actuator (121) from the user. A portable interface device (120), may be activated by the users by activating at least one actuator (121). In FIG. 3, sub-acts 321a and 321b are linked by a horizontal line. 3. This indicates that the sub-acts are substantially conjoint (i.e. sub-acts 321a and 321b are performed substantially simultaneously). The user generally activates the device (121) per 321 B while gazing at the object (115), as determined by the eye-tracker (1117) per 312 a. The user can perform a combination of actions (e.g. This is an example of multi-modal selection according to the present systems, devices and methods.

“A transmitter or signal generator (122) at 322 wirelessly transmits (150) a signal to the portable interface device 120 in response to at least one actuator (121) being activated at 321b.

“At 323, the receiver (116), of the HMD (110) wirelessly receives signal (150), wirelessly transmitted at 322. Sub-acts 321 and 322 provided the first portion of Act 320. The combination of sub-acts 311, 322, 322 and 323 creates a second part of Act 320. The system (100), receives, via the portable device 120, a second part of the selection operation at 320. This second portion is the?user activating a actuator? The portable interface device (120) provides a portion of the selection operation. Method 300 is now complete with the selection operation.

“At 330, at least one display (1111) on the HMD (110) displays a visual effect upon receiving the selection operation per 320. The visual effect can include, but is not limited to: Highlighting the object (115), visually changing the object (115), changing the content elsewhere on display (111), changing other aspects of displayed content (111) or replacing displayed content (115) based on the object selected by the user.

“The HMD (110) of the system (100), may contain a processor (112), communicatively connected to at least one display (111), and a nontransitory processorreadable storage medium (113), communicatively linked to the processor (112). The memory (113), which may contain data and/or instructions, (i.e. processor-executable instruction 114) that when executed by processor (112), causes: i.e. the HMD (110) at least one display (1111) to display the object (115), and ii. the HMD (110) at least one display (1111) to display the visual effect to user per act 335 in response to system (100) receiving selection operation from user per act 321.

“As shown in FIG. 100, the system 100 context. 1. The portable interface device (120), of the system (100), may be implemented in many different ways. The nature of the portable device (12) may affect the details of sub-acts 322, 322 and 323 of method 300.

“In a first implementation, the portable device (120), may include a wireless and batteryless portable interface device. It may also include a piezoelectric component communicatively connected to the actuator (121) as well as a radio frequency antenna (122) which is communicatively linked to the piezoelectric elements and tuned to wirelessly transmit radio frequencies (150) at a particular frequency or within a specified range of frequencies. The receiver (116), of the HMD (110), may include a radio frequency receiver (166) that is tuned to wirelessly transmit radio frequency signals (150). The portable interface device 120 may be actuated mechanically by the user in this first example. The portable interface device 120 may experience or receive a mechanical actuation from the user. This could be done by having the piezoelectric elements mechanically activated by the user. The mechanical actuation may cause the piezoelectric elements to generate an electric signal. This electric signal may communicatively couple to the portable radio frequency antenna (122) and the radio frequency antenna (150). Sub-act 323, radio frequency receiver (116), of the HMD (110) may wirelessly receive radio frequency signal 150.

“In a second implementation, the portable device (120), may include a batteryless, wireless, portable interface device. It may also include a mechanical resonance (122) that is physically coupled to actuator (121) and tuned for wireless transmission of sonic, auditory, or aural signal (150), such ultrasonic signals. The receiver (116), of the HMD (110), may include a microphone or a piezoelectric element that is tuned to respond to sonic waves (150) at the specified frequency or within a specific range of frequencies. The portable interface device 120 may be able to receive mechanical actuation from the mechanical resonator (122) at sub-act 321b of method 300. The portable interface device 120 may experience or receive a mechanical activation of the mechanical resonance (122) from the user. This could be done by, for instance, having the mechanical oscillator (122) mechanically actuated. The mechanical resonator (122) can generate a sonic sound (150) in response to the mechanical actuator. This signal is wirelessly transmitted using sub-act 322 from method 300. Sub-act 323, the HMD (110) can wirelessly receive the sonic signals (150) through the tuned piezoelectric element (116), and microphone (113).

The multi-modal interface described in this document (comprising an eye tracker and an actuator or button wirelessly communicating with a wearable head-up display) allows for sophisticated control and/or interaction with the content displayed on the wearable head-up display in a discreet, substantially hands-free fashion and with a minimal form factor. The control interactions are sophisticated and discreet, at least partially, because the eye tracker data allows for the use of the eyes to perform the pointing and identification. Hand-controlled interfaces can make it difficult to complete certain tasks. The eye, and more specifically the gaze thereof, can scan and focus on specific aspects of the displayed content faster than a finger- or hand-controlled cursor. However, specifying a selection operation using the eye alone can prove cumbersome as: a. the user is likely inadvertently to gaze at something that is not intended to be selected; and b. the methods for doing this, such as a deliberate blink, or a prolonged dwell time, are impractical and make it difficult to perform the eye. The present systems, devices, and methods take advantage of the versatility and scanning/honing capabilities of the eye/gaze but avoid the specification/selection issues by employing a secondary input mode, a simple wearable actuator such as ring-based button, to actuate the specification/selection function. Although the wearable actuator can’t perform complex control interactions by itself, it is compatible with the eye tracker and offers the best of both interfaces. The wearable actuator, like the eye tracker is discreet and virtually hands-free. Because it doesn’t need to be carried or held in the hand of the user, If a ring-based button is worn on the user’s index finger (for example), the user can activate the button by simply using their thumb. This action may be performed while another task such as carrying something or handing out paperwork is being done. It is extremely small and compact which allows for very minimal form factors.

“The various embodiments herein provide a multimodal, portable and totally wearable interface that allows a user to perform complex interactions with content displayed on wearable heads up displays in an inconspicuous, substantial hands-free fashion. In implementations where the eye tracker component of the wearable head-up display is integrated (even integrated with projection elements of wearable heads up display, as described by U.S. Provisional Patent Application No. No. No. Nos. Nos. A user wearing the multimodal interface described herein might be significantly different from a user wearing conventional eyeglasses with a conventional ring. This is possible, at least partially, because of: the compact size of the wearable head-up display; the compact size of the wearable (e.g. ring-based actuator); integration of the eye tracker and the wearable head-up display; and/or wireless communication between the wearable actuator, and the wearable head-up display.

“Infinitive verb forms are used throughout this specification and the appended Claims. Examples include, but are not limited to:?to detect? ?to provide,? ?to transmit,? ?to communicate,? ?to process,? ?to route,? The like. These infinitive verb forms can be used, unless the context otherwise requires, in an inclusive, open sense. To, at most, provide? To, at most, transmit? And so on.”

The above description of illustrated embodiments, which includes what is described in Abstract, is not meant to be exhaustive nor to limit the embodiments to those disclosed. Specific embodiments and examples are provided herein to illustrate the point. However, it is possible to make other modifications without departing from its spirit and scope. As those skilled in relevant art will know, however. These teachings may be applied to other wearable and portable electronic devices.

“For example, the detailed description of the device and/or process has described various embodiments using block diagrams and schematics. As long as the examples, block diagrams, and schematics contain one or more functions or operations, those skilled in this art will understand that each function or operation within these block diagrams or flowcharts can be implemented individually or collectively by a variety of hardware, software and firmware or any combination thereof. One embodiment of the subject matter can be implemented using Application Specific Integrated Circuits. However, anyone skilled in the art will know that the embodiments described herein can be implemented in standard integrated circuits as one or several computer programs (e.g. one or multiple programs running on one computer system), one or two controllers (e.g. microcontrollers), one or both of these programs executed on one processor (e.g. microprocessors), one or all of these programs executed on one or many processors (e.g. microprocessors), or any combination thereof. It is well within one of the disclosures.

“Logic implemented in software can be stored in memory. Logic or information can then be stored on any processor readable medium to be used by any processor-related method or system. A memory, in the context of this disclosure is any electronic, magnetic or optical device or means that stores or contains a computer or processor program. Any processor-readable medium can contain logic and/or information for use with or in connection to an instruction execution device, apparatus, and system. This includes a computer-based, processor-containing, or other system that can retrieve instructions from the device or apparatus and execute those instructions.

“A?nontransitory processor-readable media? is defined in the context of this specification. Any element that stores the program and/or information to be used by the instruction execution system, apparatus or device can be considered a processor-readable medium. A processor-readable medium could include, but not be limited to, an electronic or magnetic, optical, electromagnetic or infrared system, apparatus, or device. A computer readable medium could also include a portable magnetic, compact flash card, secure electronic, or other media.

“The different embodiments can be combined to create further embodiments. As long as they do not conflict with the specific teachings or definitions herein, all U.S. Patents, U.S. Patent Application Publications, U.S. Patent Applications, U.S. Foreign patents, U.S. foreign patent applications, and non-patent publication referred to in this specification are owned by Thalmic Labs Inc.. No. 15/282,535, U.S. Provisional Patent Application Se. No. 62/236,060, U.S. Patent Application Publication 2015-0205134, U.S. Non-Provisional patent application Ser. No. No. No. No. No. No. Provisional Patent Application Se. No. 62/117,316, U.S. Provisional Patent Application Se. No. No. Provisional Patent Application Se. No. No. Nos. Nos. You can modify certain aspects of the embodiments to use systems, circuits or concepts from the various patents applications and publications to create further embodiments.

“These and other modifications can be made to embodiments according to the detailed description. The following claims should not be understood to limit the claims to specific embodiments as disclosed in the specification or the claims. Instead, they should be interpreted to encompass all possible embodiments and all equivalents to which such claims may be entitled. The disclosure does not limit the claims.

Summary for “Systems, devices and methods to interact with content on head-mounted displays”

“Technical Field”

“The present systems and devices and methods relate generally to interfacing with content displayed on head mounted displays. They particularly relate to a multi input interface that combines eye track with a wireless portable interface device.

“Description of Related Art”

“Wearable Electronic Devices.”

“Electronic devices are now commonplace in most parts of the world. Electronic devices are now small enough to be easily carried thanks to advances in integrated circuit technology. These electronic devices are ‘portable’. These electronic devices can include on-board power supplies, such as batteries or other power storage system. They may also be ‘wireless? This means that they are designed to work without wire-connections to non-portable electronic devices. However, small, lightweight electronic devices can still be considered portable. A microphone, for example, can be considered portable electronic devices regardless of whether it is wirelessly connected or wired.

“The portability of electronic devices has created a large industry. There are many types of portable electronic devices, including smartphones, ebook readers, laptop computers and tablet computers. The convenience of carrying a portable electronic device around has brought with it the inconvenience of having your hand(s) encumbered. An electronic device can now be worn, and is portable.

A wearable electronic device can be any electronic device that the user can hold onto without having to physically grasp, grip, or clutch it. A strap, band, or band may attach a wearable device to the user. An adhesive clip, pin and clasp, article of clothing, tension, elastic support, interference fit, or other form can also be used. Wearable electronic devices can include electronic wristwatches and electronic armbands, electronic rings or electronic ankle-bracelets. Head-mounted electronic display units, hearing aids and so forth.

Wearable electronic devices are designed to be worn on the body, visible to others and for long periods of times. Form factor (i.e. size, geometry and appearance) is an important design consideration.

“Head-Mounted Displays”

“Human-Electronics Interfaces and Devices.”

A human-electronics interfacing facilitates communication between a person and one or more electronic devices. A human-electronics interfacing device is generally one or more electronic devices that: a) sense inputs from the user and convert them into electrical signals that can then be processed or acted on by the one(s) or more electronics device(s); and/or, b) provide sensory inputs to the user (e.g., usually visual, auditory and/or tactile) from the one(s), where the user can sense the outputs and interpret some information contained in the outputs. An interface between human and electronic devices can be either one-directional or bidirectional. A complete interface might use multiple interface devices. The computer mouse, for example, is a one way interface device. It detects inputs from a user and converts them into electrical signals that can then be processed by the computer. The computer’s monitor or display is a two-way interface device. It provides visual outputs that the user can use to understand the information. The computer mouse and the display form a bidirectional human/computer interface (?HCI?) An example of a human-electronics interfacing is the HCI.

A wearable electronic device can function as an interface device when it: a) contains sensors that detect inputs from a user and b) transmits signals based on those inputs to another electronic device. There are many types of input-types and sensor-types. These include tactile sensors (e.g. buttons, switches or keys) that provide manual control, acoustic devices that provide voice-control, electromyography and accelerometers that provide gestural control and/or electromyography sensors that provide gestural control.

“Interacting With Head-Mounted Displays

Display screens in portable electronic devices require that the user use their hands to hold the device and/or orient it so that they can see, access, get feedback from, or interact with the display screen. The inconvenience of having to hold the device in one’s hands can make it difficult for the user to use the device, and/or interact with their surroundings while using the portable electronic devices. This problem can be overcome by making the display screen of a portable electronic device, like the head-mounted displays, wearable. The user can view, access and/or get feedback from the display screen by making it wearable. Head-mounted displays have gained acceptance in recent years. There are a few recently introduced head mounted display devices that could be widely adopted by consumers.

Interfaces that allow us to interact with content on head-mounted displays are not satisfactory. The challenge with wearable heads-up displays is to provide sophisticated control capabilities while maintaining a discreet, minimally hands-free interface mechanism that has a minimal form factor. There are many options, including voice control, touch pads on the frames of wearable head-up displays, and a remote that can be tethered via wired connection to the display. However, none of these solutions meet all the criteria above. To make wearable heads-up displays more popular, it is necessary to overcome the technical challenges of interfacing with their content.

A system that allows interaction with content displayed on head-mounted displays may include: at most one display that is positioned in the field of view for at least a single eye of a user when the display is mounted on their head; a processor communicatively connected to the processor; a nontransitory processor readable storage medium communicatively linked to the processor. The processor contains processor-executable instructions or data that are executable by the processor. An eye-tracker that detects that the user is looking at least 1 actuator A substantially simultaneous combination of gazing at at least one object on the display and activating at least one actuator of the portable device may be used to select the object. A visual effect can be displayed on at least one of the displays mounted to the head-mounted display in response to the user’s selection operation. Non-transitory processor readable storage media may also store processor-executable instruction and/or data. These instructions, when executed by processors in response to wireless receiver wirelessly receiving signal from the portable device, will cause processor to: Request current gaze direction information from the eye tracker; identify an object at which the user gazes based on current gaze data received from eye-tracker; cause at least one display display to display the visual effects on that object.

The portable interface device can be battery-powered and may include a piezoelectric component communicatively connected to the actuator and an antenna communicatively linked to the piezoelectric elements. A radio frequency receiver may be included in the wireless receiver of the head mounted display. The actuator can mechanically activate the piezoelectric elements when activated by the user. The piezoelectric element can generate an electrical signal in response to mechanical actuation. An antenna can transmit a radio frequency message in response to an electric signal generated by the piezoelectric elements.

The portable interface device can be battery-free and may also include a mechanical resonance that is physically coupled to it. A wireless receiver for the head-mounted display could include at least one of the following: a microphone, and/or a piezoelectric component tuned to respond to a sonic sound. The actuator can mechanically activate the mechanical resonator when it is activated by the user. The mechanical resonator can generate the sonic sound in response to mechanical actuation. An ultrasonic signal can be included in the sonic sound.

“The object can include at least one object from a group consisting: a menu item; a graphical button; a keyboard key; a notification; one of multiple objects displayed on the at-least one display of the head mounted display; a file, folder and an alphanumeric characters. A wearable device that is part of the portable interface device could include one of three types: a ring or a wristband. A button may be included in the actuator of the portable device interface device. The eye-tracker can be carried and physically connected to the head-mounted display.

A method for operating a system that includes a head mounted display, an eye tracker and a wireless interface device. The portable interface devices can wirelessly transmit a signal to the portable device. Wirelessly receiving the signal from the wireless receiver of head-mounted displays, the portable device can then display a visual effect. A processor and a nontransitory processor readable storage medium may be included in the head-mounted display. Non-transitory processing-readable data storage mediums may contain processor-executable instruction and/or information. The method may include the execution by the processor of processor-executable data and/or instructions within the field view of at most one eye of the user to cause the at-least one display on the head-mounted screen to display the object; and at least one display on the head-mounted monitor to display the visual effect as a result of the system receiving the selection command from the user.

The portable interface device can be batteryless, and may include a communicatively-coupled actuator and an antenna communicatively coupling to the piezoelectric elements. A radio frequency receiver may also be included in the wireless receiver of a head-mounted display. This configuration allows the user to activate the portable device’s actuator by activating the device. The portable device can also receive a mechanical actuation by the piezoelectric elements from the user. The piezoelectric element can generate an electric signal in response to the mechanical action. The portable interface device can wirelessly transmit a signal. This may be accomplished by transmitting a radiofrequency signal using the antenna. The wireless reception of the signal may be accomplished by the wireless receiver attached to the head-mounted LCD.

The portable interface device can be batteryless, and may also include a mechanical resonance physically coupled to it. This configuration may allow the portable device to receive from the user an activation of an actuator. It could also include the device receiving a mechanical actuation from the mechanical resonator. The mechanical resonator can generate a sound signal in response to the mechanical actuator. The portable interface device can wirelessly transmit a signal. This could include transmitting the sonic signals by the mechanical resonator. The wireless reception of the signal may be accomplished by wirelessly receiving it by a wireless receiver attached to the head mounted display. This wireless receiver can include at least one of the following: a microphone or a piezoelectric element that is tuned to respond to the sonic signals. An ultrasonic signal may be included in the sonic signal generated from the mechanical resonator on the portable interface device.

The portable interface device could include an onboard power source and a radio frequency transmitter. The portable interface device can wirelessly transmit a signal using the radio frequency transmitter. This radio signal has a frequency range of 10 MHz-10 GHz. The wireless reception of the signal may be accomplished by the wireless receiver attached to the head mounted display.

A wearable human-electronics interface could be described as: a wearable head-up display with a wireless receiver, an eye-tracker that is carried by the wearable head-up display; and a wearable actuator that transmits wireless signals to the wireless receiver. A processor may be added to the wearable head-up display to communicate with both the wireless receiver and the eye tracker. This processor will enable the processor to interact with the content displayed on the wearable head-up display in response both to the concurrent inputs from the eye-tracker as well as the wearable actuator.

“BRIEF DESCRIPTION ABOUT THE VIEWS FROM THE DRAWINGS”

“Identical reference numbers are used to identify elements or acts that are similar in the drawings. Drawings may not be drawn to scale due to the relative sizes and positions of elements. The shapes and angles of elements and angles are not always drawn to scale. Some elements have been arbitrarily enlarged to make it easier to read. The particular shapes of elements drawn do not convey information about the exact shape of the elements and were chosen for their ease of recognition in drawings.

“FIG. “FIG.

“FIG. “FIG.

“FIG. “FIG.

“The following description provides a detailed understanding of the various disclosed embodiments. One skilled in the relevant arts will know that embodiments can be used without any of these details or with other components, materials, and methods. Other instances of well-known structures that are associated with head-mounted displays or electronic devices have not yet been shown or described in detail to avoid confusing descriptions of the embodiments.

“Unless the context otherwise requires, the word “comprise” is used throughout the specification and claims that follow. variations of the word?comprise?, such as?comprises? ?comprising’ and?comprises? are to be understood in an inclusive, open sense that includes?including but not limited to.?”

“Refer throughout this specification only to?one embodiment?” or ?an embodiment? It means that one or more embodiments may combine a particular feature, structure, or characteristic in any way they see fit.”

“As used herein and in the appended claims, singular forms?a? ?an,? ?an,? If the content is clear, plural referents should be included. Also, the term “or” is not used in its broadest sense. It should also be noted that the term?or? is used in its broadest meaning, which is to say?and/or’ Unless the content clearly indicates otherwise.

“The Abstract and headings of the disclosure provided herein are for convenience and do not reflect the scope or meanings of the embodiments.”

“The various embodiments herein provide systems and devices for interfacing with content on head-mounted displays. This interface has a small form factor which allows sophisticated control interactions to be performed in a discreet, hands-free manner. This is possible with a multi-modal, fully-wearable interface which combines substantially concurrent inputs from an eye tracker as well as a wireless portable interface.

“FIG. “FIG. 110 according to the present systems and devices. HMD 110 is a system that includes at least one display (two of these displays are illustrated in FIG. 1) is positioned within the user’s field of vision when HMD 110 has been worn on their head. Display(s) 111 can employ one or several waveguide(s), one, or more microdisplays, and/or any of the display technologies described by U.S. Patent Application Publication 2015.0205134, U.S. non-provisional patent application ser. No. No. No. No. No. No. Provisional Patent Application Se. No. 62/117,316, U.S. Provisional Patent Application Se. No. No. Provisional Patent Application Se. No. No. Nos. Nos. HMD 110 also contains a processor 112 that is communicatively connected to at least one display 112 and a nontransitory processorreadable storage medium, or memory 112 which is communicatively linked to processor 112. Memory 113 stores data or instructions 114 that are executable by processor 112 (HMD 110) and a non-transitory processor-readable storage medium or memory 112 communicatively coupled to processor 112.

“HMD 110” also includes receiver 116, which is a wireless transceiver or wireless receiver that can wirelessly receive signals. The receiver 116 is communicatively connected to processor 112.

“System 100 also includes an eye-tracker117, which is used to determine the eye position or gaze direction of the user. It can be communicatively connected to processor 112. Eye-tracker 112 includes at least one camera, or photodetector, to measure light (e.g. visible light or infrared) reflected from at least one of the eyes. Processor 112 may use measured reflections to determine the eye position and gaze direction. The technology described in U.S. may be implemented by Eye-tracker117. Provisional Patent Application Ser. No. No. Nos. Nos. 15/167.458 and 15/167.472. The present systems, devices and methods include an eye-tracker 117 that detects whether the user is looking at (e.g., staring, or generally pointing in the direction) at least one object displayed on the display 111. FIG. HMD 110 carries eye-tracker 112, but in other implementations, eye-tracker 110 may physically be carried by eye-tracker 112.

“System 100 offers a multi-modal interface to interact with content on a head-mounted monitor. Eye-tracker117 is able to realize a first mode of interaction, i.e. via eye position and/or gaze direction. System 100 also includes a wireless portable interface 120 that can be carried by the user. FIG. FIG. 1 shows a portable interface device 120 that is the same size as a ring and can be worn on the thumb or finger of the user. Other implementations of portable interface device 120 can be worn on the wrist or arm, or adopt a non-annular form that clips, sticks or attaches to the user (e.g. a pen with clip). HMD 110 is physically distinct from portable interface device 120. It includes at least one actuator (e.g. a button or toggle, lever, dial, knob, or other component) that when activated by the users causes portable interface 120 to wirelessly transmit data from a wireless signal generator 122.

“Portable interface devices 120 could include a portable power source such as a battery, or a supercapacitor (i.e. a capacitor with capacitance of 0.01 F or higher). Alternatively, portable interface device 120 may be ?batteryless.? The term “batteryless” is used throughout this specification and in the appended claims. Literally, it means “without any battery or batteries?” (or any equivalent device providing a comparable function, such a supercapacitor). It is used to indicate that the corresponding device, e.g., portable device 120, has no onboard battery or other source (i.e. generated from the device and stored in it) power.

“Portable interface devices 120” are generally referred to as wireless devices. The term “wireless” is used herein. The term “wireless” literally means “without any external wire connections to anything?” It is used to indicate that the device is not tethered with any external wire-connections (or optical fiber connections or cable connections, etc.). Any other electronic device or any source of electricity. In other words, if portable interface device 120 can be both wireless and batteryless, then portable interface device 120, unless there is any actuation (which is described more in detail below), is usually without any electric power.

“Wearable electronic accessories are usually larger and more bulky than traditional jewelry. This is due to the fact that wearable electronic devices often require large and bulky components such as an onboard battery. Portable interface device 120 in the present systems, devices, and methods is wireless. This allows for the removal of large and bulky electric parts (e.g. a battery or a charging port if batteryless) to create a compact and small form factor that is not common in wearable electronic devices. Portable interface device 120 can still be used to generate electric signals by mechanical actuation, such as one or more of the on-board piezoelectric components.

“Portable interface device 120” only contains one actuator or a?button 121. There are other implementations that may use a second or third actuator. However, the general portable interface device 120 has very few actuators to reduce its size. FIG. 1 actuator 121 could provide a?select’ function in combination with the illustrated example of FIG. Function in conjunction with the display 111 of HMD110 that the user is looking at, as determined by eye-tracker117 and processor 112. Memory 113 of HMD110 stores processor-executable instruction and/or data 112, which, when executed in processor 112 (HMD 110), cause at least one display 110 to display at most one object 115 responsive to a user selection operation. According to the present systems and methods, the selection operation may consist of a substantially simultaneous combination of gazing at at least one object (115) displayed by the at minimum one display 111 (as detected using eye-tracker 117) and activating at least the actuator 121 of portable interface device 120. HMD 110, e.g. processor 112 from HMD 110, may perform the selection operation in response to a wireless’selection signal? 150 at receiver 112 transmitted from wireless signal generator 122 of portable interface device 120. The selection operation could include “selecting?” The wireless selection signal 150 is received at receiver 116 by the object 115 that the eye tracker 117 recognizes as the user is gazing at it. This is why, when HMD 110 wireless receiver 116 receives a wireless signal 150, processor 112 executes processor executable instructions and/or data 112 stored in memory 113. These instructions cause processor 112: i. to request current gaze data from eyetracker 117; and ii. to identify the object 115 that the user is looking at based on current gaze data received from eyetracker.117.

A visual effect can be displayed on at least one display (111) of HMD 110 depending on the user’s selection. A visual effect could include highlighting an object 115 or visually changing or modifing the object. Depending on the application and/or interface used, the object 115 could be any image displayed, such as a menu item or a graphical button, keyboard key, a notification, multiple objects displayed by at least one display 110 of HMD 110, file, folder and/or alphanumeric characters. FIG. 1. Display 111 shows a representation for a virtual keyboard. The specific object 115 that the user selects corresponds to a particular key (i.e. letter) on the keyboard. A typical qwerty keyboard configuration shows that the letter?T is the letter in the chosen key/letter.

“System 100, an example application, may be used to allow the user to type. It does this by: i. displaying a virtual keyboard on at most one display 111, and over a number more instances: ii. detecting which letter is being gazed at by eyetracker 117 and iii. selecting that letter when the user activates actuators 121 and 120 of the portable interface device 120.

“The form factor of wearable electronics devices is an important consideration. It can determine whether or not they will be adopted by users, as we have already discussed. These systems, devices and methods provide a portable device 120 that can interact with HMD 110 content. The portable interface device 120 is small in bulk and has similar technological capabilities to traditional jewelry and other accessories. The portable interface device 120 can be wirelessly connected to the HMD 110 and may even be battery-free.

Portable interface device 120 can still wirelessly transmit signals wirelessly to HMD 110, even though it is wireless and batteryless. These systems, devices and methods offer two examples of configurations that allow portable interface device 120 wirelessly transmit signals, despite it being wireless and batteryless.

“As an example, a portable interface device 120 might include a piezoelectric component communicatively connected to actuator 121 and radio frequency antenna 122, communicatively coupled with the piezoelectric elements. The actuator 121 mechanically activates the piezoelectric elements when it is activated by the user (e.g., pressed or depressed, switched or similar). The piezoelectric element produces an electric signal in response to mechanical actuation. This electric signal is communicatively connected to radio frequency antenna 122, where it transmits wirelessly a radio frequency signal (e.g. radio frequency). Antenna 122, and associated circuitry, may be used to wirelessly transmit radio frequency signals 150 or microwave frequency signals 150 with a particular frequency or within a specified range. This configuration includes receiver 116 (HMD 110) that has a radio frequency receiver or microwave receiver. It is tuned to receive radio or microwave signals within the range of signal 150 wirelessly transmitted via antenna 122 of portable device 120.

“A second example is that a portable interface device 120 could include a mechanical resonance 122, which can be physically coupled to actuator 121. Actuator 121 is mechanically activated by the user (e.g. pressed, pushed or depressed), and causes mechanical resonator 122 to act (e.g. strikes, impacts, oscillates or vibrates or similar). Mechanical resonator 122 generates a sound, an acoustic or aural signal 150 in response to mechanical actuation. It may also be an ultrasonic sound. This configuration includes receiver 116 of HMD110, which contains a microphone and/or piezoelectric elements that can be tuned to respond to sonic signals within the range of 150 wirelessly transmitted via mechanical resonator 122 of portable interface device 120.

“Thus, the signal 150 is wirelessly transmitted in the current systems, devices and methods. There may be a variety signals, such as radio frequency, sonic, or ultrasonic signals, as well as optical and photonic signals. A signal that is “wireless transmitted” is generally defined as one that is not wired. Any signal that is not transmitted via a conductive medium.

“FIG. 2. This is an illustration diagram of a human-electronics interaction 200. In it, a user 201 wears the system which allows him to interact with the displayed content according to the current systems, devices and methods. The system includes a HMD210 and a portable device 220. HMD 210 is very similar to HMD 110 in FIG. 1, and portable interface devices 220 are substantially the same as portable interface device 120. 1. FIG. FIG. 2 shows a portable interface device 220 that is worn on the finger of user201. However, alternative implementations of portable interface device 220 could adopt a different form factor and be worn somewhere else on/by user201 such as a wristband or armband or a device which clips, affixes or otherwise couples to user201 or an article of clothing worn in user201. It is generally advantageous for the actuator (121 FIG. 1. Not visible in FIG. 2) of portable interface device 220- to be inconspicuously and easily accessible by user 201. An actuator on portable interface devices 220 can be activated easily by user 201 using the thumb adjacent to the ring. The actuator activates portable interface device 220 wirelessly transmits signal 250. HMD 210 also includes a receiver that wirelessly gets signal 250. HMD 210 receives signal 250 while HMD 210’s eye-tracker detects that HMD 210 is detecting that HMD 210 is displaying an object responsive to a selection. Then HMD 210 will wirelessly transmit signal 250. If HMD 210 is activated by HMD 210, the combination of HMD 210 receiving signal 250 and user 201 simultaneously activating HMD 210 will result in the selection operation. HMD210 may display a visual effect for user 201 in response to the selection operation.

“FIG. “FIG. This system is substantially identical to the system 100 in FIG. The system (e.g. substantially similar to FIG. 100) includes a HMD (e.g. 110), an eye tracker (e.g. 117), and a mobile interface device (e.g. 120). The portable interface device (120), is physically distinct from the HMD (1110). The portable interface device, which is wireless, can be, in certain implementations, a batteryless device. In the description of method 300, we often refer to system 100 in FIG. 1. An artist of good skill will know that system 100 is merely an example and that methods described herein can be applied using other systems or devices. FIG. 1. The appended claims should be used to define the scope of the inventions, devices, and methods. This is in addition to the examples described in this specification. In the description of method 300, references to system 100 elements from FIG. 1 are in parentheses to indicate the non-limiting nature of such references and that they are intended for illustration purposes only.

Method 300 includes three acts: 310, 320 and 330. Act 320 also includes four sub-acts 321a, 321b, 322, 322, and 323. The art is not limited to the following acts/sub-acts. However, those skilled in the art will be able to appreciate that alternative embodiments can be made. The order in which the acts/sub-acts are illustrated is only an example and can change in other embodiments.

“At 310, at most one display (111), of the HMD (110) displays a object (115) within a user’s field of vision. The object (115), which responds to a selection operation by the user, may include, without limitation, a menu item (111), a graphical button (keyboard key), a notification, one or more objects displayed by the HMD (110), and/or a file, folder, and/or an alphabetic character.

“At 320 the system (100), receives a selection from the user. The selection operation by the user can include two portions that are substantially concurrent: i. the user looking at at least one object (115), displayed on at least one display (111 per 310), and ii. the user activating at most one actuator from the portable interface device 120. Multiple inputs in different modes and communication among different devices are required for the selection operation (100). Act 320 includes sub-acts 321 a to 321 b and 322, which collectively define the reception of the components by and among the system components (100).

“At 321a, the eyetracker (117), of the system 100 detects that the user gazes at the object (115), displayed at 310. Sub-act 321a provides the first section of act 320. Sub-act 321a a provides a first part of act 320. This first portion corresponds to the user gazing at the object (115). ”

“At 321b, the portable device interface (120) of system (100), receives a second part of the selection operation and an activation by at least one actuator (121) from the user. A portable interface device (120), may be activated by the users by activating at least one actuator (121). In FIG. 3, sub-acts 321a and 321b are linked by a horizontal line. 3. This indicates that the sub-acts are substantially conjoint (i.e. sub-acts 321a and 321b are performed substantially simultaneously). The user generally activates the device (121) per 321 B while gazing at the object (115), as determined by the eye-tracker (1117) per 312 a. The user can perform a combination of actions (e.g. This is an example of multi-modal selection according to the present systems, devices and methods.

“A transmitter or signal generator (122) at 322 wirelessly transmits (150) a signal to the portable interface device 120 in response to at least one actuator (121) being activated at 321b.

“At 323, the receiver (116), of the HMD (110) wirelessly receives signal (150), wirelessly transmitted at 322. Sub-acts 321 and 322 provided the first portion of Act 320. The combination of sub-acts 311, 322, 322 and 323 creates a second part of Act 320. The system (100), receives, via the portable device 120, a second part of the selection operation at 320. This second portion is the?user activating a actuator? The portable interface device (120) provides a portion of the selection operation. Method 300 is now complete with the selection operation.

“At 330, at least one display (1111) on the HMD (110) displays a visual effect upon receiving the selection operation per 320. The visual effect can include, but is not limited to: Highlighting the object (115), visually changing the object (115), changing the content elsewhere on display (111), changing other aspects of displayed content (111) or replacing displayed content (115) based on the object selected by the user.

“The HMD (110) of the system (100), may contain a processor (112), communicatively connected to at least one display (111), and a nontransitory processorreadable storage medium (113), communicatively linked to the processor (112). The memory (113), which may contain data and/or instructions, (i.e. processor-executable instruction 114) that when executed by processor (112), causes: i.e. the HMD (110) at least one display (1111) to display the object (115), and ii. the HMD (110) at least one display (1111) to display the visual effect to user per act 335 in response to system (100) receiving selection operation from user per act 321.

“As shown in FIG. 100, the system 100 context. 1. The portable interface device (120), of the system (100), may be implemented in many different ways. The nature of the portable device (12) may affect the details of sub-acts 322, 322 and 323 of method 300.

“In a first implementation, the portable device (120), may include a wireless and batteryless portable interface device. It may also include a piezoelectric component communicatively connected to the actuator (121) as well as a radio frequency antenna (122) which is communicatively linked to the piezoelectric elements and tuned to wirelessly transmit radio frequencies (150) at a particular frequency or within a specified range of frequencies. The receiver (116), of the HMD (110), may include a radio frequency receiver (166) that is tuned to wirelessly transmit radio frequency signals (150). The portable interface device 120 may be actuated mechanically by the user in this first example. The portable interface device 120 may experience or receive a mechanical actuation from the user. This could be done by having the piezoelectric elements mechanically activated by the user. The mechanical actuation may cause the piezoelectric elements to generate an electric signal. This electric signal may communicatively couple to the portable radio frequency antenna (122) and the radio frequency antenna (150). Sub-act 323, radio frequency receiver (116), of the HMD (110) may wirelessly receive radio frequency signal 150.

“In a second implementation, the portable device (120), may include a batteryless, wireless, portable interface device. It may also include a mechanical resonance (122) that is physically coupled to actuator (121) and tuned for wireless transmission of sonic, auditory, or aural signal (150), such ultrasonic signals. The receiver (116), of the HMD (110), may include a microphone or a piezoelectric element that is tuned to respond to sonic waves (150) at the specified frequency or within a specific range of frequencies. The portable interface device 120 may be able to receive mechanical actuation from the mechanical resonator (122) at sub-act 321b of method 300. The portable interface device 120 may experience or receive a mechanical activation of the mechanical resonance (122) from the user. This could be done by, for instance, having the mechanical oscillator (122) mechanically actuated. The mechanical resonator (122) can generate a sonic sound (150) in response to the mechanical actuator. This signal is wirelessly transmitted using sub-act 322 from method 300. Sub-act 323, the HMD (110) can wirelessly receive the sonic signals (150) through the tuned piezoelectric element (116), and microphone (113).

The multi-modal interface described in this document (comprising an eye tracker and an actuator or button wirelessly communicating with a wearable head-up display) allows for sophisticated control and/or interaction with the content displayed on the wearable head-up display in a discreet, substantially hands-free fashion and with a minimal form factor. The control interactions are sophisticated and discreet, at least partially, because the eye tracker data allows for the use of the eyes to perform the pointing and identification. Hand-controlled interfaces can make it difficult to complete certain tasks. The eye, and more specifically the gaze thereof, can scan and focus on specific aspects of the displayed content faster than a finger- or hand-controlled cursor. However, specifying a selection operation using the eye alone can prove cumbersome as: a. the user is likely inadvertently to gaze at something that is not intended to be selected; and b. the methods for doing this, such as a deliberate blink, or a prolonged dwell time, are impractical and make it difficult to perform the eye. The present systems, devices, and methods take advantage of the versatility and scanning/honing capabilities of the eye/gaze but avoid the specification/selection issues by employing a secondary input mode, a simple wearable actuator such as ring-based button, to actuate the specification/selection function. Although the wearable actuator can’t perform complex control interactions by itself, it is compatible with the eye tracker and offers the best of both interfaces. The wearable actuator, like the eye tracker is discreet and virtually hands-free. Because it doesn’t need to be carried or held in the hand of the user, If a ring-based button is worn on the user’s index finger (for example), the user can activate the button by simply using their thumb. This action may be performed while another task such as carrying something or handing out paperwork is being done. It is extremely small and compact which allows for very minimal form factors.

“The various embodiments herein provide a multimodal, portable and totally wearable interface that allows a user to perform complex interactions with content displayed on wearable heads up displays in an inconspicuous, substantial hands-free fashion. In implementations where the eye tracker component of the wearable head-up display is integrated (even integrated with projection elements of wearable heads up display, as described by U.S. Provisional Patent Application No. No. No. Nos. Nos. A user wearing the multimodal interface described herein might be significantly different from a user wearing conventional eyeglasses with a conventional ring. This is possible, at least partially, because of: the compact size of the wearable head-up display; the compact size of the wearable (e.g. ring-based actuator); integration of the eye tracker and the wearable head-up display; and/or wireless communication between the wearable actuator, and the wearable head-up display.

“Infinitive verb forms are used throughout this specification and the appended Claims. Examples include, but are not limited to:?to detect? ?to provide,? ?to transmit,? ?to communicate,? ?to process,? ?to route,? The like. These infinitive verb forms can be used, unless the context otherwise requires, in an inclusive, open sense. To, at most, provide? To, at most, transmit? And so on.”

The above description of illustrated embodiments, which includes what is described in Abstract, is not meant to be exhaustive nor to limit the embodiments to those disclosed. Specific embodiments and examples are provided herein to illustrate the point. However, it is possible to make other modifications without departing from its spirit and scope. As those skilled in relevant art will know, however. These teachings may be applied to other wearable and portable electronic devices.

“For example, the detailed description of the device and/or process has described various embodiments using block diagrams and schematics. As long as the examples, block diagrams, and schematics contain one or more functions or operations, those skilled in this art will understand that each function or operation within these block diagrams or flowcharts can be implemented individually or collectively by a variety of hardware, software and firmware or any combination thereof. One embodiment of the subject matter can be implemented using Application Specific Integrated Circuits. However, anyone skilled in the art will know that the embodiments described herein can be implemented in standard integrated circuits as one or several computer programs (e.g. one or multiple programs running on one computer system), one or two controllers (e.g. microcontrollers), one or both of these programs executed on one processor (e.g. microprocessors), one or all of these programs executed on one or many processors (e.g. microprocessors), or any combination thereof. It is well within one of the disclosures.

“Logic implemented in software can be stored in memory. Logic or information can then be stored on any processor readable medium to be used by any processor-related method or system. A memory, in the context of this disclosure is any electronic, magnetic or optical device or means that stores or contains a computer or processor program. Any processor-readable medium can contain logic and/or information for use with or in connection to an instruction execution device, apparatus, and system. This includes a computer-based, processor-containing, or other system that can retrieve instructions from the device or apparatus and execute those instructions.

“A?nontransitory processor-readable media? is defined in the context of this specification. Any element that stores the program and/or information to be used by the instruction execution system, apparatus or device can be considered a processor-readable medium. A processor-readable medium could include, but not be limited to, an electronic or magnetic, optical, electromagnetic or infrared system, apparatus, or device. A computer readable medium could also include a portable magnetic, compact flash card, secure electronic, or other media.

“The different embodiments can be combined to create further embodiments. As long as they do not conflict with the specific teachings or definitions herein, all U.S. Patents, U.S. Patent Application Publications, U.S. Patent Applications, U.S. Foreign patents, U.S. foreign patent applications, and non-patent publication referred to in this specification are owned by Thalmic Labs Inc.. No. 15/282,535, U.S. Provisional Patent Application Se. No. 62/236,060, U.S. Patent Application Publication 2015-0205134, U.S. Non-Provisional patent application Ser. No. No. No. No. No. No. Provisional Patent Application Se. No. 62/117,316, U.S. Provisional Patent Application Se. No. No. Provisional Patent Application Se. No. No. Nos. Nos. You can modify certain aspects of the embodiments to use systems, circuits or concepts from the various patents applications and publications to create further embodiments.

“These and other modifications can be made to embodiments according to the detailed description. The following claims should not be understood to limit the claims to specific embodiments as disclosed in the specification or the claims. Instead, they should be interpreted to encompass all possible embodiments and all equivalents to which such claims may be entitled. The disclosure does not limit the claims.

Click here to view the patent on Google Patents.