Invented by Andreas Connellan, Fergal Corcoran, Pierce Brady, Logitech Europe SA

The market for input devices to be used in an augmented/virtual environment is rapidly growing, driven by the increasing popularity of augmented reality (AR) and virtual reality (VR) technologies. These input devices play a crucial role in enhancing the user experience and enabling seamless interaction within these immersive environments. AR and VR technologies have gained significant traction in various industries, including gaming, entertainment, education, healthcare, and even in the workplace. These technologies offer users a unique and immersive experience by blending the real and virtual worlds or creating entirely virtual environments. To fully immerse users in these augmented or virtual environments, input devices are essential. These devices allow users to interact with the virtual elements and manipulate them as if they were real. The market for input devices in AR/VR environments includes a wide range of products, such as motion controllers, haptic gloves, eye-tracking devices, and brain-computer interfaces. Motion controllers are one of the most common input devices used in AR/VR environments. These handheld devices track the user’s movements and translate them into corresponding actions within the virtual world. They provide users with a sense of presence and enable them to interact with virtual objects, navigate through virtual spaces, and perform various tasks. Haptic gloves are another popular input device in the market. These gloves provide users with tactile feedback, allowing them to feel and touch virtual objects. By incorporating haptic feedback, users can experience a more realistic and immersive environment, enhancing the overall user experience. Eye-tracking devices are gaining traction in the market as well. These devices track the user’s eye movements and enable precise targeting and interaction within the virtual environment. Eye-tracking technology enhances the user interface, making it more intuitive and natural. It allows users to control the virtual environment simply by looking at objects or areas of interest. Brain-computer interfaces (BCIs) are a cutting-edge input device that holds immense potential in the AR/VR market. BCIs enable direct communication between the user’s brain and the virtual environment, bypassing traditional input devices. This technology has the potential to revolutionize the way we interact with AR/VR environments, offering a more seamless and intuitive experience. The market for input devices in AR/VR environments is expected to witness significant growth in the coming years. According to a report by MarketsandMarkets, the global market for AR and VR in gaming alone is projected to reach $45.09 billion by 2027, indicating the immense potential for input devices in this sector. The increasing adoption of AR/VR technologies in various industries, coupled with advancements in input device technology, is driving the market’s growth. Companies are investing heavily in research and development to create innovative and user-friendly input devices that enhance the overall AR/VR experience. However, challenges such as high costs, technological limitations, and the need for standardization may hinder the market’s growth to some extent. Additionally, the market faces competition from alternative input methods, such as voice recognition and gesture control, which may impact the demand for traditional input devices. In conclusion, the market for input devices to be used in an augmented/virtual environment is expanding rapidly, driven by the growing demand for immersive and interactive experiences. Motion controllers, haptic gloves, eye-tracking devices, and brain-computer interfaces are among the key input devices shaping this market. As AR/VR technologies continue to evolve and become more accessible, the demand for innovative and user-friendly input devices is expected to rise, opening up new opportunities for companies operating in this space.

The Logitech Europe SA invention works as follows

The first part of the housing can be linear in shape and be designed to be held in the hand by the user while using the stylus. The tip of the first portion can be configured to act as an interface for the stylus device with objects in the AR/VR environment. The second part of the housing can be bent in three dimensions and non-linear, including a section that is bent longitudinally towards a line parallel to the first part of the housing. The second portion of housing can include emitters or sensors that are configured to facilitate tracking the stylus in three-dimensional space.

Background for Input device to be used in an augmented/virtual environment

Virtual, mixed or augmented reality is a term that can refer to a wide range of computer-simulated immersive environments. These environments can simulate the physical presence of an individual in a real world or a virtual one. Computer simulations of these environments may include computer-rendered images that can be displayed on a graphic display. The display can be set up as a HMD that covers all or part the user’s field-of-view.

A user can interact with the computer-simulated world by using a peripheral device or user interface device. In many AR/VR systems of today, the most common controller is the pistol grip. This controller can be operated with up to six degrees of freedom (DOF), depending on the system. In a computer-simulated AR/VR world, users can perform complex interface operations, such as simulated movements, object manipulation and interaction, and much more. The weight of the controller, along with its large tracking features, can make it cumbersome and unwieldy. It also has a protruding and obtrusive donut-shaped design. This shape is useful for reducing fatigue, as users can hold an object in pistol grip for longer. However, it only allows coarse movements and ungainly controls. There is a need to improve interface devices for virtualized environments. This is especially true when performing tasks requiring a high level of precision and control.

The housing can include a plurality of processors and a communications device, which is disposed within the housing. The communications module is controlled by the processors and configured to establish wireless electronic communication between the stylus and the host device. The housing’s first portion can be linear in shape and configured to be held in a user?s hand when the stylus is being used. The first portion can include an input element that generates control signals when activated by a user’s hands; and an end tip of the first section configured to act as an interface between objects in the AR/VR environment and the stylus device. The second part of the housing can be non-linear, and bend in three dimensions, such as the x,y and z dimensions within a Cartesian coordinate system. A section of this second portion is curved longitudinally towards a line co-linear to the first part of the housing. The second portion of housing can include emitters or sensors that facilitate tracking the stylus in the three-dimensional environment within the AR/VR.

In certain embodiments, the housing’s second portion may include a section that forms a polygonal circle (e.g. a hexagonal loop or diamond-shaped loop; a circular loop; a spherical, hemisphere, or octagonal shape loop). In certain cases, the second part of the housing can include a cross-section polygonal through at least a section of the loop (e.g. hexagonal crosssection, diamond crosssection, circular crosssection, octagonal crossing, etc.). The second portion can include a plurality planar faces, and the plurality emitters or sensor may be configured in some or all of these planar faces. The plurality of planar faces can be oriented in a three-dimensional space to face different directions. In certain cases, all or some of the planar faces can be oriented so that four of them are visible at any 360-degree point around the stylus. For example, if a first section of the housing is substantially linear (and can be configured for a user to hold it in a pen grip configuration), then any point in 360 degrees normal to that first section can allow at least four planar facets, or emitters, to be detected. A wheel/axle relationship can also be simplified, wherein the first section of the stylus (the substantially-linear portion) is the axle and the 360 degree viewing is the orientation of the wheel relative to the axle. In some cases, however, detection can occur at any point on the stylus (e.g. at any location around the stylus). In some cases, the center of gravity can be located at the interface between the second and first portions. As shown in FIG. 6B.

In certain embodiments, a device that can be used in an AR/VR reality environment includes a housing with a non-linear second portion, which bends in order to traverse three dimensions. The second portion also contains a number of sensors or emitters, which are configured to facilitate tracking the input device within the three-dimensional environment. In certain cases, the second portion may be bent longitudinally in a direction parallel to the first portion. In certain embodiments, the second part of the housing may form a loop hexagonal and include a cross-section hexagonal through at least some of the loop. The second portion of housing can include a plurality planar faces, with the plurality emitters or sensor configured on these planar faces. Each of the plurality planar faces can be oriented in a different way from each other in three-dimensional spaces. In some cases the plurality planar faces are oriented so that at least four planar surfaces are visible in any 360-degree angle around the input device. In some embodiments, an input device can be a stylus. The input device’s center of gravity can be located at the interface of the second and first portions, so that the device balances unsupported between a user’s thumb and forefinger when the hand is in a position where the input is in use. In some implementations the input device may be balanced longitudinally or latitudinally between the interface of the second portion with the first portion.

The housing of an input device for an AR/VR reality environment may also include: a first and second portions. The first portion may be configured to hold the input in the user’s hands while it is being used. The second portion may be non-linear, bent in three dimensions, and include a section having a hexagonal shape. In some cases, the plurality planar faces can be oriented in a different way from each other. Each of the plurality planar faces may also be oriented so that four or more of them are visible at any given point around the input device.

Embodiments” of this invention generally relate to control devices that are configured to work in AR/VR systems. Some embodiments are a stylus with an improved design architecture that has better ergonomic and tracking properties.

In order to explain the embodiments of this invention, many examples and details will be given in the following description. “It will be obvious to a person skilled in the arts that certain embodiments may be performed without some of these specific details or with equivalents or modifications.

To provide a high-level, broad understanding of certain aspects of the disclosure, we present a nonlimiting summary of various embodiments. As described above, the pistol grip controls that are commonly used in many AR/VR products are not ergonomically friendly. As a result, traditional pistol grip controllers are inarticulate, and they can hinder users from performing precise operations. This is due to their bulky, cumbersome design, and the biomechanics that a user’s hands have in the pistol grip position, which generally facilitates high precision movements. The invention provides a novel input system that allows a user to have a higher level of control and precision, with less fatigue and the ability to adapt to various tracking protocols used in modern AR/VR systems. “Some embodiments include a stylus that can be held like an ordinary pen, that has a balanced centre of gravity and is designed to stay in line with the user’s wrist when in use. It also includes a tracking section that is designed to prevent light obstruction during use. This makes for a high-precision input device that is ergonomically sound.

In certain embodiments, an input device may be a stylus configured to operate in a virtual/augmented reality (AR/VR), which includes a housing having a bottom and top portions, as shown, for example, in FIGS. 3A-7. The housing can include one or more processors, and a communication module that establishes wireless electronic communication between a stylus device and an at least host computing device. In some cases the bottom of the housing is linear and designed to be held in the hand by the user while using the stylus device. The bottom section can have one or more inputs that generate control signals when activated by a user’s hands, and an end tip to the bottom portion which acts as an interface for the stylus device to interact with objects in the AR/VR environment. The top portion can be formed as a three-dimensional hexagonal loop and include a hexagonal section through a portion. The angle between the planes of the hexagonal cross-section and/or loop may be 30 degrees but is not necessarily symmetric. When the stylus is being used, the hexagonal loop can be bent in a downward direction and parallel to the wrist of the user. This will prevent the line-of-sight from being obscured. The hexagonal loop may include multiple planar facets oriented in different directions within three-dimensional space. On some or all planar faces, a number of sensors or emitters can be placed to facilitate tracking the stylus in the three-dimensional environment of AR/VR. The type of tracking infrastructure that is used by an AR/VR system may determine the use of emitters, such as infra red LEDs, and/or detectors, such as infra red detectors. The planar faces can be oriented so that at least four planar faces are visible (e.g. detectable LED emissions) from any 360 degree axial angle to the orientation the stylus (e.g. around the sides of stylus), although in some embodiments, facets may be configured on different portions the hexagonal loop to allow for at least three or four to be visible anywhere around the stylus. In some implementations, the center of gravity can be located at the interface between top portion and bottom portion. The top portion of the housing is compact enough to not extend past the wrist (e.g. does not hang off the edge of the hand) of the user when the stylus is in use. The bottom portion of housing can be gripped by the user in a pen-grip position. A person of ordinary skill would be able to understand many variations, modifications and alternative embodiments of the housing with this disclosure.


The following explanations may help you better understand the present disclosure:

The terms “computer simulation” and “virtual reality environment” are used in this document. “As used herein, the terms?computer simulation? Refers to any visual, immersive computer-simulated virtual reality environment, including augmented reality and mixed reality. The terms “virtual reality” and “mixed reality” are used in this context. As used herein, the terms?virtual reality? A computer-simulated setting that mimics an imaginary environment may be included in?VR? The user can simulate their physical presence in the environment by interacting with the objects and setting. VR environments can be a videogame, a simulation of a medical procedure, such as a surgical procedure or physiotherapy, an interactive digital mockup of an designed feature (such as a computer-aided design), an educational simulation, such an E-learning simulation, or another simulation. The simulated environment can be in two- or three-dimensional.

The terms “augmented reality” and “augmented reality” are used in this document. “As used herein, the terms?augmented reality? The use of rendered images in conjunction with real-world views is possible. AR environments can include architectural applications that visualize buildings in real life; medical applications that augment additional information for a patient during therapy or surgery; and gaming environments which provide an augmented simulation before entering a virtual reality environment.

As used in this document, the term “mixed reality” is used. “As used herein, the terms?mixed reality? Virtual objects rendered as images can be used in conjunction with real-world views of an environment, where the virtual objects interact with the environment. The embodiments described below can be used in AR, VR or MR environments.

The term “real-world” is used in this document. The term’real-world environment? The physical world is also referred to as the ‘physical environment? The term “real-world arrangement” is used. Referring to an object, such as a body part or a user interface device, may refer to the arrangement of that object in real-world. This may be relative to some reference point. The term “arrangement” is used. The term ‘arrangement’ can refer to the position of an object (location and orientation). The position can be described in terms of global or local coordinate systems.

The term “rendered images” is used in this context. “As used herein, the term’rendered images’ or ‘graphic images? Images that are generated by computers and shown to the user in a virtual environment may also be included. Images may be displayed two-dimensionally or in three dimensions. Displays disclosed in this document can present images from a real world environment, by, for instance, allowing the user to view the real world environment directly and/or presenting one or more images (that can be captured using a camera).

The term “head mounted display” is used in this document. The term ‘head mounted display’ or ‘HMD? Displays that render images for a user may be referred to as?HMD? HMDs may have a graphic display in front of the user’s field of vision. Displays can be transparent, semitransparent or opaque. The HMD can be part of a head-set. Display driver circuitry may control the HMD’s graphical display.

The term “electrical circuitry” is used in this document. or ?circuitry? The term “circuitry” may be used to refer to or include any of the following hardware or software components, including but not limited: a processor, memory, combinational logic circuits, passive electrical components, interfaces, etc. In certain embodiments, the circuitry can include one or multiple virtual machines which provide the functionality described. Circuitry can include passive components in certain embodiments. Combinations of transistors and transformers can provide the functionality described. In certain embodiments the circuitry can be implemented, or functions that are associated with the system may be implemented, using one or more firmware or software modules. Circuitry in some embodiments may include logic that is at least partially programmable by hardware. Electrical circuitry can be distributed or centralized, and distributed on devices such as a computer network, a remote server, a cloud computer system, or a peripheral.

The term “processor” is used in this document. “As used herein, the term?processor(s)”? Or?processing resources? Refers to one or several units of processing, such as an ASIC, CPU, GPU, programmable logic devices (PLD), Microcontroller, FPGA, microprocessor, DSP, or any other suitable component. The configuration of a processor can be done using machine-readable instructions that are stored in a memory. The processor can be distributed or centralized, and it may also be distributed across various devices, such as a cloud-based system, a networked computer (including a remote system), or a peripheral. The processor can be located in a number of devices, including a stylus, which could include an HMD and/or a user interface.

As used in this document, the term “computer-readable medium/media” is defined as: The term “computer readable medium/media” can refer to conventional non-transient memories, such as random access memory (RAM), optical media, hard drives, flash drives, memory cards, floppy disks, optical drives, or combinations of these. While one or several memories can be physically located at the same location as the system itself, they may also be remotely located from the host computer and communicate with one or multiple processors via a network. When more than one memory are used, the first memory can be located at the same physical location of the host system, while the additional memories can be located at a distant physical location. Physical locations of one or more memory(s) may vary. One or more memories can also be implemented as “cloud memory”. “One or more memories may be accessed or partially based using the network.

Click here to view the patent on Google Patents.