Metaverse – Veniamin Milevski, Peter Vincent Boesen, Bragi GmbH

Abstract for “Variable computing engine to create interactive media using biometrics”

“A method and system for implementing interactive media content are provided. At least wireless earpieces can be used to receive interactive media content for user communication. The wireless earpieces measure user biometrics. The user condition is associated with the biometrics. The user condition determines whether branching patterns in interactive media content can be modified. Interactive content can be either a story or a game.

Background for “Variable computing engine to create interactive media using biometrics”

“Audiobooks are generally less popular in English-speaking countries. Bantam Books’ variable branch books have been less popular than other story formats. These interactions have been largely replaced by the internet gaming industry, either alone or in conjunction with other industries. This art form is now less popular. It is time to create a new way for users to interact with the storyline and even influence its ending.

“Therefore it is a primary feature, object, or advantage of this invention to improve the state of art.”

It is an additional object, feature or advantage that allows a system to detect biometric parameters of users and include them in the branching patterns.

“It is an additional object, feature, and advantage to modify the branching patterns of the game based on user biometric inputs.”

“Another object or feature is to reduce the intensity of a story in order to provide a protective function in the event that the user experiences excessive stress.

“Yet another feature, object, or advantage is to raise the action or associated stress levels of the story based on biometric data fed into the engine relative to desired level of interaction within the story or branch.”

“An additional object, feature or advantage is to protect users by intentionally calming them through a modification of branches if biometric sensors detect a problem level of interaction.”

The following description will make it clear that one or more of the objects, features, and advantages are included. Each of these features, advantages, and objects are not necessary to be included in every embodiment. Different embodiments could have different features or advantages.

“A system and method of implementing interactive media content are provided according to one aspect. At least wireless earpieces can be used to receive interactive media content. The wireless earpieces measure user biometrics. The user condition is associated with the biometrics. The user condition determines whether branching patterns in the interactive media content will be modified. A second embodiment includes wireless earpieces with a processor that executes a set instructions and a memory to store the instructions. These instructions execute the above-described method.

“A virtual reality system is described in another aspect. A virtual reality headset is used to display a virtual environment to the user. Wireless earpieces are also part of the virtual reality system. They receive interactive media content from a virtual reality headset, communicate with a user via at least wireless headphones, determine user biometrics using biometric sensors on the wireless earpieces and modify branching patterns in response to user conditions.

“The illustrative embodiments offer a system, method and wireless earpieces that allow you to interact with a gaming system. A?game’ may refer to the applicable story, game or multiplayer interface, or any other media content. For simplicity’s sake, The wireless earpieces can sense biometrics. Biometrics can include the user’s pulse, blood pressure and temperature. They also may be used to determine stress levels, as well as other parameters, factors, characteristics or measured parameters. You can also interact with the game using voice, audio, tactile and gestures.

“The game could be set up to use a user profile to manage how interactions and biometrics are controlled by the game. Based on user biometrics, different game variables, branches and decisions or other attributes can be implemented in one embodiment. Variables such as branching patterns and potential story lines can be altered based on user biometrics. Based on biometrics, the branch can be used to soothe, upset, calm or affect the user to reach a desired level of interaction. You may set user preferences to calm a child so the storyline will be calming as it progresses. The game can be customized for each user. The game can be made to please, terrorize, or keep the player guessing. You can include audio, visual, tactile and olfactory media in the game.

“In one embodiment, the digital content device may be included in the game (e.g., an e-book or tablet, a cell phone, or a mobile phone). The user may be able to access the media content through this device. An interactive book can communicate with wireless earpieces in order to receive input from the user and communicate information. Another embodiment of the game could include a virtual-reality system that includes a headset. An interactive book, or virtual reality headset, can communicate wirelessly with the wireless headphones or via a wired connection. The virtual reality headset could be connected to the wireless headphones using magnetic connectors. The wires that extend from the virtual reality headset can magnetically couple, and use induction communications and power transfer for power. Another example is that the interactive book, or virtual reality headset, may have a frame to connect to the wireless headphones. Another embodiment may use a wireless protocol, standard or connection, such as Bluetooth or NFMI or Wi-Fi.

The wireless earpieces can be worn in the user’s ear. The wireless earpieces can be fitted at least partially in the user’s external auditory canal. The ear canal provides a rich area for biometric measurements and stabilization of the wireless earpieces. Wireless earpieces can be used during rigorous physical activities that require stability. The wireless earpieces are designed to allow for extended wear and the collection of valuable information using the wireless sensors. Wireless earpiece sensors may include accelerometers and gyroscopes as well microphones, camera, imaging devices, contact/conductivity sensor, pulse oximeters, etc. Important measurements taken by the sensors may include pule rate, blood oxygenation, microphone, position/orientation, location, temperature, altitude, cadence, calorie expenditure, and so forth.”

“Wireless earpieces can include any number sensor arrays that capture information about the user. This large amount of data could allow the user to adjust the configuration of wireless earpieces dynamically. The wireless earpieces can learn from historical data and user input over time. An alert may be sent to the user with information about the current status of the configuration adjustment process.

“The illustrated embodiments can be used for entertainment, scientific, educational or commercial purposes. Virtual reality headsets such as those made by Google, HTC and Samsung, oculus Sony, Microsoft, and others, can present any number or three-dimensional visualizations to users. These examples minimize existing mass problems associated with bulky audio or headphone systems. The characteristics of the angular momentum associated to the user’s head do not change significantly, decreasing the effects of torque, neck strain and other problems that could be associated with virtual reality systems.

The wireless earpieces can include any number sensors that communicate with the components, systems, and sensors of the interactive story. This will enhance the user’s experience. One embodiment of wireless earpieces’ sensors may include accelerometers and gyroscopes as well as magnetometers and optical sensors. The sensors can collect data that may be used to determine the user’s characteristics, location, stress levels, mental health, and other factors. The data can be used to improve the user’s experience in the virtual reality game. The sensor also provides data that improves the measurement of the virtual reality headset.

To provide better three-dimensional spatial sound imaging, the precise location and position of the user may be used. The sensors can also be used to detect biometric information such as heart rate and blood pressure, skin physiology or any other biometric data. This data can be used to determine if the user feels safe while playing a virtual reality game or is feeling fatigued or stressed. Game interaction can also be modified based on user’s condition. You can use different branches, sub-branches or endings. The wireless earpieces can be used to communicate with the interactive story, virtual reality headset or other devices. You can play music, block or filter sound, amplify sounds, and so forth.

The wireless earpieces can be used for everyday activities such as exercise, phone calls, travel, etc. Wireless earpieces can also be used as audio interfaces in interactive books or virtual reality systems. The system is therefore less expensive and lighter than other audio components. Wireless earpieces are smaller in size and can be used by users to reduce strain and weight. The various games may also include a separate power source or battery that can be used to power the wireless headphones on the go.

“FIG. “FIG. Wireless earpieces 102 can be configured to communicate with one another and one or more wireless devices (104).

“The wireless earpieces 101 may be worn by a person 101. They are shown as worn separately from their position within the ears of the user (106) for visualization purposes. FIG. 2 shows a block diagram of wireless earpieces. 2. To further illustrate the components and operation, of wireless earpieces102. The first and second wireless earpieces are shown together. However, only one wireless piece is required.

“In certain applications, temporary adhesives (e.g. straps, clamps and extenders, chargers etc.) may be used. To ensure that the wireless headphones 102 are in the ears 101 of the user 101 during strenuous and physically demanding activities, you can use temporary adhesives or securing mechanisms (e.g., clamps straps, lanyards extenders, chargers etc.) The wireless earpieces may be used during swimming, virtual reality simulations and interactive gaming. They can also be used for team sports like biking, hiking, parachuting, and other activities. Wireless earpieces may be used to play music, audio, make and receive phone calls, or any other communication, determine ambient conditions (e.g. temperature, altitude location speed, heading, heading), etc. ), read user biometrics (e.g., heart rate, motion, temperature, sleep, blood oxygenation, voice output, calories burned, forces experienced, etc. ), and can receive feedback or instructions from the user. The logic of wireless earpieces102 may allow for the dynamic configuration of components, such as microphones and speakers, according to the conditions 100.

“The position of the wireless earpieces102 could be determined by the VR headset 110, other wireless devices 104, and interactive books 105. The position information of the wireless earpieces may be used to determine the proximity of the devices within the communication environment 100. To determine the distance between the devices in the communication environment 100, it may be used, for example, to determine their proximity. One embodiment uses distance information to determine if the wireless earpieces102 are being worn simultaneously (e.g. should they be exposed to similar environmental conditions, noise, etc.). Multiple wireless earpiece sets can communicate simultaneously in one embodiment. Multiple wireless earpieces can communicate simultaneously with the VR headset 110 and interactive book 105, for example.

“In one embodiment, wireless earpieces102 and the sensors that are associated with them (whether external or internal) can be set up to record a variety of measurements or log data during normal use. Sensor measurements can be used to extrapolate to other measurements, factors, and conditions relevant to the user 101. The sensors can monitor the heartbeat and EKG of the user to identify their unique patterns or characteristics. Either the user 101 or another person can configure the wireless headphones 102 and interactive book 105 either directly or via a connected app (e.g. mobile app with a graphic user interface) to store and share data, audio and images. Examples of standard usage include recording and detecting a heartbeat, setting biometrics and user input for implementing variable branching or branching decisions in a game, setting noise thresholds or microphone sensitivity, and setting gestures/inputs to perform an action, such as opening an app, playing music, or providing audio feedback. Active participation in a conversation, listening or playing music, and so forth.

“The wireless earpieces102 can also determine the user’s status, condition or other information using biometric data from sensors and user input. Sensors can determine the user’s emotional and physical state. The memory of the wireless earpieces102 can store information that allows for the determination of the user information. To determine user information, the wireless earpieces 102 may store default values, thresholds, data, and other information. The user information can also be used to associate with any number of categories (e.g. happy, stressed, scared or content ).

The wireless earpieces may be reconfigured or configured during the initial setup process, during regular usage, or upon a request by the user. One embodiment of the wireless earpieces102 may take baseline readings in order to determine whether the communications environment 100 has noise or quiet. It also can read information about the user’s emotional and physical state 101 (e.g. anxious, scared, happy calm, stressed). The communications environment 100 could include the user’s house, library, roadway, work areas, office, school or business, as well as the venue for sports and other activities. One embodiment may set default configurations for microphones and speakers based on the location of the wireless earpieces. The wireless earpieces102 may be further adjusted to adapt to the best configuration of the microphones and speakers based on real-time noise levels at the location.

The wireless earpieces may contain any number of sensors 112 as well as logic to measure and determine user biometrics such as heart rate, pulse rate, skin conduction and blood oxygenation. They can also be used for voice and audio output. The sensors 112 can also be used to determine user’s position, velocity, impact level, and other information. The sensors 112 can also be used to receive input from the user and translate it into commands or selections across the personal area network. The wireless earpieces may detect voice commands, head motions and finger taps as well as gestures and motions. The wireless earpieces may measure user input and convert it into external commands or internal commands that can be sent to mobile devices, computers or other wearable devices. The wireless earpieces102 can detect a user 101’s head movement and voice command, which are used to adjust the noise level 100 or to increase or decrease volume.

“The sensors 112 can make all measurements regarding the user 101 or communicate with any number other sensory devices (e.g. interactive book 105, etc.). The communication environment 100 can measure information and data about user 101, as well as the environment 100. One embodiment of the communication environment 100 could be all or part of a personal network. Wireless earpieces 102 can be used to control, communicate with, manage, and interact with a variety of wearable electronics or devices, including smart glasses, helmets smart glass, smart glasses, wrist bands, smart glasses, smart glasses, watches, smartglasses, smart glasses, smart glasses, smartglasses, smartglasses, smart glasses, smart glasses, chest straps wireless earpieces implants displays clothing, and other wireless earpieces. A personal area network allows data transmissions between devices such as personal computing and communications, cameras, vehicles, entertainment and medical devices. A personal area network can use any combination of wireless, wired, or hybrid configurations. It may also be dynamic or stationary. The personal area network could use wireless protocols and standards such as INSTEON or IrDA, Wireless USB or Bluetooth. One embodiment of the personal area network can move with the user 101.

“In other embodiments, a communication environment 100 can include any number or components that communicate directly or indirectly with one another through a wireless (orwired) connection, signal or link. One or more networks, components, and devices of the communication environment 100 can be included, such as signal extenders, routers, signal generators, intelligent network devices and computing devices.

“Communications within a communication environment 100 can occur via a network 120, directly between devices such as the wireless headphones 102 or interactive book 105, and indirectly through networks such as Wi-Fi networks. The network 120 can communicate with or include wireless networks, such as Wi-Fi, cellular (e.g. 3G, 4G and 5G), PCS, GSM, and so on. Bluetooth, or any other radio frequency network. It may also communicate with any number hard-wired networks such as coaxial networks or fiber-optic networks. One or more service providers can manage communications within the communication environment 100 (e.g. secure, public and private). “, or network providers.”

“The wireless earpieces102 can play, communicate, and use any number of communications to indicate the status of interactive gaming in the communications environment 100. One or more alerts could indicate branch adjustments based on user state. The alert can be sent to user 101, or another device or individual with administrative rights (e.g. parent, caregiver, etc.). You can present the alert using specific tones, tactile feedback, verbal acknowledgments, tactile feedback or other forms. An alert could be played at each stage of a game, for example. An alert can also be sent to interactive book 105.

“In some embodiments, wireless earpieces102 may vibrate, flash, play a tone, or make other sounds to indicate the game’s automatic self-configuration process. This is done in order for the user to take further actions, such as providing feedback or making a manual adjustment. You can also use the wireless earpieces 102 to perform any number of related steps. The wireless earpieces 102 may also communicate an alert to other devices that shows up as a notification, message, or other indicator indicating the necessity for configuration/re-configuration or a changed status of the configuration process, such as an audio alert that ?the story has been changed to calm your child.?”

“The wireless headphones 102 may have logic to automatically implement gaming configuration upon wireless earpiece setup, start-up, and connection to a gaming device (e.g. interactive book 105, virtual headset 110, etc.). condition changes (e.g., location, activities, etc. Other conditions and factors 100. The wireless device 104 can communicate with the wireless earpieces102 to the user 106 in order to receive feedback from them if they disagree with certain gaming variables.

“The wireless device104 may contain an application that displays information and instructions to the user 106 depending on the configuration of the game.

“In one embodiment, the wireless devices 104 can use short-range and long-range wireless communications to communicate directly with the wireless earpieces (102) via a wireless signal or other devices in the communication environment 100. The embedded logical components may contain a Bluetooth or cellular transceiver, for example. The wireless signal could be a Bluetooth or Wi-Fi, Zigbee (or Ant+), near-field magnetic Induction (NFMI) or any other short-range wireless communication.

“The wireless device104 can represent any number wireless or wired electronic communications devices or computing devices such as smart phones and laptops, desktop computers or control systems. It could also include displays, gaming devices, music players or personal digital assistants. Near field communications, Bluetooth, WiFi, Wi-Fi and wireless Ethernet are all possible methods of communication for the wireless device 104. The wireless device 104 could be a touch-screen cellular phone that communicates wirelessly with the 102 wireless earpieces via Bluetooth communications. The wireless device104 can use any number of operating system, kernels, instructions or applications that make use of sensor data from the wireless headphones 102. The wireless device 104 could represent Android, iOS, Windows or any other system and devices. The wireless device 104 and wireless earpieces102 can execute any number applications that use the user’s input, proximity data and other feedback from wireless earpieces102 to initiate, authorise, or perform configurations and the associated tasks.

“As noted above, the layout of the components within the wireless earpieces102 and the limited space for products of small size could affect the location of the sensors 112 or other components. Based on the model, version and manufacturing process of each wireless earpiece 102, the positions of the sensors 112 may differ.

“The interactive book105 can represent any number electronic devices. The interactive book 105 could be a book that includes a sensor, transceiver and user interface. touchscreen, buttons and scroll wheels, virtual buttons, etc. (including processing components). The interactive book 105 can also be used to represent any number of ebooks, tablets, personal computing device, electronic readers and cell phones. The interactive book105 can execute any number of steps or processes that are related to digital logic, firmware or software (e.g. applications, operating system kernels, instructions sets, etc.). You can combine them all.

“The interactive book105 can communicate with the wireless devices 102, 104, and/or network 120 simultaneously or simultaneously. The interactive book 105 can receive biometric readings and user input directly from the wireless device 104 during media content communication. This includes interactive stories, games, communications, and so forth. The wireless earpieces102 and the interactive book105 can perform the described processes for using user input and biometric information either individually or together. The wireless earpieces may use a branching pattern, or an algorithm to determine variables and make decisions that will be implemented by the communications environment 100, including the interactive book105. Information may be communicated by the wireless earpieces (102) and interactive book 105.

“In another example the user inputs and biometrics can be communicated via the wireless earpieces102, which are used for the interactive book105. Based on biometric readings, wireless earpieces102 could determine the user’s physical and emotional state 106. The interactive book 105 may then receive the relevant information. The interactive book 105 can make many branching decisions and implement variables based on user inputs, biometrics, or user state. The story or game may be modified to achieve a desired level or user state. The interactive book 105 can be used to treat individuals with disabilities. The interactive book 105 may be used to treat individuals with disabilities. If the wireless earpieces102 detect that the user is stressed, they may substitute a more relaxing storyline. This can be done based on biometrics and vocalizations as well as head movements and other user-detectable information. The interactive book 105 and wireless earpieces102 reduce the intensity of the storyline to similar effects on the user.

“FIG. “FIG. 2 is a pictorial representation a virtual reality systems in accordance to an illustrative embodiment. Any number of components and systems can be included in the virtual reality system. One embodiment of the virtual reality system could include wireless earpieces102, which can include a left and right earpiece, and a Virtual Reality headset 110 with strap 114. The system 130 could also be included. The processing system 130 can communicate with the virtual reality headset 110 or wireless earpieces 101.

In one embodiment, the processing device 130 may process media content sent through the VR headset 110 or wireless earpieces 101. The processing system 130 can also take biometric readings from users to modify, amend, change or otherwise modify decisions, branches or other content in a virtual reality game. Other embodiments allow the wireless earpieces102 to process all biometric data from the user in order to adjust the game or any other media content presented by the virtual reality system. The processing system 130 could represent a gaming system, such as Xbox, Playstation, Nintendo System, or other similar devices. A personal computer, server or cloud interface device, as well as other similar devices. Processing system 130 can receive any number of storages, disks, cartridges or storages. It may also access content through one or several networks or connections.

“The wireless headphones 102 can play audio from virtual content presented by virtual reality headset 110. The wireless earpieces may be set up to play music, audio, make and receive phone calls, or any other communication, as well as determine ambient environmental readings (e.g. temperature, altitude location speed, heading, speed, heading). ), and read user biometrics (e.g. heart rate, motion, sleep oxygenation, calories burnt, etc.). To make decisions or determine the content of the virtual reality system’s media content, the user biometrics and any other readings can be used.

The virtual reality headset 110 simulates physical presence at places in the real or imagined worlds. It allows the user to interact with that environment. Immersive multimedia is also known as virtual reality. It can be used to create sensory experiences that include sight, sound, touch, smell and taste. The power source for the virtual reality headset 110 is a battery, power plug, or another connection (e.g. USB connection to a gaming or computing device). The virtual reality headset 110 can also communicate data (send or receive) using a wired or wireless connection with any number of computing, communications or entertainment devices.

The visor 113 can be used to display visual and graphic information to the user. One or more displays may be included in the visor 113 (e.g. liquid crystal displays and organic LED displays), Projectors, direct, indirect, and refractive, are used to display information to the eyes. The virtual reality headset 110 can also have touch screens, smell interfaces, and tasting interfaces. The shape and size of the virtual headset 110, the visor 113, or the strap 114 can vary depending on the make, model, manufacturer, as well as the user configuration of virtual reality headset 110. These include those made by Google, HTC and Sony, Oculus and Samsung. Durovis Valve, Avegant and other companies. Other embodiments of the visor 113 can be transparent, glass, and/or see-through, allowing the user interaction with the real world while using the virtual reality system. The virtual reality system can be used to show augmented reality content to the user.

The strap 114 is a band that runs between the visor 113’s sides and secures the virtual reality headset 110 to your head. The strap 114 can be made of any of a variety of materials such as nylon, cotton, nylon, rubber, plastic or the like. The strap 114 can have buckles, loops or other adjustment mechanisms to fit the virtual reality headset 110 to the user’s head. Some virtual reality headsets look more like helmets or have additional structural components (e.g. straps, arms and extensions). For attaching the virtual reality headset 110 on the user’s head during regular and vigorous use.

“With regard to wireless earpieces102, sensor measurements could refer to measurements made using one or both wireless earpieces204. The wireless earpieces may decide that the sensor signal for pulse oximeter on the right wireless earpiece has become very noisy. In this case, they may use the sensor signal from left wireless earpiece to determine the primary measurement. In response to noise levels, the wireless earpieces may switch between the right and left pulse oximeters. This allows for the use of the clearest possible sensor signal at any time. One embodiment of the invention allows wireless earpieces102 to switch sensor measurements when the sensor measurements exceed or fall below a threshold.

“The user could also be carrying or wearing any number of sensor-enabled gadgets such as pacemakers, heart rate monitors, smart glasses, smart watches, bracelets (e.g. Fitbit, Apple Watch, Fitbit, etc.). Other sensory devices may also be worn, attached, or integrated with the wearer. The wireless earpieces102 may receive data and information from external sensors. Another embodiment may use the data and information of external sensor devices to further process the information sent by wireless earpieces102 to a wireless device. The media content can be modified based on user readings to better satisfy the user, or third parties watching the user.

“The wireless earpieces 120’s sensors may be placed at enantiomeric places. A number of colored light emitting devices (LEDs) may be placed to provide information and data, such as heart rate and respiratory rate. Data gathered by LED arrays can be used to sample data or combined with other sensors. Sensor readings can be enhanced or strengthened by adding additional data.

“As shown in the image, wireless earpieces may be physically or wirelessly connected to the virtual reality headset 110. The wireless earpieces 110 and 102 can receive user input and commands for use on any device of the virtual world system. The wireless earpieces may be described as either a pair (wireless headphones) or a single (wireless earpiece), as previously mentioned. This description could also include information about the components and functionality of each wireless earpiece 102 individually or collectively.

“In one embodiment, wireless earpieces102 play audio that corresponds to virtual reality content displayed on the virtual reality headset 110. The wireless earpieces may also provide additional user and biometric data that can be used by the virtual reality headset 110, connected computing, entertainment or communications devices, systems or components.

“In certain embodiments, the virtual-reality headset 110 can be used as a log tool to receive information, data or measurements from the wireless earpieces. The virtual reality headset 110 could be worn by the user in order to receive data from the wireless headphones 102 in real time. The virtual reality headset 110 can be used to display, store, and sync data to/from wireless earpieces 102. The virtual reality headset 110, for example, may display the pulse, blood oxygenation and distance as well as calories burned. The virtual reality headset 110 and wireless earpieces102 can have many different electrical configurations, shapes and colors. They may also include various connections and circuitry.

In one embodiment, wireless earpieces 101 may have a battery 208 and a logic engine including processor 210. A memory 212, user interface 215, physical interface 215, and a transceiver 226 are some of the components. Sensors 112. A battery 218, a memory 221, an interface 220, and sensor/sensors 224 may be included in the virtual reality headset 110. The battery 208 is a power storage unit that powers the wireless earpieces 302. The battery 218 is also a power storage device that can power the virtual reality headset 110. The batteries 208 and 218 could also be used to power a fuel cell or thermal electric generator, piezoelectric charger, solar charger or ultra-capacitor or other emerging or existing power storage technologies.

The logic engine 210 controls the operation and functionality wireless earpieces 101. The logic engine 210 can include circuitry, chips, or other digital logic. Programs, scripts, or instructions may also be included in the logic engine 220. The logic engine 210 can be hardware, software, firmware or any combination thereof. One embodiment of the logic engine 210 can include one or more processors. The logic engine 210 could also be an application-specific integrated circuit (ASIC), or field-programmable gate arrays (FPGA). The sensors 112 may be used by the logic engine 210 to determine biometric data, readings, and other information. This information may be used by the logic engine 210 to inform the user about the biometrics (e.g. audibly, via an application of a connected gadget, tactilely, etc ).

“The logic engine210 may also process user input in order to determine commands that are implemented by wireless earpieces 101 or transmitted to wireless earpieces102 via the transceiver 226. To determine the specific actions to take, the sensors 112 may detect user input. One embodiment of the logic engine 210 can implement a macro that allows the user to associate input from the sensors 112 with commands. The logic engine 210 can make any number of branching decision based on the available decisions in a story, game, or other interactive media. Artificial intelligence may be used by the logic engine 210 to create new branches or threads for users. The logic engine 210, for example, may use biometric data from sensors 112 to determine if the user’s response is as expected. To achieve desired user responses, the logic engine 210 can use any number of training and gaming scenarios.

A processor is circuitry or logic that controls execution of a set instructions in one embodiment. One or more processors may include digital signal processors, central processing units, central processing units, microprocessors or other devices that are suitable for controlling electronic devices, which can include executable software, instructions and programs, processing information and converting it to and from signals, as well as other related tasks.

“The memory212 is a hardware device, element, or recording medium that stores data for later retrieval or access. The memory 212 can be either static or dynamic. The memory 212 and logic engine 210 can be combined in one embodiment. Any type of volatile and non-volatile storage methods and mediums may be used to store the memory. The memory 212 can store information about a user’s status, wireless headphones 102, virtual reality headset 110, as well as other peripherals such a wireless device or smart case for wireless earpieces. Smart watches, smart phones, and so on. One embodiment of the memory 212 may display instructions and programs for controlling the user interface (214), which could include one or more LEDs, other light emitting components or speakers, tactile generators, etc. The user input information may be stored in the memory 212 with each command.

The memory 212 could also store interactive media, such as stories or games, that are created by the virtual reality platform. The memory 212 can also store user profiles, including measurements necessary to achieve desired user states (e.g. pulse, voice characteristics and blood chemistry). The wireless earpieces 101 may have sensors 112 that measure adrenaline, including indirectly. Interactive media content can be chosen so that hormones and other chemicals such as adrenaline, are within a specific range or don’t exceed one or two thresholds.

The transceiver 216, which is both a transmitter & receiver, may be combined to share the same circuitry in a single housing. The transceiver 216, which can communicate using Bluetooth, WiFi, ZigBee and Ant+, near field communications (wireless USB), infrared communications, ultra-wideband communications, cellular, (3G, 4G/5G, PCS, GSM, etc. Other radio frequency standards, networks or protocols that are suitable for communications. A hybrid transceiver may be available that supports multiple communications. The transceiver may be used to communicate with virtual reality headset 110, or other systems that use wired interfaces (e.g. wires and traces). NFC, Bluetooth communications.

The components of the wireless headphones 102 can be connected electrically using any number of wires or contact points, leads and busses. The wireless earpieces may also include any number or components of computing and communication components, devices, elements, such as busses, motherboards and circuits. This includes chips, circuits and sensors, buses, connectors, adapters and transceivers. Displays, antennas and other similar components. The physical interface 215 allows for communication and connection with the virtual reality headset 110, or other electrical components.

“The physical interface 215 can include any number pins, arms or connectors that allow for electrically interfacing with external devices, charging or synchronization devices, and other components. The physical interface 215 could be, for example, a micro USB port. One embodiment of the physical interface 215 includes a magnetic interface that automatically couples with contacts or interfaces for the virtual reality headset 110. Another embodiment of the physical interface 215 could include a wireless connector for charging wireless earpieces (102) without the need to connect to a charging device.

The user interface 214 can be used to receive commands, instructions or input via the touch (haptics), voice commands or predefined movements. The wireless earpieces may also be controlled by the user interface 214. The user interface 214 could include an LED array, one or several touch sensitive buttons, portions, a miniature display or screen, optical sensors or other input/output elements. The user may control the interface 214 by himself or using commands from the virtual reality headset 110, or another wireless device.

“In one embodiment, feedback may be provided by tapping the user interface (214) once, twice or three times, or any other number of times. To perform a predetermined action, a swipe motion can be used across or in front the user interface (e.g. the exterior surface of wireless earpieces 102) for the same purpose. You can associate specific actions with swiping in any direction, including play music, fast forward, pause, fast back, or activate a digital assistant (e.g. Siri, Cortana and smart assistant). Swiping can also be used to control the functionality and actions of the virtual reality headset 110, or any other external devices (e.g. smart TV, camera array, smart watch etc.). User input can also be provided by the user through head movements or based on their position. To change the content of the virtual reality headset 110, the user can use voice commands, head gestures or touch commands.

“The sensors 112 could include pulse oximeters and accelerometers as well as magnetometers and inertial sensors. They also can be used for photo detections, mini cameras, chemical sensors, such blood, sweat, excretions, and odors. Other similar instruments that detect location, orientation, movement, etc. Sensors 112 can also be used to collect optical images, data and measurements, as well as determine acoustic noise levels, ambient noise, and other environmental conditions. The sensors 112 can provide data or measurements that can be used to filter images or choose images for display by virtual reality headset 110. To command the smart glasses to display images from the left, motion or sound may be detected. Although motion or sound can be used, there are many triggers that can be used to send commands to virtual reality headset 110. Sensors 112 can detect the emotional and physical reactions of users to media content. The sensors 112 can correlate with any number stress levels, emotional or physical states. The logic engine 210, for example, may compare the measured data against the baseline readings of various user conditions and states to determine the condition and status of the user.

The virtual reality headset 110 could include components similar to those used for the wireless earpieces 101, such as a battery 218, memory 220 and sensors 222. The logic engine 226, which executes and implements the functions and processes described herein, may be included in the virtual reality headset 110. The battery 218 may be embedded in the frames of virtual reality headset 110. It may also have additional capacity that can be used to charge wireless earpieces 101. A power adapter, interface or amplifiers may be included in the virtual reality headset 110. The frame and/or lenses may also include all or part of the logic engine 226, sensors 224, user interface 228, display 228, transceiver 233, and sensor 222.

The virtual reality headset 110’s user interface 222 may have a touch interface (or display) that indicates the status of the virtual headset 110. An external LED light, for example, may indicate the status of the battery in the virtual reality headset 110 and the connected wireless headphones 102. It also indicates the status of the connection status (e.g. linked to the wireless headphones 102, wireless device), etc. ), download/synchronization status (e.g., synchronizing, complete, last synchronization, etc. ), or similar information. The user interface 222 could also contain optical sensors that track user’s eye movements and facial gestures in order to perform motions or other actions based upon the sensed input.

The display 228 could be embedded into the lenses of virtual reality headset 110, or may represent one or more projectors that project content directly to the user’s eyes. The display 228 could be a transparent organic light emitting lens (OLED) that can be used to display content. The projectors of the display 228, may use any number of wavelengths and light sources to display data or images to the user. Another embodiment of the display 228 can be transparent or semitransparent, allowing it to function as an augmented reality system. The virtual reality system can be used in any real-world environment, for example. The content displayed on the display 228 can be modified or reconfigured according to the user’s response (e.g. to attain desired physical and emotional state, conditions or measurements).

Display actions can also be performed using the LED array of the user interface 222. An LED array of the user interface 222 may also be used for display actions. For example, it could be activated when someone or something is in the cyclist’s blind spot. Another embodiment may emit device status indicators from the LED array on the wireless earpieces102, which are activated by the virtual reality headset 110’s user interface 222. The physical interface of the user interface 223, may charge the battery 218. The physical interface can be part of the user interface 222, or it may be an independent interface. The user interface 222 could include a hardware interface, such as a connector, port, or other device. The virtual reality headset 304 can be connected to a power supply, or any other electronic device. The user interface 222 can be used for both charging and communication with externally connected devices. The user interface 222 could be a mini-USB (micro-USB) or another miniature standard connector. A wireless inductive charging system can be used to initial replenish power to wireless earpieces (102). Inductive charging can also be used to charge the virtual reality headset 110.

“Another embodiment of the virtual reality headset 110 could also include sensors 234 for detecting the location and orientation, movement, biometrics, and proximity to the wireless headphones 102. The sensors 224 may also include optical sensors, cameras, and microphones for recording images and other content about users (e.g. eye movements, body movements, expressions and arm and leg positioning, etc.). The user’s periphery (e.g., sideways, behind, etc.) Virtual reality headset 110 can detect any number wavelengths or spectra and provide unique images, enhancement, data and content. A virtual reality headset 110 could also contain an LED array, galvanic or other touch sensors as well as a battery, solar charger and actuators.

“The original packaging of the wireless earpieces102 and virtual reality headset110 may contain peripheral devices such charging cords power adapters, inductive charging adapters, solar chargers adapters, solar cells or batteries. Smart case covers and transceivers (e.g. Wi-Fi, cell, etc.) are also possible. ”

“The virtual reality system may include an interactive book 105 in one embodiment. The interactive book (105) may contain any component of the wireless headphones 102 or virtual reality headset 110. The interactive book 105 could include, for example, a battery, memory and logic engine/processor. It may also contain a user interface, physical sensors and transceivers. One embodiment of the interactive book 105 may contain one or more pages that can be navigated by a person. Interactive components may be included in the pages, such as touch screens, interactive elements, styluses, or any other component or appendage that allows users to interact with them. The interactive book 105 may display different content depending on the story or game it has created. An interactive book 105 could also contain one or more speakers or vibrators or dynamic braille interfaces or other communication components that allow bi-directional communication between the user and the author.

“As described, interactive book 105 can alter the communicated content based on the measurements from sensors 112 or user status as processed and stored by the logic engine 220. The interactive book 105 or wireless earpieces102 can store the interactive media content. The interactive book 105 can communicate with any or all of the virtual reality systems via a wireless or wired connection. The interactive book 105 can communicate with the wireless headphones 102 using a Bluetooth connection. The wireless earpieces102 may transmit audio from the interactive media content displayed in the interactive book 105 to the user in one embodiment.

“All or parts of the virtual reality system can function independently, simultaneously, as a networked bunch, or using direct communications. To achieve the purposes, functionality, features, and methods described herein, users can use the wireless earpieces 101, virtual reality headset 110, and interactive book 105 either individually or together.

“FIG. “FIG. FIG. 3 is one embodiment of the process. 3. may be executed by one or more wireless headphones, such as the wireless headphones 102 of FIG. 1. FIG. 3. may be used to transmit information between wireless earpieces, an interactive book, and/or a virtual-reality headset. Wireless earpieces can synchronize audio content with interactive media content. Wireless earpieces can measure the user’s biometrics and determine their emotional and physical state. Wireless earpieces can also detect the location and location of the user in order to display precise virtual information such as the angle, position, angle, motion, and location of their head. The wireless earpieces can communicate with any number of electronic devices such as smart cases, computing devices and entertainment devices to complete the FIG. 3.”

The wireless earpieces may receive interactive media content from the user using wireless earpieces (step 302). Wireless earpieces can communicate with electronic devices such as a virtual reality headset, interactive book, virtual reality processing systems, or other electronic devices to deliver the interactive media content. Interactive media content can include stories, stories, communications and role-playing scenarios. Trainings are also possible. You can select interactive media content to playback using any combination of inputs such as file selections and icon selections. The interactive media content may be selected by the user, or one or more remote devices. In step 302, the interactive media content can be communicated between the virtual reality system (or wireless earpieces) and the user.

“Next, wireless earpieces measure the user biometrics using the wireless headphones (step 304). Any number of sensors can be used to measure user biometrics, including pulse rate, blood pressure and temperature. Another embodiment allows the determination of user biometrics using any number other devices that are in communication with wireless earpieces. The wireless earpieces can communicate with cell phones, smart watches and smart bands, as well as headgear, vehicle components, virtual realities components, and so forth.

“Next, wireless earpieces determine the user condition associated to the user biometrics. (Step 306) Information associated with user biometrics may be used to determine the user condition. You can use any combination of biometric thresholds to determine your user condition. To determine if the user is afraid, it may be possible to use voice characteristics, heart rate, blood pressure, and heart beat. The wireless earpieces can be programmed to run through a scenario or training program that associates the user’s biometrics and known conditions, including emotional and physical state. Other embodiments allow the wireless earpieces to prompt the user periodically to indicate their status in order for the user’s biometrics to be associated with them. To determine the user’s state, a database may be used to track faults, programs, custom user biometrics, baseline readings and thresholds.

“Next, the wireless headphones modify the branching patterns in the interactive media content based on the user’s condition (step 308.) One embodiment of the invention is that the wireless earpieces and virtual reality system can choose one of several branching patterns to apply based on the user?s condition during step 408. Branching patterns can represent any number variables, conditions, parameters or outcomes that may be used in interactive media content. Interactive media content can be modified by wireless earpieces or implemented based upon a command, instruction, application, feedback, or any other information received from wireless earpieces. An interactive book, or virtual reality system, may use branching patterns to respond to user conditions and status. This is based on biometric data and other information.

“In one embodiment, the process of FIG. 3. The wireless earpieces can be connected to the virtual reality headset. The user could magnetically attach the wireless earpieces to the virtual reality headset. The connectors on each side of the virtual headset can correspond to the right and left wireless earpieces. Magnetic contacts may be used to connect the two devices and provide data communication. The wireless earpieces have a significantly higher power capacity. Another embodiment may have connectors and ports that allow for physical connection between the wireless earpieces and the virtual reality headset. Both the headset and wireless earpieces can communicate by using an inductive connection. Another embodiment of the wireless earpieces may communicate with the virtual reality headset using short-range communications such as Bluetooth, NFMI or ANT+. The wireless earpieces can be connected to the frame, connectors, extensions and arms.

The wireless earpieces can also sync audio and visual content (altogether media content) with the virtual reality headset. The wireless earpieces can use any number of sensors to determine location and velocity (e.g. linear, angular, etc. ), user’s position (and their head), and biometric conditions (e.g. heart rate, blood oxygenation, temperature). The wireless earpieces can also communicate other information to the user to adjust the audio effects, such as volume, tuning, balance and fade. Wireless earpieces can also transmit or receive commands to manage and synchronize the audio content with the visual content.

“The information can be coordinated based upon user actions, conditions and position, as well as location. To make the environment more real, you can play specific three-dimensional sounds in the wireless earpieces that correspond to the left and right ears. The volume and other effects can be adjusted to match the avatar’s orientation within the virtual environment. Audio content can include flags, timestamps or other information to synchronize playback. The user will not be disoriented, motion sick, or adversely affected by the synchronization of audio and visual content.

“The wireless headphones communicate audio content that the user has received via the virtual reality headset. Based on the synchronization information between the virtual reality headset, and the wireless headphones, the audio content can be played. The content of the wireless earpieces can play different content depending on what virtual reality environment the user is in. Wireless earpieces can use different sounds, volume, and audio effects. The user can experience a virtual environment that is unique with the same sounds, without having to increase their weight or be subjected to other forces by larger sound systems.

“The illustrative embodiments offer a system, method and wireless earpiece(s), for self-configuration automatically, or based upon environmental conditions, detected activities, thresholds, thresholds, or user input. These illustrative embodiments can be either entirely hardware or entirely software (including firmware, resident code, etc.). Or an embodiment that combines both software and hardware aspects, which may be referred to as a “circuit” ?module? ?module? oder?system.?”

“Computer program codes for performing operations of the embodiments can be written in any combination possible of one or more programming langauges, including Java, Smalltalk or C++, and traditional procedural programming langauges like the?C? programming languages or similar languages. The program code can be executed entirely on a user?s computer, partially on the user?s computer as a standalone software package, partially on the user?s computer and partly at a remote location, or entirely on the remote server or computer. The remote computer can be connected to the user’s machine through any network type, such as a local area network or personal area network or wide area network. Or, it may be connected to an external computer via the Internet, using an Internet Service Provider.

“FIG. “FIG. One or more sensors 112 could include inertial sensors, such as magnetometers, accelerometers, and/or geodes. The inertial sensors 244 may be used for tracking the head’s position and movements. One or more biometric sensors may be included in the sensors 112, such as temperature sensors, pulse oximeters, perspiration sensors or other types. The biometric information can be used to track physiological indicators related to stress. This may include different parts of a story arch, particular characters, game actions, or other elements within a book or game. Also shown are one or more microphones 243. The microphones 243 can detect the voice of a user within an environment. However, they may also be used to detect subtle sounds such as breath patterns or utterances that indicate a user’s emotional reaction to an element in a storybook. Additional information may be obtained from the environment or a user through one or more cameras. You can have as many sensors as you like, and they don’t need to be the only ones shown.

“As explained previously, information detected by one or more sensors could suggest different branching patterns. Different branching patterns can be used for different reasons. Branch patterns can be used to decrease intensity. For instance, if a user is prone to becoming too intense and causing emotional or physical problems, they may choose branch patterns that are less likely to excite them. Branch patterns can also be used to intentionally increase the intensity of a user’s experience. For example, if story elements cause an increase in heart rate (detected with one or more earpieces’ heart rate sensors), increased sweating (detected with a moisture sensor on one or more earpieces), or shallower breathing (detected with a microphone), then branches can be taken to support this. Different models can be used to predict how a user will react to different branches. These models may be based on past reactions, either to the same interactive media or to other interactive media. Models include neural networks, fuzzy logic algorithms and genetic algorithms. Models can also be statistical in nature. Different branch patterns might have different weightings depending on the inferred emotional response. If certain branches are associated with more excitement in a population, such as based on biometric data or heart rate, they may be given higher weightings. When deciding between two branches in an interactive story or game the one that is most closely associated with the emotional response may be chosen (e.g. Higher values will elicit a stronger emotional response. You should understand that different models can be used depending on the interactive story or interactive content and the inferred emotions to increase or decrease. Also, it is important to understand that the objectives do not have to be binary. It may be possible to increase an emotional response and decrease another for a while before increasing it again.

“The illustrative embodiments do not have to be limited to those described herein. Particularly, the illustrative embodiments allow for many variations in how embodiments can be applied. This description is intended to be used as a guideline and explanation. This description is not meant to be an exhaustive list nor limit disclosures to specific forms. Other alternatives and exemplary aspects may be included in the disclosure. This description only illustrates some embodiments, processes, or methods of the invention. Any modifications, substitutions and/or additions that are permitted within the scope and spirit of the disclosure may be made. As can be seen, the disclosure achieves at least the intended goals.

“The detailed description of the invention described in the previous paragraph is limited to a few embodiments and is not meant to limit its scope. These claims detail a few of the embodiments disclosed in the invention with greater specificity.

Summary for “Variable computing engine to create interactive media using biometrics”

“Audiobooks are generally less popular in English-speaking countries. Bantam Books’ variable branch books have been less popular than other story formats. These interactions have been largely replaced by the internet gaming industry, either alone or in conjunction with other industries. This art form is now less popular. It is time to create a new way for users to interact with the storyline and even influence its ending.

“Therefore it is a primary feature, object, or advantage of this invention to improve the state of art.”

It is an additional object, feature or advantage that allows a system to detect biometric parameters of users and include them in the branching patterns.

“It is an additional object, feature, and advantage to modify the branching patterns of the game based on user biometric inputs.”

“Another object or feature is to reduce the intensity of a story in order to provide a protective function in the event that the user experiences excessive stress.

“Yet another feature, object, or advantage is to raise the action or associated stress levels of the story based on biometric data fed into the engine relative to desired level of interaction within the story or branch.”

“An additional object, feature or advantage is to protect users by intentionally calming them through a modification of branches if biometric sensors detect a problem level of interaction.”

The following description will make it clear that one or more of the objects, features, and advantages are included. Each of these features, advantages, and objects are not necessary to be included in every embodiment. Different embodiments could have different features or advantages.

“A system and method of implementing interactive media content are provided according to one aspect. At least wireless earpieces can be used to receive interactive media content. The wireless earpieces measure user biometrics. The user condition is associated with the biometrics. The user condition determines whether branching patterns in the interactive media content will be modified. A second embodiment includes wireless earpieces with a processor that executes a set instructions and a memory to store the instructions. These instructions execute the above-described method.

“A virtual reality system is described in another aspect. A virtual reality headset is used to display a virtual environment to the user. Wireless earpieces are also part of the virtual reality system. They receive interactive media content from a virtual reality headset, communicate with a user via at least wireless headphones, determine user biometrics using biometric sensors on the wireless earpieces and modify branching patterns in response to user conditions.

“The illustrative embodiments offer a system, method and wireless earpieces that allow you to interact with a gaming system. A?game’ may refer to the applicable story, game or multiplayer interface, or any other media content. For simplicity’s sake, The wireless earpieces can sense biometrics. Biometrics can include the user’s pulse, blood pressure and temperature. They also may be used to determine stress levels, as well as other parameters, factors, characteristics or measured parameters. You can also interact with the game using voice, audio, tactile and gestures.

“The game could be set up to use a user profile to manage how interactions and biometrics are controlled by the game. Based on user biometrics, different game variables, branches and decisions or other attributes can be implemented in one embodiment. Variables such as branching patterns and potential story lines can be altered based on user biometrics. Based on biometrics, the branch can be used to soothe, upset, calm or affect the user to reach a desired level of interaction. You may set user preferences to calm a child so the storyline will be calming as it progresses. The game can be customized for each user. The game can be made to please, terrorize, or keep the player guessing. You can include audio, visual, tactile and olfactory media in the game.

“In one embodiment, the digital content device may be included in the game (e.g., an e-book or tablet, a cell phone, or a mobile phone). The user may be able to access the media content through this device. An interactive book can communicate with wireless earpieces in order to receive input from the user and communicate information. Another embodiment of the game could include a virtual-reality system that includes a headset. An interactive book, or virtual reality headset, can communicate wirelessly with the wireless headphones or via a wired connection. The virtual reality headset could be connected to the wireless headphones using magnetic connectors. The wires that extend from the virtual reality headset can magnetically couple, and use induction communications and power transfer for power. Another example is that the interactive book, or virtual reality headset, may have a frame to connect to the wireless headphones. Another embodiment may use a wireless protocol, standard or connection, such as Bluetooth or NFMI or Wi-Fi.

The wireless earpieces can be worn in the user’s ear. The wireless earpieces can be fitted at least partially in the user’s external auditory canal. The ear canal provides a rich area for biometric measurements and stabilization of the wireless earpieces. Wireless earpieces can be used during rigorous physical activities that require stability. The wireless earpieces are designed to allow for extended wear and the collection of valuable information using the wireless sensors. Wireless earpiece sensors may include accelerometers and gyroscopes as well microphones, camera, imaging devices, contact/conductivity sensor, pulse oximeters, etc. Important measurements taken by the sensors may include pule rate, blood oxygenation, microphone, position/orientation, location, temperature, altitude, cadence, calorie expenditure, and so forth.”

“Wireless earpieces can include any number sensor arrays that capture information about the user. This large amount of data could allow the user to adjust the configuration of wireless earpieces dynamically. The wireless earpieces can learn from historical data and user input over time. An alert may be sent to the user with information about the current status of the configuration adjustment process.

“The illustrated embodiments can be used for entertainment, scientific, educational or commercial purposes. Virtual reality headsets such as those made by Google, HTC and Samsung, oculus Sony, Microsoft, and others, can present any number or three-dimensional visualizations to users. These examples minimize existing mass problems associated with bulky audio or headphone systems. The characteristics of the angular momentum associated to the user’s head do not change significantly, decreasing the effects of torque, neck strain and other problems that could be associated with virtual reality systems.

The wireless earpieces can include any number sensors that communicate with the components, systems, and sensors of the interactive story. This will enhance the user’s experience. One embodiment of wireless earpieces’ sensors may include accelerometers and gyroscopes as well as magnetometers and optical sensors. The sensors can collect data that may be used to determine the user’s characteristics, location, stress levels, mental health, and other factors. The data can be used to improve the user’s experience in the virtual reality game. The sensor also provides data that improves the measurement of the virtual reality headset.

To provide better three-dimensional spatial sound imaging, the precise location and position of the user may be used. The sensors can also be used to detect biometric information such as heart rate and blood pressure, skin physiology or any other biometric data. This data can be used to determine if the user feels safe while playing a virtual reality game or is feeling fatigued or stressed. Game interaction can also be modified based on user’s condition. You can use different branches, sub-branches or endings. The wireless earpieces can be used to communicate with the interactive story, virtual reality headset or other devices. You can play music, block or filter sound, amplify sounds, and so forth.

The wireless earpieces can be used for everyday activities such as exercise, phone calls, travel, etc. Wireless earpieces can also be used as audio interfaces in interactive books or virtual reality systems. The system is therefore less expensive and lighter than other audio components. Wireless earpieces are smaller in size and can be used by users to reduce strain and weight. The various games may also include a separate power source or battery that can be used to power the wireless headphones on the go.

“FIG. “FIG. Wireless earpieces 102 can be configured to communicate with one another and one or more wireless devices (104).

“The wireless earpieces 101 may be worn by a person 101. They are shown as worn separately from their position within the ears of the user (106) for visualization purposes. FIG. 2 shows a block diagram of wireless earpieces. 2. To further illustrate the components and operation, of wireless earpieces102. The first and second wireless earpieces are shown together. However, only one wireless piece is required.

“In certain applications, temporary adhesives (e.g. straps, clamps and extenders, chargers etc.) may be used. To ensure that the wireless headphones 102 are in the ears 101 of the user 101 during strenuous and physically demanding activities, you can use temporary adhesives or securing mechanisms (e.g., clamps straps, lanyards extenders, chargers etc.) The wireless earpieces may be used during swimming, virtual reality simulations and interactive gaming. They can also be used for team sports like biking, hiking, parachuting, and other activities. Wireless earpieces may be used to play music, audio, make and receive phone calls, or any other communication, determine ambient conditions (e.g. temperature, altitude location speed, heading, heading), etc. ), read user biometrics (e.g., heart rate, motion, temperature, sleep, blood oxygenation, voice output, calories burned, forces experienced, etc. ), and can receive feedback or instructions from the user. The logic of wireless earpieces102 may allow for the dynamic configuration of components, such as microphones and speakers, according to the conditions 100.

“The position of the wireless earpieces102 could be determined by the VR headset 110, other wireless devices 104, and interactive books 105. The position information of the wireless earpieces may be used to determine the proximity of the devices within the communication environment 100. To determine the distance between the devices in the communication environment 100, it may be used, for example, to determine their proximity. One embodiment uses distance information to determine if the wireless earpieces102 are being worn simultaneously (e.g. should they be exposed to similar environmental conditions, noise, etc.). Multiple wireless earpiece sets can communicate simultaneously in one embodiment. Multiple wireless earpieces can communicate simultaneously with the VR headset 110 and interactive book 105, for example.

“In one embodiment, wireless earpieces102 and the sensors that are associated with them (whether external or internal) can be set up to record a variety of measurements or log data during normal use. Sensor measurements can be used to extrapolate to other measurements, factors, and conditions relevant to the user 101. The sensors can monitor the heartbeat and EKG of the user to identify their unique patterns or characteristics. Either the user 101 or another person can configure the wireless headphones 102 and interactive book 105 either directly or via a connected app (e.g. mobile app with a graphic user interface) to store and share data, audio and images. Examples of standard usage include recording and detecting a heartbeat, setting biometrics and user input for implementing variable branching or branching decisions in a game, setting noise thresholds or microphone sensitivity, and setting gestures/inputs to perform an action, such as opening an app, playing music, or providing audio feedback. Active participation in a conversation, listening or playing music, and so forth.

“The wireless earpieces102 can also determine the user’s status, condition or other information using biometric data from sensors and user input. Sensors can determine the user’s emotional and physical state. The memory of the wireless earpieces102 can store information that allows for the determination of the user information. To determine user information, the wireless earpieces 102 may store default values, thresholds, data, and other information. The user information can also be used to associate with any number of categories (e.g. happy, stressed, scared or content ).

The wireless earpieces may be reconfigured or configured during the initial setup process, during regular usage, or upon a request by the user. One embodiment of the wireless earpieces102 may take baseline readings in order to determine whether the communications environment 100 has noise or quiet. It also can read information about the user’s emotional and physical state 101 (e.g. anxious, scared, happy calm, stressed). The communications environment 100 could include the user’s house, library, roadway, work areas, office, school or business, as well as the venue for sports and other activities. One embodiment may set default configurations for microphones and speakers based on the location of the wireless earpieces. The wireless earpieces102 may be further adjusted to adapt to the best configuration of the microphones and speakers based on real-time noise levels at the location.

The wireless earpieces may contain any number of sensors 112 as well as logic to measure and determine user biometrics such as heart rate, pulse rate, skin conduction and blood oxygenation. They can also be used for voice and audio output. The sensors 112 can also be used to determine user’s position, velocity, impact level, and other information. The sensors 112 can also be used to receive input from the user and translate it into commands or selections across the personal area network. The wireless earpieces may detect voice commands, head motions and finger taps as well as gestures and motions. The wireless earpieces may measure user input and convert it into external commands or internal commands that can be sent to mobile devices, computers or other wearable devices. The wireless earpieces102 can detect a user 101’s head movement and voice command, which are used to adjust the noise level 100 or to increase or decrease volume.

“The sensors 112 can make all measurements regarding the user 101 or communicate with any number other sensory devices (e.g. interactive book 105, etc.). The communication environment 100 can measure information and data about user 101, as well as the environment 100. One embodiment of the communication environment 100 could be all or part of a personal network. Wireless earpieces 102 can be used to control, communicate with, manage, and interact with a variety of wearable electronics or devices, including smart glasses, helmets smart glass, smart glasses, wrist bands, smart glasses, smart glasses, watches, smartglasses, smart glasses, smart glasses, smartglasses, smartglasses, smart glasses, smart glasses, chest straps wireless earpieces implants displays clothing, and other wireless earpieces. A personal area network allows data transmissions between devices such as personal computing and communications, cameras, vehicles, entertainment and medical devices. A personal area network can use any combination of wireless, wired, or hybrid configurations. It may also be dynamic or stationary. The personal area network could use wireless protocols and standards such as INSTEON or IrDA, Wireless USB or Bluetooth. One embodiment of the personal area network can move with the user 101.

“In other embodiments, a communication environment 100 can include any number or components that communicate directly or indirectly with one another through a wireless (orwired) connection, signal or link. One or more networks, components, and devices of the communication environment 100 can be included, such as signal extenders, routers, signal generators, intelligent network devices and computing devices.

“Communications within a communication environment 100 can occur via a network 120, directly between devices such as the wireless headphones 102 or interactive book 105, and indirectly through networks such as Wi-Fi networks. The network 120 can communicate with or include wireless networks, such as Wi-Fi, cellular (e.g. 3G, 4G and 5G), PCS, GSM, and so on. Bluetooth, or any other radio frequency network. It may also communicate with any number hard-wired networks such as coaxial networks or fiber-optic networks. One or more service providers can manage communications within the communication environment 100 (e.g. secure, public and private). “, or network providers.”

“The wireless earpieces102 can play, communicate, and use any number of communications to indicate the status of interactive gaming in the communications environment 100. One or more alerts could indicate branch adjustments based on user state. The alert can be sent to user 101, or another device or individual with administrative rights (e.g. parent, caregiver, etc.). You can present the alert using specific tones, tactile feedback, verbal acknowledgments, tactile feedback or other forms. An alert could be played at each stage of a game, for example. An alert can also be sent to interactive book 105.

“In some embodiments, wireless earpieces102 may vibrate, flash, play a tone, or make other sounds to indicate the game’s automatic self-configuration process. This is done in order for the user to take further actions, such as providing feedback or making a manual adjustment. You can also use the wireless earpieces 102 to perform any number of related steps. The wireless earpieces 102 may also communicate an alert to other devices that shows up as a notification, message, or other indicator indicating the necessity for configuration/re-configuration or a changed status of the configuration process, such as an audio alert that ?the story has been changed to calm your child.?”

“The wireless headphones 102 may have logic to automatically implement gaming configuration upon wireless earpiece setup, start-up, and connection to a gaming device (e.g. interactive book 105, virtual headset 110, etc.). condition changes (e.g., location, activities, etc. Other conditions and factors 100. The wireless device 104 can communicate with the wireless earpieces102 to the user 106 in order to receive feedback from them if they disagree with certain gaming variables.

“The wireless device104 may contain an application that displays information and instructions to the user 106 depending on the configuration of the game.

“In one embodiment, the wireless devices 104 can use short-range and long-range wireless communications to communicate directly with the wireless earpieces (102) via a wireless signal or other devices in the communication environment 100. The embedded logical components may contain a Bluetooth or cellular transceiver, for example. The wireless signal could be a Bluetooth or Wi-Fi, Zigbee (or Ant+), near-field magnetic Induction (NFMI) or any other short-range wireless communication.

“The wireless device104 can represent any number wireless or wired electronic communications devices or computing devices such as smart phones and laptops, desktop computers or control systems. It could also include displays, gaming devices, music players or personal digital assistants. Near field communications, Bluetooth, WiFi, Wi-Fi and wireless Ethernet are all possible methods of communication for the wireless device 104. The wireless device 104 could be a touch-screen cellular phone that communicates wirelessly with the 102 wireless earpieces via Bluetooth communications. The wireless device104 can use any number of operating system, kernels, instructions or applications that make use of sensor data from the wireless headphones 102. The wireless device 104 could represent Android, iOS, Windows or any other system and devices. The wireless device 104 and wireless earpieces102 can execute any number applications that use the user’s input, proximity data and other feedback from wireless earpieces102 to initiate, authorise, or perform configurations and the associated tasks.

“As noted above, the layout of the components within the wireless earpieces102 and the limited space for products of small size could affect the location of the sensors 112 or other components. Based on the model, version and manufacturing process of each wireless earpiece 102, the positions of the sensors 112 may differ.

“The interactive book105 can represent any number electronic devices. The interactive book 105 could be a book that includes a sensor, transceiver and user interface. touchscreen, buttons and scroll wheels, virtual buttons, etc. (including processing components). The interactive book 105 can also be used to represent any number of ebooks, tablets, personal computing device, electronic readers and cell phones. The interactive book105 can execute any number of steps or processes that are related to digital logic, firmware or software (e.g. applications, operating system kernels, instructions sets, etc.). You can combine them all.

“The interactive book105 can communicate with the wireless devices 102, 104, and/or network 120 simultaneously or simultaneously. The interactive book 105 can receive biometric readings and user input directly from the wireless device 104 during media content communication. This includes interactive stories, games, communications, and so forth. The wireless earpieces102 and the interactive book105 can perform the described processes for using user input and biometric information either individually or together. The wireless earpieces may use a branching pattern, or an algorithm to determine variables and make decisions that will be implemented by the communications environment 100, including the interactive book105. Information may be communicated by the wireless earpieces (102) and interactive book 105.

“In another example the user inputs and biometrics can be communicated via the wireless earpieces102, which are used for the interactive book105. Based on biometric readings, wireless earpieces102 could determine the user’s physical and emotional state 106. The interactive book 105 may then receive the relevant information. The interactive book 105 can make many branching decisions and implement variables based on user inputs, biometrics, or user state. The story or game may be modified to achieve a desired level or user state. The interactive book 105 can be used to treat individuals with disabilities. The interactive book 105 may be used to treat individuals with disabilities. If the wireless earpieces102 detect that the user is stressed, they may substitute a more relaxing storyline. This can be done based on biometrics and vocalizations as well as head movements and other user-detectable information. The interactive book 105 and wireless earpieces102 reduce the intensity of the storyline to similar effects on the user.

“FIG. “FIG. 2 is a pictorial representation a virtual reality systems in accordance to an illustrative embodiment. Any number of components and systems can be included in the virtual reality system. One embodiment of the virtual reality system could include wireless earpieces102, which can include a left and right earpiece, and a Virtual Reality headset 110 with strap 114. The system 130 could also be included. The processing system 130 can communicate with the virtual reality headset 110 or wireless earpieces 101.

In one embodiment, the processing device 130 may process media content sent through the VR headset 110 or wireless earpieces 101. The processing system 130 can also take biometric readings from users to modify, amend, change or otherwise modify decisions, branches or other content in a virtual reality game. Other embodiments allow the wireless earpieces102 to process all biometric data from the user in order to adjust the game or any other media content presented by the virtual reality system. The processing system 130 could represent a gaming system, such as Xbox, Playstation, Nintendo System, or other similar devices. A personal computer, server or cloud interface device, as well as other similar devices. Processing system 130 can receive any number of storages, disks, cartridges or storages. It may also access content through one or several networks or connections.

“The wireless headphones 102 can play audio from virtual content presented by virtual reality headset 110. The wireless earpieces may be set up to play music, audio, make and receive phone calls, or any other communication, as well as determine ambient environmental readings (e.g. temperature, altitude location speed, heading, speed, heading). ), and read user biometrics (e.g. heart rate, motion, sleep oxygenation, calories burnt, etc.). To make decisions or determine the content of the virtual reality system’s media content, the user biometrics and any other readings can be used.

The virtual reality headset 110 simulates physical presence at places in the real or imagined worlds. It allows the user to interact with that environment. Immersive multimedia is also known as virtual reality. It can be used to create sensory experiences that include sight, sound, touch, smell and taste. The power source for the virtual reality headset 110 is a battery, power plug, or another connection (e.g. USB connection to a gaming or computing device). The virtual reality headset 110 can also communicate data (send or receive) using a wired or wireless connection with any number of computing, communications or entertainment devices.

The visor 113 can be used to display visual and graphic information to the user. One or more displays may be included in the visor 113 (e.g. liquid crystal displays and organic LED displays), Projectors, direct, indirect, and refractive, are used to display information to the eyes. The virtual reality headset 110 can also have touch screens, smell interfaces, and tasting interfaces. The shape and size of the virtual headset 110, the visor 113, or the strap 114 can vary depending on the make, model, manufacturer, as well as the user configuration of virtual reality headset 110. These include those made by Google, HTC and Sony, Oculus and Samsung. Durovis Valve, Avegant and other companies. Other embodiments of the visor 113 can be transparent, glass, and/or see-through, allowing the user interaction with the real world while using the virtual reality system. The virtual reality system can be used to show augmented reality content to the user.

The strap 114 is a band that runs between the visor 113’s sides and secures the virtual reality headset 110 to your head. The strap 114 can be made of any of a variety of materials such as nylon, cotton, nylon, rubber, plastic or the like. The strap 114 can have buckles, loops or other adjustment mechanisms to fit the virtual reality headset 110 to the user’s head. Some virtual reality headsets look more like helmets or have additional structural components (e.g. straps, arms and extensions). For attaching the virtual reality headset 110 on the user’s head during regular and vigorous use.

“With regard to wireless earpieces102, sensor measurements could refer to measurements made using one or both wireless earpieces204. The wireless earpieces may decide that the sensor signal for pulse oximeter on the right wireless earpiece has become very noisy. In this case, they may use the sensor signal from left wireless earpiece to determine the primary measurement. In response to noise levels, the wireless earpieces may switch between the right and left pulse oximeters. This allows for the use of the clearest possible sensor signal at any time. One embodiment of the invention allows wireless earpieces102 to switch sensor measurements when the sensor measurements exceed or fall below a threshold.

“The user could also be carrying or wearing any number of sensor-enabled gadgets such as pacemakers, heart rate monitors, smart glasses, smart watches, bracelets (e.g. Fitbit, Apple Watch, Fitbit, etc.). Other sensory devices may also be worn, attached, or integrated with the wearer. The wireless earpieces102 may receive data and information from external sensors. Another embodiment may use the data and information of external sensor devices to further process the information sent by wireless earpieces102 to a wireless device. The media content can be modified based on user readings to better satisfy the user, or third parties watching the user.

“The wireless earpieces 120’s sensors may be placed at enantiomeric places. A number of colored light emitting devices (LEDs) may be placed to provide information and data, such as heart rate and respiratory rate. Data gathered by LED arrays can be used to sample data or combined with other sensors. Sensor readings can be enhanced or strengthened by adding additional data.

“As shown in the image, wireless earpieces may be physically or wirelessly connected to the virtual reality headset 110. The wireless earpieces 110 and 102 can receive user input and commands for use on any device of the virtual world system. The wireless earpieces may be described as either a pair (wireless headphones) or a single (wireless earpiece), as previously mentioned. This description could also include information about the components and functionality of each wireless earpiece 102 individually or collectively.

“In one embodiment, wireless earpieces102 play audio that corresponds to virtual reality content displayed on the virtual reality headset 110. The wireless earpieces may also provide additional user and biometric data that can be used by the virtual reality headset 110, connected computing, entertainment or communications devices, systems or components.

“In certain embodiments, the virtual-reality headset 110 can be used as a log tool to receive information, data or measurements from the wireless earpieces. The virtual reality headset 110 could be worn by the user in order to receive data from the wireless headphones 102 in real time. The virtual reality headset 110 can be used to display, store, and sync data to/from wireless earpieces 102. The virtual reality headset 110, for example, may display the pulse, blood oxygenation and distance as well as calories burned. The virtual reality headset 110 and wireless earpieces102 can have many different electrical configurations, shapes and colors. They may also include various connections and circuitry.

In one embodiment, wireless earpieces 101 may have a battery 208 and a logic engine including processor 210. A memory 212, user interface 215, physical interface 215, and a transceiver 226 are some of the components. Sensors 112. A battery 218, a memory 221, an interface 220, and sensor/sensors 224 may be included in the virtual reality headset 110. The battery 208 is a power storage unit that powers the wireless earpieces 302. The battery 218 is also a power storage device that can power the virtual reality headset 110. The batteries 208 and 218 could also be used to power a fuel cell or thermal electric generator, piezoelectric charger, solar charger or ultra-capacitor or other emerging or existing power storage technologies.

The logic engine 210 controls the operation and functionality wireless earpieces 101. The logic engine 210 can include circuitry, chips, or other digital logic. Programs, scripts, or instructions may also be included in the logic engine 220. The logic engine 210 can be hardware, software, firmware or any combination thereof. One embodiment of the logic engine 210 can include one or more processors. The logic engine 210 could also be an application-specific integrated circuit (ASIC), or field-programmable gate arrays (FPGA). The sensors 112 may be used by the logic engine 210 to determine biometric data, readings, and other information. This information may be used by the logic engine 210 to inform the user about the biometrics (e.g. audibly, via an application of a connected gadget, tactilely, etc ).

“The logic engine210 may also process user input in order to determine commands that are implemented by wireless earpieces 101 or transmitted to wireless earpieces102 via the transceiver 226. To determine the specific actions to take, the sensors 112 may detect user input. One embodiment of the logic engine 210 can implement a macro that allows the user to associate input from the sensors 112 with commands. The logic engine 210 can make any number of branching decision based on the available decisions in a story, game, or other interactive media. Artificial intelligence may be used by the logic engine 210 to create new branches or threads for users. The logic engine 210, for example, may use biometric data from sensors 112 to determine if the user’s response is as expected. To achieve desired user responses, the logic engine 210 can use any number of training and gaming scenarios.

A processor is circuitry or logic that controls execution of a set instructions in one embodiment. One or more processors may include digital signal processors, central processing units, central processing units, microprocessors or other devices that are suitable for controlling electronic devices, which can include executable software, instructions and programs, processing information and converting it to and from signals, as well as other related tasks.

“The memory212 is a hardware device, element, or recording medium that stores data for later retrieval or access. The memory 212 can be either static or dynamic. The memory 212 and logic engine 210 can be combined in one embodiment. Any type of volatile and non-volatile storage methods and mediums may be used to store the memory. The memory 212 can store information about a user’s status, wireless headphones 102, virtual reality headset 110, as well as other peripherals such a wireless device or smart case for wireless earpieces. Smart watches, smart phones, and so on. One embodiment of the memory 212 may display instructions and programs for controlling the user interface (214), which could include one or more LEDs, other light emitting components or speakers, tactile generators, etc. The user input information may be stored in the memory 212 with each command.

The memory 212 could also store interactive media, such as stories or games, that are created by the virtual reality platform. The memory 212 can also store user profiles, including measurements necessary to achieve desired user states (e.g. pulse, voice characteristics and blood chemistry). The wireless earpieces 101 may have sensors 112 that measure adrenaline, including indirectly. Interactive media content can be chosen so that hormones and other chemicals such as adrenaline, are within a specific range or don’t exceed one or two thresholds.

The transceiver 216, which is both a transmitter & receiver, may be combined to share the same circuitry in a single housing. The transceiver 216, which can communicate using Bluetooth, WiFi, ZigBee and Ant+, near field communications (wireless USB), infrared communications, ultra-wideband communications, cellular, (3G, 4G/5G, PCS, GSM, etc. Other radio frequency standards, networks or protocols that are suitable for communications. A hybrid transceiver may be available that supports multiple communications. The transceiver may be used to communicate with virtual reality headset 110, or other systems that use wired interfaces (e.g. wires and traces). NFC, Bluetooth communications.

The components of the wireless headphones 102 can be connected electrically using any number of wires or contact points, leads and busses. The wireless earpieces may also include any number or components of computing and communication components, devices, elements, such as busses, motherboards and circuits. This includes chips, circuits and sensors, buses, connectors, adapters and transceivers. Displays, antennas and other similar components. The physical interface 215 allows for communication and connection with the virtual reality headset 110, or other electrical components.

“The physical interface 215 can include any number pins, arms or connectors that allow for electrically interfacing with external devices, charging or synchronization devices, and other components. The physical interface 215 could be, for example, a micro USB port. One embodiment of the physical interface 215 includes a magnetic interface that automatically couples with contacts or interfaces for the virtual reality headset 110. Another embodiment of the physical interface 215 could include a wireless connector for charging wireless earpieces (102) without the need to connect to a charging device.

The user interface 214 can be used to receive commands, instructions or input via the touch (haptics), voice commands or predefined movements. The wireless earpieces may also be controlled by the user interface 214. The user interface 214 could include an LED array, one or several touch sensitive buttons, portions, a miniature display or screen, optical sensors or other input/output elements. The user may control the interface 214 by himself or using commands from the virtual reality headset 110, or another wireless device.

“In one embodiment, feedback may be provided by tapping the user interface (214) once, twice or three times, or any other number of times. To perform a predetermined action, a swipe motion can be used across or in front the user interface (e.g. the exterior surface of wireless earpieces 102) for the same purpose. You can associate specific actions with swiping in any direction, including play music, fast forward, pause, fast back, or activate a digital assistant (e.g. Siri, Cortana and smart assistant). Swiping can also be used to control the functionality and actions of the virtual reality headset 110, or any other external devices (e.g. smart TV, camera array, smart watch etc.). User input can also be provided by the user through head movements or based on their position. To change the content of the virtual reality headset 110, the user can use voice commands, head gestures or touch commands.

“The sensors 112 could include pulse oximeters and accelerometers as well as magnetometers and inertial sensors. They also can be used for photo detections, mini cameras, chemical sensors, such blood, sweat, excretions, and odors. Other similar instruments that detect location, orientation, movement, etc. Sensors 112 can also be used to collect optical images, data and measurements, as well as determine acoustic noise levels, ambient noise, and other environmental conditions. The sensors 112 can provide data or measurements that can be used to filter images or choose images for display by virtual reality headset 110. To command the smart glasses to display images from the left, motion or sound may be detected. Although motion or sound can be used, there are many triggers that can be used to send commands to virtual reality headset 110. Sensors 112 can detect the emotional and physical reactions of users to media content. The sensors 112 can correlate with any number stress levels, emotional or physical states. The logic engine 210, for example, may compare the measured data against the baseline readings of various user conditions and states to determine the condition and status of the user.

The virtual reality headset 110 could include components similar to those used for the wireless earpieces 101, such as a battery 218, memory 220 and sensors 222. The logic engine 226, which executes and implements the functions and processes described herein, may be included in the virtual reality headset 110. The battery 218 may be embedded in the frames of virtual reality headset 110. It may also have additional capacity that can be used to charge wireless earpieces 101. A power adapter, interface or amplifiers may be included in the virtual reality headset 110. The frame and/or lenses may also include all or part of the logic engine 226, sensors 224, user interface 228, display 228, transceiver 233, and sensor 222.

The virtual reality headset 110’s user interface 222 may have a touch interface (or display) that indicates the status of the virtual headset 110. An external LED light, for example, may indicate the status of the battery in the virtual reality headset 110 and the connected wireless headphones 102. It also indicates the status of the connection status (e.g. linked to the wireless headphones 102, wireless device), etc. ), download/synchronization status (e.g., synchronizing, complete, last synchronization, etc. ), or similar information. The user interface 222 could also contain optical sensors that track user’s eye movements and facial gestures in order to perform motions or other actions based upon the sensed input.

The display 228 could be embedded into the lenses of virtual reality headset 110, or may represent one or more projectors that project content directly to the user’s eyes. The display 228 could be a transparent organic light emitting lens (OLED) that can be used to display content. The projectors of the display 228, may use any number of wavelengths and light sources to display data or images to the user. Another embodiment of the display 228 can be transparent or semitransparent, allowing it to function as an augmented reality system. The virtual reality system can be used in any real-world environment, for example. The content displayed on the display 228 can be modified or reconfigured according to the user’s response (e.g. to attain desired physical and emotional state, conditions or measurements).

Display actions can also be performed using the LED array of the user interface 222. An LED array of the user interface 222 may also be used for display actions. For example, it could be activated when someone or something is in the cyclist’s blind spot. Another embodiment may emit device status indicators from the LED array on the wireless earpieces102, which are activated by the virtual reality headset 110’s user interface 222. The physical interface of the user interface 223, may charge the battery 218. The physical interface can be part of the user interface 222, or it may be an independent interface. The user interface 222 could include a hardware interface, such as a connector, port, or other device. The virtual reality headset 304 can be connected to a power supply, or any other electronic device. The user interface 222 can be used for both charging and communication with externally connected devices. The user interface 222 could be a mini-USB (micro-USB) or another miniature standard connector. A wireless inductive charging system can be used to initial replenish power to wireless earpieces (102). Inductive charging can also be used to charge the virtual reality headset 110.

“Another embodiment of the virtual reality headset 110 could also include sensors 234 for detecting the location and orientation, movement, biometrics, and proximity to the wireless headphones 102. The sensors 224 may also include optical sensors, cameras, and microphones for recording images and other content about users (e.g. eye movements, body movements, expressions and arm and leg positioning, etc.). The user’s periphery (e.g., sideways, behind, etc.) Virtual reality headset 110 can detect any number wavelengths or spectra and provide unique images, enhancement, data and content. A virtual reality headset 110 could also contain an LED array, galvanic or other touch sensors as well as a battery, solar charger and actuators.

“The original packaging of the wireless earpieces102 and virtual reality headset110 may contain peripheral devices such charging cords power adapters, inductive charging adapters, solar chargers adapters, solar cells or batteries. Smart case covers and transceivers (e.g. Wi-Fi, cell, etc.) are also possible. ”

“The virtual reality system may include an interactive book 105 in one embodiment. The interactive book (105) may contain any component of the wireless headphones 102 or virtual reality headset 110. The interactive book 105 could include, for example, a battery, memory and logic engine/processor. It may also contain a user interface, physical sensors and transceivers. One embodiment of the interactive book 105 may contain one or more pages that can be navigated by a person. Interactive components may be included in the pages, such as touch screens, interactive elements, styluses, or any other component or appendage that allows users to interact with them. The interactive book 105 may display different content depending on the story or game it has created. An interactive book 105 could also contain one or more speakers or vibrators or dynamic braille interfaces or other communication components that allow bi-directional communication between the user and the author.

“As described, interactive book 105 can alter the communicated content based on the measurements from sensors 112 or user status as processed and stored by the logic engine 220. The interactive book 105 or wireless earpieces102 can store the interactive media content. The interactive book 105 can communicate with any or all of the virtual reality systems via a wireless or wired connection. The interactive book 105 can communicate with the wireless headphones 102 using a Bluetooth connection. The wireless earpieces102 may transmit audio from the interactive media content displayed in the interactive book 105 to the user in one embodiment.

“All or parts of the virtual reality system can function independently, simultaneously, as a networked bunch, or using direct communications. To achieve the purposes, functionality, features, and methods described herein, users can use the wireless earpieces 101, virtual reality headset 110, and interactive book 105 either individually or together.

“FIG. “FIG. FIG. 3 is one embodiment of the process. 3. may be executed by one or more wireless headphones, such as the wireless headphones 102 of FIG. 1. FIG. 3. may be used to transmit information between wireless earpieces, an interactive book, and/or a virtual-reality headset. Wireless earpieces can synchronize audio content with interactive media content. Wireless earpieces can measure the user’s biometrics and determine their emotional and physical state. Wireless earpieces can also detect the location and location of the user in order to display precise virtual information such as the angle, position, angle, motion, and location of their head. The wireless earpieces can communicate with any number of electronic devices such as smart cases, computing devices and entertainment devices to complete the FIG. 3.”

The wireless earpieces may receive interactive media content from the user using wireless earpieces (step 302). Wireless earpieces can communicate with electronic devices such as a virtual reality headset, interactive book, virtual reality processing systems, or other electronic devices to deliver the interactive media content. Interactive media content can include stories, stories, communications and role-playing scenarios. Trainings are also possible. You can select interactive media content to playback using any combination of inputs such as file selections and icon selections. The interactive media content may be selected by the user, or one or more remote devices. In step 302, the interactive media content can be communicated between the virtual reality system (or wireless earpieces) and the user.

“Next, wireless earpieces measure the user biometrics using the wireless headphones (step 304). Any number of sensors can be used to measure user biometrics, including pulse rate, blood pressure and temperature. Another embodiment allows the determination of user biometrics using any number other devices that are in communication with wireless earpieces. The wireless earpieces can communicate with cell phones, smart watches and smart bands, as well as headgear, vehicle components, virtual realities components, and so forth.

“Next, wireless earpieces determine the user condition associated to the user biometrics. (Step 306) Information associated with user biometrics may be used to determine the user condition. You can use any combination of biometric thresholds to determine your user condition. To determine if the user is afraid, it may be possible to use voice characteristics, heart rate, blood pressure, and heart beat. The wireless earpieces can be programmed to run through a scenario or training program that associates the user’s biometrics and known conditions, including emotional and physical state. Other embodiments allow the wireless earpieces to prompt the user periodically to indicate their status in order for the user’s biometrics to be associated with them. To determine the user’s state, a database may be used to track faults, programs, custom user biometrics, baseline readings and thresholds.

“Next, the wireless headphones modify the branching patterns in the interactive media content based on the user’s condition (step 308.) One embodiment of the invention is that the wireless earpieces and virtual reality system can choose one of several branching patterns to apply based on the user?s condition during step 408. Branching patterns can represent any number variables, conditions, parameters or outcomes that may be used in interactive media content. Interactive media content can be modified by wireless earpieces or implemented based upon a command, instruction, application, feedback, or any other information received from wireless earpieces. An interactive book, or virtual reality system, may use branching patterns to respond to user conditions and status. This is based on biometric data and other information.

“In one embodiment, the process of FIG. 3. The wireless earpieces can be connected to the virtual reality headset. The user could magnetically attach the wireless earpieces to the virtual reality headset. The connectors on each side of the virtual headset can correspond to the right and left wireless earpieces. Magnetic contacts may be used to connect the two devices and provide data communication. The wireless earpieces have a significantly higher power capacity. Another embodiment may have connectors and ports that allow for physical connection between the wireless earpieces and the virtual reality headset. Both the headset and wireless earpieces can communicate by using an inductive connection. Another embodiment of the wireless earpieces may communicate with the virtual reality headset using short-range communications such as Bluetooth, NFMI or ANT+. The wireless earpieces can be connected to the frame, connectors, extensions and arms.

The wireless earpieces can also sync audio and visual content (altogether media content) with the virtual reality headset. The wireless earpieces can use any number of sensors to determine location and velocity (e.g. linear, angular, etc. ), user’s position (and their head), and biometric conditions (e.g. heart rate, blood oxygenation, temperature). The wireless earpieces can also communicate other information to the user to adjust the audio effects, such as volume, tuning, balance and fade. Wireless earpieces can also transmit or receive commands to manage and synchronize the audio content with the visual content.

“The information can be coordinated based upon user actions, conditions and position, as well as location. To make the environment more real, you can play specific three-dimensional sounds in the wireless earpieces that correspond to the left and right ears. The volume and other effects can be adjusted to match the avatar’s orientation within the virtual environment. Audio content can include flags, timestamps or other information to synchronize playback. The user will not be disoriented, motion sick, or adversely affected by the synchronization of audio and visual content.

“The wireless headphones communicate audio content that the user has received via the virtual reality headset. Based on the synchronization information between the virtual reality headset, and the wireless headphones, the audio content can be played. The content of the wireless earpieces can play different content depending on what virtual reality environment the user is in. Wireless earpieces can use different sounds, volume, and audio effects. The user can experience a virtual environment that is unique with the same sounds, without having to increase their weight or be subjected to other forces by larger sound systems.

“The illustrative embodiments offer a system, method and wireless earpiece(s), for self-configuration automatically, or based upon environmental conditions, detected activities, thresholds, thresholds, or user input. These illustrative embodiments can be either entirely hardware or entirely software (including firmware, resident code, etc.). Or an embodiment that combines both software and hardware aspects, which may be referred to as a “circuit” ?module? ?module? oder?system.?”

“Computer program codes for performing operations of the embodiments can be written in any combination possible of one or more programming langauges, including Java, Smalltalk or C++, and traditional procedural programming langauges like the?C? programming languages or similar languages. The program code can be executed entirely on a user?s computer, partially on the user?s computer as a standalone software package, partially on the user?s computer and partly at a remote location, or entirely on the remote server or computer. The remote computer can be connected to the user’s machine through any network type, such as a local area network or personal area network or wide area network. Or, it may be connected to an external computer via the Internet, using an Internet Service Provider.

“FIG. “FIG. One or more sensors 112 could include inertial sensors, such as magnetometers, accelerometers, and/or geodes. The inertial sensors 244 may be used for tracking the head’s position and movements. One or more biometric sensors may be included in the sensors 112, such as temperature sensors, pulse oximeters, perspiration sensors or other types. The biometric information can be used to track physiological indicators related to stress. This may include different parts of a story arch, particular characters, game actions, or other elements within a book or game. Also shown are one or more microphones 243. The microphones 243 can detect the voice of a user within an environment. However, they may also be used to detect subtle sounds such as breath patterns or utterances that indicate a user’s emotional reaction to an element in a storybook. Additional information may be obtained from the environment or a user through one or more cameras. You can have as many sensors as you like, and they don’t need to be the only ones shown.

“As explained previously, information detected by one or more sensors could suggest different branching patterns. Different branching patterns can be used for different reasons. Branch patterns can be used to decrease intensity. For instance, if a user is prone to becoming too intense and causing emotional or physical problems, they may choose branch patterns that are less likely to excite them. Branch patterns can also be used to intentionally increase the intensity of a user’s experience. For example, if story elements cause an increase in heart rate (detected with one or more earpieces’ heart rate sensors), increased sweating (detected with a moisture sensor on one or more earpieces), or shallower breathing (detected with a microphone), then branches can be taken to support this. Different models can be used to predict how a user will react to different branches. These models may be based on past reactions, either to the same interactive media or to other interactive media. Models include neural networks, fuzzy logic algorithms and genetic algorithms. Models can also be statistical in nature. Different branch patterns might have different weightings depending on the inferred emotional response. If certain branches are associated with more excitement in a population, such as based on biometric data or heart rate, they may be given higher weightings. When deciding between two branches in an interactive story or game the one that is most closely associated with the emotional response may be chosen (e.g. Higher values will elicit a stronger emotional response. You should understand that different models can be used depending on the interactive story or interactive content and the inferred emotions to increase or decrease. Also, it is important to understand that the objectives do not have to be binary. It may be possible to increase an emotional response and decrease another for a while before increasing it again.

“The illustrative embodiments do not have to be limited to those described herein. Particularly, the illustrative embodiments allow for many variations in how embodiments can be applied. This description is intended to be used as a guideline and explanation. This description is not meant to be an exhaustive list nor limit disclosures to specific forms. Other alternatives and exemplary aspects may be included in the disclosure. This description only illustrates some embodiments, processes, or methods of the invention. Any modifications, substitutions and/or additions that are permitted within the scope and spirit of the disclosure may be made. As can be seen, the disclosure achieves at least the intended goals.

“The detailed description of the invention described in the previous paragraph is limited to a few embodiments and is not meant to limit its scope. These claims detail a few of the embodiments disclosed in the invention with greater specificity.

Click here to view the patent on Google Patents.