Invented by Erik Mowery Simpson, Kenneth Simpson, Circlesx LLC
The Circlesx LLC invention works as follows
Implementations” of different computer methods are used to pair a computerized device that acts as a mobile computer to record the user’s environment and project the light into waveguide eyeglasses, or contacts. This allows the user to see embedded light structures in the waveguides while viewing the real world. The computerized ball device can also be docked into a drone cradle to create a database map while the user is not using it for a specific task. The device can also be attached to a wristband for mobility. The device can also couple projected light structures, so that multiple users can view the same content of light structures to create an environment of trust. The device breaks away from the traditional design for head-mounted virtual and mixed realities that combine the camera with the device.Background for Computer ball device for mixed, virtual, or augmented realities
Field of Invention
Implementations on various methods of coupling a computerized ball which is a mobile computing device, to waveguide eyeglasses, or contacts, which then allows a user view their world as mixed reality, virtual reality and augmented reality and allow group collaboration using a coupled device. The device can also be docked into a drone cradle, which creates an environment map for the user while they are not using it. The device can also be attached on a wristband for mobility. The device breaks away from the traditional head-mounted virtual and mixed reality design, where the central processing unit or camera is located. The standard eyeglass is added to this device, which violates social contracts and records the user as well as bystanders. The mobile computer device decouples from the waveguide so that the third-person view of the user is possible in addition to the first-person view. Computerized ball devices allow standard-shaped eyeglasses (user to ball or device to group) to be connected to the device. The design of the device is superior to other implementations of mixed reality and virtual technology. These devices have bulky head-mounted apparatus that add weight and disturb social contracts, biomechanical movement, and weight distribution. They also do not allow both first-person and third-person view points on the waveguide display. The technology can be used by a family to share a group’s technology experience. The technology can be used by commercial and industrial work teams to share technology experiences in a natural way. This technology allows humans to work together and collaborate better. The technology extends natural human interaction, without drawing attention away from the shared human experience. The current technology deployment of computerized devices is focused on the individual experience, at the expense group interaction. The computerized ball with eyeglasses on the head allows for an enhanced group experience. The computerized device can be docked to a drone with the capability of mapping a database for the user or placed in a group setting. The computerized ball has an internal CPU and server, as well as a network connection to connect with a remote server.
Description of Related Art
The following examples and descriptions are not considered prior art because they appear in this section.
These technologies are implemented by Microsoft and their HaloLens product (A Trademark of Microsoft, Inc.), Oculus Rift VR (A Trademark of Facebook, Inc. by assignment), Sony’s PlayStation VR (A Trademark for Sony, Inc.), HTC Vive(A trademark for HTC, Incorporated), Google Glass(A trademark for Google incorporated), Samsung’s GearVR and Magic Leap. These technologies include those used by Microsoft with their HaloLens (A trademark of Microsoft, Inc.), Oculus VR (A trademark of Facebook, Inc., by assignment), Sony PlayStation VR (A trade mark of Sony, Inc.), HTC Vive, (A trade mark of HTC, Inc.), Google Glass, (Atrademark by Google, Inc.), Samsung Gear VR, (Atrademark by Samsung, Inc.), Razer?s OSVR, (Atrademark by Razer, Inc.), or Magic Lea These technologies are impressive and helpful, but they do not allow humans to interact with each other in a group face-to-face while using technology. These technologies promise to improve productivity for humans, but they also isolate them from face-to-face interaction. The model with a camera in the eyeglasses has been rejected by humans because they fear that others will be recorded or watched. Some supporters of the camera-on-eyeglasses model claim that humans will benefit so much from the technology that they will outweigh the costs of being recorded. There are also objections because other members of a group cannot share the same information, creating a barrier that creates unfair information advantages. The method’s implementation allows users to “group” or ?pair? Multiple sets of eyeglasses using a mobile computing device with a ball. The ‘group? or ?paired? The user experience provides comfort to all users within a group because they can share the same information. This means that no social contract is violated, such as recording someone without their permission. Although customization of the user experience can be useful for an individual’s experience, technology implementations have reduced the ability of humans to communicate with each other through eye-to-eye contact. Prior art has generally limited the ability to use both first-person and third-person viewpoints at once.
Systems that provide virtual or augmented reality experiences have implemented methods, but they do so in a way that violates the traditional social contract between humans:
The current systems of technology that integrate computing in everyday life are largely accepted by users who can trust each other. It is not a problem that CPUs, desktops, laptops or mobile computers are isolated. This is because it is obvious when one user records another. In standard social settings, this is prohibited because it breaches a social contract between humans. Google Glass, Microsoft HaloLens and other virtual or augmented reality devices are not widely adopted because they breach social contracts between people. Information is power. If humans are aware that they have different levels, they won’t interact freely. This has created a rift between humans and technology. In addition, virtual and augmented reality require mapping the user’s surroundings so that virtual objects can interact in real objects which have been mapped to a database. The implementations of methods so far have used non-sharing technology, which pits one user against another and causes low adoption. This also violates human social contracts. The proposed invention allows for the third-person omniscient perspective. It allows users to not only watch movies, but also be part of them alongside actors and actresses.
The inventions of systems and methods for accelerating the adoption of mixed-reality, augmented-reality and virtual reality are directed towards a mobile computer ball device that pairs standard shaped glasses or contact lenses. The mobile ball has the capability to map and record environments, while pairing with other users within a private group. The shared device can be used to build social trust, so users are able to equally benefit from the group’s information. This is opposed against the risk of destroying trust by using the technology in a way that could create a biased or distorted information advantage. The shared device can also display a first-person or third-person omniscient perspective, with artificial intelligence feedback. This allows a user not only to listen to music but to actually be part of a band. It allows a user not only to watch movies but to actually be in them alongside actors and actresses.
A group of users are communicatively connected to the device in order to set up settings for group interaction.” The group of users is able to control the device independently. A plurality of users are then connected through the ball to the local application server or remote application server. Users are then connected to various application functions by the ball CPU, application host, or network server. The ball CPU is capable of not only pairing users for data transmission and electromagnetic light, but also mapping the user’s surroundings to interact with the application server. The device can be used privately, like a smart phone. Or it can be paired with others to reduce social stress. Users can use shared or private applications to ease social contract stress. These include calendars, photos, cameras, videos, maps and weather. They may also pair with other users for group purposes. . . .
In one embodiment, the application server utilizes artificial intelligence algorithms in order to provide more efficiently the services or applications needed by the user after the device scans the common area of the group or the private user that has been paired up with the device. The application server will show through the ball device the most frequently used apps as well as recommend more efficient applications based on data patterns for the user or group.
In another embodiment, the application server analyzes the biomechanical movements of users for exercises such as athletics and music. It also analyses the user’s movement when teaching, teaching, or conferencing. Due to the placement of the camera, the projection and the waveguide lens, the visualization can be viewed from both a first-person and third-person omniscient perspective. The CPU ball device can then provide an augmented reality assistant to the CPU ball user. The augmented reality partner may be a professional, such as John McEnroe in tennis or Nick Sabin in football or Julia Child in cooking or Beethoven in piano or Harrison Ford in Star Wars. The ball CPU will analyze the user’s movement and provide customized feedback, and interact with other projected human objects and holograms based on the artificial intelligence inherent in augmented reality.
In another embodiment, the drone docked CPU ball will scan the area around the user and enter the data into a database. The drone-mounted CPU ball can scan the area of the user and navigate through hallways, stairs, rooms and other places within a set radius. It also tracks the movements of the user. The ball CPU device will be able access the database from the application server, whether it is local or remote. This will speed up processing times and reduce the memory leak.
In another embodiment, multiple users with the CPU cube or ball device can lock other devices on a shared group share over the network if they are working in a group. These features of bounded group locking are designed to increase technology trust between users without violating social contracts. This implementation allows all users to see the same recording through the planar-waveguide lens, which prevents asymmetrical information that violates social trust contracts.
In another embodiment, the computer ball projector projects electromagnetic waves and reflective light onto the eyeglasses. The eyeglasses are then projected onto the retina’s photo receptors via the aqueous and vitreous humor lenses. The images that provide the augmented-reality experience are projected within the structures of the eyeglasses and contact lenses at variable reactive depths. The optic nerve then synapses with the impulses to transmit the images to the brain. The pupil and lense can be adjusted to adjust the light transmission. These features enhance the user’s visual experience, not only from the natural world, but also from the CPU-generated objects.
People use contact lenses or eyeglasses because their corneal and eye length are often not matched. Optometry calls the mismatch correlation error. When someone is near-sighted, the cornea doesn’t project the image all the way to the retina. If one is far-sighted, the image is too far beyond the retina. To correct nearsightedness, glasses are thicker at the edges and thinner in the center. This allows light to diffuse more and to diverge further and to project images back towards the retina. For farsightedness, glasses that are thicker on the sides and thinner in the middle are used. This allows light to be converged more quickly. Images are projected onto the retina with greater accuracy. The trick is to use more or less glasses. “The eye is tricked by moving the light to different focal points.
The mobile CPU ball projects coherent laser light and electromagnetic wave. Holographic images are created by holographic images that are refracted from the light transmitted into the glasses or contact structures. The holographic images are created by refraction of the laser image projected onto the eyeglasses or contacts mounted on the head.
The discussion below is aimed at certain specific implementations. The discussion below should be understood to only enable a person of ordinary skill to use and make any subject matter that is defined by patent claims, whether they are issued now or in the future. The discussion below is only to enable a person with ordinary skill in the art to make and use any subject matter defined now or later by the patent?claims?
The following paragraphs give a short summary of the various techniques described in this document, such as those illustrated in FIG. 1. FIG. The ball CPU device is mounted to a drone charging platform 170. The ball CPU 140 can gather information from a fixed distance from the user to transmit and complete a map tile data base 160 using the wireless network 150. Map tiles are transferred from the database 160 to the ball device 140 via a wireless network. To optimize memory leak, the CPU ball device 140 uses only map tiles within the immediate 120-meter radius of the user. The ball CPU devices 140 and 170 scan the users 120 environment periodically to analyze changes in the map tiles. Images can also be projected to the user by laser light. Images could include but not be limited to an athlete 175, a musician 180, a chef 190, or any other service user.
The embodiment shown in FIG. The figure 2 shows how multiple users can be paired to ensure that social contracts are not broken and images are projected rather than consumed by one user 230 in order to create asymmetrical information with the user 220. The user 230 can call the ball processor device 270 into a centrally-accessible common area for the user 230, or a group 230 and 220 to use with head mounted wavesguide glasses 240 and contacts 210 that act as waveguides to transmit the laser and the infrared 280 from the ball device 270. The ball CPU can then transmit infrared and laser light to the head-mounted waveguide apparatuses 240 or 220, which allows for the visualization of embedded photons in the waveguide. The separation of the CPU ball 270 from head-mounted glasses 240 or contact lenses 210 has another benefit. Network and radiofrequency wave are farther away from the head-mounted waveguide, therefore there is less radiofrequency and heat radiation of the electronics to the brain tissue. The CPU ball 270 can also be attached to a wrist-mounted configuration 290 by the user 230 for increased mobility. If users 230, 220 and 270 agree to pair, the system will only allow users to view the same information. This adds to the trust in the system, and the use of technology to ease social contract trust burdens that are violated when a user has reason to think they are being recorded.
The embodiment shown in FIG. 3A shows the network-based ball CPU device, 330, projecting infrared and laser light 370 onto the users’ 310 head mounted contacts or glasses 320.
The embodiment shown in FIG. 3B shows the mobile network-based ball CPU projector 340 that projects laser light and infrared to the head mounted contact lens or glass 350. FIG. FIG. 3A contacts or glasses 320. The lens 350 is internally refracted to create holographic images.
Click here to view the patent on Google Patents.