Invented by Adam Leeper, John Ullman, Cheng Yang, Peter Tan, Google LLC

The market for multiplayer augmented reality experiences has been growing rapidly in recent years. With the rise of augmented reality technology, game developers and tech companies have been exploring new ways to create immersive and interactive experiences that allow players to interact with each other in real-time. One of the main drivers of this growth is the increasing popularity of mobile gaming. As smartphones and tablets have become more powerful, they have become the primary gaming devices for many people. This has led to a surge in demand for mobile games that offer multiplayer experiences, and augmented reality has emerged as a key technology for delivering these experiences. Another factor driving the growth of the market for multiplayer augmented reality experiences is the increasing availability of AR headsets and other hardware. Companies like Microsoft, Google, and Magic Leap have been investing heavily in AR technology, and as a result, there are now a wide range of AR devices available on the market. This has opened up new opportunities for game developers and other companies to create immersive AR experiences that can be enjoyed by multiple players at the same time. One of the most exciting aspects of the market for multiplayer augmented reality experiences is the potential for social interaction. AR games and experiences can bring people together in new and exciting ways, allowing them to share experiences and connect with each other in ways that were previously impossible. This has led to the development of a wide range of social AR experiences, from multiplayer games to virtual hangouts and social networks. However, there are also challenges associated with the market for multiplayer augmented reality experiences. One of the biggest challenges is the need for high-quality content. AR experiences require a high level of technical expertise and creativity to create, and there is a risk that the market could become saturated with low-quality or derivative content if developers do not invest in creating truly innovative and engaging experiences. Another challenge is the need for robust infrastructure to support multiplayer AR experiences. AR experiences require high-speed internet connections and powerful servers to deliver a smooth and seamless experience, and there is a risk that the market could be held back by a lack of investment in these areas. Despite these challenges, the market for multiplayer augmented reality experiences is poised for significant growth in the coming years. As AR technology continues to evolve and become more accessible, we can expect to see a wide range of exciting new multiplayer experiences that bring people together in new and innovative ways. Whether you are a gamer, a social butterfly, or just someone who loves to explore new technologies, the market for multiplayer augmented reality experiences is definitely one to watch.

The Google LLC invention works as follows

The systems and methods described herein provide co-presence within an augmented reality space. The method can include controlling first and second computers to detect atleast one plane of an augmented reality scene generated for a real space. Receiving from the first computer, a selection from a location in the scene. Generating a marker corresponding with the location.

Background for Multiplayer augmented reality experiences

Content can be displayed in different ways to users who access an immersive virtual reality (e.g. VR space). One example is that content can be displayed in an application accessible from the VR space. Another example is to display content on or inside virtual objects within the VR space. “Multiple users may want to interact with this content simultaneously.

A system consisting of one or multiple computers can be configured in such a way that it performs certain operations or actions. This is done by installing software, firmware, or hardware on the system, which, when used, causes the system to perform these actions. A computer program can be configured to perform certain operations or actions if it includes instructions that when executed by a data processing device cause the apparatus perform the actions.

In one aspect, “a computer-implemented way to provide co-presence within an augmented-reality environment”, the method can include controlling a computing device, both a first and second, in order to detect a plane that is associated with the scene of the augmented-reality environment created for a real space. The method can also include receiving from the first device a selection of the first location in the scene as well as a selection of the second location, and then generating first and second reference markers corresponding to each location. The method can also include receiving from a secondary computing device a selection of a first location in the scene, and a selection of a location of a scene, and then generating a frame of reference centered on the first marker and indicating a direction towards the second marker. The method can also include supplying the reference frame to both the first and second computing devices to establish co-presence within the augmented reality. In order to establish co-presence, the first computer device may be registered with the second device for the scene. Other embodiments include computer systems, computer apparatuses, and computer software recorded on one computer storage device, each configured for the actions of the method.

Implementations can include one or more features. The computer-implemented methods wherein the reference frame is created based on the detected pose of the first computing device which selected the first and second locations and a determined intersect between the second position and the detected plane. In response to a receiving a a a a a a a a a a a a a a a th a s e c t i o n a th a r e a d a t h i r s a t h a l a n a u a m e a a a a a In order to establish co-presence, it may be necessary to generate for the scene registrations of the third device in relation to the first device and the third device in relation to the second device. “The method of receiving, at a secondary computing device a selection from the first location in the scene, and a choice of the second location in the scene, includes automatically detecting the first and second reference markers at the second device.

In some implementations, a third computing system receiving a selection on the first location and second location gives access to the application based on the stored application state. The computer-implemented methods wherein the first location represents first physical features in the physical environments and the 2nd location represents second physical features in the physical environments, and the first physical feature is agreed between users associated with a 1st computing device and 2nd computing device.

The method can further comprise displaying the first reference mark corresponding the the first location in the computing device and the second reference pointer corresponding the the second location in the computing device. In some implementations prompts from a display device connected to the first computing unit trigger the receiving of the selections of the first location and the second location at the first computer device.

The reference frame can be used to store an application state and use it in conjunction with co-presence. The first computing device or the second computing devices may select the first and second locations to access the application based on the stored application state. In some implementations the co-presence can be established without using the position data of the first or second computing devices. The described techniques can be implemented in hardware, software, a process or method, or a computer accessible medium.

In another aspect, an computer program product that is tangibly embedded on a nontransitory computer readable storage medium that includes instructions, which, when executed by a computing device at least, will cause that computing device to control at least a computing device at a computing device at a computing device at a computing device at a computing device at a computing device at a computing device first to receive a selection at the first computing devices of a location in the scene at a computing device first and to select a second location, and at a a a a a a A reference frame can be generated. The reference frame can be centered on the first reference pointer pointing in the direction of the second marker. The reference frame can be generated at least partly based on the atleast one plane, the location of the first device, and the location of the second device. The reference frame can be provided to both the first and second computing devices to establish co-presence within the augmented reality environment.

Implementations can include one or more features. In some implementations the computing device receives from a second device a selection for the first location in the scene, and a similar selection for the second location. In some implementations the reference frame can be used to restore the co-presence of the first and second computing devices after losing the connection to the augmented reality environment. In some implementations the reference frame can be used to restore the co-presence of the first and second computing devices in response to the change of location associated with the augmented environment.

In another general aspect, the system is described. The system comprises at least a processor and memory that stores instructions that, when executed, cause the system perform operations such as, controlling a computing device first and second to detect an at least plane associated with the scene of an augmented environment generated for a space in physical form, receiving a choice of first and second locations within the same scene at the computing device first, generating first and second reference markers corresponding to each location at the computing device first, receiving a choice of first and second locations at the computing devices second, receiving a

Implementations may include any or all of the following features. In some implementations the reference frame can be generated using information from the first computing device which selected the first and second locations and determined intersection between the second and detected plane.

The system can provide the reference frame for the third device in response to receiving the selections of the first and second locations from the third device. This will allow the co-presence to be established within the augmented reality. The co-presence can be established by generating for the scene, a registration between the third device and the first device.

In some implementations, a co-presence established is used to gain access to an application within the augmented reality environments and a stored state is associated with the reference frame. In some implementations re-establishing a reference frame can be done by the first computing devices, the second computing devices, or another computing device choosing the first and second locations to gain access to an application based on the stored state.

In some implementations, prompts from a display device connected to the first computing system can trigger a selection at a second location in the scene. In some implementations the first location is a physical feature of the physical environment while the second location is a physical feature of the physical surroundings. The first and second physical features are agreed between users associated with a computing device associated with a computing device associated with a second computing device. In some implementations the co-presence can be established without relying on position data from the first computing device and the second computing devices.

The drawings and description below provide details on one or more of the implementations. The description, drawings and claims will reveal other features.

When two or more users want to take part in an AR experience (e.g. game, application, setting, etc.) A physical and virtual environment can be defined in order to provide a convenient and functional experience for users accessing an augmented reality (AR). A reference frame, generated by the described systems and methods, can be used in addition to the physical environments to provide each computing device with knowledge of how the computing devices (e.g. computing devices) are related to each other when accessing the AR space within the same physical area. The reference frame may be generated by one computing device, and then shared with other users who are accessing the AR world with the same computing device. The other users can then use this mechanism to create, move, draw or modify the AR environment. Virtual objects can be displayed and interacted with by other users in the AR shared environment.

Accordingly, the disclosed embodiments create a frame of reference that can be shared across a number of computing devices. A reference frame can be created for a computing device and quickly adopted by another computing device. Two or more users who wish to access an AR environment may use a computing system to view virtual content and AR objects in the same physical location as the user that is associated with the frame. Two users, for example, may agree on two physical points within a scene of an AR environment (e.g. a room or space). The systems and method described herein may use two locations agreed upon in the physical environment of the AR Environment combined with a detectable plane (e.g. or another plane) associated to the AR Environment to generate a frame reference that can be adopted and provided by any number users who wish to access the AR content and environment. In some implementations the plane described in this document may be predefined according to a specific physical or virtual environment. In some implementations,

In some implementations, the physical space used for the AR experience can provide elements that allow all users to interact and share content without long configuration tasks and settings. The user can choose two locations within a scene by using a computing device. Below are detailed descriptions of how to select the two locations. The user can select, for example, a corner on the left and right of a scene. “The two selected locations, along with a predefined plane (or detected), can be used in conjunction with the systems and method described herein to create a reference frame that users will use to navigate within an AR environment.

In some implementations, the predefined (or detect) plane can represent a plane on a table or floor surface that has been detected by one or several computing devices or tracking devices in an AR environment. The plane can be detected by sensors, cameras or a combination of both. In some implementations the plane can be selected by a surface, such as a floor, table, object, etc. Associated with a specific physical room. In certain implementations, systems and methods described in this document may detect the plane. In some implementations the coordinates of the plane may be given to the systems and methods described in this document. In the AR environment, the plane can be rendered to allow computing devices to access reference frames from other computing devices.

As used herein the term co-presence is a virtual reality experience (VR) or an AR experience where two or more users are operating in the same physical area. The term pose is used to describe the position and orientation of a computer device camera. The term raycast is used to describe the process of finding a 3D object by intersecting an incoming ray with detected geometry. The term shared reference frames is used to refer to a frame that all users are assumed to share in a VR or AR environment.

FIG. Diagram 1 shows an example of a user exploring shared content in a multiplayer AR environment 100. A first user, 102, wears a head-mounted device (HMD), 104 while holding a computing tool 106. As indicated by the lines 105, user 102 is viewing an environment 100.

Click here to view the patent on Google Patents.