Invented by Marcus Hammond, Marc Wimmershoff, Timothy David Kentley-Klay, Zoox Inc

The market for external control of an autonomous vehicle is a rapidly growing industry that holds immense potential for the future of transportation. As self-driving technology continues to advance, the need for external control systems becomes increasingly important to ensure the safety and efficiency of autonomous vehicles. External control refers to the ability to remotely monitor and control an autonomous vehicle’s operations. This can include functions such as route planning, speed control, emergency braking, and even remote parking. The market for external control systems is driven by the desire to enhance the capabilities of autonomous vehicles and address potential safety concerns. One of the key drivers of this market is the need for improved safety measures. While autonomous vehicles are designed to be highly reliable and safe, there may still be situations where external intervention is necessary. For example, in the event of a sudden obstacle or a malfunctioning sensor, an external control system can provide real-time assistance to ensure the vehicle’s safety. This capability is particularly important during the transition phase when autonomous and human-driven vehicles share the road. Another significant factor driving the market for external control is the potential for increased efficiency and productivity. With external control systems, fleet operators can remotely manage and optimize the operations of their autonomous vehicles. This includes monitoring vehicle performance, optimizing routes, and scheduling maintenance. By having real-time access to vehicle data and control, operators can ensure that their autonomous vehicles are operating at peak efficiency, reducing costs and improving overall productivity. The market for external control systems is also fueled by the growing demand for autonomous vehicle services. Ride-hailing companies, delivery services, and public transportation providers are increasingly looking to incorporate autonomous vehicles into their operations. External control systems can provide these companies with the ability to remotely manage and control their autonomous fleets, ensuring smooth operations and a seamless customer experience. Furthermore, the market for external control systems is attracting a wide range of players, including technology companies, automotive manufacturers, and startups. These companies are investing in research and development to develop advanced external control systems that can meet the evolving needs of the autonomous vehicle market. Additionally, partnerships and collaborations between different stakeholders are becoming more common, as companies seek to leverage their respective expertise and resources to accelerate the development and adoption of external control systems. However, the market for external control systems also faces challenges and concerns. One of the main concerns is cybersecurity. As external control systems rely on wireless communication and data exchange, there is a risk of unauthorized access and potential hacking. Ensuring robust cybersecurity measures will be crucial to maintain the integrity and safety of external control systems. In conclusion, the market for external control of an autonomous vehicle is a dynamic and promising industry. With the increasing adoption of autonomous vehicles, external control systems play a vital role in enhancing safety, improving efficiency, and enabling new business models. As technology continues to advance and stakeholders collaborate, the market for external control systems is poised for significant growth and innovation in the coming years.

The Zoox Inc invention works as follows

Remote control of a vehicle such as an autonomous vehicle may be more reliable and/or efficient at times. This control may, however, require processes to ensure the safety of nearby persons and objects. This disclosure includes aspects such as using onboard sensors for detecting objects in an area and altering remote commands according to these objects, e.g. By reducing the maximum speed of the vehicle in relation to distance from detected objects. In some of the examples presented here, remote controlling can be done by using objects within the environment to control the vehicle.

Background for External Control of an Autonomous Vehicle

Vehicles are replacing or supplementing manual controls with automatic controls. Semi-autonomous cars may help a driver perform certain functions, such as collision avoidance or braking. Fully-autonomous cars may turn passengers into passive participants while they are transported to their destination. The increased automation can make common tasks in conventional vehicles more challenging. The absence of a wheel in certain implementations can make it more difficult to navigate the vehicle through congested areas or in unmapped areas.

Vehicles are replacing or supplementing manual controls with automatic controls. Semi-autonomous cars may help a driver perform certain functions, such as collision avoidance or braking. Fully-autonomous cars may turn passengers into passive participants while they are transported to their destination. The increased automation can make common tasks in conventional vehicles more challenging. The absence of a wheel in certain implementations can make it more difficult to navigate the vehicle through congested areas or in unmapped areas.

The following detailed description is aimed at systems and processes that control an autonomous vehicle within an environment. Some autonomous vehicles, unlike traditional automobiles, may not be easily controlled by passengers. As an example that is not limited, some autonomous cars may lack onboard navigation controls such as steering wheels, transmission controls or acceleration and braking controls. There may be situations where it’s desirable to manually navigate the vehicle or on demand. When the vehicle is in an area that’s particularly crowded or unmapped, like a parking lot or service location, it might be easier to let the driver control the vehicle manually. For example, the technician might need to move an autonomous vehicle accurately into a charging or repair area.

According to the implementations of this publication, external control of an autonomous vehicle can be performed using a remote-controller communicating with a receiver located on the vehicle. Sensors on the vehicle can be used to control the vehicle safely while it is being remotely controlled. The sensors can be used, for example, to identify objects within the environment and prevent the vehicle from moving closer than a predetermined range. In other embodiments, the sensors can be configured to identify a particular object in the surrounding environment, like a person, then determine an input that controls the autonomous vehicle. In one example, the proximity of a person to a vehicle could cause it to move to a certain distance or location. Below, we will describe these and other examples in greater detail.

In some implementations, the vehicle can receive commands indicating the direction and/or the speed at which it should proceed. The vehicle can detect the presence of an obstacle near the vehicle. Depending on the distance or position of the object, the vehicle will be controlled to move in the desired direction and speed. Alternatively, it may ignore the command signal or proceed at a lower speed. Alternative command signals can be used, for example, to prevent the vehicle from accidentally contacting the object.

In other implementations, the vehicle could interpret a person’s position as a signal. The vehicle, for example, may detect the presence of a person or object near the vehicle, and move away from it. As a person approaches a vehicle, for example, the vehicle can move in a certain direction or distance to maintain distance and/or an orientation relative to that person. In other implementations, the vehicle can maintain a distance between it and an object. It does this by moving closer. When sensor information indicates a control object has moved away, for example, a person from the vehicle, then the vehicle can move in a certain direction or at a specific distance to maintain the distance and/or the orientation relative to that person.

In some examples, where a position or motion of an item is interpreted to be a command for control, the object can first be authenticated. The vehicle may authenticate the object, or a device that is associated with it, for example, by using a wireless protocol. In other implementations a person or a device can be recognized as an entity that is capable of controlling the vehicle by the autonomous vehicle.

Features and Methods described herein can be used by systems, such as autonomous drones, fully- or semiautonomous vehicle systems, or other systems, to assist in manual navigation, for example, collision avoidance. The disclosure may reveal other uses for the methods and systems described herein. The following sections describe these and other features, implementations and applications in greater detail with reference to FIGS. 1-8.

Turning to FIG. In an environment 100, an autonomous vehicle and a human 104 can be found. According to some implementations, the autonomous car 102 could be a Level 5, meaning that all seats are passenger seats, and there is no driver’s seat. This is because the vehicle is fully autonomous. The U.S. National Highway Traffic Safety Administration issued the Level 5 classification in 2016. It describes a vehicle that is capable of performing safety-critical functions throughout the trip without the driver or occupant having to take control at any point. The vehicle can be configured to perform all functions, from start to finish, including parking, and may therefore be unoccupied. The systems and methods described in this document can be used to control any ground-borne vehicles, including conventional cars where all vehicular controls are performed by the driver, as well as vehicles that have partial autonomy, which allows them to be controlled autonomously without driver assistance or attention during certain driving conditions. This partially autonomously controlled vehicle might require driver assistance and/or attention during other driving situations, like, for instance, when operating in city streets or urban areas or at least during some parking functions.

The example vehicle 102 can include a frame or chassis, and a frame on which a body is mounted. The body can be configured to provide a vehicle in any form, including a van or sport utility vehicle. It could also be a crossover vehicle, truck, bus, agricultural vehicle, construction vehicle, etc. The vehicle 102 can be powered by one, two, or three internal combustion engines or electric motors. The example vehicle 102 can also have any number of tires, wheels and/or tracks. For example, it could include four wheels.

The vehicle 102 can also include vehicle sensors 106 which sense objects near the vehicle such as those in the environment 100. In some examples the vehicle sensors may include sensors that are configured to identify objects within the environment 100. The vehicle sensors 106 can include, for instance, one or two light detection and range sensors (LIDAR), as well as one or several cameras (RGB (intensity), grey scale (intensity), depth, stereo, infrared etc. One or more radio detection-and-ranging sensors (RADAR), ultrasonic transducers and microphones to detect sounds in the surrounding environment are some of the sensors that can be used by autonomous vehicles. Other sensors may be included in the autonomous vehicle 102, such as sensors that detect tire pressure and tread depth or tire temperature. Sensors for brakes can also be used to detect brake temperatures and wear. Wheel encoders (also known as inertial measurement unit (IMU)), accelerometers and magnetometers are examples of vehicle sensors 106. The vehicle sensors may be configured to transmit sensor data indicative of sensed objects such as the person, 104, and signals to the systems associated with autonomous vehicle 102. Sensor data and other types of sensors are also contemplated.

In certain implementations, the information from the sensor may be used to help a perception system that is associated with the autonomous car 102 understand the attributes of the surrounding environment and/or act on these attributes. In normal operation, the autonomous car 102 may use attributes of its environment to control its movement, e.g. to alter a trajectory. A controller within the autonomous car may, for example, control the autonomous car to avoid an object detected, e.g. by controlling the steering and acceleration of the autonomous 102. In some cases, the normal controls for operating the vehicle are not available or not practical. The perception system, which navigates the vehicle to an object, may rely on one or more maps stored in the area. If the autonomous vehicle is not in a map-enabled area or if maps are unavailable, it may be difficult for the autonomous vehicle 102 to navigate. It may be advantageous to allow manual control of autonomous vehicle 102. Manual control may also be more effective in certain environments. “As an example that is not limited, the movement of the autonomous vehicle within enclosed spaces such as parking garages or service locations may be better controlled by manual controls than computer-generated controls.

The example in FIG. The sequence of four scenarios in Figure 1 shows how a vehicle 102 is instructed to move in the first direction by the user via the controller. Specifically, FIG. In particular, FIG.1 illustrates a scenario 112. The vehicle is at a position V0. A second scenario 114 shows the vehicle moving in a direction 110, as shown by the arrow, to a position V1. A third scenario 116 shows the vehicle moving in the direction 110, to a position V2. In one example the user input may be the same for each scenario. For instance, a command is given to advance the vehicle in a direction at the maximum speed. In the embodiment shown in FIG. In FIG. 1, the person 104 remains stationary at position P0. In FIG. “In FIG.

While the person 104 may appear as the only object in the environment, other objects may also be present, whether they are animate or not. As will be apparent from the disclosure below, certain aspects of the disclosure can be applied to moving objects in the environment 100, including the person 104.

Such sensor data may be used to determine a distance of the autonomous vehicle from the person 104, e.g., a linear distance between V0 and P0 in the first scenario 112. These sensor data can be used to calculate a distance between the autonomous vehicle and the person, for example, a linear space between V0 andP0 in the scenario 112. In other implementations the sensor data can be used determine the position of the individual in a two or three-dimensional environment associated with the vehicle and/or the environment 100.

In the example shown in FIG. The distance between the remote controller and the autonomous vehicle is measured. In the embodiment illustrated, inputs from the controller 108 direct the autonomous vehicle to move in the arrow direction 110. In this direction, the person 104 can be found. If the autonomous vehicle 102 continues in this direction, then the person 104 may be hit by the autonomous car. In some implementations, however, sensor data is used from the sensors to identify the person and avoid the collision. Sensor data associated with a person is acquired by the sensors, and this data is then used to calculate the distance between the person 104 and the vehicle 102. The presence of the person 104 outside a threshold distance is not a concern, so the controls received via the remote control 108 can be ignored. The vehicle 102 will then move in accordance with the commands entered by the remote control 108. The controller 108 can alter the controls as the vehicle moves closer to person 104. For example, it may reduce the maximum speed. Or, as the distance between vehicle and person 104 decreases, the controller 108’s controls may be ignored.

FIG. This example is illustrated in FIG. Specifically, FIG. In FIG. 1, a first area or zone 120 extends a distance of d1 from the vehicle, while a second region or zone 122 is adjacent to the first area 120 and opposite the vehicle, extending another distance of d2 before the vehicle. In this example when objects, such as the person, are outside the zones (i.e. a distance greater than the distance of d2), commands from the remote controller are used to control the autonomous vehicles, for example to control speed or steering. The scenarios 112 and114 show embodiments where the person 104 is located outside of both regions 120,122. The graph 124 shows that at positions V0, V1, and V2, the vehicle can traverse up to a maximum speed Vmax. Vmax can be a predetermined speed that is suitable for the autonomous vehicle to navigate in the environment 100. Vmax in some implementations may be significantly lower than the top speed of an autonomous vehicle. The autonomous vehicle 102, for example, may be configured to travel at the maximum speed posted, such as up to 120 km/h. However, in some implementations, the environment 100 could be a service or manufacturing area, testing facility, parking lot, garage, or another covered area.

As the vehicle 102 moves closer to the individual 104 as shown in scenario 116 the person is at a distance away from the vehicle. This distance is the difference between V2 P0 and the distance d2. Sensor data from vehicle sensors 106 indicate that the person is in the second area 122. The graph 124 shows that the maximum speed at which the vehicle can proceed is lower than Vmax because of the close proximity of the person. The vehicle will continue in this direction at a lower speed when the signal is sent from the remote control 106. This was the case when the person 104 wasn’t in the region 122. The vehicle will still move at the speed specified by the remote controller, even if it is slower than the maximum speed associated with the position of the object.

The vehicle continues to travel in the direction indicated by the arrow 110, until the person is at the threshold, which is the distance d1 away from the vehicle. This is at the intersection between the first region 120, and the second region 122. In the example 118, the forward movement of the vehicle 102 is prevented because the person 104 has been detected at the region 120. So, any commands from the remote control 108 which would cause the vehicle 102 to move further in the direction arrow 110 will be overridden or ignored as long as the person is in or at the first region 120. You can still use the remote controller to move the vehicle laterally or in reserve (if this is possible), but it will not continue moving toward the person. The graph 124 shows that at position V3, at maximum speed, the vehicle will move in the direction indicated by the arrow 110.

As shown in graph 124, maximum vehicle velocity decreases linearly when relative movement between the vehicle 102, and an object (e.g. the person 104) causes the distance to the object and vehicle to change from d2 d1 within the second region. The decrease in speed may be a different shape or form for other embodiments. The maximum speed may be reduced in increments equal to the difference between d1 d2 or according to a curve. For example, the rate of reduction may be slower when the object approaches d2 compared to when it is nearer d1. Other aspects can be changed in addition to the rate of reduction. The distances d1 or d2 can be changed depending on the application. In a parking area, like a garage or parking lot, the distance d1 can be smaller because parking spaces are closer to each other, and there may be more obstacles, such as walls, and so on. In some examples, d1 can be anywhere from 1 foot to 5 feet. D2 can be anything between 5 and 10 feet bigger than d1.

Click here to view the patent on Google Patents.