Invented by Timothy David Kentley-Klay, Rachad Youssef Gamara, Zoox Inc

The market for systems for configuring active lighting in an autonomous vehicle to indicate its direction is rapidly growing as the demand for self-driving cars continues to rise. With the advancements in technology and the increasing focus on safety, these systems play a crucial role in ensuring the smooth operation of autonomous vehicles on the road. Active lighting systems in autonomous vehicles are designed to communicate the vehicle’s intentions and movements to other road users, including pedestrians, cyclists, and other drivers. These systems use a combination of LED lights, sensors, and advanced algorithms to provide clear and intuitive signals, making it easier for others to understand the vehicle’s direction. One of the key benefits of these systems is their ability to adapt to different driving scenarios. For example, when the autonomous vehicle is turning left, the active lighting system can project a left-turn signal on the road surface, making it more visible to pedestrians and cyclists. Similarly, when the vehicle is changing lanes or merging onto a highway, the system can indicate the intended direction, reducing the chances of confusion or accidents. The market for these systems is driven by several factors. Firstly, the increasing adoption of autonomous vehicles by major automotive manufacturers and technology companies has created a need for advanced safety features. Active lighting systems are seen as an essential component in ensuring the safe integration of autonomous vehicles into existing traffic. Secondly, government regulations and safety standards are also pushing the demand for these systems. Many countries are actively working on legislation to regulate autonomous vehicles, and active lighting systems are likely to be a mandatory requirement to ensure the safety of all road users. Furthermore, the growing concern for pedestrian and cyclist safety is also driving the market. Active lighting systems can significantly enhance the visibility of autonomous vehicles, especially during low-light conditions or adverse weather. This increased visibility reduces the risk of accidents and improves the overall safety of autonomous vehicles. In terms of market players, several automotive lighting manufacturers and technology companies are actively developing and commercializing these systems. Companies like Osram, Hella, and Valeo are investing heavily in research and development to create innovative active lighting solutions for autonomous vehicles. The market for systems for configuring active lighting in autonomous vehicles is expected to witness significant growth in the coming years. According to a report by MarketsandMarkets, the global market for automotive lighting is projected to reach $34.65 billion by 2022, with active lighting systems playing a crucial role in this growth. In conclusion, the market for systems for configuring active lighting in autonomous vehicles to indicate their direction is expanding rapidly. These systems are essential for ensuring the safe integration of autonomous vehicles into existing traffic and improving the overall visibility and communication between autonomous vehicles and other road users. With the increasing adoption of autonomous vehicles and the focus on safety, the demand for these systems is expected to grow exponentially in the coming years.

The Zoox Inc invention works as follows

Systems and apparatus may be configured for actively-controlled light emissions from a robot vehicle. A light emitter(s) of the robotic vehicle may be configurable to indicate a direction of travel of the robotic vehicle and/or display information (e.g., a greeting, a notice, a message, a graphic, passenger/customer/client content, vehicle livery, customized livery) using one or more colors of emitted light (e.g., orange for a first direction and purple for a second direction), one or more sequences of emitted light (e.g., a moving image/graphic), or positions of light emitter(s) on the robotic vehicle (e.g., symmetrically positioned light emitters). The robotic vehicle does not necessarily have a front and back (e.g. a trunk/hood). It may also be configured to travel in two directions, one opposite of the other (e.g. opposite of the first direction), using light emitters.

Background for System for configuring active lighting in an autonomous vehicle to indicate its direction

Autonomous Vehicles, like those designed to transport passengers in urban environments, may face many situations where they need to inform other vehicles and people of their operational intent. For example, the direction in which the autonomous vehicle will or is currently driving. Passengers of autonomous vehicles may be unsure which vehicle will serve their transportation needs. If there are several autonomous vehicles, for example, the passenger who booked a ride may want to know which one is meant for them.

Accordingly, there are systems, apparatus, and methods that can be used to implement the operational status and intent for robotic vehicles.

Different embodiments or examples can be implemented in many ways, such as a system or method, an interface, software or firmware, logic or circuity or a set of executable instructions in a nontransitory computer-readable medium. A non-transitory medium, a computer network, where program instructions are transmitted over optical, wireless, electronic or other communication links, and then stored, can be a nontransitory medium. Non-transitory computer-readable mediums include, but are not limited to, electronic memory, DRAM and SRAM, ROM and EEPROM. Flash memory is also an example. One or more nontransitory computer-readable mediums can be distributed across a variety of devices. The disclosed processes can be carried out in any order.

Below, you will find a detailed description and accompanying figures of one or several examples. This detailed description does not limit itself to any one example, but rather includes all examples. The claims are the only limit to the scope of this invention. Numerous alternatives, modifications and equivalents can be found. In order to give a complete understanding, the following description includes many specific details. The details are only provided as an example, and the techniques described can be used according to the claims even without these details. To avoid obscuring the descriptions, the technical material in the fields of technology related to the examples was not described in detail.

FIG. FIG. 1 shows an example of how to implement an active safety system into an autonomous vehicle. In FIG. In FIG. To explain, the environment 190 can include objects that could potentially collide with autonomous vehicle, including static or dynamic objects or objects that present a danger to passengers riding in autonomous vehicle (not shown), or to autonomous vehicle. In FIG. In FIG. “A potential collision 187 may occur with the autonomous vehicle (e.g. by rear-ending it).

Autonomous vehicle system 100 may use sensor data 132 from the sensor system and receive autonomous vehicle location data 139 (e.g. implemented in a localizer system of the autonomous vehicle 100) to sense the environment 190 and detect the object180 and may take actions to mitigate or avoid the collision between the object180 and the autonomous vehicle 100. The autonomous vehicle system may receive sensor data from the sensor system, and autonomous vehicle location data (e.g. implemented in the localizer system of autonomous vehicle 100) 139. Sensor data 132 can include, but not be limited to, data representing a sensing signal (e.g. a signal produced by a system sensor). The sensor data 132 may indicate the external environment 190 of the autonomous vehicle 100. The location data of the autonomous vehicle 139 can include, but not be limited to, data that represents a position of the autonomous 100 in the surrounding environment 190. The data that represents the location of an autonomous vehicle 100 can include, for example, position and orientation (e.g. a local position, or local pose), maps (e.g. from one or several map tiles), and data generated via a GPS and IMU. A sensor system of an autonomous vehicle 100 can include a GPS, an IMU, or both.

Autonomous Vehicle System 101″ may include, but is not restricted to, hardware, software or firmware. It can also include logic, circuitry and computer executable instructions embedded in a nontransitory computer-readable medium. The autonomous vehicle system 101 can access a number of data stores, including but not restricted to an object type data store 119. The object types data store may contain data that represents object classifications associated with objects detected in the environment 190. “Objects classified as pedestrians may include data representing object types associated with object classifications (e.g., a variety of pedestrian objects such as?standing?,?standing?

Path Calculator 112″ may be configured to create data that represents a trajectory for the autonomous vehicle (e.g. trajectory 105), based on data indicating a location of autonomous vehicle (e.g. local pose data in vehicle location data 139) and other data. The path calculator 112 can be configured to create future trajectories that will be executed by autonomous vehicle 100. In some cases, the path calculator 112 can be integrated into or part of an autonomous vehicle’s planner system. In other examples the path calculator 112 or the planner system can calculate data related to a predicted movement of an object within the environment, and determine a predicted path for the object. In some cases, the predicted path can be the object path. In some examples, an object’s path can be a predicted trajectory. In other examples, an object path in the environment may represent a predicted trajectory. This predicted trajectory may be identical or similar to the predicted trajectory.

The “Object Data Calculator 114” may be configured in order to calculate data relating to the location of an object 180 within the environment 190. It can also calculate data relating an object track that is associated with an object 180. The object data calculator 114 can calculate data such as the location of an object, data for the object track and data for the object classification by using sensor data 132. In some examples the object data calculate 114 can be implemented or constitute a perception or sensor system or a part thereof that is configured to receive data representing the signal.

The “Object Classification Determinator 118” may be configured so that it can access data representing objects types 119 (e.g. a type of object classification, subclasses of object classifications, or subsets of object classes) and compare data from the object track with data from the object class to determine an object type. A detected object with an object type of “car” is one example. As an example, a detected object with an object classification of a ‘car? or ?school bus?. A type of object may have additional subclasses, or subsets. For example, a “school bus” could be included. A vehicle that is parked can have an extra subclass called’static? (e.g. The school bus is not moving, for example. (e.g. “The school bus is moving”, for example.

Collision Predictor 116″ may be configured to utilize data representing object type, data representing trajectory of object, and data representing trajectory of autonomous vehicle in order to predict collisions (e.g. 187) between autonomous vehicle 100, and object 180.

A kinematics calculator 115 can be configured to calculate data that represent one or more vector and/or scalar quantities associated with the motion of the object in the environment, such as but not limited, to velocity, speed acceleration, deceleration momentum, local pose, and force. The data from the kinematics calculator 115 can be used to calculate other data, such as but not limited data that represents an estimated time until impact between the autonomous vehicle and the object 180 or data that represents a distance between object 180 to the autonomous vehicle. In some examples, the kinematics calculus 115 can be configured to predict the likelihood that other objects (e.g. cars, pedestrians, bicycles, motorcycles, etc.) In a state of alertness or control, as opposed to being un-alert or out-of-control or drunk, etc. For example, the kinematics calculator 115 can be configured to estimate the probability that other agents, such as drivers or riders in other vehicles, are acting rationally, (e.g. based on the motion of the vehicle they are driving or a riding), and this may influence the behavior of autonomous vehicle 100. Based on the erratic movement of the object that they are driving or riding. Sensor data collected over time can be used to predict or estimate a future position of an object in relation to the current or future trajectory or the autonomous vehicle 100. A planner system of an autonomous vehicle 100 can be configured to perform vehicle maneuvers which are extra cautious, and/or activate the safety system.

A safety system activater 120 can be configured to trigger one or more safety features of the autonomous vehicle when a collision has been predicted by the collision prediction system 116, and/or other safety-related events have occurred (e.g. an emergency maneuver performed by the vehicle, like hard braking, rapid acceleration, etc.). Safety system activator may be configured for an interior safety system, an exterior system, a driving system (e.g. cause the drive system to perform an emergency maneuver in order to avoid the Collision), any combination thereof, or any of the above. Drive system 126, for example, may receive data configured to cause the steering system to change the trajectory of the vehicle 100, from a trajectory 105, to a collision-avoidance trajectory 105a.

FIG. 2A shows an example of a flow chart 200 for the implementation of an active safety system within an autonomous vehicle. At a stage of flow diagram 200 (e.g. implemented in the planner system of autonomous vehicle) 202, it is possible to receive data that represents a trajectory of an autonomous 100 in an external environment (e.g. environment 190).

At stage 204, the object data for an object (e.g. automobile 180) placed in a particular environment (e.g. environment 190) can be calculated. At the stage 204, sensor data 205 can be accessed to calculate object data. The object data can include, but is not restricted to, data representing the location of the object in the environment, the track (e.g. static for an object that is not moving and dynamic for one in motion), as well as an object classification associated with it (e.g. pedestrian, dog or cat, bicycle, motorbike, automobile, truck etc.). The stage 204 can output one or more types associated with an item, such as but not limited, to data that represents object location 207 within the environment, object track 209 and object classification 211.

At a stage 206, a predicted path of an object in the surrounding environment can be calculated. The stage 206, for example, may receive data representing the object location 207. It may then process this data to produce data representing the predicted object path 213.

At stage 208, object types 215 can be accessed. At stage 210 data defining an object type may be determined using the data describing the object track 209, the object classification data 211, and the object types 215. An object type can include, but is not limited to, a pedestrian object having a statically tracked object (e.g. the pedestrian isn’t moving), an automobile object having a dynamically tracked object (e.g. the automobile is moving), and an infrastructure object having a statically tracked object (e.g. a traffic sign or lane marker). The stage 210 can output data that represents object type 217.

At stage 212, a collision between an autonomous vehicle and an object can be predicted using the determined object type 218, the autonomous vehicle trajectory 203, and the predicted path of the object 213. In one example, it may be possible to predict a collision based on part of the determined object type 217, due to an object track being dynamic (e.g. the object in motion), a trajectory for the autonomous vehicle potentially conflicting with the trajectory of an object (e.g. the trajectories could intersect or interfere with each other), as well as the object having a classification 211 that indicates that the object poses a threat of collision (e.g. the object has been classified as a bicycle, a

At stage 214, the safety system of an autonomous vehicle can be activated if the collision is anticipated (e.g. at stage 212). The stage 214 can activate one of more autonomous vehicle safety systems. These include interior safety systems, exterior safety systems and one or several drive systems. For example, the stage 214 may activate one or more safety systems of the autonomous vehicle. These include interior safety systems, exterior safety systems, and drive system (e.g. steering propulsion braking). The stage 214 can cause (e.g. by communicating data or signals) the safety system activator to activate one of more safety systems in the autonomous vehicle 100.

FIG. The flow diagram 250 depicted in 2B is another example for the implementation of an active safety system within an autonomous vehicle 100. At a stage 252 of flow diagram 250 (e.g. from the planner system of the autonomous 100), data representing the path 253 of an autonomo vehicle 100 may be received.

At stage 254, the location of an item in the environment can be determined.” Sensor data 255 can be processed by a perception system (e.g.) to determine location data of an object in the environment 255. At the state 254, data associated with an environment (e.g. object data associated to object 180) (e.g. environment 190) can be determined. The object data may be determined using sensor data 255 that was accessed in stage 254. The object data can include, but not be limited to, data representing the location of the object within the environment, a track associated with that object (e.g. static for an object that isn’t moving and dynamic for one in motion), a classification associated with that object (e.g. pedestrian, dog or cat, bicycle, motorbike, automobile, truck etc.). The object data may include an object type and a location of the object in the environment. The stage 254 can output data for an object. This includes, but is not limited to, data that represents the object’s location in the environment (257), data that represents an object’s track (261) associated with it, data that represents an object classification 263, and data that represent an object type associated with it.

Click here to view the patent on Google Patents.