Autonomous Vehicles – Mahsa Ghafarianzadeh, Benjamin Sapp, Zoox Inc

Abstract for “Detecting blocked objects”

“A system and method of determining if a stationary vehicle or blocking vehicle is necessary to enhance the control of an autonomous vehicle. The autonomous vehicle may send sensor data to the perception engine, which can detect a stationary vehicle within the environment. The perception engine can respond to this detection by determining feature values for the environment of the vehicle using sensor data (e.g. features of the stationary vehicle or other objects) and the environment itself. These feature values may be input by the autonomous vehicle into a machine learning model that determines if the stationary vehicle is blocking and then use the probability to generate the trajectory to control the vehicle’s motion.

Background for “Detecting blocked objects”

“Stationary objects such as vehicles on the road or other stationary objects can interfere with autonomous vehicle operation. A stationary vehicle, such as a car, may block the autonomous vehicle by being double-parked or inaccessible. Sensor visibility may make it difficult to detect such vehicles. To determine if the vehicle is blocking, you should be in front of it.

Environmental cues can increase the complexity of detection. Two examples: A vehicle that stops at an intersection with a red light could be a blocking vehicle, and not just waiting for the traffic light to turn. A vehicle that is stopped at an intersection with a green light might be waiting in line to turn, but not a blocking vehicle. Incorrectly detected block vehicles can cause additional problems such as inability to return to the original lane.

As mentioned above, “blocking objects” (including vehicles) are objects that prevent an autonomous vehicle from following a route or path. Double parking is common in urban areas. Double-parked vehicles must be identified as blocking vehicles and treated differently from stationary vehicles. The autonomous vehicle might be told to wait for a stopped vehicle so it can move. However, the vehicle could be instructed how to navigate around double-parked vehicles. The general rules such as treating all stationary cars at green lights as blocking are often incorrect and/or inadequate for the safe operation of an autonomous vehicle. They also do not reflect human driving. This disclosure relates to methods (e.g. machines, programs and processes) that can be used to determine whether a stationary vehicle or object is blocking. Then, the ability to control an autonomous car based on this determination. The techniques described herein include a machine learning (ML) model that receives sensor data to determine if a stationary vehicle has become a blocking vehicle. This determination is not made according to a conditional principle (e.g., it is true that there is a green light and that the vehicle has been stopped, it would indicate that the vehicle was blocking). Instead, the techniques described herein can determine the probability that a stationary vehicle will be blocking using an ML model that uses sensor data.

In some cases, the techniques described herein include the receipt of raw and/or processed sensors data (e.g. sensor data processed using another machine-learning model or software/hardware modules of an autonomous car), determining whether a vehicle in a road path is stationary, determining feature value from the sensor data (e.g. values indicating features like speed, brake conditions, height, and/or size of the stationary vehicles; traffic flow data; and classifying the stationary and/or objects in the vicinity of the vehicle. An ML model may be used to determine the probability in some cases.

“In some cases, an autonomous vehicle may also include a planning engine and/or a perception engine. One or more ML models, as well as other computer-executable instructions, may be included in the perception engine. These instructions are used to detect, identify, classify, and/or track objects using sensor data from the autonomous vehicle. The perception engine might include the ML model that determines whether the stationary vehicle is blocking (referred to hereinafter as?BV model). Although the BV model is also covered in general, the ML models are equally important. A planner could include one or more ML algorithms, models, etc. for route-planning, trajectory-planning, evaluating decisions, etc. A planner can be used to create a trajectory to control the motion of an autonomous vehicle using data from other components such as the perception engine, sensors, data from other vehicles, and/or a network link.

The techniques described herein improve the operation of an autonomous car by increasing detection accuracy over previous solutions (e.g. using conditional rules). The techniques prevent the autonomous vehicle’s unnecessarily being parked behind a blocked vehicle. This reduces the vehicle’s power consumption and minimizes the amount of wasted compute cycles. These techniques can also be used to prevent the autonomous vehicle unnecessarily changing lanes and re-routing if the stationary vehicle is not blocking. The vehicle’s power consumption will be reduced and the wasted compute cycles may also be reduced. These techniques can improve the safety of autonomous vehicles’ operation, both for passengers and other entities in the environment. They allow the vehicle to perceive a situation more accurately. The techniques could prevent an autonomous vehicle from changing lanes only to have it return to the original lane. This is because the vehicle has not yet seen a line of cars, and may be able to anticipate that doors of non-blocking stationary vehicles may open, thus avoiding collisions.

A blocking vehicle, as used herein is any stationary vehicle on a drivable road surface that hinders the progress of other vehicles. Some stationary vehicles are not blocked. A vehicle that stops its progress on a road surface to signal a red light may not be considered a blocking vehicle. It could, for example, stop at a traffic light to signal a red light and wait for another vehicle’s progress. A blocking vehicle could be a vehicle that is double-parked, a delivery vehicle that has been parked for delivery, or a vehicle that has had its driver evacuated, stopped police car, incapacitated, or a vehicle that has been parked. It is difficult to distinguish between a stationary vehicle that is not blocking and one that is blocking. In most cases, it may only be a matter of the time the vehicle has been stopped. The primary difference between a stationary and blocking vehicle is the fact that a vehicle cannot or will not move if the blocking vehicle is present. The blocking vehicle can be described as a stationary vehicle that isn’t following the generally accepted rules of road. For example, it could be considered a vehicle that isn’t following a lane or not adhering to the traffic laws. These techniques improve the operation of autonomous vehicles by preventing them from being stopped behind a blocking vehicle for an inexplicable period of time.

“Example Scenario”

“FIG. “FIG. An example scenario shows an autonomous vehicle (102) approaching a roadway junction (104), which includes a traffic light (106). The autonomous vehicle 102 could be an autonomous vehicle that is designed to operate according the U.S. National Highway Traffic Safety Administration Level 5 classification. This means the vehicle can perform all safety-critical functions throughout the trip without the driver or occupant being required to control it. In other cases, however, the autonomous vehicle102 could be fully or partially autonomous and can have any level of classification currently in use or planned in the future. In some cases, however, the methods described in this article for identifying blocking vehicles can also be used by non-autonomous cars.

“The autonomous vehicle102 may receive sensor data from one of the sensors of the autonomous car102. This sensor data may be used by the autonomous vehicle 102 to calculate a trajectory for controlling the vehicle’s motion. The planner, which may receive data from sensors and/or perception engines, and other software and hardware modules, could be included in the autonomous vehicle 102. This data could include data such as the position of the autonomous car 102 from a global-positioning sensor (GPS), data about objects within the area of the autonomous car 102, route data, which specifies the destination of the vehicle, and global map data that identifies the characteristics of roads (e.g. Features that can be detected by different sensor modalities are useful for localizing an autonomous vehicle. Local map data is used to identify characteristics in close proximity to the vehicle. This data may be used by the planner to create a trajectory for controlling the vehicle’s motion. The autonomous vehicle 102 may also include a perception engine. This engine receives sensor data from one of the sensors 102 and determines perception data. It transmits this data to the planner. The planner can use the data to determine a location of the autonomous car 102 on the global maps, to determine one or more paths, or to control the motion of the vehicle to follow a route. The planner might determine a route for autonomous vehicle 102, from one location to another, generate potential trajectories to control vehicle motion (e.g. 1 microsecond, 1/2 second) and then select one potential trajectory to guide the vehicle along the route. This may be used to generate drive control signals that can be transmitted to the components of the autonomous car 102.

“The perception engine could include one or more ML model and/or other computer executable instructions for detecting and identifying, segmenting and classifying objects using sensor data from the autonomous vehicle 102. The perception engine might detect an object in its environment and classify it (e.g., semi-truck or pickup truck, passenger vehicle, child, dog, or ball). The perception engine can also track the object’s position, velocity, acceleration, and/or heading (historical, current and/or predicted).

The perception engine can determine whether a vehicle is stationary based at least partially on data from sensors attached to the autonomous vehicle. The perception engine may be able to receive sensor data, classify an object in the environment of autonomous vehicle 102 as a vehicle and determine that the velocity of that vehicle is below a predetermined threshold velocity. For example, a vehicle’s sensed velocity should not exceed 0.1 meters per sec or 0.05 meters per sec. A vehicle can be described as, among other things, a vehicle that is used to transport objects. It could include a passenger vehicle or a delivery truck. The perception engine might classify the vehicle as stationary if it determines that the vehicle’s velocity is not within a predetermined threshold. The perception engine may classify a detected vehicle as a stationary vehicle if it determines that the vehicle’s velocity does not meet (or exceed) a predetermined threshold speed and/or another condition such as a time period, traffic light condition, distance sensed from a junction, etc. A perception engine might classify a vehicle detected as stationary based on the sensor data it receives. For example, it may determine that the vehicle is moving slower than a predetermined threshold velocity (e.g. the vehicle has stopped for 20 seconds) or that the vehicle has been traveling slower for a predetermined time.

“Returning now to FIG. “Returning to the example scenario in FIG. 1A, the autonomous car 102 may approach a junction that contains a traffic light106 and may come across an object it classifies (e.g. vehicle 108), which is also marked with a question mark. Technical difficulties arise when the perception engine doesn’t have enough information to determine whether the stationary car is merely stopped (i.e., non-blocking stationary vehicles), being obstructed by another object or legal constraint (e.g. a stoplight) or whether it is blocking the road. A blocking vehicle, as used herein is any vehicle that stops or moves at a speed that is less than the predetermined threshold velocity and impedes other vehicles. A blocking vehicle could be, for example, a double-parked vehicle or a delivery truck whose goods are being unloaded. It also might include a stopped police vehicle, an incapacitated car (e.g., one with a failing drive system, a flat tire, or a vehicle involved or whose occupant has vacated it), and a meter-reading vehicle. A “drivable road surface” is defined herein. These may be portions of roadways that are used in normative driving conditions.

“A blocking vehicle can be different from a nonblocking stationary car in that it is stopped when other vehicles cannot make progress (or otherwise according to generally accepted driving rules). Therefore, the non-blocking station vehicle does not hinder other vehicles (e.g., a red traffic light, an object has entered a road, or the stationary vehicle has another vehicle in front). The blocking vehicle can block the trajectory of an autonomous vehicle 102. This means that it may be different from non-blocking stationary vehicle (e.g., it occupies at most part of the same lane as autonomous vehicle102, or coincides with the path and/or trajectory for autonomous vehicle 102).

“The perception engine might not resolve the ambiguity about whether a detected car is a non-blocking, stationary vehicle or a blocking vehicle. The planner may not have sufficient data to create a trajectory that moves the autonomous vehicle 102 in accordance with the scenario.”

“FIG. “FIG. 1A, and it reflects another complication in the example scenario 100. FIG. FIG. 1B shows the imperfect and limited view of the scenario that is available to the autonomous car 102 through the sensor data. The future sensor and perception advancements will likely increase the amount of scenarios reflected in sensor data. However, it is likely that the autonomous car 102 will not have access to 100% of the states, objects, and other information. in a scenario or having an effect on it, at least partially. FIG. FIG. 1B shows an example of scenario 100 as reflected in sensor data that was received by the planner of the autonomous car 102. A combination of global map data and GPS data from the autonomous car 102 could indicate that a junction is 100m in front of the vehicle 102. Sensor data may also indicate that the autonomous vehicles 102 are in a lane authorized to enter the junction. The autonomous vehicle may receive sensor data which allows it to determine that vehicles 110 and 112 have stationary vehicles.

“Due to limitations of the sensor(s), autonomous vehicle 102 might not receive sensor data to detect vehicles 114-122. Vehicles 114-122, for example, might be out of the field of view, beyond the reliable operational distance of a sensors, or occluded. Some examples may have conditional rules that define the conditions under which a blocking vehicle indicator will be output. The conditional rules can be pre-programmed by an administrator or programmer. A hard-coded rule might specify, for example, that the perception engine emits a blocking device indication when a detected car has been classified as stationary and remains stationary for a predetermined time.

However, these hard-coded rules might not be able to account for all possible scenarios. The example of a hard-coded rule that might be used in the case where a stationary vehicle 108 is incapacitated, and therefore blocks vehicle 108. FIG. FIG. 1A shows that there could be a long line in front of the stationary car 108, which the autonomous vehicle may not be able to detect. The planner might need to indicate a blockage vehicle, such as scenario 100, in order to prevent the vehicle from operating incorrectly or dangerously. The planner could initiate a lane shift, assuming that the autonomous vehicle (102) can return to the original lane after passing the stationary car (108), which has been identified by the perception engine. The long line of cars in front of the stationary vehicle may make it difficult for the autonomous vehicle 102 to return to its original lane. False positive would be to classify the stationary vehicle 108 in this scenario as a blocking car. False positives could also disrupt the operation of the autonomous car 102. “False positives and/or negatives may be too high in hard-coded rules.”

“Furthermore,” adding a new feature (e.g. checking if the stationary vehicle’s hazards lights are flashing) would require a lot of time and human coding. This is not a good idea as it could cause the autonomous vehicle to stop operating or require human reconfiguration. It may also require multiple iterations for testing and developing new configurations of the perception engine.

“In some cases, the techniques described herein may instead be used to probabilistically determine if a stationary vehicle or a blocking vehicle via an MML model that is not limited.” This is in conjunction with the BV ML Model.

“Example Process”

“FIG. 2. This is a pictorial flow chart of an example process 200 that determines, at an autonomous car, whether a stationary or blocking vehicle is necessary to control it. The example process 200 at operation 202 may also include receiving sensor data from a sensor(s) of an autonome vehicle 204. The sensor data can be received either from another vehicle or a remote sensor (e.g., weather station, traffic control service, emergency service) and/or sensors located in infrastructure (e.g. “Sensors(s) are located on light posts, buildings.”

“At operation 206, the example process 200 might include detecting a stationary car 208 using the sensor data. This could include detecting an object and classifying it as a vehicle. The threshold could be predetermined (e.g. the vehicle is not moving faster than 0.05m per second) or relative (e.g. the vehicle is moving less than 20% of the average speed of traffic). Operation 206 can also, or alternatively, include determining whether other features from sensor data satisfy one or more conditions. For example, the vehicle must be at least a certain distance from a junction. The vehicle must also meet a specific type of condition (e.g., bicycle, passenger vehicle), or a light traffic signal is green.

“The operation 206 could include the detection of all stationary vehicles within sight? The autonomous vehicle 204’s sensors (i.e. All vehicles within the field of vision of one or more sensors. Operation 206 can also detect stationary vehicles within a predetermined distance. The autonomous vehicle 204 might detect stationary vehicles within 50 metres, for example. The autonomous vehicle (204) limits the distance at which it detects stationary vehicles. This conserves storage and processing resources. The autonomous vehicle 204 planner can make better decisions about the trajectory of the vehicle and/or control it.

“At operation 208, the example process 200 could include determining feature value 210 based at most in part on sensor data. The feature values 210 could correspond to features defined by a blocking vehicle (BVML) model. Based on feature values 210, the BVML model can be used to determine whether the stationary vehicle (208) is a blocking car. This information may be determined using a perception engine from the autonomous vehicle (204). The autonomous vehicle 204 might attempt to determine feature value 210 for at most a subset the features that the BV ML model can be configured. The table below shows an example of feature value 210 that were determined by the perception engine. These features correspond to features that the BVML could rely upon to determine whether the stationary vehicle 208 may be a blocking vehicle. The example shows that some feature values 210 weren’t determined by the perception engine, or they were not available from sensor data. ?Other object behavior?. These features and others are discussed with regard to FIGS. 3A-3F.”

“Feature Feature ValunDistance From Junction 14.3 MetersnBrake lights on 0nSVSpeed.001m/snTraffic flow Normality?? ?.87?nBlocked by another object?nPerson near vehicle 0nOther behavior?nTypes of agent?Passenger vehicle?

“While several examples of features have been listed (and with reference to FIGS. The number and types are not limited to the features listed in 3A-3F. You can think of many features, such as object bounding box sizes, object color, object size and object height. A speed and/or direction, object yaw (e.g. relative to the orientation of the vehicle), Lane identification (e.g. Left, center, and right, a GPS location, detected logo, and/or text associated to the stationary vehicle, etc. Boolean values can be used to represent any of these features (e.g. The feature may be represented by Boolean values (i.e., real numbers such as detected speed), text (such classification) or any other representation of data.

In some cases, the feature values of the perception engine may include the following: a speed of stationary vehicles 208, traffic signal status (e.g. traffic light state, existence traffic sign), traffic flow data and a relationship of stationary vehicles data to traffic flows data (e.g. noise radar sensor data that may indicate that stationary vehicles are moving around stationary vehicles 208), proximity to stationary vehicles 208 and related data (e.g. police car, passenger vehicle or delivery truck), door open/closed condition (e.g.

“At operation 212, the example process 200 may involve determining, using BVML model 214, and the feature values, whether the stationary vehicle 208 might be a blocking vessel. The BVML model 221 may input the feature values 210 and, depending on the configuration of the model 214, may output the probability that the stationary vehicle is blocking vehicle. The BV ML model 214 can include a decision tree, or any arrangement thereof. These may include a random forest, boosted ensemble, or boosted decision trees, a directed acyclic diagram (DAG), where nodes are organized in a Bayesian network, deep learning algorithms (e.g., artificial neural networks, deep belief network, deep stacking network, deep neural network (RNN), etc. The BV ML model 214 could include a gradient-boosted deci tree.

“For example, if the BV ML model 214 contains a decision tree, it may output a negative or positive number to indicate whether the stationary vehicle 208 has been blocked. The BV ML model 214 could include multiple decision trees that may be weighted. One decision tree might be weighted so that it outputs a?1.63 and +1.63, while another decision tree could be weighted so that it outputs a +0.76 and?0.76. The perception engine can add all the outputs from the decision trees to calculate the probability that the stationary vehicle, 208, is blocking.

“Where the BV ML Model 214 contains a neural network the BV ML Model 214 may contain an input layer, one or two hidden layers of nodes and an output layer. The input layer of nodes can be set up to receive one or more feature value and activate hidden layers. The output layer can be set up to receive stimuli from the hidden layers, and then output the indication according to the most activated nodes. The output layer might contain two nodes, a positive and a negative indicator that the stationary vehicles are blocking. In other words, it may have an output that provides a combination of a strong confidence that a stationary vehicle is blocking and a strong confidence that it is not. The decision tree could be a classification tree that outputs whether or not the detected vehicle has been identified as a blocking vehicle.

In some cases, the BV ML model 214 may have been generated from labeled data. The BV ML model 214 could include a deep learning model that can learn to predict whether a stationary vehicle 208 will be a blocking car based on input samples feature values. These feature values are associated with labels that indicate whether they came from a scenario in which a stationary vehicule 208 was a blocking or non-blocking vehicle (i.e. a ground truth label). FIG. FIG. 2 shows the BV ML 214 decision tree. The shaded nodes represent nodes that are reached using input feature values. The illustrated decision tree might output a weighted probability of the stationary vehicle 208 being a blocking vessel equal to 0.623. Positive values could indicate that the stationary car 208 is blocking, while negative values may indicate that the vehicle 208 does not block. However, any sign value (e.g. ”

“At operation 216, the example process 200 might include sending the probability determined from the perception engine to the planner, which determines the trajectory for controlling an autonomous vehicle. As shown at 218 and 228, To generate a trajectory to control the autonomous car 204, the planner might use the probability that the stationary vehicle 208 is blocking. FIG. FIG. 2 shows two examples of trajectories the planner might choose in alternative scenarios.

“Example scenario 218: The perception engine might have indicated that the stationary vehicle (208) is a blocking car. The planner of the autonomous car 204 might respond to this indication by generating a trajectory 222. This causes the vehicle 204 to merge into another lanes. The perception engine may also provide information to the planner about the blocking vehicle (e.g. police car, meter reader, delivery vehicle). The classification could be a semantic label in some cases. This semantic label could be included in one of the feature value.”

“In the example scenario 220, the perception engines may have indicated that the stationary vehicle, 208, is not blocking (i.e. the stationary vehicle, 208, is non-blocking stationary). The planner of the autonomous car 204 may respond to this indication by generating a trajectory 234 that causes the autonomous automobile 204 to move forward in the same direction.

“In some cases, the planner and/or perception engine may decide to transmit a signal for remote assistance. The autonomous vehicle 204 might transmit a signal to remote computing devices, which may have higher computing power. This allows the remote computing device to determine the probability of a stationary vehicle 208 being a blocking vehicle or allow a human operator to input the information. If the probability is lower than a threshold, such as 0.25 or greater, the planner might transmit a signal to remote computing device. This may indicate that there is low confidence that the stationary vehicles 208 are or are not blocking. The autonomous vehicle 204 may transmit feature and sensor values to remote computing devices in some cases.

“Example Features”

“FIGS. 3A-3F show a range of features that the BVML model could use to determine if a stationary vehicle is a blocking vehicle. The perception engine may determine the feature values from sensor data. The operations described below can also be performed as part of operation 212 in FIG. 2.”

“FIG. 3A shows an example scenario 300 in which an autonomous vehicle 302 approaches stationary vehicle 304. A junction 306 includes traffic signal 308 (in our example, a traffic light). The autonomous vehicle 302 could be the equivalent of autonomous vehicle 204. This discussion is pertinent to that case. The autonomous vehicle 302 could detect that vehicle 304 is stationary using a perception engine using any of the techniques described herein. The autonomous vehicle 302 can respond to the detection that the vehicle is stationary by determining feature values using sensor data. The autonomous vehicle 302 might determine the condition of the lights in the stationary car 304 (e.g. brake lights on/off, hazard lights ON/OFF), and a distance between the vehicle 304 and the junction 306 (or both). In some cases, this may include the presence or absence of traffic signs 308 (e.g. red light, yellow light. The autonomous vehicle 302 can determine the height and/or the size of the stationary vehicle. The BVML model may determine the likelihood that the stationary vehicle is blocking. In certain cases, the height of an autonomous vehicle 302 could be used to indicate that it is taller than the stationary vehicle 304.

In some cases, the perception engine may also or alternatively identify text and/or logo associated with the stationary car 304 as one value. A machine-learning algorithm may decide that the stationary vehicle 304 has a pizza delivery sign (e.g. on the top of the vehicle), or a taxi service sign. It could also determine that the text and/or the logo are associated with the vehicle (e.g. UPS text and/or Logo, Uber text, and/or logo). The text and/or logo can be used to increase the confidence in a classification of the stationary vehicle type 304 (e.g. delivery vehicle, public transport vehicle).

“The perception engine may in some cases, additionally or alternatively, determine traffic flow data using sensor data. Traffic flow data can include data about additional objects that are classified as vehicles by the perception engine (e.g. vehicles 314, 3316, and 318). These data could include the velocity of the vehicle at 320 mph, distance between the vehicle and another vehicle, and/or previous vehicles at 322 m, etc.

“For example, FIG. 3B illustrates an example distribution 324 traffic flow data. This distribution could reflect the frequency of different detected feature values that are related to other objects (e.g. Histogram, distribution. This could include the distribution of vehicle speeds determined by the perception engine. Example distribution 324 shows how the perception engine determined that most of the detected vehicles move at speeds between 55 and 110 km/h. FIG. FIG. 3B shows velocity. However, it is also discussed herein. It is understood that any other feature that may be unique for each vehicle could also be represented in a distribution. Distance between vehicles, doors open/closed and person-near-vehicle are just some examples. Each vehicle may have a unique code, or lane of vehicles. However, some traffic light indicators, etc., might be different for each vehicle. You may not. These feature values are referred to as traffic flow data when they are compared to other vehicles’ frequency distributions. herein.”

“The perception engine can use a frequency distribution for traffic flow data to produce a feature value that at least one percentiles the traffic data associated to the stationary vehicle. 304. It may also include characteristics of the distribution (e.g. whether there is a long tail, whether it is eliminated by reducing traffic data reflected in the distribution distribution to vehicles of specific lane(s), width of distribution, height of distribution), and whether traffic flow information associated with the stationary car 304 lies within the tail. FIG. FIG. 3B shows thresholds 326 & 328 that may indicate a location in the quartile and/or percentile (e.g. 95th and 5th percentiles, respectively). These thresholds 326 or 328 may be used by the perception engine to determine whether a velocity or another feature value associated the stationary vehicle (304) is within the distribution (of feature-values of other observed vehicles) of the tail. The perception engine might determine whether the velocity or another feature value associated to the stationary vehicle 304 falls within or outside the two standard deviations from the Gaussian mean. No matter what method, the perception engine can determine that the velocity or other feature of the stationary vehicle is abnormal if it determines that it is outside of the normal range.

“It is possible that the perception engine could use traffic flow data to identify stationary vehicle 304’s corresponding data as being abnormal.” It is more likely that stationary vehicle 304 will fall into the lowertail, but in some cases, the upper tail could be used to identify an erratic or other vehicle.

“In some cases, the perception data might generate traffic flow data for vehicles that are the same type or general class as the stationary vehicle. The perception engine might generate traffic flow data for bicycles that it has detected, if the stationary vehicle 304 is classified as a bicycle. Another example is where the stationary vehicle 304 was classified as a passenger car, the perception engine might generate traffic flow data for other objects that are vehicles, passenger cars, and/or motor vehicle.

“FIG. 3C shows an alternate or additional feature 330, for which the perception engine might determine a feature worth. A feature value may be generated by the perception engine of an autonomous vehicle 302 that describes the behavior of other objects relative to the stationary vehicle, 304. The perception engine might store track 334 from another vehicle and output it as a feature.

The perception engine can output additional or alternate examples that indicate the track’s feature value. The indication could include indications that 332 vehicles are changing lanes or remaining stationary. The perception engine can relay the frequency at which objects repeat a behavior, similar to traffic flow data. This frequency might be restricted to objects in the same lane as autonomous vehicle 302 but may be heavier for objects that have behavior that originated within the autonomous vehicle 302. Vehicles 336 and 338 could be parked vehicles. Vehicles 336 and 338 may be in a different lanes. Therefore, limiting the frequency of behavior displayed by other vehicles to the same lane of autonomous vehicle 302 can increase the accuracy of the planner’s generated trajectory in response to the frequency. If the determination was not restricted to the same lane of autonomous vehicle 302, it may indicate that 2 vehicles (66% vehicle behavior) were stationary while 1 vehicle (33% vehicle behavior) passed the stationary vehicle. This feature value could indicate that 1 vehicle (100%) has passed the stationary car by constraining the determination to the frequency in which the autonomous vehicle 302 performs the behavior.

“In an alternate or additional example, the autonomous car 302 may detect all stationary cars (i.e. These vehicles may not be restricted to one lane, but within the range of the sensors or within a predetermined threshold distance (e.g. 50m, 100m). Information about stationary vehicles in other lanes can be used to plan how other vehicles might react. Planning a route around a double-parked vehicle and into the autonomous vehicle’s lane. The feature value could reflect the location of these other stationary vehicles and/or any other feature values.

“FIG. 3D shows an alternative or additional feature 340. FIG. The shaded portion of FIG. 3D shows an area 342 that is blocked by at least one sensor of the autonomous vehicle (302, e.g., surfaces on the stationary vehicle 304). The autonomous vehicle’s perception engine may produce a feature value which indicates the probability of an object being occluded 344 in front or in the occlusion zone 342 by the autonomous vehicle 302.

In some cases, the perception engine might determine that sensor data contains a residual that could indicate the existence 344. The residual could include a portion or image/video that may indicate the existence of an object, but it is not classifiable, a SONAR/RADAR anomaly, and so forth. Using RADAR as an example the perception engine might determine if the occluded objects 344 exist based on determining a portion RADAR data attributable the vehicle and/or other environmental objects (e.g. the roadway). The residual RADAR data could indicate that 344 is present. The RADAR data could include noises that indicate the existence a distant object from the autonomous car 302 than the distance between the vehicle and the stationary vehicle 304. These loud reflections could be refracted from the occluded objects 344 to a sensor of the autonomous car under the chassis of the stationary vehicle 302 in certain instances, and/or via an object nearby (e.g. a wall).

“Some examples may show distances between the stationary vehicle 304 and the object 344. This could include cases where the object 344 is not visible from a direct sensor?view? Examples where object 344 is at most partially within?view? Sensors

“FIG. “FIG. A feature value may be generated by the autonomous vehicle’s perception engine 302 to indicate the presence of a person 348 close to the stationary vehicle 304. The feature value could indicate distance between the person 346 (or the stationary car 304), whether the person 348 leaves or returns to the stationary car 304, how many people are within range of the stationary car 304 and/or whether any door or other aperture on the stationary automobile 304 is open. The feature values can indicate a yaw 350 for the stationary vehicle 304 in some cases. The yaw can be measured relative to the pose of an autonomous vehicle 302 or relative to the direction of a lane. FIG. FIG.

“FIG. 3F shows an alternate or additional feature 352 that the perception engine might determine a value for. “The feature value generated by the autonomous vehicle’s perception engine 302 could indicate classifications of objects (e.g. cone, flagger or flare), and/or meta-classifications may describe a group (e.g. delivery, construction zone, incapacitated car).

The perception engine can determine one or more feature value (e.g. 15 meters from the stationary vehicle to the junction,?delivery trucks, etc.). The BVML model can push these feature values through nodes to determine if the stationary vehicle is a blocking vehicle. These and other feature values may be pushed through the nodes of BVML model by the BVML model in order to determine whether the stationary vehicle is blocking.

“Example Architecture”

“FIG. “FIG. The vehicle system 402 could represent at most a portion of the autonomous vehicle 204 or 302. This architecture can be used in some cases to control an autonomous car that comes into contact with a stationary vehicle.

“In some cases, the vehicle system 402 could include processor(s), 404 or memory 406. FIG. illustrates these elements in combination. FIG. 4 shows these elements in combination. However, it is possible that they could be distinct elements of the vehicle 402 and that parts of the system might be implemented as software or hardware in certain examples.

“Processor(s), 404” may refer to a uniprocessor that includes one processor or a multiprocessor that includes several processors (e.g. two, four or eight or any other number). Any processor that can execute instructions may be the processor(s), 404. In various implementations, processors 404 may be general-purpose processors or embedded processors that implement any one of a number of instruction set architectures. Multiprocessor systems may have multiple processors 404 that implement different ISAs. The processor(s), 404, may include a central processing device (CPU), a graphics processing device (GPU), Field Programmable Gate Arrayss (FPGA), Application Specific Integrated Circuits (ASIC), and/or a combination of these.

“The example vehicle system 402 could include memory 406. In some examples, the memory 406 may include a non-transitory computer readable media configured to store executable instructions/modules, data, and/or data items accessible by the processor(s) 404. In various implementations, the non-transitory computer readable media may be implemented using any suitable memory technology, such as static random access memory (SRAM), synchronous dynamic RAM (SDRAM), nonvolatile/Flash-type memory, or any other type of memory. The non-transitory computerreadable memory is used to store program instructions and data that implement desired operations. Other implementations allow program instructions and/or data to be received, sent, stored, or both, on different types of computer accessible media such as non-transitory media readable media or similar media. A non-transitory computer readable memory can include storage media or media such as magnetic or optical media (e.g. solid state memory), flash memory (e.g. solid state memory), or magnetic media (e.g. a disk). These media are connected to the example vehicle 402 via an input/output? Interface 408. Interface 408.

“Furthermore,” though shown as a single unit at FIG. 4. It is possible that the processor(s), 404, and memory 406 could be distributed among multiple computing devices on the vehicle, and/or between multiple data centers, etc.

“In some cases, the input/output (I/O) may be configured to coordinate I/O traffic between the processor(s), 404, memory 406, network interface 410, sensor(s) 412 and drive system 416. Interface 408 can be used to coordinate I/O traffic among the processor(s), 404, memory 406, network interface 410 and sensor(s) 412; I/O devices 414; drive system 416 and/or other hardware of vehicle system 402. The I/O devices 414 can include display(s), external or internal speakers, and/or passenger input device(s), among other things. The I/O interface 408 may be used to perform data transformations, such as timing or protocol, in order to transform data signals from one component (e.g. the non-transitory computer-readable media) into a format that is suitable for another component (e.g. processor(s). The I/O interface 408 may support devices that are attached to various types of peripheral bus standards, such as the Universal Serial Bus standard (USB), the Peripheral Component Interconnect standard (PCI), or a variant thereof. The I/O interface 408 function may be divided into separate components in some implementations. For example, a north and south bridge. In some cases, the I/O interface 408 functionality, such as the interface to the memory 406, can be integrated directly into the processor(s), 404, or one or more components of the vehicle’s system 402.

“The example vehicle system 402 could include a network interface (410) that is used to establish a communication connection (i.e., the?network?). The vehicle system 402 may be connected to one or more devices. The network interface 410 can be used to exchange data between the vehicle 402 system and another vehicle 418 (e.g. vehicle(s), 104(2) and (3) via a first network420 and/or vehicle system 402 with remote computing systems 422 via a secondary network 424). The network interface 410 can enable wireless communication between another vehicle (418) and/or remote computing device (422). The network interface 410 can support communication via wireless general-data networks (e.g. Wi-Fi networks) and/or telecommunications networks (e.g. satellite networks and cellular communication networks).

“In some cases, the sensor data described herein may be received by a first vehicle and transmitted from there to a second. The perception engine may include sensor data from another vehicle in some cases. The sensor data from the first vehicle could be used to fill out a feature value not available to the second vehicle, and/or to determine weight feature values based on sensor data from the second car.

The sensor(s) 402 of the example vehicle may be configured, for example to locate the vehicle system 402. To detect objects in the environment, sense movement of the vehicle system 402. Sensors 412 can also sense environmental data (e.g. ambient temperature, pressure and humidity) and/or conditions in the interior of the vehicle system. (e.g. passenger count, interior temperature and noise level). One or more lidar sensors and one or more cameras may be included in the sensor(s), 412. RGB-cameras, intensity cameras, depth cameras and stereo cameras may be included.

“The example vehicle system 402 could include a perception engine 426, a BVML model 428 and a planner 433.

“The perception engine 426, which may be stored in memory 406, can instruct the processor(s), 404 to configure the processor(s), 404 to receive sensor data (from the sensor(s), 412) as input and output data. This data could include, for example, one of the poses (e.g. Position and orientation of an object within the environment of the example vehicle system 402. An object track associated with an object (e.g., the historical position, velocity and acceleration of the object over time (e.g. 5 seconds (e.g. A pedestrian, a bicycle rider, or a vehicle. Some examples of perception engine 426 might be configured to predict more then one object trajectory. The perception engine 426 can be used to predict multiple object trajectories using probabilistic determinations, multi-modal distributions, and/or multi-modal distributions.

“The perception engine 426, which may contain instructions stored in memory 406, may instruct the processor(s), 404 to configure the processor(s), 404 to receive sensor information from the sensor(s), 412 as input and output an indication that it detects a stationary car from the sensor data. These feature values can also be stored in memory 406 This may include instructions to configure the processor(s), 404 to calculate distance between stationary vehicles and traffic lights from images and/or clouds of lidar points. The feature values may be transmitted to the BVML model 428 by the perception engine 426.

“The BV ML Model 428 may contain instructions stored on memory 406 which, when executed by processor(s), 404, configure processor(s), 404 to receive feature value(s) associated with elements in the environment in that the vehicle system 402 is present, and determine if the stationary vehicle (or other vehicle) is a blocking one. The BV ML Model 428 may contain a decision tree and/or deep-learning algorithm, with nodes that allow feature values to be pushed to determine and output.

“The planner 430 may receive the probability that a stationary vehicle is blocking to the perception engine 426. This information may be in addition to any additional information the planner 430 may use for generating a trajectory (e.g. object classifications, object tracks and vehicle pose). The planner 430 and the perception engine 426, may transmit, in some cases, a blocking vehicle indicator via the network interface 426 to the remote computing device 422, via network 424, and/or another vehicle via network 420. This information is based at least partially on the probability determined using the perception engine 426. This indication can be used by another vehicle 418 to indicate a feature value in certain cases. If the vehicle encounters a stationary vehicle at the exact same location indicated by the perception engine 426, the vehicle system 402, this may allow the vehicle 418 to use the information. This could include temporarily changing a global map to include the blocking vehicle indication. The global map can be accessed via a network to a fleet.

“In some cases, the perception engine (or the BVML model 428) may be located in another vehicle 418 or at a remote computing device 422. A perception engine located on another vehicle 418 or remote computing device 422, may coordinate with the 426. The other vehicle 418 or remote computing device 422, for example, may determine one or more feature value and/or probability. If the remote computing device 422, 418 or 422 determines one or more feature value, the remote computing device 422, 422 and/or the other vehicle may transmit those feature values to the vehicle 402 via the networks 420 or 424. The feature values sent by the remote computing device 422, and the other vehicle 418 may be included in the 426’s perception engine. If the BVML model 428 is at another vehicle 418 or remote computing device 422, then the other vehicle (418) and/or the remote computing device (422) may receive feature values from the vehicle 402 via networks 424 and 420. These feature values may be used to determine whether the stationary vehicle is blocking. This probability may be transmitted to a planner 430 from the vehicle system 402.

“In some cases, remote computing device 422 could include a Teleoperations Device. The teleoperations devices may respond to sensor data,/or one or several feature values that indicate whether the stationary vehicle has become a blocking vehicle. The teleoperations devices may also display information about the sensor data, and/or one or more feature value that could be used to receive input from a remote operator (teleoperator). The ability to confirm or identify that the stationary vehicle is/isn’t a blocking vessel. The interface that receives input from the operator, such as the indication that the stationary vehicle has become a blocking vessel, may be included in teleoperations devices. The teleoperations system may respond to an autonomous vehicle and/or additional autonomous cars corroboring the indication, or identifying it as a false negative.

“In some cases, a remote operator may input a feature into the remote computing unit 422 which may be transmitted to the vehicle 402 for use with the BVML model 428 or input into a remote computing device 422 for BVML model 428.”

The planner 430 may contain instructions stored in memory 406 that when executed by processor(s), 404 will configure the processor(s), 404 to create data representative of a trajectory for the example vehicle 402. This data could include data representing the location of the vehicle system 402 within its environment, as well as other data such data like local pose data and probability that the stationary vehicle is blocking. The planner 430 may generate multiple trajectories to control the example 402 vehicle. It may do this in a steady fashion (e.g. every 1 to 2 milliseconds though it is possible to use any receding time). The selected trajectory may be determined at least partially based on the current route, probability that the stationary vehicle will block, current vehicle trajectory and/or detected object trajectory information. The planner 430 can transmit the chosen trajectory to the drive 416 in order to control the example 402 vehicle system 402 according the selected trajectory.

“In some cases, the BVML model 428 and/or the planner 432 may include special hardware, such as a processor that can run the perception engine (e.g. a graphics processor or an FPGA).

“Example Process”

“FIG. “FIG.

“At operation 502, the sample receiving process 500 could include a sample that contains sensor data and a label indicating either a stationary nonblocking vehicle or stationary blocking vehicle according to any one of the techniques described herein. You may receive thousands, or even tens of millions, of samples, each with sensor data and a label (e.g. “Blocking or not-blocking”.

“At operation 504, an example process 500 might include determining the feature values for the sample using at least part of the sample sensor data according to any one of the techniques described herein. One example is that the feature values are determined from sensor data. This could include receiving a feature to train the BVML model. A numeric value that represents a green light. The feature value determination can be repeated for all samples received at operation 502.

“At operation 506, the 500 example may include the generation of a BVML model that outputs a probability that a stationary car is a blocking vehicle. This probability can be determined, at minimum in part, based on one or more feature value and the label associated to the sample. The type of ML model used (e.g. decision tree(s), deep-learning model), may allow for the generation of nodes, connection weights and node layers. These layer types can be used to map input feature value to a label. At runtime, the BV ML model generated may receive a set feature value from the perception engine, and output an indication of whether the stationary vehicle is blocking or not. The planner might use this indication to create a trajectory for the autonomous vehicle in some cases.

For example, if the stationary vehicle is a block vehicle with a positive value greater than 1, such as a probability equal to or exceeding 1, the planner might create a trajectory that allows the autonomous vehicle merge into another lane. A lower-valued positive indicator that the stationary vehicle has been blocked, such as a probability less than 1, might cause the planner to determine a trajectory that will allow the vehicle to stay in place for a few seconds longer before it re-evaluates or transmits a request for remote teleoperations assistance. The planner might determine a trajectory that will remain in place for a higher or lower value (greater or equal to 1) indication that the stationary vehicle has been blocked. The planner’s configuration will determine the exact values that he uses to perform different actions.

“FIG. “FIG.6” illustrates a flow chart of an example 600 process for detecting blocked vehicles. One or more processors, or other components of an autonomous vehicle, could perform the operations in example 600.

“At operation 602, an example process 600 could include receiving sensor data 604 at least from one sensor 412 according to any of these techniques. The perception engine 426 may receive the sensor data 604 as an example.

“At operation 606, an example process 600 could include detecting a stationary car in the environment of an autonomous vehicle, based at least partially on the sensor data 604, according any of the techniques described herein. This could include, for example, detecting an object in the environment of an autonomous vehicle and classifying it as a vehicle. Also, determining the vehicle’s speed and determining whether the vehicle is not moving at a predetermined speed. This could also include determining if the vehicle is preventing an autonomous vehicle from following a previously created trajectory and/or if the vehicle is causing harm to another vehicle in the autonomous vehicle’s environment. One or more of these operations could include entering the sensor data 604 into the stationary vehicle detector 608, which might include one or more machine-learning algorithms of the perception engine 426, or other components thereof. A stationary vehicle indicator 610 can be generated in some cases (e.g. changing a register value or flag value, sending a command to another component within the perception engine 426).

“At operation 602, the example 600 could include the determination of one or more feature value 614 according to any of these techniques. One example of how to determine the one or several feature values 614 is (1) detecting one, more or all of the vehicles on the road, and (2) determining a value that indicates the speed of the stationary car and other speeds. This feature value could include an indication that the stationary vehicle’s speed may be unusual compared to other vehicles, and/or traffic flow data distribution indicating the speed and speeds of other vehicles. A collection of components from the perception engine 426 is referred to as a feature generator 616 in FIG. 6. (e.g., machine-learning algorithms that perform object detection, object classification and object tracking) can determine the feature value 614 based at least partially on the sensor data 604 or the stationary vehicle indicator 610.

“In some cases, operation 612 might also include providing one or more feature value 614 as input into an ML model (e.g. BV ML Model 428), according any of the techniques herein.”

“At operation 618 the example process 600 may consist of receiving, from a ML model an indication 620 that a stationary vehicle is either a blocking or non-blocking vessel (i.e., FIG. BV indication.620 in FIG. 6, according to any one of the techniques described herein. The indication 616 could include a label, such as?blocking vehicle? ?non-blocking vehicle?) and/or a probability. One example is that the perception engine 426, may receive the indications 620 and/or BVML model 428. Or the perception engine 426, may transmit the indication 620 to the planner 430 using any of the techniques described herein. In some cases, the planner 430 may also receive sensor data 604, data which is received from the perception engine 426, data such as object classifications, object tracks, etc.

“At operation 622 the example process 600 could include the generation of trajectory 624 to control the motion of an autonomous vehicle according to any one of the techniques described herein. The planner 430 could generate candidate trajectories, at most partially, based on the indication 616 and then select one candidate trajectories to control the autonomous vehicle. The planner 430 can transmit the chosen trajectory to the autonomous vehicle’s drive system 416.

“FIG. “FIG.7” illustrates a flow chart of an example 700 process for detecting blocked vehicles. The example process 700 operations can be performed by one or more autonomous vehicle processors or other components. As described below, or the operations can be carried out remotely by remote computing systems such as a remote autonomous vehicle and/or a device that teleoperates.

“At operation 702, example process 700 could include receiving sensor data according to any one of the techniques described herein.”

“At operation 704 the example process 700 might identify a stationary car (i.e. the?YES?) According to any of these techniques, arm in the flow diagram). If a stationary vehicle isn’t identified, process 700 could be reverted to operation 702.

“At operation 706, an example process 700 could include determining one of several feature values based at least partially on the sensor data according to any of these techniques. All of the feature value discussed herein can be included. One example of the feature values could be determining (706A) traffic flow data, which indicates speeds of one or several vehicles detected using sensor data. This could include the speed of the stationary vehicle, and/or other speeds.

“At operation 708 the example process 700 could include providing one or more feature value to a machine learning model according to any of these techniques.”

“At operation 701, the example process 700 could include the outputting, by machine-learning models, of a probability that the stationary car is a blocking device, according to any one of the techniques described herein.

“At operation 712 the example process 700 could include controlling the vehicle based at least partially on the probability according to any one of the techniques described herein. This could include controlling the vehicle so that it passes a blocking vehicle (712A) or waiting for a nonblocking vehicle (712B )).”).

“Example Clauses”

“A. “A.

“B. “B.

“C. The autonomous car of paragraph A orB, in which detection of the stationary vehicle also includes detecting one or several other stationary vehicles within a predetermined distance from the autonomous vehicle.

“D. An autonomous vehicle described in paragraphs A-C. In which the one or more feature value further includes at least a speed of a stationary vehicle and a traffic sign state.

“E. Any autonomous vehicle described in paragraphs A-D. In which case, the blocking vehicle is any object detected by the perception engines that blocks at least one autonomous vehicle or another vehicle’s progress.”

Summary for “Detecting blocked objects”

“Stationary objects such as vehicles on the road or other stationary objects can interfere with autonomous vehicle operation. A stationary vehicle, such as a car, may block the autonomous vehicle by being double-parked or inaccessible. Sensor visibility may make it difficult to detect such vehicles. To determine if the vehicle is blocking, you should be in front of it.

Environmental cues can increase the complexity of detection. Two examples: A vehicle that stops at an intersection with a red light could be a blocking vehicle, and not just waiting for the traffic light to turn. A vehicle that is stopped at an intersection with a green light might be waiting in line to turn, but not a blocking vehicle. Incorrectly detected block vehicles can cause additional problems such as inability to return to the original lane.

As mentioned above, “blocking objects” (including vehicles) are objects that prevent an autonomous vehicle from following a route or path. Double parking is common in urban areas. Double-parked vehicles must be identified as blocking vehicles and treated differently from stationary vehicles. The autonomous vehicle might be told to wait for a stopped vehicle so it can move. However, the vehicle could be instructed how to navigate around double-parked vehicles. The general rules such as treating all stationary cars at green lights as blocking are often incorrect and/or inadequate for the safe operation of an autonomous vehicle. They also do not reflect human driving. This disclosure relates to methods (e.g. machines, programs and processes) that can be used to determine whether a stationary vehicle or object is blocking. Then, the ability to control an autonomous car based on this determination. The techniques described herein include a machine learning (ML) model that receives sensor data to determine if a stationary vehicle has become a blocking vehicle. This determination is not made according to a conditional principle (e.g., it is true that there is a green light and that the vehicle has been stopped, it would indicate that the vehicle was blocking). Instead, the techniques described herein can determine the probability that a stationary vehicle will be blocking using an ML model that uses sensor data.

In some cases, the techniques described herein include the receipt of raw and/or processed sensors data (e.g. sensor data processed using another machine-learning model or software/hardware modules of an autonomous car), determining whether a vehicle in a road path is stationary, determining feature value from the sensor data (e.g. values indicating features like speed, brake conditions, height, and/or size of the stationary vehicles; traffic flow data; and classifying the stationary and/or objects in the vicinity of the vehicle. An ML model may be used to determine the probability in some cases.

“In some cases, an autonomous vehicle may also include a planning engine and/or a perception engine. One or more ML models, as well as other computer-executable instructions, may be included in the perception engine. These instructions are used to detect, identify, classify, and/or track objects using sensor data from the autonomous vehicle. The perception engine might include the ML model that determines whether the stationary vehicle is blocking (referred to hereinafter as?BV model). Although the BV model is also covered in general, the ML models are equally important. A planner could include one or more ML algorithms, models, etc. for route-planning, trajectory-planning, evaluating decisions, etc. A planner can be used to create a trajectory to control the motion of an autonomous vehicle using data from other components such as the perception engine, sensors, data from other vehicles, and/or a network link.

The techniques described herein improve the operation of an autonomous car by increasing detection accuracy over previous solutions (e.g. using conditional rules). The techniques prevent the autonomous vehicle’s unnecessarily being parked behind a blocked vehicle. This reduces the vehicle’s power consumption and minimizes the amount of wasted compute cycles. These techniques can also be used to prevent the autonomous vehicle unnecessarily changing lanes and re-routing if the stationary vehicle is not blocking. The vehicle’s power consumption will be reduced and the wasted compute cycles may also be reduced. These techniques can improve the safety of autonomous vehicles’ operation, both for passengers and other entities in the environment. They allow the vehicle to perceive a situation more accurately. The techniques could prevent an autonomous vehicle from changing lanes only to have it return to the original lane. This is because the vehicle has not yet seen a line of cars, and may be able to anticipate that doors of non-blocking stationary vehicles may open, thus avoiding collisions.

A blocking vehicle, as used herein is any stationary vehicle on a drivable road surface that hinders the progress of other vehicles. Some stationary vehicles are not blocked. A vehicle that stops its progress on a road surface to signal a red light may not be considered a blocking vehicle. It could, for example, stop at a traffic light to signal a red light and wait for another vehicle’s progress. A blocking vehicle could be a vehicle that is double-parked, a delivery vehicle that has been parked for delivery, or a vehicle that has had its driver evacuated, stopped police car, incapacitated, or a vehicle that has been parked. It is difficult to distinguish between a stationary vehicle that is not blocking and one that is blocking. In most cases, it may only be a matter of the time the vehicle has been stopped. The primary difference between a stationary and blocking vehicle is the fact that a vehicle cannot or will not move if the blocking vehicle is present. The blocking vehicle can be described as a stationary vehicle that isn’t following the generally accepted rules of road. For example, it could be considered a vehicle that isn’t following a lane or not adhering to the traffic laws. These techniques improve the operation of autonomous vehicles by preventing them from being stopped behind a blocking vehicle for an inexplicable period of time.

“Example Scenario”

“FIG. “FIG. An example scenario shows an autonomous vehicle (102) approaching a roadway junction (104), which includes a traffic light (106). The autonomous vehicle 102 could be an autonomous vehicle that is designed to operate according the U.S. National Highway Traffic Safety Administration Level 5 classification. This means the vehicle can perform all safety-critical functions throughout the trip without the driver or occupant being required to control it. In other cases, however, the autonomous vehicle102 could be fully or partially autonomous and can have any level of classification currently in use or planned in the future. In some cases, however, the methods described in this article for identifying blocking vehicles can also be used by non-autonomous cars.

“The autonomous vehicle102 may receive sensor data from one of the sensors of the autonomous car102. This sensor data may be used by the autonomous vehicle 102 to calculate a trajectory for controlling the vehicle’s motion. The planner, which may receive data from sensors and/or perception engines, and other software and hardware modules, could be included in the autonomous vehicle 102. This data could include data such as the position of the autonomous car 102 from a global-positioning sensor (GPS), data about objects within the area of the autonomous car 102, route data, which specifies the destination of the vehicle, and global map data that identifies the characteristics of roads (e.g. Features that can be detected by different sensor modalities are useful for localizing an autonomous vehicle. Local map data is used to identify characteristics in close proximity to the vehicle. This data may be used by the planner to create a trajectory for controlling the vehicle’s motion. The autonomous vehicle 102 may also include a perception engine. This engine receives sensor data from one of the sensors 102 and determines perception data. It transmits this data to the planner. The planner can use the data to determine a location of the autonomous car 102 on the global maps, to determine one or more paths, or to control the motion of the vehicle to follow a route. The planner might determine a route for autonomous vehicle 102, from one location to another, generate potential trajectories to control vehicle motion (e.g. 1 microsecond, 1/2 second) and then select one potential trajectory to guide the vehicle along the route. This may be used to generate drive control signals that can be transmitted to the components of the autonomous car 102.

“The perception engine could include one or more ML model and/or other computer executable instructions for detecting and identifying, segmenting and classifying objects using sensor data from the autonomous vehicle 102. The perception engine might detect an object in its environment and classify it (e.g., semi-truck or pickup truck, passenger vehicle, child, dog, or ball). The perception engine can also track the object’s position, velocity, acceleration, and/or heading (historical, current and/or predicted).

The perception engine can determine whether a vehicle is stationary based at least partially on data from sensors attached to the autonomous vehicle. The perception engine may be able to receive sensor data, classify an object in the environment of autonomous vehicle 102 as a vehicle and determine that the velocity of that vehicle is below a predetermined threshold velocity. For example, a vehicle’s sensed velocity should not exceed 0.1 meters per sec or 0.05 meters per sec. A vehicle can be described as, among other things, a vehicle that is used to transport objects. It could include a passenger vehicle or a delivery truck. The perception engine might classify the vehicle as stationary if it determines that the vehicle’s velocity is not within a predetermined threshold. The perception engine may classify a detected vehicle as a stationary vehicle if it determines that the vehicle’s velocity does not meet (or exceed) a predetermined threshold speed and/or another condition such as a time period, traffic light condition, distance sensed from a junction, etc. A perception engine might classify a vehicle detected as stationary based on the sensor data it receives. For example, it may determine that the vehicle is moving slower than a predetermined threshold velocity (e.g. the vehicle has stopped for 20 seconds) or that the vehicle has been traveling slower for a predetermined time.

“Returning now to FIG. “Returning to the example scenario in FIG. 1A, the autonomous car 102 may approach a junction that contains a traffic light106 and may come across an object it classifies (e.g. vehicle 108), which is also marked with a question mark. Technical difficulties arise when the perception engine doesn’t have enough information to determine whether the stationary car is merely stopped (i.e., non-blocking stationary vehicles), being obstructed by another object or legal constraint (e.g. a stoplight) or whether it is blocking the road. A blocking vehicle, as used herein is any vehicle that stops or moves at a speed that is less than the predetermined threshold velocity and impedes other vehicles. A blocking vehicle could be, for example, a double-parked vehicle or a delivery truck whose goods are being unloaded. It also might include a stopped police vehicle, an incapacitated car (e.g., one with a failing drive system, a flat tire, or a vehicle involved or whose occupant has vacated it), and a meter-reading vehicle. A “drivable road surface” is defined herein. These may be portions of roadways that are used in normative driving conditions.

“A blocking vehicle can be different from a nonblocking stationary car in that it is stopped when other vehicles cannot make progress (or otherwise according to generally accepted driving rules). Therefore, the non-blocking station vehicle does not hinder other vehicles (e.g., a red traffic light, an object has entered a road, or the stationary vehicle has another vehicle in front). The blocking vehicle can block the trajectory of an autonomous vehicle 102. This means that it may be different from non-blocking stationary vehicle (e.g., it occupies at most part of the same lane as autonomous vehicle102, or coincides with the path and/or trajectory for autonomous vehicle 102).

“The perception engine might not resolve the ambiguity about whether a detected car is a non-blocking, stationary vehicle or a blocking vehicle. The planner may not have sufficient data to create a trajectory that moves the autonomous vehicle 102 in accordance with the scenario.”

“FIG. “FIG. 1A, and it reflects another complication in the example scenario 100. FIG. FIG. 1B shows the imperfect and limited view of the scenario that is available to the autonomous car 102 through the sensor data. The future sensor and perception advancements will likely increase the amount of scenarios reflected in sensor data. However, it is likely that the autonomous car 102 will not have access to 100% of the states, objects, and other information. in a scenario or having an effect on it, at least partially. FIG. FIG. 1B shows an example of scenario 100 as reflected in sensor data that was received by the planner of the autonomous car 102. A combination of global map data and GPS data from the autonomous car 102 could indicate that a junction is 100m in front of the vehicle 102. Sensor data may also indicate that the autonomous vehicles 102 are in a lane authorized to enter the junction. The autonomous vehicle may receive sensor data which allows it to determine that vehicles 110 and 112 have stationary vehicles.

“Due to limitations of the sensor(s), autonomous vehicle 102 might not receive sensor data to detect vehicles 114-122. Vehicles 114-122, for example, might be out of the field of view, beyond the reliable operational distance of a sensors, or occluded. Some examples may have conditional rules that define the conditions under which a blocking vehicle indicator will be output. The conditional rules can be pre-programmed by an administrator or programmer. A hard-coded rule might specify, for example, that the perception engine emits a blocking device indication when a detected car has been classified as stationary and remains stationary for a predetermined time.

However, these hard-coded rules might not be able to account for all possible scenarios. The example of a hard-coded rule that might be used in the case where a stationary vehicle 108 is incapacitated, and therefore blocks vehicle 108. FIG. FIG. 1A shows that there could be a long line in front of the stationary car 108, which the autonomous vehicle may not be able to detect. The planner might need to indicate a blockage vehicle, such as scenario 100, in order to prevent the vehicle from operating incorrectly or dangerously. The planner could initiate a lane shift, assuming that the autonomous vehicle (102) can return to the original lane after passing the stationary car (108), which has been identified by the perception engine. The long line of cars in front of the stationary vehicle may make it difficult for the autonomous vehicle 102 to return to its original lane. False positive would be to classify the stationary vehicle 108 in this scenario as a blocking car. False positives could also disrupt the operation of the autonomous car 102. “False positives and/or negatives may be too high in hard-coded rules.”

“Furthermore,” adding a new feature (e.g. checking if the stationary vehicle’s hazards lights are flashing) would require a lot of time and human coding. This is not a good idea as it could cause the autonomous vehicle to stop operating or require human reconfiguration. It may also require multiple iterations for testing and developing new configurations of the perception engine.

“In some cases, the techniques described herein may instead be used to probabilistically determine if a stationary vehicle or a blocking vehicle via an MML model that is not limited.” This is in conjunction with the BV ML Model.

“Example Process”

“FIG. 2. This is a pictorial flow chart of an example process 200 that determines, at an autonomous car, whether a stationary or blocking vehicle is necessary to control it. The example process 200 at operation 202 may also include receiving sensor data from a sensor(s) of an autonome vehicle 204. The sensor data can be received either from another vehicle or a remote sensor (e.g., weather station, traffic control service, emergency service) and/or sensors located in infrastructure (e.g. “Sensors(s) are located on light posts, buildings.”

“At operation 206, the example process 200 might include detecting a stationary car 208 using the sensor data. This could include detecting an object and classifying it as a vehicle. The threshold could be predetermined (e.g. the vehicle is not moving faster than 0.05m per second) or relative (e.g. the vehicle is moving less than 20% of the average speed of traffic). Operation 206 can also, or alternatively, include determining whether other features from sensor data satisfy one or more conditions. For example, the vehicle must be at least a certain distance from a junction. The vehicle must also meet a specific type of condition (e.g., bicycle, passenger vehicle), or a light traffic signal is green.

“The operation 206 could include the detection of all stationary vehicles within sight? The autonomous vehicle 204’s sensors (i.e. All vehicles within the field of vision of one or more sensors. Operation 206 can also detect stationary vehicles within a predetermined distance. The autonomous vehicle 204 might detect stationary vehicles within 50 metres, for example. The autonomous vehicle (204) limits the distance at which it detects stationary vehicles. This conserves storage and processing resources. The autonomous vehicle 204 planner can make better decisions about the trajectory of the vehicle and/or control it.

“At operation 208, the example process 200 could include determining feature value 210 based at most in part on sensor data. The feature values 210 could correspond to features defined by a blocking vehicle (BVML) model. Based on feature values 210, the BVML model can be used to determine whether the stationary vehicle (208) is a blocking car. This information may be determined using a perception engine from the autonomous vehicle (204). The autonomous vehicle 204 might attempt to determine feature value 210 for at most a subset the features that the BV ML model can be configured. The table below shows an example of feature value 210 that were determined by the perception engine. These features correspond to features that the BVML could rely upon to determine whether the stationary vehicle 208 may be a blocking vehicle. The example shows that some feature values 210 weren’t determined by the perception engine, or they were not available from sensor data. ?Other object behavior?. These features and others are discussed with regard to FIGS. 3A-3F.”

“Feature Feature ValunDistance From Junction 14.3 MetersnBrake lights on 0nSVSpeed.001m/snTraffic flow Normality?? ?.87?nBlocked by another object?nPerson near vehicle 0nOther behavior?nTypes of agent?Passenger vehicle?

“While several examples of features have been listed (and with reference to FIGS. The number and types are not limited to the features listed in 3A-3F. You can think of many features, such as object bounding box sizes, object color, object size and object height. A speed and/or direction, object yaw (e.g. relative to the orientation of the vehicle), Lane identification (e.g. Left, center, and right, a GPS location, detected logo, and/or text associated to the stationary vehicle, etc. Boolean values can be used to represent any of these features (e.g. The feature may be represented by Boolean values (i.e., real numbers such as detected speed), text (such classification) or any other representation of data.

In some cases, the feature values of the perception engine may include the following: a speed of stationary vehicles 208, traffic signal status (e.g. traffic light state, existence traffic sign), traffic flow data and a relationship of stationary vehicles data to traffic flows data (e.g. noise radar sensor data that may indicate that stationary vehicles are moving around stationary vehicles 208), proximity to stationary vehicles 208 and related data (e.g. police car, passenger vehicle or delivery truck), door open/closed condition (e.g.

“At operation 212, the example process 200 may involve determining, using BVML model 214, and the feature values, whether the stationary vehicle 208 might be a blocking vessel. The BVML model 221 may input the feature values 210 and, depending on the configuration of the model 214, may output the probability that the stationary vehicle is blocking vehicle. The BV ML model 214 can include a decision tree, or any arrangement thereof. These may include a random forest, boosted ensemble, or boosted decision trees, a directed acyclic diagram (DAG), where nodes are organized in a Bayesian network, deep learning algorithms (e.g., artificial neural networks, deep belief network, deep stacking network, deep neural network (RNN), etc. The BV ML model 214 could include a gradient-boosted deci tree.

“For example, if the BV ML model 214 contains a decision tree, it may output a negative or positive number to indicate whether the stationary vehicle 208 has been blocked. The BV ML model 214 could include multiple decision trees that may be weighted. One decision tree might be weighted so that it outputs a?1.63 and +1.63, while another decision tree could be weighted so that it outputs a +0.76 and?0.76. The perception engine can add all the outputs from the decision trees to calculate the probability that the stationary vehicle, 208, is blocking.

“Where the BV ML Model 214 contains a neural network the BV ML Model 214 may contain an input layer, one or two hidden layers of nodes and an output layer. The input layer of nodes can be set up to receive one or more feature value and activate hidden layers. The output layer can be set up to receive stimuli from the hidden layers, and then output the indication according to the most activated nodes. The output layer might contain two nodes, a positive and a negative indicator that the stationary vehicles are blocking. In other words, it may have an output that provides a combination of a strong confidence that a stationary vehicle is blocking and a strong confidence that it is not. The decision tree could be a classification tree that outputs whether or not the detected vehicle has been identified as a blocking vehicle.

In some cases, the BV ML model 214 may have been generated from labeled data. The BV ML model 214 could include a deep learning model that can learn to predict whether a stationary vehicle 208 will be a blocking car based on input samples feature values. These feature values are associated with labels that indicate whether they came from a scenario in which a stationary vehicule 208 was a blocking or non-blocking vehicle (i.e. a ground truth label). FIG. FIG. 2 shows the BV ML 214 decision tree. The shaded nodes represent nodes that are reached using input feature values. The illustrated decision tree might output a weighted probability of the stationary vehicle 208 being a blocking vessel equal to 0.623. Positive values could indicate that the stationary car 208 is blocking, while negative values may indicate that the vehicle 208 does not block. However, any sign value (e.g. ”

“At operation 216, the example process 200 might include sending the probability determined from the perception engine to the planner, which determines the trajectory for controlling an autonomous vehicle. As shown at 218 and 228, To generate a trajectory to control the autonomous car 204, the planner might use the probability that the stationary vehicle 208 is blocking. FIG. FIG. 2 shows two examples of trajectories the planner might choose in alternative scenarios.

“Example scenario 218: The perception engine might have indicated that the stationary vehicle (208) is a blocking car. The planner of the autonomous car 204 might respond to this indication by generating a trajectory 222. This causes the vehicle 204 to merge into another lanes. The perception engine may also provide information to the planner about the blocking vehicle (e.g. police car, meter reader, delivery vehicle). The classification could be a semantic label in some cases. This semantic label could be included in one of the feature value.”

“In the example scenario 220, the perception engines may have indicated that the stationary vehicle, 208, is not blocking (i.e. the stationary vehicle, 208, is non-blocking stationary). The planner of the autonomous car 204 may respond to this indication by generating a trajectory 234 that causes the autonomous automobile 204 to move forward in the same direction.

“In some cases, the planner and/or perception engine may decide to transmit a signal for remote assistance. The autonomous vehicle 204 might transmit a signal to remote computing devices, which may have higher computing power. This allows the remote computing device to determine the probability of a stationary vehicle 208 being a blocking vehicle or allow a human operator to input the information. If the probability is lower than a threshold, such as 0.25 or greater, the planner might transmit a signal to remote computing device. This may indicate that there is low confidence that the stationary vehicles 208 are or are not blocking. The autonomous vehicle 204 may transmit feature and sensor values to remote computing devices in some cases.

“Example Features”

“FIGS. 3A-3F show a range of features that the BVML model could use to determine if a stationary vehicle is a blocking vehicle. The perception engine may determine the feature values from sensor data. The operations described below can also be performed as part of operation 212 in FIG. 2.”

“FIG. 3A shows an example scenario 300 in which an autonomous vehicle 302 approaches stationary vehicle 304. A junction 306 includes traffic signal 308 (in our example, a traffic light). The autonomous vehicle 302 could be the equivalent of autonomous vehicle 204. This discussion is pertinent to that case. The autonomous vehicle 302 could detect that vehicle 304 is stationary using a perception engine using any of the techniques described herein. The autonomous vehicle 302 can respond to the detection that the vehicle is stationary by determining feature values using sensor data. The autonomous vehicle 302 might determine the condition of the lights in the stationary car 304 (e.g. brake lights on/off, hazard lights ON/OFF), and a distance between the vehicle 304 and the junction 306 (or both). In some cases, this may include the presence or absence of traffic signs 308 (e.g. red light, yellow light. The autonomous vehicle 302 can determine the height and/or the size of the stationary vehicle. The BVML model may determine the likelihood that the stationary vehicle is blocking. In certain cases, the height of an autonomous vehicle 302 could be used to indicate that it is taller than the stationary vehicle 304.

In some cases, the perception engine may also or alternatively identify text and/or logo associated with the stationary car 304 as one value. A machine-learning algorithm may decide that the stationary vehicle 304 has a pizza delivery sign (e.g. on the top of the vehicle), or a taxi service sign. It could also determine that the text and/or the logo are associated with the vehicle (e.g. UPS text and/or Logo, Uber text, and/or logo). The text and/or logo can be used to increase the confidence in a classification of the stationary vehicle type 304 (e.g. delivery vehicle, public transport vehicle).

“The perception engine may in some cases, additionally or alternatively, determine traffic flow data using sensor data. Traffic flow data can include data about additional objects that are classified as vehicles by the perception engine (e.g. vehicles 314, 3316, and 318). These data could include the velocity of the vehicle at 320 mph, distance between the vehicle and another vehicle, and/or previous vehicles at 322 m, etc.

“For example, FIG. 3B illustrates an example distribution 324 traffic flow data. This distribution could reflect the frequency of different detected feature values that are related to other objects (e.g. Histogram, distribution. This could include the distribution of vehicle speeds determined by the perception engine. Example distribution 324 shows how the perception engine determined that most of the detected vehicles move at speeds between 55 and 110 km/h. FIG. FIG. 3B shows velocity. However, it is also discussed herein. It is understood that any other feature that may be unique for each vehicle could also be represented in a distribution. Distance between vehicles, doors open/closed and person-near-vehicle are just some examples. Each vehicle may have a unique code, or lane of vehicles. However, some traffic light indicators, etc., might be different for each vehicle. You may not. These feature values are referred to as traffic flow data when they are compared to other vehicles’ frequency distributions. herein.”

“The perception engine can use a frequency distribution for traffic flow data to produce a feature value that at least one percentiles the traffic data associated to the stationary vehicle. 304. It may also include characteristics of the distribution (e.g. whether there is a long tail, whether it is eliminated by reducing traffic data reflected in the distribution distribution to vehicles of specific lane(s), width of distribution, height of distribution), and whether traffic flow information associated with the stationary car 304 lies within the tail. FIG. FIG. 3B shows thresholds 326 & 328 that may indicate a location in the quartile and/or percentile (e.g. 95th and 5th percentiles, respectively). These thresholds 326 or 328 may be used by the perception engine to determine whether a velocity or another feature value associated the stationary vehicle (304) is within the distribution (of feature-values of other observed vehicles) of the tail. The perception engine might determine whether the velocity or another feature value associated to the stationary vehicle 304 falls within or outside the two standard deviations from the Gaussian mean. No matter what method, the perception engine can determine that the velocity or other feature of the stationary vehicle is abnormal if it determines that it is outside of the normal range.

“It is possible that the perception engine could use traffic flow data to identify stationary vehicle 304’s corresponding data as being abnormal.” It is more likely that stationary vehicle 304 will fall into the lowertail, but in some cases, the upper tail could be used to identify an erratic or other vehicle.

“In some cases, the perception data might generate traffic flow data for vehicles that are the same type or general class as the stationary vehicle. The perception engine might generate traffic flow data for bicycles that it has detected, if the stationary vehicle 304 is classified as a bicycle. Another example is where the stationary vehicle 304 was classified as a passenger car, the perception engine might generate traffic flow data for other objects that are vehicles, passenger cars, and/or motor vehicle.

“FIG. 3C shows an alternate or additional feature 330, for which the perception engine might determine a feature worth. A feature value may be generated by the perception engine of an autonomous vehicle 302 that describes the behavior of other objects relative to the stationary vehicle, 304. The perception engine might store track 334 from another vehicle and output it as a feature.

The perception engine can output additional or alternate examples that indicate the track’s feature value. The indication could include indications that 332 vehicles are changing lanes or remaining stationary. The perception engine can relay the frequency at which objects repeat a behavior, similar to traffic flow data. This frequency might be restricted to objects in the same lane as autonomous vehicle 302 but may be heavier for objects that have behavior that originated within the autonomous vehicle 302. Vehicles 336 and 338 could be parked vehicles. Vehicles 336 and 338 may be in a different lanes. Therefore, limiting the frequency of behavior displayed by other vehicles to the same lane of autonomous vehicle 302 can increase the accuracy of the planner’s generated trajectory in response to the frequency. If the determination was not restricted to the same lane of autonomous vehicle 302, it may indicate that 2 vehicles (66% vehicle behavior) were stationary while 1 vehicle (33% vehicle behavior) passed the stationary vehicle. This feature value could indicate that 1 vehicle (100%) has passed the stationary car by constraining the determination to the frequency in which the autonomous vehicle 302 performs the behavior.

“In an alternate or additional example, the autonomous car 302 may detect all stationary cars (i.e. These vehicles may not be restricted to one lane, but within the range of the sensors or within a predetermined threshold distance (e.g. 50m, 100m). Information about stationary vehicles in other lanes can be used to plan how other vehicles might react. Planning a route around a double-parked vehicle and into the autonomous vehicle’s lane. The feature value could reflect the location of these other stationary vehicles and/or any other feature values.

“FIG. 3D shows an alternative or additional feature 340. FIG. The shaded portion of FIG. 3D shows an area 342 that is blocked by at least one sensor of the autonomous vehicle (302, e.g., surfaces on the stationary vehicle 304). The autonomous vehicle’s perception engine may produce a feature value which indicates the probability of an object being occluded 344 in front or in the occlusion zone 342 by the autonomous vehicle 302.

In some cases, the perception engine might determine that sensor data contains a residual that could indicate the existence 344. The residual could include a portion or image/video that may indicate the existence of an object, but it is not classifiable, a SONAR/RADAR anomaly, and so forth. Using RADAR as an example the perception engine might determine if the occluded objects 344 exist based on determining a portion RADAR data attributable the vehicle and/or other environmental objects (e.g. the roadway). The residual RADAR data could indicate that 344 is present. The RADAR data could include noises that indicate the existence a distant object from the autonomous car 302 than the distance between the vehicle and the stationary vehicle 304. These loud reflections could be refracted from the occluded objects 344 to a sensor of the autonomous car under the chassis of the stationary vehicle 302 in certain instances, and/or via an object nearby (e.g. a wall).

“Some examples may show distances between the stationary vehicle 304 and the object 344. This could include cases where the object 344 is not visible from a direct sensor?view? Examples where object 344 is at most partially within?view? Sensors

“FIG. “FIG. A feature value may be generated by the autonomous vehicle’s perception engine 302 to indicate the presence of a person 348 close to the stationary vehicle 304. The feature value could indicate distance between the person 346 (or the stationary car 304), whether the person 348 leaves or returns to the stationary car 304, how many people are within range of the stationary car 304 and/or whether any door or other aperture on the stationary automobile 304 is open. The feature values can indicate a yaw 350 for the stationary vehicle 304 in some cases. The yaw can be measured relative to the pose of an autonomous vehicle 302 or relative to the direction of a lane. FIG. FIG.

“FIG. 3F shows an alternate or additional feature 352 that the perception engine might determine a value for. “The feature value generated by the autonomous vehicle’s perception engine 302 could indicate classifications of objects (e.g. cone, flagger or flare), and/or meta-classifications may describe a group (e.g. delivery, construction zone, incapacitated car).

The perception engine can determine one or more feature value (e.g. 15 meters from the stationary vehicle to the junction,?delivery trucks, etc.). The BVML model can push these feature values through nodes to determine if the stationary vehicle is a blocking vehicle. These and other feature values may be pushed through the nodes of BVML model by the BVML model in order to determine whether the stationary vehicle is blocking.

“Example Architecture”

“FIG. “FIG. The vehicle system 402 could represent at most a portion of the autonomous vehicle 204 or 302. This architecture can be used in some cases to control an autonomous car that comes into contact with a stationary vehicle.

“In some cases, the vehicle system 402 could include processor(s), 404 or memory 406. FIG. illustrates these elements in combination. FIG. 4 shows these elements in combination. However, it is possible that they could be distinct elements of the vehicle 402 and that parts of the system might be implemented as software or hardware in certain examples.

“Processor(s), 404” may refer to a uniprocessor that includes one processor or a multiprocessor that includes several processors (e.g. two, four or eight or any other number). Any processor that can execute instructions may be the processor(s), 404. In various implementations, processors 404 may be general-purpose processors or embedded processors that implement any one of a number of instruction set architectures. Multiprocessor systems may have multiple processors 404 that implement different ISAs. The processor(s), 404, may include a central processing device (CPU), a graphics processing device (GPU), Field Programmable Gate Arrayss (FPGA), Application Specific Integrated Circuits (ASIC), and/or a combination of these.

“The example vehicle system 402 could include memory 406. In some examples, the memory 406 may include a non-transitory computer readable media configured to store executable instructions/modules, data, and/or data items accessible by the processor(s) 404. In various implementations, the non-transitory computer readable media may be implemented using any suitable memory technology, such as static random access memory (SRAM), synchronous dynamic RAM (SDRAM), nonvolatile/Flash-type memory, or any other type of memory. The non-transitory computerreadable memory is used to store program instructions and data that implement desired operations. Other implementations allow program instructions and/or data to be received, sent, stored, or both, on different types of computer accessible media such as non-transitory media readable media or similar media. A non-transitory computer readable memory can include storage media or media such as magnetic or optical media (e.g. solid state memory), flash memory (e.g. solid state memory), or magnetic media (e.g. a disk). These media are connected to the example vehicle 402 via an input/output? Interface 408. Interface 408.

“Furthermore,” though shown as a single unit at FIG. 4. It is possible that the processor(s), 404, and memory 406 could be distributed among multiple computing devices on the vehicle, and/or between multiple data centers, etc.

“In some cases, the input/output (I/O) may be configured to coordinate I/O traffic between the processor(s), 404, memory 406, network interface 410, sensor(s) 412 and drive system 416. Interface 408 can be used to coordinate I/O traffic among the processor(s), 404, memory 406, network interface 410 and sensor(s) 412; I/O devices 414; drive system 416 and/or other hardware of vehicle system 402. The I/O devices 414 can include display(s), external or internal speakers, and/or passenger input device(s), among other things. The I/O interface 408 may be used to perform data transformations, such as timing or protocol, in order to transform data signals from one component (e.g. the non-transitory computer-readable media) into a format that is suitable for another component (e.g. processor(s). The I/O interface 408 may support devices that are attached to various types of peripheral bus standards, such as the Universal Serial Bus standard (USB), the Peripheral Component Interconnect standard (PCI), or a variant thereof. The I/O interface 408 function may be divided into separate components in some implementations. For example, a north and south bridge. In some cases, the I/O interface 408 functionality, such as the interface to the memory 406, can be integrated directly into the processor(s), 404, or one or more components of the vehicle’s system 402.

“The example vehicle system 402 could include a network interface (410) that is used to establish a communication connection (i.e., the?network?). The vehicle system 402 may be connected to one or more devices. The network interface 410 can be used to exchange data between the vehicle 402 system and another vehicle 418 (e.g. vehicle(s), 104(2) and (3) via a first network420 and/or vehicle system 402 with remote computing systems 422 via a secondary network 424). The network interface 410 can enable wireless communication between another vehicle (418) and/or remote computing device (422). The network interface 410 can support communication via wireless general-data networks (e.g. Wi-Fi networks) and/or telecommunications networks (e.g. satellite networks and cellular communication networks).

“In some cases, the sensor data described herein may be received by a first vehicle and transmitted from there to a second. The perception engine may include sensor data from another vehicle in some cases. The sensor data from the first vehicle could be used to fill out a feature value not available to the second vehicle, and/or to determine weight feature values based on sensor data from the second car.

The sensor(s) 402 of the example vehicle may be configured, for example to locate the vehicle system 402. To detect objects in the environment, sense movement of the vehicle system 402. Sensors 412 can also sense environmental data (e.g. ambient temperature, pressure and humidity) and/or conditions in the interior of the vehicle system. (e.g. passenger count, interior temperature and noise level). One or more lidar sensors and one or more cameras may be included in the sensor(s), 412. RGB-cameras, intensity cameras, depth cameras and stereo cameras may be included.

“The example vehicle system 402 could include a perception engine 426, a BVML model 428 and a planner 433.

“The perception engine 426, which may be stored in memory 406, can instruct the processor(s), 404 to configure the processor(s), 404 to receive sensor data (from the sensor(s), 412) as input and output data. This data could include, for example, one of the poses (e.g. Position and orientation of an object within the environment of the example vehicle system 402. An object track associated with an object (e.g., the historical position, velocity and acceleration of the object over time (e.g. 5 seconds (e.g. A pedestrian, a bicycle rider, or a vehicle. Some examples of perception engine 426 might be configured to predict more then one object trajectory. The perception engine 426 can be used to predict multiple object trajectories using probabilistic determinations, multi-modal distributions, and/or multi-modal distributions.

“The perception engine 426, which may contain instructions stored in memory 406, may instruct the processor(s), 404 to configure the processor(s), 404 to receive sensor information from the sensor(s), 412 as input and output an indication that it detects a stationary car from the sensor data. These feature values can also be stored in memory 406 This may include instructions to configure the processor(s), 404 to calculate distance between stationary vehicles and traffic lights from images and/or clouds of lidar points. The feature values may be transmitted to the BVML model 428 by the perception engine 426.

“The BV ML Model 428 may contain instructions stored on memory 406 which, when executed by processor(s), 404, configure processor(s), 404 to receive feature value(s) associated with elements in the environment in that the vehicle system 402 is present, and determine if the stationary vehicle (or other vehicle) is a blocking one. The BV ML Model 428 may contain a decision tree and/or deep-learning algorithm, with nodes that allow feature values to be pushed to determine and output.

“The planner 430 may receive the probability that a stationary vehicle is blocking to the perception engine 426. This information may be in addition to any additional information the planner 430 may use for generating a trajectory (e.g. object classifications, object tracks and vehicle pose). The planner 430 and the perception engine 426, may transmit, in some cases, a blocking vehicle indicator via the network interface 426 to the remote computing device 422, via network 424, and/or another vehicle via network 420. This information is based at least partially on the probability determined using the perception engine 426. This indication can be used by another vehicle 418 to indicate a feature value in certain cases. If the vehicle encounters a stationary vehicle at the exact same location indicated by the perception engine 426, the vehicle system 402, this may allow the vehicle 418 to use the information. This could include temporarily changing a global map to include the blocking vehicle indication. The global map can be accessed via a network to a fleet.

“In some cases, the perception engine (or the BVML model 428) may be located in another vehicle 418 or at a remote computing device 422. A perception engine located on another vehicle 418 or remote computing device 422, may coordinate with the 426. The other vehicle 418 or remote computing device 422, for example, may determine one or more feature value and/or probability. If the remote computing device 422, 418 or 422 determines one or more feature value, the remote computing device 422, 422 and/or the other vehicle may transmit those feature values to the vehicle 402 via the networks 420 or 424. The feature values sent by the remote computing device 422, and the other vehicle 418 may be included in the 426’s perception engine. If the BVML model 428 is at another vehicle 418 or remote computing device 422, then the other vehicle (418) and/or the remote computing device (422) may receive feature values from the vehicle 402 via networks 424 and 420. These feature values may be used to determine whether the stationary vehicle is blocking. This probability may be transmitted to a planner 430 from the vehicle system 402.

“In some cases, remote computing device 422 could include a Teleoperations Device. The teleoperations devices may respond to sensor data,/or one or several feature values that indicate whether the stationary vehicle has become a blocking vehicle. The teleoperations devices may also display information about the sensor data, and/or one or more feature value that could be used to receive input from a remote operator (teleoperator). The ability to confirm or identify that the stationary vehicle is/isn’t a blocking vessel. The interface that receives input from the operator, such as the indication that the stationary vehicle has become a blocking vessel, may be included in teleoperations devices. The teleoperations system may respond to an autonomous vehicle and/or additional autonomous cars corroboring the indication, or identifying it as a false negative.

“In some cases, a remote operator may input a feature into the remote computing unit 422 which may be transmitted to the vehicle 402 for use with the BVML model 428 or input into a remote computing device 422 for BVML model 428.”

The planner 430 may contain instructions stored in memory 406 that when executed by processor(s), 404 will configure the processor(s), 404 to create data representative of a trajectory for the example vehicle 402. This data could include data representing the location of the vehicle system 402 within its environment, as well as other data such data like local pose data and probability that the stationary vehicle is blocking. The planner 430 may generate multiple trajectories to control the example 402 vehicle. It may do this in a steady fashion (e.g. every 1 to 2 milliseconds though it is possible to use any receding time). The selected trajectory may be determined at least partially based on the current route, probability that the stationary vehicle will block, current vehicle trajectory and/or detected object trajectory information. The planner 430 can transmit the chosen trajectory to the drive 416 in order to control the example 402 vehicle system 402 according the selected trajectory.

“In some cases, the BVML model 428 and/or the planner 432 may include special hardware, such as a processor that can run the perception engine (e.g. a graphics processor or an FPGA).

“Example Process”

“FIG. “FIG.

“At operation 502, the sample receiving process 500 could include a sample that contains sensor data and a label indicating either a stationary nonblocking vehicle or stationary blocking vehicle according to any one of the techniques described herein. You may receive thousands, or even tens of millions, of samples, each with sensor data and a label (e.g. “Blocking or not-blocking”.

“At operation 504, an example process 500 might include determining the feature values for the sample using at least part of the sample sensor data according to any one of the techniques described herein. One example is that the feature values are determined from sensor data. This could include receiving a feature to train the BVML model. A numeric value that represents a green light. The feature value determination can be repeated for all samples received at operation 502.

“At operation 506, the 500 example may include the generation of a BVML model that outputs a probability that a stationary car is a blocking vehicle. This probability can be determined, at minimum in part, based on one or more feature value and the label associated to the sample. The type of ML model used (e.g. decision tree(s), deep-learning model), may allow for the generation of nodes, connection weights and node layers. These layer types can be used to map input feature value to a label. At runtime, the BV ML model generated may receive a set feature value from the perception engine, and output an indication of whether the stationary vehicle is blocking or not. The planner might use this indication to create a trajectory for the autonomous vehicle in some cases.

For example, if the stationary vehicle is a block vehicle with a positive value greater than 1, such as a probability equal to or exceeding 1, the planner might create a trajectory that allows the autonomous vehicle merge into another lane. A lower-valued positive indicator that the stationary vehicle has been blocked, such as a probability less than 1, might cause the planner to determine a trajectory that will allow the vehicle to stay in place for a few seconds longer before it re-evaluates or transmits a request for remote teleoperations assistance. The planner might determine a trajectory that will remain in place for a higher or lower value (greater or equal to 1) indication that the stationary vehicle has been blocked. The planner’s configuration will determine the exact values that he uses to perform different actions.

“FIG. “FIG.6” illustrates a flow chart of an example 600 process for detecting blocked vehicles. One or more processors, or other components of an autonomous vehicle, could perform the operations in example 600.

“At operation 602, an example process 600 could include receiving sensor data 604 at least from one sensor 412 according to any of these techniques. The perception engine 426 may receive the sensor data 604 as an example.

“At operation 606, an example process 600 could include detecting a stationary car in the environment of an autonomous vehicle, based at least partially on the sensor data 604, according any of the techniques described herein. This could include, for example, detecting an object in the environment of an autonomous vehicle and classifying it as a vehicle. Also, determining the vehicle’s speed and determining whether the vehicle is not moving at a predetermined speed. This could also include determining if the vehicle is preventing an autonomous vehicle from following a previously created trajectory and/or if the vehicle is causing harm to another vehicle in the autonomous vehicle’s environment. One or more of these operations could include entering the sensor data 604 into the stationary vehicle detector 608, which might include one or more machine-learning algorithms of the perception engine 426, or other components thereof. A stationary vehicle indicator 610 can be generated in some cases (e.g. changing a register value or flag value, sending a command to another component within the perception engine 426).

“At operation 602, the example 600 could include the determination of one or more feature value 614 according to any of these techniques. One example of how to determine the one or several feature values 614 is (1) detecting one, more or all of the vehicles on the road, and (2) determining a value that indicates the speed of the stationary car and other speeds. This feature value could include an indication that the stationary vehicle’s speed may be unusual compared to other vehicles, and/or traffic flow data distribution indicating the speed and speeds of other vehicles. A collection of components from the perception engine 426 is referred to as a feature generator 616 in FIG. 6. (e.g., machine-learning algorithms that perform object detection, object classification and object tracking) can determine the feature value 614 based at least partially on the sensor data 604 or the stationary vehicle indicator 610.

“In some cases, operation 612 might also include providing one or more feature value 614 as input into an ML model (e.g. BV ML Model 428), according any of the techniques herein.”

“At operation 618 the example process 600 may consist of receiving, from a ML model an indication 620 that a stationary vehicle is either a blocking or non-blocking vessel (i.e., FIG. BV indication.620 in FIG. 6, according to any one of the techniques described herein. The indication 616 could include a label, such as?blocking vehicle? ?non-blocking vehicle?) and/or a probability. One example is that the perception engine 426, may receive the indications 620 and/or BVML model 428. Or the perception engine 426, may transmit the indication 620 to the planner 430 using any of the techniques described herein. In some cases, the planner 430 may also receive sensor data 604, data which is received from the perception engine 426, data such as object classifications, object tracks, etc.

“At operation 622 the example process 600 could include the generation of trajectory 624 to control the motion of an autonomous vehicle according to any one of the techniques described herein. The planner 430 could generate candidate trajectories, at most partially, based on the indication 616 and then select one candidate trajectories to control the autonomous vehicle. The planner 430 can transmit the chosen trajectory to the autonomous vehicle’s drive system 416.

“FIG. “FIG.7” illustrates a flow chart of an example 700 process for detecting blocked vehicles. The example process 700 operations can be performed by one or more autonomous vehicle processors or other components. As described below, or the operations can be carried out remotely by remote computing systems such as a remote autonomous vehicle and/or a device that teleoperates.

“At operation 702, example process 700 could include receiving sensor data according to any one of the techniques described herein.”

“At operation 704 the example process 700 might identify a stationary car (i.e. the?YES?) According to any of these techniques, arm in the flow diagram). If a stationary vehicle isn’t identified, process 700 could be reverted to operation 702.

“At operation 706, an example process 700 could include determining one of several feature values based at least partially on the sensor data according to any of these techniques. All of the feature value discussed herein can be included. One example of the feature values could be determining (706A) traffic flow data, which indicates speeds of one or several vehicles detected using sensor data. This could include the speed of the stationary vehicle, and/or other speeds.

“At operation 708 the example process 700 could include providing one or more feature value to a machine learning model according to any of these techniques.”

“At operation 701, the example process 700 could include the outputting, by machine-learning models, of a probability that the stationary car is a blocking device, according to any one of the techniques described herein.

“At operation 712 the example process 700 could include controlling the vehicle based at least partially on the probability according to any one of the techniques described herein. This could include controlling the vehicle so that it passes a blocking vehicle (712A) or waiting for a nonblocking vehicle (712B )).”).

“Example Clauses”

“A. “A.

“B. “B.

“C. The autonomous car of paragraph A orB, in which detection of the stationary vehicle also includes detecting one or several other stationary vehicles within a predetermined distance from the autonomous vehicle.

“D. An autonomous vehicle described in paragraphs A-C. In which the one or more feature value further includes at least a speed of a stationary vehicle and a traffic sign state.

“E. Any autonomous vehicle described in paragraphs A-D. In which case, the blocking vehicle is any object detected by the perception engines that blocks at least one autonomous vehicle or another vehicle’s progress.”

Click here to view the patent on Google Patents.

How to Search for Patents

A patent search is the first step to getting your patent. You can do a google patent search or do a USPTO search. Patent-pending is the term for the product that has been covered by the patent application. You can search the public pair to find the patent application. After the patent office approves your application, you will be able to do a patent number look to locate the patent issued. Your product is now patentable. You can also use the USPTO search engine. See below for details. You can get help from a patent lawyer. Patents in the United States are granted by the US trademark and patent office or the United States Patent and Trademark office. This office also reviews trademark applications.

Are you interested in similar patents? These are the steps to follow:

1. Brainstorm terms to describe your invention, based on its purpose, composition, or use.

Write down a brief, but precise description of the invention. Don’t use generic terms such as “device”, “process,” or “system”. Consider synonyms for the terms you chose initially. Next, take note of important technical terms as well as keywords.

Use the questions below to help you identify keywords or concepts.

  • What is the purpose of the invention Is it a utilitarian device or an ornamental design?
  • Is invention a way to create something or perform a function? Is it a product?
  • What is the composition and function of the invention? What is the physical composition of the invention?
  • What’s the purpose of the invention
  • What are the technical terms and keywords used to describe an invention’s nature? A technical dictionary can help you locate the right terms.

2. These terms will allow you to search for relevant Cooperative Patent Classifications at Classification Search Tool. If you are unable to find the right classification for your invention, scan through the classification’s class Schemas (class schedules) and try again. If you don’t get any results from the Classification Text Search, you might consider substituting your words to describe your invention with synonyms.

3. Check the CPC Classification Definition for confirmation of the CPC classification you found. If the selected classification title has a blue box with a “D” at its left, the hyperlink will take you to a CPC classification description. CPC classification definitions will help you determine the applicable classification’s scope so that you can choose the most relevant. These definitions may also include search tips or other suggestions that could be helpful for further research.

4. The Patents Full-Text Database and the Image Database allow you to retrieve patent documents that include the CPC classification. By focusing on the abstracts and representative drawings, you can narrow down your search for the most relevant patent publications.

5. This selection of patent publications is the best to look at for any similarities to your invention. Pay attention to the claims and specification. Refer to the applicant and patent examiner for additional patents.

6. You can retrieve published patent applications that match the CPC classification you chose in Step 3. You can also use the same search strategy that you used in Step 4 to narrow your search results to only the most relevant patent applications by reviewing the abstracts and representative drawings for each page. Next, examine all published patent applications carefully, paying special attention to the claims, and other drawings.

7. You can search for additional US patent publications by keyword searching in AppFT or PatFT databases, as well as classification searching of patents not from the United States per below. Also, you can use web search engines to search non-patent literature disclosures about inventions. Here are some examples:

  • Add keywords to your search. Keyword searches may turn up documents that are not well-categorized or have missed classifications during Step 2. For example, US patent examiners often supplement their classification searches with keyword searches. Think about the use of technical engineering terminology rather than everyday words.
  • Search for foreign patents using the CPC classification. Then, re-run the search using international patent office search engines such as Espacenet, the European Patent Office’s worldwide patent publication database of over 130 million patent publications. Other national databases include:
  • Search non-patent literature. Inventions can be made public in many non-patent publications. It is recommended that you search journals, books, websites, technical catalogs, conference proceedings, and other print and electronic publications.

To review your search, you can hire a registered patent attorney to assist. A preliminary search will help one better prepare to talk about their invention and other related inventions with a professional patent attorney. In addition, the attorney will not spend too much time or money on patenting basics.

Download patent guide file – Click here