Invented by Carl Wellington, Colin Green, Adam Milstein, Uatc LLC
The Uatc LLC invention works as followsA traffic signals analysis system for an automated vehicle (AV)” can receive image data via one or more cameras. The image data includes an upcoming traffic sign system at an intersection. The system can access a signal map that is specific to the traffic signaling system and determine the action the AV should take through the intersection. The system can use the matching map to generate a template for the traffic signaling system, and determine the first subset and second subset of traffic signals that are applicable to the action. The system can dynamically analyse the first subset, and the second subset, to determine the state of the traffic signaling for the action. It then generates an output for AV that indicates the state of upcoming traffic signals for the action.
Background for Traffic Signal Analysis System
Traffic accidents cause more than one million deaths worldwide each year. In the United States, they are responsible for over 30,000 deaths. Distracted driving, drunk driving, driver incompetence, or inability to drive, hazardous roads and weather conditions and long commutes are all factors that continue to contribute to the high number of traffic-related injuries and deaths. “The advent of autonomous vehicle technologies, along with the constant increases in machine-learning and artificial intelligence technology, may prove to circumvent some of the unfortunate factors which lead to traffic accidents.
The main concern about autonomous vehicles on the public road is their ability to make reliable and safe decisions in complex situations. An autonomous vehicle will encounter many situations where it must make decisions that could lead to the loss of human life, even if this is unlikely. Traffic signaling systems at road intersections can be as simple as three-bulb faces or complex yielding and directional signals. To advance public acceptance of autonomous vehicles, autonomous vehicles must make decisions that are safe, reliable, skilled, and responsible at all intersections.
A traffic signals analysis system” is provided for an autonomous vehicle to make decisions about traffic intersections. In the examples provided here, the traffic analysis system can be integrated into the autonomous vehicle as a dedicated, independent component or it may be an independent component. The traffic signal system can monitor or receive image data from one camera or more, and identify traffic signaling systems, including any number traffic lights, within the image data. In some implementations, the system can determine the pass-through actions for the autonomous vehicle at the intersection. Signal analysis systems can, for example, determine if the autonomous vehicle is to be told to turn left, right, or U-turn the intersection based on the current route data and/or navigation data.
Accordingly, in many examples, the system can perform localization to determine the pose of the autonomous car (e.g. position, bearing and/or direction) and then access a matching map (e.g. stored in the memory of the autonomous car or the system, or in a remote system or memory accessible via a network by the system analysis) which includes information about the characteristics of the traffic signaling system. The signal analysis system may use the matching signal map in many cases to create a template of the traffic signaling system using the image data. The signal template may define at least one area of interest where the traffic signal faces are expected to appear in the image. The signal analysis system, based on the image data and characteristic information, can identify the state of traffic signaling for the action of passing through, and generate an output signal for the autonomous vehicle that indicates the state of traffic signaling for the action of passing through. The signal analysis system, for example, can determine the state of the traffic signaling system during the pass-through and then generate an output signal indicating the green status. The output signal can be transmitted by the signal analysis system to the autonomous vehicle’s control system. This system will then perform the pass-through actions accordingly.
In certain aspects, the system can determine the state of traffic signaling for the pass through action by dynamically analysing the region(s), defined by the template generated, as the autonomous vehicles approaches the intersection. The signal analysis system is able to dynamically update and analyze the regions of interest in order to determine the current state of the traffic signals for the pass-through actions. The traffic signaling system may include multiple faces. The signal analysis system will identify the matching traffic signal map and then access it to determine the characteristics of the signal (e.g. the number of faces and their locations, the subsets, the representations of each lens or bulb in each subset and the right-of way characteristics). The signal analysis system will generate the template using the characteristics and then project it onto the image data to be analyzed.
As used in this document, a “subset” corresponds directly to the right-of-way for a specified passthrough action (e.g. left turn, straight-through progression, right turn or U.turn). A traffic signaling system directly corresponds to the right of way for a specific pass-through action. The subset can be displayed on several signal faces in the traffic signaling systems. One subset can direct a straight-through, but the lights may appear on several signal faces. A light (e.g. a red one) that directs a straight-through can also be used to direct a pass-through (e.g. a left turn). Accordingly, left turn actions may have their own set of signals directing the right-of way (e.g. green and yellow left turns), but they may also be dependent on another set.
In some examples, the system can identify and analyze a subset that corresponds to a pass-through for an autonomous vehicle. It will then identify the state for that pass-through (e.g. stop, go or yield to vehicles). In some examples, a signal analysis system analyzes the subset using a probabilistic match operation on portions of the set, such as each bulb or group of bulbs within a traffic-light face, to determine the state for each traffic-light face. As described in this document, the matching operation can be used to resolve occlusions, identify and resolve faulty signal, analyze multiple subsets when a pass-through action is required (e.g. a left turn or right turn depends on a second set), and so forth.
Among other benefits described herein, examples achieve a technical impact of increasing the safety, reliability and trustworthiness for autonomous vehicles when they pass through intersections.
One or more of the examples described herein provides that methods, technologies, and actions performed on a computing device can be performed programmatically or by computer-implemented methods. As used in this document, programming means using code or computer executable instructions. These instructions may be stored on one or more memory resources within the computing device. “A programmatically performed action may or may be automatic.
One or more of the examples described in this document can be implemented by using programmatic engines, modules, or components. A programmatic engine, component, or module can be a part of a larger program, a section of a bigger program, software, hardware, or even a subroutine. A module or component may exist independently on a component of hardware. A module or component may also be an element or process that is shared by other modules, machines, or programs.
Some examples herein may require computing devices to be used, including memory and processing resources. One or more of the examples described in this document may be implemented in whole or part on computing devices, such as servers (e.g. desktop computers), smartphones (e.g. cellular phones), laptop computers (e.g. digital picture frames), printers (e.g. digital picture frames), network equipment (e.g. routers), and tablet devices. Memory, processing and network resources can be used to establish, use or perform any example described in this document (including the performance of a method or the implementation of a system).
Furthermore one or several examples described herein can be implemented by using instructions that are executable on one or multiple processors. These instructions can be stored on a computer-readable media. The machines shown in the figures provide examples of computer-readable media and processing resources on which instructions to implement examples disclosed herein may be carried or executed. The numerous machines that are shown in the examples of the invention have processors and different forms of memory to store data and instructions. Computer-readable media include hard drives in personal computers and servers, as well as permanent memory storage devices. Computer storage mediums can also include CD or DVD players, flash memory, and magnetic memory. Computers, terminals and network-enabled devices (e.g. mobile devices such as cell phone) are examples of devices and machines that use processors, memory and instructions stored on computer readable mediums. Examples can also be implemented as computer-programs or computer usable carriers capable of carrying a program.
Numerous examples of autonomous vehicles (AV) are referred to herein.” A vehicle that is automated in terms of steering and propulsion is an autonomous vehicle. There are different levels of autonomy for autonomous vehicles. Some vehicles allow automation only in certain scenarios, like on highways. However, drivers must be present. Some of the more advanced autonomous vehicles can drive without human assistance, either from inside or outside the vehicle. These vehicles are often required to make advance decisions about how to drive the vehicle given the challenging environment of the vehicle.
FIG. The block diagram of FIG. 1 illustrates an autonomous vehicle according to example implementations. An example is shown in FIG. A control system 100 may be used in FIG. 1 to autonomously operate a vehicle autonomously 10 within a geographic area for various purposes including transport services (e.g. transport of people, delivery services, and so on). In the examples given, a vehicle that is autonomously driven can be operated without human control. In the context of automobiles for example, an autonomously-driven vehicle can steer and accelerate, shift, break, and operate lighting components. Some variations recognize that an autonomous vehicle capable of being operated manually or autonomously.
In one implementation, the system 100 uses specific sensor resources to intelligently control the vehicle 10 during the most common driving scenarios. The control system 100, for example, can drive the vehicle by autonomously accelerating, steering and braking it as the vehicle moves towards a destination. The control system can plan routes and perform vehicle control actions, such as braking, steering and accelerating, using sensor data, along with other inputs, including transmissions from human operators or other vehicles.
In an example shown in FIG. The control system 100 comprises a computer system or processing system that processes sensor data obtained by the vehicle in relation to the road segment on which the vehicle operates. Sensor data can be used by the vehicle to determine what actions it should take to reach its destination. In some variations, control system 100 may include additional functionality such as wireless communication capability to send or receive wireless communications from one or more distant sources. The control system 100, when controlling the vehicle 10, can send out instructions and data (shown as commands 85), which programmatically control various electromechanical interfaces on the vehicle 10. The commands 85 are used to control the operational aspects of the vehicle 10 including propulsion and steering.
Examples acknowledge that urban driving environments pose significant challenges for autonomous vehicles. The behavior of pedestrians, cyclists and other vehicles, for example, can be affected by geographic region, locality and even the country. The way in which drivers react to pedestrians, cyclists, and other vehicles also varies by geographical region and locality.
The autonomous vehicle 10 may be equipped with multiple types sensors 101,103,105 that combine to give a computerized view of the environment and space around the vehicle. The control system 100 may also be used within the autonomous vehicle 10, to receive sensor data and control electromechanical interfaces to operate the vehicle on roads.
More specifically, the sensors 101 103 105 work together to provide a complete view of the vehicle 10 and to gather information about the surrounding area, including potential hazards, in the forward direction of the vehicle. The sensors 101,103,105 may include, for example, multiple sets of camera sensor 101 (video cameras, stereo pairs of cameras, depth perception cameras, or long-range cameras), remote detection (such as radar or LIDAR) sensors 103, proximity or touch sensors, or sonar sensors (not illustrated).
Each sensor 101, 103 or 105 is able to communicate with the control 100 using a sensor interface 110, 112. 114. The sensor interfaces 110 and 112 can be comprised of hardware or other logical components that are coupled to the respective sensor. The sensors 101,103,105, for example, can include a video and/or stereo camera set that continuously generates image data from an environment of vehicle 10. The sensor interfaces 110 and 112 can also include dedicated processing resources, such as a field-programmable gate array. The sensor interfaces 110, 112, 114 can include a dedicated processing resource, such as a field programmable gate array (?FPGA?)Click here to view the patent on Google Patents.