Autonomous Vehicles – Michael R. James, Danil V. Prokhorov, Toyota Motor Corp

Abstract for “Autonomous vehicle interaction and external environment”

“Arrangements refer to the interaction of an autonomous vehicle with an external environment. These interactions can take place in many ways. One example is a non-verbal gesture made by a human to an external environment. It is possible to identify the non-verbal gesture. Based on the non-verbal gesture, a future driving maneuver can easily be identified. The autonomous vehicle can then be programmed to execute the future driving maneuver. Another example is the detection of an external environment by the autonomous vehicle to identify a person (e.g. A human pedestrian, human bicyclist, or human driver of another vehicle can all be detected. therein. It is possible to locate the identified person. It is possible to determine if the identified person is related to future autonomous vehicle driving maneuvers. An autonomous vehicle can send a directional message.

Background for “Autonomous vehicle interaction and external environment”

There are many ways that a human driver can interact with the vehicle’s environment. The driver might communicate with pedestrians or other drivers using non-verbal gestures. Drivers may wave their hands or use other gestures to communicate with pedestrians and other drivers. Drivers may also nod their heads. These gestures are used to communicate different emotions, request information or indicate an action. A non-verbal gesture, for example, can be used to signal the driver’s intention to go in a certain direction or to let another driver or pedestrian take a particular action. Other pedestrians or human drivers in the environment can communicate with the driver using non-verbal gestures. These gestures are used to communicate various things. For example, they can indicate that the vehicle driver should continue with the next maneuver or request that the driver take a particular action.

“In one aspect, the present disclosure is directed at a method for interaction between an autonomous car and its external environment. This could include detecting an autonomous vehicle’s external environment to identify the person. This method may also include the identification of the person. This method may also include the identification of whether the person can be related to future autonomous vehicle driving maneuvers. This method could be used to determine if the person is related to future autonomous vehicle driving maneuvers. The autonomous vehicle would then send a directional message.

“Another aspect of the disclosure is that it relates to a system for interaction between an autonomic vehicle and its external environment. The system can include an autonomous vehicle. An autonomous vehicle may include a sensor system. The sensor can detect the environment outside of the autonomous vehicle and identify the person within. The sensor can be used to identify the person.

“The system may also include a processor onboard the vehicle. This processor is programmed to perform executable operations. Executable operations include determining if the identified person is possibly related to a future driving maneuver for the autonomous vehicle. If the autonomous vehicle determines that the person is likely to be involved in a future driving maneuver, the vehicle can send a directional message.

“In another respect, the present disclosure addresses a computer program product that facilitates interaction between an autonomous car and its external environment. The computer program product comprises a computer-readable storage medium containing program code. A processor can execute the program code to perform a particular method. This method may include identifying a person in the vehicle’s external environment. This method may also include the identification of the person. This method may also be used to determine if the person can possibly be related to future autonomous vehicle driving maneuvers. This method could be used to determine if the person is related to future autonomous vehicle driving maneuvers. The autonomous vehicle would then send a directional message.

“Another aspect of the disclosure is that it relates to an interaction between an autonomous car and its external environment. This method may include the detection of a non-verbal gesture from a human being in an external environment. This method can also be used to identify the non-verbal gesture. This method may also include the identification of non-verbal human gestures that can be used to determine a future driving maneuver. This method may also allow the autonomous vehicle be prompted to execute the future driving maneuver.

“In a further regard, the present disclosure refers to a system of interaction between an autonomous car and its external environment. The system may include an autonomous vehicle. An autonomous vehicle may include a sensor system that can detect non-verbal gestures made by an individual in the environment.

“A processor can be added to the system. The processor can be programmed to perform executable operations. Executable operations include the identification of the non-verbal gesture. Executable operations may also include the determination of a future driving maneuver, based at most in part on the non-verbal human gesture. The executable operations may also include the ability to cause the autonomous vehicle’s future driving maneuver to be implemented.

“In yet another respect, the present disclosure refers to a computer program product that allows interaction between an autonomous car and its external environment. The computer program product comprises a computer-readable storage medium containing program code. A processor can execute the program code to perform a particular method. The method involves detecting non-verbal gestures in the external environment. This method also involves identifying the non-verbal gesture. Further, the method includes determining a future driving move based at most in part on the non-verbal human gesture. This method involves causing an autonomous vehicle to execute the future driving maneuver.

This description describes the interaction between an autonomous car and its environment. This interaction may include an autonomous vehicle sending a message to one or several intended recipients in the environment. Alternately, or in addition to such interaction, the autonomous vehicle can determine one or more future driving maneuvers using, at minimum in part, non-verbal gestures from the outside environment. This detailed description is for computer programs, methods, and systems that include such features. These systems, methods, and computer program products may improve the safety and performance of an autonomous car in at least some cases.

“Detailed embodiments of the inventions are disclosed in this document. However, it should be understood that these embodiments are only exemplary. Specific structural and functional details are not intended to be limited. They are provided as a basis for claims and to demonstrate how one skilled in the art can use the features in almost any structure. The terms and phrases used in this document are not meant to limit but to give an understanding of possible implementations. FIGS. 1-7 show various embodiments. FIGS. 1-7 show various embodiments, but they are not limited to the application or structure shown.”

It will be noted that reference numerals were used where necessary to indicate the corresponding or analogueous elements. This is done for simplicity and clarity. To provide an in-depth understanding of the embodiments, many details have been provided. However, those with ordinary skill in art will understand that the embodiments described herein are possible to be used without the specific details.

Referring to FIG. 1 is an example of a vehicle 100. This is the definition of a’vehicle’. Any form of motorized transportation. The vehicle 100 may be an automobile in one or more of the implementations. Although the arrangements are described with regard to automobiles herein, it will be clear that other embodiments can be used. The vehicle 100 could be any type of motorized transport, including a boat, aircraft, or other watercraft.

“The vehicle 100 may be considered an autonomous vehicle according to the arrangements made herein. “Autonomous vehicle” is defined herein. A vehicle that can operate in autonomous mode. ?Autonomous mode? Autonomous mode means that the vehicle is controlled by one or more computers to navigate and/or maneuver along a route without any input from the driver. The vehicle 100 can be fully automated in one or more of the above arrangements. The vehicle 100 can be set up to switch between an autonomous and manual mode. This switching can be done in any way that is currently available or developed later. ?Manual mode? This means that the majority of navigation and/or maneuvering along a travel route are performed by a human driver.

“The vehicle 100 may have an associated longitudinal direction 101. This can be the central axis for the vehicle 100. A vehicle 100 can be associated with a longitudinal direction. ?Longitudinal direction? Any direction that is substantially parallel or co-linear to the longitudinal axis 101.

“The vehicle 100 may include many elements, some of them part of an autonomous driving program. FIG. 1 shows some of the elements that could make up the vehicle 100. The following will be described. It is important to note that the vehicle 100 does not have to include all the elements in FIG. 1 or described herein. Any combination of elements in FIG. 100 is possible for vehicle 100. 1. Additional elements can be added to the vehicle 100 as shown in FIG. 1. Vehicle 100 may not contain one or more elements from FIG. 1. In FIG. 1, the elements are shown to be located inside the vehicle 100. It will be clear that any one of these elements may be found outside the vehicle 100. The elements may also be physically separated over large distances.

“The vehicle 100 may include one or more processors 110. ?Processor? ?Processor? refers to any component or group that is configured to execute any of these processes. One or more general-purpose, and/or one/more special-purpose processors may be used to implement the processor 110. Microprocessors and microcontrollers are all suitable processors. Other examples of suitable processors include a central processing unit, an array processor and a vector processor. A programmable logic array is (PLA), a field-programmable gates array (FPGA), a digital signal process (DSP), and an application specific integrated (ASIC). A controller is also an option. A processor 110 may include at least one hardware circuit, e.g. an integrated circuit, that is designed to execute instructions in program code. If there are multiple processors 110, they can be used independently or in combination. The processor 110 may be the main processor in one or more of these arrangements. The processor 110 could be used as an engine control unit.

“The vehicle 100 may include one or more data storage 115 to store one or more types data. The data store 115 may contain volatile or non-volatile memories. You can choose from RAM (Random Access Memory), Flash memory, PROM (Programmable Read Only Memory), EPROM [Electrically Erasable Programmable Read Only Memory], EPROM (Programmable Read Only Memory), EPROM [Erasable Programmable Read Only Memory], EPROM (Electrically Erasable Programmable Read Only Memory], EPROM (Electrically Erasable Programmable Read Only Memory], EEPROM (Electrically Erasable Programmable Read Only Memory), registers, optical disks, Hard Drives or any other storage media, or any combination of these, including magnetic disks and optical disks or hard drives or any other suitable medium or any combination of them all. The data store 110 can be used as a component, or the data store 110 can be operatively connected. The term “operatively connected” is used here. As used in this description, the term “operatively connected” can refer to direct or indirect connections as well as connections that are not directly physical.

The autonomous driving module 120 can be included in vehicle 100. The autonomous driving module 120 is computer-readable program code that implements the processes described in this document. The autonomous driving module 120 may be part of the processor 110. Alternatively, the autonomous driving modules 120 and 110 can be executed on or distributed to other processor systems to which the processor 110 connects.

The instructions in the autonomous driving module 120 (e.g. program logic) can be executed by the processor 110. These instructions may include instructions to perform various vehicle functions, or to transmit data to, get data from, interact and/or control one or more of the systems (e.g. One or more vehicle systems 160. These instructions may also be found in the data store 115.

“The vehicle 100 may include a human signal detection module 121. The human signal recognition code 121 can be implemented in computer-readable program code. This code executes the processes described herein when executed by a processor. The human signal identification module 121 may be implemented as a component of processor 110. Alternatively, the human signal recognition program code 121 may be executed on or distributed to other processor systems to which the processor 110 connects.

“The human signal recognition module (121) can be set up to identify, assess, and/or interpret nonverbal gestures of human beings in order to determine their meaning. The human signal recognition module (121) can also be used to identify, assess, and/or interpret verbal gestures of human beings in order to determine their meaning. Instructions (e.g. program logic) can be included in the human signal recognition module. They are executable by processor 110. These instructions may include instructions for analyzing non-verbal and/or vocal human gestures in order to determine their meaning. These instructions may also be found in the data store 115.

“The human signal recognition module (121) can be connected to one or several gesture libraries 116 in one or more arrangements. One or more of the data storage 115 can include one or more gesture libraries (116). A set of predefined gestures can be included in the gesture library 116. ?Gesture? A gesture is a non-verbal communication method that uses visible human body actions and/or movements to communicate a message. Gestures can be used as a substitute or in combination with speech or vocalization. Gestures are movements of the hands, fingers and arms as well as movement of the face, eyes, lips, mouth, nose, ears, lips, and other body parts. Is there a ‘predefined human gesture? A human gesture with an associated meaning. A “set” of predefined human gestures is the term. One or more predefined gestures. Each of the predefined gestures from the gesture library 116 may have a meaning.

“Identification of non-verbal gestures can be done in many ways. Computer vision and/or image processing can be used to identify a nonverbal gesture. The human signal recognition module (121) can be used to match a non-verbal gesture detected by the human to a set predefined gestures in the gesture library (116), to determine if there are matches. Match? Match? This means that the non-verbal captured human gesture and one predefined human gesture in the set are identical. Some embodiments use the term?match? In some embodiments,?match? oder?matches can be used to mean that the detected human gesture and one of the predefined human gestures in the set of predefined human gestures are substantially identical. This can mean that the detected gesture and one of the predefined gestures of a human being are nearly identical. The detected human gesture and one predefined gesture in the set can match within a predetermined probability (e.g. at least about 85%, 90% or higher) or confidence level.

“The vehicle 100 may include an external communication module 122. The external communication module code 122 can be used to implement the processes described in this document. The external communication module 122, which can be part of the processor 110, can be executed by and/or distributed to other processor systems to whom the processor 110 is connected.

“The external communication module (122) can be set up to determine the direction to send messages to one or several recipients (e.g. pedestrians and human drivers) within the vehicle’s external environment 100. These messages could be related to future vehicle maneuvers, for instance. Instructions (e.g. program logic) can be included in the external communication module 122, which is executable by processor 110. These instructions may include instructions for determining the appropriate method of directional communication such as visual communication or audial communication. These instructions may also be found in the data store 115.

“The vehicle 100 may include a sensor system 125. One or more sensors can be included in the sensor system 125. ?Sensor? Any device, component, and/or system capable of detecting, determining, assess, measuring, quantifying, and/or sensing something. One or more sensors can be set up to detect, determine and assess, measure, quantify, and/or sense something in real-time. The term “real-time” is used herein. “Real-time” refers to a level in which a user or system perceives that is sufficiently immediate to make a decision or process, or to allow the processor to keep pace with an external process.”

“In the case of a sensor system 125 that includes multiple sensors, the sensors may work in isolation or in combination. The processor 110, data store 115 and/or autonomous driving module 120 can all be connected to the sensor system 125.

Any type of sensor can be included in the sensor system 125. The sensor system 125, for example, can contain one or more sensors that detect, determine and assess the vehicle 100. Alternately, or in addition to the above, the sensor system can also include one or more sensors that detect, determine and assess, measure, quantify, and/or sense information about the environment in which the vehicle 100 is situated, as well as information about objects within the environment. These objects can be stationary or mobile. The sensor system 125 may also include additional sensors that can detect, measure, quantify, and/or sense 100’s location. These and other types sensors will be discussed in detail. The embodiments described herein are not exclusive to the specific sensors.

“The sensor system125 may include one or more sensors that detect, assess, quantify, measure, quantify, and/or sense the position and orientation changes of vehicle 100. This could be based, for instance, on inertial acceleration. The sensor system 125 may include accelerometers, GPS, and/or other suitable sensors in one or more arrangements. Sensors that monitor the internal systems of the vehicle 100 can be included in the sensor system 125 (e.g. an O2 monitor, fuel gauge, engine oil temperature, coolant temperature etc ).

“The sensor system can contain one or more environment sensors 122. The environment sensors can be used to detect, assess, measure and quantify objects in at most a portion of 100’s environment. Here are several examples of environment sensors 126. It will be clear that the embodiments described herein are not restricted to the specific sensors.

“In some arrangements, at least a part of the environment sensors126 can use radio signals (e.g. RADAR-based sensors One or more radio-based sensors may be used to determine, assess, quantify, and/or sense the presence of one/more objects in the environment 100. The position of each object relative to the vehicle 100 can also be determined. In the longitudinal direction, in the lateral direction, and/or in any other direction (s )).”

“One or more environment sensors 126 may use at least part lasers in one or more arrangements. One or more environment sensors 126 may be included in a laser rangefinder, or LIDAR. These devices may include a laser scanner and/or a laser source that emits a laser, and a detector that detects the reflections of the laser. You can configure the laser rangefinder/LIDAR to operate in either a coherent or incoherent detection mode. One or more laser-based sensors may be used to detect, assess, measure and quantify the presence of objects in the environment 100. They can also determine the relative position of each object to the vehicle 100. The distance between each object and the vehicle 100 can be measured in one or several directions (e.g. In the longitudinal direction, in the lateral direction, and/or in any other direction (s )).”

“One or more environment sensors 126 may use at least part of ultrasound in one or more arrangements. These sensors may include an ultrasonic source that emits ultrasonic signals and a detector that detects reflections from the ultrasonic signal. One or more of the ultrasonic-based environment sensors (126) can be used to detect, determine and assess the presence of one to three objects in the environment 100. They can also be used to measure and quantify the distance between each object and the vehicle 100. In the longitudinal direction, in the lateral direction, and/or in any other direction. This detecting can be done using a characteristic (e.g. The intensity (or frequency) of an ultrasonic reflection signal.

“In some arrangements the sensor system 125 and the processor 110 can be configured to detect and determine, assess and measure, quantify, and/or sense one or more aspects, characteristics, or properties of a detected objects. The sensor system 125 and/or one of the modules 120,121, 122, can be used to determine, assess and measure, quantify, and/or sense directly or indirectly the size, relative sizes, length, width and height of a detected object.

The sensor system 125 may also include other types sensors. The sensor system, 110 and/or one of the modules 120,121, or 122 can all be used to control the movements of any one or more sensors in the sensor system. Any of the sensors described in this article can be placed anywhere that is convenient relative to the vehicle 100. One or more sensors may be found within the vehicle 100. Other sensors, such as those located on the exterior or interior of the vehicle, can also be installed 100.

“The vehicle 100 may include a camera system number 127. The camera system 127 may be included in one or more arrangements. One or more cameras 128 can be included in the camera system 127. ?Camera? Describes any device, component and/or system that captures visual data. ?Visual data? This includes image and/or video information/data. You can have the visual data in any format you prefer.

“In some arrangements, one or more cameras 128 may include a lens (not illustrated) and an image capture device (not illustrated). Any type of image capturing device, system or device can use the image capture element, such as an area array sensor (CCD), a Charge Coupled Devices (CCD), sensor, a Complementary metal Oxide Semiconductors (CMOS), sensor, a linear array sensor and a CCD. Images can be captured at any wavelength of the electromagnetic spectrum by the image capture element. The image capture element can capture both color and grayscale images.

“In some arrangements, one or more cameras 128 may be externally facing. ?Externally facing? A camera that can be oriented, placed, configured, operated, and/or arranged in order to capture visual data from at most one portion of the vehicle’s external environment 100. One or more cameras 128 may be found in any part of the vehicle 100. One or more cameras 128 can be found within the vehicle 100. You can find one or more cameras 128 on the vehicle’s exterior 100. You can find one or more cameras 128 on the vehicle’s exterior 100.

“One or more cameras 128 can be fixed so that their position is stable relative to the vehicle 100. The cameras 128 may be moved so that visual data can be captured from different areas of the vehicle 100’s external environment. You can move such cameras 128 in any way you like. The cameras 128 may be pivotable around one or more axes and/or extended. The cameras 128 may be set up in one or more configurations to allow for any motion type, such as substantially spherical (or substantially hemisphere), substantially circular (or substantially linear), and/or substantially linear (or both).

One or more cameras 128 can have zoom in/out capabilities. One or more cameras 128 can zoom in on objects in the outside environment 100. If the object is a person, (e.g. One or more cameras 128 can zoom in at least on a portion of the human body to capture visual data related to nonverbal gestures.

“In certain arrangements, at least one of the cameras 128 can zoom into the body of the person to capture visual information relating to verbal gestures. One or more cameras 128 can zoom into a person’s mouth to capture visual data. ?Verbal gesture? A form of audible communication where a person speaks or causes sound to be made. Verbal gestures may include any word, phrase, sentence, or utterance made by a person. You can use a verbal gesture in conjunction with non-verbal gestures.

“The camera system 127, sensor system 125 and/or movements of one or more cameras 128 are controlled by the processor 110, camera system 127 and/or any of the modules 120,121, 122.”

“The human signal recognition module (121) and/or processor 110 can be set up to analyze the visual data from the camera system 127 in order to identify non-verbal gestures made by a human being (e.g. A human pedestrian or driver is a person who is located in the environment. Any suitable method can be used to identify a non-verbal human gesture. As an example, a captured human non-verbal gesture can be compared with a set predefined gestures in gesture library116 to determine if it matches the set of predefined gestures in gesture library116.

“The vehicle 100 may include an input system 130 to receive input from a vehicle’s occupant (e.g. A driver or passenger. You can use any input system 130 that is suitable, such as a keyboard, display, touch screen or multi-touch screen, joystick, joystick, trackball, microphone, and/or combination thereof.

“The vehicle 100 may include an output system 135 to present information to the passenger or driver. As described above, the output system 135 may include a display. The output system 135 can also include an earphone, microphone and/or speaker. The vehicle 100 could be used as both an input system 130 component and an output system 135 component.

“The vehicle 100 may include one or more microphones (not illustrated). The one or more microphones may be part of the sensor system. ?Microphone? Any device, component, or system that can capture audial information is a microphone. ?Audial data? Any data/information that can be perceptible by the human hearing sense. Any form of audial data is possible. The sensor system can capture verbal gestures 125.

“In some arrangements, one or more microphones can be externally facing. “Externally facing” is used in this context. A microphone that can be oriented, positioned and configured to capture audial data from at most one portion of the vehicle’s external environment 100. One or more microphones may be placed in any part of the vehicle 100. One or more microphones can be found within 100. You can also place one or more microphones on the exterior 100. You can place one or more microphones on the vehicle’s exterior 100. The microphones can be placed around the vehicle 100 using a plurality. A microphone array can contain multiple microphones.

“One or more microphones can be fixed so that their position is stable relative to the vehicle 100. You can move one or more microphones so that they can be moved to capture audial data from different areas of the vehicle 100’s external environment. You can move such microphones in many ways. The microphones can be rotated about one or more axes, pivotable on one or more axes and slidable/expandable on another. The microphones can be set up in any arrangement that allows for a wide range of motion. One or more microphones, and/or their movements, can be controlled using the sensor system 125 and the processor 110, and/or any of the modules 120-121 or 122.

“The human signal recognition module (121) and/or processor 110 can be set up to analyze the audial data collected by the sensor system.125 (e.g. The microphones and/or visual data collected by the sensor system125 and/or camera system127 can be used to identify one or several verbal gestures made in an external environment. The human signal recognition module (121) and/or processor 110 can be used to analyze visual data about lip movements in order to identify words spoken by humans in the external environment. The human signal recognition module (121) and/or processor 110 can also be used to analyze audial data in order to identify words spoken in an external environment by a person and/or sounds that are generated by humans in that environment (e.g. Honking of a vehicle’s horn. Any suitable way can be used to identify a verbal human gesture. To determine if a captured verbal expression matches one of the predefined human verbal gestures, you can compare it to a list of verbal gestures (not shown).

“The human signal recognition module (121) can be connected to one or several verbal libraries or dictionary in one or more arrangements. One or more of the data storage 115 can include one or more verbal libraries and dictionaries. A set of predefined human verbal gestures can be included in the verbal libraries and dictionaries.

“The human signal recognition module (121) can be used in one or more arrangements to include voice recognition technologies for the analysis, asses, identification and/or interpretation of audial data from the external environment. The human signal recognition module (121) can be equipped with any hardware or software that supports voice recognition.

“The vehicle 100 may include an external communication network 145. What is an external communication system? One or more elements, one, more devices, one, more components, one, more systems, or combinations thereof that are used to communicate with one, or more, intended recipients, in the external environment 100. Particularly, “external communication network”? One or more elements, one, more devices, one, more components, one, more systems, or combinations thereof that are used to send directional communications to one, or more, intended recipients, in the external environment 100. ?Directional communications? One or more messages that are sent to an area of the environment which includes the location of the intended recipient. You can send the message in any form you like, including audial and visual.

The message could be of any type. The message could relate to 100 future driving maneuvers. Future driving maneuvers could include turning right, turning left or continuing straight. Future driving maneuvers may also include instructions or indications. For example, future driving maneuvers could allow pedestrians to cross roads before vehicles make 100 turns on the road.

The processor 110 and/or autonomous driving module 120 could be used to control the external communications system 145. The processor 110, and/or any of the modules 120 121, 122, 122, can be used to control the external communication system (145) when the vehicle 100 is in autonomous mode. This allows the vehicle 100 to communicate with the intended recipients via the external communications system 145, such as sending a directional signal.

“External communication system 145 comprises a visual communication network 146 and/or audial communication network 150. What is a visual communication system? One or more elements, one, more devices, one, more components, one, more systems, and/or any combination thereof that are used to transmit directional visual messages to one, more or all recipients. ?Visual message? Any communication that can be perceptible by the human eye. What is a directional visual message? A visual message sent to an area of the vehicle’s external environment in a direction substantially aligned to the location of an intended receiver in that environment. The recipient? The?recipient? can be any person living in the environment. For example, the recipient could be a pedestrian or cyclist, and/or a driver or occupant of another vehicle.

The visual message can either be a textual or graphical representation. The textual representation of the visual message in one or more arrangements can be very brief and/or simple. Examples of such textual representations are:?I See You? ?Go Ahead,? ?Turning Right,? ?Turning Right? ?Turning Right? and?Stopping. The graphical representation of the visual message may be very simple. An arrow can be used to indicate the direction in which the vehicle 100 will turn. Another example is a red stop sign that indicates the vehicle 100 is halting. Some arrangements allow for both textual and graphic representations of the visual message.

“In one or several arrangements, the visual communications system 146 may include one or multiple displays 147, one, or more projectors148, and/or one, or more, humanoid figures (149). Below are descriptions of each of these visual communication systems 146. One or more displays 147 or projectors 148, and/or one/more humanoid figures (149) can all be controlled at least partially by the processor 110. The autonomous driving module 120, human signal recognition module 121, and/or external communication module 122 can also control the processor 110.

“In one or several arrangements, the visual communication network 146 may include one or multiple displays 147. ?Display? Display is a component or group of components that display visual data. Any type of display can be used for the display 147. The display 147 could be either a liquid crystal display (LCD), light emitting diode display (LED), or any other type of display. One or more displays can have or set to high contrast output. This allows the intended recipient to see the visual message on the display 147 in different lighting conditions. The contrast output of the displays 147 can also be adjusted so that the intended recipient can see the visual message on the display 147 even when it is sunny.

“The externally facing display can include one or more of the 147 displays. What is an externally facing display? A display that faces the outside. You can locate the one or more displays (147) in any part of the vehicle 100. One or more displays 147 can be found within the vehicle 100. You can also locate one or more displays 147 on the vehicle’s exterior 100. The exterior of the vehicle can have one or more displays 147. The displays 147 may be placed on doors, panels, trunks, roofs, wing mirrors, and other exterior parts of the vehicle 100. You can have multiple displays 147, and you can distribute the plurality 147 around the vehicle 100 in any way that suits your needs.

“One or more displays 147 can be placed in a fixed position relative to the vehicle 100. You can move one or more displays 147 so that visual data can be presented selectively to different areas of the vehicle’s external environment 100. You can move such displays 147 in any way you like. The displays 147 are able to be rotated about one or more of the axes, pivotable, slideable and/or extended, just to name some possibilities. The displays 147 can be used in any combination of arrangements. They can have any range of motion including substantially spherical or substantially hemisphere, circular, and/or linear.

“FIGS. 2, 3 and 4 show arrangements where a plurality 147 displays are integrated into different body panels of vehicle 100. FIG. FIG. 2 shows another arrangement where a display 147 may be mounted on the vehicle’s roof 100. You can rotate the display around an axis 181 any way you like. Alternately, or in addition to this, the display 147 may be moved between a deployed position (as illustrated in FIG. 2) or a stowed position in any manner that suits you.

“In one or several arrangements, the visual communication network 146 may include one or multiple projectors 148. ?Projector? The projector is a component or group of components that can project data onto a surface. The projector 148 is capable of projecting visual data, such as images and/or videos, in one or more arrangements. The projector 148 can project visual data (e.g., a light, image or video) onto any surface outside the vehicle 100. You can choose to have the surface be paved, part of a structure or a sign. The projector 148 may be used in one or more of the following arrangements: a laser, other light emitting element, device or component, and/or system. The projector 148 can emit light at any wavelength on the electromagnetic spectrum.

The one or more projectors (148) can be found in any part of the vehicle 100. One or more projectors 148 can be found within the vehicle 100. The exterior can have one or more projectors 148. The exterior can have one or more projectors 148. One or more projectors 148 may be mounted on the door, fender panel, trunk, roof, trunk, wing mirror, and/or any other exterior part of the vehicle 100. You can provide multiple projectors (148), and you can distribute the plurality 148 around the vehicle 100 in any way that is most convenient.

“The projector position 148 can be fixed so that it does not move relative to the vehicle 100. The position of one or more projectors 148 can change so that visual data can be presented to different areas of the vehicle’s external environment 100. You can move such projectors 148 in any way you like. The projectors 148 are able to be rotated about one or more of the axes and pivotable, slidable, and/or extended, just to name some possibilities. The projectors 148 may be configured in any combination that allows for a wide range of motion. They can be substantially spherical or substantially hemisphere, circular, and/or linear.

“FIG. “FIG. Rotating the projector 148 about an axis (e.g. You can rotate the projector 148 about an axis (e.g., axis 182) in any way you like. Alternately, or in addition to this, the projector can be moved between a deployed and stowed position in any way that suits you. You can move the projector 148 to send a visual message (e.g. person 301) to someone in an external environment 300. FIG. FIG. An arrow indicates a right-hand turn.

“In one or several arrangements, the visual communication network 146 may include one or two humanoid figures. ?Humanoid figure? A humanoid figure is a controllable object that has at least some similarities to a human body. The humanoid figure (149) can be a complete human being. It can also look like a person’s body from the waist down, or the entire body of a person from the chest up. The humanoid figure (149) can appear lifelike, natural, and/or realistic in one or more arrangements. The appearance of the humanoid figure (149) can vary between one arrangement and another. The humanoid figure of 149 may be anatomically correct or incorrect. The humanoid figure of 149 can look like a human being in one or more arrangements. FIG. FIG. 4 illustrates an arrangement where the humanoid figure (149) does not appear humanoid, but has many features that are associated with humans (e.g. A head 405, an arm 405 and/or one or two fingers 417 are shown.

“The humanoid figure of 149 can be either partially or fully electro-mechanically or mechanically in one or more arrangements. The humanoid figure (149) can be a robot in one or more arrangements. The humanoid figure number 149 is a controllable mannequin. The humanoid figure number 149 is a controllable toy that can be used in one or more arrangements.

“One or more humanoid figurines 149 can be programmed to imitate human gestures, movements, or gestures in a way that is consistent with human action. One or more humanoid figurines 149 can be programmed to point one or more fingers and make a sign using the fingers. You can use the OK, thumbs up, stop, and/or OK signs to indicate your approval. You can also nod your head and/or wave a hand, to name just a few options. A humanoid figure 149 can be fully articulated in order to perform a desired gesture.

“One or more humanoid figures (149) can be found in any part of the vehicle 100. This allows it to be viewed externally by recipients 100. The vehicle 100 can contain one or more humanoid characters 149. One or more humanoid characters 149 can be found in the front passenger compartment 100. The exterior can have one or more humanoid characters 149. You can place one or more humanoid figures (149) on any door, fender panel, panel, trunk, trunk, roof and/or wing mirror, as well as other exterior parts of the vehicle 100.

“FIG. 4. This diagram shows an arrangement where a humanoid figure of 149 is placed in the front passenger area 400. An upper body section 420 of the humanoid figures 149 can be moved in one or more arrangements (e.g. Rotatable around an axis to face the appropriate side of vehicle 100. Alternately, or in addition, the humanoid figurine 149 can be controlled so that a head 405, arm 410 and/or hand 415 are moved, as well as one or more fingers 417, to send a visual message. FIG. 4. Waving a hand. A window can be opened on one or more of the vehicles 100 to let the recipient see the humanoid figure (149) from outside.

“As mentioned above, an external communication system (145) can also include an audial communications system 150. What is an audio communication system? One or more elements, one, more devices, or one or two systems and/or any combination thereof that are used to send a directional audial message (e.g., to a pedestrian or driver) to one or several intended recipients. an autonomous vehicle’s external environment. ?Audial message? Any communication that can be perceptible by the human senses of hearing.

“In some arrangements, the audial message may be short or simple. These textual representations can be illustrated by the following:?I See You? ?Go Ahead,? ?Turning Right,? ?Turning Right? ?Turning Right? The audial message may be one or more sounds or groups of sounds, but it can also form words.

“The audial communication network 150 can contain one or more speakers 152. ?Speaker? A speaker is a combination of elements, devices, components, systems and/or combinations that produce sound when an audio signal input is received. Speakers include sound chips, sound cards, and electroacoustic transducers. One or more audio output channels can be connected to each speaker (not shown). ?Audio output channel? Any device, component, or structure that can carry audio signals.

“The one or more speakers can be set up as directional sound speakers in one or multiple arrangements. ?Directional sound? These speakers are designed to produce sound fields that spread more than traditional speakers. These speakers can be very beneficial as they allow the message to be delivered more directly to the intended recipients and reduce the possibility of it being misinterpreted by unintended recipients.

“The speaker or speakers can be configured to provide audial data to one of the recipients in an external environment of an autonomous vehicle. You can locate the one or more speakers 152 in any part of the vehicle 100. One or more speakers 152 can be found within the vehicle 100. You can also locate one or more speakers 152 on the exterior 100. The exterior can have one or more speakers 152. The speakers 152 may be placed on doors, panels, trunks, roofs, trunks, roofs, wing mirrors, and other exterior parts of the vehicle 100. You can provide multiple speakers 152 and distribute the plurality 152 around the vehicle 100 in any way you choose.

“The speaker position can be fixed so that it does not move relative to the vehicle 100. You can move one or more speakers 152 so that they can be moved to present audial data to specific areas of the vehicle’s external environment 100. You can move such speakers 152 in any way you like. The speakers 152, for example, can be rotated about one or more of the axes, pivotable and/or slidable, as well as extendable. The speakers 152 can be used in any arrangement, which may include substantially spherical or substantially hemisphere, circular, and/or linear motions.

“FIG. FIG. 5 illustrates arrangements where a plurality 152 speakers are integrated into different body panels of vehicle 100. FIG. FIG. 5 shows another arrangement where a speaker 152 may be mounted on the vehicle’s roof 100. This speaker 152 can be rotated about any number of axes, including the one you are using. “axis 181 in any way that is most appropriate.”

“The vehicle 100 may include one or more vehicles systems 160. FIG. shows several examples of one or more vehicle system 160. 1. The vehicle 100 may include different systems or more. You should know that while specific vehicle systems may be identified separately, any or all of them can be combined or separated via hardware or software within the vehicle 100.

A propulsion system 162 can be included in a vehicle 100. One or more mechanisms, devices or elements, components, systems and/or combinations thereof can make up the propulsion system 162. They are designed to provide power for the vehicle 100. A propulsion system 162 may include an engine or an energy source.

The engine can be any type of motor or engine that is currently available or developed in the future. An engine could be either an internal combustion engine or an electric motor. A Stirling engine is another possibility. The propulsion system may include multiple types of engines in some instances. A gas-electric hybrid vehicle, for example, can have a gasoline engine as well as an electric motor.

The energy source could be any source of energy that can be used at least partially to power the engine. You can configure the engine to convert the energy source into electrical energy. You can choose from gasoline, diesel or propane as energy sources. Alternately, or in addition to fuel tanks, batteries and capacitors, the energy source could also include flywheels. The energy source may be used in some embodiments to supply energy for other vehicle systems 100.

“The vehicle 100 may include tires, wheels and/or tracks. You can use any type of tires, wheels and/or tracks. The wheels, tires, and/or track of the vehicle 100 can be set up to rotate in a different direction from other wheels, tires, and/or tracks 100. Any suitable material can be used to make the wheels, tires, and/or tracks.

“The vehicle 100 may include a brake system 164. The braking system can contain one or more mechanisms or devices, components, elements, systems or combinations thereof that are now known or later developed and designed to slow down the vehicle 100. The braking system 164 may use friction to slow down the wheels/tires. The braking system 164 can convert the kinetic energy from the wheels/tires into an electric current.

“The vehicle 100 may also include a steering mechanism 166. The steering system 166 may include any number of mechanisms, devices, elements or components, as well as combinations thereof, that are now or later developed and used to adjust the heading.

The throttle system 168 can be included in the vehicle 100. The throttle system 168 may include one or more mechanisms or devices, elements or components, systems and/or combinations thereof that are now known or later developed and used to control the engine/motor speed of the vehicle 100.

“The vehicle 100 may include a transmission 170. One or more devices, elements, components or systems of the transmission system 170 may be used to transmit mechanical power from vehicle’s engine/motor 100 to the wheels/tires. The transmission system 170 may include a differential, gearbox, clutch and/or other elements. If the transmission system 170 has drive shafts, one or more axles can be attached to the wheels/tires.

A signaling system 172 can be included in a vehicle 100. Signaling system 172 may include any number of mechanisms, devices, elements or components, or combinations thereof, either now known or later developed. It can provide illumination for the driver 100 and/or provide information about one or more aspects 100. The signaling system 172, for example, can give information about the vehicle’s location, size, direction, speed, and/or driver’s intentions with regard to direction and speed. The signaling system 172 may include the following: headlights and taillights; brake lights; hazard lights; turn signal lights.

“The vehicle 100 may include a navigational system 174. The navigation system 174 may include one or more mechanisms or devices, elements or components, systems, applications, and/or combinations thereof. They can be used to determine the geographical location of the vehicle 100 or to determine a route for the vehicle 100.

“The navigation system174 can contain one or more mapping apps to determine a vehicle’s travel route 100. A driver or passenger can input an origin and destination. The mapping program can help you find the best travel route between your destination and origin. You can choose a travel route based on one or several parameters, such as: The shortest travel distance, the shortest travel time, etc. are some of the options. The navigation system 174 can be set up to automatically update the travel route while the vehicle 100’s in use.

“The navigation system 173 can contain a global positioning system, local positioning system, or a geolocation. Any one of a variety of satellite positioning systems can be used to implement the navigation system 174, such as the United States Global Positioning System, the Russian Glonass, the European Galileo, or any other system that uses satellites from a mixture of satellite systems or any future satellite systems. This includes the Chinese COMPASS system and Indian Regional Navigational Satellite System. The navigation system 174 also supports Transmission Control Protocol (TCP), a Geographic Information System (GIS), and location services.

“The navigation system 174 may contain a transceiver that can be used to calculate the position 100 of the vehicle relative to the Earth. A GPS transceiver can be used to determine vehicle’s latitude, longitude, and/or altitude. Other systems can be used by the navigation system 174 (e.g. Laser-based localization systems and inertial-aided GPS can be used to locate the vehicle 100.

“Alternatively, or in addition, the navigation systems 174 can be built on access point geolocation services such as the W3C Geolocation Application Programming Interface. This system allows the vehicle 100 to be located by consulting location information servers. These include the Internet protocol (IP), Wi-Fi and Bluetooth Media Access Control address (MAC), radio-frequency ID (RFID), WiFi connection location or device GPS and Global System for Mobile Communications(GSM)/code division multiple access (CDMA). It will be clear that the method by which the vehicle 100’s geographic location is determined will depend on how the particular location tracking system operates.

“The processor 110 or the autonomous driving module 120 may be connected to the vehicle systems 160 to allow communication with individual components and/or vehicles. Referring to FIG. 1. The processor 110 and/or autonomous driving module 120 may be communicating with each other to send and/or get information from various vehicle systems 160 in order to control movement, speed, maneuvering, direction, and so on. vehicle 100. These vehicle systems 160 may be controlled by the processor 110 or the autonomous driving module 120. They may also be partially or completely autonomous.

“The processor 110 or the autonomous driving module 120 could be used to control navigation and/or maneuvering by the vehicle 100. This can be done by controlling one or more vehicle systems 160 and/or their components. The processor 110 and/or autonomous driving module 120, for example, can control the vehicle’s direction and speed 100 when it is in autonomous mode. The processor 110 or the autonomous driving module 120 may cause the vehicle 100’s speed to increase (e.g. by increasing fuel supply to the engine), decrease (e.g. by applying brakes and decreasing fuel supply to the engine) and/or change its direction (e.g. by turning the front wheels). As used herein, ?cause? or ?causing? It is used to cause, make, force, compel or instruct an event or act, or to be in a position where such an event or activity may occur.

One or more actuators 140 can be included in the vehicle 100. The actuators 140 may be any combination or element that can modify, adjust, and/or alter any of the vehicle’s 160 systems or component thereof in response to signals or inputs from either the processor 110 or the autonomous driving module 120. Any actuator can be used. One or more actuators 140 could include motors and pneumatic actuators, hydraulic pistons and relays. Solenoids, solenoids and/or piezoelectric actuators are just a few examples.

The vehicle 100 can be set up to allow interaction between the vehicle 100, 100 and the external environment 100. The vehicle 100 can be set up to activate at least part of the external communication network 145 to send a directional message to a human recipient within the environment. The vehicle 100 can be programmed to recognize and respond to non-verbal human signals and take a driving action.

“Now that we have described the potential systems, elements, and/or components of vehicle 100, we will describe various interaction methods between autonomous vehicles and the external environment. Referring to FIG. 6, an example of a method for interacting with an external environment for an autonomous vehicle is shown. Referring now to FIG. 6, is an example of how an autonomous vehicle interacts with its external environment. We will now discuss the various steps that can be taken to achieve method 600. FIG. 600 illustrates the method 600. 6. may apply to the embodiments discussed above with respect to FIGS. Although the embodiments shown in FIGS. 1-5 may be applicable, it is possible to use other systems and arrangements. The method 600 could include additional steps, which is why the FIG. 600 does not have to contain every step. 6. These steps are part of the method 600, but they are not restricted to this chronological order. You can perform some steps in a different order or even all of them simultaneously.

“At block 605, one or more people can be identified in the exterior environment of the vehicle 100. Identify one or more people? This means that the object detected is identified as a human being. One or more sensors can detect the external environment, such as the environment sensor(s), 126 and/or the camera system, 127. In some cases, the detection of the environment can be continuous or at a set interval. You can use one or more human recognition technology to identify and/or detect people in any way you choose. Facial recognition, e.g. Face detection, body recognition, and/or iris identification are some examples of human recognition technologies. Computer vision, template matching, or any other visual data processing technology can recognize a human.

“A person can be assumed to be present in one or more arrangements without directly identifying or detecting them. If it is found that the other vehicle does not have an autonomous vehicle, or is in manual mode, then the vehicle 100 might assume that the human driver is driving the vehicle 100. The vehicle 100 might send a request to the vehicle 100 asking about the operation mode of the other vehicle. If no response is received, it could be that the vehicle 100 has not yet been contacted. If no response is received (e.g., after a predetermined time) it can be assumed that the vehicle has a human operator. If the response indicates that the vehicle operates in manual mode, it can also be assumed that it has a human driver.

Summary for “Autonomous vehicle interaction and external environment”

There are many ways that a human driver can interact with the vehicle’s environment. The driver might communicate with pedestrians or other drivers using non-verbal gestures. Drivers may wave their hands or use other gestures to communicate with pedestrians and other drivers. Drivers may also nod their heads. These gestures are used to communicate different emotions, request information or indicate an action. A non-verbal gesture, for example, can be used to signal the driver’s intention to go in a certain direction or to let another driver or pedestrian take a particular action. Other pedestrians or human drivers in the environment can communicate with the driver using non-verbal gestures. These gestures are used to communicate various things. For example, they can indicate that the vehicle driver should continue with the next maneuver or request that the driver take a particular action.

“In one aspect, the present disclosure is directed at a method for interaction between an autonomous car and its external environment. This could include detecting an autonomous vehicle’s external environment to identify the person. This method may also include the identification of the person. This method may also include the identification of whether the person can be related to future autonomous vehicle driving maneuvers. This method could be used to determine if the person is related to future autonomous vehicle driving maneuvers. The autonomous vehicle would then send a directional message.

“Another aspect of the disclosure is that it relates to a system for interaction between an autonomic vehicle and its external environment. The system can include an autonomous vehicle. An autonomous vehicle may include a sensor system. The sensor can detect the environment outside of the autonomous vehicle and identify the person within. The sensor can be used to identify the person.

“The system may also include a processor onboard the vehicle. This processor is programmed to perform executable operations. Executable operations include determining if the identified person is possibly related to a future driving maneuver for the autonomous vehicle. If the autonomous vehicle determines that the person is likely to be involved in a future driving maneuver, the vehicle can send a directional message.

“In another respect, the present disclosure addresses a computer program product that facilitates interaction between an autonomous car and its external environment. The computer program product comprises a computer-readable storage medium containing program code. A processor can execute the program code to perform a particular method. This method may include identifying a person in the vehicle’s external environment. This method may also include the identification of the person. This method may also be used to determine if the person can possibly be related to future autonomous vehicle driving maneuvers. This method could be used to determine if the person is related to future autonomous vehicle driving maneuvers. The autonomous vehicle would then send a directional message.

“Another aspect of the disclosure is that it relates to an interaction between an autonomous car and its external environment. This method may include the detection of a non-verbal gesture from a human being in an external environment. This method can also be used to identify the non-verbal gesture. This method may also include the identification of non-verbal human gestures that can be used to determine a future driving maneuver. This method may also allow the autonomous vehicle be prompted to execute the future driving maneuver.

“In a further regard, the present disclosure refers to a system of interaction between an autonomous car and its external environment. The system may include an autonomous vehicle. An autonomous vehicle may include a sensor system that can detect non-verbal gestures made by an individual in the environment.

“A processor can be added to the system. The processor can be programmed to perform executable operations. Executable operations include the identification of the non-verbal gesture. Executable operations may also include the determination of a future driving maneuver, based at most in part on the non-verbal human gesture. The executable operations may also include the ability to cause the autonomous vehicle’s future driving maneuver to be implemented.

“In yet another respect, the present disclosure refers to a computer program product that allows interaction between an autonomous car and its external environment. The computer program product comprises a computer-readable storage medium containing program code. A processor can execute the program code to perform a particular method. The method involves detecting non-verbal gestures in the external environment. This method also involves identifying the non-verbal gesture. Further, the method includes determining a future driving move based at most in part on the non-verbal human gesture. This method involves causing an autonomous vehicle to execute the future driving maneuver.

This description describes the interaction between an autonomous car and its environment. This interaction may include an autonomous vehicle sending a message to one or several intended recipients in the environment. Alternately, or in addition to such interaction, the autonomous vehicle can determine one or more future driving maneuvers using, at minimum in part, non-verbal gestures from the outside environment. This detailed description is for computer programs, methods, and systems that include such features. These systems, methods, and computer program products may improve the safety and performance of an autonomous car in at least some cases.

“Detailed embodiments of the inventions are disclosed in this document. However, it should be understood that these embodiments are only exemplary. Specific structural and functional details are not intended to be limited. They are provided as a basis for claims and to demonstrate how one skilled in the art can use the features in almost any structure. The terms and phrases used in this document are not meant to limit but to give an understanding of possible implementations. FIGS. 1-7 show various embodiments. FIGS. 1-7 show various embodiments, but they are not limited to the application or structure shown.”

It will be noted that reference numerals were used where necessary to indicate the corresponding or analogueous elements. This is done for simplicity and clarity. To provide an in-depth understanding of the embodiments, many details have been provided. However, those with ordinary skill in art will understand that the embodiments described herein are possible to be used without the specific details.

Referring to FIG. 1 is an example of a vehicle 100. This is the definition of a’vehicle’. Any form of motorized transportation. The vehicle 100 may be an automobile in one or more of the implementations. Although the arrangements are described with regard to automobiles herein, it will be clear that other embodiments can be used. The vehicle 100 could be any type of motorized transport, including a boat, aircraft, or other watercraft.

“The vehicle 100 may be considered an autonomous vehicle according to the arrangements made herein. “Autonomous vehicle” is defined herein. A vehicle that can operate in autonomous mode. ?Autonomous mode? Autonomous mode means that the vehicle is controlled by one or more computers to navigate and/or maneuver along a route without any input from the driver. The vehicle 100 can be fully automated in one or more of the above arrangements. The vehicle 100 can be set up to switch between an autonomous and manual mode. This switching can be done in any way that is currently available or developed later. ?Manual mode? This means that the majority of navigation and/or maneuvering along a travel route are performed by a human driver.

“The vehicle 100 may have an associated longitudinal direction 101. This can be the central axis for the vehicle 100. A vehicle 100 can be associated with a longitudinal direction. ?Longitudinal direction? Any direction that is substantially parallel or co-linear to the longitudinal axis 101.

“The vehicle 100 may include many elements, some of them part of an autonomous driving program. FIG. 1 shows some of the elements that could make up the vehicle 100. The following will be described. It is important to note that the vehicle 100 does not have to include all the elements in FIG. 1 or described herein. Any combination of elements in FIG. 100 is possible for vehicle 100. 1. Additional elements can be added to the vehicle 100 as shown in FIG. 1. Vehicle 100 may not contain one or more elements from FIG. 1. In FIG. 1, the elements are shown to be located inside the vehicle 100. It will be clear that any one of these elements may be found outside the vehicle 100. The elements may also be physically separated over large distances.

“The vehicle 100 may include one or more processors 110. ?Processor? ?Processor? refers to any component or group that is configured to execute any of these processes. One or more general-purpose, and/or one/more special-purpose processors may be used to implement the processor 110. Microprocessors and microcontrollers are all suitable processors. Other examples of suitable processors include a central processing unit, an array processor and a vector processor. A programmable logic array is (PLA), a field-programmable gates array (FPGA), a digital signal process (DSP), and an application specific integrated (ASIC). A controller is also an option. A processor 110 may include at least one hardware circuit, e.g. an integrated circuit, that is designed to execute instructions in program code. If there are multiple processors 110, they can be used independently or in combination. The processor 110 may be the main processor in one or more of these arrangements. The processor 110 could be used as an engine control unit.

“The vehicle 100 may include one or more data storage 115 to store one or more types data. The data store 115 may contain volatile or non-volatile memories. You can choose from RAM (Random Access Memory), Flash memory, PROM (Programmable Read Only Memory), EPROM [Electrically Erasable Programmable Read Only Memory], EPROM (Programmable Read Only Memory), EPROM [Erasable Programmable Read Only Memory], EPROM (Electrically Erasable Programmable Read Only Memory], EPROM (Electrically Erasable Programmable Read Only Memory], EEPROM (Electrically Erasable Programmable Read Only Memory), registers, optical disks, Hard Drives or any other storage media, or any combination of these, including magnetic disks and optical disks or hard drives or any other suitable medium or any combination of them all. The data store 110 can be used as a component, or the data store 110 can be operatively connected. The term “operatively connected” is used here. As used in this description, the term “operatively connected” can refer to direct or indirect connections as well as connections that are not directly physical.

The autonomous driving module 120 can be included in vehicle 100. The autonomous driving module 120 is computer-readable program code that implements the processes described in this document. The autonomous driving module 120 may be part of the processor 110. Alternatively, the autonomous driving modules 120 and 110 can be executed on or distributed to other processor systems to which the processor 110 connects.

The instructions in the autonomous driving module 120 (e.g. program logic) can be executed by the processor 110. These instructions may include instructions to perform various vehicle functions, or to transmit data to, get data from, interact and/or control one or more of the systems (e.g. One or more vehicle systems 160. These instructions may also be found in the data store 115.

“The vehicle 100 may include a human signal detection module 121. The human signal recognition code 121 can be implemented in computer-readable program code. This code executes the processes described herein when executed by a processor. The human signal identification module 121 may be implemented as a component of processor 110. Alternatively, the human signal recognition program code 121 may be executed on or distributed to other processor systems to which the processor 110 connects.

“The human signal recognition module (121) can be set up to identify, assess, and/or interpret nonverbal gestures of human beings in order to determine their meaning. The human signal recognition module (121) can also be used to identify, assess, and/or interpret verbal gestures of human beings in order to determine their meaning. Instructions (e.g. program logic) can be included in the human signal recognition module. They are executable by processor 110. These instructions may include instructions for analyzing non-verbal and/or vocal human gestures in order to determine their meaning. These instructions may also be found in the data store 115.

“The human signal recognition module (121) can be connected to one or several gesture libraries 116 in one or more arrangements. One or more of the data storage 115 can include one or more gesture libraries (116). A set of predefined gestures can be included in the gesture library 116. ?Gesture? A gesture is a non-verbal communication method that uses visible human body actions and/or movements to communicate a message. Gestures can be used as a substitute or in combination with speech or vocalization. Gestures are movements of the hands, fingers and arms as well as movement of the face, eyes, lips, mouth, nose, ears, lips, and other body parts. Is there a ‘predefined human gesture? A human gesture with an associated meaning. A “set” of predefined human gestures is the term. One or more predefined gestures. Each of the predefined gestures from the gesture library 116 may have a meaning.

“Identification of non-verbal gestures can be done in many ways. Computer vision and/or image processing can be used to identify a nonverbal gesture. The human signal recognition module (121) can be used to match a non-verbal gesture detected by the human to a set predefined gestures in the gesture library (116), to determine if there are matches. Match? Match? This means that the non-verbal captured human gesture and one predefined human gesture in the set are identical. Some embodiments use the term?match? In some embodiments,?match? oder?matches can be used to mean that the detected human gesture and one of the predefined human gestures in the set of predefined human gestures are substantially identical. This can mean that the detected gesture and one of the predefined gestures of a human being are nearly identical. The detected human gesture and one predefined gesture in the set can match within a predetermined probability (e.g. at least about 85%, 90% or higher) or confidence level.

“The vehicle 100 may include an external communication module 122. The external communication module code 122 can be used to implement the processes described in this document. The external communication module 122, which can be part of the processor 110, can be executed by and/or distributed to other processor systems to whom the processor 110 is connected.

“The external communication module (122) can be set up to determine the direction to send messages to one or several recipients (e.g. pedestrians and human drivers) within the vehicle’s external environment 100. These messages could be related to future vehicle maneuvers, for instance. Instructions (e.g. program logic) can be included in the external communication module 122, which is executable by processor 110. These instructions may include instructions for determining the appropriate method of directional communication such as visual communication or audial communication. These instructions may also be found in the data store 115.

“The vehicle 100 may include a sensor system 125. One or more sensors can be included in the sensor system 125. ?Sensor? Any device, component, and/or system capable of detecting, determining, assess, measuring, quantifying, and/or sensing something. One or more sensors can be set up to detect, determine and assess, measure, quantify, and/or sense something in real-time. The term “real-time” is used herein. “Real-time” refers to a level in which a user or system perceives that is sufficiently immediate to make a decision or process, or to allow the processor to keep pace with an external process.”

“In the case of a sensor system 125 that includes multiple sensors, the sensors may work in isolation or in combination. The processor 110, data store 115 and/or autonomous driving module 120 can all be connected to the sensor system 125.

Any type of sensor can be included in the sensor system 125. The sensor system 125, for example, can contain one or more sensors that detect, determine and assess the vehicle 100. Alternately, or in addition to the above, the sensor system can also include one or more sensors that detect, determine and assess, measure, quantify, and/or sense information about the environment in which the vehicle 100 is situated, as well as information about objects within the environment. These objects can be stationary or mobile. The sensor system 125 may also include additional sensors that can detect, measure, quantify, and/or sense 100’s location. These and other types sensors will be discussed in detail. The embodiments described herein are not exclusive to the specific sensors.

“The sensor system125 may include one or more sensors that detect, assess, quantify, measure, quantify, and/or sense the position and orientation changes of vehicle 100. This could be based, for instance, on inertial acceleration. The sensor system 125 may include accelerometers, GPS, and/or other suitable sensors in one or more arrangements. Sensors that monitor the internal systems of the vehicle 100 can be included in the sensor system 125 (e.g. an O2 monitor, fuel gauge, engine oil temperature, coolant temperature etc ).

“The sensor system can contain one or more environment sensors 122. The environment sensors can be used to detect, assess, measure and quantify objects in at most a portion of 100’s environment. Here are several examples of environment sensors 126. It will be clear that the embodiments described herein are not restricted to the specific sensors.

“In some arrangements, at least a part of the environment sensors126 can use radio signals (e.g. RADAR-based sensors One or more radio-based sensors may be used to determine, assess, quantify, and/or sense the presence of one/more objects in the environment 100. The position of each object relative to the vehicle 100 can also be determined. In the longitudinal direction, in the lateral direction, and/or in any other direction (s )).”

“One or more environment sensors 126 may use at least part lasers in one or more arrangements. One or more environment sensors 126 may be included in a laser rangefinder, or LIDAR. These devices may include a laser scanner and/or a laser source that emits a laser, and a detector that detects the reflections of the laser. You can configure the laser rangefinder/LIDAR to operate in either a coherent or incoherent detection mode. One or more laser-based sensors may be used to detect, assess, measure and quantify the presence of objects in the environment 100. They can also determine the relative position of each object to the vehicle 100. The distance between each object and the vehicle 100 can be measured in one or several directions (e.g. In the longitudinal direction, in the lateral direction, and/or in any other direction (s )).”

“One or more environment sensors 126 may use at least part of ultrasound in one or more arrangements. These sensors may include an ultrasonic source that emits ultrasonic signals and a detector that detects reflections from the ultrasonic signal. One or more of the ultrasonic-based environment sensors (126) can be used to detect, determine and assess the presence of one to three objects in the environment 100. They can also be used to measure and quantify the distance between each object and the vehicle 100. In the longitudinal direction, in the lateral direction, and/or in any other direction. This detecting can be done using a characteristic (e.g. The intensity (or frequency) of an ultrasonic reflection signal.

“In some arrangements the sensor system 125 and the processor 110 can be configured to detect and determine, assess and measure, quantify, and/or sense one or more aspects, characteristics, or properties of a detected objects. The sensor system 125 and/or one of the modules 120,121, 122, can be used to determine, assess and measure, quantify, and/or sense directly or indirectly the size, relative sizes, length, width and height of a detected object.

The sensor system 125 may also include other types sensors. The sensor system, 110 and/or one of the modules 120,121, or 122 can all be used to control the movements of any one or more sensors in the sensor system. Any of the sensors described in this article can be placed anywhere that is convenient relative to the vehicle 100. One or more sensors may be found within the vehicle 100. Other sensors, such as those located on the exterior or interior of the vehicle, can also be installed 100.

“The vehicle 100 may include a camera system number 127. The camera system 127 may be included in one or more arrangements. One or more cameras 128 can be included in the camera system 127. ?Camera? Describes any device, component and/or system that captures visual data. ?Visual data? This includes image and/or video information/data. You can have the visual data in any format you prefer.

“In some arrangements, one or more cameras 128 may include a lens (not illustrated) and an image capture device (not illustrated). Any type of image capturing device, system or device can use the image capture element, such as an area array sensor (CCD), a Charge Coupled Devices (CCD), sensor, a Complementary metal Oxide Semiconductors (CMOS), sensor, a linear array sensor and a CCD. Images can be captured at any wavelength of the electromagnetic spectrum by the image capture element. The image capture element can capture both color and grayscale images.

“In some arrangements, one or more cameras 128 may be externally facing. ?Externally facing? A camera that can be oriented, placed, configured, operated, and/or arranged in order to capture visual data from at most one portion of the vehicle’s external environment 100. One or more cameras 128 may be found in any part of the vehicle 100. One or more cameras 128 can be found within the vehicle 100. You can find one or more cameras 128 on the vehicle’s exterior 100. You can find one or more cameras 128 on the vehicle’s exterior 100.

“One or more cameras 128 can be fixed so that their position is stable relative to the vehicle 100. The cameras 128 may be moved so that visual data can be captured from different areas of the vehicle 100’s external environment. You can move such cameras 128 in any way you like. The cameras 128 may be pivotable around one or more axes and/or extended. The cameras 128 may be set up in one or more configurations to allow for any motion type, such as substantially spherical (or substantially hemisphere), substantially circular (or substantially linear), and/or substantially linear (or both).

One or more cameras 128 can have zoom in/out capabilities. One or more cameras 128 can zoom in on objects in the outside environment 100. If the object is a person, (e.g. One or more cameras 128 can zoom in at least on a portion of the human body to capture visual data related to nonverbal gestures.

“In certain arrangements, at least one of the cameras 128 can zoom into the body of the person to capture visual information relating to verbal gestures. One or more cameras 128 can zoom into a person’s mouth to capture visual data. ?Verbal gesture? A form of audible communication where a person speaks or causes sound to be made. Verbal gestures may include any word, phrase, sentence, or utterance made by a person. You can use a verbal gesture in conjunction with non-verbal gestures.

“The camera system 127, sensor system 125 and/or movements of one or more cameras 128 are controlled by the processor 110, camera system 127 and/or any of the modules 120,121, 122.”

“The human signal recognition module (121) and/or processor 110 can be set up to analyze the visual data from the camera system 127 in order to identify non-verbal gestures made by a human being (e.g. A human pedestrian or driver is a person who is located in the environment. Any suitable method can be used to identify a non-verbal human gesture. As an example, a captured human non-verbal gesture can be compared with a set predefined gestures in gesture library116 to determine if it matches the set of predefined gestures in gesture library116.

“The vehicle 100 may include an input system 130 to receive input from a vehicle’s occupant (e.g. A driver or passenger. You can use any input system 130 that is suitable, such as a keyboard, display, touch screen or multi-touch screen, joystick, joystick, trackball, microphone, and/or combination thereof.

“The vehicle 100 may include an output system 135 to present information to the passenger or driver. As described above, the output system 135 may include a display. The output system 135 can also include an earphone, microphone and/or speaker. The vehicle 100 could be used as both an input system 130 component and an output system 135 component.

“The vehicle 100 may include one or more microphones (not illustrated). The one or more microphones may be part of the sensor system. ?Microphone? Any device, component, or system that can capture audial information is a microphone. ?Audial data? Any data/information that can be perceptible by the human hearing sense. Any form of audial data is possible. The sensor system can capture verbal gestures 125.

“In some arrangements, one or more microphones can be externally facing. “Externally facing” is used in this context. A microphone that can be oriented, positioned and configured to capture audial data from at most one portion of the vehicle’s external environment 100. One or more microphones may be placed in any part of the vehicle 100. One or more microphones can be found within 100. You can also place one or more microphones on the exterior 100. You can place one or more microphones on the vehicle’s exterior 100. The microphones can be placed around the vehicle 100 using a plurality. A microphone array can contain multiple microphones.

“One or more microphones can be fixed so that their position is stable relative to the vehicle 100. You can move one or more microphones so that they can be moved to capture audial data from different areas of the vehicle 100’s external environment. You can move such microphones in many ways. The microphones can be rotated about one or more axes, pivotable on one or more axes and slidable/expandable on another. The microphones can be set up in any arrangement that allows for a wide range of motion. One or more microphones, and/or their movements, can be controlled using the sensor system 125 and the processor 110, and/or any of the modules 120-121 or 122.

“The human signal recognition module (121) and/or processor 110 can be set up to analyze the audial data collected by the sensor system.125 (e.g. The microphones and/or visual data collected by the sensor system125 and/or camera system127 can be used to identify one or several verbal gestures made in an external environment. The human signal recognition module (121) and/or processor 110 can be used to analyze visual data about lip movements in order to identify words spoken by humans in the external environment. The human signal recognition module (121) and/or processor 110 can also be used to analyze audial data in order to identify words spoken in an external environment by a person and/or sounds that are generated by humans in that environment (e.g. Honking of a vehicle’s horn. Any suitable way can be used to identify a verbal human gesture. To determine if a captured verbal expression matches one of the predefined human verbal gestures, you can compare it to a list of verbal gestures (not shown).

“The human signal recognition module (121) can be connected to one or several verbal libraries or dictionary in one or more arrangements. One or more of the data storage 115 can include one or more verbal libraries and dictionaries. A set of predefined human verbal gestures can be included in the verbal libraries and dictionaries.

“The human signal recognition module (121) can be used in one or more arrangements to include voice recognition technologies for the analysis, asses, identification and/or interpretation of audial data from the external environment. The human signal recognition module (121) can be equipped with any hardware or software that supports voice recognition.

“The vehicle 100 may include an external communication network 145. What is an external communication system? One or more elements, one, more devices, one, more components, one, more systems, or combinations thereof that are used to communicate with one, or more, intended recipients, in the external environment 100. Particularly, “external communication network”? One or more elements, one, more devices, one, more components, one, more systems, or combinations thereof that are used to send directional communications to one, or more, intended recipients, in the external environment 100. ?Directional communications? One or more messages that are sent to an area of the environment which includes the location of the intended recipient. You can send the message in any form you like, including audial and visual.

The message could be of any type. The message could relate to 100 future driving maneuvers. Future driving maneuvers could include turning right, turning left or continuing straight. Future driving maneuvers may also include instructions or indications. For example, future driving maneuvers could allow pedestrians to cross roads before vehicles make 100 turns on the road.

The processor 110 and/or autonomous driving module 120 could be used to control the external communications system 145. The processor 110, and/or any of the modules 120 121, 122, 122, can be used to control the external communication system (145) when the vehicle 100 is in autonomous mode. This allows the vehicle 100 to communicate with the intended recipients via the external communications system 145, such as sending a directional signal.

“External communication system 145 comprises a visual communication network 146 and/or audial communication network 150. What is a visual communication system? One or more elements, one, more devices, one, more components, one, more systems, and/or any combination thereof that are used to transmit directional visual messages to one, more or all recipients. ?Visual message? Any communication that can be perceptible by the human eye. What is a directional visual message? A visual message sent to an area of the vehicle’s external environment in a direction substantially aligned to the location of an intended receiver in that environment. The recipient? The?recipient? can be any person living in the environment. For example, the recipient could be a pedestrian or cyclist, and/or a driver or occupant of another vehicle.

The visual message can either be a textual or graphical representation. The textual representation of the visual message in one or more arrangements can be very brief and/or simple. Examples of such textual representations are:?I See You? ?Go Ahead,? ?Turning Right,? ?Turning Right? ?Turning Right? and?Stopping. The graphical representation of the visual message may be very simple. An arrow can be used to indicate the direction in which the vehicle 100 will turn. Another example is a red stop sign that indicates the vehicle 100 is halting. Some arrangements allow for both textual and graphic representations of the visual message.

“In one or several arrangements, the visual communications system 146 may include one or multiple displays 147, one, or more projectors148, and/or one, or more, humanoid figures (149). Below are descriptions of each of these visual communication systems 146. One or more displays 147 or projectors 148, and/or one/more humanoid figures (149) can all be controlled at least partially by the processor 110. The autonomous driving module 120, human signal recognition module 121, and/or external communication module 122 can also control the processor 110.

“In one or several arrangements, the visual communication network 146 may include one or multiple displays 147. ?Display? Display is a component or group of components that display visual data. Any type of display can be used for the display 147. The display 147 could be either a liquid crystal display (LCD), light emitting diode display (LED), or any other type of display. One or more displays can have or set to high contrast output. This allows the intended recipient to see the visual message on the display 147 in different lighting conditions. The contrast output of the displays 147 can also be adjusted so that the intended recipient can see the visual message on the display 147 even when it is sunny.

“The externally facing display can include one or more of the 147 displays. What is an externally facing display? A display that faces the outside. You can locate the one or more displays (147) in any part of the vehicle 100. One or more displays 147 can be found within the vehicle 100. You can also locate one or more displays 147 on the vehicle’s exterior 100. The exterior of the vehicle can have one or more displays 147. The displays 147 may be placed on doors, panels, trunks, roofs, wing mirrors, and other exterior parts of the vehicle 100. You can have multiple displays 147, and you can distribute the plurality 147 around the vehicle 100 in any way that suits your needs.

“One or more displays 147 can be placed in a fixed position relative to the vehicle 100. You can move one or more displays 147 so that visual data can be presented selectively to different areas of the vehicle’s external environment 100. You can move such displays 147 in any way you like. The displays 147 are able to be rotated about one or more of the axes, pivotable, slideable and/or extended, just to name some possibilities. The displays 147 can be used in any combination of arrangements. They can have any range of motion including substantially spherical or substantially hemisphere, circular, and/or linear.

“FIGS. 2, 3 and 4 show arrangements where a plurality 147 displays are integrated into different body panels of vehicle 100. FIG. FIG. 2 shows another arrangement where a display 147 may be mounted on the vehicle’s roof 100. You can rotate the display around an axis 181 any way you like. Alternately, or in addition to this, the display 147 may be moved between a deployed position (as illustrated in FIG. 2) or a stowed position in any manner that suits you.

“In one or several arrangements, the visual communication network 146 may include one or multiple projectors 148. ?Projector? The projector is a component or group of components that can project data onto a surface. The projector 148 is capable of projecting visual data, such as images and/or videos, in one or more arrangements. The projector 148 can project visual data (e.g., a light, image or video) onto any surface outside the vehicle 100. You can choose to have the surface be paved, part of a structure or a sign. The projector 148 may be used in one or more of the following arrangements: a laser, other light emitting element, device or component, and/or system. The projector 148 can emit light at any wavelength on the electromagnetic spectrum.

The one or more projectors (148) can be found in any part of the vehicle 100. One or more projectors 148 can be found within the vehicle 100. The exterior can have one or more projectors 148. The exterior can have one or more projectors 148. One or more projectors 148 may be mounted on the door, fender panel, trunk, roof, trunk, wing mirror, and/or any other exterior part of the vehicle 100. You can provide multiple projectors (148), and you can distribute the plurality 148 around the vehicle 100 in any way that is most convenient.

“The projector position 148 can be fixed so that it does not move relative to the vehicle 100. The position of one or more projectors 148 can change so that visual data can be presented to different areas of the vehicle’s external environment 100. You can move such projectors 148 in any way you like. The projectors 148 are able to be rotated about one or more of the axes and pivotable, slidable, and/or extended, just to name some possibilities. The projectors 148 may be configured in any combination that allows for a wide range of motion. They can be substantially spherical or substantially hemisphere, circular, and/or linear.

“FIG. “FIG. Rotating the projector 148 about an axis (e.g. You can rotate the projector 148 about an axis (e.g., axis 182) in any way you like. Alternately, or in addition to this, the projector can be moved between a deployed and stowed position in any way that suits you. You can move the projector 148 to send a visual message (e.g. person 301) to someone in an external environment 300. FIG. FIG. An arrow indicates a right-hand turn.

“In one or several arrangements, the visual communication network 146 may include one or two humanoid figures. ?Humanoid figure? A humanoid figure is a controllable object that has at least some similarities to a human body. The humanoid figure (149) can be a complete human being. It can also look like a person’s body from the waist down, or the entire body of a person from the chest up. The humanoid figure (149) can appear lifelike, natural, and/or realistic in one or more arrangements. The appearance of the humanoid figure (149) can vary between one arrangement and another. The humanoid figure of 149 may be anatomically correct or incorrect. The humanoid figure of 149 can look like a human being in one or more arrangements. FIG. FIG. 4 illustrates an arrangement where the humanoid figure (149) does not appear humanoid, but has many features that are associated with humans (e.g. A head 405, an arm 405 and/or one or two fingers 417 are shown.

“The humanoid figure of 149 can be either partially or fully electro-mechanically or mechanically in one or more arrangements. The humanoid figure (149) can be a robot in one or more arrangements. The humanoid figure number 149 is a controllable mannequin. The humanoid figure number 149 is a controllable toy that can be used in one or more arrangements.

“One or more humanoid figurines 149 can be programmed to imitate human gestures, movements, or gestures in a way that is consistent with human action. One or more humanoid figurines 149 can be programmed to point one or more fingers and make a sign using the fingers. You can use the OK, thumbs up, stop, and/or OK signs to indicate your approval. You can also nod your head and/or wave a hand, to name just a few options. A humanoid figure 149 can be fully articulated in order to perform a desired gesture.

“One or more humanoid figures (149) can be found in any part of the vehicle 100. This allows it to be viewed externally by recipients 100. The vehicle 100 can contain one or more humanoid characters 149. One or more humanoid characters 149 can be found in the front passenger compartment 100. The exterior can have one or more humanoid characters 149. You can place one or more humanoid figures (149) on any door, fender panel, panel, trunk, trunk, roof and/or wing mirror, as well as other exterior parts of the vehicle 100.

“FIG. 4. This diagram shows an arrangement where a humanoid figure of 149 is placed in the front passenger area 400. An upper body section 420 of the humanoid figures 149 can be moved in one or more arrangements (e.g. Rotatable around an axis to face the appropriate side of vehicle 100. Alternately, or in addition, the humanoid figurine 149 can be controlled so that a head 405, arm 410 and/or hand 415 are moved, as well as one or more fingers 417, to send a visual message. FIG. 4. Waving a hand. A window can be opened on one or more of the vehicles 100 to let the recipient see the humanoid figure (149) from outside.

“As mentioned above, an external communication system (145) can also include an audial communications system 150. What is an audio communication system? One or more elements, one, more devices, or one or two systems and/or any combination thereof that are used to send a directional audial message (e.g., to a pedestrian or driver) to one or several intended recipients. an autonomous vehicle’s external environment. ?Audial message? Any communication that can be perceptible by the human senses of hearing.

“In some arrangements, the audial message may be short or simple. These textual representations can be illustrated by the following:?I See You? ?Go Ahead,? ?Turning Right,? ?Turning Right? ?Turning Right? The audial message may be one or more sounds or groups of sounds, but it can also form words.

“The audial communication network 150 can contain one or more speakers 152. ?Speaker? A speaker is a combination of elements, devices, components, systems and/or combinations that produce sound when an audio signal input is received. Speakers include sound chips, sound cards, and electroacoustic transducers. One or more audio output channels can be connected to each speaker (not shown). ?Audio output channel? Any device, component, or structure that can carry audio signals.

“The one or more speakers can be set up as directional sound speakers in one or multiple arrangements. ?Directional sound? These speakers are designed to produce sound fields that spread more than traditional speakers. These speakers can be very beneficial as they allow the message to be delivered more directly to the intended recipients and reduce the possibility of it being misinterpreted by unintended recipients.

“The speaker or speakers can be configured to provide audial data to one of the recipients in an external environment of an autonomous vehicle. You can locate the one or more speakers 152 in any part of the vehicle 100. One or more speakers 152 can be found within the vehicle 100. You can also locate one or more speakers 152 on the exterior 100. The exterior can have one or more speakers 152. The speakers 152 may be placed on doors, panels, trunks, roofs, trunks, roofs, wing mirrors, and other exterior parts of the vehicle 100. You can provide multiple speakers 152 and distribute the plurality 152 around the vehicle 100 in any way you choose.

“The speaker position can be fixed so that it does not move relative to the vehicle 100. You can move one or more speakers 152 so that they can be moved to present audial data to specific areas of the vehicle’s external environment 100. You can move such speakers 152 in any way you like. The speakers 152, for example, can be rotated about one or more of the axes, pivotable and/or slidable, as well as extendable. The speakers 152 can be used in any arrangement, which may include substantially spherical or substantially hemisphere, circular, and/or linear motions.

“FIG. FIG. 5 illustrates arrangements where a plurality 152 speakers are integrated into different body panels of vehicle 100. FIG. FIG. 5 shows another arrangement where a speaker 152 may be mounted on the vehicle’s roof 100. This speaker 152 can be rotated about any number of axes, including the one you are using. “axis 181 in any way that is most appropriate.”

“The vehicle 100 may include one or more vehicles systems 160. FIG. shows several examples of one or more vehicle system 160. 1. The vehicle 100 may include different systems or more. You should know that while specific vehicle systems may be identified separately, any or all of them can be combined or separated via hardware or software within the vehicle 100.

A propulsion system 162 can be included in a vehicle 100. One or more mechanisms, devices or elements, components, systems and/or combinations thereof can make up the propulsion system 162. They are designed to provide power for the vehicle 100. A propulsion system 162 may include an engine or an energy source.

The engine can be any type of motor or engine that is currently available or developed in the future. An engine could be either an internal combustion engine or an electric motor. A Stirling engine is another possibility. The propulsion system may include multiple types of engines in some instances. A gas-electric hybrid vehicle, for example, can have a gasoline engine as well as an electric motor.

The energy source could be any source of energy that can be used at least partially to power the engine. You can configure the engine to convert the energy source into electrical energy. You can choose from gasoline, diesel or propane as energy sources. Alternately, or in addition to fuel tanks, batteries and capacitors, the energy source could also include flywheels. The energy source may be used in some embodiments to supply energy for other vehicle systems 100.

“The vehicle 100 may include tires, wheels and/or tracks. You can use any type of tires, wheels and/or tracks. The wheels, tires, and/or track of the vehicle 100 can be set up to rotate in a different direction from other wheels, tires, and/or tracks 100. Any suitable material can be used to make the wheels, tires, and/or tracks.

“The vehicle 100 may include a brake system 164. The braking system can contain one or more mechanisms or devices, components, elements, systems or combinations thereof that are now known or later developed and designed to slow down the vehicle 100. The braking system 164 may use friction to slow down the wheels/tires. The braking system 164 can convert the kinetic energy from the wheels/tires into an electric current.

“The vehicle 100 may also include a steering mechanism 166. The steering system 166 may include any number of mechanisms, devices, elements or components, as well as combinations thereof, that are now or later developed and used to adjust the heading.

The throttle system 168 can be included in the vehicle 100. The throttle system 168 may include one or more mechanisms or devices, elements or components, systems and/or combinations thereof that are now known or later developed and used to control the engine/motor speed of the vehicle 100.

“The vehicle 100 may include a transmission 170. One or more devices, elements, components or systems of the transmission system 170 may be used to transmit mechanical power from vehicle’s engine/motor 100 to the wheels/tires. The transmission system 170 may include a differential, gearbox, clutch and/or other elements. If the transmission system 170 has drive shafts, one or more axles can be attached to the wheels/tires.

A signaling system 172 can be included in a vehicle 100. Signaling system 172 may include any number of mechanisms, devices, elements or components, or combinations thereof, either now known or later developed. It can provide illumination for the driver 100 and/or provide information about one or more aspects 100. The signaling system 172, for example, can give information about the vehicle’s location, size, direction, speed, and/or driver’s intentions with regard to direction and speed. The signaling system 172 may include the following: headlights and taillights; brake lights; hazard lights; turn signal lights.

“The vehicle 100 may include a navigational system 174. The navigation system 174 may include one or more mechanisms or devices, elements or components, systems, applications, and/or combinations thereof. They can be used to determine the geographical location of the vehicle 100 or to determine a route for the vehicle 100.

“The navigation system174 can contain one or more mapping apps to determine a vehicle’s travel route 100. A driver or passenger can input an origin and destination. The mapping program can help you find the best travel route between your destination and origin. You can choose a travel route based on one or several parameters, such as: The shortest travel distance, the shortest travel time, etc. are some of the options. The navigation system 174 can be set up to automatically update the travel route while the vehicle 100’s in use.

“The navigation system 173 can contain a global positioning system, local positioning system, or a geolocation. Any one of a variety of satellite positioning systems can be used to implement the navigation system 174, such as the United States Global Positioning System, the Russian Glonass, the European Galileo, or any other system that uses satellites from a mixture of satellite systems or any future satellite systems. This includes the Chinese COMPASS system and Indian Regional Navigational Satellite System. The navigation system 174 also supports Transmission Control Protocol (TCP), a Geographic Information System (GIS), and location services.

“The navigation system 174 may contain a transceiver that can be used to calculate the position 100 of the vehicle relative to the Earth. A GPS transceiver can be used to determine vehicle’s latitude, longitude, and/or altitude. Other systems can be used by the navigation system 174 (e.g. Laser-based localization systems and inertial-aided GPS can be used to locate the vehicle 100.

“Alternatively, or in addition, the navigation systems 174 can be built on access point geolocation services such as the W3C Geolocation Application Programming Interface. This system allows the vehicle 100 to be located by consulting location information servers. These include the Internet protocol (IP), Wi-Fi and Bluetooth Media Access Control address (MAC), radio-frequency ID (RFID), WiFi connection location or device GPS and Global System for Mobile Communications(GSM)/code division multiple access (CDMA). It will be clear that the method by which the vehicle 100’s geographic location is determined will depend on how the particular location tracking system operates.

“The processor 110 or the autonomous driving module 120 may be connected to the vehicle systems 160 to allow communication with individual components and/or vehicles. Referring to FIG. 1. The processor 110 and/or autonomous driving module 120 may be communicating with each other to send and/or get information from various vehicle systems 160 in order to control movement, speed, maneuvering, direction, and so on. vehicle 100. These vehicle systems 160 may be controlled by the processor 110 or the autonomous driving module 120. They may also be partially or completely autonomous.

“The processor 110 or the autonomous driving module 120 could be used to control navigation and/or maneuvering by the vehicle 100. This can be done by controlling one or more vehicle systems 160 and/or their components. The processor 110 and/or autonomous driving module 120, for example, can control the vehicle’s direction and speed 100 when it is in autonomous mode. The processor 110 or the autonomous driving module 120 may cause the vehicle 100’s speed to increase (e.g. by increasing fuel supply to the engine), decrease (e.g. by applying brakes and decreasing fuel supply to the engine) and/or change its direction (e.g. by turning the front wheels). As used herein, ?cause? or ?causing? It is used to cause, make, force, compel or instruct an event or act, or to be in a position where such an event or activity may occur.

One or more actuators 140 can be included in the vehicle 100. The actuators 140 may be any combination or element that can modify, adjust, and/or alter any of the vehicle’s 160 systems or component thereof in response to signals or inputs from either the processor 110 or the autonomous driving module 120. Any actuator can be used. One or more actuators 140 could include motors and pneumatic actuators, hydraulic pistons and relays. Solenoids, solenoids and/or piezoelectric actuators are just a few examples.

The vehicle 100 can be set up to allow interaction between the vehicle 100, 100 and the external environment 100. The vehicle 100 can be set up to activate at least part of the external communication network 145 to send a directional message to a human recipient within the environment. The vehicle 100 can be programmed to recognize and respond to non-verbal human signals and take a driving action.

“Now that we have described the potential systems, elements, and/or components of vehicle 100, we will describe various interaction methods between autonomous vehicles and the external environment. Referring to FIG. 6, an example of a method for interacting with an external environment for an autonomous vehicle is shown. Referring now to FIG. 6, is an example of how an autonomous vehicle interacts with its external environment. We will now discuss the various steps that can be taken to achieve method 600. FIG. 600 illustrates the method 600. 6. may apply to the embodiments discussed above with respect to FIGS. Although the embodiments shown in FIGS. 1-5 may be applicable, it is possible to use other systems and arrangements. The method 600 could include additional steps, which is why the FIG. 600 does not have to contain every step. 6. These steps are part of the method 600, but they are not restricted to this chronological order. You can perform some steps in a different order or even all of them simultaneously.

“At block 605, one or more people can be identified in the exterior environment of the vehicle 100. Identify one or more people? This means that the object detected is identified as a human being. One or more sensors can detect the external environment, such as the environment sensor(s), 126 and/or the camera system, 127. In some cases, the detection of the environment can be continuous or at a set interval. You can use one or more human recognition technology to identify and/or detect people in any way you choose. Facial recognition, e.g. Face detection, body recognition, and/or iris identification are some examples of human recognition technologies. Computer vision, template matching, or any other visual data processing technology can recognize a human.

“A person can be assumed to be present in one or more arrangements without directly identifying or detecting them. If it is found that the other vehicle does not have an autonomous vehicle, or is in manual mode, then the vehicle 100 might assume that the human driver is driving the vehicle 100. The vehicle 100 might send a request to the vehicle 100 asking about the operation mode of the other vehicle. If no response is received, it could be that the vehicle 100 has not yet been contacted. If no response is received (e.g., after a predetermined time) it can be assumed that the vehicle has a human operator. If the response indicates that the vehicle operates in manual mode, it can also be assumed that it has a human driver.

Click here to view the patent on Google Patents.