Autonomous Vehicles – Eric James Hanson, Molly Castle Nix, Sean Chin, Dennis Zhao, Uber Technologies Inc

Abstract for “Seamless vehicle entrance”

“Systems, methods and tangible non-transitory computer readable media are all provided. These devices allow for autonomous vehicle operation. One example of a method is to receive trip data, which includes information associated with a request for an autonomous vehicle at a pickup location. The autonomous vehicle then can travel to the pick up location. The trip data can be detected by the autonomous vehicle. If the vehicle determines that one or more signals meet one or several broadcast criteria, it can activate one of the vehicle systems involved in fulfilling the request. One or more properties can be used to determine the broadcast criteria.

Background for “Seamless vehicle entrance”

Vehicles, even autonomous ones, may include access systems that regulate vehicle access. These systems can take many forms, including the traditional mechanical lock and key. Many existing access systems are not flexible enough to allow for multiple passengers to be accommodated or change access conditions remotely. Many existing access systems can be complicated and difficult to use, which often leads to wasted time and effort for potential passengers. There is a need to find a better way to get into an autonomous vehicle.

“Aspects, and benefits of embodiments of this disclosure will be described in part. You can also learn from the description or by practicing the embodiments.

“A computer-implemented method for autonomous vehicle operation is an example of the present disclosure. Computer-implemented methods of operating autonomous vehicles can include the receiving by an autonomous car, including one or more computing devices trip data, information associated with a request to pick up the vehicle from a pickup location. The method may also include the autonomous vehicle travelling to the pick up location based on part of the trip data. The method may also include the detection by the autonomous vehicle of one or more signals that are associated with trip data. This method may include, upon determining that one or more signals meet one or several broadcast criteria, activating, via the autonomous car, one or multiple vehicle systems related to fulfilling the request for an autonomous vehicle at the pick up location.

“Another aspect of the present disclosure is directed at one or more tangible, not-transitory computer readable media storing computer readable instructions that when executed or one or two processors causes the processors to perform operations. Operations can include the receipt of trip data, including information related to a request for an automated vehicle at a pickup location. Based in part on trip data, operations may include controlling the autonomous vehicle to travel from the pick-up point. The operations may also include the detection of one or more signals that are associated with trip data. The operations may include, upon determining that one or more signals meet one or several broadcast criteria for authorizing access to an autonomous vehicle, activating one of the vehicle systems involved in fulfilling the request to have the vehicle delivered to the pick-up point.

“Another aspect of the present disclosure is directed at an autonomous vehicle that includes one or two processors and one (or more) non-transitory computer readable media storing instructions. These instructions, when executed by one or multiple processors, cause one or several processors to perform operations. Operations can include the receipt of trip data, including information related to a request for an automated vehicle at a pickup location. Based in part on trip data, operations may include controlling the autonomous vehicle to travel from the pick-up point. The operations may also include the detection of one or more signals that are associated with trip data. If the signals meet one or several broadcast criteria, the operations may include activating one of the vehicle systems that are necessary to fulfill the request for an autonomous vehicle at the pick up location.

“Another example aspect of the present disclosure is directed to other systems and methods, vehicles, devices, apparatuses tangible non-transitory information media, and devices that allow for autonomous vehicle operation.”

These and other features, aspects, and benefits of different embodiments will be better understood by referring to the following description. These accompanying drawings are included in and form a part this specification and illustrate embodiments. They also serve to explain related principles.

“Examples of the present disclosure aim to provide potential passengers of a vehicle (e.g. an autonomous vehicle or semi-autonomous car or a manually operated one) with seamless vehicle entry. This is partly based on the detection and transmission of signals (e.g. Bluetooth signals) by remote computing devices (e.g. a smartphone or wearable computing gadget). A computing system, such as a system that includes one or more computing devices that can communicate with or operate a vehicle, can receive trip data in response to a request to pick up a passenger at a pick up location. The autonomous vehicle’s computing system can receive trip data. This can include a unique identification for the trip and can be associated with an autonomous vehicle and a remote computing gadget (e.g., a smartphone being used by a potential passenger).

The vehicle can receive trip data and detect any other signals (e.g. radio signals that contain trip data). The vehicle can activate one or several vehicle systems if the signals satisfy one or multiple criteria. For example, it can unlock the doors of the vehicle if the received signal strength exceeds a threshold value. The disclosed technology is more effective in providing seamless vehicle detection and access for a potential passenger who has requested a vehicle.

A prospective passenger may request to be picked up by an autonomous vehicle at their current location. This can be done via their mobile device. The request can include information about the location of the vehicle based on inputs from the passenger or location information that is derived from signals received by the device from an external source, such as a global positioning satellite (GPS). A computing system associated with an autonomous vehicle can receive the request. It will determine the closest vehicle to the passenger’s current location. The computing system can detect any identification information in trip data and modify trip data to exclude the information.

“As the autonomous car travels to the location of the potential passenger, the mobile phone of the passenger can transmit signals that can be detected by the vehicle. The autonomous vehicle can detect signals from the smartphone and stop at a predetermined distance from the pick-up location. The mobile device can also detect signals that can be broadcast by an autonomous vehicle and signal it to stop within a predetermined range of the pick up location or source of the signals (e.g. the mobile phone) and unlock a door for the potential passenger.

The disclosed technology may include an autonomous vehicle, and/or a computing device associated with the autonomous vehicle. One or more computing devices can be included in the autonomous vehicle or the computing system associated therewith. These devices can process, generate or exchange signals or data with other devices, such as one or more autonomous vehicles.

“The computing system can, for example, exchange signals (e.g. electronic signals) or data to one or more vehicle systems, including sensor systems (e.g. sensors that generate output based upon the state of the physical world external to the vehicle), communication systems (e.g. wired and wireless communication systems that are able to exchange signals or data to other devices); navigation systems, (e.g. devices that can receive signals GPS, GLONASS or other systems used in determining a vehicle’s geographic location); and/control systems; and/or to alter the vehicle’s course and/es to change the vehicle’s.

The computing system can receive trip information that is associated to a request for an auto-driving vehicle at a pick up location. One or more remote computing systems can receive the trip data via one or several communication components, including wired or wireless communications systems. Trip data can include pick-up locations (e.g. an identifier for geographic location, including a latitude/longitude, and/or address); current location of the prospective passengers; pick-up times (e.g. a time when the prospective passengers will meet with the vehicle); and/or prospective passenger cargo status (e.g. an indication of whether the passenger is carrying cargo that will go into a cargo area of a vehicle).

“In certain embodiments, the computing systems can determine portions of trip data that may contain personal identification data. The computing system can, for example, determine whether personal information such as a person’s name, email address, phone number or payment information (e.g. credit card number) is included in a trip request. The computing system can also exclude or make unavailable (e.g. delete, encrypt or obfuscate portions of trip data that contain personal information from trip data that will go to the autonomous vehicle or remote computing device). The computing system can, for example, withhold portions of trip data that are associated with personal information data when generating trip data. This ensures that prospective passengers are kept safe and secure when trip data is transmitted.

The trip data can be used to determine a location for the autonomous vehicle and remote computing device. The computing system can, for example, determine the latitude/longitude, address and/or relative positions (e.g. the position of the autonomous car with respect to the remote computer). The computing system can also determine, using part of map data and the location the autonomous vehicle/or remote computing device, the signal disruption value associated to one or more signals being blocked/interfered with (e.g., an estimated decrease in the number of signals being received).

“The map data may include one or several indications (e.g. indications of location and magnitude signal interference), of one or multiple structures or features that could block or interfere detection of the one/more signals, such as buildings (e.g. houses, apartment buildings or office buildings), tunnels (e.g. underground or underwater tunnels), and sources of electromagnetic transmissions (e.g. power lines and/or mobile phone towers). Broadcasting the one or more signals may be dependent on part of the signal disruption value in some cases. The signal disruption value can be used by the remote computing device or autonomous vehicle to selectively broadcast or detect one or more signals.

The computing system can direct an autonomous vehicle to go to the pick up location. The computing system can, for example, determine the path from the current vehicle’s location to the pick up location indicated in trip data. The computing system can use the path between the current location and the pick up location to activate any or all of the vehicle systems (e.g. the engine, sensors, and/or steering) to direct the autonomous car to the pick up location.

“In certain embodiments, the computing device can broadcast one or more signals that include information about the location of an autonomous vehicle (e.g. latitude and longitude). The autonomous vehicle can transmit one or more signals that indicate the location of the vehicle to remote computing systems using one or more communication devices associated with it, including wireless and wired communications systems. The signals can be cellular signals or radio signals or microwave signals. Additionally, they can use different communications protocols such as Bluetooth or Wi-Fi. One or more broadcast instructions can be included in some embodiments for devices or systems that receive the trip data. The instructions for broadcasting one or more autonomous vehicle signals can include information about when and where to broadcast them; and/or which signal protocol to use in broadcasting them. One or more broadcast criteria may be determined partly based on the location and/or use of an autonomous vehicle. The broadcast criteria could be, for example, based on whether the autonomous vehicle is located within a specific area or close to the pick up area.

The computing system can detect one or more signals related to the trip information (e.g. one or more signals that include portions of the trip or other data associated with it). The autonomous vehicle may include components that can detect one or more signals, such as a radio receiver or a Bluetooth receiver. The computing system may be able to identify the source, direction, magnitude and frequency of one or more signals in some cases. The computing system can also analyze trip data if the trip data is part of the one or more signals.

“In response to determining whether the signals satisfy any of the broadcast criteria, the computing device can activate one or several vehicle systems that are associated with fulfilling the request. The autonomous vehicle can satisfy one or more broadcast criteria by determining that the strength or combination of signals from the remote computing devices exceeds a threshold strength. This indicates that the remote computing devices are within a predetermined distance (e.g. within ten metres).

“Activating one or several vehicle systems may include unlocking an entry to the vehicle (e.g. unlocking one of more doors to an individual passenger compartment), locking an entrance, opening an entrance, or unlocking one of more cargo areas of an autonomous car (e.g. unlocking one of more trunk doors or other cargo/storage areas of an autonomous vehicle), slowing down the vehicle to a halt (e.g. reducing the vehicle’s velocity until it stops), or generating an indicator of the availability of an area that displays identifier that is sent to prospective passengers)

“In some embodiments, the computing system can determine the proximity of an autonomous vehicle relative to a remote computing device based in part upon one or more properties. One or more broadcast criteria may be satisfied in some embodiments. This could include the determination that the autonomous vehicle is within a certain distance from the remote computing device. One example of satisfying one or more broadcast criteria is determining that the autonomous car and remote computing device are within 30 meters of each other. This can be done based on one or several properties of the signals. One or more properties can be, for example, received signal strength (RSS), or signal to noise ratios (SNR), of one or several signals.

The computing system can generate one or more indications based on the distance of the autonomous car to the remote computing devices. One or more of the indications could include one, more or all of the following: one, more or all of the remote computing devices’ vibrations, one, more, flashing and/or pulsating light indicators, one, more or all of the colors, one, more audio indications, one, more or all of the following: one, more or all of the pictures, one, more or both of them, one, more or any combination of these, one, more pictorial indications, one, more or all of the phrase, letters, or words

“In some embodiments, a type (e.g. a color or light indication) or magnitude (e.g. a size or brightness for a light indicator) of one or more indications can be determined in part based on the distance between the autonomous vehicle and the remote computing devices. As an example, the distance between the autonomous vehicle and the remote computing devices can affect the strength or frequency of haptic indicators (e.g. vibrations).

The computing system can generate trip identifiers. The trip identifier may include information that is associated with one or several symbols (e.g. letters, words and marks, pictograms as well as pictures, colors, patterns, colors, etc.) that can be used for identification. The computing system can also broadcast (e.g. send via a wireless networking) one or several signals that include trip identifier. This allows a remote computing device to detect the signals and identify the prospective passenger. The trip identifier could include a red circle and a black letter?L. A red circle with a black letter?L? and a black number of?6? can be used as the trip identifier. Inside the circle. A remote computing device that is associated with a potential passenger can receive the trip ID. The autonomous vehicle can display this information on the vehicle’s display section. The remote computing devices can also display the trip identification on the display portion of their remote computing devices. This allows a potential passenger to identify the autonomous vehicle requested for a trip more efficiently. A potential passenger can use the trip identifier to identify their chosen autonomous vehicle in a situation where multiple vehicles are waiting in close proximity.

“In some instances, the trip identifier may include information associated with one of more symbols. It can also be based in part upon a randomly generated trip ID or chosen from among a number of trip IDs that have not been used within a predetermined time period. The computing system can generate a trip ID by randomly combing elements of a symbol, such as color, shape and size. The computing system can also access a number of trip IDs (e.g. trip identifiers stored within a database of trip identifications) that have been used recently and choose a trip ID that has not been used in that time period (e.g. not used for a certain amount of hours, day of the trip or week).

The disclosed technology may include systems, methods, or devices that can have a range of technical benefits and technical effects on the vehicle’s operation, including the coordination of pick-up and delivery of potential passengers by one or several vehicles. The disclosed technology has one major benefit: it reduces the number of interactions between a prospective passenger and an autonomous vehicle. Each interaction between an autonomous car and a potential passenger results in energy consumption (e.g. fuel consumed by an autonomous vehicle, and/or battery charging by remote computing device used a prospective passengers), and time (e.g. prospective passenger waiting time, and/or time that a passenger spends to board an autonomous vehicle). Reducing the number or duration of interactions between the vehicle and the passenger could result in less energy usage by both the autonomous vehicle as well as the prospective passenger.

The disclosed technology provides enhanced privacy to prospective passengers by receiving passenger requests and then creating trip data that does NOT include any of the passenger’s personal information (e.g. prospective passenger name, address and/or payment information). The trip data can be encrypted to further protect the security of prospective passengers, even if it is intercepted by an unauthorized third-party.

Further, by selectively detecting and broadcasting one or several signals, based in part upon the location of structures, objects, or other obstacles to the signals, the disclosed technology can more efficiently conserve scarce energy resources (e.g. battery resources of remote computing devices including smart phones). The disclosed technology can, for example, access a map that shows the location of structures and objects that block one or more signals. The disclosed technology can adjust the sensitivity of the sensors that detect one or more signals based on the location of interfering or blocking objects.

“The disclosed technology allows for more efficient passenger-vehicle coordination, better vehicle-passenger interactions, enhanced passenger privacy, battery saving through the use of mapping data that shows the locations of structures and objects that block or interfere signals. This also results in time and resource savings due to more efficient vehicle-passenger detection, prospective passenger boarding, and passenger boarding.

“With reference to FIGS. “With reference now to FIGS. 1-9, we will discuss in detail some examples of the present disclosure. FIG. FIG. 1 shows a diagram of an illustration system 100, according to the example embodiments of this disclosure. A system 100 may include multiple vehicles 102, 104, and a vehicle computing platform 108. It can also include one or several computing devices 110, one or two data acquisition systems 112, one or both of the human machine interface systems 118, other vehicle systems 120, a communication system 122, 124, one or all sensors 128; one- or more remote computing devices 130; and an operation computing system 150.

“The operations computing device 150 can be linked to a service provider that provides vehicle services to multiple users through a fleet that includes, for instance, the vehicle number 104. Vehicle services include transport services (e.g. rideshare), courier services and delivery services, as well as other types of services.

“The operations computing device 150 can contain multiple components that perform different operations and functions. One example is that the operations computing system 150 could include or be associated with remote computing devices. These remote computing devices can be located far from the vehicle 104. One or more processors can be used with one or several memory devices. One or more memory devices may store instructions that, when executed by one or multiple processors, cause the one/more processors to perform operations and functions related to operation of the vehicle. These include receiving data (e.g. trip data), controlling operation (e.g. the vehicle 104), activating one/more vehicle systems (e.g. doors and/or storage area of the vehicle 104)

“Example: The operations computing system 150 can be set up to communicate with the vehicle (104) and/or its users in order to coordinate vehicle service 104. The operations computing system 150 is able to manage a database that contains vehicle status data. The vehicle status data may include the location of the plurality 102 (e.g. a latitude or longitude of a car), availability of a car (e.g. whether it is available for passengers or cargo pick-up), and the state of objects outside the vehicle (e.g. the location and/or proximity of external objects to the vehicle).

One or more local memory devices can store an indication, record, or other data indicative about the state of one or several objects, including their proximity or location. The vehicle 104 can also provide data indicative about the state of one or several objects (e.g. proximity or location of one or multiple objects) within a predefined range of the vehicle. This data can be stored locally in one or many memory devices associated to the operations computing software 150.

“The operations computing device 150 can communicate with vehicle 104 through one or more communication networks, including the communications network 140. The communications network 140 can send and receive signals (e.g. electronic signals) or data (e.g. data from a computing device). It can also include any combination wired (e.g. twisted pair cable) or wireless communication mechanisms (e.g. cellular, wireless satellite, microwave and radio frequency), and/or any desired topology (or topologies). The communications network 140, for example, can also include a local network (e.g. Intranet, wide area network (e.g. Internet, wireless LAN network (e.g. via Wi-Fi), cellular networks (e.g. VHF network), a HF network and/or any other suitable communication network (or combination thereof), for transmitting data from and to the vehicle 104.

The vehicle 104 could be either a ground-based vehicle (e.g. an automobile), an airplane, or another type of vehicle. The vehicle 104 is an autonomous vehicle that can perform different actions, including driving, navigation, and/or operating with minimal or no interaction from a human driver. You can configure the autonomous vehicle 104 to operate in any combination of modes, including a fully autonomous operational mode or semi-autonomous operational mode. It can also be set up to park, sleep, and/or autonomous mode. Fully autonomous (e.g. self-driving) operating mode is one where the vehicle 104 can perform driving and navigational operations without the need for interaction with a human driver. Semi-autonomous operating modes can include those in which the vehicle can operate without any interaction from a human driver. The vehicle 104 can use park and/or sleep modes between operational modes. It also performs different actions, such as waiting for a vehicle service to be completed or charging between operational modes.

“The vehicle 104 may include a vehicle computing systems 108. The components of the vehicle computing system 108 may include different components that perform various functions and operations. One example is that the vehicle computing system (108) can have one or more computing units 110 onboard the vehicle (104). One or more computing devices 110 may include one or several processors and one (or more) memory devices. Each of these devices is on-board the vehicle. One or more memory devices may store instructions that, when executed by one or many processors, cause the one-or more processors perform operations and functions such as taking the vehicle out-of-service, stopping its motion 104 or determining the condition of one, more, or both of these.

“The one or more computing units 110 can be used to implement, include, or otherwise be associated with other systems on the vehicle 104. These on-board systems can be connected to the one or more computing devices 110. One or more computing devices 110 may be configured to communicate with any of the following systems: one or multiple data acquisition systems 112, one or several control systems 116 (e.g. including a navigation system), one, or more autonomy systems 114, one or both human machine interface systems, 118, other vehicle system 120, and/or a communications system 122. These systems can be connected to the one or more computing devices 110 via a network of 124. The network 124 may include one or more data busses (e.g. controller area network (CAN), OBD-II connector (e.g. OBD-II), as well as a combination of wired/wireless communication links. One or more computing devices 110, as well as the other onboard systems, can transmit and/or receive data and/or messages among themselves via the network 124.

“The vehicle’s one or more data acquisition system 112 may include multiple devices that can acquire data 104. These data can be data related to the vehicle, including data from one or more vehicle systems (e.g. health data), vehicle’s interior, exterior, surroundings and/or vehicle users. One or more image capture systems 126 can be included in the data acquisition system 112. One or more image capture device 126 may include one or several cameras, LIDAR system), two-dimensional and three-dimensional imagecapturing devices, static (e.g. rotating) image captures devices, video capture (e.g. video recorders), lane detects, scanners optical readers, electric eyes, or other appropriate types of image capture equipment. One or more image capture units 126 can be found in the interior of the vehicle 104 and/or exterior. One or more of the image capture devices (126) can be set up to capture image data that can be used to operate the vehicle 104 in autonomous mode. One or more of the image capture devices 126 can be configured to acquire image data that will allow the vehicle to 104 to use one or more machine vision methods (e.g. to detect objects in their environment).

“Additionally or alternatively, one or more data acquisition system 112 may include one or several sensors 128. One or more sensors 128 could include motion sensors and pressure sensors, mass sensors and weight sensors, impact sensors, temperature sensors, humidity sensors and RADAR, sonar and radios (e.g. for obtaining information about the vehicle’s surroundings), GPS equipment, proximity sensors and/or other types of sensors to obtain data indicative of parameters related to the vehicle 104, and/or relevant for the operation of the car 104.

“The one or more data collection systems 112 may include one or more sensors 128 that are dedicated to obtaining data about a specific aspect of the vehicle. 104 This includes the fuel tank, engine and oil compartment as well as wipers. One or more sensors 128 may also include sensors that are associated with mechanical or electrical components of the vehicle. One or more sensors 128 could be used to determine whether the vehicle’s trunk, door, or gas cap is open or closed. The data from the one or more sensors 128 may be used to detect other vehicles, objects, road conditions (e.g. curves, potholes and dips, bumps and/or changes of grade), and measure distances between the vehicle 104, other vehicles, and/or other objects.

“The vehicle computing device 108 can also receive map data. A computing device within the vehicle, such as the autonomy system 114, can be configured to receive map information from one or more remote computing devices, including the operations computing systems 150 and 130. These remote computing devices are often associated with a geographical mapping service provider. Any combination of two-dimensional and three-dimensional geographical map data associated to the area where the vehicle is, is, or will travel can be used as map data.

“The data from one or more data acquisition system 112, map data and/or any other data can be stored on one or more memory devices at the vehicle 104. On-board memory devices may have limited storage capacities. The data stored on the memory devices can be limited in storage capacity. This means that the data may need to periodically be deleted, deleted, or downloaded to another device (e.g., to a service provider’s database). One or more computing devices 110 may be set up to monitor memory devices and/or communicate with an associated processor to determine the amount of data stored in each memory device. You can also configure one or more other onboard systems, such as the autonomy system 114, to access data stored in one or more memory device.

“The autonomy system114 can be configured so that the vehicle 104 can operate in an autonomous mode. The autonomy system 114 can retrieve data associated with vehicle 104 (e.g. acquired by one or more data acquisitions systems 112). The map data can be obtained by the autonomy system 114. The autonomous system 114 can control different functions of the vehicle. The autonomy system 114 may include models that can perceive road features, signage and/or objects, people or animals, etc. Based on data from the one or more data acquisition system 112, map data, or other data.

“In certain implementations, the autonomy systems 114 may include one or more machine learning models that use data from the one or more data acquisition system 112, the map data and/or any other data to aid in the operation the autonomous vehicle. The data acquired can also be used to detect other vehicles and/or vehicles, road conditions (e.g. curves, potholes and dips in grade or the like), and measure distances between vehicles 104 and them. The autonomy system 114 is capable of predicting the movement and/or position (or lack thereof), of such elements, using one or more Odometry techniques. Based at least partially on these predictions, the autonomy system 114 may be used to plan the vehicle’s motion. The planned motion can be implemented by the autonomy system 114 to properly navigate the vehicle 104 without any human intervention. The autonomy system 114 may include a navigation system that directs the vehicle 104 towards a specific destination. The autonomy system 114 regulates vehicle speed, acceleration and deceleration. It can also control steering and/or operation other components in order to operate in an autonomous mode.

“The autonomy system114 can calculate a vehicle’s position and/or route in real-time or near real time. The autonomy system 114 can, for example, use acquired data to calculate one or more possible routes (e.g. calculate a route every fractional of a second). The autonomy system 114 will then choose the route and instruct the vehicle 104 to follow it. The autonomy system 114 can, for example, calculate one or several straight paths. It can also determine one or two lane-change paths, one, or more turning paths and/or one/more stopping paths. The vehicle 104 can choose a path based on data acquired, current traffic factors, and travelling conditions. Different weights may be used to select a path depending on the implementation. The vehicle 104 can follow the chosen path once it has been selected by the autonomy system 114.

“The vehicle’s one or more control system 116 can be set up to control one or several aspects of the vehicle. One or more control systems 116 may be used to control access points on the vehicle 104. One or more access points could include features like the vehicle’s trunk lock, trunk lock and hood lock, fuel tanks access, latches and/or other mechanical accessibility features that can be adjusted between different states, locations, or positions. One or more control systems 116 could be used to control access points (e.g. door lock), to change the state of the access point between a lock position and unlocked position. Alternately, one or more control system 116 can be used to control other electrical features of vehicle 104 that can adjust between one or several states. One or more control systems 116 may be used to control electrical features, such as hazard lights or microphones, and to switch between the first and second states (e.g. off and on).

“One or more human machine interaction systems 118 can allow interaction between a user (e.g. human), the vehicle (e.g. the vehicle computing system 108) and/or third parties (e.g. an operator associated to the service provider). One or more human interface systems 118 may include multiple interfaces that allow the user to input/receive information from the vehicle computing platform 108. One or more human interface systems 118 may include a graphical user, web-based, manipulation interface, touch interface, attentive interface, conversational and/or vocal interfaces (e.g. via text messages, chatter bot), conversational agent, interactive voice reply (IVR), system, gesture interface and/or other types. One or more input devices can be included in the human machine interface system 118. These include touchscreens, keypads, touchpads, knobs and buttons, sliders and switches, mouse, gyroscopes, microphones, and other hardware interfaces. One or more human interfaces 118 may also include one or several output devices (e.g. speakers, lights, or display devices) that can receive and output data related to the interfaces.

“The vehicle’s other systems 120 can be used to monitor and control other aspects. 104 Software update monitors, engine control units, transmission control units, and on-board memory devices are just some examples of the additional vehicle systems 120. One or more computing devices 110 may be set up to communicate with other vehicle systems 120 in order to receive data or send signals. Software update monitors, for example, can send data to one or more computing devices 110 that indicate the current status of software on the various on-board systems, and/or whether a particular system needs a software upgrade.

“The vehicle computing system 108 and its one or more computing device 110 can use the communication system 122 to communicate with other computing devices. The vehicle computing system 108 may use the communication system 122, which allows it to communicate with other users over the internet. The communication system 122 may allow one or more computing devices 110, to communicate with any of the systems onboard the vehicle 104. The communication system 122 can be used by the vehicle computing system to communicate with the operations computing systems 150 and/or remote computing devices 130 via the networks (e.g. via one or several wireless signal connections). Any suitable components can be included in the communication system 122 to allow interfacing with one of several networks. These components can include transmitters, receivers ports, controllers antennas or any other components that can facilitate communication with remote computing devices.

“In some cases, one or more computing devices 110 mounted on the vehicle 104 can access vehicle data that indicates one or more parameters of the vehicle. 104 One or more parameters may include information such as vehicle health and maintenance information 104, vehicle computing system 110, or one or more on-board systems. One or more parameters could include the fuel level, engine condition, tire pressure, vehicle conditions, vehicle exterior conditions, mileage, time till next maintenance, time since the last maintenance, data storage in the onboard memory devices, charge level of an automotive energy storage device, vehicle software status, required software updates, and/or any other health and maintenance data for the vehicle 104.

One or more systems onboard the vehicle can provide at least some vehicle data indicative of parameters. One or more computing devices 110 may be set up to request vehicle data from the onboard systems on a regular and/or as-needed basis. One or more of the onboard systems can be set up to provide vehicle data indicative to one or more parameters to the computing devices 110. This could be done, for example, intermittently, continuously, or as-needed. One or more data acquisitions system 112 may provide information about the vehicle’s fuel level, and/or vehicle energy storage device charge. One or more parameters may be indicative of user input in some implementations. One or more of the human machine interfaces (118) can receive input from the user, e.g. via a vehicle’s interior display device. One or more human interfaces 118 can provide data indicative that the user input was received by the computing devices 110.

“In some cases, one or more computing device 130 can receive input and provide data indicative that the user input was received by the one or two computing devices 110. One or more computing device 110 can receive data indicative of user input from one or more computing device 130 (e.g. via wireless communication).

“The one or more computing device 110 can be used to determine the current state of the vehicle (104), and the environment surrounding it (104), including the status of pedestrians, cyclists and motor vehicles (e.g. trucks and/or automobiles), roads and waterways and/or buildings. The one or several computing devices 110 can also be used to determine the properties of any or all signals (e.g. radio signals) that are detected by one or multiple sensors 128. One or more computing devices 110 can determine information such as when one or more signals are associated to a request for autonomous vehicles (e.g. the autonomous vehicle 104) at an pick-up point.

“FIG. “FIG. FIG. 2 illustrates one or more of the actions or events. 2, can be executed by one or several devices (e.g. one or more computing devices or systems), such as the vehicle 104 or the vehicle computing system108 or the operations computing platform 150. 1. FIG. FIG. 2 illustrates an environment 200, which includes a potential passenger 210, signals 214 and a remote computing device 212, signals 214, and a vehicle 220.

“In the environment 200 the vehicle 220 (e.g. an autonomous vehicle) is located five meters below street level and inside tunnel 234. This tunnel is a subterranean tunnel that has a tunnel exit 236 which faces street 223. The vehicle 220 received trip data in this example. It was requesting to pick up the passenger 210 at pick-up location 240. Further, the potential passenger 210 uses the remote computing devices 212 (e.g. a smart phone) which can transmit and/or receive signals (e.g. radio signals), including the signals 214. The vehicle 220 is located within the tunnel 234, which is in the signal disturbance area 232. Therefore, the vehicle 220- cannot detect signals 214 through building 242.

“In some embodiments, vehicle 220 can access map information that includes an indication about the layout and geography 200, including an indication as to the areas (e.g. the signal disruption area 223) in which one or more signals (e.g. the signals 214) may be detected and/or blocked. The vehicle 220 is traveling along the path 222. This will allow it to pass the tunnel exit 236, and get out of the signal disruption zone 232. The vehicle 220 could be able to sense the signals 214 when it is on street 238, and may travel more precisely to user 210 who is waiting in pick-up area 244.

“FIG. “FIG. FIG. 3 illustrates one or more of the actions or events. One or more of the devices shown in FIG. 3 can implement one or more actions or events (e.g. one or more computing devices or systems, such as the vehicle 104 or the vehicle computing system108 or the operations computing device 150. 1. FIG. FIG. 3 shows an illustration of a remote computing unit 300 that can exchange data (e.g. send and/or get) with one or several computing systems, such as the vehicle 104 or the vehicle computing system. 1. FIG. FIG. 3 shows the remote computing device 300 with a display area 302, visual indication 304 and haptic indication 306 as illustrated in FIG.

“In this case, the remote computing device 300 (e.g. a smart-phone) of the user who requested an autonomous car has received one or more signals. One or more signals from an autonomous vehicle (e.g. the vehicle 104, or the vehicle 228) include trip data that can be used to generate the visual indicator 304 (ATTENTION: Autonomous vehicle at corner of Main Street and Lincoln Avenue.) The display area 302 is the remote computing device 300.

“Further the signals from an autonomous vehicle (e.g. the vehicle 104 and the vehicle 220), can contain trip data that can then be used to generate haptic indication 306 (“e.g. one or more vibrations that can be generated by the remote computing device 300). The remote computing device 300 can send a haptic indicator 306 to indicate that the vehicle being requested is near. The haptic indicator 306 can also be detected by the remote computing devices 300. This means that the user does not have to interact with the remote computing devices 300. This allows the user to instantly receive an indication of a nearby vehicle.

“FIG. “FIG. FIG. 4 depicts one or more of the actions or events. 4. can be executed by one or several devices (e.g. one or more computing devices or systems, including the vehicle 104 or the vehicle computing system108 or the operations computing software 150 shown in FIG. 1. FIG. FIG. 4 illustrates an environment 400, which includes a remote computing unit 402, a journey identifier 404, a display section 406, a vehicle 412, a route identifier 414 and a display section 416. A vehicle 422, a ride identifier 424 and a display space 426 are also included.

“In the environment 400 the vehicle 422 (e.g. an autonomous vehicle), has received trip data. This data can be represented on a liquid crystal display monitor (e.g. a liquid crystal display screen) as a circular shape with a triangle within it and the letter?C? The number?6 is followed by the letter?C? Inside the triangle. A prospective passenger can use the trip identifier to identify the vehicle they are waiting for.

“In this case, the remote computing devices 402 (e.g., smart phones) that a potential passenger uses to wait for the vehicle 422 receives one or more signals. These signals include trip data that corresponds with the trip ID 424 displayed on the display 406 of remote computing device 402. As the potential passenger associated to the remote computing devices 402 waits on the vehicle 422, a variety of vehicles, including the vehicle 412 or the vehicle 422, can be seen waiting within reach of the passenger. A prospective passenger associated to remote computing device 402 may compare the trip ID 404 displayed on display area 406 with the trip ID 414 displayed in display area 416 or the trip ID 424 displayed in display area 426. The similarity of the trip ID 404 and trip ID 424 can be used to determine that vehicle 422 was the one requested.

“FIG. “FIG.5” shows a flow diagram for an example method 500 for autonomous vehicle operation, according to the example embodiments of this disclosure. One or more of the 500-parts can be implemented using one or more devices (e.g. one or multiple computing devices or systems, such as the vehicle 104 or the vehicle computing system108 or the operations computing device 150 shown in FIG. 1. Furthermore, one or more of the methods 500 can be used as an algorithm on any of the components of the devices discussed herein (e.g. FIG. 1) to, for instance, receive trip data, travel at a pick up location and activate one of the vehicle systems. FIG. FIG. 5 shows elements in a specific order, for illustration and discussion. Anyone with ordinary skill in art will be able to see that elements can be modified, altered, removed, combined and/or modified without departing from the scope of this disclosure.

“At 502, the method 500 may include receiving trip data, including information associated to a request for an autonomous vehicle at a pickup location. The vehicle 104, and/or the car computing system 108 can each receive signals and/or data via the communication system 122. The vehicle 104, and/or the car computing system 108 can also receive the trip data via a network that includes the network 124. The trip data can, for example, be received through one or more communication networks that include wired and/or mobile communications systems.

The trip data can be obtained from one or more sources, including remote computing systems that are used to exchange (e.g. sending and/or receiving data) information related to the scheduling and management of autonomous vehicle trips. Trip data may include pick-up locations (e.g. an identifier for geographic location that includes a latitude/longitude and a set of directions to the pickup location and/or an address at the pick up location); current location of the prospective passengers; pick-up times (e.g. a time when the passenger will meet with the vehicle); and/or prospective passenger cargo status (e.g. an indication of whether or not the passenger is carrying cargo, the area where the cargo will be stored, dimensions and/or the vehicle and/or any weight of the cargo and/or the cargo and/or the vehicle’s cargo and/or the cargo and/or the cargo and/or the cargo and/or the cargo and/or the cargo and/or the vehicle and/or the cargo and/or the vehicle and/or the cargo and/or the vehicle and/or the cargo and/or the cargo and/or the cargo and/or the cara cargo will be carried in the cargo and/or the cargo and/or the size and/or the cargo/or the cargo and/or the cargo and/or the cargo and/or the associated with the Carada-

“At 504, the method 500 may include travel, based in part upon the trip data to the pick up location. A computing system, such as the vehicle computing system 108, can be used to control the autonomous vehicle’s travel to the pick up location. The vehicle computing system 108, for example, can direct vehicle 104 to go to the pick up location. An autonomous system in an autonomous vehicle can determine the path from the current location to the pick up location. This is an example. A computing system associated to the autonomous car (e.g. the vehicle computing software 108) can determine the path between the current location and the pick up location. This allows the computing system (e.g. the vehicle computing system 108) to activate any or all of the vehicle systems (e.g. the sensors, navigation system, engine, brakes, and/or steering system) that will direct the autonomous automobile to the pick up location.

“At 506, method 500 may include the detection of one or more signals that are associated with trip data. One example is that the vehicle, e.g. the vehicle 104, can have one or more components (e.g. the communication system 122) which can detect one or multiple signals (e.g. one or several radio signals). In some cases, the computing system can identify signal source, direction, magnitude, strength, and frequency. The trip data in one or more signals can also be analysed to determine, for example, whether there are any errors. One or more error detection methods (e.g. parity bit check) or error correction techniques (e.g. Hamming code), can be applied to trip information in one or more signals to determine if an error occurred in the signal (e.g. the signal is corrupted or incomplete) or to correct the error in the data when it has been detected.

“At 508, the method 500 may include determining whether or not the signals meet one or more broadcast criteria that are required to authorise access to an autonomous vehicle. Broadcast criteria could include signal strength exceeding or falling below a threshold, frequency within a predetermined frequency range and/or whether signals are emitted from one location or in one direction.

“For example, the vehicle computing platform 108 can determine whether, how, or that the broadcast criteria have been satisfied by comparing the properties of the signals to the thresholds associated with those properties. The vehicle computing system 108 also has the ability to compare the signal strength measured in micro-volts/meter with a predetermined signal strength threshold. This means that broadcast criteria can be satisfied when the micro-volts/meter of one or more signals exceeds this signal strength threshold.

“The method 500 can be redirected to 510 if the broadcast criteria are met. If the broadcast criteria are not met, the method 500 may be stopped or redirected to 502, 504 or 506.

“At 510 the method 500 may include activating one or several vehicle systems. One or more vehicle system activation can be used to fulfill the request for an autonomous vehicle at the pick up location. The vehicle computing system 108, for example, can generate control signals to activate or control one or more vehicles (e.g. engine/motor systems or braking systems or steering systems and/or door controls systems) of vehicle 104.

“Activating one or several vehicle systems in some embodiments can include opening an entry to the autonomous car (e.g. unlocking one or two doors to a vehicle’s passenger compartment for an authorized passenger); locking an entrance (e.g. locking one or both doors to an autonomous vehicle’s trunk or other cargo/storage areas of the vehicle); opening a cargo space of an autonomous vehicle for an authorized passenger (e.g. opening one or multiple doors to a vehicle trunk or other cargo/storage section of vehicle); braking the vehicle;

“FIG. “FIG. 6 shows a flow diagram for an example method 600 for autonomous vehicle operation in accordance with example embodiments. One or more of the portions of the method 600 may be implemented using one or more devices (e.g. one or more computing devices or systems), such as the vehicle 104 and the vehicle computing system108. 1. Furthermore, one or more of the methods 600 can be used as an algorithm on any of the components of the devices discussed herein (e.g. FIG. 1) to, for instance, receive trip data, travel at a pick up location and activate one of the vehicle systems. FIG. FIG. 6 shows elements in a specific order, for illustration and discussion. Anyone with ordinary skill in art will be able to see that elements can be modified, altered, removed, combined and/or modified without departing from the scope of this disclosure.

“At 602, the method 600 may include determining, in part, the one/more properties of one or several signals, the proximity of an autonomous vehicle to a source. The vehicle computing system 108, for example, can determine the distance between the autonomous vehicle (104), and one of the remote computing devices 130 (e.g. a smartphone used by a potential passenger of the vehicle 104) using the received signal strength from the one or multiple signals detected by one or more sensors 128. One or more properties can be included in the received signal strength and/or signal-to-noise ratio of one or several signals.

“In some embodiments, satisfying one or more broadcast criteria may include the proximity to the source of one or more signals being within an acceptable distance. One example of satisfying one or several broadcast criteria is determining that an autonomous vehicle and remote computing device are within 10 meters of each other. This can be done based on one or multiple properties of the signals.

“At 604, method 600 may include the generation of one or more indications based in part upon the proximity of an autonomous vehicle to the source. The vehicle computing system 108, for example, can generate one to three control signals to activate or control one or several output devices (e.g. display devices, audio device, lights, and/or tactile devices) of the vehicle. These can then be used to output one or more indications (e.g. an indication that vehicle 104 is available). One or several haptic indicators (e.g. one to three vibrations of a door handle) can be used to indicate that the vehicle 104 is available.

“In some embodiments the type, magnitude, or frequency of one or more indicators can be determined in part by the distance between the autonomous vehicle and the source of the signal. As an example, if the vehicle is closer to a potential passenger than the distance between them, the strength or frequency of one or several light indications (e.g. blinking headlights) can decrease in proportion to the distance between them.

“FIG. “FIG.7” shows a flow diagram for an example 700-level method of autonomous vehicle operation, according to the example embodiments of this disclosure. One or more of the 700-parts can be implemented using one or more devices (e.g. one or multiple computing devices or systems, such as the vehicle 104 or the vehicle computing system108 or the operations computing device 150 shown in FIG. 1. Furthermore, one or more of the methods 700 can be used as an algorithm on any of the components of the devices discussed herein (e.g. FIG. 1) to, for instance, receive trip data, travel at a pick up location and activate one of the vehicle systems. FIG. FIG. 7 shows elements in a specific order, for illustration and discussion purposes. Anyone with ordinary skill in art will be able to see that elements can be modified, altered, removed, combined and/or modified without departing from the scope of this disclosure.

“At 702, the method 700 may include the determination of a location associated to the autonomous vehicle. The location of an autonomous vehicle can be determined partly by inputs such as signals from external sources and/or the use of sensors to determine its location relative to other objects. Image sensors, for example, can be used to determine how close the vehicle is to landmark locations. The vehicle computing system 108, for example, can receive signals from a source radio-navigation data (e.g. GPS and/or GPS) and, based on these signals, determine the vehicle’s location 104.

“In some embodiments the broadcast criteria may be partly based on the autonomous vehicle’s location. The broadcast criteria could include the autonomous car being within a specific location (e.g. a pick up area), not being within that location (e.g. on a busy highway), or being within a predetermined distance of a location (e.g. within twenty meters from a pickup location).

“At 704, method 700 can include broadcasting one, or more signals with information regarding the location of an autonomous vehicle. The vehicle 104 can transmit one or more signals, including information about the location of the vehicle (e.g., latitude and longitude of vehicle 104) via the communication system 122. An autonomous vehicle can transmit one or more signals that indicate its location (e.g., latitude and longitude and/or location relative to a pre-established point or reference location) to remote computing systems using one or more communication system associated with it.

“Further, one or more signals may include radio signals or cellular signals. The one or two signals can also use different communications protocols, such as Bluetooth or Wi-Fi. One or more broadcast instructions can be included in some embodiments for devices or systems that receive the trip data. The instructions for broadcasting one or several signals by an autonomous vehicle may include, for example, instructions on when and how long to broadcast them; where and from what latitude and longitude to broadcast them; and/or the signal protocol (e.g. a wireless signal protocol).

“In some embodiments the broadcast criteria may be partly based on the autonomous vehicle’s location. The broadcast criteria could be, for example, based on the autonomous vehicle’s location within a specific area or within a specified proximity (e.g., distance in meters) to the pick-up area.

“At 706, method 700 may include determining, in part, map data and the position of the autonomous car, a signal disturbance value (e.g., data structure including one, two, or more numerical values that are associated with an estimated decrease in the number of signals being received by the receiving device of an autonomous vehicle). The vehicle computing system 108, for example, can calculate, using part of the map data, how many structures (e.g. buildings) can block one or several signals. The vehicle can use different frequencies or communication channels (e.g. different wireless frequencies) depending on how many structures are blocking one or more of the signals.

“Map data may indicate one or more structures, conditions, or features that could block or interfere with detection of one or more signals. One or more structures, conditions, or features could include buildings (e.g. office buildings and/or residential homes); tunnels (e.g. tunnels through mountainsides/under waterways); vegetation (e.g. trees and/or shrubs); weather conditions (e.g. lightning, hail, rain and/or snow); and/or sources electromagnetic transmissions (e.g. narrowband and/or broadband interference from electric devices). There are many factors that can affect the signal’s strength, including the type of material used (e.g. a concrete wall could block a signal more than an equivalent thickness wooden wall), the size of the structures (e.g. tunnels through mountainsides and/or under waterways), vegetation (e.g. trees and/or bushes); weather conditions (e.g. lightning, hail, rain, snow, and/or fog) as well as sources of electromagnetic transmissions (e.g. narrowband or broadband interference from electric devices).

“In certain embodiments, the detection can be made based on part of the signal disruption value.” The signal disruption value can also be used to modify the sensitivity of an autonomous vehicle that detects one or more signals, and/or the protocols and frequencies associated with those signals.

“FIG. “FIG. 8” shows a flow diagram for an example method 800 for autonomous vehicle operation in accordance with example embodiments. One or more of the portions of the method 800 may be implemented using one or more devices (e.g. one or more computing devices or systems, such as the vehicle 104 or the vehicle computing system108 or the operations computing device 150 shown in FIG. 1. Furthermore, one or more of the methods 800 can be used as an algorithm on any of the components of the devices discussed herein (e.g. FIG. 1) to, for instance, receive trip data, travel at a pick up location and activate one of the vehicle systems. FIG. FIG. 8 shows elements in a specific order, for illustration and discussion. Anyone with ordinary skill in art will be able to see that elements can be modified, altered, removed, combined and/or modified without departing from the scope of this disclosure.

“At 802, the method 800 may include the generation of a trip identifier that will be used to identify the autonomous car. The trip identifier may include information that is associated with one or several symbols (e.g. letters, words and numbers, pictograms pictures, icons, colors, patterns, etc.) that can be used for identification of the trip. The vehicle computing system 108 can access data structures that contain data associated with trip identifiers (e.g. numbers, letters and words), and generate the trip ID based on one or more trip identifications or information associated with a random number generator.

“In some instances, the trip identifier may include information associated with one of more symbols. It can also be based in part upon a randomly generated trip ID or selected from multiple trip IDs that have not been used within a predetermined time. A trip identifier may be generated randomly, for example by using a pseudorandom number generator or a random seed. This allows the combination of various elements of one or many symbols (e.g. text, numbers, colors, shapes and sizes, and/or patterns). A plurality of trip IDs, such as trip identifiers stored within a database of trip IDs, can be accessed. Then, a trip identification that has not been used in a predetermined time period (e.g. not used for an unspecified duration) can also be chosen to be used as the trip identifyr.

“At 804, method 800 can broadcast one or more signals, including the trip identifier. The trip identifier could include a white square and a black letter “S?” A green number?39 can be added to the square. Inside the square. The remote computing device that is associated with a potential passenger can receive one or more signals as part of the trip identification. The remote computing devices can then display the trip ID on the display portion of the remote computing. The autonomous vehicle can also display the trip ID (e.g. the same trip identification displayed on the remote computing devices) on an exterior display area of the vehicle (e.g. a panel on the roof, window, or body panel of the vehicle). The trip identifier allows potential passengers to identify the autonomous car that they are interested in using for their trip by simply using it.

“FIG. “FIG. 9″ depicts an example system of 900 as per the example embodiments of this disclosure. The system 900 may include a vehicle computing device 908, which can include any or all of those features as shown in FIG. 1; one to more computing device 910 that can include some of the features from the one-or more computing units 110; a communications interface 912, one or two processors 914; memory system 920; memory 924; one and more input devices 926; output devices 928; one of more computing device 930 which can include all or some of the features shown in FIG. 1; one to more input devices 932, one or two output devices 934; a networking 940 that can include all or some of the features of FIG. 1; and an operation computing system 950 that can include any or all the features of FIG. 1.”

The vehicle computing system 908 may include one or more computing devices 910. One or more computing devices 910 may include one or two processors 914 that can be added to a vehicle with the vehicle 104, and one or several memory devices 920 that can be added to a vehicle with the vehicle 104. Any processing device can include a microprocessor or microcontroller, an integrated circuit, an ASIC (application specific integrated circuit), a digital signal processor (DSP), and one or more memory devices 920 which can be included on-board a vehicle including the vehicle 104. One or more processors 914 may include one processor, or multiple processors that can be operatively or selectively connected. One or more memory devices (920) can contain non-transitory computer readable storage media such as RAM, flash memory devices and magnetic disks.

“The one or several memory devices 920 can store data that can be accessed and manipulated by the one or multiple processors 914. One or more memory devices, 920, that can be attached to a vehicle, including vehicle 104, may include a memory system 922. This memory system can store computer-readable instructions which can be executed using the one or multiple processors 914. Software written in any programming language that is compatible with hardware implementations (e.g. computing hardware) can be included in the memory system 922. The memory system 922 may also include instructions that can either be executed in separate threads (logically or virtually) on one or more processors. Any set of instructions can be included in the memory system 922 that, when executed by one or more CPUs 914, causes the processors 914 perform operations.

“For example, one or several memory devices 920 that can be placed on-board a vehicule including the vehicle 104 may store instructions, including special instructions, which when executed by one or multiple computing devices 910 on-board the vehicle causes the one-or more computing devices to perform operations such the operations and functions described above, including operations for receiving data (e.g. path data, context and/or traffic regulation), and/or activating one/more vehicle systems (e.g. one or two portions of method 900, method 1000 or any other functions or any other functions or operations or functions, or any other functions or operations or functions, or functions, or any other functions or operations or functions, as described in this disclosure

One or more memory devices (920) can contain a memory system 924. This memory system can store data that can then be retrieved, modified, created, and/or saved by one or more computing devices (910. Data stored in memory system 924 may include data associated to a vehicle, including vehicle 104, data acquired using the one or multiple data acquisition systems 112, path data associated a vehicle’s journey; context data associated a state of an environment; traffic regulation information associated with traffic regulations in an environmental; data associated user input; data associated one or more actions or control command signals; data related with users; and/or any other data or information. One or more databases can store the data in memory system 924. One or more databases can be divided up to be located in different locations on-board a vehicle, which can include vehicle 104. One or more computing devices 910 may be able to access data from remote memory devices, which could include the vehicle 104.

The environment 900 may include the network 940, which is a communications network that can exchange signals (e.g. electronic signals) and data (e.g. data from a computing device). This includes signals or data exchanged among computing devices, including the operations computing systems 950, 1108, or 930. The network 940 may include any combination or topologies of wired (e.g. twisted pair cable) as well as wireless communication methods (e.g. wireless, satellite, microwave and radio frequency). The communications network 140, for example, can contain a local area network (e.g. Intranet, wide area network (e.g. Internet, wireless LAN network (e.g. via Wi-Fi), cellular networks (e.g. VHF network), cellular network and/or any other suitable communication network (or combination thereof). These can be used to transmit data from and to a vehicle, including the vehicle 104.

“The one or several computing devices 910 may also include communication interface 912 that allows for communication with one or many other systems on-board the vehicle, including vehicle 104 (e.g. over the network 940). Any suitable component for interfacing with one of the networks can be included in communication interface 912. This includes, for example, transmitters and receivers, ports controllers, antennas, software, and controllers.

Summary for “Seamless vehicle entrance”

Vehicles, even autonomous ones, may include access systems that regulate vehicle access. These systems can take many forms, including the traditional mechanical lock and key. Many existing access systems are not flexible enough to allow for multiple passengers to be accommodated or change access conditions remotely. Many existing access systems can be complicated and difficult to use, which often leads to wasted time and effort for potential passengers. There is a need to find a better way to get into an autonomous vehicle.

“Aspects, and benefits of embodiments of this disclosure will be described in part. You can also learn from the description or by practicing the embodiments.

“A computer-implemented method for autonomous vehicle operation is an example of the present disclosure. Computer-implemented methods of operating autonomous vehicles can include the receiving by an autonomous car, including one or more computing devices trip data, information associated with a request to pick up the vehicle from a pickup location. The method may also include the autonomous vehicle travelling to the pick up location based on part of the trip data. The method may also include the detection by the autonomous vehicle of one or more signals that are associated with trip data. This method may include, upon determining that one or more signals meet one or several broadcast criteria, activating, via the autonomous car, one or multiple vehicle systems related to fulfilling the request for an autonomous vehicle at the pick up location.

“Another aspect of the present disclosure is directed at one or more tangible, not-transitory computer readable media storing computer readable instructions that when executed or one or two processors causes the processors to perform operations. Operations can include the receipt of trip data, including information related to a request for an automated vehicle at a pickup location. Based in part on trip data, operations may include controlling the autonomous vehicle to travel from the pick-up point. The operations may also include the detection of one or more signals that are associated with trip data. The operations may include, upon determining that one or more signals meet one or several broadcast criteria for authorizing access to an autonomous vehicle, activating one of the vehicle systems involved in fulfilling the request to have the vehicle delivered to the pick-up point.

“Another aspect of the present disclosure is directed at an autonomous vehicle that includes one or two processors and one (or more) non-transitory computer readable media storing instructions. These instructions, when executed by one or multiple processors, cause one or several processors to perform operations. Operations can include the receipt of trip data, including information related to a request for an automated vehicle at a pickup location. Based in part on trip data, operations may include controlling the autonomous vehicle to travel from the pick-up point. The operations may also include the detection of one or more signals that are associated with trip data. If the signals meet one or several broadcast criteria, the operations may include activating one of the vehicle systems that are necessary to fulfill the request for an autonomous vehicle at the pick up location.

“Another example aspect of the present disclosure is directed to other systems and methods, vehicles, devices, apparatuses tangible non-transitory information media, and devices that allow for autonomous vehicle operation.”

These and other features, aspects, and benefits of different embodiments will be better understood by referring to the following description. These accompanying drawings are included in and form a part this specification and illustrate embodiments. They also serve to explain related principles.

“Examples of the present disclosure aim to provide potential passengers of a vehicle (e.g. an autonomous vehicle or semi-autonomous car or a manually operated one) with seamless vehicle entry. This is partly based on the detection and transmission of signals (e.g. Bluetooth signals) by remote computing devices (e.g. a smartphone or wearable computing gadget). A computing system, such as a system that includes one or more computing devices that can communicate with or operate a vehicle, can receive trip data in response to a request to pick up a passenger at a pick up location. The autonomous vehicle’s computing system can receive trip data. This can include a unique identification for the trip and can be associated with an autonomous vehicle and a remote computing gadget (e.g., a smartphone being used by a potential passenger).

The vehicle can receive trip data and detect any other signals (e.g. radio signals that contain trip data). The vehicle can activate one or several vehicle systems if the signals satisfy one or multiple criteria. For example, it can unlock the doors of the vehicle if the received signal strength exceeds a threshold value. The disclosed technology is more effective in providing seamless vehicle detection and access for a potential passenger who has requested a vehicle.

A prospective passenger may request to be picked up by an autonomous vehicle at their current location. This can be done via their mobile device. The request can include information about the location of the vehicle based on inputs from the passenger or location information that is derived from signals received by the device from an external source, such as a global positioning satellite (GPS). A computing system associated with an autonomous vehicle can receive the request. It will determine the closest vehicle to the passenger’s current location. The computing system can detect any identification information in trip data and modify trip data to exclude the information.

“As the autonomous car travels to the location of the potential passenger, the mobile phone of the passenger can transmit signals that can be detected by the vehicle. The autonomous vehicle can detect signals from the smartphone and stop at a predetermined distance from the pick-up location. The mobile device can also detect signals that can be broadcast by an autonomous vehicle and signal it to stop within a predetermined range of the pick up location or source of the signals (e.g. the mobile phone) and unlock a door for the potential passenger.

The disclosed technology may include an autonomous vehicle, and/or a computing device associated with the autonomous vehicle. One or more computing devices can be included in the autonomous vehicle or the computing system associated therewith. These devices can process, generate or exchange signals or data with other devices, such as one or more autonomous vehicles.

“The computing system can, for example, exchange signals (e.g. electronic signals) or data to one or more vehicle systems, including sensor systems (e.g. sensors that generate output based upon the state of the physical world external to the vehicle), communication systems (e.g. wired and wireless communication systems that are able to exchange signals or data to other devices); navigation systems, (e.g. devices that can receive signals GPS, GLONASS or other systems used in determining a vehicle’s geographic location); and/control systems; and/or to alter the vehicle’s course and/es to change the vehicle’s.

The computing system can receive trip information that is associated to a request for an auto-driving vehicle at a pick up location. One or more remote computing systems can receive the trip data via one or several communication components, including wired or wireless communications systems. Trip data can include pick-up locations (e.g. an identifier for geographic location, including a latitude/longitude, and/or address); current location of the prospective passengers; pick-up times (e.g. a time when the prospective passengers will meet with the vehicle); and/or prospective passenger cargo status (e.g. an indication of whether the passenger is carrying cargo that will go into a cargo area of a vehicle).

“In certain embodiments, the computing systems can determine portions of trip data that may contain personal identification data. The computing system can, for example, determine whether personal information such as a person’s name, email address, phone number or payment information (e.g. credit card number) is included in a trip request. The computing system can also exclude or make unavailable (e.g. delete, encrypt or obfuscate portions of trip data that contain personal information from trip data that will go to the autonomous vehicle or remote computing device). The computing system can, for example, withhold portions of trip data that are associated with personal information data when generating trip data. This ensures that prospective passengers are kept safe and secure when trip data is transmitted.

The trip data can be used to determine a location for the autonomous vehicle and remote computing device. The computing system can, for example, determine the latitude/longitude, address and/or relative positions (e.g. the position of the autonomous car with respect to the remote computer). The computing system can also determine, using part of map data and the location the autonomous vehicle/or remote computing device, the signal disruption value associated to one or more signals being blocked/interfered with (e.g., an estimated decrease in the number of signals being received).

“The map data may include one or several indications (e.g. indications of location and magnitude signal interference), of one or multiple structures or features that could block or interfere detection of the one/more signals, such as buildings (e.g. houses, apartment buildings or office buildings), tunnels (e.g. underground or underwater tunnels), and sources of electromagnetic transmissions (e.g. power lines and/or mobile phone towers). Broadcasting the one or more signals may be dependent on part of the signal disruption value in some cases. The signal disruption value can be used by the remote computing device or autonomous vehicle to selectively broadcast or detect one or more signals.

The computing system can direct an autonomous vehicle to go to the pick up location. The computing system can, for example, determine the path from the current vehicle’s location to the pick up location indicated in trip data. The computing system can use the path between the current location and the pick up location to activate any or all of the vehicle systems (e.g. the engine, sensors, and/or steering) to direct the autonomous car to the pick up location.

“In certain embodiments, the computing device can broadcast one or more signals that include information about the location of an autonomous vehicle (e.g. latitude and longitude). The autonomous vehicle can transmit one or more signals that indicate the location of the vehicle to remote computing systems using one or more communication devices associated with it, including wireless and wired communications systems. The signals can be cellular signals or radio signals or microwave signals. Additionally, they can use different communications protocols such as Bluetooth or Wi-Fi. One or more broadcast instructions can be included in some embodiments for devices or systems that receive the trip data. The instructions for broadcasting one or more autonomous vehicle signals can include information about when and where to broadcast them; and/or which signal protocol to use in broadcasting them. One or more broadcast criteria may be determined partly based on the location and/or use of an autonomous vehicle. The broadcast criteria could be, for example, based on whether the autonomous vehicle is located within a specific area or close to the pick up area.

The computing system can detect one or more signals related to the trip information (e.g. one or more signals that include portions of the trip or other data associated with it). The autonomous vehicle may include components that can detect one or more signals, such as a radio receiver or a Bluetooth receiver. The computing system may be able to identify the source, direction, magnitude and frequency of one or more signals in some cases. The computing system can also analyze trip data if the trip data is part of the one or more signals.

“In response to determining whether the signals satisfy any of the broadcast criteria, the computing device can activate one or several vehicle systems that are associated with fulfilling the request. The autonomous vehicle can satisfy one or more broadcast criteria by determining that the strength or combination of signals from the remote computing devices exceeds a threshold strength. This indicates that the remote computing devices are within a predetermined distance (e.g. within ten metres).

“Activating one or several vehicle systems may include unlocking an entry to the vehicle (e.g. unlocking one of more doors to an individual passenger compartment), locking an entrance, opening an entrance, or unlocking one of more cargo areas of an autonomous car (e.g. unlocking one of more trunk doors or other cargo/storage areas of an autonomous vehicle), slowing down the vehicle to a halt (e.g. reducing the vehicle’s velocity until it stops), or generating an indicator of the availability of an area that displays identifier that is sent to prospective passengers)

“In some embodiments, the computing system can determine the proximity of an autonomous vehicle relative to a remote computing device based in part upon one or more properties. One or more broadcast criteria may be satisfied in some embodiments. This could include the determination that the autonomous vehicle is within a certain distance from the remote computing device. One example of satisfying one or more broadcast criteria is determining that the autonomous car and remote computing device are within 30 meters of each other. This can be done based on one or several properties of the signals. One or more properties can be, for example, received signal strength (RSS), or signal to noise ratios (SNR), of one or several signals.

The computing system can generate one or more indications based on the distance of the autonomous car to the remote computing devices. One or more of the indications could include one, more or all of the following: one, more or all of the remote computing devices’ vibrations, one, more, flashing and/or pulsating light indicators, one, more or all of the colors, one, more audio indications, one, more or all of the following: one, more or all of the pictures, one, more or both of them, one, more or any combination of these, one, more pictorial indications, one, more or all of the phrase, letters, or words

“In some embodiments, a type (e.g. a color or light indication) or magnitude (e.g. a size or brightness for a light indicator) of one or more indications can be determined in part based on the distance between the autonomous vehicle and the remote computing devices. As an example, the distance between the autonomous vehicle and the remote computing devices can affect the strength or frequency of haptic indicators (e.g. vibrations).

The computing system can generate trip identifiers. The trip identifier may include information that is associated with one or several symbols (e.g. letters, words and marks, pictograms as well as pictures, colors, patterns, colors, etc.) that can be used for identification. The computing system can also broadcast (e.g. send via a wireless networking) one or several signals that include trip identifier. This allows a remote computing device to detect the signals and identify the prospective passenger. The trip identifier could include a red circle and a black letter?L. A red circle with a black letter?L? and a black number of?6? can be used as the trip identifier. Inside the circle. A remote computing device that is associated with a potential passenger can receive the trip ID. The autonomous vehicle can display this information on the vehicle’s display section. The remote computing devices can also display the trip identification on the display portion of their remote computing devices. This allows a potential passenger to identify the autonomous vehicle requested for a trip more efficiently. A potential passenger can use the trip identifier to identify their chosen autonomous vehicle in a situation where multiple vehicles are waiting in close proximity.

“In some instances, the trip identifier may include information associated with one of more symbols. It can also be based in part upon a randomly generated trip ID or chosen from among a number of trip IDs that have not been used within a predetermined time period. The computing system can generate a trip ID by randomly combing elements of a symbol, such as color, shape and size. The computing system can also access a number of trip IDs (e.g. trip identifiers stored within a database of trip identifications) that have been used recently and choose a trip ID that has not been used in that time period (e.g. not used for a certain amount of hours, day of the trip or week).

The disclosed technology may include systems, methods, or devices that can have a range of technical benefits and technical effects on the vehicle’s operation, including the coordination of pick-up and delivery of potential passengers by one or several vehicles. The disclosed technology has one major benefit: it reduces the number of interactions between a prospective passenger and an autonomous vehicle. Each interaction between an autonomous car and a potential passenger results in energy consumption (e.g. fuel consumed by an autonomous vehicle, and/or battery charging by remote computing device used a prospective passengers), and time (e.g. prospective passenger waiting time, and/or time that a passenger spends to board an autonomous vehicle). Reducing the number or duration of interactions between the vehicle and the passenger could result in less energy usage by both the autonomous vehicle as well as the prospective passenger.

The disclosed technology provides enhanced privacy to prospective passengers by receiving passenger requests and then creating trip data that does NOT include any of the passenger’s personal information (e.g. prospective passenger name, address and/or payment information). The trip data can be encrypted to further protect the security of prospective passengers, even if it is intercepted by an unauthorized third-party.

Further, by selectively detecting and broadcasting one or several signals, based in part upon the location of structures, objects, or other obstacles to the signals, the disclosed technology can more efficiently conserve scarce energy resources (e.g. battery resources of remote computing devices including smart phones). The disclosed technology can, for example, access a map that shows the location of structures and objects that block one or more signals. The disclosed technology can adjust the sensitivity of the sensors that detect one or more signals based on the location of interfering or blocking objects.

“The disclosed technology allows for more efficient passenger-vehicle coordination, better vehicle-passenger interactions, enhanced passenger privacy, battery saving through the use of mapping data that shows the locations of structures and objects that block or interfere signals. This also results in time and resource savings due to more efficient vehicle-passenger detection, prospective passenger boarding, and passenger boarding.

“With reference to FIGS. “With reference now to FIGS. 1-9, we will discuss in detail some examples of the present disclosure. FIG. FIG. 1 shows a diagram of an illustration system 100, according to the example embodiments of this disclosure. A system 100 may include multiple vehicles 102, 104, and a vehicle computing platform 108. It can also include one or several computing devices 110, one or two data acquisition systems 112, one or both of the human machine interface systems 118, other vehicle systems 120, a communication system 122, 124, one or all sensors 128; one- or more remote computing devices 130; and an operation computing system 150.

“The operations computing device 150 can be linked to a service provider that provides vehicle services to multiple users through a fleet that includes, for instance, the vehicle number 104. Vehicle services include transport services (e.g. rideshare), courier services and delivery services, as well as other types of services.

“The operations computing device 150 can contain multiple components that perform different operations and functions. One example is that the operations computing system 150 could include or be associated with remote computing devices. These remote computing devices can be located far from the vehicle 104. One or more processors can be used with one or several memory devices. One or more memory devices may store instructions that, when executed by one or multiple processors, cause the one/more processors to perform operations and functions related to operation of the vehicle. These include receiving data (e.g. trip data), controlling operation (e.g. the vehicle 104), activating one/more vehicle systems (e.g. doors and/or storage area of the vehicle 104)

“Example: The operations computing system 150 can be set up to communicate with the vehicle (104) and/or its users in order to coordinate vehicle service 104. The operations computing system 150 is able to manage a database that contains vehicle status data. The vehicle status data may include the location of the plurality 102 (e.g. a latitude or longitude of a car), availability of a car (e.g. whether it is available for passengers or cargo pick-up), and the state of objects outside the vehicle (e.g. the location and/or proximity of external objects to the vehicle).

One or more local memory devices can store an indication, record, or other data indicative about the state of one or several objects, including their proximity or location. The vehicle 104 can also provide data indicative about the state of one or several objects (e.g. proximity or location of one or multiple objects) within a predefined range of the vehicle. This data can be stored locally in one or many memory devices associated to the operations computing software 150.

“The operations computing device 150 can communicate with vehicle 104 through one or more communication networks, including the communications network 140. The communications network 140 can send and receive signals (e.g. electronic signals) or data (e.g. data from a computing device). It can also include any combination wired (e.g. twisted pair cable) or wireless communication mechanisms (e.g. cellular, wireless satellite, microwave and radio frequency), and/or any desired topology (or topologies). The communications network 140, for example, can also include a local network (e.g. Intranet, wide area network (e.g. Internet, wireless LAN network (e.g. via Wi-Fi), cellular networks (e.g. VHF network), a HF network and/or any other suitable communication network (or combination thereof), for transmitting data from and to the vehicle 104.

The vehicle 104 could be either a ground-based vehicle (e.g. an automobile), an airplane, or another type of vehicle. The vehicle 104 is an autonomous vehicle that can perform different actions, including driving, navigation, and/or operating with minimal or no interaction from a human driver. You can configure the autonomous vehicle 104 to operate in any combination of modes, including a fully autonomous operational mode or semi-autonomous operational mode. It can also be set up to park, sleep, and/or autonomous mode. Fully autonomous (e.g. self-driving) operating mode is one where the vehicle 104 can perform driving and navigational operations without the need for interaction with a human driver. Semi-autonomous operating modes can include those in which the vehicle can operate without any interaction from a human driver. The vehicle 104 can use park and/or sleep modes between operational modes. It also performs different actions, such as waiting for a vehicle service to be completed or charging between operational modes.

“The vehicle 104 may include a vehicle computing systems 108. The components of the vehicle computing system 108 may include different components that perform various functions and operations. One example is that the vehicle computing system (108) can have one or more computing units 110 onboard the vehicle (104). One or more computing devices 110 may include one or several processors and one (or more) memory devices. Each of these devices is on-board the vehicle. One or more memory devices may store instructions that, when executed by one or many processors, cause the one-or more processors perform operations and functions such as taking the vehicle out-of-service, stopping its motion 104 or determining the condition of one, more, or both of these.

“The one or more computing units 110 can be used to implement, include, or otherwise be associated with other systems on the vehicle 104. These on-board systems can be connected to the one or more computing devices 110. One or more computing devices 110 may be configured to communicate with any of the following systems: one or multiple data acquisition systems 112, one or several control systems 116 (e.g. including a navigation system), one, or more autonomy systems 114, one or both human machine interface systems, 118, other vehicle system 120, and/or a communications system 122. These systems can be connected to the one or more computing devices 110 via a network of 124. The network 124 may include one or more data busses (e.g. controller area network (CAN), OBD-II connector (e.g. OBD-II), as well as a combination of wired/wireless communication links. One or more computing devices 110, as well as the other onboard systems, can transmit and/or receive data and/or messages among themselves via the network 124.

“The vehicle’s one or more data acquisition system 112 may include multiple devices that can acquire data 104. These data can be data related to the vehicle, including data from one or more vehicle systems (e.g. health data), vehicle’s interior, exterior, surroundings and/or vehicle users. One or more image capture systems 126 can be included in the data acquisition system 112. One or more image capture device 126 may include one or several cameras, LIDAR system), two-dimensional and three-dimensional imagecapturing devices, static (e.g. rotating) image captures devices, video capture (e.g. video recorders), lane detects, scanners optical readers, electric eyes, or other appropriate types of image capture equipment. One or more image capture units 126 can be found in the interior of the vehicle 104 and/or exterior. One or more of the image capture devices (126) can be set up to capture image data that can be used to operate the vehicle 104 in autonomous mode. One or more of the image capture devices 126 can be configured to acquire image data that will allow the vehicle to 104 to use one or more machine vision methods (e.g. to detect objects in their environment).

“Additionally or alternatively, one or more data acquisition system 112 may include one or several sensors 128. One or more sensors 128 could include motion sensors and pressure sensors, mass sensors and weight sensors, impact sensors, temperature sensors, humidity sensors and RADAR, sonar and radios (e.g. for obtaining information about the vehicle’s surroundings), GPS equipment, proximity sensors and/or other types of sensors to obtain data indicative of parameters related to the vehicle 104, and/or relevant for the operation of the car 104.

“The one or more data collection systems 112 may include one or more sensors 128 that are dedicated to obtaining data about a specific aspect of the vehicle. 104 This includes the fuel tank, engine and oil compartment as well as wipers. One or more sensors 128 may also include sensors that are associated with mechanical or electrical components of the vehicle. One or more sensors 128 could be used to determine whether the vehicle’s trunk, door, or gas cap is open or closed. The data from the one or more sensors 128 may be used to detect other vehicles, objects, road conditions (e.g. curves, potholes and dips, bumps and/or changes of grade), and measure distances between the vehicle 104, other vehicles, and/or other objects.

“The vehicle computing device 108 can also receive map data. A computing device within the vehicle, such as the autonomy system 114, can be configured to receive map information from one or more remote computing devices, including the operations computing systems 150 and 130. These remote computing devices are often associated with a geographical mapping service provider. Any combination of two-dimensional and three-dimensional geographical map data associated to the area where the vehicle is, is, or will travel can be used as map data.

“The data from one or more data acquisition system 112, map data and/or any other data can be stored on one or more memory devices at the vehicle 104. On-board memory devices may have limited storage capacities. The data stored on the memory devices can be limited in storage capacity. This means that the data may need to periodically be deleted, deleted, or downloaded to another device (e.g., to a service provider’s database). One or more computing devices 110 may be set up to monitor memory devices and/or communicate with an associated processor to determine the amount of data stored in each memory device. You can also configure one or more other onboard systems, such as the autonomy system 114, to access data stored in one or more memory device.

“The autonomy system114 can be configured so that the vehicle 104 can operate in an autonomous mode. The autonomy system 114 can retrieve data associated with vehicle 104 (e.g. acquired by one or more data acquisitions systems 112). The map data can be obtained by the autonomy system 114. The autonomous system 114 can control different functions of the vehicle. The autonomy system 114 may include models that can perceive road features, signage and/or objects, people or animals, etc. Based on data from the one or more data acquisition system 112, map data, or other data.

“In certain implementations, the autonomy systems 114 may include one or more machine learning models that use data from the one or more data acquisition system 112, the map data and/or any other data to aid in the operation the autonomous vehicle. The data acquired can also be used to detect other vehicles and/or vehicles, road conditions (e.g. curves, potholes and dips in grade or the like), and measure distances between vehicles 104 and them. The autonomy system 114 is capable of predicting the movement and/or position (or lack thereof), of such elements, using one or more Odometry techniques. Based at least partially on these predictions, the autonomy system 114 may be used to plan the vehicle’s motion. The planned motion can be implemented by the autonomy system 114 to properly navigate the vehicle 104 without any human intervention. The autonomy system 114 may include a navigation system that directs the vehicle 104 towards a specific destination. The autonomy system 114 regulates vehicle speed, acceleration and deceleration. It can also control steering and/or operation other components in order to operate in an autonomous mode.

“The autonomy system114 can calculate a vehicle’s position and/or route in real-time or near real time. The autonomy system 114 can, for example, use acquired data to calculate one or more possible routes (e.g. calculate a route every fractional of a second). The autonomy system 114 will then choose the route and instruct the vehicle 104 to follow it. The autonomy system 114 can, for example, calculate one or several straight paths. It can also determine one or two lane-change paths, one, or more turning paths and/or one/more stopping paths. The vehicle 104 can choose a path based on data acquired, current traffic factors, and travelling conditions. Different weights may be used to select a path depending on the implementation. The vehicle 104 can follow the chosen path once it has been selected by the autonomy system 114.

“The vehicle’s one or more control system 116 can be set up to control one or several aspects of the vehicle. One or more control systems 116 may be used to control access points on the vehicle 104. One or more access points could include features like the vehicle’s trunk lock, trunk lock and hood lock, fuel tanks access, latches and/or other mechanical accessibility features that can be adjusted between different states, locations, or positions. One or more control systems 116 could be used to control access points (e.g. door lock), to change the state of the access point between a lock position and unlocked position. Alternately, one or more control system 116 can be used to control other electrical features of vehicle 104 that can adjust between one or several states. One or more control systems 116 may be used to control electrical features, such as hazard lights or microphones, and to switch between the first and second states (e.g. off and on).

“One or more human machine interaction systems 118 can allow interaction between a user (e.g. human), the vehicle (e.g. the vehicle computing system 108) and/or third parties (e.g. an operator associated to the service provider). One or more human interface systems 118 may include multiple interfaces that allow the user to input/receive information from the vehicle computing platform 108. One or more human interface systems 118 may include a graphical user, web-based, manipulation interface, touch interface, attentive interface, conversational and/or vocal interfaces (e.g. via text messages, chatter bot), conversational agent, interactive voice reply (IVR), system, gesture interface and/or other types. One or more input devices can be included in the human machine interface system 118. These include touchscreens, keypads, touchpads, knobs and buttons, sliders and switches, mouse, gyroscopes, microphones, and other hardware interfaces. One or more human interfaces 118 may also include one or several output devices (e.g. speakers, lights, or display devices) that can receive and output data related to the interfaces.

“The vehicle’s other systems 120 can be used to monitor and control other aspects. 104 Software update monitors, engine control units, transmission control units, and on-board memory devices are just some examples of the additional vehicle systems 120. One or more computing devices 110 may be set up to communicate with other vehicle systems 120 in order to receive data or send signals. Software update monitors, for example, can send data to one or more computing devices 110 that indicate the current status of software on the various on-board systems, and/or whether a particular system needs a software upgrade.

“The vehicle computing system 108 and its one or more computing device 110 can use the communication system 122 to communicate with other computing devices. The vehicle computing system 108 may use the communication system 122, which allows it to communicate with other users over the internet. The communication system 122 may allow one or more computing devices 110, to communicate with any of the systems onboard the vehicle 104. The communication system 122 can be used by the vehicle computing system to communicate with the operations computing systems 150 and/or remote computing devices 130 via the networks (e.g. via one or several wireless signal connections). Any suitable components can be included in the communication system 122 to allow interfacing with one of several networks. These components can include transmitters, receivers ports, controllers antennas or any other components that can facilitate communication with remote computing devices.

“In some cases, one or more computing devices 110 mounted on the vehicle 104 can access vehicle data that indicates one or more parameters of the vehicle. 104 One or more parameters may include information such as vehicle health and maintenance information 104, vehicle computing system 110, or one or more on-board systems. One or more parameters could include the fuel level, engine condition, tire pressure, vehicle conditions, vehicle exterior conditions, mileage, time till next maintenance, time since the last maintenance, data storage in the onboard memory devices, charge level of an automotive energy storage device, vehicle software status, required software updates, and/or any other health and maintenance data for the vehicle 104.

One or more systems onboard the vehicle can provide at least some vehicle data indicative of parameters. One or more computing devices 110 may be set up to request vehicle data from the onboard systems on a regular and/or as-needed basis. One or more of the onboard systems can be set up to provide vehicle data indicative to one or more parameters to the computing devices 110. This could be done, for example, intermittently, continuously, or as-needed. One or more data acquisitions system 112 may provide information about the vehicle’s fuel level, and/or vehicle energy storage device charge. One or more parameters may be indicative of user input in some implementations. One or more of the human machine interfaces (118) can receive input from the user, e.g. via a vehicle’s interior display device. One or more human interfaces 118 can provide data indicative that the user input was received by the computing devices 110.

“In some cases, one or more computing device 130 can receive input and provide data indicative that the user input was received by the one or two computing devices 110. One or more computing device 110 can receive data indicative of user input from one or more computing device 130 (e.g. via wireless communication).

“The one or more computing device 110 can be used to determine the current state of the vehicle (104), and the environment surrounding it (104), including the status of pedestrians, cyclists and motor vehicles (e.g. trucks and/or automobiles), roads and waterways and/or buildings. The one or several computing devices 110 can also be used to determine the properties of any or all signals (e.g. radio signals) that are detected by one or multiple sensors 128. One or more computing devices 110 can determine information such as when one or more signals are associated to a request for autonomous vehicles (e.g. the autonomous vehicle 104) at an pick-up point.

“FIG. “FIG. FIG. 2 illustrates one or more of the actions or events. 2, can be executed by one or several devices (e.g. one or more computing devices or systems), such as the vehicle 104 or the vehicle computing system108 or the operations computing platform 150. 1. FIG. FIG. 2 illustrates an environment 200, which includes a potential passenger 210, signals 214 and a remote computing device 212, signals 214, and a vehicle 220.

“In the environment 200 the vehicle 220 (e.g. an autonomous vehicle) is located five meters below street level and inside tunnel 234. This tunnel is a subterranean tunnel that has a tunnel exit 236 which faces street 223. The vehicle 220 received trip data in this example. It was requesting to pick up the passenger 210 at pick-up location 240. Further, the potential passenger 210 uses the remote computing devices 212 (e.g. a smart phone) which can transmit and/or receive signals (e.g. radio signals), including the signals 214. The vehicle 220 is located within the tunnel 234, which is in the signal disturbance area 232. Therefore, the vehicle 220- cannot detect signals 214 through building 242.

“In some embodiments, vehicle 220 can access map information that includes an indication about the layout and geography 200, including an indication as to the areas (e.g. the signal disruption area 223) in which one or more signals (e.g. the signals 214) may be detected and/or blocked. The vehicle 220 is traveling along the path 222. This will allow it to pass the tunnel exit 236, and get out of the signal disruption zone 232. The vehicle 220 could be able to sense the signals 214 when it is on street 238, and may travel more precisely to user 210 who is waiting in pick-up area 244.

“FIG. “FIG. FIG. 3 illustrates one or more of the actions or events. One or more of the devices shown in FIG. 3 can implement one or more actions or events (e.g. one or more computing devices or systems, such as the vehicle 104 or the vehicle computing system108 or the operations computing device 150. 1. FIG. FIG. 3 shows an illustration of a remote computing unit 300 that can exchange data (e.g. send and/or get) with one or several computing systems, such as the vehicle 104 or the vehicle computing system. 1. FIG. FIG. 3 shows the remote computing device 300 with a display area 302, visual indication 304 and haptic indication 306 as illustrated in FIG.

“In this case, the remote computing device 300 (e.g. a smart-phone) of the user who requested an autonomous car has received one or more signals. One or more signals from an autonomous vehicle (e.g. the vehicle 104, or the vehicle 228) include trip data that can be used to generate the visual indicator 304 (ATTENTION: Autonomous vehicle at corner of Main Street and Lincoln Avenue.) The display area 302 is the remote computing device 300.

“Further the signals from an autonomous vehicle (e.g. the vehicle 104 and the vehicle 220), can contain trip data that can then be used to generate haptic indication 306 (“e.g. one or more vibrations that can be generated by the remote computing device 300). The remote computing device 300 can send a haptic indicator 306 to indicate that the vehicle being requested is near. The haptic indicator 306 can also be detected by the remote computing devices 300. This means that the user does not have to interact with the remote computing devices 300. This allows the user to instantly receive an indication of a nearby vehicle.

“FIG. “FIG. FIG. 4 depicts one or more of the actions or events. 4. can be executed by one or several devices (e.g. one or more computing devices or systems, including the vehicle 104 or the vehicle computing system108 or the operations computing software 150 shown in FIG. 1. FIG. FIG. 4 illustrates an environment 400, which includes a remote computing unit 402, a journey identifier 404, a display section 406, a vehicle 412, a route identifier 414 and a display section 416. A vehicle 422, a ride identifier 424 and a display space 426 are also included.

“In the environment 400 the vehicle 422 (e.g. an autonomous vehicle), has received trip data. This data can be represented on a liquid crystal display monitor (e.g. a liquid crystal display screen) as a circular shape with a triangle within it and the letter?C? The number?6 is followed by the letter?C? Inside the triangle. A prospective passenger can use the trip identifier to identify the vehicle they are waiting for.

“In this case, the remote computing devices 402 (e.g., smart phones) that a potential passenger uses to wait for the vehicle 422 receives one or more signals. These signals include trip data that corresponds with the trip ID 424 displayed on the display 406 of remote computing device 402. As the potential passenger associated to the remote computing devices 402 waits on the vehicle 422, a variety of vehicles, including the vehicle 412 or the vehicle 422, can be seen waiting within reach of the passenger. A prospective passenger associated to remote computing device 402 may compare the trip ID 404 displayed on display area 406 with the trip ID 414 displayed in display area 416 or the trip ID 424 displayed in display area 426. The similarity of the trip ID 404 and trip ID 424 can be used to determine that vehicle 422 was the one requested.

“FIG. “FIG.5” shows a flow diagram for an example method 500 for autonomous vehicle operation, according to the example embodiments of this disclosure. One or more of the 500-parts can be implemented using one or more devices (e.g. one or multiple computing devices or systems, such as the vehicle 104 or the vehicle computing system108 or the operations computing device 150 shown in FIG. 1. Furthermore, one or more of the methods 500 can be used as an algorithm on any of the components of the devices discussed herein (e.g. FIG. 1) to, for instance, receive trip data, travel at a pick up location and activate one of the vehicle systems. FIG. FIG. 5 shows elements in a specific order, for illustration and discussion. Anyone with ordinary skill in art will be able to see that elements can be modified, altered, removed, combined and/or modified without departing from the scope of this disclosure.

“At 502, the method 500 may include receiving trip data, including information associated to a request for an autonomous vehicle at a pickup location. The vehicle 104, and/or the car computing system 108 can each receive signals and/or data via the communication system 122. The vehicle 104, and/or the car computing system 108 can also receive the trip data via a network that includes the network 124. The trip data can, for example, be received through one or more communication networks that include wired and/or mobile communications systems.

The trip data can be obtained from one or more sources, including remote computing systems that are used to exchange (e.g. sending and/or receiving data) information related to the scheduling and management of autonomous vehicle trips. Trip data may include pick-up locations (e.g. an identifier for geographic location that includes a latitude/longitude and a set of directions to the pickup location and/or an address at the pick up location); current location of the prospective passengers; pick-up times (e.g. a time when the passenger will meet with the vehicle); and/or prospective passenger cargo status (e.g. an indication of whether or not the passenger is carrying cargo, the area where the cargo will be stored, dimensions and/or the vehicle and/or any weight of the cargo and/or the cargo and/or the vehicle’s cargo and/or the cargo and/or the cargo and/or the cargo and/or the cargo and/or the cargo and/or the vehicle and/or the cargo and/or the vehicle and/or the cargo and/or the vehicle and/or the cargo and/or the cargo and/or the cargo and/or the cara cargo will be carried in the cargo and/or the cargo and/or the size and/or the cargo/or the cargo and/or the cargo and/or the cargo and/or the associated with the Carada-

“At 504, the method 500 may include travel, based in part upon the trip data to the pick up location. A computing system, such as the vehicle computing system 108, can be used to control the autonomous vehicle’s travel to the pick up location. The vehicle computing system 108, for example, can direct vehicle 104 to go to the pick up location. An autonomous system in an autonomous vehicle can determine the path from the current location to the pick up location. This is an example. A computing system associated to the autonomous car (e.g. the vehicle computing software 108) can determine the path between the current location and the pick up location. This allows the computing system (e.g. the vehicle computing system 108) to activate any or all of the vehicle systems (e.g. the sensors, navigation system, engine, brakes, and/or steering system) that will direct the autonomous automobile to the pick up location.

“At 506, method 500 may include the detection of one or more signals that are associated with trip data. One example is that the vehicle, e.g. the vehicle 104, can have one or more components (e.g. the communication system 122) which can detect one or multiple signals (e.g. one or several radio signals). In some cases, the computing system can identify signal source, direction, magnitude, strength, and frequency. The trip data in one or more signals can also be analysed to determine, for example, whether there are any errors. One or more error detection methods (e.g. parity bit check) or error correction techniques (e.g. Hamming code), can be applied to trip information in one or more signals to determine if an error occurred in the signal (e.g. the signal is corrupted or incomplete) or to correct the error in the data when it has been detected.

“At 508, the method 500 may include determining whether or not the signals meet one or more broadcast criteria that are required to authorise access to an autonomous vehicle. Broadcast criteria could include signal strength exceeding or falling below a threshold, frequency within a predetermined frequency range and/or whether signals are emitted from one location or in one direction.

“For example, the vehicle computing platform 108 can determine whether, how, or that the broadcast criteria have been satisfied by comparing the properties of the signals to the thresholds associated with those properties. The vehicle computing system 108 also has the ability to compare the signal strength measured in micro-volts/meter with a predetermined signal strength threshold. This means that broadcast criteria can be satisfied when the micro-volts/meter of one or more signals exceeds this signal strength threshold.

“The method 500 can be redirected to 510 if the broadcast criteria are met. If the broadcast criteria are not met, the method 500 may be stopped or redirected to 502, 504 or 506.

“At 510 the method 500 may include activating one or several vehicle systems. One or more vehicle system activation can be used to fulfill the request for an autonomous vehicle at the pick up location. The vehicle computing system 108, for example, can generate control signals to activate or control one or more vehicles (e.g. engine/motor systems or braking systems or steering systems and/or door controls systems) of vehicle 104.

“Activating one or several vehicle systems in some embodiments can include opening an entry to the autonomous car (e.g. unlocking one or two doors to a vehicle’s passenger compartment for an authorized passenger); locking an entrance (e.g. locking one or both doors to an autonomous vehicle’s trunk or other cargo/storage areas of the vehicle); opening a cargo space of an autonomous vehicle for an authorized passenger (e.g. opening one or multiple doors to a vehicle trunk or other cargo/storage section of vehicle); braking the vehicle;

“FIG. “FIG. 6 shows a flow diagram for an example method 600 for autonomous vehicle operation in accordance with example embodiments. One or more of the portions of the method 600 may be implemented using one or more devices (e.g. one or more computing devices or systems), such as the vehicle 104 and the vehicle computing system108. 1. Furthermore, one or more of the methods 600 can be used as an algorithm on any of the components of the devices discussed herein (e.g. FIG. 1) to, for instance, receive trip data, travel at a pick up location and activate one of the vehicle systems. FIG. FIG. 6 shows elements in a specific order, for illustration and discussion. Anyone with ordinary skill in art will be able to see that elements can be modified, altered, removed, combined and/or modified without departing from the scope of this disclosure.

“At 602, the method 600 may include determining, in part, the one/more properties of one or several signals, the proximity of an autonomous vehicle to a source. The vehicle computing system 108, for example, can determine the distance between the autonomous vehicle (104), and one of the remote computing devices 130 (e.g. a smartphone used by a potential passenger of the vehicle 104) using the received signal strength from the one or multiple signals detected by one or more sensors 128. One or more properties can be included in the received signal strength and/or signal-to-noise ratio of one or several signals.

“In some embodiments, satisfying one or more broadcast criteria may include the proximity to the source of one or more signals being within an acceptable distance. One example of satisfying one or several broadcast criteria is determining that an autonomous vehicle and remote computing device are within 10 meters of each other. This can be done based on one or multiple properties of the signals.

“At 604, method 600 may include the generation of one or more indications based in part upon the proximity of an autonomous vehicle to the source. The vehicle computing system 108, for example, can generate one to three control signals to activate or control one or several output devices (e.g. display devices, audio device, lights, and/or tactile devices) of the vehicle. These can then be used to output one or more indications (e.g. an indication that vehicle 104 is available). One or several haptic indicators (e.g. one to three vibrations of a door handle) can be used to indicate that the vehicle 104 is available.

“In some embodiments the type, magnitude, or frequency of one or more indicators can be determined in part by the distance between the autonomous vehicle and the source of the signal. As an example, if the vehicle is closer to a potential passenger than the distance between them, the strength or frequency of one or several light indications (e.g. blinking headlights) can decrease in proportion to the distance between them.

“FIG. “FIG.7” shows a flow diagram for an example 700-level method of autonomous vehicle operation, according to the example embodiments of this disclosure. One or more of the 700-parts can be implemented using one or more devices (e.g. one or multiple computing devices or systems, such as the vehicle 104 or the vehicle computing system108 or the operations computing device 150 shown in FIG. 1. Furthermore, one or more of the methods 700 can be used as an algorithm on any of the components of the devices discussed herein (e.g. FIG. 1) to, for instance, receive trip data, travel at a pick up location and activate one of the vehicle systems. FIG. FIG. 7 shows elements in a specific order, for illustration and discussion purposes. Anyone with ordinary skill in art will be able to see that elements can be modified, altered, removed, combined and/or modified without departing from the scope of this disclosure.

“At 702, the method 700 may include the determination of a location associated to the autonomous vehicle. The location of an autonomous vehicle can be determined partly by inputs such as signals from external sources and/or the use of sensors to determine its location relative to other objects. Image sensors, for example, can be used to determine how close the vehicle is to landmark locations. The vehicle computing system 108, for example, can receive signals from a source radio-navigation data (e.g. GPS and/or GPS) and, based on these signals, determine the vehicle’s location 104.

“In some embodiments the broadcast criteria may be partly based on the autonomous vehicle’s location. The broadcast criteria could include the autonomous car being within a specific location (e.g. a pick up area), not being within that location (e.g. on a busy highway), or being within a predetermined distance of a location (e.g. within twenty meters from a pickup location).

“At 704, method 700 can include broadcasting one, or more signals with information regarding the location of an autonomous vehicle. The vehicle 104 can transmit one or more signals, including information about the location of the vehicle (e.g., latitude and longitude of vehicle 104) via the communication system 122. An autonomous vehicle can transmit one or more signals that indicate its location (e.g., latitude and longitude and/or location relative to a pre-established point or reference location) to remote computing systems using one or more communication system associated with it.

“Further, one or more signals may include radio signals or cellular signals. The one or two signals can also use different communications protocols, such as Bluetooth or Wi-Fi. One or more broadcast instructions can be included in some embodiments for devices or systems that receive the trip data. The instructions for broadcasting one or several signals by an autonomous vehicle may include, for example, instructions on when and how long to broadcast them; where and from what latitude and longitude to broadcast them; and/or the signal protocol (e.g. a wireless signal protocol).

“In some embodiments the broadcast criteria may be partly based on the autonomous vehicle’s location. The broadcast criteria could be, for example, based on the autonomous vehicle’s location within a specific area or within a specified proximity (e.g., distance in meters) to the pick-up area.

“At 706, method 700 may include determining, in part, map data and the position of the autonomous car, a signal disturbance value (e.g., data structure including one, two, or more numerical values that are associated with an estimated decrease in the number of signals being received by the receiving device of an autonomous vehicle). The vehicle computing system 108, for example, can calculate, using part of the map data, how many structures (e.g. buildings) can block one or several signals. The vehicle can use different frequencies or communication channels (e.g. different wireless frequencies) depending on how many structures are blocking one or more of the signals.

“Map data may indicate one or more structures, conditions, or features that could block or interfere with detection of one or more signals. One or more structures, conditions, or features could include buildings (e.g. office buildings and/or residential homes); tunnels (e.g. tunnels through mountainsides/under waterways); vegetation (e.g. trees and/or shrubs); weather conditions (e.g. lightning, hail, rain and/or snow); and/or sources electromagnetic transmissions (e.g. narrowband and/or broadband interference from electric devices). There are many factors that can affect the signal’s strength, including the type of material used (e.g. a concrete wall could block a signal more than an equivalent thickness wooden wall), the size of the structures (e.g. tunnels through mountainsides and/or under waterways), vegetation (e.g. trees and/or bushes); weather conditions (e.g. lightning, hail, rain, snow, and/or fog) as well as sources of electromagnetic transmissions (e.g. narrowband or broadband interference from electric devices).

“In certain embodiments, the detection can be made based on part of the signal disruption value.” The signal disruption value can also be used to modify the sensitivity of an autonomous vehicle that detects one or more signals, and/or the protocols and frequencies associated with those signals.

“FIG. “FIG. 8” shows a flow diagram for an example method 800 for autonomous vehicle operation in accordance with example embodiments. One or more of the portions of the method 800 may be implemented using one or more devices (e.g. one or more computing devices or systems, such as the vehicle 104 or the vehicle computing system108 or the operations computing device 150 shown in FIG. 1. Furthermore, one or more of the methods 800 can be used as an algorithm on any of the components of the devices discussed herein (e.g. FIG. 1) to, for instance, receive trip data, travel at a pick up location and activate one of the vehicle systems. FIG. FIG. 8 shows elements in a specific order, for illustration and discussion. Anyone with ordinary skill in art will be able to see that elements can be modified, altered, removed, combined and/or modified without departing from the scope of this disclosure.

“At 802, the method 800 may include the generation of a trip identifier that will be used to identify the autonomous car. The trip identifier may include information that is associated with one or several symbols (e.g. letters, words and numbers, pictograms pictures, icons, colors, patterns, etc.) that can be used for identification of the trip. The vehicle computing system 108 can access data structures that contain data associated with trip identifiers (e.g. numbers, letters and words), and generate the trip ID based on one or more trip identifications or information associated with a random number generator.

“In some instances, the trip identifier may include information associated with one of more symbols. It can also be based in part upon a randomly generated trip ID or selected from multiple trip IDs that have not been used within a predetermined time. A trip identifier may be generated randomly, for example by using a pseudorandom number generator or a random seed. This allows the combination of various elements of one or many symbols (e.g. text, numbers, colors, shapes and sizes, and/or patterns). A plurality of trip IDs, such as trip identifiers stored within a database of trip IDs, can be accessed. Then, a trip identification that has not been used in a predetermined time period (e.g. not used for an unspecified duration) can also be chosen to be used as the trip identifyr.

“At 804, method 800 can broadcast one or more signals, including the trip identifier. The trip identifier could include a white square and a black letter “S?” A green number?39 can be added to the square. Inside the square. The remote computing device that is associated with a potential passenger can receive one or more signals as part of the trip identification. The remote computing devices can then display the trip ID on the display portion of the remote computing. The autonomous vehicle can also display the trip ID (e.g. the same trip identification displayed on the remote computing devices) on an exterior display area of the vehicle (e.g. a panel on the roof, window, or body panel of the vehicle). The trip identifier allows potential passengers to identify the autonomous car that they are interested in using for their trip by simply using it.

“FIG. “FIG. 9″ depicts an example system of 900 as per the example embodiments of this disclosure. The system 900 may include a vehicle computing device 908, which can include any or all of those features as shown in FIG. 1; one to more computing device 910 that can include some of the features from the one-or more computing units 110; a communications interface 912, one or two processors 914; memory system 920; memory 924; one and more input devices 926; output devices 928; one of more computing device 930 which can include all or some of the features shown in FIG. 1; one to more input devices 932, one or two output devices 934; a networking 940 that can include all or some of the features of FIG. 1; and an operation computing system 950 that can include any or all the features of FIG. 1.”

The vehicle computing system 908 may include one or more computing devices 910. One or more computing devices 910 may include one or two processors 914 that can be added to a vehicle with the vehicle 104, and one or several memory devices 920 that can be added to a vehicle with the vehicle 104. Any processing device can include a microprocessor or microcontroller, an integrated circuit, an ASIC (application specific integrated circuit), a digital signal processor (DSP), and one or more memory devices 920 which can be included on-board a vehicle including the vehicle 104. One or more processors 914 may include one processor, or multiple processors that can be operatively or selectively connected. One or more memory devices (920) can contain non-transitory computer readable storage media such as RAM, flash memory devices and magnetic disks.

“The one or several memory devices 920 can store data that can be accessed and manipulated by the one or multiple processors 914. One or more memory devices, 920, that can be attached to a vehicle, including vehicle 104, may include a memory system 922. This memory system can store computer-readable instructions which can be executed using the one or multiple processors 914. Software written in any programming language that is compatible with hardware implementations (e.g. computing hardware) can be included in the memory system 922. The memory system 922 may also include instructions that can either be executed in separate threads (logically or virtually) on one or more processors. Any set of instructions can be included in the memory system 922 that, when executed by one or more CPUs 914, causes the processors 914 perform operations.

“For example, one or several memory devices 920 that can be placed on-board a vehicule including the vehicle 104 may store instructions, including special instructions, which when executed by one or multiple computing devices 910 on-board the vehicle causes the one-or more computing devices to perform operations such the operations and functions described above, including operations for receiving data (e.g. path data, context and/or traffic regulation), and/or activating one/more vehicle systems (e.g. one or two portions of method 900, method 1000 or any other functions or any other functions or operations or functions, or any other functions or operations or functions, or functions, or any other functions or operations or functions, as described in this disclosure

One or more memory devices (920) can contain a memory system 924. This memory system can store data that can then be retrieved, modified, created, and/or saved by one or more computing devices (910. Data stored in memory system 924 may include data associated to a vehicle, including vehicle 104, data acquired using the one or multiple data acquisition systems 112, path data associated a vehicle’s journey; context data associated a state of an environment; traffic regulation information associated with traffic regulations in an environmental; data associated user input; data associated one or more actions or control command signals; data related with users; and/or any other data or information. One or more databases can store the data in memory system 924. One or more databases can be divided up to be located in different locations on-board a vehicle, which can include vehicle 104. One or more computing devices 910 may be able to access data from remote memory devices, which could include the vehicle 104.

The environment 900 may include the network 940, which is a communications network that can exchange signals (e.g. electronic signals) and data (e.g. data from a computing device). This includes signals or data exchanged among computing devices, including the operations computing systems 950, 1108, or 930. The network 940 may include any combination or topologies of wired (e.g. twisted pair cable) as well as wireless communication methods (e.g. wireless, satellite, microwave and radio frequency). The communications network 140, for example, can contain a local area network (e.g. Intranet, wide area network (e.g. Internet, wireless LAN network (e.g. via Wi-Fi), cellular networks (e.g. VHF network), cellular network and/or any other suitable communication network (or combination thereof). These can be used to transmit data from and to a vehicle, including the vehicle 104.

“The one or several computing devices 910 may also include communication interface 912 that allows for communication with one or many other systems on-board the vehicle, including vehicle 104 (e.g. over the network 940). Any suitable component for interfacing with one of the networks can be included in communication interface 912. This includes, for example, transmitters and receivers, ports controllers, antennas, software, and controllers.

Click here to view the patent on Google Patents.

How to Search for Patents

A patent search is the first step to getting your patent. You can do a google patent search or do a USPTO search. Patent-pending is the term for the product that has been covered by the patent application. You can search the public pair to find the patent application. After the patent office approves your application, you will be able to do a patent number look to locate the patent issued. Your product is now patentable. You can also use the USPTO search engine. See below for details. You can get help from a patent lawyer. Patents in the United States are granted by the US trademark and patent office or the United States Patent and Trademark office. This office also reviews trademark applications.

Are you interested in similar patents? These are the steps to follow:

1. Brainstorm terms to describe your invention, based on its purpose, composition, or use.

Write down a brief, but precise description of the invention. Don’t use generic terms such as “device”, “process,” or “system”. Consider synonyms for the terms you chose initially. Next, take note of important technical terms as well as keywords.

Use the questions below to help you identify keywords or concepts.

  • What is the purpose of the invention Is it a utilitarian device or an ornamental design?
  • Is invention a way to create something or perform a function? Is it a product?
  • What is the composition and function of the invention? What is the physical composition of the invention?
  • What’s the purpose of the invention
  • What are the technical terms and keywords used to describe an invention’s nature? A technical dictionary can help you locate the right terms.

2. These terms will allow you to search for relevant Cooperative Patent Classifications at Classification Search Tool. If you are unable to find the right classification for your invention, scan through the classification’s class Schemas (class schedules) and try again. If you don’t get any results from the Classification Text Search, you might consider substituting your words to describe your invention with synonyms.

3. Check the CPC Classification Definition for confirmation of the CPC classification you found. If the selected classification title has a blue box with a “D” at its left, the hyperlink will take you to a CPC classification description. CPC classification definitions will help you determine the applicable classification’s scope so that you can choose the most relevant. These definitions may also include search tips or other suggestions that could be helpful for further research.

4. The Patents Full-Text Database and the Image Database allow you to retrieve patent documents that include the CPC classification. By focusing on the abstracts and representative drawings, you can narrow down your search for the most relevant patent publications.

5. This selection of patent publications is the best to look at for any similarities to your invention. Pay attention to the claims and specification. Refer to the applicant and patent examiner for additional patents.

6. You can retrieve published patent applications that match the CPC classification you chose in Step 3. You can also use the same search strategy that you used in Step 4 to narrow your search results to only the most relevant patent applications by reviewing the abstracts and representative drawings for each page. Next, examine all published patent applications carefully, paying special attention to the claims, and other drawings.

7. You can search for additional US patent publications by keyword searching in AppFT or PatFT databases, as well as classification searching of patents not from the United States per below. Also, you can use web search engines to search non-patent literature disclosures about inventions. Here are some examples:

  • Add keywords to your search. Keyword searches may turn up documents that are not well-categorized or have missed classifications during Step 2. For example, US patent examiners often supplement their classification searches with keyword searches. Think about the use of technical engineering terminology rather than everyday words.
  • Search for foreign patents using the CPC classification. Then, re-run the search using international patent office search engines such as Espacenet, the European Patent Office’s worldwide patent publication database of over 130 million patent publications. Other national databases include:
  • Search non-patent literature. Inventions can be made public in many non-patent publications. It is recommended that you search journals, books, websites, technical catalogs, conference proceedings, and other print and electronic publications.

To review your search, you can hire a registered patent attorney to assist. A preliminary search will help one better prepare to talk about their invention and other related inventions with a professional patent attorney. In addition, the attorney will not spend too much time or money on patenting basics.

Download patent guide file – Click here