Invented by Bibhrajit HALDER, SafeAI Inc

The market for dynamically controlling sensor behavior is rapidly growing as industries across various sectors recognize the importance of real-time data collection and analysis. Sensors play a crucial role in monitoring and measuring various parameters in industries such as manufacturing, healthcare, agriculture, and transportation. However, the ability to dynamically control sensor behavior adds a new level of flexibility and efficiency to these systems. Dynamically controlling sensor behavior refers to the ability to modify and adjust sensor settings, such as sampling rate, sensitivity, and data transmission frequency, in real-time. This allows for better adaptability to changing conditions and requirements, leading to improved accuracy, responsiveness, and cost-effectiveness. One of the key drivers behind the increasing demand for dynamically controlling sensor behavior is the rise of the Internet of Things (IoT). As more devices and systems become interconnected, the need for sensors that can be remotely controlled and managed becomes essential. For example, in a smart factory setting, dynamically controlling sensor behavior enables real-time monitoring of production lines, allowing for immediate adjustments to optimize efficiency and reduce downtime. In the healthcare industry, dynamically controlling sensor behavior has the potential to revolutionize patient monitoring and care. By adjusting sensor settings based on individual patient needs, healthcare professionals can ensure accurate and timely data collection, leading to better diagnosis and treatment. Additionally, dynamically controlled sensors can help prevent false alarms and reduce unnecessary interventions, improving patient comfort and safety. Agriculture is another sector that can greatly benefit from dynamically controlling sensor behavior. By adjusting sensor settings based on environmental conditions, such as soil moisture, temperature, and light intensity, farmers can optimize irrigation, fertilization, and pest control. This not only improves crop yield and quality but also minimizes resource wastage and environmental impact. Transportation systems are also leveraging dynamically controlled sensors to enhance safety and efficiency. For example, in autonomous vehicles, sensors need to adapt to changing road and traffic conditions in real-time. By dynamically adjusting sensor behavior, these vehicles can make more informed decisions, such as adjusting speed or changing lanes, to ensure passenger safety. The market for dynamically controlling sensor behavior is expected to witness significant growth in the coming years. According to a report by MarketsandMarkets, the global smart sensor market, which includes dynamically controlled sensors, is projected to reach $91.5 billion by 2025, growing at a CAGR of 18.1% from 2020 to 2025. Key players in this market are continuously investing in research and development to enhance sensor capabilities and develop innovative solutions. For instance, advancements in artificial intelligence and machine learning algorithms are enabling sensors to learn and adapt to changing conditions autonomously, further improving their performance and efficiency. In conclusion, the market for dynamically controlling sensor behavior is expanding rapidly as industries recognize the benefits of real-time data collection and analysis. The ability to adjust sensor settings in real-time allows for better adaptability, accuracy, and cost-effectiveness in various sectors such as manufacturing, healthcare, agriculture, and transportation. As the IoT continues to grow, the demand for dynamically controlled sensors is expected to increase, driving further innovation and market growth.

The SafeAI Inc invention works as follows

An infrastructure for improving safety of autonomous systems is provided.” The autonomous vehicle management system controls the functions or operations of a vehicle, machine or other autonomous device in order to ensure that they are carried out safely. The AVMS can dynamically control the behavior of sensors attached to a vehicle. For example, for a sensor, the AVMS can dynamically change and control what sensor data is captured by the sensor and/or communicated from the sensor to the AVMS (e.g., granularity/resolution, field of view, control zoom), when the data is captured by the sensor and/or communicated by the sensor to the AVMS (e.g., on-demand, according to a schedule), and how the data is captured by the sensor and/or communicated from the sensor to the AVMS (e.g., communication format, communication protocol, rate of data communication).

Background for Dynamically Controlling Sensor Behavior

Recently, we have seen a significant rise in the use and adoption of autonomous driving technology (e.g. autonomous vehicles). The adoption and application at large scale of Artificial Intelligence based technologies in the autonomous driving domain has played a part. AI-based applications for autonomous driving are used to perform tasks such as identifying the objects within the environment of an autonomous vehicle, making automatic decisions that affect the vehicle’s motion, etc. The current autonomous driving solutions that use AI systems do not have the necessary tools to ensure functional safety. This is a major obstacle to the adoption and use of these technologies.

The present disclosure is related to autonomous vehicles and, more specifically, to artificial intelligence-based and machine learning techniques used by an autonomous management system for an autonomous car to control operations in a safe way. Herein are described various inventive embodiments including methods, systems and non-transitory computers-readable storage media that store programs, code or instructions executable on one or more processors.

An infrastructure that increases the safety of autonomous system such as autonomous vehicles and autonomous machines is provided.” The autonomous vehicle management system, also known as a controller system, is configured to automatically perform one or more autonomous functions performed by a vehicle or machine in a manner that ensures the autonomous operations are carried out safely. Examples of autonomous operations are, without limitation: autonomous driving along a route, scooping or dumping operations, moving objects or materials (e.g. moving dirt from one place to another), lifting material, driving, rolling or spreading dirt, excavating or transporting objects or materials from one point or another point.

In certain embodiments, an autonomous vehicle management system receives sensor data from one of more sensors attached to the vehicle. The autonomous vehicle management system generates and updates an internal map of the vehicle based on this sensor data. This internal map contains information that represents the state of the autonomous vehicles environment (e.g. objects detected). The internal map is used in conjunction with other inputs such as the objective (e.g. change lanes, turn right/left, perform a special operation like digging or scooping etc.). The autonomous vehicle management system generates a plan of actions for the autonomous car based on safety considerations and other inputs. The plan of actions may include a sequence of planned actions that the autonomous vehicle will perform in order to reach the goal in a secure manner. The autonomous vehicle control system can then control one or several vehicle systems to execute the actions specified in the plan.

The autonomous vehicle management system can use a variety of artificial intelligence-based techniques, such as neural networks, reinforcement Learning (RL) techniques and others. As part of the processing, it uses models. The autonomous vehicle management system, for example, may use a Convolutional Neuronal Network (CNN), to identify objects within the autonomous vehicles’ environment, using sensor data captured by the vehicle (e.g. images taken by the vehicle mounted cameras). The autonomous vehicle management system can also use RL techniques to determine the actions that should be included in a plan of action for the autonomous vehicle in order to reach a goal in a secure manner.

The autonomous vehicle control system uses different techniques to increase the safety of autonomous operations. The autonomous vehicle system can dynamically control the behavior of sensors that are associated with vehicles that provide sensor data to the autonomous vehicle system. For example, the autonomous vehicle management system can send instructions or commands to a particular sensor to dynamically control what sensor data is captured by the particular sensor, what sensor data is communicated from the sensor to the autonomous vehicle For a sensor, the autonomous vehicle management system can dynamically change and control what sensor data is captured by the sensor and/or communicated from the sensor to the autonomous vehicle management system (e.g., granularity/resolution of the data, field of view of the data, partial/detailed data, how much data is communicated, control zoom associated with the data, and the like), when the data is captured by the sensor and/or communicated by the sensor to the autonomous vehicle management system (e.g., on-demand, according to a schedule), and how the data is captured by the sensor and/or communicated from the sensor to the autonomous vehicle management system (e.g., communication format, communication protocol, rate of data communication to the autonomous vehicle management system). The autonomous vehicle system builds the internal map based on sensor data from the sensors. By being able dynamically control the behavior and communication of the sensors the information used to build or maintain the internal maps can be dynamically controlled.

In certain embodiments, there is a controller system for controlling the autonomous operation of a car. The controller system identifies the sensor whose behaviour is to be altered. The controller system communicates a command to the sensor that causes a change to the sensor’s behavior. The controller system receives sensor data resulting from a change in behavior caused by the instructions. The controller system determines the action that the vehicle will perform based on the sensor data it receives from the sensor due to the change in behavior. This action is usually related to an autonomous operation. Examples of autonomous operation include, but are not limited to, driving along a route, a scooping or dumping operation or other operation that involves moving or lifting an object or material. The sensor that is affected by the change in behavior can be either a sensor located onboard the vehicle, or one that is remote.

The controller system can then control a vehicle’s system to perform the action. Vehicle systems can include, but are not limited to, a braking or steering system, propulsion, electrical, auxiliary, or any other system.

In certain embodiments, a controller system can be configured to update a map of an internal environment generated by it based on the sensor data that the controller receives resulting from alterations in the behavior of the sensor. The updated map can include information about the vehicle’s surroundings. The updated internal map may determine the action that the vehicle will take.

In certain embodiments, a controller system receives data indicating a vehicle goal. In response to the goal, the controller system determines a change that needs to be made in the sensor data it receives from a group of sensors. The controller system can then select a specific sensor from the set whose behavior needs to be altered in order to change the sensor data. The goal can be related to autonomous operation, to vehicle maneuvering, to vehicle interaction with an object within the vehicle’s surroundings, etc.

In certain embodiments, in order to communicate the instruction to the sensor, the controller may convert the instruction into a second format that is understandable by the sensors. The sensor may receive the instruction in a second format. Communication may be carried out using a protocol that is understood by the sensor.

In some cases, the content of the data that the controller system receives from the sensor is different than the content of data it received from the device before the communicating. In some instances, sensor data that is received by the system controller from the device sensor after communicating an instruction to the sensor includes content captured by sensor in a first volume within the vehicle environment. Sensor data received by controller system before communicating contained data captured by sensor in a second volume, with the first and second volumes being different. In other cases, sensor data is received by a controller system after an instruction is communicated from the controller to a sensor. Prior to communicating, sensor data was received by the control system at a different resolution. In some scenarios, the sensor’s behavior changes and the sensor sends sensor data to the controller at a different rate. In other scenarios, an instruction can cause a change to the data that the sensor captures.

The following specification, the claims and the accompanying drawings will make it easier to understand all of these features and embodiments.

In the following description, certain details are provided for clarification purposes in order to give a clear understanding of some inventive embodiments. It will become clear that different embodiments can be implemented without the specific details. Figures and descriptions are not meant to be restrictive. The word “exemplary” is used here to mean’serving as an example, instance or illustration. The word ‘exemplary’ is used in this document to mean “serving as a model, example, or illustration.” Any embodiment or design described as ‘exemplary’ herein is to be construed as preferred. “Any embodiment or design described herein as?exemplary?

References to ‘one embodiment’ are made throughout this specification. ?an embodiment,? If you use the phrase ‘an embodiment,? or similar language, it means that a certain feature, structure or characteristic described with respect to an embodiment is present in at least one of them. The phrase ‘in one embodiment’ is used to indicate this. The phrases?in one embodiment? Similar language in this specification does not always refer to the exact same embodiment.

The present disclosure is related to autonomous vehicles and, more specifically, to artificial intelligence-based and machine learning techniques used by an autonomous management system for an autonomous vehicle to control operations of the autonomous car in a safe way.

An infrastructure that increases the safety of autonomous system such as autonomous vehicles and autonomous machines is provided.” The autonomous vehicle management system, also known as a controller system, is configured to automatically perform one or more autonomous functions performed by a vehicle or machine in a manner that ensures the autonomous operations are carried out safely. Examples of autonomous operations are, without limitation: autonomous driving along a route, scooping or dumping operations, moving objects or materials (e.g. moving dirt from one place to another), lifting material, driving, rolling or spreading dirt, excavating or transporting objects or materials from one point or another point.

In certain embodiments, an autonomous vehicle management system receives sensor data from one of more sensors attached to the vehicle. The autonomous vehicle management system generates and updates an internal map of the vehicle based on this sensor data. This internal map contains information that represents the state of the autonomous vehicles environment (e.g. objects detected). The internal map is used in conjunction with other inputs such as the objective (e.g. change lanes, turn right/left, perform a special operation like digging or scooping etc.). The autonomous vehicle management system generates a plan of actions for the autonomous car based on safety considerations and other inputs. The plan of actions may include a sequence of planned actions that the autonomous vehicle will perform in order to reach the goal in a secure manner. The autonomous vehicle control system can then control one or several vehicle systems to execute the actions specified in the plan.

The autonomous vehicle management system can use a variety of artificial intelligence-based techniques, such as neural networks, reinforcement Learning (RL) techniques and others. As part of the processing, it uses models. The autonomous vehicle management system, for example, may use a Convolutional Neuronal Network (CNN), to identify objects within the autonomous vehicles’ environment, using sensor data captured by the vehicle (e.g. images taken by the vehicle mounted cameras). The autonomous vehicle management system can also use RL techniques to determine the actions that should be included in a plan of action for the autonomous vehicle in order to reach a goal in a secure manner.

The autonomous vehicle control system described in the disclosure uses different techniques to increase the safety of autonomous operations. The autonomous vehicle system can, for example, dynamically control the behavior of the sensors that are associated with the vehicle and provide the sensor data used by the system to process. For a sensor, the autonomous vehicle management system can dynamically change and control what sensor data is captured by the sensor and/or communicated from the sensor to the autonomous vehicle management system (e.g., granularity/resolution of the data, field of view of the data, partial/detailed data, how much data is communicated, control zoom associated with the data, and the like), when the data is captured by the sensor and/or communicated by the sensor to the autonomous vehicle management system (e.g., on-demand, according to a schedule), and how the data is captured by the sensor and/or communicated from the sensor to the autonomous vehicle management system (e.g., communication format, communication protocol, rate of data communication to the autonomous vehicle management system). The autonomous vehicle system builds the internal map based on sensor data from the sensors. By being able dynamically control the behavior and behavior of the sensors the information used to build or maintain the internal maps can be dynamically controlled.

As an example, the autonomous vehicle control system can simulate and evaluate various “what-if” scenarios as part of their decision-making process. scenarios. These what-if scenario’s project different behavioral predictions onto the map of the autonomous vehicle and can be used in order to determine the safest sequence of actions that the autonomous vehicle should take to achieve a specific goal. The autonomous vehicle management system can run different what-if scenarios, for example, to determine the best way to perform a turn. Each what-if simulation may simulate a unique behavior pattern (e.g. simulating varying speeds, paths, pedestrians, etc.). The autonomous vehicle management system will then determine the safest course of action for the autonomous vehicle based on these simulations.

Click here to view the patent on Google Patents.