Invented by Kevin L Miller, Labyrinth Research LLC

As technology continues to advance, the number of privacy-impacting devices in our homes and workplaces is rapidly increasing. From smart speakers and security cameras to fitness trackers and medical devices, these devices collect and transmit sensitive personal information. As a result, there is a growing need for a unified control and validation system to ensure that these devices are secure and protect users’ privacy. The market for unified control and validation of privacy-impacting devices is still in its early stages, but it is quickly gaining momentum. Companies are beginning to recognize the importance of privacy and security in the design and development of their products, and consumers are demanding more control over how their personal information is collected and used. One of the key challenges in this market is the lack of standardization across devices. Each device may have its own unique privacy settings and controls, making it difficult for users to manage their privacy across multiple devices. A unified control and validation system would provide a standardized approach to privacy management, making it easier for users to understand and control their privacy settings across all their devices. Another challenge is the complexity of the technology involved. Privacy-impacting devices often use a variety of sensors and data collection methods, making it difficult to validate that they are collecting and transmitting data in a secure and privacy-preserving manner. A unified control and validation system would provide a framework for testing and validating these devices, ensuring that they meet industry standards for privacy and security. The market for unified control and validation of privacy-impacting devices is still evolving, but there are already a number of companies and organizations working in this space. The Open Connectivity Foundation, for example, is developing a standard for secure and privacy-preserving IoT devices. The Global Privacy Control initiative is working to create a universal setting that allows users to signal their privacy preferences to websites and apps. As the market for privacy-impacting devices continues to grow, the need for a unified control and validation system will only become more pressing. Consumers are becoming increasingly aware of the risks associated with these devices, and they are demanding more control over their personal information. Companies that prioritize privacy and security in their products will be well-positioned to succeed in this market, and those that don’t may face significant reputational and legal risks. In conclusion, the market for unified control and validation of privacy-impacting devices is an important and rapidly growing area of the technology industry. As the number of these devices continues to increase, companies and organizations that prioritize privacy and security will be well-positioned to succeed. A unified control and validation system will provide a standardized approach to privacy management, making it easier for users to understand and control their privacy settings across all their devices. The future of this market is bright, and we can expect to see continued innovation and growth in the years to come.

The Labyrinth Research LLC invention works as follows

Techniques are described to help control and verify the behavior of privacy-impacting device in accordance with the privacy expectations of individuals or other entities. Accountability and audit systems can verify IoT devices’ control states in relation to privacy preferences inputs, and notify device owners when their devices have been compromised by malware or viruses. “A trust-enhancing, technically transparent system architecture consists of a distributed application networking, distributed ledger, smart contracts and/or Blockchain technology.

Background for Unified Control and Validation of Privacy-impacting Devices

Internet of Things (IoT), robots, autonomous agents, and ‘bots’, smart home appliances and devices, mobile devices, conversationsal agents, drones and cameras, as well as other sensor-laden device have become more prevalent in the physical spaces where humans live and work. They are able to perceive our presence and interpret what we do and say, record video, audio and location data, interact with us physically by touching or approaching, and communicate or notify us in ways that could be intrusive.

Privacy concerns arising from these privacy-impacting gadgets are qualitatively distinct from those experienced in traditional website and mobile app apps. Incentives for the current paradigm of data privacy in mobile apps and websites are based on a lack of viable economic models to monetize web services and publication of content. Privacy on mobile apps and websites is often defined in terms of ‘notice and consent.’ Modalities that focus on obtaining consent from consumers for the sale of their personal data or behavioral data to a third party to be used in marketing. The participants in this system allowed the notion of privacy of information and its associated consent and notice modality to dominate the data privacy conversation. This includes everything from the regulatory themes to the user interfaces of privacy settings for giving or refusing consent. This issue is further complicated by the fact that in privacy law, the only time a person is protected from privacy invasions is when they are unreasonable or unexpected. The interaction of the consent and notice modality, with the amorphousness in the “reasonableness” The doctrine of’reasonableness’ means that over time our’reasonable expectations? As individuals grant blanket permission to web service providers for them to use their personal data in exchange for “free” “Free” use of their apps and services.

These privacy-impacting gadgets present more complex and nuanced privacy issues than web privacy.” When privacy is viewed from these devices’ perspective, it includes some classic privacy concerns, such as those in data sharing on websites, but also physical privacy. ?Physical privacy,? As understood here, physical privacy includes concepts like whether a sensor can measure and record the physical characteristics of a human being (e.g. audio recording, heart rate monitoring); whether a machine’s proximity to a human in certain situations; and whether and how a robot may touch a human. These types of physical privacy are more closely linked to those that are protected by privacy torts like ‘intrusion on seclusion? Battery and intrusion upon seclusion are two examples of privacy torts that protect physical privacy. The traditional notice and consent mechanisms are unlikely to be sufficient when applied to the privacy concerns of these devices.

Despite rapid advances in these privacy-impacting gadgets, there are no effective techniques to solve their privacy challenges.

Unlike web-based privacy models people, robots and the devices that they carry are mobile. This means that privacy behavior expectations have become dynamic and context-specific. For example, robots may move into physical spaces inhabited or occupied by different people. And different people may enter or leave a robot’s proximity at any given time. Privacy expectations can also be influenced by cultural norms, group values and even physical location. Situational contexts, such as an urgent situation, can override other concerns. In real-world situations, the management of privacy expectations is made exponentially harder when devices are required to select appropriate governance measures in order to accommodate potentially conflicting privacy requirements for multiple people simultaneously occupying an office, home or public space. These individuals may be interacting with a device for the very first time. Devices must be able to navigate complex privacy settings and cultural norms. They may also need to seek clarification from people near them or compromise their positions to achieve effective action.

Aware of these shortcomings and nuances, we describe systems, system architectures and techniques for selecting, defining and controlling privacy behavior of devices, in accordance with privacy preferences and expectations from individuals and other entities. The techniques and systems presented are designed to regulate the observation, movement and recording activities of privacy-impacting devices in accordance with privacy behavior expectations from the humans it interacts with. Disclosed solutions facilitate a common consistent standard that assists privacy-impacting devices in acting in alignment with our contextually-informed values, which require granular and scenario-specific restrictions on the range of actions a robot can take in a wide variety of environments, from assisting an elderly man to handing out brochures at a shopping mall.

The following explores the system architecture and characteristics of a technical framework for making available, fusing and reconciling privacy behavior expectation data of multiple actors across every contextual level (cultural, societal, group, locational, individual, and situational) and transforming them into concrete instructions usable by a device as behavioral controls. This paper explores a technical framework that combines and reconciles data on privacy expectations of multiple actors at every level of context (cultural, social, group, geographical, individual and situational). It then transforms this information into instructions for devices.

The technical framework includes, but is not limited to: techniques, systems and apparatuses that are used for a “privacy enunciator” device. To announce the presence or influence of entities within a device?s range of action; Techniques and systems to share individualized privacy behavior preferences through a trusted exchange mechanism that devices can access to inform their sensor recording and activation, movement and actions, communication modalities and data persistence; Data sharing and sharing behaviors of devices based on individualized context-sensitive and role-sensitive privacy rule; Techniques, systems, apparatuses, for assisting privacy impacting devices to interact in a technically transparent and trust-enhancing manner with other members Accountability and auditing mechanisms are discussed in order to encourage responsible adoption. This includes the use of distributed apps, distributed ledgers and blockchain architectures. The participants in this exchange are portrayed as members of an “interacting privacy trust system.

This Brief Summary presents a few concepts in simplified form. They are described in detail below in the Detailed description. This Brief Summary does not intend to identify key features and essential features of claimed subject matter. It also is not meant to limit the claims.

Accordingly systems, system architectures and techniques are described to select, define, control, verify and audit privacy behaviors of devices, in accordance with privacy behavior preferences and expectation of individuals and entities. Techniques and systems for controlling and verifying privacy-impacting device behavior are described in embodiments. Apparatuses for a ?privacy enunciator device? Announce the presence of entities within a device’s field of influence or action. The disclosure of techniques and systems to define and share individualized privacy preferences is made. The invention relates to devices that impact privacy. The disclosure of techniques and systems for privacy preference resolution protocol and merger techniques allows for the automated or inter-active resolution of conflicts between individuals who are in multi-actor settings or contexts that are ambiguous. Accountability and auditing mechanisms are also discussed to verify the control state of devices in relation to their privacy behavior preferences inputs. To perform certain aspects of techniques and systems, some embodiments use distributed ledger, smart contract, or blockchain technology. “In addition, there are many more detailed embodiments and applications of these systems and techniques.

Participants to this exchange are presented as members of an interacting privacy trust system. In an interaction, trust is when one party believes that the other will behave in a certain way. Participants in the privacy trust system are devices controlled by described technical methods, entities with privacy behavior preference, a service that assists entities in defining their privacy behavior preference and disseminates privacy preferences to devices. Auditors verifying devices perform the correct control functions in response a set privacy behavior preferences.

The technical benefits of these features are numerous and varied. First, they allow users to easily and quickly define privacy preferences that can be applied to a variety of privacy impacting devices, rather than having to configure each device or website individually. Second, devices are able to control their privacy impacting behaviors according to standardized, autonomous models. This makes standard device interaction, and control mechanisms more efficient and generalized. Third, the system is designed to be sensitive to cultural and contextual factors. This is an important technical goal. Auditable transaction telemetry is a fourth way to provide system transparency and accountability. This is a key feature in the design of complex autonomous systems. Auditable controls that compare the actual state of the device to the expected state can help detect spyware and other malware. Below, many other beneficial technical effects will be described with reference to specific embodiments.

It may be useful to introduce some useful terminology before discussing specific embodiments. These descriptions are not meant to be restrictive, and further information on any terminology can be provided in the detailed embodiments.

A ?privacy-controlled device,? As used in this document, a PCD is a device which can affect the privacy rights, expectations, or interests of a person or entity. You can also refer to it as a “privacy impacting device.” In terms of techniques and systems to control and verify privacy behaviors, embodiments relate to devices with a variety of capabilities and morphologies. In the various examples scenarios, specific devices will be described. However, there are several device categories that can be considered, including robots, Internet of Things devices, autonomous agents or ‘bots’, smart home appliances and devices, mobile devices, conversational agent, drones, cameras and other sensor-laden gadgets.

For example, a robot with any level of sophistication will have dozens of sensors to serve a variety of purposes, including: orienting the robot in its surroundings, identifying important objects or persons, attenuating the force that they use when performing movements or actions (e.g. grasping), determining the operating state of the machine or device, and recording the movements, sounds, actions or other telemetry of people or entities for historical, accounting and behavioral analysis purposes, as well as pure surveillance. Any sensor can violate privacy preferences depending on the way and when it is used, and how long the sensor data is stored. Some components combine sensor and actuator functions.

Robotic vacuums, for example, may be able to map out and record the floorplans of rooms or houses, as well possessing the ability to interrupt people in the home while they are completing their tasks (e.g. a person suffering from dementia may feel disturbed by the robotic vacuum entering the room). Drones are not limited to flying over homes and yards from above in order to monitor private areas, but they can also interrupt or disrupt residents’ enjoyment of the space.

PCDs may also include other data-gathering devices or telemetric equipment. A standard computer or mobile phone with a webcam that can record a stranger passing by nearby is an example of a device that could impact privacy. Computers or mobile devices can transmit data on location, movement, or other behaviors that their sensors can detect. “A device that has a browser on it can record information access activities for a particular user.

An ?entity? “Entity” includes all living things including humans. The term “entity” may include a deceased person, if a privacy behaviour impacts on a privacy expectation or right that survives death. This includes the privacy expectations/rights of the deceased individual and any living individuals (e.g. heirs), whose privacy expectations/rights depend on the privacy data of the deceased individual. The term “entity” is used. The term ‘entity’ also includes legal entities, businesses and corporations, organizations and non-profits as well as places like private homes, automobiles or other moving transportation, stores, offices and facilities, hospitals and airports or any other collective or group, all of which could have a privacy expectation or right associated with their status (e.g. the U.S. Transportation Safety Authority expectations regarding the privacy of travelers at an airport, or the rights for a business in recording customers on closed-circuit camera in its store, or the a Privacy rights/expectations for more than one entity can be affected by (and conflict with) the detection zone of a given device.

FIG. 1A is an example of a system/components environment where some implementations can be made of techniques and systems for selecting, defining and controlling privacy behaviors in devices, as well as verifying and auditing them. FIG. FIG. 1A depicts one embodiment or representation of a privacy-trust system. The example component environment, which is explored in greater detail below, includes a privacy controlled device (?PCD?) The privacy control component is housed in 100. 102A detects, within its detection zone, 101, the privacy identity for one or more entities, such as 105A and 105B. PCC 102A communicates with privacy service120A by sending privacy ruleset requests 110 (e.g. over a network). Privacy service 120A can include components/units, or data stores.

Click here to view the patent on Google Patents.