Operationalizing Privacy in a Big Data Context

Operationalizing Privacy in the Big Data Context

Ann Racuya-Robbins

Operationalizing privacy is largely about understanding the nature of the data you are interested in analyzing. Understanding of the nature of the data involves intuition, ethics and some ICT technical knowledge. An average adult person has the ability to understand and make decisions on these matters.

Once the nature of the data is understood intuitively, ethically and in a general technical way privacy requirements for Big Data can be delineated in alignment with national standards, laws and regulations followed by the further specification of technical details that meet privacy requirements in deployment. Importantly since this technical knowledge is based on intuitive and ethical parameters the average person can understand the relationships between the privacy parameters and the technical and they should remain clear.

Privacy Risk Assessment, Management, Prevention and Mitigation

Goal: Privacy Preserving Information Systems using Big Data, Big Data Analytics

Scoping the Privacy Context – A Question and Answer Tree.

Pre-Big Data Processing/Analytics  also a Post-Big Data Processing/Analytics

QUESTION: The most important question is: Does your prospective data set(s) contain personal data*?

ANSWER: Yes.

FOLLOWUP QUESTION 1: How do you know?

FOLLOWUP ANSWER: Metadata Personal Data Tag

FOLLOWUP ANSWER: Provenance Report from Data Vendor or Agency

FOLLOWUP QUESTION 2: Can you verify? Reproduce?

FOLLOWUP ANSWER:

 

ANSWER: No.

FOLLOWUP QUESTION 1: How do you know?

FOLLOWUP ANSWER: Data Vendor Reports no Personal Data.

FOLLOWUP QUESTION 2: Can you verify?

FOLLOWUP ANSWER: No. Data Vendors maintains Proprietary  Status of Data

ANSWER: I don’t know.

FOLLOWUP QUESTION 1: How can you find out?

 

QUESTION: Does your data set contain “raw” data?

QUESTION: How large is your Data Set(s) Cluster?

< 100 gig

< 1.5 TB

< 100 TB

etc

QUESTION: Will more than one data set be linked and analyzed.

QUESTION: What is the anticipated rate of arrival of the data? At what velocity will the Data Set(s) Cluster be processed/analyzed?

QUESTION: Is the data irregular and of multiple data types?

QUESTION: Will the processing/analytics be used for real-time decision-making?

More QUESTIONS to be determined.

 

*Appendix

Definitions TBD

Personal Data/Information

Data Actions

             Collection

Processing

Use

Logging

Disclosure

Generation

Retention

Transformation

Transfer

Inference

Extrusion

Pollution

Manageability

 

Personal Data Metadata Tags

General Personal Data = PD-G

Very Sensitive Personal Data = PD-VS

 

Privacy Rights Risks, Harms and Mitigations (Controls)

Rights TBD

 

Harms

Appropriation: Personal information is used in ways that deny a person self-determination or fair value exchange.

Breach of Trust: Breach of implicit or explicit trusted relationship, including a breach of a confidential relationship

Distortion: The use or dissemination of inaccurate or misleadingly incomplete personal information

Exclusion: Denial of knowledge about or access to personal data. Includes denial of service.

Induced Disclosure: Pressure to divulge information.

Insecurity: Exposure to future harm, including tangible harms such as identity theft, stalking.

Loss of Liberty: Improper exposure to arrest or detainment.

Power Imbalance: Acquisition of personal information about person which creates an inappropriate power imbalance, or takes unfair advantage of or abuses a power imbalance between acquirer and the person.

Stigmatization: Personal information is linked to an actual identity in such a way as to create a stigma.

Surveillance: Collection or use, including tracking or monitoring of personal information that can create a restriction on free speech and/or other permissible activities.

Unanticipated Revelation: Non-contextual use of data reveals or exposes person or facets of a person in unexpected ways.

To Be Defined:

Data Inference

Extrusion

Pollution

Bias

Discrimination

Data Subjects Intellectual Property

 

Preventions, Mitigations, Controls

 

Big Data Guidelines Repository at WIPO or

 

Human Trust Experience and Data Actions

Recently I attended a Privacy Workshop hosted by NIST. One of the insights that emerged is the difference between security language and privacy language. For example while a phrase like “data actions” may from the security engineering perspective be useful and meaningful from the perspective of human beings this term is quite empty. Identity is emergent, tender, personal, lying in the field of emotions and life and death. Identity is alive. We should not be impatient that such an important subject is hard maybe beyond our ability at present to speak to. Privacy too is new and unformed.

I sensed that by the end of the NIST Privacy Workshop there was an awareness of the raw and vast scope of the problem.

When “data actions” means inferring what a human being is going to do or think next, monetizing that and generating revenue for a third party or releasing the recent date of your brother’s death for monetary purposes, the emotional danger of these “actions” emerges.

Context is a wonderful tool to help us. But some things carry across context. I think we should look for our humanity in every context and accept nothing less.