It seems to me that much of contemporary-government governance is built on the premise that statistical/computational methods are unbiased evidentiary approaches to informing many aspects of standards development, governance, policy and enforcement. In a democracy where differences exist this is particularly important because the best practice is to resolve differences through discussion and unbiased evidentiary information where people are encouraged to voluntarily come together even compromise. Big Data erodes if not undermines the fairness of the “unbiased evidentiary” basis of statistical/computational approaches. The lack of privacy in the Big Data setting is one of the expressions of this. This lack of privacy is a problem for both individuals and entities. It is really a collective problem of our time. Introducing noise into the computation has more or less as many drawbacks as benefits taken overall.
This is the conundrum I have been wrestling with and which I hope to shed some light on in the Implications for Life in a Time of Big Data Whitepaper I am working on in the NIST Big Data Public Working Group.