[W]here privacy is afforded, it is afforded by the grace of inefficiencyKerry Howley in Drone Wars: Call Me a Traitor [nymag.com], published by New York Magazine
This article is about the horrors of America’s drone wars, committing murder from afar of terrorists and civilians, of American war crimes maybe, I’m going to avoid commenting on that topic for now. I’m going to ignore the context of the quote and talk about the substance of it in relation to privacy in a more general sense, mostly in the first world where military and spy drones are not surveilling us but may other things are.
The quote struck me because, even for people far from the drone battlefield, our privacy is often also granted through inefficiency. But also through inconvenience, obscurity and cost.
So much data is collected about us, videos, location data, images to be mined for facial recognition, etc., etc., etc. It’s collected by spy agencies —as in this story— but also by a plethora of private companies, big tech, law enforcement and beyond. “Privacy” for most people is because there is too much data to process or the algorithms that process it are fully automated and no human actually looks unless there is an issue, or the algorithm is not interested in your sex life or your gambling habit except in so far as they can be used to sell you something.
The government, and your angry partner, might be interested in such things, and they might hire people to follow you —physically, or digitally— but for the most part even if google or amazon had the data to know you are hiding your sexual orientation, philandering or to expose what happens in Vegas, they don’t care. That apathy on the part of the data collectors is what keeps many things private today.
The computers know all, but it’s not worth people looking at the data on you most of the time, and the algorithms are looking for specific things. Sure, deep neural networks may accidentally find a correlation between something you don’t want exposed to the public and what a company is trying to sell but they don’t broadcast the correlation, only output the recommendation. (there is actually a problem in machine learning around Interpretability and Explainability —which is basically “can you explain why a decision or recommendation was made by the system”, it’s an active area of research but most complex Machine Learning or Neural Networks systems can produce results that their creators have a hard time explaining, they can’t dissect the logic, the system is a magic black box.)
I used to joke that I gave up on the NSA reading my emails because I realized that my life is just not interesting enough for anyone to look and if, on occasion, something is flagged by an algorithm and an analyst does actually look they will realize I am not a person of interest very quickly. (My life is boring, I pity my FBI agent). As an aside, I briefly worked for “the customer” [confusion.cc] and there was, in the early 2000’s already at least a few programs being built to automate the processing of the data that was hoovered up by the TLAs it was some advanced shit for the time but prehistoric by the standards of what is publicly available today from big tech.
I a big actual problem, given that at least where big tech is concerned we actively give them all this information, is that this data never dies… it goes into the digital archives and is there forever so if someone, for appropriate or nefarious reasons, decides to dig it up it’s there. There needs to be an expiry date for all this data collected. Like GDPR gives you a right to be forgotten. Telco companies I work with are required to keep billing data for seven years, usually available instantly for a year or two and then archived (takes longer to retrieve but often must be available in 24 hours for legal requests) for another five or six (depends of jurisdiction…) but after that they typically dump it to save on storage space. Maybe their should be a law that all the raw data collected by companies or government on people should be archived after a year or two and deleted (and dropped from any algorithm’s calculations) after a few more years. It’s no good to Google to know what I was interested in eight or nine years ago, really, to sell me things well should not take more than the most recent year or two’s data. I guess it’s more requiring that things be automatically forgotten then real privacy, but…
I remember reading about a guy one time who was shocked in an interview when the prospective employer asked him about his messy divorce. He was shocked because the divorce took place a few years before and on the other side of the country, he moved cross country after the divorce and he never spoke about it to anyone in his new home. But the prospective employer had googled him and found the divorce information in the local newspaper’s now online archive and the court documents which were also online. The thing about this is that the way the law works in the US is the court documents were always public, but prior to mass posting of such things online the only way to get them was to march down to the court house… which a local reporter might do for a messy divorce or the government might hire someone to do if there ware processing a security clearance, but you would never expect a random potential employer a thousand or more miles away to have been to the court house or have the local papers. That’s why you move, to start over. So in the pre-Internet days a lot of privacy was through inconvenience, our laws and, if you are older then the Internet, our expectations have not kept pace. A lot of what we get upset about is something that is not new in concept, but what was hard is now easy with the rise of technology.
I guess, in the end, all of this is to say, the laws need to be updated to match what people actually expect or what. The EU has made a start, California has tried something but the US as a whole and most places are , as usual, legislative way behind the technology and businesses. Time to catch up.
Wow… this was supposed to be short post for a nice quote. So let me stop here.