Big data analytics, a powerful force for good or evil?
With an ever-increasing amount of data generated by the internet of things, the most interesting questions about the digital transformation involve how we will use this data, and will it ultimately be used for good or evil?
Palantir is one of Silicon Valley’s most exciting and most secretive startups emerging out of the digital transformation. The company, founded by Peter Thiel, arose out of an effort to use data to look combat credit card fraud at PayPal.
Motivated by the terrorist attacks of 9/11, Thiel founded Palantir as a “mission driven company” focused on using data to “reduce terrorism while preserving civil liberties”. The company was launched into prominence by a $2MM investment from In-Q-Tel, the CIA’s venture capital arm, where investors noted “The most impressive thing about the team was how focused they were on the problem … how humans would talk with data.” (3)
The business model at Palantir involves providing software to aggregate and manage diverse data sets and to organize the information in an intuitive way that enables customers to “ask better questions and make better decisions” in any domain of complex problem solving (2).
In the early days, they worked almost exclusively with government agencies to stop fraud, find terrorists, and aid in military missions. The Palantir team executed on their mission to do good in the world through cooperation with the US military during the wars in the middle east. They are famous for predicting roadside bombs to highlight safe paths through the streets in Baghdad, analyzing artillery fragments, location data and social media posts to locate bomb makers, and are even rumored to have been involved in locating Osama Bin Laden. (1)
However, their operating model, combined with motives for growth and profitability have recently created conflict between their business model and operating model.
Palantir delivers on their business model by selling perpetual licenses to their analytics software along with support services. When Palantir sells their software, engineers deploy to the client site to help organize, tag, and integrate data and build custom user interfaces to enable problem solving. Then they leave the product with their client for ongoing use. As the company grows, and control of clients and uses decentralizes, maintaining their ethical standards becomes more complicated.
An example of this conflict, came to light when Palantir engaged with the Northern California Regional Intelligence Center (NCRIC), a federal intelligence unit established after 9/11 to fight domestic terrorism. The NCRIC uses Palantir software to, among other things, aggregate license plate photos automatically captured by local police cars and traffic cameras. Proponents of the program boast that the software can review the database of 500 million license plate photos in a matter of seconds. They point to an example where a child abductor was located and arrested within one hour of the child disappearing. Detractors decry the program as an example of government overreach and a violation of privacy. One such detractor, Michael Katz-Lacabe, filed a freedom of information act request on his two license plates and the police department sent him 112 photos that had been captured by these cameras. He reflected that with this technology, the government can “wind back the clock and see where everyone is.” Should we worry, that in the wrong hands, this technology could be used for government discrimination, political or criminal purposes?
Palantir recognizes that its ability to continue exist and attract customers is tied to its ability to protect data privacy and security and highlights two specific commitments in this area:
- Software is designed with privacy and civil liberties capability baked in rather than added on as an afterthought (2)
- Palantir Council of Advisors on Privacy and Civil Liberties is a group of advocates and policy experts who help Palantir address concerns and push for legislation that will protect Privacy and Civil Liberties (2)
The line between privacy and security can be blurry and many people are worried about Palantir’s ability to walk that line (3). The pressure on Palantir to deliver shareholder returns is growing and will push the company to loosen their ethical standards and compromise their ability to walk away from work that does not align with their mission.
To take their commitment one step further, I would like to see Palantir create an independent (non-owner) oversight board with rotating elected membership, which holds veto power over all projects and uses of the software that do not align with the mission. Such a board, while complicated to architect, may be the best chance we have, of preventing the destruction of privacy and the sinister use of Palantir’s incredible power.
(727 Words)
I like the point you made about privacy. I just joined a data security talk in engineering school. It’s scary that a couple of companies now control most of the data and people are not aware of the risk. When we use free resources from the internet, we don’t realize they actually come with a cost. Through these services Facebook and Google they own all the data about us. Using this data, they know about us better than we do. These Internet giants are ‘harvesting’ personal data and making billions of dollars a year, but are not properly regulated.
I believe that data privacy rights will be one of the most important issues that our generation grapples with. Unfortunately, big data aggregation can be somewhat paradoxical. Specifically, the same technology that can be used for altruistic purposes can also be used for far more nefarious purposes. We have already seen these issues come to light through the Edward Snowden leaks, debates over tech companies rights to protect customers data, etc. I think Danny nailed the toughest question… how do you create an objective standard for privacy rights violation? Perhaps, the answer is this independent board that Danny proposed. Unfortunately, I think that there will need to be laws at the federal level, otherwise accountability becomes too subjective. Even if Palantir decides to be altruistic with this type of independent board, more competitors will pop up who won’t necessarily work by the same standards. I think the second issue is citizen awareness. Citizen’s will need to be given the right to determine specifically who can use their data and to what end. In a sense, I think citizens need to fight for laws which grant them control over their own data.
I’m with Jesse that I think data privacy rights will become a huge issue in the next few years. At this point in time, people seem to place a very low value on their own data but as more and more information is collected and as the government and private industry develop the capabilities to actually process and act on that data, I suspect individuals will become more protective of their privacy.
It is particularly concerning to me to see large data companies such as Palantir and Google be so tightly aligned with the government. I worry that these companies do not have the checks and controls in place to protect individuals and will begin to act solely in their own interests and the governments. I think we will continue to see conflict of interest issues emerge in the coming years (such as Peter Thiel speaking at the RNC) that begin to encourage individuals to recognize the risk of not controlling data privacy.
Thanks for the post Danny. Great points all around. It’s pretty troubling how much data is being collected without full public knowledge or consent. Even if Palentir is able to prevent major catastrophes, the means by which they did it is undeniably a slippery slope. The public should play a role in deciding how much they are willing to sacrifice privacy for safety. I remember listening a podcast where a company had designed and built a camera that could be flown above a city and provide authorities an ability to re-trace steps and find the perpetrators of shootings, kidnappings, bombing, etc when they occur. The company was allowed to work with authorities in Dayton, OH and Juarez to test its capabilities. Despite some really favorable results, the residents of Dayton ended up voting against the permanent integration of these cameras into the police force. They simply weren’t comfortable with 24/7 surveillance, no matter what it’s stated purpose was.
This was a very informative post about a company that I knew little about, though I’m not surprised given Palantir’s business model. I have a less sanguine view on Palantir’s ability to mediate the inherent conflict between maximizing shareholder value and protecting civil liberties. While the idea of an independent oversight board is commendable, I question how effective this board could actually be at protecting civil liberties. The primary problem that this board would face is information asymmetry. Management would likely control the information that the board has access to, so the board’s decisions would only be as good as the information they receive. Moreover, Thiel could stack the board with “independent” members who are actually in his back pocket through other connections.
I think this post raises a bigger question of whether or not private companies should even be in the business of analyzing big data to prevent terrorism and crime. These areas have traditionally been the responsibility of local, state and national governments. While the government oversteps its bounds from time-to-time, it’s ultimately held accountable by voters. However, citizens have very limited power if any to control the decisions made by Palantir. I think we as a society need to assess whether or not it makes sense for governmental agencies to do business with companies like this.
This is a really informative post – thanks. I completely agree with Sam’s comment above – to what extent can society allow private companies the power to use analytics to prevent such incidents? Of particular concern is the point about shareholder pressure, and the likelihood that the company will succumb and loosen their ethical standards. Does more need to be done in terms of partnering with government and national security experts, to implement the right checks and balances? Your point about the non-ownership board could be expanded, for example ensuring a good portion of board seats are filled by such experts.