What if you developed a Big Data technology so powerful that it could pull intelligence from mountains of data, and tell you things that no one else knows? How would you control it, keep it from being misused, or from falling into the wrong hands? Who can you trust to use it responsibly?
Trust and control are more critical than ever in the era of Big Data. But what those things mean is becoming more complex and nuanced. To see how, read this New York Times article about Palantir Technologies.
In the world of Big Data, Palo Alto-based Palantir has become something of a juggernaut. The company has raised $900 million from investors and is reportedly valued at $9 billion. Much of the article focuses on whether it is headed toward an IPO, as investors hope, or will stay private, as its CEO wants.
It’s interesting for Silicon Valley insiders, sure. But what matters to the rest of us is the nature of Palantir’s business. Palantir has developed some of the most sophisticated software around to detect and understand patterns in large sets of data. The company was started, in part, with money from PayPal cofounder Peter Thiel and the venture capital arm of the CIA. Its technologies are being used by insurance companies, investment firms and national governments.
By many accounts, the software has done tremendous good in the hands of humanitarian groups and other non-profits. Those successes serve to reinforce the reasons so many of us are optimistic about the benefits of Big Data.
But things can get tricky. Palantir CEO and co-founder Alex Karp says the company can’t guarantee what people and institutions will do with its powerful new tool.
“When you are saving the world, fighting fraud and slave labor, you can do great things,” Mr. Karp told the New York Times. “What concerns me,” he said, “is working with commercial entities, and non-U.S. governments.”
Privacy is a major issue. Although Palantir’s software comes with privacy safeguards, there is no assurance that the companies that buy the software will use them. As New York Times reporter Quentin Hardy writes: “Executives worry more about what compromise might do to the company — and to society.”
Hardy also raises the issue of keeping control of the technology.
“As Palantir expands into offering services to the private sector — now perhaps 70 percent of its business — Mr. Karp’s worry is losing control of what happens with its software. The privacy controls are, after all, optional. And, ultimately, it can’t control who gets the software. If, for example, a tobacco company wanted Palantir technology, it could acquire an existing Palantir client.”
The absence of absolute control over one’s products is not new. Tech companies in particular have always faced scrutiny over what organizations and countries are buying their products and for what purpose. But with Big Data, there’s a feeling that the services and technologies reach much more deeply into the lives of average people whose information is being gathered and scrutinized. It’s significant that the folks at Palantir were willing to acknowledge the potential implications in a national newspaper article.
If it turns out someone does use Palantir’s software for nefarious purposes, it will impact the company’s reputation. But given Palantir’s prominence, particularly in a post-Snowden world where many are prepared to believe the worst, such an incident could also reflect negatively on any company working in the Big Data realm.
The upshot is that, even if companies in this area don’t work directly together, they may be broadly linked in the minds of the public. And that means it may be important to think about how your own policies impact both internal systems and the wider ecosystem. It’s a lot to contemplate.
Image credit: David Robert Bliwas