Clearview – Any Chance of Redemption?

Clearview grabbed the world’s attention early this year when details of their facial recognition software, Clearview AI were leaked to the public.  In an era defined by surveillance capitalism, data privacy and ethics issues are a consistent topic of conversation with increasing public mistrust and skepticism over how tech companies handle private data.  Clearview has done nothing to ease these concerns and their indiscretions may compromise the company’s ability to stay successful in the long run.

What do they do?

Clearview’s business is simple. Clearview AI has the ability to identify people with only the slightest data point. Clearview AI has been known to recognize people from reflections in mirrors and low-resolution security footage[1].  In order to do this, individual pictures are ‘scraped’ from various sources all over the internet including social media and other public websites. This gives Clearview AI billions of data points from which to improve the accuracy of its facial recognition software. This data is then used to provide facial recognition services to a number of large public and private institutions such as law enforcement agencies and financial services companies.  Thus, Clearview’s value creation and value capture strategy is straightforward. They use their intellectual property to provide companies with accurate facial recognition services and get paid for it.

What’s the problem?

One benefit of Clearview AI is to help law enforcement agencies identify and capture suspects more efficiently and accurately.  With over 600 agencies using the product in 2020[2], there appears to be clear societal benefit for legal investigations.  Clearview makes such benefits of their facial recognition software clear in this CNN interview.  While strengthening law enforcement does benefit society, the same cannot be said for Clearview’s scraping method. Individuals usually have no clue their visual data is being scraped and neither do some of the companies that hold that data. Thus, it was no surprise that when Clearview received widespread publicity due to some data breaches[3], it was heavily criticized. This presented the company with a series of challenges:

  • Media Scrutiny & Company Reputation – Clearview’s immediate threat is dealing with negative press. The media storm negatively impacted Clearview’s reputation, with large organizations such as Facebook, Madison Square Garden, and the NBA either renouncing Clearview AI’s practices or denying they were ever clients of the company[4].
  • Lawsuits and Regulation – Clearview’s business model has been found to be in breach of biometric and privacy acts in some states, resulting in a number of lawsuits against the company.
  • Ethics Violations – Clearview’s business practices appear to clearly breach Kantian ethics since the scraping approach itself is widely considered to be a breach of privacy and trust. One could argue Clearview’s model is ethical from a utilitarian perspective since the outcome appears to offer a societal benefit from a law enforcement standpoint. However, it’s important to remember that they do have private sector clients and it may not always be clear what those private companies use that data for.

So what Can they do?

Given Clearview’s breaches, I only see two options for the company:

  1. Sell Clearview AI to law enforcement agencies. The public may be much more understanding of technology like Clearview AI being owned by law enforcement agencies and the military. It’s only natural that a private company having access to such extensive information causes great unrest.
  2. Change their ‘scraping’ approach. Clearview have to find a way to legally and ethically obtain the data used for Clearview AI. This is likely to be a long and tedious process but is critical for Clearview AI’s long-term sustainability, especially if they want to remain in the private sector.

Given the world’s attention has been gripped by the corona virus pandemic, Clearview may survive the negative media press coverage and public uproar. The crisis may even have softened the public’s hardwire stance against individual data tracking.  There’s been widespread appreciation for Apple and Google’s initiative to inform people if they have been in close contact with a known CoVid-19 carrier. While this does not nearly result in the same breach of privacy as Clearview AI, it does to some extent show that in times of need public stances against potentially unethical actions may change. Nevertheless, Clearview needs a change its data scraping strategy as soon as possible in order to stand a chance.

[1] https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html

[2] https://www.youtube.com/watch?v=q-1bR3P9RAw

[3] https://www.engadget.com/2020-02-27-clearview-ai-leak-businesses-facial-recognition.html

[4] https://www.theverge.com/2020/2/6/21126063/facebook-clearview-ai-image-scraping-facial-recognition-database-terms-of-service-twitter-youtube

https://www.engadget.com/2020-02-27-clearview-ai-leak-businesses-facial-recognition.html

 

Previous:

TOMRA – Potatoes to the Right, Rocks to the Left

Next:

Idemia: “Only You Can Be You”

Student comments on Clearview – Any Chance of Redemption?

  1. Thanks for the great read! This is reminiscent of personal financial data being in the control of three organizations that have pretty lax security standards as evidenced by past leaks. I completely agree with you about the scraping approach requiring some modifications but short of being restricted by the sources (Facebook etc. which have sent legal notices) or regulation, I don’t see why they would want to do it voluntarily, even if it is being controlled by a law enforcement agency! I would in fact argue that law enforcement agencies would have an easier way around data scraping barriers! My view is that we need an open regulatory framework governing the use of biometric and facial recognition data that is communicated publically in a coherent manner and allow people to make informed decisions for non-essential services.

Leave a comment