You probably have seen the videos in the last 5 years: someone famous saying or doing something, except, it is not really them – it is not actually real. Synthetic media is the official name for what became known as “deepfakes”, a combination of the terms “deep learning” and “fake”, and it is the modern AI-powered equivalent of “photoshopping” one person’s likeness in a video. While there are many concerns on how the technology could be used to create misinformation, some companies are trying to use the underlying technology to create and capture value in arguably a more ethical manner.
D-ID is a startup founded in 2017 that initially tried to address the concerns around surveillance systems and facial recognition by developing its AI algorithms to digitally “de-identify” a person’s face in media (such as video or photographs) to protect the privacy of the individuals. They developed technology to alter identifiable parameters used by facial recognition algorithms while preserving the high-level structure that our brains visually process – i.e., they modify the face representation in media so that the difference is small or imperceptible to the human eye but scrambles what the machine algorithms can understand.
However, even though this privacy focus provided some value to a niche segment of the population, their pivot to generative AI for synthetic media after a couple of years has allowed them to better align their business proposition with higher customer demand. The company’s underlying expertise in deep-learning facial features positioned them to leverage its AI technology to produce animations of photo portraits and production of human-like videos.
Deep Nostalgia: bringing old pictures to life
One of the successful business partnerships for D-ID in the synthetic media pivot has been with genealogy service MyHeritage, which released a new feature called “deep nostalgia” where users can generate an animated video of an uploaded photo. This AI-powered feature served as viral marketing for MyHeritage services, providing a life-like representation of long-lost relatives in the family tree – allowing their customers to create a stronger connection and emotional reaction to genealogy as reported by major international newspapers. Furthermore, it also served to promote D-ID face animation technology and capabilities making “live portraits” of historical figures, as some people attempted through Deep Nostalgia. D-ID offers paid access to an “AI Face” platform that provides deep learning, computer vision, and image processing technologies to third parties like historical organizations and museums for synthesized video creation of known figures.
Creative Reality: AI media Studio
After the success of animated photos, D-ID trained its neural networks with videos of people speaking input text, so that they could add audio lip-synced with the animation. This product has been used in a partnership with media giant Warner Bros for promotional material of some of its film assets such as interactive trailer of sci-fi movie “Reminiscence” or a “Harry Potter” exhibition.
However, its largest business segment today comes from enterprise customers looking for an easy “one stop shop” video production, particularly in the marketing, sales, and training needs. D-ID studio platform enables customers to generate realistic AI personas, which after uploading the input script and selecting a speech voice, creates a high-quality video for corporate uses. The AI studio platform significantly decreases the cost and complexity around video production at scale, as well as the ability to customize for multiple digital presenters, voices, languages, or script edits while avoiding the hassles of content creation.
Challenges and Opportunities
Synthetic Media is still a recent concept, but we have seen a proliferation of startups in the last years entering the space, whether by mobile apps for content creators in social platforms or more established enterprise partnerships. While D-ID remains one of the leaders in this technology, and a successful partner for synthetic human media production for enterprise content needs, their foothold is not cemented yet and multiple startups coming for different segments may eat away some of the early mover advantage.
The interesting thing about being at the cutting-edge of technology is that it does not stay at the “cutting-edge” for long – as machine learning techniques and processing power evolves, more players join the game, and more companies try to specialize on a particular area for value capture. Alternative startups like Synthesia have doubled down on synthetic video content production platform for training and marketing of well-known companies. Others such as Revel AI have focused on creating hyperreal content for the entertainment industry – synthetic media is now joining video commercials for big brands like Adidas.
D-ID continues to explore different use cases of the same underlying technology, and where to evolve to (what industry and at what level should they play as a provider), but depending on this choice, they may need to streamline their product portfolio, acquire (or develop) complimentary tech stack and revamp their organization to serve their industry of choice.
And still, let us not forget that the concept of “deepfakes” has a negative connotation for the general public. The threat of bad actors using this technology for fake content generation with the intent of misinformation will continue to be an obstacle, or at least a reason for caution, around the spread and adoption of the technology. Today, this content is not particularly regulated or contained by copyright laws, so depending how the space evolves and what actions the governments decide to take (particularly those with stronger data privacy focus like in Europe) will affect the future of D-ID.
D-ID has found a way to evolve to the needs of their most pressing customers through the years and remain a valued partner for creation and capture. But will they know their real identity to grasp an opportunity that will solidify their value, or will they just get blurred away from the space as synthetic media continues to grow?