Live inside your content: The future of entertainment

Rct studio is creating a fully immersive, AI-driven movie experience that adapts to the player's reactions, with each story creating thousands of permutations. Is this the prelude to a world of AI-driven characters that continuously evolve? Will such characters force us to confront the very idea of what makes us human?

“They’re not looking for a story that tells them who they are. They already know who they are. They’re here because they want a glimpse of who they could be.”

For almost a century, the way we have consumed content has been constant: this is the story, and each of you will experience it the same way. While each of us can interpret the story differently, we see and hear these stories uniformly, creating the incentive to make stories that appeal broadly or that appeal to a particular audience. But what if just one story could use AI to generate thousands of possibilities?

An industry with an evolving landscape

Throughout the history of entertainment, movies have evolved with the times. From black and white pictures to massive special effects in 3D, the industry has constantly sought to find a more realistic experience for its audience. In addition, with the onset of digital technologies via streaming platforms, content distributors are now using big data to understand their consumers better. By understanding their preferences at a granular level, their hope is that they can create content that is exactly what their audience is looking for. However, this is still a uniform experience; evolving technology in both AI and virtual reality is taking the next big leap in entertainment, betting on the same principle of personalized media with the likes of Netflix and Disney, but in a completely new way.

 

Choose your own (virtual) reality

Rct studio, a start-up owned by China’s Baidu and a graduate of the Y combinator, is creating a fully immersive and interactive virtual reality experience. To venture through rct’s immersive worlds, users wear a virtual reality headset and control their simulated self via voice. The idea is to create a virtual environment where human players explore a multi-ending story environment, consistently interacting with non-human characters (NPCs) that evolve throughout the story, each playing off decisions of the human player (see trailer below). To create complex NPCs that are both deeper and more human than those currently available in the market, the company is using AI to analyze a plethora of information in-game to create limitless possibilities.

[1]

While technologies in virtual reality and interactive movies exist in the market, it is the AI engine that drives this product that makes it so unique. The company’s proprietary Morpheus engine is a natural language processing system that is designed to automatically analyze screenplays (fed by the story’s creator), which it then segments by recognizing characters and their general purpose. The system then takes the script of seemingly endless possibilities and uses a real-time text to render feature that takes text from the program and creates the visual rendering on the game.

Each character is driven by an independent AI-powered model that creates all potential scenarios, which is then combined with the other character data points to create the most likely outcomes given certain conditions. These NPCs are based on common sense behaviors, meaning if a gun is pointed at them, for example, they will surrender. In order to build this, the company’s engineers trained a convolutional neural network on a vast number of industry-standard scripts so AI could predict how to react. It also integrated physics into the model so that the NPCs would not do anything physically impossible.

From a content creation perspective, the company has already signed an LOI with Future Affairs Administration, the number one Sci-Fi studio in China, holding the most IPs in the genre. It aims to capture a share of the $200 billion worldwide movie market, with production starting sometime this year.

Beyond virtual gaming

In addition to this particular interactive application, the company hopes to monetize this technology by selling its Morpheus engine to movie and animation studios. Instead of relying on human ingenuity to create stories, directors can simply create a broad story line, feed it into the system, and view various alternative endings and options. The company estimates that this can save a typical Hollywood action film up to $20 million by automating key processes.

As with most AI, the more data it gets from playing, the better it becomes. With more stories and experiences, the idea is that these NPCs will use player data to become more and more human. The company has indicated that once it has taken in enough data, its system has the potential to map the human brain, creating a digital rendering of human consciousness. In a reality straight out of science fiction, it’s possible that this may be the early renderings of AI-driven technology that will create robots that think and act like a human. Is humanity ready for such technology?

 

 

Will humanity create consciousness?

The implications of this type of technology are vast. If a system can be trained to learn from behaviors of humans to self-teach to become more and more real within a simulation, what other possibilities lie ahead? On the topic of entertainment, the show Westworld explores a similar idea to that of Rct studio, but light years ahead, creating robots that feel like humans, powered by an AI system that is almost indistinguishable from our own consciousness. Such ideas create thought-provoking questions for all of us, and force us to confront the very notion of what it means to be human. As these technologies evolve, we must confront the complexity and ethical consequences it all comes with.

 

 

 

Sources:

[1]: https://rct-studio.com/

[2]: https://medium.com/syncedreview/direct-and-star-in-your-own-movie-with-california-ai-startup-rct-studio-a1c1a8e8e7b

[3]: https://techcrunch.com/2019/04/07/rct-studio-profile/

 

Previous:

Kensho – made in the Square, trades on the Street

Next:

Gro-ing New Agricultural Data

Student comments on Live inside your content: The future of entertainment

  1. Thank you for sharing such an interesting post. The passage in which you mention how the technology utilizes the data produced by gamers to learn from human behaviour in order to emulate it makes so much sense! In fact I would assume that it is the perfect kind of data for the purpose.
    I have seen an article* on children’s behaviour towards a robot, how they may develop an emotional response and why it is important to call the robot “it” in order to learn how to draw the line but at some point, I wouldn’t be surprised to see the same kind of article about adult subjects…

    (* https://www.wsj.com/articles/why-kids-should-call-the-robot-it-11566811801)

Leave a comment