Home > Technology peripherals > AI > AI is everywhere: new changes in film and television production

AI is everywhere: new changes in film and television production

王林
Release: 2023-04-26 16:52:08
forward
933 people have browsed it

AI is everywhere: new changes in film and television production

Nowadays, discussions about AI and cloud computing seem to be everywhere, not just in niche technologies and investment circles. Among them, the most disseminating and influential level is the entertainment business. In the entertainment field, creative and business practitioners are very concerned about the impact that AI and cloud will have on their skill reserves, but at the same time they are also full of expectations for the unlimited possibilities after the technology is popularized.

At the recent Sundance Film Festival, many directors, editors and other practitioners exchanged views on the impact of AI and cloud services on film and television production. In addition, Adobe executives also shared how these two new forces can take the already widely used creativity tools such as Premiere Pro to the next level.

This is a remarkable and dynamic period, with rapid developments, positive and negative predictions ebbing and flowing, and major opportunities brewing to reshape the way film and television programs are produced. Perhaps after a hundred years of stability after its birth, film and television content will usher in a disruptive production revolution.

The impact of AI and cloud computing on the film and television industry has just begun to emerge. Take Comcast as an example. They just won the Emmy Technology Award for using AI to quickly generate sports videos.

Stars and their agents have also noticed that AI technology is playing a role in every corner of the entire film and television industry.

For example, the agency CAA has signed a strategic partnership with Metaphysic, with the latter providing "De-Aging" tools that can help actors in the picture "rejuvenate". Under the processing of tools, the characters played by two stars, Tom Hanks and Robin Wright, start from teenagers and continue to change their appearance as they grow older.

This kind of technology is of great significance to agents: their celebrity clients suddenly have the vitality of performances across time.

New technology opens up new possibilities

However, the daily work of film and television projects is complex and cumbersome and requires the cooperation of professionals such as editors and directors. Through communication with practitioners at the Sundance Film Festival, we found that they mainly regard Premiere Pro as a high-speed collaboration platform to create more and more feasible uses and possibilities.

Meagan Keane, director of product marketing for Pro Video at Adobe, pointed out when talking about AI, “Looking to the future of film production, AI will become an important force in opening up various possibilities. We believe this will be something that everyone can participate in. future."

In her view, the core significance of AI lies in increasing human creativity rather than replacing existing practitioners. Powerful tools such as AI can help practitioners easily use technologies, skills and resources that they do not possess to present more engaging experiences in various media such as videos. Even for experienced professional creators, AI has greatly reduced the torture of repetitive tasks, such as having to go to on-site filming, allowing team members such as editors to focus more on the overall planning and design of the film.

Michael Cioni, senior director of global innovation at Adobe, has always been a master of camera cloud docking technology for Adobe Frame.io. In his eyes, AI has endless room for imagination.

All digital assets are cloud assets

The explanation he gave is not complicated: “By 2031, all media and entertainment assets will be generated on the cloud and generated by cloud computing. . This will definitely become a fact, I would like to call it a technical certainty. All electronic assets generated by the device are cloud assets.” In contrast, local hard drives, film, video tapes and memory cards will all be replaced.

I believe that the media conversion process is familiar to everyone. The first to embrace cloud and AI is text documents. Today, this kind of data is mainly stored on cloud services (such as Google Docs, Apple Pages, Microsoft Word, Adobe Acrobat, etc.) for sharing and editing. The emergence of services such as Spotify, Apple Music, Amazon Music and Tidal has also allowed hundreds of millions of users to start listening to, creating, collaborating and selling music works on the cloud. Now, this trend is about to sweep across our most complex medium: video.

Cioni explained, "Let's take the production of movie scenes as an example. After specifying the shooting scene, we can further describe it with natural language, such as "It starts to rain." At this time, the rain will appear in the shot , pattering down. You can also say "it starts to snow", and AI will add the effect of snow, which can be achieved without any engineering knowledge. This kind of plot that only existed in science fiction novels in the past has now become a reality. This effect has been possible on static media (photos) in the past, and we know this will eventually make the transition to video. Therefore, we must change our thinking and consider how to cultivate our creativity in this context. ”

All this will blur the boundaries between different functions and completely change the way of video creation in the future.

Many young filmmakers are used to crossing these boundaries, touching a little bit of everything on projects, and collaborating with each other on digital platforms. Through the realization of camera cloud docking and AI intelligence, producers can already select a scene and then quickly generate a "combination" in the form of a rough cut as a reference to determine whether the current scene is enough or whether additional shots are needed. .

They can also quickly rough out basic visual effects, color correction and even sound design, and then hand it over to experts for fine-tuning once they decide to use it. Therefore, for many film creators, Premiere is no longer a non-linear video editing tool in the traditional sense.

Establishing a new relationship between people and images

Maximilien Van Aertryck, co-director of the documentary "The Amazing Machine" which won the Sundance Special Jury Creative Vision Award, believes that "this is our greatest passion - exploring the relationship between people and cameras, and how this affects society as a whole. We are like anthropologists who like to have fun exploring."

This documentary has a gentle tone , calls for a more thoughtful understanding of the world created by new tools. Axel Danielson, another co-director/editor of the film, said, as the classic joke goes, "Who do you want to believe? Me, or your unreliable eyes?" Indeed, we increasingly need to realize that seeing may not necessarily be believing. Even video data cannot be trusted.

Danielson said, "Our film does not give many answers, it only raises questions. The only answer we know is that as a collective society, humans need to take a closer look at the messages spread by the media."

This film compiles video data from more than a century, some of which are significant and some of which are ironic and funny. And more importantly, in the hands of great and terrible creators, this content can be given vastly different meanings and powers.

In "The Amazing Machine," Van Aertryck and Danielson spent several years gathering information about the complex relationship between humans, cameras, and the images they capture.

Van Aertryck said frankly, "We are not the kind of well-trained editing experts. We went to work ourselves because there was really no one else. Adobe became the perfect tool for us because we can throw all the materials into it. And easily share with each other."

There are only four people in the entire shooting team. Some are in the Baltic Sea (Gothenburg, Sweden) and some are in the Balearic Islands in Western Europe. Everyone can only use Premiere's cloud tools to exchange clips and clips. But the whole process was very smooth, as if everyone was sitting together. This is also the significance of cloud functions: as long as they need and are willing, every creator from all over the world can participate in work anytime and anywhere.

Crystal Kayiza, the editor and director of "Rest Stop" who won the U.S. Short Film Jury Award at this year's Sundance Film Festival, said, "Online sharing makes it possible to try more. I can do it anytime I want. Returning to my hometown in Oklahoma without affecting the creation of the film."

Last year, Adobe provided new cloud direct connection capabilities on many popular cameras and other production equipment, which can transmit videos while shooting content. This allows editors, producers, studio executives and other post-production professionals to start processing footage within minutes of the footage being shot.

Even small-production films can use these techniques. Daniela T. Quiroz, who won the Sundance Editing Award, said that cloud direct connection was crucial to the production of the film "Mariachi", especially the early production.

Quiroz, who lives in Brooklyn, New York, had already begun editing when film directors Alejandra Vasquez and Sam Osborn were filming a dance competition at Rio Grande Valley High School in Texas. Because she can quickly get early shooting materials, she can quickly process them and communicate with the directors about the focus of the next shooting.

Quiroz said, "As editors, we are fortunate to witness the actors' first time on camera. You will witness your favorite characters and the music that resonates."

When filming "Sometimes I Am" There is no such condition at all when thinking about death. The film was shot on the remote Oregon coastline. Because there were no digital samples available for immediate viewing, the footage shot every day had to be saved to a hard drive and sent overnight to a digital processing facility in New Orleans. Editor Ryan Kendrick said that the emergence of new technology is indeed good news for practitioners.

Unprecedented collaboration

Now, Kendrick can collaborate remotely from his home in Nashville with the director and other colleagues in other places.

Kendrick lamented, "I can work with my partners at home, so far apart from each other, which is so cool." This has also changed the relationship between directors, cinematographers, editors and other production staff. mode of interaction.

In the past, "editors always felt far away from the set. Now, the connection is closer, and you don't feel disconnected from it all. Now the editing room has become a place of trial and error, and I You can try all kinds of things here, even if the results are completely opposite to what you expected. I always say, "Who knows, let's try it.""

In "Sometimes I Think About Death" , Kendrick, Lambert and director of photography Dustin Lane maintained a weekly discussion frequency before the actual shooting started, hoping to avoid possible problems in advance.

"We will discuss possible editing issues that may arise. When you get the script, the hardest thing is to imagine the transition from the plot to the next part. Specifically, we make the character always in the situation, not in the "movie". So we talked a lot about trying to figure out how to keep the emotional weight of the images. "

Kendrick said that Camera-to-Cloud technology speeds up the shooting cycle, so it can also play an important role in projects such as commercial advertisements and music videos.

"Business Many elements of advertising are changing rapidly. "It is with the support of cloud architecture that project materials can be smoothly transferred between Premiere Pro, Frame.io and After Effects.

" This is the biggest problem that film and television practitioners have eliminated in the past five years. The chronic problem of delays in the production process. ”

Build Better Technology Guiding Principles

Keane said that Adobe’s technology approach is built around five “guiding principles”, including: workflow, media, collaboration, productivity and Security.

“We will truly integrate AI into “productivity”. This is not to replace creativity, but to replace the complex and cumbersome aspects of the past. "

The entire Hollywood is paying close attention to the rapidly changing field of AI. There are reports that the board of directors of SAG-AFTRA, Hollywood's largest union, plans to introduce clauses on image rights issues in the next round of contract negotiations-that is, binding The use of AI technology on the appearance, voice or other characteristics of union members.

Judging from the current draft opinions, "any form in the contract aimed at controlling the performance rights of virtual actors must be agreed with the union in advance, otherwise will be deemed invalid and shall not be enforced". "

Faced with this complex new problem, a new round of contract negotiations is about to start this spring. Old, middle-aged and young actors who hope to extend their career life may be willing to accept certain advantages of AI technology, but if there is no strong With strong contract protection, producers are likely to directly use AI technology to simulate and avoid paying high fees to real actors.

A certain fan project on Twitch seems to indicate such a future. Name A user named "watchmeforever" continues to upload never-ending "Seinfeld" spin-off series generated by AI. Although the graphics are badly distorted, the computer-synthesized sounds are ridiculous, and the quality of the jokes is not flattering, but with the help of AI With the improvement of technology, all this may no longer be a problem. More importantly, the actor who played the role of Jerry came forward to defend his rights, causing this "never-ending" drama to end hastily.

For creativity Opening the door to a new world

From a more positive perspective, Adobe’s Keane believes that AI and cloud computing will bring a wider range of creative tools and eliminate the impact of elements such as images, sounds, and visual effects on large-scale servers. The reliance on the farm gives everyone the computing power and security foundation to produce excellent video content.

“With the help of cloud resources, everyone can tell their own stories from anywhere in the world. Collaboration at all stages can be achieved across regions, and everyone can exchange opinions and spark ideas at any time. "

For filmmakers, safety is also a big issue that cannot be ignored. Keane mentioned the movie "Plan C", which is a documentary about abortion. The creators need to use the facial expressions of the interviewees to and sound are blurred multiple times, while ensuring that the original material originally collected will never be leaked.

In addition, the proliferation of generative AI, deep fakes and related issues has also made the authenticity of image content more important than ever It is more important at all times. To this end, Adobe has joined the "Content Authenticity Initiative" with the New York Times, Canon and many other media and technology companies.

But the male protagonist of the science fiction classic "The Matrix" The protagonist Keanu Reeves is not optimistic about this problem. He said in an interview that deep fakes, which use AI to maliciously generate videos, are "very scary." The contracts signed by Reeves often contain relevant clauses that prohibit producers from using Digitally tampering with his performance. The roots of this can be traced back to a movie decades ago, when the producers forcibly used special effects to add a tear to his face.

Reeves said, "This situation of actors losing their autonomy is frustrating. We can accept that our performance clips are edited, but the content is at least true. And once it reaches the stage of deep fakes, Then the actor doesn’t have any autonomy at all. It’s very scary, and I’m looking forward to seeing how humans respond to this kind of technology.”

游戏-changing new technology

is about to change the rules of the game The new technology is currently available in beta version in Premiere Pro. This is based on the text editing function, which allows users to extract and transcribe dialogue text from videos with the help of editing AI, thereby realizing video editing. AI can also highlight important sound bites, including rough cut references for the final version, and generate keyword-based search terms to quickly travel to specific clips or dialogue. There is no doubt that this will become a new and more powerful way to manage project materials and data.

Keane pointed out, "You can use all this data to polish the final media."

Some small manufacturers have also launched similar technology. Both Descript and Podcastle provide tools to create digital versions of speech, which can be used to read scripts in text form without the need for additional editing. Descript also recently added text-based video editing capabilities, which can be used to create podcasts and other video content.

Editor Kendrick said that he had used Adobe's text-based editing tools to help a friend quickly edit a mini-documentary. In his opinion, this text-based editing function is also of great significance to professional editors. In the past, producers/editors often needed to spend several days sorting out the footage to ensure that the materials could be quickly connected into a program within a tight time limit.

Audiobooks, screenplays, social media – AI is everywhere

AI is making inroads into other areas of entertainment. Apple has launched an AI tool that can generate narration for audiobooks, giving authors another option to create low-cost audio versions of their projects. But as soon as the news came out, professional audiobook voice actors and their legions of fans expressed strong opposition.

Yes, these professional voice actors are certainly talented and can often add color to the work with their fascinating performances and timbres. Compared with their results, AI-generated narration is still a long way off.

But for non-best-selling books, the limited income is obviously not enough to support people to enjoy this level of listening experience. At this time, AI tools have become a good choice that is both low-cost and can guarantee the quality of experience to a certain extent. It is this niche book listening market with a global valuation of approximately US$1.5 billion that Apple is looking at.

The start-up company Filmusage is trying to use AI to split script elements, including actors, props, costumes, vehicles, sounds, locations, etc.

As for the field of short videos, TikTok, Meta Reels and Alphabet’s YouTube Shorts all rely on machine learning and AI tools to display “recommended” videos that users may be interested in, striving to create more seamless and attractive videos. way to integrate advertising with the viewing experience.

Rich Greenberg, an analyst at financial rating agency LightShed Partners, mentioned that Facebook paid a very high cost when it turned to AI-driven content recommendation, but “this fact ultimately proves that AI content recommendation is Zuckerberg’s own The second major correct decision I made after founding Facebook."

The above is the detailed content of AI is everywhere: new changes in film and television production. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:51cto.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template