Page 1 of 49

Synthetic Media & Generative AI News and Discussions

Posted: Tue May 18, 2021 10:23 pm
by Yuli Ban
For news and discussions relating to AI-generated and manipulated media, including deepfakes, style transfer, natural language generation, music synthesis, and more!

Image

Deepfake dubs could help translate film and TV without losing an actor’s original performance
What exactly is lost in translation when TV shows and films are subbed or dubbed into a new language? It’s a hard question to answer, but for the team at AI startup Flawless, it may be one we don’t have to think about in the future. The company claims it has the solution to this particular language barrier; a technical innovation that could help TV shows and films effortlessly reach new markets around the world: deepfake dubs.

We often think of deepfakes as manipulating the entire image of a person or scene, but Flawless’ technology focuses on just a single element: the mouth. Customers feed the company’s software with video from a film or TV show along with dubbed dialogue recorded by humans. Flawless’ machine learning models then create new lip movements that match the translated speech and paste them automatically onto the actor’s head.

"When someone’s watching this dubbed footage, they’re not jolted out of the performance by a jarring word or a mistimed mouth movement,” Flawless’ co-founder Nick Lynes tells The Verge. “It’s all about retaining the performance and retaining the original style.”

Re: Synthetic Media & Deepfakes News and Discussions

Posted: Tue May 18, 2021 11:45 pm
by Yuli Ban

Re: Synthetic Media & Deepfakes News and Discussions

Posted: Wed May 19, 2021 12:08 am
by Yuli Ban

Re: Synthetic Media & Deepfakes News and Discussions

Posted: Wed May 19, 2021 1:31 am
by Yuli Ban

Re: Synthetic Media & Deepfakes News and Discussions

Posted: Sat May 22, 2021 7:35 am
by Yuli Ban
[P] AI for turning anime into sketches
Image

Re: Synthetic Media & Deepfakes News and Discussions

Posted: Tue May 25, 2021 11:26 am
by Yuli Ban
I'm hearing Ozzy, not Ramsey

Re: Synthetic Media & Deepfakes News and Discussions

Posted: Tue May 25, 2021 11:27 am
by Yuli Ban

Re: Synthetic Media & Deepfakes News and Discussions

Posted: Fri May 28, 2021 12:11 am
by Yuli Ban
Neural Network Enabled Filmmaking: Using Deepfake Dubs To Translate TV And Film Without Losing The Authenticity Of Performance
Creativity knows no barrier and is found across all spectrums. However, most often, language is found to be a significant roadblock in conveying that creativity to the masses. TV shows and films are one arena where widespread translation is used to reach wider audiences. But the question that arises is, does that change the viewing experience of the audience? The answer might be different for different individuals, but it could possibly be insignificant given the fact that the AI Startup Flawless claims to have found a technology to override this language barrier. With the help of their Deepfake Dubs, the performance quality and the emotion would be retained, and the film or show, even when translated, would be as authentic as the original.

Re: Synthetic Media & Deepfakes News and Discussions

Posted: Sun May 30, 2021 9:16 pm
by Yuli Ban

Re: Synthetic Media & Deepfakes News and Discussions

Posted: Tue Jun 01, 2021 5:04 am
by Yuli Ban
Movie written by algorithm turns out to be hilarious and intense
Update, 5/30/21: It's Memorial Day weekend in the US, and staff are trying to stay away from the keyboard accordingly. As such, we're resurfacing a few classic pieces from our archives. Since catching a holiday weekend movie in 2021 is a much different proposition than in years past, we thought a second theatrical front page run for Sunspring, a short film starring Thomas Middleditch and written in conjunction with an algorithm, should be in order. The film originally debuted on Ars on June 9, 2016, and our interview with the humans behind the project appears unchanged below.

Ars is excited to be hosting this online debut of Sunspring, a short science fiction film that's not entirely what it seems. It's about three people living in a weird future, possibly on a space station, probably in a love triangle. You know it's the future because H (played with neurotic gravity by Silicon Valley's Thomas Middleditch) is wearing a shiny gold jacket, H2 (Elisabeth Gray) is playing with computers, and C (Humphrey Ker) announces that he has to "go to the skull" before sticking his face into a bunch of green lights. It sounds like your typical sci-fi B-movie, complete with an incoherent plot. Except Sunspring isn't the product of Hollywood hacks—it was written entirely by an AI. To be specific, it was authored by a recurrent neural network called long short-term memory, or LSTM for short. At least, that's what we'd call it. The AI named itself Benjamin.
So this is actually from 2016. That's perfect! It's a great show of what natural language generation tech was like before the explosion of transformers and large language models. It's the equivalent of bringing up Magenta and Flow Machines in the era of Jukebox.