bannerebay

Fight fire with fire… Google launches Deepfakes to fight Deepfakes

by ace

Today´s Deals

Google has released a huge set of deepfake video data in an effort to support researchers working on detection tools. Deepfake's videos look so authentic that they can be used for highly convincing misinformation campaigns in the upcoming elections.

They can also cause countless problems for people like celebrities whose faces can be used to create fake porn videos that look authentic.

The tech giant filmed actors in a variety of scenes and then used publicly available deepfake generation methods to create a database of about 3,000 deepfakes.

Researchers can now use this dataset to train automated detection tools and make them as effective and accurate as possible when it comes to detecting AI synthesized images.

Google

Google promises to add more videos to the database, hoping to keep up with fast-evolving deepfake generation techniques. The company said in its announcement:

“As the field is moving fast, we will add to this dataset as deepfake technology evolves over time, and will continue to work with partners in this space. We firmly believe in supporting a thriving research community to mitigate potential damage from misuse of synthetic media, and today's release of our deepfake dataset in the FaceForensics benchmark is an important step in that direction. ”

Google is not the only technology company that is contributing to the fight against deepfakes. Facebook and Microsoft are also involved in an industry-wide initiative to create an open source toolkit that businesses, governments and media organizations can use to detect counterfeit video.

The social network plans to launch a similar database by the end of the year, and we doubt that Google cares – the more samples there are, the better the detection tools can become.

Source

Recommended Shopping

banneraliexp

DIVIDER1

Related Articles

Leave a Comment

14 + 19 =

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More