Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think this will be a big issue. The same could already be done with still images years ago, if not decades. And with enough time and resources, the same can be applied reasonably well to videos, manually frame by frame or using CGI.

It takes a lot more to make it undetectable as fake though, and I guess the same applies to deepfakes. Once people get used to the fact such face swaps can be done, they'll be more sceptical and demand realness verification in case of published sensitive material.



Most people barely question the validity or truth of news they see on TV, and that's real footage..

You really think those same people will actually question if a video was real in the first place, particularly if it reinforces their world-view?


I'm not so much concerned about people making politicians and other personalities say something they haven't, rather I worry that these personalities will now have plausible deniability for any footage of them doing or saying something embarrassing. It's always harder to prove a negative and in this case you'd have to prove that it can't be a fake. Now imagine figuring out which video is real amid a flurry of fakes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: