• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Deep Fakes: Application of AI to fabricate videos

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,263 (4.29/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
We all know the true reason this tech was created...
 
Public manipulation and character assassination.
 
Last edited:
You can't manipulate someone who has no stake in the game and doesnt give a crap. :peace:
Food for thought.
 
It's scary that in movies we've had evil 'clones' or facsimiles of the hero that perform evil to tarnish their name. That reality is now getting closer to those with deep enough pockets to carry it out. What we need is the reflexive technology to identify such fakery. Digital imprints in verified appearances or something like that.

It's certainly not a bright future...
 
I was gonna say porn.
Pretty sure the video says it was already done (10% of cases). People were superimposing someone else's head on porn videos making people think such and such had a sex tape. It's not true, so the person that was character assassinated now has to fight this landslide of false accusations (which is what I assume the person talking does).

This has true potential to just take it to a newer, lower level and honestly I don't see a bottom to it.
What we need is the reflexive technology to identify such fakery.
That's what is being worked on. I just don't see it really working because, until people are willing to acknowledge that what they see may be a fabrication, they'll take it face value and be inclined to propagate it. I don't think the majority of people are going to seek to verify something before believing it and propagating it.

The only way tech can stop it is if it is implemented before publication of the content. For example, if YouTube scans videos for Deep Fakes before it will allow publication. That's akin to censorship though. Not to mention the issue of false positives.
 
Last edited:
Pretty sure the video says it was already done (10% of cases).

I know it was already done which explains why it was created. This software doesn't get engineered in a vacuum...
 
Public manipulation and character assassination.

You know, if you think about it even further this will likely have the complete opposite effect over time. People will get used to it and before you know it any sort of "evidence" will be completely worthless in the public space.
 
Back
Top