2023-05-04 08:59:17
CHP leader and Nation Alliance Presidential candidate Kemal Kılıçdaroğlu, in his statements, claimed that the government would engage in “dirty business” as the elections approached. Kılıçdaroğlu, who shared on Twitter on the night of May 1, used the following statements: “I will give my last warning. Fahrettin Altun, Serhat and their teammates Çağatay and Evren; The dark web world you are trying to deal with will fall into the hands of foreign intelligence. Playing Cambridge Analytica is beyond your capacity, boys.” Later, the former HDP leader in prison, Selahattin Demirtaş, made a similar statement to Kılıçdaroğlu and said: “Provocations can be made by hacking the accounts of well-known people or through accounts that have given you confidence until now. If defamatory videos are circulated, PLEASE DO NOT WATCH, PUBLISH, AND BLOCK SHAREERS, regardless of who they are intended for.” There are different polemics about when the tape and video were shot in Ankara deep backstages. 10 days before the election, it is observed that the political tension has increased with tapes and videos.
So what is this Deep Fake?
It is a technology developed by Stanford Engineers for the entertainment industry in 2019. For example; During montage, a distorted speech can be corrected without re-shooting. Image manipulation is as old as imaging itself, though. According to Füsun Sarp Nebil from T24, Likewise, CGI has been on the market for years and both have found wide use as deep fakes. The one who evolved the idea to this level is a Princeton graduate student named Ohad Fried.
Deep Fake is based on machine learning technology. In other words, images are created by following the facial expressions and especially the mouth movements of the people in the previous videos.
At the point where Deep Fake technology has come, making a deep fake video has become as simple as editing a text in a word file. In other words, you can take the video of the person you want and have them say whatever you want. The editor can modify the video using a text. Similar to word processing, the editor can easily add new words, delete unwanted ones, or rearrange a finished video.
According to estimates made today, 96 percent of all deep fake scams consist of pornographic images. In the USA, especially in revenge porn (published by an ex) the use of deep fakes has increased so much, bans and heavy penalties have come in many states.
This technology can also help cybercriminals. There is already talk of some deep fake scams, either audio or video, aimed at financial companies. For example, a fake interview with Elon Musk was used in a cryptocurrency scam.

But Deep Fake poses potential risks to actual politics and elections.
HOW TO UNDERSTAND DEEP FAKES THAT DIE THE SENSES
Although deep fake videos were developed for the entertainment industry, unfortunately, their malicious use is more. Deep fake videos pose significant threats to individuals and organizations when used irresponsibly.
However, there are still some tips to help you understand Deep Fake videos. Let’s transfer them;
1. Look for unnatural eye movements
A common warning sign is unnatural eye movements or lack of eye movement, especially no blinking. It’s hard to fake a natural-looking blink. It is also difficult to accurately reproduce eye movements. Because when a person is talking to another person, their eyes normally follow them.
2. Artificial facial movements
If it doesn’t show the emotion on a person’s face that should match what they’re saying, it’s most likely a deep fake. If you can detect face swapping or image merging, the video may be deep fake.
3. Unnatural position of facial features
You should be wary of videos that look real if a person’s face and nose are pointed in different directions. You can also check the position of their noses.
4. Compare the sound quality
Deep fake makers typically focus more on visuals than audio. For this reason, you may be detecting the difference from the voice / speaking style. Watch out for poor lip sync, robotic voices, awkward word pronunciation, digital background noise, and even a lack of audio.
5. Strange body shape or movement
You should be suspicious if a person turns sideways and moves their head, or if their movement from one frame to the next is intermittent or disjointed, appears distorted or closed.
6. Strange posture or physique
Another indication of deep fake is when a person’s body shape looks unnatural or when their head and body are positioned in an awkward or inconsistent manner. This is one of the simpler anomalies to detect, as deep fake technology typically focuses on facial features rather than the entire body.
7. Beware of inconsistencies in color and lighting
Unusual skin tones, blemishes, strange lighting and oddly placed shadows indicate that what you see may be fake. If you’re watching a questionable video, note any discrepancies in the persona and compare them to an original reference. This will help you determine if it’s a deep fake.
DEEP FAKE DEVELOPS, DETECTION IS DIFFICULT
While counting them, let’s note; As technology advances, deep frauds evolve as well. For this reason, maybe we can count the following as the 8th article (which is the expectation in the 10 days before the election);
8. If you hear a famous person make crazy claims or things that can’t be true, it might be a deep fake
As in the Elon Musk video above, if a famous person is found to be saying something contrary to what is known, due care should be exercised and the information should be checked from reputable sources, even if the video is persuasive.
There are also some studies to detect deep fakes. Intel, Adobe, and Microsoft announced the “Coalition for Content Provenance and Authenticity (C2PA)” collaboration to fight deep fakes last year. It is necessary to examine and follow them as well.
It should be noted that scammers can intentionally encode videos to hide their deep fake shortcomings. Therefore, the best strategy is not to look for clues by looking at the video, but to use common sense and fact-checking skills. (T24 Füsun Sarp Nebil)
#Deep #Fake