Take the pledge to vote

For a better tommorow#AajSawaroApnaKal
  • I agree to receive emails from News18

  • I promise to vote in this year's elections no matter what the odds are.
  • Please check above checkbox.

    SUBMIT

Thank you for
taking the pledge

Vote responsibly as each vote counts
and makes a diffrence

Disclaimer:

Issued in public interest by HDFC Life. HDFC Life Insurance Company Limited (Formerly HDFC Standard Life Insurance Company Limited) (“HDFC Life”). CIN: L65110MH2000PLC128245, IRDAI Reg. No. 101 . The name/letters "HDFC" in the name/logo of the company belongs to Housing Development Finance Corporation Limited ("HDFC Limited") and is used by HDFC Life under an agreement entered into with HDFC Limited. ARN EU/04/19/13618
CO-PRESENTED BY
LIVE TV DownloadNews18 App
News18 English
272
nda:
Needmore seats to Win
Needmore seats to Win
upa:
»
5-min read

At a Time of Fake News, 'DeepFakes' is The Technology That We Just Don't Need

"What do you mean, that isn't you on that super HD quality video I just watched?"

Raka Mukherjee | News18.com@RakaMukherjeee

Updated:July 3, 2018, 2:00 PM IST
facebookTwittergoogleskypewhatsapp
Loading...


At least 31 people were murdered across more than 10 states in the last one year. This was all because of one fake news on WhatsApp.

At a time when the country is battling fake news and trolls at the same time, the world of technology has just given us a gift that we really did not need.

When you search for 'deepfakes', the first result that Google shows up is a Wikipedia article that defines it as, “An artificial intelligence-based human image synthesis technique. It is used to combine and superimpose existing images and videos onto source images or videos.”

The next link is the that takes you to a website where you can directly learn how to master DeepFakes.

The first step is knowledge. The second step is access.

You may not have had any clue on what DeepFakes are, but rest assured, because you’ve seen examples of them everywhere – on porn sites where celebrity faces are morphed over porn-stars, and in memes where prominent political figure’s heads are fixed onto cartoons. Ever wondered why they were so realistic? The answer is DeepFakes.

DeepFakes is essentially a software that lets you recognize human facial features and impose it onto other already existing videos or images. And naturally, as this tool is easily available now – to anyone and everyone on the internet, people make the best(or worst) use of it.

Deep Fakes Club, which is the first learning link you see, will teach you how to go from amateur to pro in minutes.

Maybe all you’ve done of super-imposing pictures so far is at a young age paper cut-outs of your favorite actor as a kid and pasted it into your scrapbook, or simply observed political caricature on TV shows of famous people. But this site will give you instructions down to the last T along with links and ex-plainer videos in case you still don’t understand. It’s almost like they’re trying to convince you why you should have this skill.

How you use this skill, however, is completely at your own discretion.

The 'About' section for the site states how it is aimed at being “a community resource to promote and develop deep learning-based face swapping techniques. We focus on education, news, and technical development.”

The site does not take any responsibility on how you use – or misuse it. And more often than not, it is misused.

Today, there are global debates on what the implications of DeepFakes are – that political figures can have their videos morphed. DeepFakes will allow you to create HD-quality videos which will blend so seamlessly and have so many angles, the chances that you will know it is fake is close to impossible.

In India alone, the number of cases of morphed pictures and videos are at an all time high.

At this point of time where people have the world in their phone screens, they also have the best opportunity to misuse it.

In as recent as April this year, Rana Ayubb, a journalist had a morphed pornographic video of her circulated on social media – as “repercussions” for speaking up about how the freedom of speech in newsrooms was being stifled.
But Ayubb is not the only person who has faced this – there are a lot of cases in cybercrime that often go unreported that are examples of these kind of videos.

In April 2017, two men in their mid-twenties were arrested after they allegedly morphed a porn clip of a minor girl and started circulating it on social media. The crime came to light when the girl received a WhatsApp notification of the clip on her number, sharing with her the video which contained sexually explicit content with her face superimposed on it.

Another similar instance occurred in September 2016 involving a minor in Kolkata, where an had threatened the school-going-girl with exposing her morphed photos on porn sites.

Similarly, in April this year, a Kerala bride after seeing her image on social media, uncovered a racket where a photo-studio would morph wedding pictures of women and turn them into pornographic images.

A 2015 Times of India article also reported how multiple instances of a woman’s face morphed onto questionable pictures started circulating on Whatsapp.
Similarly, in March 2018, a Punjabi teen’s faced was morphed in a video of obscene nature and uploaded onto Instagram.

All of these instances bring to light a common thread – they are all discovered after the image or video has been morphed and circulated.

This brings into question access – again, the technology to create these videos is at your fingertips.

The negative implications of this growing technology is huge – in India where we believe Whatsapp forwards with fake news without so much as doing a simple googling search to make sure whether it is legitimate – where would HD quality videos, shot from different angles, all showing the same person’s face put us?

Whatsapp messages have gotten people killed – the number of lynchings based on fake Whatsapp messages has been rising consistently as people choose to act instead of questioning the content – in a situation like this, if you receive a video where you see a friend, relative or a neighbor in a video committing a questionable deed – how likely are you to believe it? How likely are you to believe them when they say it is morphed?

The world is struggling with the menace of fake news. Social Media giants are joining hands to battle the deep problem. At this point, do we really need technology like this?


(Get detailed and live results of each and every seat in the Lok Sabha elections and state Assembly elections in Andhra Pradesh, Odisha, Arunachal Pradesh and Sikkim to know which candidate/party is leading or trailing and to know who has won and who has lost and by what margin. Our one-of-its-kind Election Analytics Centre lets you don a psephologist’s hat and turn into an election expert. Know interesting facts and trivia about the elections and see our informative graphics. Elections = News18)
Read full article
Loading...
Next Story
Next Story

Also Watch

facebookTwittergoogleskypewhatsapp
 
 

Live TV

Loading...
Countdown To Elections Results
  • 01 d
  • 12 h
  • 38 m
  • 09 s
To Assembly Elections 2018 Results