My first brush with the idea of technology gone all wrong was Shelley's ‘Frankenstein’. A voracious reader in my school days. I horrifically visualized the half-human, halfmonster thirsting for revenge from his creator and the entire human race. What fascinated me the most was this beast's strength and ability to chase down his enemycreator to the ends of the earth, not affected by sub-zero temperatures, consumed by his grief and anger, investing all his brute strength and mental faculties into destroying his target.
Even now I sometimes get the dingy laboratory where Frankenstein was conceived,in my dreams, all with yellow eyes and nerves and the smell of science that sucks.In today’s advanced world, the Frankenstein monster manifests itself in other forms.
There's always been a real and a fake.
There's always been a black and a white.
But you could always tell the difference.
Until now.
The new-age fraudsters are faking it, and they are faking it deeply.Deepfake – usage of multiple layers of algorithms to replace an original image, audio, or video recording, complete with lip-syncing effects. Fixing your face on someoneelse’s body and flatteringly so, putting someone else’s words in your mouth and convincingly so, attributing someone else’s actions to you inappropriately so, as to
embarrass, defame, discredit you, blackmail you, or simply have fun at your cost and prove one’s beguiling adroitness in handling a new techno innovation.Suppose you have an eye for detail – spotting uneven colour tones and lighting conditions, unusual movements or lack of them, or issues in audio or video quality.
In that case, you might be able to tell the real from the fake. However, highly skilled fraudsters leave no scope for any such anomalies.When you can't tell a third person’s real from the fake, you're open to deceit, fraud, and a life that's exposed, unsafe, and vulnerable.
Rashmika Mandanna was one of the first widely known targets of Deep Fake, followed by Katrina Kaif. It took these beautiful actresses for the world to wake up to the fact that reputation, respect, and esteem can get permanently injured by an innovation that can replace, fake, and superimpose audio visual effects and present a false you to the world.
In the words of a thoroughly confused Salman Khan in Andaaz Apna Apna-
Tum jo tum ho wo tum nahin ho, wo wo hai.
Wo jo wo hai wo wo nahin hai wo tum ho.
Tum jo ho wo tum nahin ho.
Wo jo hai wo wo nahin hai.
Ohhhh….
Main jo hoon main main hoon,
Ya main bhi wo nahin hoon jo main hoon?
Main kaun hoon?
Karisma Kapoor’s ‘Buddhu’ sums it up!
We are also naïve and unknowing targets of this spectacular burst of technology,
clueless targets a.k.a. – Buddhus!
Has it come to this now? Super smart technology turning one person’s intelligence against another, creating two main categories in the social order – the Victim and the Perpetrator. So much for the supremacy of humans who created AI! Did we domesticate a dog so strong that it takes us for a walk as we can’t control the leash,or is it a typical example of great ideas backfiring? Technology, a double-edged sword on one hand makes life easy, interesting, productive and pliable, and on the other leaves it exposed to unholy experiments by
unscrupulous people.
PM Narendra Modi, former US President Barack Obama, and the current President Joe Biden have had deepfakes dedicated to them. However,
this does not mean that only the rich and famous or only actors are at risk. Everyone is at risk. Social media enthusiasts, actors, professionals in showbiz, politicians, commoners, anyone… Not yesterday’s fad, Deepfakes have been around for quite some time now. And they are dangerous, what with their ability to reach far and wide within minutes, their similarity to original content, and the tendency of people to lap up inappropriate subject matter for its shock value, and at times, voyeurism, for rubbernecking and gossip.
Deepfakes might lead to the downfall of a Government, a Company going bankrupt,a career getting spoilt, a relationship going sour, a life destroyed, and so on.So, who is protecting us? Several sections in the IPC and the IT Act contain punishments for online fraud and forgery. Though they are enough to deter the scammers.
These laws might need more teeth, and more provisions to differentiate between the deepfakes created for valid purposes and the malicious ones.And we? What should we do? Protect our online accounts, pictures, passwords etc. with the latest security provisions, share our pictures and videos sparingly, beware of online fraudsters, do not believe everything that is there on social media to be true
but use our pragmatism, report cyber harassment promptly, and understand that AI has been created for the furtherance of good causes and curtailment is required when it crosses the boundaries of decency and morality.
Reverse engineering might be a solution to this problem, but at present date, deepfake creation is several steps ahead of deepfake detection, as new techniques are being introduced every day, and it is difficult to keep pace with them. Even the leading algorithm at the Facebook Deepfake Detection Competition held a couple of years back could detect only a bit more than 50 percent of the deepfakes.
So, when the experts can’t do anything much about a technological menace, we can wait and watch and at least make a song and dance about it!
Modiji did not dance and sing,
Getting into the elevator like that wasn’t a Rashmika thing!
Kajol never thought changing on camera was better,
A top or a towel – in which one does Katrina look hotter?
Even Obama and Biden can be faked with ease,
Choose your deepfake - notorious, provocative, or just a tease!.
(Nipun Dutta, Asstt Editor-Teacher by profession, ex- lawyer and a writer by choice)
Related Items
Transport Secretary unveiled a 5-option action plan to rescue trapped workers
Deepfake can create problems, says PM Modi
Dhankhar calls for quick action against those spreading fake news