Technology / Society
Some Of Them Are Of Cricketers. All Of Them Are On WhatsApp Before Breakfast.
During the India-Pakistan conflict, deepfakes of Modi, Shah, Jaishankar, and Shehbaz Sharif all went viral within hours of being made. AI audio detection tools confirmed what most people suspected: the politicians had not said the things they appeared to say. The people who shared them were also not sure. They shared them anyway.
During Operation Sindoor week, BOOM fact-checkers extracted audio from viral videos of PM Modi, Home Minister Amit Shah, and External Affairs Minister S. Jaishankar and ran them through AI audio detection tools at Deepfake-o-meter, an open-source resource from the University at Buffalo. The tools confirmed: the politicians had not said what the videos showed them saying. The videos were AI-generated. The process of confirming this took a few hours. The videos had been shared four million times before the confirmation was published.
The deepfakes went in both directions. A deepfake of Shehbaz Sharif admitting Pakistan's defeat went viral in India. A deepfake of Modi apologising to Pakistan went viral in Pakistan. Both were false. Both were shared extensively by people who found them believable, which is the most revealing detail of all — not that the deepfakes existed, but that both countries' citizens found "the enemy's leader has admitted defeat" credible enough to share without checking. We will believe anything that confirms what we already think. The technology is new. The impulse is not.
