Bharti Airtel chairman Sunil Mittal has AI voice scam ‘warning’ for you and what makes it ‘dangerous’ – Times of India


Sunil Mittal, chairman of Bharti Airtel recently revealed how scammers cloned his voice using artificial intelligence (AI) and called one of his executives to demand a large money transfer. Speaking at NDTV World Summit, Mittal said that the official was vigilant and “sensible” to realise Mittal won’t ask for such a huge money transfer and the scam was stopped in its tracks.
He admitted that he was completely “stunned” when he heard the voice recording himself, stating that “it was perfectly articulated just as I would speak”.“And anyone who would not have been vigilant may have done something about it,” Mittal said.
During the event, Mittal warned about the misuse of technology. He said that the fraudsters will go a step ahead in the future and misuse digital signatures, even replicate faces on zoom calls to perpetuate such acts.
“We’ll have to protect our societies from the evils of AI, and yet we have to use the goodness of AI, because those companies, and nations that will not adopt AI will be left behind. So this is a conundrum for every time you get a new technology into place, there are pluses and minuses. I remain very optimistic about the benefit of AI that the human race will achieve and be able to do jobs which are otherwise very difficult to perform,” he warned.
Mittal also touched upon the dilemma of AI adoption, pointing out that societies must strike a balance between the positive applications of AI and the risks associated with its misuse.

Fraudsters using deep fakes and voice cloning for scams

Use of sophisticated tactics like AI-generated deep fakes and voice cloning by scammers to deceive their victims have increased recently. Fruadters often create convincing voice replicas from just a few seconds of audio found in social media videos, then use these cloned voices to trick family members into sending money. What makes these scams particularly dangerous is their growing sophistication – even tech-savvy individuals can struggle to distinguish real from fake.
One such particularly troubling trend is the “digital arrest” scam. Fraudsters impersonate law enforcement officers through video or audio calls. They further intimidate and confine victims to their homes to extract payments.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *