SEOSEO News

AI-powered voice & video scams: India the unfortunate leader / Blogs / Perficient


Ally Armeson, the executive program director of Cybercrime Support Network, a nonprofit organization, says “Generative AI is evolving very quickly. Like any technology, generative AI can be misused or exploited for harmful purposes, so certainly regulation will be necessary as these generative AI tools continue to develop.”

And before the regulations were imposed, the technology started getting misused. The increasing use and acceptance of artificial intelligence (AI) tools have made it easier to manipulate images, videos, and voices of acquaintances and relatives.

Recently, news surfaced that cyber attackers are exploiting AI-based voice and facial reenactment technology to deceive individuals. A recent study indicates that India is the country with the highest number of victims, with 83% of Indians losing their money in such fraudulent activities. Criminals trick their victims into thinking they’re talking to a relative who needs money for reasons like paying for damages from a car accident or paying ransom for a kidnapping.  A McAfee report suggests that a majority (69%) of Indians are unable to distinguish between a genuine human voice and an AI-generated voice.

Fraudsters are taking advantage of AI to mimic the voices / generate fake videos from available social media pictures to portray distressed family members, and a considerable number of Indians are becoming victims of such scams.

Facial reenactment, which is a technique that involves using AI to map the movements of a person’s face onto the face of another person in a video or audio recording, can be implemented very convincingly, making it difficult to tell whether the video or audio recording is real or fake.

Recent Fraud

In a recently reported case of an AI-deepfake scam, a man from Kerala got trapped and lost Rs 40,000 . The victim, Radhakrishnan, a native of Kozhikode, received a video call from an unknown number. On receiving the call, the image of the person on the video call was similar to one of his former colleagues in Andhra Pradesh. To gain his trust, the caller even mentioned the names of some of their common friends.

Thinking it was a genuine call, the victim continued the video call and a few minutes into the call, the caller requested the victim for Rs 40,000 for immediate assistance for a relative in the hospital. Wanting to help his friend, the victim agreed to help and sent the amount online. However, a short time later, the same person asked for Rs 35,000. But this time, the victim became suspicious. He subsequently contacted his former colleague to cross-check. It was then that he realized he had been cheated and the call was not genuine. The victim then reported the fraud to the police.

“This is, without hesitation, the scariest thing I have ever seen,” said Scott Hermann, the founder of financial and identity protection company IdentityIQ.

Most of us think we would recognize our loved one’s voices in a heartbeat. However, McAfee found that 70% of adults who were surveyed lacked confidence in distinguishing between cloned and real voices. A 2019 study found that the brain didn’t register a significant difference between real and computer-generated voices. The subjects in the study incorrectly identified recordings as real 58% of the time, leaving plenty of room for scammers to take advantage. Moreover, many people are making their real voice available to scammers: McAfee said 53% of adults shared their voice data online weekly.

How to be Safe:

  • Police have raised an alert against online cheating using AI-based deep fakes. They have warned people to double-check the veracity of any such requests. They have asked to alert the police over the helpline number 1930 in case of any cheating so that the transactions can be frozen.
  • The caller from unknown number may ask you for personal information, or send money and may sound slower or different then you remember it. It’s best to end the call immediately and avoid answering any recurring calls from same number.
  • Even the Federal Trade Commission says if you get a call from a loved one asking for money. You should call the person back using a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friends.

Conclusion
If we are aware that such scams exist, we don’t need to be particularly worried. All it takes is the ability to take a step back and ask a few questions that only the loved one on the other end of the phone would know the answer to.

Security experts even recommend establishing a safe word with loved ones to distinguish between a genuine emergency and a scam.

The security agencies and the government are working for the solutions. Until all of us have access to the same, it’s our responsibility to understand this new cyber threat and protect ourselves.





Source link

Related Articles

Back to top button
error

Enjoy Our Website? Please share :) Thank you!