Artificial Intelligence (AI) is reportedly targeting victims through text messages, social media, and dating apps to steal their money, ABC 13 reported Friday.
The victims were scammed by women that turned out to be fake, according to the report.
The outlet said a man named Jim spoke to a “woman” who duped him into an investment after claiming she was in love with him once they met through a mysterious text message.
The woman reportedly told him she had an uncle who was on the board for the stock exchange in Hong Kong, and he sent money to invest, the report continued:
He was convinced to send $60,000 to invest in the stock exchange. He said he lost most of it because the investment tanked. Then, the woman opened up an overseas crypto account in his name, but when Jim tried to take that money out, he was going to be charged thousands in upfront tax fees. Experts say it’s a scam.
“I figured, ‘What the heck, I’ll try somebody online. It couldn’t hurt’. I was wrong, it could,” Jim said.
After experts with Bitdefender and NordVPN investigated the photos and videos of the people online, they determined they were fake or altered.
AI technology has also caused great concern regarding deepfake images of everyday people, Breitbart News reported Wednesday:
Civitai, an online marketplace for AI models, has recently introduced a controversial feature allowing users to post “bounties” for creating deepfake images of real people, including normal people without significant public presence. Creators earn money for completing the bounties, which may be used for deepfake porn or other nefarious purposes.
When voicing her fears about AI, singer Dolly Parton explained recently, “I don’t think, or, at least, I hope nobody can ever replicate me or what I do. AI is a scary thing.”
“I’m sure it’s a good thing for scientists or medical things. But when it comes to trying to duplicate a human being and every little thing they are, it don’t seem right to me,” she added.
The technology can also be used for even darker purposes, as reported in the case of a North Carolina child psychiatrist who used AI to make child porn.
COMMENTS
Please let us know if you're having issues with commenting.