There are growing concerns that scammers will use to develop new methods of fraud and deception as the technology continues to advance.
One of the primary ways in which scammers are using AI is through social media platforms. By leveraging AI-powered tools, scammers can amplify their reach and create a seemingly loyal fanbase consisting of thousands of people.
These fake accounts and interactions can be used to give the illusion of credibility and popularity to their scam projects.
Scammers may even use AI-driven chatbots or virtual assistants to engage with individuals, provide investment advice, promote fake tokens and initial coin offerings, or offer high-yield investment opportunities.
The use of AI can challenge social proof-of-work, which assumes that crypto projects with greater and more loyal followings online must be legitimate.
With AI making it easier for projects to scam people, users must exercise caution and due diligence prior to investing in a project.
One example of how scammers are using AI is through the use of “pig butchering” scams. AI instances can spend several days befriending someone, usually an elderly or vulnerable person, only to end up scamming them.
AI Allows Scammers to Automate and Scale Activities
The advancement of AI technologies has enabled scammers to automate and scale fraudulent activities, potentially targeting vulnerable individuals in the cryptosphere.
In addition, by leveraging social media platforms and AI-generated content, scammers can orchestrate elaborate pump-and-dump schemes, artificially inflating the value of tokens and selling off their holdings for significant profits, leaving numerous investors with losses.
Investors have long been warned to look out for deepfake crypto scams, which use AI technologies to create very realistic online content that swaps faces in videos and photos or even alters audio content to make it seem as if influencers or other well-known personalities are endorsing scam projects.
As reported, the Federal Bureau of Investigation has about the growing threat of “deepfakes” being used in cyber extortion.
Earlier this month, the FBI said that malicious actors are using deepfakes to manipulate photographs or videos, often obtained from social media accounts or the open internet, and create sexually-themed images that appear authentic.
Last week, also of a popular meme coin-linked AI bot “Explain This Bob” after called it a “scam.”
The automated Twitter account “Explain This Bob” used OpenAI’s latest large multimodal model GPT-4 to comprehend and respond to tweets by those who tagged the account.
While there are some positive uses for AI in the cryptocurrency industry, such as automating boring and monotonous aspects of crypto development, users must remain vigilant and exercise caution when investing in new projects.
“Cybercriminals use AI in crypto scams, creating advanced bots impersonating family members, raising concerns about crypto industry security and promoting a skeptical mindset,” Twitter user GarageID said.