As artificial intelligence (AI) continues to advance, there are growing concerns that scammers will exploit this technology to develop new methods of fraud and deception. One primary way in which scammers use AI is through social media platforms, leveraging AI-powered tools to amplify their reach and create a seemingly loyal fanbase consisting of thousands of people. These fake accounts and interactions can give the illusion of credibility and popularity to deceitful projects.
Scammers may use AI-driven chatbots or virtual assistants to engage with individuals, provide investment advice, promote fake tokens and initial coin offerings (ICOs), or offer high-yield investment opportunities. This challenges the concept of social proof-of-work, which assumes that crypto projects with more substantial and more loyal followings online must be legitimate.
AI instance usage in “pig butchering” scams, for example, can spend several days befriending someone, usually an elderly or vulnerable person, only to end up scamming them. AI allows scammers to automate and scale their activities, potentially targeting vulnerable individuals within the crypto sphere.
Additionally, scammers can use social media platforms and AI-generated content to orchestrate elaborate pump-and-dump schemes, artificially inflating the value of tokens before selling off their holdings for significant profits, resulting in investors’ losses. Investors have long been warned of deepfake crypto scams, which utilize AI technologies to create realistic online content that alters videos and photos or audio content, falsely suggesting influencers or well-known personalities endorse scam projects.
The Federal Bureau of Investigation has issued a stark warning about the increasing threat of deepfakes being used in cyber extortion. Malicious actors are reportedly using deepfakes to manipulate photographs or videos, often obtained from social media accounts or the open internet, creating sexually-themed images that appear authentic.
Recently, Twitter suspended the account of a popular meme coin-linked AI bot after Elon Musk called it a “scam.” The automated Twitter account used OpenAI’s large multimodal model GPT-4 to comprehend and respond to users who tagged the account.
While there are some positive uses for AI in the cryptocurrency industry, such as automating mundane aspects of crypto development, users must remain vigilant and exercise caution when investing in new projects. Experts warn that the rise of AI entails significant risks to the industry, and investors must do their due diligence to avoid falling victim to these new scams.
Source: Cryptonews