The Popularity of AI and 2023 Introduction of WormGPT
WormGPT’s features were practical and aimed at supporting harmful activities. It offered lightning-fast responses and unlimited message length, making it easy for users to communicate and operate without interruption. Privacy was also a big selling point, with promises of secure and confidential conversations to protect user identities.
Users could pick from different AI models tailored for general or specialized uses, and they could save and revisit conversations whenever they wanted. The tool even offered advanced beta features, like “context memory” for smoother, ongoing chats and “coding formatting” to help organize code in a way that was easy to read and use.

Official cybercrime forum advertisement for Wo
This tool was effective for writing snippets of malware, phishing campaigns, and more. Testimonials in the forum thread confirmed its effectiveness, with users praising its capabilities. As WormGPT gained traction, media outlets began to take notice, and soon more than 100 news websites had covered it. Headlines like “ChatGPT’s Evil Twin WormGPT is Secretly Entering Emails, Raiding Banks” brought the story into the mainstream.
With all this publicity, the tool’s author started to respond, claiming WormGPT was intended for ethical usage only. However, this didn’t quite add up—given its marketing on a cybercrime forum and the blatant blackhat messaging in its ads.

Closure message from the author of WormGPT
Eventually, the author ceased sales of WormGPT and posted a message on the forum, attempting to deflect any liability by distancing themselves from the tool’s misuse. Despite these efforts, WormGPT’s rapid rise and controversial marketing left a lasting impression on the cybercrime world.
The Move From WormGPT to FraudGPT and DarkBERT
After WormGPT started to fade away and became harder to purchase, a new AI tool called FraudGPT emerged, quickly filling the gap left behind. Discovered on July 25, 2023, FraudGPT was also marketed as a tool designed specifically for cybercriminals, offering features for crafting phishing emails, generating malicious code, and even providing hacking tutorials. Promoted by a user under the alias "CanadianKingpin12," FraudGPT was advertised on various cybercrime forums as a successor to WormGPT.

Example of FraudGPT’s interface
Interestingly, some speculated that WormGPT and FraudGPT might have been created by the same group, as they shared a similar set of capabilities and marketing styles. However, due to policy violations, threads promoting the sales of FraudGPT were frequently taken down on major cybercrime forums, forcing the creator to shift promotions to decentralized platforms like Telegram, where restrictions were less stringent.

Error message generated when viewing deleted FraudGPT thread
To improve FraudGPT’s appeal, the creator claimed that it had advantages over WormGPT and even hinted at developing new AI bots, such as “DarkBERT” and “DarkBART,” which would reportedly integrate internet access and Google Lens for image-based responses. But even with these attempts to keep the tool accessible, many of FraudGPT’s sale threads have ceased to exist, suggesting a decline in its availability.
The Shift to Scam-Driven AI Models and Imitations
Around the time FraudGPT began losing visibility, other variants started popping up—including EscapeGPT, EvilGPT, and WolfGPT. However, many of these tools quickly raised suspicions within the cybercrime community. Allegations emerged that these so-called "new AI tools" were simply jailbroken versions of ChatGPT with added wrappers to make them appear as standalone products