Hackers exploiting ChatGPT: New Delhi, Jan 8 (IANS) Artificial intelligence (AI)-driven ChatGPT, that gives human-like answers to questions, is also being used by cyber criminals to develop malicious tools that can steal your data, a report has warned.

ChatGPT, the new AI sensation, is helping hackers exploiting ChatGPT to write codes and launch cyberattacks effortlessly, researchers at security firm Check Point Research said in a blog post on Friday.

In underground hacking forums, threat actors are creating “info stealers”, and encryption tools and facilitating fraud activity.

The researchers warned of the fast-growing interest in ChatGPT by cybercriminals to scale and teach the malicious activity through which even some less skilled hackers exploiting ChatGPT easily.

Cybercriminals are Finding ChatGPT Attractive

Hackers exploiting ChatGPT (OpenAI’s Platform) to write malicious codes to steal your data
Hackers exploiting ChatGPT

On December 29, a thread named “ChatGPT – Benefits of Malware” appeared on a popular underground hacking forum.

The publisher of the thread disclosed that he was experimenting with ChatGPT to recreate malware strains and techniques described in research publications and write-ups about common malware.

“Although this individual may be a technical threat actor, these posts demonstrate, with out-of-the-box examples, how non-technical cybercriminals can use ChatGPT for malicious purposes, with real examples they can immediately use,” the report mentioned.

In the same month, a threat actor named USDoD released a Python script. They believe this is the first script they have created using Open AI. This script can completely encrypt another user’s computer without any user interaction.

UsDoD is not a developer and has limited technical skills, but is a very active member of the underground community. He has been involved in various illegal activities, including selling access to compromised companies and stolen databases.

Another notable instance of the use of ChatGPT that was carried out on New Year’s Eve of 2022 was the creation of Dark Web Marketplaces scripts. The marketplace’s main role is to provide a platform for the automated trade of illegal goods like stolen accounts or payment cards, malware, or even drugs and ammunition, with all payments in cryptocurrencies, Check Point Research highlighted.

ChatGPT Abused to Build Hacking Tools

According to a new report from Israeli security firm Check Point, Hackers exploiting ChatGPT by using it to develop powerful hacking tools and create new chatbots designed to mimic young girls to lure targets.

ChatGPT can also encode malicious software that can monitor user keystrokes and create ransomware. For reference, ChatGPT was developed by OpenAI as an interface for LLM (Large Language Model).

However, cyber criminals somehow found a way to make it a threat to the cyber world as its code generation ability can easily help threat actors launch cyber attacks.

Meanwhile, Hold Security founder Alex Holden explains that he has observed dating scammers abusing ChatGPT to create attractive personas. Scammers create female personas to impersonate girls to gain trust and lengthen conversations with their targets.

Potential Danger

Check Point Research’s researchers explained in their blog post that an attacker could create an authentic-looking spear-phishing email to run a reverse shell that can accept commands in English.

Many underground hacking forums have posted incidents of cybercriminals using OpenAI to develop malicious tools without any development skills. In one of his posts Checkpoint confirmed, the hacker shared Android malware code created by his ChatGPT. This code can steal files of interest, compress them, and spread them over the Internet.

A user shared how ChatGPT has abused to code the functionality of dark web marketplaces such as Silk Road and Alphabay.

Another tool has been posted on the forum that can install a backdoor on the device and upload more malware to the compromised computer. Similarly, user-shared Python code can encrypt files in OpenAI apps.

The Bottom Line

The tools that we analyze are pretty basic, it’s only a matter of time until more sophisticated threat actors enhance the way they use AI-based tools,” Shykevich said.

OpenAI, the developer behind ChatGPT, is reportedly trying to raise capital at a valuation of almost $30 billion.

Microsoft acquired OpenAI for $1 billion and is now pushing ChatGPT applications for solving real-life problems.

Also, read | How AI Is Transforming The Future of Healthcare In India

Writing is my hobby. Writer by Nature and Engineer by Education. I like to write in multiple categories. Mainly I write technical content.

Leave A Reply