shot-button
Ganesh Chaturthi Ganesh Chaturthi
Home > Technology News > Hackers are stealing data by exploiting ChatGPT to write malicious codes

Hackers are stealing data by exploiting ChatGPT to write malicious codes

Updated on: 08 January,2023 03:35 PM IST  |  New Delhi
IANS |

The publisher of the thread disclosed that he was experimenting with ChatGPT to recreate malware strains and techniques described in research publications and write-ups about common malware

Hackers are stealing data by exploiting ChatGPT to write malicious codes

Image for representational purpose only. Photo courtesy: istock

Artificial intelligence (AI)-driven ChatGPT, that gives human-like answers to questions, is also being used by cyber criminals to develop malicious tools that can steal your data, a report has warned.


The first such instances of cybercriminals using ChatGPT to write malicious codes have been spotted by Check Point Research (CPR) researchers.


In underground hacking forums, threat actors are creating "infostealers", encryption tools and facilitating fraud activity.


The researchers warned of the fast-growing interest in ChatGPT by cybercriminals to scale and teach malicious activity.

"Cybercriminals are finding ChatGPT attractive. In recent weeks, we're seeing evidence of hackers starting to use it to write malicious code. ChatGPT has the potential to speed up the process for hackers by giving them a good starting point," said Sergey Shykevich, Threat Intelligence Group Manager at Check Point.

Just as ChatGPT can be used for good to assist developers in writing code, it can also be used for malicious purposes.

On December 29, a thread named "ChatGPT - Benefits of Malware" appeared on a popular underground hacking forum.

The publisher of the thread disclosed that he was experimenting with ChatGPT to recreate malware strains and techniques described in research publications and write-ups about common malware.

"While this individual could be a tech-oriented threat actor, these posts seemed to be demonstrating less technically capable cybercriminals how to utilise ChatGPT for malicious purposes, with real examples they can immediately use," the report mentioned.

On December 21, a threat actor posted a Python script, which he emphasized was the afirst script he ever created'.

When another cybercriminal commented that the style of the code resembles OpenAI code, the hacker confirmed that OpenAI gave him a "nice (helping) hand to finish the script with a nice scope."

This could mean that potential cybercriminals who have little to no development skills at all, could leverage ChatGPT to develop malicious tools and become a fully-fledged cybercriminal with technical capabilities, the report warned.

"Although the tools that we analyse are pretty basic, it's only a matter of time until more sophisticated threat actors enhance the way they use AI-based tools," Shykevich said.

OpenAI, the developer behind ChatGPT, is reportedly trying to raise capital at a valuation of almost $30 billion.

Microsoft acquired OpenAI for $1 billion and is now pushing ChatGPT applications for solving real-life problems.

Read More: All you need to know about ChatGPT, a prototype Artificial Intelligence chatbot

This story has been sourced from a third party syndicated feed, agencies. Mid-day accepts no responsibility or liability for its dependability, trustworthiness, reliability and data of the text. Mid-day management/mid-day.com reserves the sole right to alter, delete or remove (without notice) the content in its absolute discretion for any reason whatsoever

"Exciting news! Mid-day is now on WhatsApp Channels Subscribe today by clicking the link and stay updated with the latest news!" Click here!

Register for FREE
to continue reading !

This is not a paywall.
However, your registration helps us understand your preferences better and enables us to provide insightful and credible journalism for all our readers.

Mid-Day Web Stories

Mid-Day Web Stories

This website uses cookie or similar technologies, to enhance your browsing experience and provide personalised recommendations. By continuing to use our website, you agree to our Privacy Policy and Cookie Policy. OK