Is ChatGPT Malware Using Python and Powershell to Steal Files?

Is ChatGPT Malware Using Python and Powershell to Steal Files?
Published on

Is ChatGPT Malware Using Python and Powershell to Steal Files?

ChatGPT and the projects associated with it such as AI have raised many concerns ranging from cheating credentials and job loss. One such under-looked is the use of ChatGPT malware using Python and Powershell. While it's still much in the basic stages, there are several proofs of concepts that have emerged indicating low- or even no-skill threat actors have figured out how to manipulate ChatGPT to produce basic but viable malware.

ChatGPT Malware Using Python and Powershell to Steal Files

The Check Point report describes a dark web thread posted on December 29 by a more experienced criminal actor instructing those with less skill, as well as a thread from a week earlier by a user who claimed their ChatGPT malware script was the first code they had ever written. Another New Year's Eve thread describes how to use ChatGPT to create dark web marketplaces.

The more advanced forum user stated that he was attempting to prompt ChatGPT to recreate a variety of known malware strains and techniques. It also stated that he had succeeded in getting the AI to translate malware samples between programming languages. The method does necessitate some basic coding knowledge, but the hacker provided detailed instructions for those wishing to replicate the technique. A second example from this poster has ChatGPT generate a short piece of Java code that downloads an SSH and telnet client and uses Powershell to run it on a target system while avoiding detection. This script is open-ended, allowing malware to be downloaded and installed on target systems instead.

With the help of ChatGPT, the previous forum user, who was experimenting with their first Python malware, essentially created a basic ransomware tool. More knowledgeable forum members confirmed that this script would successfully encrypt a specified list of files or directories. The script included information needed to decrypt the target files as presented, but Check Point notes that it could be modified to remove this. Despite the fact that this user's previous forum activity indicates that they are not a coder, they are active and well-known in the criminal underground as a broker for stolen databases and access to compromised companies.

The third case does not involve malware, but it does involve ChatGPT in the process of selling and transferring stolen data. This sample creates a temporary forum marketplace that can use crypto payment methods to facilitate transfers.

The most immediate AI malware threat is that it is enhancing the capabilities of unskilled threat actors.

At the moment, the tools developed by ChatGPT do not pose any new or serious threats. However, keep in mind that ChatGPT is an early release of a project that is still in development, and it is only a matter of time before more sophisticated malware can be generated automatically with little to no hacking knowledge.

With the help of AI, experienced cybercriminals will eventually be able to create or refine highly customised tools in much shorter periods, while the inexperienced will benefit greatly. ChatGPT's ability to generate reasonably convincing phishing emails in another language is one example of this.

However, the "battle of AIs" remains a distant possibility, hampered by several factors. One is that ChatGPT frequently gets things wrong, but always outputs answers as if it is certain they are correct; it still requires a skilled eye to know if the generated code is functional and fit for its intended purpose. Another reason is that these advanced, expensive models remain in the hands of a small number of people who have considerable control over how they are used.

The most immediate threat is the boost that this will provide to "script kiddies," who have little coding knowledge but comb sites like GitHub and StackExchange to paste together prefabricated code. This can be used maliciously.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net