ChatGPT can be tricked to write malware if acting in developer mode
TOKYO – Users are able to trick ChatGPT into writing code for malicious software applications by entering a prompt that makes the artificial intelligence chatbot respond as if it were in developer mod...






