Bok
12-06-2023, 09:50 AM
AI chatbots can convince other chatbots to instruct users how to build bombs and cook meth
More... (https://www.scientificamerican.com/article/jailbroken-ai-chatbots-can-jailbreak-other-chatbots/)
More... (https://www.scientificamerican.com/article/jailbroken-ai-chatbots-can-jailbreak-other-chatbots/)