-
Administrator
Jailbroken AI Chatbots Can Jailbreak Other Chatbots
AI chatbots can convince other chatbots to instruct users how to build bombs and cook meth
More...
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules