News

Two people demonstrated that Discord's new AI chatbot Clyde can be tricked into giving instructions on how to make dangerous substances.