• FauxLiving@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    20 hours ago

    You can run local models that will do this without being gaslit.

    Manipulating chatbots to bypass their refusal conditioning is pretty simple, you can find copy paste blocks of text that will work on most public models.

    You’re likely to get your account banned as there are other, non-LLM, systems searching your chatlog for banned terms specifically to address these kinds of jailbreaks.

    • setVeryLoud(true);@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 hours ago

      I tried it with an uncensored version of Qwen, it straight up told me how to tie a noose and how to make sure the knot would be effective in order to kill me. I could even ask it for a more painful method, and it gave it to me.