• 1dalm@lemmings.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    29
    ·
    1 day ago

    Fair. I’m an addict.

    But you still didn’t respond to the criticism that there are no other better ideas offered to protect children online.

    • ignirtoq@feddit.online
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 day ago

      First, don’t need to respond to arguments made in bad faith. There’s no net positive outcome possible for the other person who is coming to the conversation in good faith.

      Second, not having their own solution does not invalidate anything critical they say about the one under discussion. There doesn’t have to be a “better” solution to justify not implementing something that has no positive impact on the targeted problem and severe negative impacts elsewhere.

    • BenderRodriguez@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      1 day ago

      How will an adult offering up their photo identification protect children? Let’s say I’m a pedophile who’s targeting children. Will I upload a video of my face or Government ID to prey on children on Discord, or will I pretend to be younger and not identify myself? Your argument makes no sense. Also, what are you talking about being an addict for? It’s completely irrelevant here.

      • Zikeji@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        Isn’t the entire point to “protect” the children from “being corrupted” by that evil porn and other adult content? It’s not about protecting them from pedophiles, it’s about keeping them nice and innocent so it’s even easier for the pedophiles to groom them.

      • 1dalm@lemmings.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        11
        ·
        1 day ago

        I’m only going to explain this once because this topic, as well as most online topics, is more about emotions than facts. But here we are.

        You didn’t understand how this helps because you aren’t trained to think like a child predator. (That’s fine.) I’ve had to take a lot of child safety trainings over the years for professional reasons. Here’s how online child predators work, they start by trying to get the kid into a secret. They say “hey want to see some porn?”, and of course the kid is curious. And the kid is told, “be sure you don’t tell your parents about this.” Then they slowly try to pull into deeper and deeper secrets and start to black mail the kid. They start to demand that the kid send them nude photos. They trap the kids into deeper and deeper secrets and guilt to get more and more out of them. In the worst cases this results in meetups with the predator in person.

        The easiest places for the predators to start this process is porn sites where the kids are visiting in secret to begin with. Those are the kids that are most vulnerable.

        How how did this protect kids? The goal is to keep the kids out of spaces where they would be targeted to begin with.

        So there you go. I’m all ears for how to do this better.

          • 1dalm@lemmings.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            7
            ·
            1 day ago

            Some of them, yes.

            You see, one thing that child predators really really hate is policies. They want their interactions to be frictionless, so that at the first sign of trouble they can get out. Strong policies really are a strong deterrent.

            It won’t make the evil people stop doing the evil things, but it’ll cause a lot of them to move to someone else’s platform that has weaker policies.

    • U7826391786239@lemmy.zip
      link
      fedilink
      arrow-up
      5
      ·
      1 day ago
      1. it’s not the world’s problem to come up with a solution. it’s discord’s (and all social media’s) problem. 2) not having a solution doesn’t justify taking actions that are not only not solutions, but also puts users at risk.

      if you’re fine with putting up with the bullshit, that’s your business, but everyone who’s saying fuck that shit are correct.