• 0 Posts
  • 6 Comments
Joined 1 year ago
cake
Cake day: December 8th, 2024

help-circle
  • They’re pretty much just hating to hate or basing themselves on very outdated information, ‘missing critical features’ is a joke, because if it actually were critical it would’ve been implemented already (plus firefox is very extensible, with many plugins existing and forks adding specific features), if they actually had a point they maybe would’ve given a single example.

    Weirdly implementing some web standards kinda did apply a bit until a few years ago where all the big browser engine developers got together and pinned down the standard. If something still breaks that probably means the website used some out-of-spec workaround that only works in Chrome. Some things do indeed behave differently between firefox and chrome (an example of my own: file input fields with multiple types, eg allow both video and image are handled differently at least in the mobile apps). Yet again if they had a point maybe an example would’ve been great.

    Weird user agent styles?..?? I’m just confused honestly.



  • The problem is that it’s impossible to take out this one application. There doesn’t need to be any actual nude pictures of children in the training set for the model to figure out that a naked child is basically just a naked adult but smaller. (Ofc I’m simplifying a bit).

    Even going further and saying let’s remove all nakedness from our dataset, it’s been tried… And what they found is that removing such a significant source of detailed pictures containing a lot of skin decreased the quality of any generated image that has to do with anatomy.

    The solution is not a simple ‘remove this from the training data’. (Not to mention existing models that are able to generate these kinds of pictures are impossible to globally disable even if you were to be able to affect future ones)

    As to what could actually be done, applying and evolving scanning for such pictures (not on people’s phones though [looking at you here EU].) That’s the big problem here, it got shared on a very big social app, not some fringe privacy protecting app (there is little to do except eliminate all privacy if you’d want to eliminate it on this end)

    Regulating this at the image generation level could also be rather effective. There aren’t that many 13 year old savvy enough to set up a local model to generate there. So further checks at places where the images are generated would also help to some degree. Local generation is getting easier by the day to set up though, so while this should be implemented it won’t do everything.

    In conclusion: it’s very hard to eliminate this, but ways exist to make it harder.