And all that data does by definition exclude: “AI” is not built on “all of humankind’s knowledge” but based on whatever a mostly western view of the world and what is relevant looks like. Cultures who are not within that framework, who might even be based on more oral forms of keeping history and knowledge are not represented. Even if those groups are not actively excluded (which again they very often are) there are huge populations who just are not seen by the data do not get a say in how they are represented. Or if they are represented it’s just as problems: Think about unsheltered people for example.

The right loves those patterns because they confirm their prejudices: Ask an image generator for a picture of two people kissing and you most often get a heterosexual couple, often white. Because that’s what the training data looks like. That makes “AI” perfect for creating the form of idealized, fictional “past” that fascists love to allude to (“make America great again“), a past that never existed but that needs to be saved or restored (we’ll get back to that later).

we can pretty easily determine the short-term purpose of “AI”: The destruction of labor power.

This dismantling happens on multiple levels by attacking the foundation of what allows those forms of organization to take place.

The first level is very individualistic: By pointing at “the AI” that can replace a worker that worker is pressured into working harder, not asking for raises or any other improvements of their working conditions. Even though “AI” cannot do your job, the threat itself is useful to employers to undermine your individual power, your feeling of being valuable as a worker.

The second level is about attacking the idea of solidarity and connection: Because “AI” will not replace you (again, “AI” cannot replace the absolute majority of workers!) “but someone using AI will”. This sets up kind of a Thunderdome in which we all have to fight against each other for scraps/jobs. This framing implies that you should not unionize and connect with your fellow workers but that you should see them as your enemies, as the people who will take your job and your ability to provide for your family. We know this dynamic, it’s exactly how the right presents migration as “attack”. It also normalizes violence again turning all of existence into an endless fight against one another (unless you are one of the few people in power of course).

The third level is somewhat more devious. Because it makes us do that form of dissolving of social bonds ourselves. An example: If I use an “AI” to generate an illustration instead of asking a designer I am saying that while my skills and labor has value, that of the designer has not. This implicitly cuts my ability to form connections of solidarity with designers whose work and livelihood I have implicitly declared irrelevant. It makes me put myself over my fellow workers, workers who are facing the same struggles as me, who are my comrades. But no more.

  • CriticalMiss@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    6 hours ago

    I dunno, claiming that fascism was always in our tech is an incorrect statement. The Amiga 1200 wanted to hurt nobody.

    I love the Amiga 1200

    • naevaTheRat@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 hours ago

      In his influential 1980’s paper “Do Artifacts Have Politics?” Langdon Winner argues that this view of “neutral technology” does not hold up. That the politics of specific artifacts do not just come from who uses the technology and for what purpose but that technologies have built-in politics that stem from the political views and goals of the people building the technology as well as their internal structure.

      He shows this by pointing at how certain bridges were built racist: When the civil rights movement in the US got black kids the right to go to the often better schools that used to only accept white kids, politicians did for example plan roads and bridges in a way that the buses that were supposed to take the black kids to the white schools could not pass the bridges and roads. This was not oversight but design intent. The racism is built into the structure of the artifact itself.

      Winner also argues that certain technologies imply certain political or social structures in order to exist: The nuclear bomb implies not just scientists who can build it and a state thinking that that form of destruction is a valid form of acting in the world but also a security state capable of controlling and defending it. You simply cannot build a nuclear bomb without those structures, they are implied if not required, enforced by the artifact itself.

      Winner’s work does not argue that the embedded politics of an artifact are always absolute: We do know of many potentially oppressive technologies that have been taken by artists and activists to turn them against their original use. But that is always an uphill battle: Surveillance will always lean towards a more forceful, rigid, less free understanding of government for example. You can use (counter-)surveillance of course but you always have to be aware of not reproducing the logic you are trying to criticize or attack.

      Nobody is claiming all technology is fascist, but all embodies some politics and some of that politics is fascist.

  • felixwhynot@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    12 hours ago

    I read somewhere that “building community is inconvenient” and I think about that a lot these days. Doing the inconvenient thing (ie, asking your designer friend to design something) builds community, but it is arguably less convenient than having a bot do it for you. The benefit of the inconvenience, though, is the community

    • yucandu@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      11
      ·
      7 hours ago

      I am autistic, and expecting me to prefer asking people to do things and building community, instead of being fine with solitariness and the tools that help me achieve it, feels a bit ableist tbh.

      • Tiresia@slrpnk.net
        link
        fedilink
        English
        arrow-up
        11
        ·
        6 hours ago

        You chose to write a comment to another human being about how you would like to be treated. Imagine if, through doing that, the people you still have to interact with in your daily life learn to actually treat you that way - that too is community.

        A neighbor getting groceries for you so you don’t have to go for an overstimulating store and so you can have them arrive at your door within a one minute interval from 10:00 to 10:01 on tuesday and friday is community. Someone tailoring your shirts so you don’t notice any friction against your skin anymore is community.

        The tools that capitalism can provide are depersonalized, poorly fitting, and often malicious, but a community can work with you can come to understand you and learn to fit your needs better than anything you’re likely to be able to buy.

  • Tim_Bisley@piefed.social
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    9 hours ago

    And all that data does by definition exclude: “AI” is not built on “all of humankind’s knowledge” but based on whatever a mostly western view of the world and what is relevant looks like.

    Its based on a specific western worldview the techbros have, and want to project, while drinking whatever flavor of kool-aid is seen as being edgy at the moment.

  • yucandu@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    8
    ·
    7 hours ago

    The right loves those patterns because they confirm their prejudices: Ask an image generator for a picture of two people kissing and you most often get a heterosexual couple, often white. Because that’s what the training data looks like. That makes “AI” perfect for creating the form of idealized, fictional “past” that fascists love to allude to (“make America great again“), a past that never existed but that needs to be saved or restored (we’ll get back to that later).

    It depends which AI you ask, no? Ask AI made in a mostly white country, you’re going to get pictures of mostly white people. Ask AI made in a mostly Asian country, it’ll be mostly Asian people.

    This same dilemma came up with search engines. Everyone complained that “doctor” in Google Image search showed mostly white doctors. Except if you made the same search in Weibo, in showed mostly Asian doctors.

    Same goes for heterosexual couples, it’s just what’s more common.

    I’m starting to get the impression a lot of this anti-AI push is coming from Russia and China, the way it’s being framed as an “anti-West” thing. Because the chips for it are coming from the West. It’s just like vaccines, and how a lot of the anti-vaccine propaganda was coming from Russia and China. Because they need to make the thing that’s saving the world look bad.

    I’m not saying AI is saving the world. But a “fascist artifact”? Come on. That’s classic Russian propaganda hyperbole.

      • yucandu@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        4
        ·
        7 hours ago

        I did read the whole article, it feels rambling, vague, and incoherent, with full of these classic talking points.

        Like congratulations, in talking about AI, you somehow managed to shoehorn in - unhoused people, MAGA, labour value, worker solidarity, migration, the normalization of violence… you write like a tweaker.

        • naevaTheRat@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          6 hours ago

          Well on your whole “it’s just what’s more common” this sort of entrenching of the “norm” as what is true is specifically one of the criticisms raised, so it seems strange to raise it as a defense, further much of the training data for all models is in english and reflects global north values and perspectives, particularly european and american because this is what has been digitised.

          He also writes about the issues with centralisation and control, disinformation, lack of consent, undermining of government accountability/devaluing of institutions and transparency. It seems weird to dismiss this all as “just use a distillation of chatgpt run by a chinese company”.