And all that data does by definition exclude: “AI” is not built on “all of humankind’s knowledge” but based on whatever a mostly western view of the world and what is relevant looks like. Cultures who are not within that framework, who might even be based on more oral forms of keeping history and knowledge are not represented. Even if those groups are not actively excluded (which again they very often are) there are huge populations who just are not seen by the data do not get a say in how they are represented. Or if they are represented it’s just as problems: Think about unsheltered people for example.
The right loves those patterns because they confirm their prejudices: Ask an image generator for a picture of two people kissing and you most often get a heterosexual couple, often white. Because that’s what the training data looks like. That makes “AI” perfect for creating the form of idealized, fictional “past” that fascists love to allude to (“make America great again“), a past that never existed but that needs to be saved or restored (we’ll get back to that later).
…
we can pretty easily determine the short-term purpose of “AI”: The destruction of labor power.
This dismantling happens on multiple levels by attacking the foundation of what allows those forms of organization to take place.
The first level is very individualistic: By pointing at “the AI” that can replace a worker that worker is pressured into working harder, not asking for raises or any other improvements of their working conditions. Even though “AI” cannot do your job, the threat itself is useful to employers to undermine your individual power, your feeling of being valuable as a worker.
The second level is about attacking the idea of solidarity and connection: Because “AI” will not replace you (again, “AI” cannot replace the absolute majority of workers!) “but someone using AI will”. This sets up kind of a Thunderdome in which we all have to fight against each other for scraps/jobs. This framing implies that you should not unionize and connect with your fellow workers but that you should see them as your enemies, as the people who will take your job and your ability to provide for your family. We know this dynamic, it’s exactly how the right presents migration as “attack”. It also normalizes violence again turning all of existence into an endless fight against one another (unless you are one of the few people in power of course).
The third level is somewhat more devious. Because it makes us do that form of dissolving of social bonds ourselves. An example: If I use an “AI” to generate an illustration instead of asking a designer I am saying that while my skills and labor has value, that of the designer has not. This implicitly cuts my ability to form connections of solidarity with designers whose work and livelihood I have implicitly declared irrelevant. It makes me put myself over my fellow workers, workers who are facing the same struggles as me, who are my comrades. But no more.
I dunno, claiming that fascism was always in our tech is an incorrect statement. The Amiga 1200 wanted to hurt nobody.
I love the Amiga 1200
The right loves those patterns because they confirm their prejudices: Ask an image generator for a picture of two people kissing and you most often get a heterosexual couple, often white. Because that’s what the training data looks like. That makes “AI” perfect for creating the form of idealized, fictional “past” that fascists love to allude to (“make America great again“), a past that never existed but that needs to be saved or restored (we’ll get back to that later).
It depends which AI you ask, no? Ask AI made in a mostly white country, you’re going to get pictures of mostly white people. Ask AI made in a mostly Asian country, it’ll be mostly Asian people.
This same dilemma came up with search engines. Everyone complained that “doctor” in Google Image search showed mostly white doctors. Except if you made the same search in Weibo, in showed mostly Asian doctors.
Same goes for heterosexual couples, it’s just what’s more common.
I’m starting to get the impression a lot of this anti-AI push is coming from Russia and China, the way it’s being framed as an “anti-West” thing. Because the chips for it are coming from the West. It’s just like vaccines, and how a lot of the anti-vaccine propaganda was coming from Russia and China. Because they need to make the thing that’s saving the world look bad.
I’m not saying AI is saving the world. But a “fascist artifact”? Come on. That’s classic Russian propaganda hyperbole.
Read the whole article, don’t just react to a single bit I highlighted.
I did read the whole article, it feels rambling, vague, and incoherent, with full of these classic talking points.
Like congratulations, in talking about AI, you somehow managed to shoehorn in - unhoused people, MAGA, labour value, worker solidarity, migration, the normalization of violence… you write like a tweaker.
Well on your whole “it’s just what’s more common” this sort of entrenching of the “norm” as what is true is specifically one of the criticisms raised, so it seems strange to raise it as a defense, further much of the training data for all models is in english and reflects global north values and perspectives, particularly european and american because this is what has been digitised.
He also writes about the issues with centralisation and control, disinformation, lack of consent, undermining of government accountability/devaluing of institutions and transparency. It seems weird to dismiss this all as “just use a distillation of chatgpt run by a chinese company”.
And all that data does by definition exclude: “AI” is not built on “all of humankind’s knowledge” but based on whatever a mostly western view of the world and what is relevant looks like.
Its based on a specific western worldview the techbros have, and want to project, while drinking whatever flavor of kool-aid is seen as being edgy at the moment.
I read somewhere that “building community is inconvenient” and I think about that a lot these days. Doing the inconvenient thing (ie, asking your designer friend to design something) builds community, but it is arguably less convenient than having a bot do it for you. The benefit of the inconvenience, though, is the community
I am autistic, and expecting me to prefer asking people to do things and building community, instead of being fine with solitariness and the tools that help me achieve it, feels a bit ableist tbh.
I literally said that building community is inconvenient (for everyone).
You chose to write a comment to another human being about how you would like to be treated. Imagine if, through doing that, the people you still have to interact with in your daily life learn to actually treat you that way - that too is community.
A neighbor getting groceries for you so you don’t have to go for an overstimulating store and so you can have them arrive at your door within a one minute interval from 10:00 to 10:01 on tuesday and friday is community. Someone tailoring your shirts so you don’t notice any friction against your skin anymore is community.
The tools that capitalism can provide are depersonalized, poorly fitting, and often malicious, but a community can work with you can come to understand you and learn to fit your needs better than anything you’re likely to be able to buy.
deleted by creator





