And they don’t have cognition at all. They do not, and can not, think like we do. Maybe some day we will learn to make real AI, these LLM’s are not it. It’s a cheap trick intelligence,.
I think the emp is pretty limited to the blast zone in frying electronics. The fallout from a weapon spreads around the world, circling in the winds countless times dropping dust everywhere, but the emp is localized to more around the area of physical destruction but not sure exactly.
The Neutron bombs, not entirely sure in physics how that works, but they produce no actual blast that causes physical destruction so much and just kills everything.
Oh, how far from the blast and how does it mess them up do you know? I should know that I guess I just heard about the emp, and not sure how a neutron bomb would affect electronics either.
“no human” but Machines would, since they are unaffected by nuclear winter and radiation.
And they don’t have cognition at all. They do not, and can not, think like we do. Maybe some day we will learn to make real AI, these LLM’s are not it. It’s a cheap trick intelligence,.
Radiation absolutely fucks electronic components
I think the emp is pretty limited to the blast zone in frying electronics. The fallout from a weapon spreads around the world, circling in the winds countless times dropping dust everywhere, but the emp is localized to more around the area of physical destruction but not sure exactly.
The Neutron bombs, not entirely sure in physics how that works, but they produce no actual blast that causes physical destruction so much and just kills everything.
I repeat, radiation absolutely fucks electronic components. I am not talking about an emp, I am talking about radiation.
Oh, how far from the blast and how does it mess them up do you know? I should know that I guess I just heard about the emp, and not sure how a neutron bomb would affect electronics either.
The electromagnetic pulse caused by a nuke would pop resisters too. AI would more likely use biological means to get rid of us.
Assuming AI would care about itself and not just “solving the problem”.
Yeah, these doom scenarios require cascading assumptions and no real answer, except maybe “don’t”.