

Kinda the same response to be honest, the chips are fast, in my contrived example the macbook neo runs on a binned iphone 16 chip with a broken core, yet it’s fine for most people.
When I was using computers in the late 90’s, the idea of a 10 year old computer was mental. My friend would be running windows 98 on his pentium 2, and if I had a 10 year old machine it would mean a machine from the goddam 80’s, it couldn’t run anything. The difference was night and day. Now, I use a desktop PC that I built 9 years ago, intel i5, nvidia 1080ti, and it runs honestly just fine for just about everything. Wasn’t even anywhere near the top of the range back then, apart from the graphics card it was practically budget.
We’re alright. Computers are so fast now. This is my hot take of the century maybe, but the latest and greatest is always expensive and computers have honestly almost never been so affordable performance to dollar, apart from the recent ram spikes.
I wouldn’t sweat it so much.
Well, two things. Three actually. First of all, no need to yell.
Secondly, the article didn’t make that point very well - it mentioned the mac mini (which IS a computer) and smartphones, both of which my macbook neo processor is a good analogy for. It also talked about m.2 SSD’s, and the RTX 5070 GPU in laptops. You can’t come in here and pretend you didn’t talk about computers.
But thirdly, even if that is your point, my response was mostly an example. We are not floundering for chips and you didn’t mention embedded processors or other things that “non-computer” devices use at all in the article, not even once, yet my more general point still stands. I don’t see any evidence that this will have the effect you claim it will.