• 0 Posts
  • 21 Comments
Joined 2 years ago
cake
Cake day: April 27th, 2024

help-circle
  • Technology moves on. Any meaningfully upgradable desktop will blow away the highest spec iPads of 2026 once the owner finds it necessary to upgrade. And upgrading a pc doesn’t mean you have to replace the entire thing just because you want a new GPU, it’s just like bulldozing your house because you don’t like the current wall paint colour.

    Sure, if you’ll go clinically insane if you get the smallest bottleneck with your hardware, sure, replace your rig if it’s viable for you, but most of the time for most workloads, small bottlenecks don’t mean much, so upgrading components when it feels right is generally just a better choice.

    Also, the highest spec iPads only have 16gb of unified ram, and sure, with compression and it only having a single page table between all processors, it’s impressive, but realistically, what do you need all that insane power of memory architecture as well as the m5 chipset for a mobile workflow? And how are you supposed to replace the storage/ram/processor when it begins to feel slow after defying the trillion dollar company by putting a desktop OS on it?

    iPads and workstations/desktops aren’t applicable to each other, they’re entirely different classes of devices. Frankly, if you manage to put a desktop OS on an iPad, I’d like to see you try using it for gaming, productivity and other workloads for at least a decade. And if you can’t? Well you can’t upgrade it like you can a real workstation/desktop.





  • I used to use fedora on my first school Thinkpad (since I recieved another when my brother graduated, becoming my second school laptop,) then after I got 2 considerably more powerful laptops for free I switched the thinkpads out of my setup, and to this day they still run windows 11 unfortunately (haven’t gotten around to it and they’re both nvidia MX business machines, so that’s not awfully ideal,) and then I converted the second Thinkpad to a FydeOS machine (basically chrome os with local accounts) to give to my mother as her own laptop, and then put chromeOS flex on a third Thinkpad I bought off a friend for $10 which remains as my occasional-browsing-but-also-throw-around-laptop that has pretty friggin good battery life.

    I do plan to switch at least one of the MX laptops to Linux, for which I’m considering Pop!_OS due to driver support and GPU configuration, but I’ve still gotta back some stuff up first.



  • I hate it when they give ambiguous testing figures like “getting run over by a 15.6 ton truck”, it’s not accurate because it isn’t specific. Do they mean a wheel pushing directly onto the chip? Or is it just getting quickly run over? Are they doing burnouts on the chip? Is the chip stuck down on the presumably regular road, or is it just tossed there?

    So many things could happen, the chip gets scratched and becomes unusable, the chip survives because it was stuck to the floor, the chip survives/dies because the truck went too slow/fast, etc.

    I haven’t read the article yet tho, imma read it now to see if there’s any context to this.

    Edit: the context is fuck all. They just threw the statement in seemingly as dramatisation. Maybe they were implying that the chip would survive flawlessly while implanted in a persons arm, if that person were to get violently killed by a 15.6 tonne truck going 300kph, who knows.


  • The worst thing about ‘smart TVs’ is that they advertise so many ‘cool’ features but most of them have less performant processors and less ram than my 2019 budget galaxy a series phone, and that’s very telling, you can’t even use their dog shit built in web browsers since everything becomes outdated after like a week, and the performance is so bad that the expensive Hisense tv my dad bought back in 2020 can’t even load Google properly.








  • Yeah nah, the AI shit himself as well as the industry as a whole is peddling isn’t as useful as they say it is, even with the “revolutionary new Gemini 3 models”, which are just slop convo generators. The thing is, when AI is thrusted into a person’s line of sight, the label doesn’t make it so they impulsively rework their entire workflow to be the exact same in terms of quality but use an AI instead, most people brush it off as system bullshit they don’t need, and even if it could help with some things in some capacity, it’s marketed as a “feed everything into me and you can use me for everything” machines, when honestly more accurate, smaller and specialised models should be made instead, even if their purpose is also a bit dubious too.

    And sure, it might seem contradictory, but I do use an AI, but it’s pretty much just for brainstorming and conversational shit for refining my ideas through just articulating them, but I really dislike how my computers are full of AI services for no reason, with a few of my laptops with copilot baked in, another with Gemini baked in, and what have you.

    I suppose the only good thing that came out from LLMs would just be the increase from 8 to 16gb of ram on many machines, but then again, it had to drop again because of said AI companies.




  • From my perspective, he is probably referring to chromeOS’s crosvm container, which virtualises a debian install (or other distros). Since Chromebooks are popular in schools, predominantly in the USA but even still globally, students are likely to attempt to gain further functionality out of their devices, and hence experiment with Linux, get used to it and possibly install it on different devices (or on that same Chromebook through the mrchromebox firmware) in the future.

    Edit: alternatively, he could just be referring to flooding the market with cheap Linux laptops for specific purposes like education workflows or standard consumer workflows, just like how Chromebooks achieved that footing in the market.