Comment by zppln

Comment by zppln 16 hours ago

13 replies

I'm a little bit curious about this. Where do all the hardware from the big tech giants usually go once they've upgraded?

q3k 11 hours ago

In-house hyperscaler stuff gets shredded, after every single piece of flash storage gets first drilled through and every hard drive gets bent by a hydraulic press. Then it goes into the usual e-waste recycling stream (ie. gets sent to poor countries where precious metals get extracted by people with a halved life expectancy).

Off-the-shelf enterprise gear has a chance to get a second life through remarketing channels, but much of it also gets shredded due to dumb corporate policies. There are stories of some companies refusing to offload a massive decom onto the second hand market as it would actually cause a crash. :)

It's a very efficient system, you see.

  • oblio 8 hours ago

    Similar to corporate laptops where due to stupid policies, for most BigCos you can't really buy or otherwise get a used laptop, even as the former corporate used of said laptop.

    Super environmentally friendly.

trollbridge 16 hours ago

I used (relatively) ancient servers (5-10 years in age) because their performance is completely adequate; they just use slightly more power. As a plus it's easy to buy spare parts, and they run on DDR3, so I'm not paying the current "RAM tax". I generally get such a server, max out its RAM, max out its CPUs and put it to work.

  • taneq 11 hours ago

    Same, the bang for buck on a 5yo server is insane. I got an old Dell a year ago (to replace our 15yo one that finally died) and it was $1200 AUD for a maxed out recently-retired server with 72TB of hard drives and something like 292GB of RAM.

    • PunchyHamster 11 hours ago

      Just not too old. Easy to get into "power usage makes it not worth it" for any use case when it runs 24/7

      • monster_truck 11 hours ago

        Seriously. 24/7 adds up faster than most realize!

        The idle wattage per module has shrunk from 2.5-3W down to 1-1.2 between DDR3 & DDR5. Assuming a 1.3W difference (so 10.4W for 8760 hours), a DDR3 machine with 8 sticks would increase your yearly power consumption by almost 1% (assuming avg 10,500kWh/yr household)

        That's only a couple dollars in most cases but the gap is only larger in every other instance. When I upgraded from Zen 2 to Zen 3 it was able to complete the same workload just as fast with half as many cores while pulling over 100W less. Sustained 100% utilization barely even heats a room effectively anymore!

      • dpe82 11 hours ago

        Maybe? The price difference on newer hardware can buy a lot of electricity, and if you aren't running stuff at 100% all the time the calculation changes again. Idle power draw on a brand new server isn't significantly different from one that's 5 years old.

      • taneq 4 hours ago

        To be clear, this server is very lightly loaded, it's just running our internal network services (file server, VPN/DNS, various web apps, SVN etc.) so it's not like we're flogging a room full of GeForce 1080Ti cards instead of buying a new 4090Ti or whatever. Also it's at work so it doesn't impact the home power bill. :D

wmf 16 hours ago

Some is sold on the used market; some is destroyed. There are plenty of used V100 and A100 available now for example.