The biggest selling point of Java is that you can easily find programmers that know it. They will need some training to do HFT style code but you'll still pay them less than C++ prima donnas and they'll churn out reasonably robust code in good time.
In theory the water stays clean and can be reused. But I assume these cheapskates will go for evaporative cooling everytime? Then yeah, we need laws against that.
I guess when you're dissipating upwards of a gigawatt of power at a single site boiling water starts to look attractive. It's a pretty impressive curveball; I definitely would never have predicted "an evil corporation boils off all the local drinking water" to be a legitimate concern. I'm pretty sure that's too absurd a plot point for even a children's movie.
I keep hearing people claiming that water is just as much as issue as energy for operating these DCs, but that just doesn't make any sense to me. However, I haven't had to step inside a DC for almost two decades.
Continuously dissipating 1 gigawatt of energy by boiling room temperature water would require approximately 1.38 million liters of water per hour.
Seems like the environmentally responsible thing to do be to build the datacenter near the coast and use the waste heat to desalinate water. Or at least dissipate the heat into the ocean rather than boiling off an inland freshwater supply.
Setting aside a small patch of ocean for the task seems like a much better plan than the current practice. Provided you dump it in a place with a decent current any adversely affected area should be exceedingly small.
Keep in mind that the sun is constantly dumping energy on us. Absorption averaged across the entire earth is ~200 W/m^2. Assuming I didn't misplace some zeros somewhere then a gigawatt corresponds to ~5 km^2 of ocean surface. That's the daily flux. Penetration falls off exponentially so 75% of that only ever makes it ~10 m down.
I think the takeaway here is the utterly incomprehensible scale of the ocean.
This idea is probably more worth it in middle eastern countries given that 90% of their water comes from Desalination Plants. But given the recent war within region, I don't really expect Datacenters to be built within the region for quite a long time.
Yeah because it's cheaper they go evaporative. That's an easy fix by just making it more expensive.
People talk about the water usage like it's an intrinsic feature of datacenters; it's not. You just make it more expensive so they are forced to conserve. But you wait till you have to so you don't push them to build elsewhere.
You left out overthrowing governments with customized targeted propaganda, jamming citizen discussion with noise, artificially creating and nourishing contrarian cells in democratic societies. The machines will now be programming people.
Cool kids had C64s. I had every other boring, flawed model. Tandy MC-10. TI-99, ZX80 (not even 81!) and some other CoCo with chiclet keys. Now I know the 6809 is actually pretty interesting but back then without video or graphic chips there wasn't much you could do as a 12 year old.
Weirdly the most fun I had was with the BASIC programmable SHARP PC-xxxx line. I still have my PC-1350 somewhere.
That 6809 bewitched my middle school self. Having already learnt Z80 assembly language, the 6809 just looked so much more elegant. It had index registers that were actually useful! It had position independent code! It could do multiplication in one instruction! So when faced with choosing a CoCo or a C64 .. of course I chose the machine with the MUL instruction. Naturally, within mere months, that horrid 32x16 black on green display forced the harsh realization that a computer is more than just the CPU, that the support chips could actually be far more interesting. Who cares about a multiply instruction, when you could have sprites and 3 voice sound?
My worst hardware choice (later) was to save with a monochrome VGA screen to afford a 24pin Fujitsu dot matrix vs the 9pin Epson. It forged the person I am today.
That's an "if you have to ask, it's not for you" question. Also, the noise these things make... You better have a separate garage. The constraints of a data center are really far from those of a homelab.
Rust compilation is actually very fast if you keep things Java-esque : No macros, no monomorphization (every Trait dyn). Obviously most the ecosystem isn't built that way, leaning more on the metaprogramming style because the language makes it so convenient.
Every community has these assholes. In my experience, the Rust user base is nothing but polite, understanding and pragmatic. There's no smugness, explicit or implied. The Rust lore is just a joke that's getting less funny every day someone takes it seriously.
Technically, I prefer Rust but I understand people who find Go better suited for their work. For example, devops is mostly Go nowadays and that makes more sense than Rust (or Python).
But I've never, ever seen toxic behavior from the Go community. For Rust, it's the norm, sadly.
Frankly, I see _a lot_ more uninformed attacks on Rust than actual Rust evangelism / snobbery. And most of these anti-Rust comments reek of personal insecurities and low-effort trolling. Like saying that Java is slow. It's getting old real quick.
reply