

Why can’t these data centers just recirculate most of this water? Obviously there is some loss which requires more water over time but why do they require constant fresh water?


Why can’t these data centers just recirculate most of this water? Obviously there is some loss which requires more water over time but why do they require constant fresh water?


On the topic of daylight savings, I used to prefer that we stay on the daylight savings side of the time. But honestly at this point I am fine with staying on standard time if that means no more switching.
Otherwise one thing lately that I wish was done and over with by now is physical junk mail. Literal paper showing up in my mailbox that I now have to dispose of. Something I don’t ask for and will never look at. And I can’t help but think that happens to millions in my country every single day all for an irrelevant number of people to even look at. I can’t imagine how many trees are lost each year for something that has zero usefulness.
what am i going to do with 300million potatoes if i can’t build clocks for 3 days?
awesome good to know. much better work than lushsux…
If anybody is wondering who the artist is, it appears to me to be @lushsux it looks like his style. I say appears because I am not going to instagram to verify lol


We don’t necessarily need to know how animal brains work to achieve AGI, and it doesn’t necessarily have to work anything like animal brains.
100% agree. Definitely thinking inside the box, inside the brain, when I went down that path.
I think better way to explain my thinking is that LLMs can not operate like a human brain in that they fundamentally lack almost all qualities of a human brain. They are good but not perfect at logic just like humans, but they completely lack creativity, intuition, imagination, emotion and common sense, qualities that would make AGI.
Without humans being able to understand how our brains process those qualities, it will be very hard to achieve AGI. But again, very wrong of me to think we need to translate code from our brains to achieve AGI.


No of course. I meant that if at least one party in the UK gov is using signal, with end to end encryption, they are no longer using it because they are now considered ‘hostile actors’


If i remember correctly, a few weeks ago a government party had their signal chat leaked. Those people have since ceased using signal right?


From reading all the comments from the community, it’s amazing (yet not surprising) that all these managers have fallen for the marketing of all these LLMs. These LLMs have gotten people from all levels of society to just accept the marketing without ever considering the actual results for their use cases. It’s almost like the sycophant nature of all LLMs has completely blinded people from being rational just because it is shiny and it spoke to them in a way no one has in years.
On the surface level, LLMs are cool no doubt, they do have some uses. But past that everyone needs to accept their limitations. LLMs by nature can not operate the same as a human brain. AGI is such a long shot because of this and it’s a scam that LLMs are being marketed as AGI. How can we attempt to recreate the human brain into AGI when we are not close to mapping out how our brains work in a way to translate that into code, let alone other more simple brains in the animal kingdom.
depends on the rice, how much starch. Jasmine - no need to wash, Sushi - must wash